Helping the Environment

I’ve put up a few posts here (Green ICT, How to move 160 Servers, Did You Notice?) about moving our datacentre and how amongst other things this will help the University reduce its environmental footprint. Well, we’ve set up the shared datacentre along with the University of Aberdeen and North East Scotland College and it’s great to see that our achievement has been recognised at national level.

EAUC (The Environmental Association of Universities and Colleges) have an annual Green Gown award ceremony, and the Shared Datacentre was a winner in the category of Modernisation: Effectiveness & Efficiency in the Estate. There’s a wee writeup of this in RGU’s “Green Times”.

The British Computer Society UK IT Industry Awards recognised our venture as a winner in their “datacentre of the year” category.

Computer Weekly has featured the project in their European User Awards as “as an exemplar of public-sector excellence and green efficiency”. It’s worth reading this article for some insight into the slightly trickier moments of the project – “. . . like a Bond movie.”

With this success under our belt, we are now actively pursuing a shared project to develop a backup datacentre – work should start in earnest in January. This one will be on RGU’s Campus and once again will be designed with the environment in mind and take advantage of the abundant supply of cool fresh air of the North East (yes, summer as well as winter) to keep our servers at the optimum temperature without spending a fortune on air conditioning. I’m definitely hoping for a white Christmas . . .

It’s not a side of IT that our staff and students often get to see, but just thought you’d like to know how much we are leading the field here and we’ve managed to achieve by sharing with other institutions. My thanks to RGU IT Staff and colleagues in University of Aberdeen and North East Scotland College!


Did you Notice?

Sometimes the greatest successes are the ones which go largely un-noticed. I reminded you last week about the work that was taking place this past weekend to move our Faculty of Health and Social Care datacentre across to the new shared datacentre at the University of Aberdeen. I’m pleased to say that the work went very well and that all the servers were successfully relocated and are now operating in the new datacentre with little disruption over the weekend.

Yes, there were plenty of glitches along the way but the team had, and used, plans “B” and “C” along the way and successfully overcame all the critical problems. For most of the weekend, the University was running on just one datacentre instead of the normal two. Thanks to much work over the past several years, however, all our critical services continued to operate as normal over the weekend with just the occasional short pause when things were being restarted. E-mail, web site, Moodle VLE, My Apps etc were all working over the weekend. A few pieces of hardware had problems starting up, but thanks to the use of “virtualisation” technology we were able to just move the “virtual” servers onto other hardware and continue services as normal until the faulty hardware is repaired.

My thanks and well done to all those involved from IT Services!


The First Items to be removed from the Old Data Centre.

Server No. 7 on its way out.

1 engineer happy to be found amongst the cables.

Lets get some bubble wrap around server No. 16

35 Servers, 2 large Tape Libraries all wrapped up and ready to go.


Where did I put that server?
(It’s behind you mate).

A collection of worried engineers.

A nice collection of cables.
(Connected at Last)

Moving Datacentre – next two weeks

There have been a couple of postings about moving our servers into a new datacentre (Green ICT, How to Move 160 Servers). Well, now the time has come! As a reminder, we plan to move all of our servers out of the Faculty of Health and Social Care Server room into a refurbished datacentre shared with the University of Aberdeen and Aberdeen College. Over the past few months IT Services staff have been preparing the ground, making sure that the network connections are all working and that the services being moved are ready. Some of this work is highly specialised – one of the essential components to link the network had not been properly configured and had to be returned to Japan for further work, a round trip of 3 weeks. IT Services did also find that the configuration of some services had to be changed to allow them to operate in the new datacentre, and you will have seen a few small outages to allow these services to be updated.

The moves are going to happen over the next 2 weeks. ITS are not moving everything in one go, but will carry out the move in 2 or 3 stages. Some of the services ITS will be able to move without any downtime at all, some will require some downtime which is unavoidable.

The first batch are going to move this week, but the main move is planned to be the weekend of 27th July. That weekend in particular will be a substantial move and there is likely to be some downtime over the weekend, so do keep an eye on the information notices which will be issued and plan your work and studies around this.

Once this is complete, we will have removed a significant risk from our IT infrastructure by being able to decommission the old server room (see When Things Get Hot). We will also greatly improve our environmental credentials. One key measure for datacentres is “Power Usage Effectiveness”, or PUE. This is a measure of the total amount of power used by the datacentre, divided by the raw power used just by the servers – and the reason this figure is important is that older datacentres use a lot of extra power just to keep the servers cool. So, for example, if you have servers consuming 100kW of power, and if you need another 100kW of air conditioning to keep them cool, then the datacentre is using 200kW of power in total. The PUE is 200/100 = 2. We want a figure which is as close to 1 as possible – the lower the better.

In the shared datacentre we’re aiming for an average PUE of 1.2 or less, and have already reached figures as low as 1.08 at times which means we are using much less additional power to cool the servers. That sort of figure is close to Facebook’s big new datacentre in Sweden  and we anticipate we will reduce our carbon output by around 230,000 Kg per annum.

Green ICT

Hopefully you will have read the recent edition of RGU’s “Green Times” – if not you can read it here.

It includes an article which shows the environmental impact of PC’s being left switched on and what you can do to help – by turning your PC off when it is not in use. What is less obvious to our staff and students is the impact of “behind the scenes” ICT. Way back in 2007, the Gartner Group estimated that globally information and communications technology contributes to some 2% of total carbon emissions.

That’s about the same as the aviation industry. A quarter of the ICT related emissions come from data centres running servers, which then require further energy to keep them cool. . .

We have many servers running in our server rooms on Campus, and the rooms themselves are nothing like as efficient as modern datacentre standards. We expend much more energy on cooling than we need to. Aberdeen University is in a similar situation, so are Aberdeen College and Banff and Buchan College (although they have recently upgraded their server room), and at the moment we all run our data centres independently from each other. Over the past 3 years we have been working hard to see how we could collaborate to reduce all our costs and carbon emissions.

This culminated in an agreement earlier this year to move into a shared datacentre by upgrading space in the University of Aberdeen. It’s currently under construction, and will be ready by the Spring of 2013. Initially, it will become the primary datacentre for Aberdeen University, Robert Gordon University and Aberdeen College, with Banff and Buchan using the facility at a later date.

The environmental impact of this will be substantial. We estimate that the total power consumption of all the servers from the 3 institutions is 220kWh. In our own separate, old, data centres at the moment we probably use the same amount of power again just to keep the servers cool (220 kWh is roughly 22 electric cookers, with everything switched on, running all the time – just picture it). CO2 emissions from all that will come to 2030 metric tonnes per annum.

By packing all these servers into one modern datacentre we will slash the energy required for cooling. We estimate that our total power consumption will drop from 441kWh to about 264kWh. As an added bonus, much of the electricity will be generated by Aberdeen University’s combined heat and power plant with lower associated carbon emissions. In total, we anticipate saving 1061 tonnes per annum.

And into the bargain, the institutions will save £2.6m collectively over the next 10 years.