New Building for the Scott Sutherland School of Architecture and Built Environment

The next phase of construction in our Riverside East building – the new Scott Sutherland School – is now nearing completion. This has been built as a new wing onto the south of the existing building, and will house the School and the Faculty Office for the Faculty of Design and Technology. Early in June, the contractors expect the building to be ready for the University to start fitting out with its own furnishings, equipment to make it ready for occupation from July. Have a look at Daniel Doolan’s blog for some great photos of the construction work for this project – and indeed the rest of Riverside East.

One of the most important tasks in getting the building ready is the installation of the IT and Audio Visual (AV) equipment. Some of this has already taken place – the IT network cabling was installed as part of the construction work, and the company installing all the AV equipment has been working alongside the main contractor and much of the AV kit is already in place.

Over the next few weeks, staff from IT Services and the University IT/AV team will be working to install and commission all the remaining IT facilities. These activities can’t really start until the main building work is finished as they depend on a clean environment and the furniture installation.

First of all, IT staff will test the fibre optic connection that links the new building to the rest of Riverside East. The next stage is to start installing the network switches in the communications rooms within the building. The switches connect to the cabling and control all the IT network traffic throughout the building so nothing really works until these switches are in place. It’s not just a question of plugging them in – they all have to be configured correctly to operate with the rest of the University network.

As the switches are being commissioned, IT staff will also be checking out all the AV equipment and making sure that it works across the network. Then, once the furniture arrives and the switches are ready, IT staff can start to install PC’s on staff desks ready for staff to move in. Following this, the WiFi access points will be installed so that WiFi access is available in the new building, the printer/copiers need to be installed and linked up to the network, and PC’s need to be installed in the IT labs and studio spaces in time for students to start using the new building.

This is a busy and very intensive period of work for IT staff and will be a key priority for them over the next two months. We are all looking forward to seeing the new building completed and occupied, and staff and students enjoying a modern new environment for the School!

Goodbye St Andrew Street!

Moving a data centre from one location to another without disrupting the organisation is a major undertaking. We moved one data centre to a shared facility with the University of Aberdeen and North East Scotland College last year successfully, and on the weekend of 24th / 25th May we moved our second data centre from St Andrew Street down to a new purpose built facility on our Garthdee Campus. This one will also be shared with the University of Aberdeen and North East Scotland College.

The St Andrew Street building has been the University’s IT hub pretty much since IT began. When the national JANET network came into being, its main entry point was into St Andrew Street, and for many years the University’s main computer room was there, with network links to the rest of the Campus. The St Andrew Street building will be sold at the end of June, and this data centre move is part of a wider set of tasks to decommission the whole building. This has involved re-routing our external fibre connections (sorry about the roadworks!), and significant changes to the University’s overall Campus network so that we can unplug St Andrew St.

Down at Garthdee, there was construction work to build the new data centre, and this also involved significant changes to the Campus network at Garthdee so that the new data centre was connected into our network. All this work has been going on quietly over the past several months stage by stage.

Prior to the weekend of 24th / 25th of May, all of our critical systems had to be switched so that they were running fully from the North East Shared Data centre at the University of Aberdeen. That allowed the IT team over the weekend in May to shut everything down at St Andrew Street, move it down the road, and bring it back up again without any disruption. Once everything was up and running, critical services had to be rebalanced to run across the two data centres.

The School of Computing and Digital Media also had to move their servers from St Andrew Street down to the new data centre and they had completed their physical moves ahead of the 24th and 25th of May.

In a couple of weeks, the three remaining University departments will leave the St Andrew Street building for ever. Once they are gone, IT Engineers will disconnect the building and decommission the internal networks – that truly will represent the end of an era for us, but we are looking forward to moving down to Garthdee!

Here is the St Andrew Street Data centre before the move:
062-01

And here it is looking pretty empty after everything was moved down to Garthdee:
062-02

And here is the new datacenter down at Garthdee:
062-03

Wi-Fi and Eduroam

Commissioning of the Wi-Fi system in the Riverside East building and Aberdeen Business School started on 14th August and will continue through to the 27th September. The service will initially be focussed on the Library tower, followed by the rest of Riverside East and finishing with the redeveloped are of Aberdeen Business School.

The initial service will cover Windows devices and apple phones and mobile devices. Other devices (e.g. android and Macs) will be added as commissioning continues.

Further details on the commissioning of wi-fi are available on RGyoU. Follow the link and enter your username and password to access the page.

Staff and students at RGU will be pleased to know that as part of implementing this new Wi-Fi system we will be implementing “Eduroam”. What is Eduroam? Very simply, it is a worldwide arrangement whereby staff from one academic institution can, when they are visiting another academic institution, log in to the wireless network at the other institution, but do so using the username and password they normally use at home. So, staff or students at RGU can visit other participating institutions and log in with their RGU username and password.

Likewise, visitors to RGU from other participating institutions can sign on to the RGU wireless network using their own username and password.

The key benefit is that when you are visiting another institution, you don’t need to request a “guest” username and password, you can just log straight in. It works on laptops and mobile devices, and some institutions may allow you to connect to their wired network as well if you need to.

We are implementing Eduroam as part of the implementation of the new wireless network this summer. This wireless network will be implemented in Riverside East first, and then Aberdeen Business School as part of its refurbishment, and then later in the year in the rest of the Campus. When the new wireless system is up and running, the SSID will just be called “eduroam”. This means that we will also be using Eduroam for the normal wireless access for our own staff and students on Campus – it’s easier just to have one system.

For a few months, we will be operating with two wireless systems. Buildings other than Riverside East and Aberdeen Business School will continue to run the current wireless system until it is fully replaced. However, as soon as Eduroam is available in Riverside East you will be able to use it at other participating institutions if you happen to be visiting them.

Before you can actually use Eduroam, there will be some things to check and configure on your own equipment, you will need to comply with Eduroam and institutional policies, and you’ll need to enter your username in a particular way – but more details of all of that will be issued nearer the time. There’s loads of information available about Eduroam, but a good place to start is on the Janet web site .

If you are looking to see where you can use Eduroam, you will see that each institution can be listed as a “Home” or “Visited” institution, or both. You want to look for institutions listed as “Visited” – these are the ones which will allow you to log in when you visit them.

There’s also an amusing animation which you can view to get an easy understanding of what Eduroam is about.

Did you Notice?

Sometimes the greatest successes are the ones which go largely un-noticed. I reminded you last week about the work that was taking place this past weekend to move our Faculty of Health and Social Care datacentre across to the new shared datacentre at the University of Aberdeen. I’m pleased to say that the work went very well and that all the servers were successfully relocated and are now operating in the new datacentre with little disruption over the weekend.

Yes, there were plenty of glitches along the way but the team had, and used, plans “B” and “C” along the way and successfully overcame all the critical problems. For most of the weekend, the University was running on just one datacentre instead of the normal two. Thanks to much work over the past several years, however, all our critical services continued to operate as normal over the weekend with just the occasional short pause when things were being restarted. E-mail, web site, Moodle VLE, My Apps etc were all working over the weekend. A few pieces of hardware had problems starting up, but thanks to the use of “virtualisation” technology we were able to just move the “virtual” servers onto other hardware and continue services as normal until the faulty hardware is repaired.

My thanks and well done to all those involved from IT Services!

    OLD DATA CENTRE

server1
The First Items to be removed from the Old Data Centre.

server2
Server No. 7 on its way out.

server3
1 engineer happy to be found amongst the cables.

server4
Lets get some bubble wrap around server No. 16

server5
35 Servers, 2 large Tape Libraries all wrapped up and ready to go.

    NEW DATA CENTRE

server6
Where did I put that server?
(It’s behind you mate).

server7
A collection of worried engineers.

server8
A nice collection of cables.
(Connected at Last)

Update on “Riverside East” IT

I’ve written a couple of posts {here and here} about the work involved in providing the IT facilities in the new Riverside East building and this is just by way of an update.

As I mentioned previously, the main priority is to ensure that the supporting IT network cabling and equipment is ready in time for each move into the building. There has been other final construction and snagging work taking place in the building at the same time, but IT staff have been able to work with this and get the IT desktop kit, networks and phones ready in time for the Library, School of Pharmacy and Life Sciences, School of Engineering, and the “IDEAS” (Innovation, Design And Sustainability) Research Institute all of whom have now moved in to the building. The next School to move will be the School of Computing at the end of the week.

There is a core group of IT staff involved in these activities, and we have to schedule their work over the next two months to meet both the occupancy schedule of new / refurbished buildings and other important IT priorities elsewhere in the University. Their overall priorities over July/August are as follows:

1) Moving the School of Computing into Riverside East.
2) The datacentre move. Over the next two weeks, many of the same IT staff will be busy with the datacentre move which we have to give priority to so that it is complete ahead of resits and the start of Semester.
3) Completing all the network resilience in Riverside East.
4) Commissioning the WiFi system in Riverside East.
5) Working with suppliers to complete the Audio Visual fitout of all the teaching spaces in Riverside East.
6) Commissioning the IT networks and AV facilities which are part of the refurbishment of Aberdeen Business School.

I know that many of you are keen to see the WiFi system operational in the new building, and IT staff will fit that in as soon as they can, but they have to focus on some of the other work first to allow others to meet key deadlines, and to protect the overall Campus IT infrastructure. Will keep you posted!

Moving Datacentre – next two weeks

There have been a couple of postings about moving our servers into a new datacentre (Green ICT, How to Move 160 Servers). Well, now the time has come! As a reminder, we plan to move all of our servers out of the Faculty of Health and Social Care Server room into a refurbished datacentre shared with the University of Aberdeen and Aberdeen College. Over the past few months IT Services staff have been preparing the ground, making sure that the network connections are all working and that the services being moved are ready. Some of this work is highly specialised – one of the essential components to link the network had not been properly configured and had to be returned to Japan for further work, a round trip of 3 weeks. IT Services did also find that the configuration of some services had to be changed to allow them to operate in the new datacentre, and you will have seen a few small outages to allow these services to be updated.

The moves are going to happen over the next 2 weeks. ITS are not moving everything in one go, but will carry out the move in 2 or 3 stages. Some of the services ITS will be able to move without any downtime at all, some will require some downtime which is unavoidable.

The first batch are going to move this week, but the main move is planned to be the weekend of 27th July. That weekend in particular will be a substantial move and there is likely to be some downtime over the weekend, so do keep an eye on the information notices which will be issued and plan your work and studies around this.

Once this is complete, we will have removed a significant risk from our IT infrastructure by being able to decommission the old server room (see When Things Get Hot). We will also greatly improve our environmental credentials. One key measure for datacentres is “Power Usage Effectiveness”, or PUE. This is a measure of the total amount of power used by the datacentre, divided by the raw power used just by the servers – and the reason this figure is important is that older datacentres use a lot of extra power just to keep the servers cool. So, for example, if you have servers consuming 100kW of power, and if you need another 100kW of air conditioning to keep them cool, then the datacentre is using 200kW of power in total. The PUE is 200/100 = 2. We want a figure which is as close to 1 as possible – the lower the better.

In the shared datacentre we’re aiming for an average PUE of 1.2 or less, and have already reached figures as low as 1.08 at times which means we are using much less additional power to cool the servers. That sort of figure is close to Facebook’s big new datacentre in Sweden  and we anticipate we will reduce our carbon output by around 230,000 Kg per annum.

Moving to Riverside East

It’s just a few weeks before the first part of Riverside East, our new building down at the Garthdee Campus, will be open for staff and students.  The Library is moving first and they plan to have moved out of the Aberdeen Business School and into the new building around the end of May. Sounds like they are having fun with red and green dots getting ready to move thousands of books – have a look at their blog.

We’re having fun too in IT Services, although we’re not quite seeing dots in front of our eyes yet. Before anyone can move into the new building we need to get all the IT equipment set up ready for staff and students to move in.

First priority is to build the IT network. All the cabling work has been done as part of the construction project, but what we have to do now is install all the network routers and switches which will drive the whole network across the building, and connect it up to the rest of the Campus. The network kit has been bought and the IT Engineers are on site to start the installation. It’s a complex process. The network equipment needs to be physically installed into the communications rooms and communications risers in the building, and thousands of cables need to be connected up to the correct network equipment. It’s essential that this is done systematically and neatly from the beginning to make access for future maintenance easy. Then it’s all got to be systematically configured and tested. The priority is to get everything ready for the Library first, but in order to do that much of the whole building network core needs to be done anyway. So in terms of sequencing, we’ll get the Library done first and then move on to the remainder of the building according to the move schedule for the Schools and Departments moving in. Because some of the construction work is still ongoing, health and safety is an important priority and all the IT engineers who will be working on site are going through formal safety training first.

Next come the IT workstations – approximately 400 are going into the Library space in time for it to open for students. There is a team of IT staff, ably assisted by some students from the School of Engineering, who are currently taking lots of workstations out of their boxes and cabling them up ready for installation in the new building. That’s fine, but of course you can’t put the workstations into the Library until there are desks to sit them on. So, all this IT work is actually now part of a very intensive programme of work to co-ordinate all the activities for the move into the building. The University has contracted with a company called Space Solutions and they are managing the overall programme of work – actually right down to scheduling the use of the lifts in the building. There’s no point in a bunch of IT guys arriving with hundreds of workstations and furniture guys with loads of desks and then fighting over who gets to use the lift. Just goes to show what level of detail has to be planned at this stage.

At the tail end of all of this there will be a fairly intensive period when the desks are going into the library, the workstations are going on the desks, are being connected up and tested, and somewhere round about this time thousands of books will be getting moved across and into their proper places on the shelves. Then of course there will be printers to install, and the self service issuing terminals for borrowers to use.

I’m sure it will look like an oasis of calm when the doors open and students go into the new building. Inevitably there will be some snagging at that stage, but spare a thought for a month of very hard work which will have preceded the opening!

 

How to move 160 servers without moving 160 servers

What are some of the challenges faced by IT Services staff? Here is a guest contribution from “Bobby G” – one of our senior IT technical staff:

“It may not be the type of question we ask ourselves everyday but we have recently been in a position where we have been required to move 160 of the university’s key servers onto new computer hardware often in diverse locations, and we wondered how to carry this work out as quickly as possible and with as little impact to our customers as possible.

The servers are all real working servers providing many important roles for the University including Library, Teaching, Financial, Research, and Support services. The amount of data involved is also quite large with around 5TB of data being involved (For those of us familiar with 1.4MB Floppy Disks that’s around 3.5 Million Floppy disks worth of information).

There is a trick as I suppose there usually is with these types of questions, and the answer is to move most of the servers in a “virtual” manner. This still involves moving where the server really is in terms of all of its intelligence (CPU/Memory), but actually leaves the data with all of the information and disks exactly appearing to be where they always were. There are now a number of systems which allow us to carry out this type of work and the University has used a tool from VMware in this instance. This has allowed us to reduce the total number of real physical servers used by half from 20 to 10 servers while almost doubling the amount of computing power available.

This makes the system much greener as there are significant savings in electricity and room cooling costs, and makes it much easier to add additional servers at a very low cost.

With a little careful planning we were able to move all 160 servers in around 9 hours one weekend with most services being unaffected by the move and most of those that were affected only being shut down for around 10 minutes.

The new setup of the system has been automated in such a manner that as servers get busy or if a physical server fails the “virtual” server will now move around to find a comparatively quiet working server and no one will even know it has moved (unless they have access to the log files). So we currently know the room that your server runs in but not exactly where it is as it may have moved itself in the last 10 mins. One of the next challenges we are giving ourselves is to setup the system so that we don’t even know which room the server is running in to allow the systems to move between buildings for themselves when a service is busy or there is some form of problem (e.g. power outage) in a building.

So in answer to the question “how do you move 160 servers without moving 160 servers” – you only move the little bit of intelligence that runs the servers and leave the rest set up as it is. (i.e. move where the server thinks it is).”

Wireless Coverage at RGU

We’ve had a few issues with our wireless network over the past few weeks – apologies for that to our staff and students. Without going into the details, there were a number of unexpected issues with the central controllers and we’ve been in regular contact with the supplier and manufacturer to get these sorted. We still have some background work to do in order to get to the root cause of the issues, but we have an interim solution in place now and the service itself is much more stable for our users. We are, as it happens, shortly going to install a new Wireless Network Infrastructure anyway to meet the requirements of the new Riverside East building down at Garthdee, and to meet the growth in demand for coverage across the whole of the Garthdee Campus.

Use of wireless networks, as we all know, has exploded over the last few years to the point where we increasingly take it for granted that we will have wireless access in many public spaces and places of work. In designing our “Riverside East” new building down at the Garthdee Campus, we were clear from the outset that we wished to see wireless coverage available within the entire building. Our current wireless infrastructure varies across the Campus. When it was first implemented, the intention was to make sure that the key “public” areas were wireless enabled, including the core committee rooms, library and teaching areas. But, it was not designed to be a complete solution – especially in our older buildings it did not at the time make sense to attempt to put wireless coverage into every single room.

Requirements change, however, and with the range of mobile devices in use today complete wireless coverage is now a growing expectation, and we are receiving requests for the wireless service to be extended to areas that currently have poor coverage. As mentioned above, we are starting the process of refreshing our wireless network with the new Riverside East building in the first instance. We need to get that building commissioned as our first priority, but whatever solution we procure for that will be sized so that it can be expanded across to the rest of the Campus.

Installing a wireless network in a large building with thousands of users is a complex task. Wireless signals are broadcast by what we call “wireless access points” – you’ve probably seen them on the wall around the campus. Each access point can only handle a limited number of connections without losing performance, and it must not be too close to another access point using the same radio frequency, and there is a limited number of radio frequencies that by law we are allowed to use. So, we have to position the access points across the building (vertically as well as horizontally) to match what we think the demand will be in each area and configure them so that they don’t interfere with each other. To achieve that, we need to calculate how far each signal will broadcast and that depends on the construction materials used in the building. Early on in the design of the building, we put the CAD drawings through a software programme which calculates how the wireless signal will behave theoretically and we use that to estimate where the access points should go. That is only an approximate guide, however, and once the building is complete engineers have to physically walk through the building to measure the actual signal loss in each area before they can finalise where all the access points should go.

Even having done all of that, there are still some limitations on wireless technology. If you are doing heavy downloads of large files, and particularly if there are a number of people doing that in the same physical location, you may find that the wireless network will slow down – that’s just the laws of physics and how much traffic can be carried across one radio signal in one location. That said, we are looking at newer wireless technologies that will improve performance even in areas of dense use. So, while there might still be some occasions where it is better to “plug in”, for most everyday tasks – email, web browsing, Facebook usage etc the new wireless network will be fine.

Green ICT

Hopefully you will have read the recent edition of RGU’s “Green Times” – if not you can read it here.

It includes an article which shows the environmental impact of PC’s being left switched on and what you can do to help – by turning your PC off when it is not in use. What is less obvious to our staff and students is the impact of “behind the scenes” ICT. Way back in 2007, the Gartner Group estimated that globally information and communications technology contributes to some 2% of total carbon emissions.

That’s about the same as the aviation industry. A quarter of the ICT related emissions come from data centres running servers, which then require further energy to keep them cool. . .

We have many servers running in our server rooms on Campus, and the rooms themselves are nothing like as efficient as modern datacentre standards. We expend much more energy on cooling than we need to. Aberdeen University is in a similar situation, so are Aberdeen College and Banff and Buchan College (although they have recently upgraded their server room), and at the moment we all run our data centres independently from each other. Over the past 3 years we have been working hard to see how we could collaborate to reduce all our costs and carbon emissions.

This culminated in an agreement earlier this year to move into a shared datacentre by upgrading space in the University of Aberdeen. It’s currently under construction, and will be ready by the Spring of 2013. Initially, it will become the primary datacentre for Aberdeen University, Robert Gordon University and Aberdeen College, with Banff and Buchan using the facility at a later date.

The environmental impact of this will be substantial. We estimate that the total power consumption of all the servers from the 3 institutions is 220kWh. In our own separate, old, data centres at the moment we probably use the same amount of power again just to keep the servers cool (220 kWh is roughly 22 electric cookers, with everything switched on, running all the time – just picture it). CO2 emissions from all that will come to 2030 metric tonnes per annum.

By packing all these servers into one modern datacentre we will slash the energy required for cooling. We estimate that our total power consumption will drop from 441kWh to about 264kWh. As an added bonus, much of the electricity will be generated by Aberdeen University’s combined heat and power plant with lower associated carbon emissions. In total, we anticipate saving 1061 tonnes per annum.

And into the bargain, the institutions will save £2.6m collectively over the next 10 years.