CUSTOMEr CASE: University of Southampton
Country: United Kingdom


If you can’t stand the heat, call for Raritan!

The University of Southampton is midway through a programme of implementing Raritan PDU technology within its main data center and more than 200 multi-campus IT hub locations. The university’s decision to standardize on Raritan technology follows on from issues experienced with its previous supplier’s PDU technology and support.

Discover the full story:

The University of Southampton has around 22,000 enrolled students, 5,000 staff, an annual turnover of in excess of £400 million and is a founding member of the Russell Group – an organization of 24 top UK universities dedicated to maintaining the highest research and teaching standards. Additionally, the university has upwards of 170,000 subscribers to its Massive Open Online Courses (MOOCs) - free study programmes designed to be studied online by large numbers of students. Course materials such as video lectures, reading material, course work and tests are augmented by forums which help students and tutors build an online community.

As with any modern educational environment, the university’s IT infrastructure plays a crucial role in ensuring the smooth running of almost every aspect of campus life. Digital infrastructure underpins the university’s research and innovation excellence (a major revenue resource), which relies on two supercomputers for much of its work, and the day to day activities of the students’ learning (and social), academic teaching and support staff administration activities.

The major switch to online learning in response to coronavirus has served to put extra pressure on, and expectation of, the university’s IT resources, the nerve center of which is the data center.

Data Center Manager Mike Powell explains: “We offer an average of more than 350 different IT services to our user community, spread across the main Highfield campus, the Avenue arts, humanities and foreign languages campus, the oceanography and earth sciences dockside campus, the Winchester School of Art and Southampton University Hospital Trust.”

https://www.minkels.com/images/3tzKZ/shutterstock_112274051.jpgThe university’s Tier 2 data center went live in March 2013 and is currently configured for a day one load of 1.1MW. The facility’s infrastructure has been provisioned to provide an easy, seamless, non-disruptive upgrade to a 2.5MW load as and when required. With another supercomputer on the horizon, some of this extra capacity is likely to be taken up in the next couple of years. That said, some of the day-to-day, commodity IT applications and workloads have already been moved to the cloud as part of a ‘cloud first’ strategy, so the data center is unlikely to experience any workload capacity issues any time soon.

The data center was designed with a smaller capacity and footprint than most, due to the university’s early investment in an aggressive virtualization exercise (the initial plan for 40 racks was reduced to 12). Add in the two supercomputers and it is fair to say that the data center is something of a trailblazer when it comes to high density operation.

The data center infrastructure has inrow 30KW chillers at its core, which offer excellent high density cooling opportunity characteristics and a high level of resilience with rack rows.

One of the challenges faced when operating high density rack loads is the power distribution unit (PDU) at the rear of the rack.

Back in 2013, when the data center went live, the PDUs chosen were seen to be best in class at that time. However, in the high temperatures generated by the facility’s high density environment, some of these units were failing prematurely.

Temperatures around 45 degrees

Mike takes up the story: “Back in 2019, I had a chance meeting with a Raritan representative at a data center conference and the discussion was around why is the Raritan product better than what we had already. And it was around the fact that we were starting to see premature failures of the existing PDUs, due to heat we think. What we were seeing at the rear of the racks were temperatures getting around 45 degrees and that was too much for the existing product.

“Some of the features that Raritan portrayed to us were that the Raritan product could withstand 60https://www.minkels.com/images/p0DKk/PDU_side_9000x9000_01.png degrees and the fact that we could have hot swappable management consoles, which the previous product set didn’t do; and that it used less power to operate its intelligent features – independent outlet control and independent power monitoring per outlet. We don’t change brands lightly, so we had a sample product sent down for evaluation, we were very impressed with it and then we made that decision. We’d tried a number of other products and manufacturers, but it was the Raritan product we opted for, based on those three criteria initially. So we swapped and we’ve almost replaced all the PDUs in the data center, we’re just waiting on the final supplies.”

Additionally, Mike has undertaken a Raritan PDU replacement programme in the more than 200 data distribution, hub rooms. As he explains: “We now have one consistent product line across the estate – that’s really important when we’re interfacing with our management platform to see any overheats that are going on in the data center or in the hub rooms that we have one product set that we interface into, not numerous ones we have to try and connect to.”

Refurbishment programme

Currently, the hub rooms are undergoing a refurbishment programme, to ensure that they all have the same infrastructure. Temperature sensors are fitted as standard. “One of the beauties of the Raritan is that you can plug a temperature sensor straight in,” says Mike. “We also do leak detection in some of our major rooms for any of the air conditioners which may suffer leaks or water ingress. And one of the reasons we chose Raritan was the connectivity opportunities on its management interface, with numerous different sensors that are plug and play.”

Mike continues: “In the data center we use the intelligent PDUs, but in the hub rooms we use the metered PDUs, where we do need that control per outlet. So, we have a couple of different types that we tailor to the end use.”

Other Legrand solutions are being implemented as part of the refurbishment work. The company’s MIGHTY MO network frames are being used extensively across the hub rooms - where a conventional rack needs replacing - as are Cablofil steel wire cable trays. Additionally, EZ-PATH fire stopping devices are being installed in the hub rooms. As Mike details: “Basically, you install it into a hub room wall, a fire break, and you can keep putting new data cables through it. If and when there is a fire, the device seals up.”

He adds: “These products, plus the PDUs, are listed in our specification not only for new builds, but also for refurbishing any of the existing university buildings, as our chosen product set for deployment. So, Legrand is in our data center, they are in our landing rooms and our core edge data distribution rooms already and in the specification for any new deployments.”