Inside tomorrow's smarter data centre

By on
Inside tomorrow's smarter data centre

The power, cooling and physical space that underpins all modern IT systems is often overlooked. But there is plenty of innovation in this hidden world that supports all the virtual ones.

Software

Perhaps the most surprising aspect of critical infrastructure is how much innovation is being driven by software. 

The control systems for power, cooling and even security, are all software systems. The same trends we see in the broader IT market can be seen in critical infrastructure.

Major suppliers are building software platforms with APIs that customers, both data centres and end-customers themselves, can plug into to find more efficient ways to consume resources and just keep tabs on their equipment housed within the facility. 

“The biggest gains are efficiencies driven by software,” says John Atherton, general manager, power quality for Eaton Australia and New Zealand. The vendor’s Intelligent Power Manager software has been certified as VMware-ready and integrates with VMware’s vRealize Operations Manager software, for example.

Mark Deguara, director of data centre solutions at Emerson Network Power Australia, says adding monitoring equipment is the first step to making even legacy data centres more efficient. Emerson’s Trellis platform for thermal management offers real-time monitoring of heat in a data centre, so providers can adjust systems to respond in real time. 

“The data centre becomes autonomous,” says Deguara. “Optimisation of legacy data centres starts with getting metering in and using it. First your measure, then you baseline and then you improve.”

It’s all part of a movement towards virtual data centres, where individual data centres work together like blades in a blade-chassis and workloads are shared between them. Coordinating the workloads and all of the critical infrastructure they rely on requires the new sophisticated software platforms that run modern data centres.

Power

Without electrons there would be no computing, so keeping all the IT equipment fed with a steady supply of clean and healthy power is perhaps the most critical part of all critical infrastructure. The biggest recent development in power is in energy storage, in batteries. Storing energy on site is required for uninterruptible power supply (UPS) if the main power feed fails, but how that power is stored can vary enormously.

Most UPS units use banks of lead-acid batteries to store enough power to run the data centre for a short period of time if the grid power feed fails, but newer battery technologies with superior features are becoming more affordable.

Eaton has entered into a partnership in Europe with carmaker Nissan to reuse batteries from the Nissan Leaf electric car. The lithium-ion batteries used to store power for electric and hybrid vehicles stop being viable when they drop to about 85 percent efficiency, so they’re no longer useful for cars, but adding Eaton’s power management software to the systems can give them new life as data centre power storage.

Lithium-ion battery systems cost about 1.5 to 3 times that of the standard lead-acid batteries UPS systems have used for decades. By reusing batteries from Nissan, Eaton can tap into a supply that is substantially cheaper than newly made ones, making the economics more compelling, given the superior features regarding maintenance.

Since data centres need to store energy locally anyway, renewable generation sources such as wind or solar can be used to supplement grid feeds to help bring costs down. While renewables can’t yet provide the bulk of a data centre’s substantial power needs, they can help to improve efficiency, which reduces costs. 

The marketing benefits of going green are also becoming an important aspect of data centre providers’ pitch to corporate customers looking for
lower TCO.

Cooling

Keeping all the equipment cool is the second-most important part of critical infrastructure. Cooling without major additional cost is a key part of keeping costs down. 

“We use mostly free-air and evaporative cooling,” says Josh Griggs, managing director of data centre provider Metronode. “We have sensors throughout the facility automatically adjusting the cooling mix as needed. We can have a 1kW rack running right next to a 30kW rack with no problems.”

Free-air cooling mixes some outside air with data centre air when the external air is cooler. The air is filtered to remove pollutants like smoke or dust and, when clean and cool, can save a lot of energy.

With evaporative cooling, warm air is cooled by evaporating water, using the same principle as the old Coolgardie Safe bush fridge. Modern systems don’t mix the cooled air with the water itself, to keep humidity within the data centre low.

Inside the data centre itself, modern designs put the cool air underneath the floor, since hot air rises. Rather than cooling the entire room, only the racks themselves get cooled through channelling equipment. Carefully monitoring the temperature throughout the data centre and adjusting airflow as needed ensures the right amount of cooling can be guided to where it is required.

Right at the edge of cooling technology, liquid cooling is starting to make a comeback. 

“Market acceptance of liquid cooling isn’t there yet,” says Griggs, “but it’s only a few years of R&D away.”

Dell EMC recently released its Triton liquid-cooled server technology, developed in partnership with customer eBay. As this kind of technology filters down from the very large-scale operators it will start to be offered by data centre providers as an option. We can expect liquid cooling to supplement traditional air cooling in just a few years as companies seek ever denser compute.

Security

Physical security is one of those things that we take for granted, but here things are changing as well. Data centres that host multiple tenants need to protect each tenant from the other, as well as outside threats. “Security and uptime are table-stakes,” says Griggs. While every data centre CRN has visited recently has the same array of security guards, cameras everywhere and multi-stage locks, electronic access tracking is becoming more granular.

Some facilities, including Equinix’s latest SY4 facility in Sydney, offer swipe-card access for individual racks instead of physical keys. The big advantage of swipe cards over keys is that swipe-card access provides a solid audit trail of who accessed which rack and when. An audit trail can be very useful when trying to figure out who mis-cabled the server, not just for review after a security incident.

Remote sites

Another aspect of critical systems that’s easy to ignore is that there are lots and lots of IT systems that don’t live in a data centre. Think of all the mobile base-stations out there providing LTE and better data services to the army of mobile devices. Modern farms are full of electronics – sensors, actuators, cameras, you name it – and they need critical infrastructure.

Rural areas are hit with sun, dust, wind and other elements that quickly destroy standard data centre gear. To survive in these conditions requires specifically designed equipment.

“We ran a project with battery maker Redflow at sites that used to use on-site generators for power,” says Deguara. The project combined on-site renewable energy generation using solar and wind power, with energy storage using Redflow batteries.

“A site that would have required a generator to run 24 hours a day could use a generator for just 3.5 hours a day,” says Deguara. 

Apart from the environmental savings, there are immense cost savings from reduced fuel consumption, less generator maintenance and humans not needing to attend remote sites.

“It’s all about the economics,” says Deguara. “If the marketplace isn’t willing to take on the technology, it’s not important.” 


Factfile: Bevy of activity in Australia’s independent data centre scene

Perth
Data Centre Limited has 500 racks in specifically renovated ISO con-tainers. The firm hopes to go public with an IPO by the end of the year.

Hobart 
Red Cloud bought about 7000m2 of land in a Hobart business park in August. Its data centre is expected to be delivered in 2017 at a cost of $40 million.

Adelaide
YourDC constructed a $30 million data centre in May. The facility houses 800 racks with three petabytes of capacity.

Got a news tip for our journalists? Share it with us anonymously here.
Copyright © nextmedia Pty Ltd. All rights reserved.
Tags:

Log in

Email:
Password:
  |  Forgot your password?