DIY data centres

By on
DIY data centres

Using bed sheets and desk fans to alter in-room airflows and Perspex sheets for containment is part of data centre industry folklore.

Often they are cited as examples of bad practice or as proof that a generation of centres designed and configured around mainframe-era thinking face refresh or redundancy as newer, more cost-efficient designs emerge.

But discounting the DIY elements of data centre construction and configuration also serves a handy purpose of driving sales of more expensive custom kit from vendors.

For example, a containment system of Perspex' roofs and sliding doors from Rittal covering two rows of 10 racks is around $7,500, says the company's IT business development manager Mark Roberts.

But Perspex sheets themselves range up to a couple of hundred dollars at a hardware store or a plastics manufacturer. Add a PVC strip or swing door you might usually find in a cold room or butcher's shop, seal any holes or gaps nicely and the cost you pay for containment could be a fraction of the commercial version.

Australia is a nation consumed by renovation and DIY. Data centre consultants readily acknowledge that it is part of the data centre and computer room culture for many small to medium businesses.

But there are consequences to getting it wrong - big ones - and some in the industry are just waiting to reap the consultancy fees when DIY jobs go pear-shaped.

"Customers often try to do something and don't quite get it right," says APC's data centre solutions advisor Adam Wilkinson."We get called in to help them do it properly."

"Taking a short-term focus on investment like [DIY Perspex containment] with the knowledge that is out there is risky. You've got to think big - [besides] $7500 aggregated across that many racks really isn't that bad," adds the Frame Group's data centre practice manager Greg Goode.

CRN's investigation of construction and configuration techniques drew a wide variety of responses, ranging in price from sub-$100 to well into the five figures.

As a result, we present our guide to putting together the most cost-effective SMB or SME data centre today.

The floor

Floor design is linked to a number of factors, including power, equipment weight and cooling.

But the traditional raised floor in data centres is in many ways a design leftover from the mainframe era.

Most small businesses are unlikely to have mainframes. As a result, it could be worth dropping racks straight down onto the concrete slab and containing them rather than building a raised floor, particularly if the average rack density is under five kilowatts.

"One of the minimal advantages of having a raised floor - though unadvised - is that you can run a small amount of cabling under it," says Shaun Vosper, the director of Brisbane-based consultancy Data Centre Technologies.

Goode concurs: "Raised floors were initially built to throw cabling under and pump air to the racks," he says. "In the mainframe era, air management was a load of rubbish. Most of the sub-floor plenum was totally occupied by cabling and a mish-mash of other pipes, so the cooling never worked that effectively.

"Slowly people worked out that it was better using the sub-floor plenum just for the distribution of air, and they moved cables into managed or overhead systems. Other people say that with containment you can do away with the raised floor altogether."

Goode says the choice between raised floor and the bare slab is ideological. "The jury's 50/50 at the moment," he says.

Vosper is convinced the choice is more practical.

"If you're doing a really basic setup I'd suggest not to worry about a raised floor because there's a lot of air conditioning systems on the market that don't require floor-based distribution," Vosper says.

"For the SME market, it's a cost they just don't have to incur."

The raised floor can be costly not just in installation but in floor space, according to Wilkinson.

"The smaller the room, the greater percentage of space is lost by putting in a raised floor," Wilkinson says.

"By the time you address pedestrian access and occupational health and safety issues around the step [up to the floor level], you could lose five out of 25 square metres in the room."

Adds Gordon Makryllos, APC's vice president for Pacific, "The default should be that you don't need a raised floor."

Cooling

If the raised floor is canned, a different approach to cooling than the standard computer room air conditioning (CRAC) unit is required.

Opponents of the slab say that in-row cooling - a method of sticking air conditioners between the racks themselves - combined with hot or cold aisle containment is the main alternative to under-floor air distribution.

"If you have a slab then you have to cool the equipment in-row," says Peter Spiteri, director of marketing at Emerson Network Power.

"Although there's some efficiency benefits in not having a raised floor, there could be issues with having plant and IT equipment side-by-side because you've just put in infrastructure that may or may not carry chilled water right beside racks of your server equipment.

"That plumbing requires service on a monthly basis so you could have maintenance cleaning trays right next to someone setting up blade servers."

Most industry players CRN spoke to were broadly dismissive of this risk. Such maintenance visits are generally supervised, they say.

Others, such as Vosper, believe in-row cooling could be unnecessary at the extreme low-end of the market.

"If your business has 2kW of IT load per rack do you need an in-row cooler? No, you don't," he says. "Most SMEs usually just take a basic air conditioner for home use and buy the more industrial model.

"If you've got three racks averaging three or four kilowatts each, a couple of top quality base air conditioners would pose few dramas.

"If you were going to push above five kilowatts per rack as part of your IT strategy only then would you really need to look at other air conditioning solutions."

Vosper, however, urges small businesses considering a base air conditioner to think about redundancy.

"If the unit fails what are you going to do?" he says. "What people have to look at is the cost of downtime to their business because it will really drive their choice."

Containment

Containment has become a de facto design methodology in the data centre. It's about controlling air flows - getting the most efficient use of available cold air while expelling hot air and ensuring the two flows don't mix.

There are a number of ways to achieve this.

Many larger data centres arrange racks into "hot" and "cold" aisles. In the cold aisle, cool air is blown into racks on both sides. The alternating hot aisle exhausts the air and expels it, at least in theory.

To augment this arrangement, either the hot or cold aisle is often "contained" - that is, a roof is added on top of the racks with sliding doors at each end.

"You need to start compartmentalising the data centre and lock up areas where air is bypassing the normal cooling cycle," Goode says.

DIY types use Perspex or other acrylic or polycarbonate sheets for the roof and PVC doors to save money. But other DIY forms of containment are also emerging, according to Vosper.

"I've seen a basic data centre built with the type of refrigerant cooling panels you find as walls in a cool room," Vosper says.

"The panels have significant heat and cooling properties so you don't lose a lot of the cooling you're providing.

"I've also seen centres built out of gyprock with house insulation in the walls. Some of these DIY jobs are in the premises of extremely large companies."

Regardless of construction, experts agree there are some cheap ways to optimise the effectiveness of containment architectures.

These include putting blanking plates over the front of empty spaces in the racks to prevent cool air from passing through, and ensuring the containment area itself is properly sealed.

"You need to make sure there's no leakage of air into service corridors or outside of the contained area," Spiteri says.

"For example, where a cable comes in or out of a room you've got to seal around the hole."

The future of containment isn't limited to walling in a number of racks; several vendors including APC are pushing containment within a single rack, opening the door for much smaller computing room deployments.

"We have a single rack configuration that is totally contained and capable of capacities up to 30kW," Makryllos says.

"It's like a data centre in a rack."

Chris Molloy, chief executive of TEX Solutions, says that US rack manufacturer Chatsworth is also pursuing a rack containment model.

"I'm definitely seeing a move towards rack containment," he says.

Goode concurs. "It's becoming the next wave of thinking in data centres."

Read on to page two for DIY tips on thermal dynamics.

Racks

Apart from containment, there is a general understanding of the importance of rack design in the data centre. But that hasn't always been the case.

"I've generally considered cabinetry to be the poor side of the data centre," Goode says.

"In the past racks were literally just ways to hold equipment up against gravity. Now what is a simple 'dumb' piece of metal has become a highly specialised area with cabinetry optimised for cooling, data cabling management, electrical cabling management and access."

If there's one area of the data centre that should be exempt from an aggressive price focus, it seems, it's the rack.

What Goode calls the "elegant solutions" don't come cheap - in the "ballpark of $3500" per rack, he says, but the additional cost is worth it if energy efficiency is one of your aims through the refit.

"You might get the price down to $2000 at volume," he says.

"Anything cheaper and you'll find you're retrofitting solutions into the rack to optimise cooling and data cabling that boost the costs beyond what you thought you'd saved. It's a bit of a fool's paradise.

"I wouldn't go for a cheap rack because they cost more down the track. You're wasting your money as far as I'm concerned."

Colocation

Of course, not every small or medium business will choose to house their data centre on-premise. Many will look to third party service providers such as carriers to host their racks at a specialist site.

CRN doesn't intend to weigh up the merits of colocation (colo) against building your own facility; but it is important to consider the price difference between the options.

"What hampers a lot of people in making the decision [on which way to go] is working out how each data centre charges for space," Vosper says.

"Some data centres charge up to $3000 per square metre whereas current office space [in Brisbane] can be found for around $400 per square metre. But the base infrastructure you get for $3000 reduces downtime or risks associated with your computing environment."

Other pricing models exist but Vosper warns they can come with hidden costs.

"Other data centres charge $1600 per rack, which is worth around three square metres on average," Vosper says.

"You're still getting fairly significant redundancy built in [to the cost] but you've got to be aware that you'll be charged more in other places [to make up] for the reduced rack charge."

Per-megabyte communications costs and equipment servicing fees can all add significant overheads for small to medium businesses.

"Every data centre is built for a reason. For example, telecommunications providers build centres to get clients within the facility so they can sell them other services," Vosper says.

"I had a customer recently looking for space for one server and one tape drive but as part of that they were looking for someone to do tape rotation for their backups.

"What they had to work out was whether it was worth putting that kit into a facility or just a relatively clean office space, because they will really pay for that person [in the facility]."

Colo could also be a money-saver if the cost of power is built into the contract, primarily because power costs are rising across the region and it could lock down that cost for the business.

"The price of power per kilowatt hour is probably going to double in the next five years," Spiteri predicts.

"If the colocation environment includes power as part of their charge and the customer has signed a three-year contract, it might be cheaper to be in colo," Vosper says.

Modular plant

Data centre construction is undergoing a fundamental shift. Gone are the days when operators built 100 per cent of the capacity upfront and then tried to fill it. Now centres deploy just enough space and plant to meet current market demands.

"In the past you'd build to what you thought the maximum capacity would be," Makryllos says.

"The new approach is not to make all that investment up front but to build out in a scalable, modular fashion. Many customers start with a single rack configuration then add more power and cooling modules or racks as needed."

"People now build data centres in total modularity and scale up the space and infrastructure as electrical and mechanical loads increase," adds Goode.

"It's about optimisation of investment capital from day one."

Every consultant and vendor CRN spoke to touted modular architecture as a key way for small businesses to save money.

Goode says the viability of modular plant, such as uninterruptible power supplies (UPS), has increased dramatically in the past four years.

"It has definitely become the norm," he says.

One reason for this is efficiency. A disadvantage of running small workloads on high-end plant is the power inefficiency involved.

"What we find with modular systems is they work more efficiently when running close to their full rating," says Michael Mallia, general manager of power quality at Eaton Industries.

"A modular approach means you can operate the UPS at peak efficiency by matching the load more closely with the rating on the UPS.

"It also means the UPS generates less heat which reduces its running costs."

Loren Wiener, data centre product manager at NEC Australia, says, "thinking modular is the biggest opportunity for SMBs".

Wiener also cites the increased use of "pods" - such as Sun's data centre in a shipping container - as further examples of the way modularity has permeated the market.

"[But] pods are probably not as affordable for SMBs," Wiener says.

Thermal dynamics

Despite best intentions, air often doesn't move in quite the way we expect it to. It makes many data centres - particularly older ones - less efficient than they could be, and a burgeoning services business has sprung up to help customers optimise in-room dynamics.

"A lot of people still work on gut instinct for how they think air moves," Goode says.

Computerised fluid dynamics software can be valuable in determining how an aisle should be set up and where equipment should be placed.

"The algorithms have been adapted to measure the movement and efficiency of air flows, and the models I've seen of data centres are very close to reality," Goode says.

Wiener believes engaging a consultancy to conduct a thermal imaging test is a good idea, but says there are DIY ways for small businesses to trial the concept.

"I tend to use a spot laser thermometer," Wiener says. "You just zap it at different bits of the rack for a closer look [at the temperatures]."

Dick Smith Electronics' offers the thermometers online for $99.

Molloy recommends engaging a consultancy for more complex requirements, such as to test the expansion of heat loads on the floor. Emerson, Eaton and APC all offer this type of audit service.

"We look at the complete power chain," Mallia says. "Ongoing monitoring is the key because you can't manage what you can't measure."

But others believe there are simpler ways to resolve heat and cooling issues that won't break the bank.

"Ninety per cent of the time when you bump into excess heat or cooling issues they can quite often be resolved by spreading the equipment out across more racks," Wiener says.

"It will cost you more in space or rack costs but you'll end up using less power and cooling."

Establishing access control could also resolve cooling issues. "One smaller data centre I worked at was in a general office area. The tea room was across the hall [from the main office space] but the quickest way to get to it was through the data centre. People were constantly opening and closing the data centre doors and walking through with hot cups of tea, which caused fluctuations in temperature," Molloy recalls.

Another alternative is to buy off-the-shelf monitoring kits from the likes of APC and Emerson that can be mounted in racks and feedback information on operating temperatures via a network of sensors, according to Vosper. "They're not that expensive," he says.

On the cooling side, Molloy also offers a simple DIY solution to test airflow in raised floor environments.

"An easy way to see if floor tiles are in the right spot is to take a piece of paper and put it over the vent," Molloy says.

"If it's not being lifted up then you know it's not set up right in the first place."

Conclusion

While it would be largely unadvisable to build an entire data centre yourself "on the cheap", there are a number of ways businesses with small budgets can optimise the cost at each point of the build.

From Perspex and home airconditioning systems to coolroom panels and Dick Smith thermometers, it is more than possible to save some money as long as you have some knowledge of what you're doing.

The next generation of data centre is modular, scalable, flexible and relatively quick to deploy. Whether off-the-shelf or DIY, it's time to embrace the concepts.

(*CRN recommends obtaining independent advice for your specific configuration before attempting DIY alternatives.)

Multi page
Got a news tip for our journalists? Share it with us anonymously here.
Tags:

Log in

Email:
Password:
  |  Forgot your password?