Racks
Apart from containment, there is a general understanding of the importance of rack design in the data centre. But that hasn't always been the case.
"I've generally considered cabinetry to be the poor side of the data centre," Goode says.
"In the past racks were literally just ways to hold equipment up against gravity. Now what is a simple 'dumb' piece of metal has become a highly specialised area with cabinetry optimised for cooling, data cabling management, electrical cabling management and access."
If there's one area of the data centre that should be exempt from an aggressive price focus, it seems, it's the rack.
What Goode calls the "elegant solutions" don't come cheap - in the "ballpark of $3500" per rack, he says, but the additional cost is worth it if energy efficiency is one of your aims through the refit.
"You might get the price down to $2000 at volume," he says.
"Anything cheaper and you'll find you're retrofitting solutions into the rack to optimise cooling and data cabling that boost the costs beyond what you thought you'd saved. It's a bit of a fool's paradise.
"I wouldn't go for a cheap rack because they cost more down the track. You're wasting your money as far as I'm concerned."
Colocation
Of course, not every small or medium business will choose to house their data centre on-premise. Many will look to third party service providers such as carriers to host their racks at a specialist site.
CRN doesn't intend to weigh up the merits of colocation (colo) against building your own facility; but it is important to consider the price difference between the options.
"What hampers a lot of people in making the decision [on which way to go] is working out how each data centre charges for space," Vosper says.
"Some data centres charge up to $3000 per square metre whereas current office space [in Brisbane] can be found for around $400 per square metre. But the base infrastructure you get for $3000 reduces downtime or risks associated with your computing environment."
Other pricing models exist but Vosper warns they can come with hidden costs.
"Other data centres charge $1600 per rack, which is worth around three square metres on average," Vosper says.
"You're still getting fairly significant redundancy built in [to the cost] but you've got to be aware that you'll be charged more in other places [to make up] for the reduced rack charge."
Per-megabyte communications costs and equipment servicing fees can all add significant overheads for small to medium businesses.
"Every data centre is built for a reason. For example, telecommunications providers build centres to get clients within the facility so they can sell them other services," Vosper says.
"I had a customer recently looking for space for one server and one tape drive but as part of that they were looking for someone to do tape rotation for their backups.
"What they had to work out was whether it was worth putting that kit into a facility or just a relatively clean office space, because they will really pay for that person [in the facility]."
Colo could also be a money-saver if the cost of power is built into the contract, primarily because power costs are rising across the region and it could lock down that cost for the business.
"The price of power per kilowatt hour is probably going to double in the next five years," Spiteri predicts.
"If the colocation environment includes power as part of their charge and the customer has signed a three-year contract, it might be cheaper to be in colo," Vosper says.
Modular plant
Data centre construction is undergoing a fundamental shift. Gone are the days when operators built 100 per cent of the capacity upfront and then tried to fill it. Now centres deploy just enough space and plant to meet current market demands.
"In the past you'd build to what you thought the maximum capacity would be," Makryllos says.
"The new approach is not to make all that investment up front but to build out in a scalable, modular fashion. Many customers start with a single rack configuration then add more power and cooling modules or racks as needed."
"People now build data centres in total modularity and scale up the space and infrastructure as electrical and mechanical loads increase," adds Goode.
"It's about optimisation of investment capital from day one."
Every consultant and vendor CRN spoke to touted modular architecture as a key way for small businesses to save money.
Goode says the viability of modular plant, such as uninterruptible power supplies (UPS), has increased dramatically in the past four years.
"It has definitely become the norm," he says.
One reason for this is efficiency. A disadvantage of running small workloads on high-end plant is the power inefficiency involved.
"What we find with modular systems is they work more efficiently when running close to their full rating," says Michael Mallia, general manager of power quality at Eaton Industries.
"A modular approach means you can operate the UPS at peak efficiency by matching the load more closely with the rating on the UPS.
"It also means the UPS generates less heat which reduces its running costs."
Loren Wiener, data centre product manager at NEC Australia, says, "thinking modular is the biggest opportunity for SMBs".
Wiener also cites the increased use of "pods" - such as Sun's data centre in a shipping container - as further examples of the way modularity has permeated the market.
"[But] pods are probably not as affordable for SMBs," Wiener says.
Thermal dynamics
Despite best intentions, air often doesn't move in quite the way we expect it to. It makes many data centres - particularly older ones - less efficient than they could be, and a burgeoning services business has sprung up to help customers optimise in-room dynamics.
"A lot of people still work on gut instinct for how they think air moves," Goode says.
Computerised fluid dynamics software can be valuable in determining how an aisle should be set up and where equipment should be placed.
"The algorithms have been adapted to measure the movement and efficiency of air flows, and the models I've seen of data centres are very close to reality," Goode says.
Wiener believes engaging a consultancy to conduct a thermal imaging test is a good idea, but says there are DIY ways for small businesses to trial the concept.
"I tend to use a spot laser thermometer," Wiener says. "You just zap it at different bits of the rack for a closer look [at the temperatures]."
Dick Smith Electronics' offers the thermometers online for $99.
Molloy recommends engaging a consultancy for more complex requirements, such as to test the expansion of heat loads on the floor. Emerson, Eaton and APC all offer this type of audit service.
"We look at the complete power chain," Mallia says. "Ongoing monitoring is the key because you can't manage what you can't measure."
But others believe there are simpler ways to resolve heat and cooling issues that won't break the bank.
"Ninety per cent of the time when you bump into excess heat or cooling issues they can quite often be resolved by spreading the equipment out across more racks," Wiener says.
"It will cost you more in space or rack costs but you'll end up using less power and cooling."
Establishing access control could also resolve cooling issues. "One smaller data centre I worked at was in a general office area. The tea room was across the hall [from the main office space] but the quickest way to get to it was through the data centre. People were constantly opening and closing the data centre doors and walking through with hot cups of tea, which caused fluctuations in temperature," Molloy recalls.
Another alternative is to buy off-the-shelf monitoring kits from the likes of APC and Emerson that can be mounted in racks and feedback information on operating temperatures via a network of sensors, according to Vosper. "They're not that expensive," he says.
On the cooling side, Molloy also offers a simple DIY solution to test airflow in raised floor environments.
"An easy way to see if floor tiles are in the right spot is to take a piece of paper and put it over the vent," Molloy says.
"If it's not being lifted up then you know it's not set up right in the first place."
Conclusion
While it would be largely unadvisable to build an entire data centre yourself "on the cheap", there are a number of ways businesses with small budgets can optimise the cost at each point of the build.
From Perspex and home airconditioning systems to coolroom panels and Dick Smith thermometers, it is more than possible to save some money as long as you have some knowledge of what you're doing.
The next generation of data centre is modular, scalable, flexible and relatively quick to deploy. Whether off-the-shelf or DIY, it's time to embrace the concepts.
(*CRN recommends obtaining independent advice for your specific configuration before attempting DIY alternatives.)