CRN sat down recently with a number of the industry’s leading experts on data centre building and management for an intelligent and enlightening discussion about the IT challenges and opportunities facing organisations today and what we can expect to see in the data centre tomorrow.
Panellists
Jacques Tesson, CEO, DPSA
Jason Rylands, data centre architect, DPSA
Paul Tyrer, managing director, Schneider IT Business (APC)
Andrew Sylvester, data centre software sales manager, Schneider IT Business (APC)
Trevor Voss, chief technology officer, Linktech
Greg Boorer, managing director, Canberra Data Centres
Marcus Stock, managing director, Innov8
CRN Jacques, you said you’ve seen a lot of change in the data centre or computer centre over the past few years. Talk us through some of those changes and what you see happening in the next couple of years, and the business opportunities.
Jacques For the reseller the challenge is to identify new opportunities and incremental business. For the past three years there’s been a dramatic increase of data processing due to the advanced communications we have.
Now users want to know where the data is, and locate it very fast. For this you need very advanced storage, server and networking solutions. And for it to be very efficient and work to its optimum position, you have to have a proper hosting.
We got hit by the dot com crash in 2001, so the data centre became managed by either the CFO, people with the money, the IT manager disappeared, and the financial people were left with a legacy of inefficient data centres. Now we’re hit by another crisis. We’ve got the 2010-2012 crisis, and demand is basically making the existing system efficient – use what you’ve got.
If you’ve got a 50 square metre room and you want 20 more servers, what are you going to do? You look at efficiencies, you remove all this UPS image track, you find 40 RU (rack unit) space and you make it efficient, so you increase the density vertically rather than horizontally and then you can pack more and more IT in there.
I’ve seen a dramatic evolution of the computer room. The cost of electricity and energy has gone up, the cost of employees, the risk brought by employees is up, because you’ve got solutions which have to be managed, and if you don’t monitor you can’t manage, and if you can’t manage, you can’t reform.
CRN Marcus, are you finding a similar level of conservatism in the market and what are you doing to counter that with your customers?
Marcus I think the two challenges which people haven’t addressed are the lack of qualified people and available resources. So if you think about the data centre life cycle, it’s the old analogy of ‘piss poor planning gets poor results’.
The one thing you can guarantee is that power is going to cost more and more. If you haven’t got the right resources to make designs, planning and the preparation to be efficient, not on day one operations, but by 10 years down the track, you’re guaranteed to fail.
A data centre was basically a comms room. If you actually look at the top 50 organisations, say, in Australia, how many of those have IT as part of the corporate mission statement?
A classic example is Commbank [The Commonwealth Bank]. Their IT is a differentiator in the market.
You will see that all the way through business. People are seeing now in the evolution of data centres that you’ve got to put the metrics there.
If you haven’t understood how you’re going to run your data centre, how are you going to run your business, and you haven’t defined it back at that preparation phase, you will never measure it properly.
CRN It sounds like a no brainer, but as you’ve discovered Andrew, there are probably a lot of companies that don’t really appreciate that fact.
Andrew There are a lot of companies collecting lots and lots of data, and very few companies turning that data into useable information. How they operate their data centre and how they drive performance improvements are often optimised around the data centre. If we can help customers understand the knowledge journey and how they can take the data that they’re collecting, turn it into information and then take that and turn it into business knowledge, I think that’s a benefit of integrated data centre infrastructure management.
What’s interesting is all the data that’s collected and all the information that we create out of that data is all retrospective. It’s all about how well are we doing, what have we achieved retrospectively, historically. Once you get up into more of the knowledge management area, you can start to forecast out with much greater accuracy – so you can do things like capacity planning more effectively.
Taking that data and turning it into useable information and turning it into business knowledge is a critical driver we need to foster, and educate our clients and help our partners identify opportunities in that space as well.
Marcus We are not reinventing the wheel. These processes are taking business information, analytics, metrics, and converting them into reporting. It’s been set up by the guys who did SAP and Oracle and all of the ERP systems. That whole life cycle of how you build the software platform, how you build your metrics and how you do your reporting, has been done in a financial world. Now, instead of a sales ledger and a purchase ledger, you’re using your power consumption and your output and processor site. It’s still the same order of magnitude of data.
Andrew It’s how you take that data and align it better to the business metrics. So how is the business measuring its core business function? How is it measuring performance? And how does the information it’s collecting from the IT space and the data server space align to support those more business-oriented metrics?
Trevor In the mid 2000s a lot of the analytics was actually provided by finance to help IT change, because it was very easy around virtualisation to say ‘I’ve got 50 servers, I can get away with four’; that’s a financial decision they really had to make, or someone made for them. Now it’s about IT standing up and making decisions in their own right, and I don’t think a lot of them have the skills to do it. They certainly have the capability in terms of their tools, whether it be standard delivered tools from some of the server manufacturers or whatever, yet they still don’t analyse it.
Ask a lot of them about their disaster recovery, and they don’t even know if the sites are up. They’ve got a DR plan that’s been signed off by the board and feeds into the business continuity plan. They’ve got a DR site but they can’t tell you if it’s hot or cold and they don’t know the integrity of the data, because they don’t test it. CIOs and IT managers are now expected to drive change, and a lot of them don’t know what power they’re consuming at the moment. It’s not even in their budget. So why should they have a focus to try to reduce it?
Paul This is a massive opportunity for the reseller integrated community. There are a lot of people out there with data centres that are really struggling. We’re seeing these power densities, these cooling densities, increase, and these organisations are not equipped to address it. If we look at the projections here in Australia, we’re seeing a fanning out of hardware spend. In Australia there’s supposedly 90,000-plus data centres in the market. The average data centre typically would run for 15 to 18 years. For the average data centre today it’s around about eight years.
Data centres have seen massive changes in their IT environment over the past seven or eight years. They’re going to see massive change over the next 10 years, and they’re going to struggle. Integrators have a big opportunity in that space.
Jason Yes, there’s a huge opportunity for those resellers looking to increment their revenue and get into this data centre space. When we go on site with resellers and we talk to IT managers we start to ask some questions around the server and the data centre, and they really don’t know how much power they draw. The budget generally for power has been lumped into their lease or the building. They’re not actually sure how much it is costing them to run their data centre. A lot of money has gone into IT but there’s been no measurement and metrics.
If we don’t have those figures, of how much it’s actually costing, we can’t do any sort of comparison of what it’s going to cost us. So for the channel, it’s a huge opportunity to go in there and work with their customers and then get things in place to do the measurement.
Metering is a massive opportunity. Metering at the rack level; knowing exactly how much it’s costing you to run your equipment. That allows you to do your budgets and make those comparisons, because without that information, the resellers can’t help their clients.
Andrew A lot of the things we’ve been talking about refer to what I call baselining. A reseller could assist by baselining power. That’s everything from what they are being charged by their electricity or power provider, through how are they contribute to power within the data centre and corroding that power baseline, so they can start to measure their power usage within the data centre, looking at where they can make savings.
Then you create a whole data centre baseline or data centre model from where they can improve.
For a lot of resellers and clients I talk to it’s about ‘we don’t even know what we’ve got today, can you help me create a baseline?’, ‘can you put a stake in the ground for me and then work with me to improve where I want to go?’
It could be something as simple as mapping out their power supply train or using a tool, like a DCM [data centre management] tool to do that for them. So I think there are opportunities for resellers to use DCM tools to help their own business, and also provide those tools to resellers, to clients, as well.
Paul This is critical. A typical megawatt data centre over its lifespan would typically consume $20 million worth of electricity. That is about 50 percent of the running costs of that data centre over its lifespan. So if you can’t measure it, manage or optimise it, you are really going to be struggling to operate the data centre as efficiently as possible. The IT manager should be renamed the ‘IT and energy manager’ because the IT has such a significant bearing on the total actual spend of an organisation.
Jacques What we’re discovering now is your facility manager and IT manager are networking, whereas they used to debate at board level meeting to discuss who’s going to pay for what. So the IT guy says ‘you provide me with UPS power and you pay for it’, and the network guy says ‘I’m just providing connection to the user’ and the IT guy says ‘well the client is going to have to pay for that’.
There used to be this debate. But when you look in a building, the computing power of that building would account for between 40 and even 60 percent of the whole building power. So then the facility manager needs to report every 12 months to the board how much power is being consumed, but he doesn’t know. So who does he turn to? The IT manager.
There is a difference between being effective and being efficient. Effective is doing the right thing, and being efficient is doing the right thing in the right manner. So you have to look at best practice and Greg Boorer from CDC has got the best example.
Greg Yes, we’ve taken a bucket of energy that the government has conservatively estimated between 27 and 36 megawatts to support as little as 9 or 10 megawatts of IT load, and we’ve reduced that down to 12 megawatts for the same load. It’s significant. But that’s just the monetary side. It’s not all about the carbon tax – that will only make up about 50 percent of the increases in our power cost.
Generally, the more efficient you can be the better, and that’s driving a lot of people to ask questions about whether it’s more efficient to place their equipment into purpose-built facilities, rather than doing it in-house. With all the monitoring and metering tools, people know more about equipment and how much energy it’s using and how secure it is, and who’s had access to it in purpose-built facilities than they do sometimes in their own basements.