Until recently, adding another terabyte might have seemed like the right thing to do to futureproof storage needs. However, this overprovisioning is now part of the problem. Customers who use smarter tools to manage their storage may find there is more than enough for
their requirements.
When IT managers forecasted their storage requirements in boom times, they tended to stay safe and procure more space than initially required. Their loyal suppliers then doubled that again, just in case. The result today is an unprecedented amount of unused storage capacity clogging up back-offices and data centres around Australia, demanding more management, more power, more cooling, more floor space and, surprisingly, even more storage.
Some forecasters have storage underutilisation at 30 to 40 percent, but Simon Piff, IDC storage research program director, Asia Pacific, believes it is even lower.
"Personally I think it's 20 to 30 percent, but there aren't any hard statistics because people don't want to look foolish and admit they made a mistake."
With data volumes growing at 50-60 percent a year in Australia - faster in some other parts of the region - due to the unprecedented growth of online transactions, video, email and presentation files, the need for optimised storage is crucial.
However the solution might be in the crisis itself: now is the time for resellers and their clients to rethink storage requirements and prove they can make their data storage smarter.
"We're seeing 60 percent compound annual growth in data, but we are not getting 60 percent compound increase in IT budgets to cope with it, so we're coming to a crossroads where something has to change for organisations to continue operating," says Scott Morris, NetApp director of channels, Australia and New Zealand.
Clive Gold, marketing chief technology officer, Australia and New Zealand, at EMC concurs: "In times like these, struggling SMEs tend to do just what they did yesterday. They just buy another server because that's what they did before. But now is the time to stop and think about it. And that's where the channel can help."
The key
The key to making storage smarter is to consolidate first. This applies to hardware and software as well as the data itself.
"There's a proliferation of storage devices at the moment. The first point to start is to see how many assets you have and how many are underutilised," says Morris. He suggests resellers and customers start by auditing the hardware they do have before exploring ways to store more data in the same footprint. They should also consider reducing the number of devices by choosing higher density, higher volume drives.
Consolidation of software is more difficult given the abundance of specialised tools marketed by different vendors.
Gold says customers are confused by this complexity, adding another layer of redundancy to their storage systems.
"It's too hard for customers to get all the benefits from their storage. To really use all the fancy little tools you get in the storage arena you need human intervention. Vendors have created lots of functionality and individually solved lots of different problems, but they are just tools," Gold says.
Piff says IDC surveys show customers want vendors and resellers to install and configure their solutions. "They just want them to make it work. It's a great opportunity for resellers, but it does require them to get close to their customers."
Backup is not archive
Customers need to distinguish between archive and backup before they can begin storage consolidation, says Dr Kevin McIsaac, advisor for virtualisation, storage and data centre infrastructure at research firm IBRS.
"Many organisations use the term ‘backup' and ‘archive' interchangeably, but there is a distinct difference in these processes," McIsaac says in his latest briefing paper which will be the subject of a seminar in June.
"A backup copies data to a secondary medium - such as tape and increasingly disk - leaving the data on the original system. The purpose of the backup process is to enable recovery. An archive migrates data from the primary system to the archive system, removing the data from the primary."
McIsaac says failure to understand the difference and to sort out the active data from the inactive results in backup processes that can't be completed inside established backup windows, longer-than-needed data recovery times, and unnecessary costly human intervention to find archived files to restore.
He says many organisations respond tactically to this problem by increasing backup throughput through virtual tape libraries, backup to disk or deduplication technologies.
"While these approaches can be effective in the short term, they are a band-aid solution that fails to address the underlying issue - that most of the data being backed up every night never changes."
Deduplication
Used in archival and backup, deduplication shrinks storage needs by eliminating the compound storage of unchanged data. In other words, it only stores data that has been modified and only once, avoiding a full backup every time. In the case of more than one instance of a file needing to be referenced, deduplication allows the storage of pointers to the unique copy.
It can be used to reduce backup, save space on existing disks and tapes, can lower storage procurement needs, reduce data transmission and result in faster recovery times. But depending on the level of granularity, it can also require more processing power to re-assemble the pointers when retrieval is needed."Deduplication is such a huge topic now. The technology is very young - everyone defines it as they wish," says Gold.
"EMC's Avamar, for example, will look into the files it has been requested to backup and will see if anything in them has changed. Then it will see if any other files already in the backup contains that change. If nothing has changed, it won't move the file," he says.
NetApp's Scott Morris says deduplication makes mathematical sense. "If you have 10TB of data, you need 10TB for a second copy and another 10TB to keep a third copy. It adds up very quickly." Deduplication can eliminate up to 95 percent of the data needing to be replicated.
"Deduplication is not just suited to backup and disaster recovery environments. We also do it at the primary device. Our competitors say we can't, but we can. We dedupe at the source," Morris claims.
Mark Nielsen, StorageWorks product marketing manager for HP's enterprise storage and servers division, says resellers should talk to customers about how deduplication leads to more efficient backups and helps reduce the storage footprint. And how much faster their storage ROI can be.
"Typically, we are seeing the two- to three-year ROI window shrink to between six to 12 months," he says.
Included in the deduplication bag of tricks is snapshots, an ability of dedupe systems to store a snapshot of a file, rather than the file itself. The snapshot contains the pointers to the data contained in the file and already stored elsewhere. For example, a PowerPoint presentation may include photos and graphs used by a number of users.
Under conventional backup, those logos and graphs would be stored several times. Snapshot will store a picture of the presentation and will point to where the photos and graphs are, eliminating the need to re-store them.
The delay associated with restoring those elements to the backed-up presentation should it be required are negligible, according to vendors CRN spoke to.
"If it was slower, the market wouldn't take it up," says Morris.
While deduplication is moving from buzz word to reality, there's tangible demand in the marketplace for resellers who can translate the benefits to customers in business terms.
"While all these features are great to have, going in and talking at a feature level, like "I can deduplicate' isn't actually the answer organisations want to hear," Morris says.
They would rather learn how to achieve sustainable reductions in the total cost of ownership of storage devices and how to reduce their overall storage operating budget.
"It's endemic in our industry today that we get so caught up in the technology that we forget people are buying it for business, not technology, reasons."
Thin provisioning
One of the newest storage concepts, thin provisioning refers to the ability of storage software to allocate capacity to different users in a single storage area network (SAN).
Inflexible allocation of capacity according to corporate hierarchy or department requirement has proven one of the greatest sources of overprovisioning and underutilisation in SANs.
Now thin provisioning promises to end the silo structure at the back end.
"Traditionally, IT departments would forecast how much space they needed for each application on a three-year basis and from day one, the storage would be carved up in the buckets people asked for, with no opportunity to change, only to wipe and bring it back," explains Morris. "This created a lot of unused storage and perpetuated itself, because in the next three-year cycle, they'd ask for more just in case."
Thin provisioning compensates for the need for flexible allocation while keeping track of usage by application
or client.
FlexVols (or flexible volume tools) allow managers to shrink or grow storage volumes with simple commands while the system is running.
"When we combine deduplication and thin provisioning we are actually able to get utilisation beyond 100 percent," Morris claims.
Other novel terms include thin replication, FlexClones, snap mirror and snap vault which individually or combined promise further storage optimisation.
Gold says the bottom line is that through thin provisioning "some customers have reduced the amount of storage they had to buy by 40 percent".
Storage as a service
Symantec has joined the chorus of vendors professing the virtues of storage optimisation by launching a new campaign titled "Stop Buying Storage".
David Dzienciol, senior director, enterprise sales and partners, Pacific region, Symantec Australia, says customers must employ the best tools to maximise their existing assets and should defer any future storage investment until the financial storm has passed.
That might also be the time when Symantec's North American storage-in-the-cloud service becomes available in Australia, but Dzienciol won't be drawn on when that might be.
"Certainly there is a trend towards a cloud service," he says, adding the vendor is exploring ways to bring the Symantec Protection Network (SPN) - an 18-month-old online backup service - down under.
He says a Symantec survey of companies' IT plans last year found 64 percent were looking at using cloud services in some way. "SaaS are a way to manage cost, reduce risk of patches, reduce the need to acquire software and satisfy demand issues. But our goal this year is to help customers utilise the assets they do have onsite," says Dzienciol.
"The opportunity in 2009 will be ‘how do I get more out of what I have?'," he says, suggesting software tools such as Symantec's Command Central Storage as a solution.
"Where the customer already has the tools we are working with our partners to help them better understand the technology, reclaim and repurpose the storage. We are spending a lot of time communicating this message to our partners for them to turn a challenging economic time into an opportunity for them."
Strategic consulting
Queue the opportunity for consultants to enter the storage market. Are customers using all the storage they have? Could they use it better? Is replication more efficient than deduplication? How much time and resources are wasted re-storing data that is not needed or not managing data that is crucial? These are only some of the questions a new consulting service by Hitachi Data Systems promises to address.
Under the umbrella of "Storage Economics", the vendor is promising a technology-agnostic analysis of a client's overall storage requirements and a long-term strategic plan to better address them.
Simon Elisha, Hitachi chief technologist, Australia and New Zealand, says Storage Economics is a set of some 30 metrics that enable the Hitachi consultant, together with the reseller, to evaluate a client's needs in relation to their business objectives.
The metrics include management-time-by-terabyte, power consumption, outage and downtime, disk procurement, data transfer, floor space, depreciation and rate of growth by tier of storage, among others (see screen shot).
"It's an opportunity for our channel partners to change the conversation they are having with the customer," says Elisha.
"The conversation is not just about needing more storage. During boom times, organisations got away with it because budgets were growing, but now we need to change that thinking.
"We need to sweat their storage asset to get more out of repurposing and making subtle changes."
It's a big picture discussion better held in the boardroom rather than on the IT floor.
"If the conversation is held at the correct level, that of director of IT, chief operating officer, chief information officer or senior operations manager, it's a very quick and very well-received conversation because it's done in business terms."
But he says such long-range strategic discussions can fall on the deaf ears of IT managers who are "up to their neck in running the day-to-day".
"The chance is for good channel or sales people to have the conversation with the right people. It comes down to the relationship they have with their customers. We don't want to be box-dropping, saying ‘see you next time'. We actually do care about what they are doing from a business perspective."
IDC's Simon Piff says vendor-tied consultants ultimately have one thing in mind: "sell more kit". "But (Hitachi) is taking the right approach in trying to get closer to the customer."
The bigger opportunity for consulting, he says, lies with resellers who represent more than one vendor and are not yet engaging customers in strategic discussions.
"Some resellers just ask ‘what do you want to buy - we have everything'. The market has changed a lot in the last six months," Piff says.