Well, here we are in mid-2010 and cloud computing is still in its nascent stages--even in the face of industry hype and the market's anticipation.
Depending on where you stand in the world of information technology, or where your customers stand, the cloud is either a far-off promise of powerful solutions and growth or a viable, efficient and strong option today.
Much investment has yet to be made in building out application support for cloud-based IT, and much investment has yet to be made in building enterprise-level uptime and reliability. It's getting better, though. For example, companies like Amazon.com provide more transparency than ever into their own uptime and reliability numbers, and a number of third-party companies, such as Apparrent Networks, provide better tools for monitoring and benchmarking performance of cloud services and infrastructure.
Microsoft, too, has launched its own platform and set of tools for developers to start writing for the cloud and provides new options for ISVs that didn't exist a year ago.
This month we take a look at several companies that are bringing new weapons to the cloud computing battle and try to examine whether they make sense for most developers, VARs and customers. It's still a mixed bag of functionality and complexity and, in the end, we're left waiting for continued improvement throughout the cloud.
But in this issue, we can point to several companies, technologies and solutions as reasons for optimism.
Microsoft's Windows Azure
Microsoft's flagship cloud product isn't a product, it's a service targeted at its development community: Windows Azure Platform.
What's great about Windows Azure is also its biggest drawback: It is integrated so tightly into Microsoft's existing Windows ecosystem that for those schooled in Visual Basic and .Net, it should be an absolute breeze. But some would say that also means there's a catch.
Microsoft saw its ascension to the top of the industry in the '80s and early to mid-'90s when it made it simple and profitable for ISVs to build applications for its Windows platform. When moving up the chain to network- and Internet-based computing, Microsoft moved its model to the .Net platform and continued to grow. Now, with the advent of cloud computing, Microsoft brings us Windows Azure.
Built on the foundation of Visual Basic and .Net, Red Hat executives, among others, say Microsoft has built this with a Windows "lock-in" for developers. To a large degree, that criticism is correct. For something as simple as integrating the Windows Azure development into your environment, Windows Azure Tools for Microsoft Visual Studio requires .Net Framework 3.5 SP1 and either Visual Studio 2008 SP1 (Standard or above), Visual Web Developer 2008 Express Edition with SP1 or Visual Studio 2010 (Standard or above) or Visual Web Developer 2010 Express Edition.
To simply access the tools and infrastructure, we needed to install everything on an instance of Windows Server 2008 (we used Standard Edition, 32-bit.)
To our way of thinking, that's a lock-in. But not everybody minds a lock-in. Some would say Apple's strategy surrounding iPhone and iPad is to build a lock-in, and that hasn't hurt Apple. And in previous computing eras, lock-in certainly hasn't hurt Microsoft in a noticeable way.
So you've met the requirements to install the Azure tools, you're familiar with Microsoft's Visual Basic-based programming environment and you want to build cloud-based applications.
Azure isn't free. Under an SLA, users can pay 12 cents an hour for compute cloud infrastructure, 15 cents per GB stored per month, 1 cent per 10,000 storage transactions, among other costs. (Microsoft isn't charging for inbound data transfers during off-peak times through June 30.)
While not free, the cost is competitive with other cloud computing or cloud storage services.
Microsoft's top executives have made it clear to channel partners that every, single, solitary part of its product lineup--from Windows to SQL Server--will be ported over to the cloud, which means that it's going to have to take its development community with it, or try. Azure is a robust platform for those sticking with Microsoft environments and will be a big part of the Microsoft ecosystem's future.
Ctera C200
Ctera's motto doesn't exactly roll off the tongue: "Enter the Age of Cloud-Attached Storage."
But if the phrase doesn't make for a good motto, it does make for sound advice when you consider its approach and technology.
The company has gone to market with an approach to cloud storage that simply makes very good sense for small or midsize businesses, or workgroups, that seek a cost-effective way to straddle both the hard-wired world of hands-on information technology and the developing world of hosted infrastructure. With the Ctera C200, a toaster-size, network-attached storage device, Ctera allows for on-site data backup with a secondary backup into the company's own hosted site.
With a caveat that each enterprise would need to evaluate its own set of security and regulatory requirements, the CRN Test Center can fairly say that the solution is fast and easy to deploy, works as advertised and provides solid management and administration capabilities.
Let's start with the device itself. Small enough to fit on a desktop without getting in the way, the C200 is built as a two-bay appliance that can hold a couple of 3.5-inch SATA drives and support two USB storage devices. The device connects to the network via Gigabit Ethernet port, and its administration console is easily discoverable on the network via browser. That leads us to a discussion of the hosted part of the solution.
Through the management console, VARs or administrators can set permissions, provision resources, schedule backups, set file sharing, configure alerts and notifications, and monitor the system. It's also where cloud backup services are managed, including cloud backup scheduling. The console was well designed and is easy to navigate, which we found helps cut down on time needed for administration. Like other hosted storage services, Ctera also provides for very simple remote access to files via custom URL to its Ctera Portal service.
As part of the US$499 list price on the C200, the solution comes with one year of 10 GB of backup with more capacity available. Ctera works with several partners in its hosted services, including Rackspace, one of the leading providers of hosted infrastructure. (The CRN Test Center has found Rackspace to be reliable and transparent in its infrastructure and performance.)
Ctera's technology and service are elegant, easy and fast to deploy, flexible and cost-effective.
The company heavily favors the VAR channel with its sales model and program, and the C200 is sold only via partners. It has positioned itself well for enterprises that want to migrate slowly from on-premise to hosted IT, and it has also positioned itself to be in a strong spot for some time to come in the small and midsize markets. It's a solution we can recommend.
EMC Atmos
EMC, one of the industry's pioneers in 'virtualisation' and an emerging leader in cloud computing, had appeared tentative for a while in rolling out its new cloud computing platform. Atmos, its hosted storage offering, launched last year but EMC was relatively quiet about it. That is, until last month. Then, EMC made an announcement: It would work to give a boost to its Atmos cloud infrastructure platform by adding data protection technology, which it calls GeoProtect, as well as upgrading its entire'hosting'infrastructure'to servers running Intel Xeon 5500 chips.
The Xeon 5500s are, simply, the most powerful processors we've ever seen in the'CRN'Test Centre lab. Last year, in one instance, we were able to take a dual-Xeon server, two 5570 CPUs, and install Windows Server 2008 R2 and 20 functioning virtual servers in about an hour. In a'server'the size of two pizza boxes, we were able to build a stable, virtual 'data center' in no time.
The system itself registered a Geekbench score of almost 15,000. Even though this happened a year ago, that's still the highest system score we've ever seen with that benchmark.
When one of the biggest players in the industry adopts the most powerful hardware standard for one of its most critical, emerging lines of business, it makes a statement. Hardware still counts. EMC could have just said it would upgrade its data centre and left it at that. But by going public with the details, EMC was making a statement and challenging the rest of the industry to do the same.'
Amazon Web Services (S3, Simple Storage Service)
When we last examined Amazon.com's cloud offerings a year ago, we found its EC2 "elastic cloud" to be a turnkey-simple, pay-as-you go, Web-based hosted server solution.
A lot has happened since then: Competition has increased; cloud companies have built out a lot more infrastructure and more enterprises than ever are open to giving it a try. But Amazon fueled perhaps the biggest alteration to the cloud market when it decided to slash pricing dramatically over the past several months with its Amazon Simple Storage Service.
Enterprise cloud storage is the most crowded portion of the hosted IT space, it's the easiest place to find value (EMC, Rackspace and Data Deposit Box all offer laudable solutions here), and reliability and availability are making some improvement.
Amazon Web Services (AWS) now provides software development kits for Java and .Net, including libraries, sample code and APIs, meaning it provides a platform for application development and deployment. Amazon S3 (Simple Storage Service), essentially, sets the industry standard for cloud-based storage and backup. Integrated well into the AWS fabric, Amazon.com's Amazon CloudFront, for example, provides the ability to import, export or access data stored on its cloud in a matter of a few clicks. It offers the option to have data distributed via download or streaming, allocate elastic IP addresses, make snapshots, create security groups and load balancers, and bundle tasks.
As it provides greater levels of management, customisation, security and accessibility, Amazon's approach also offers greater levels of complexity. For example, the process of creating "buckets" to hold objects on Amazon's cloud requires an amount of coding and line commands that other services don't require for baseline service. (Creating a "bucket" with an HTTP request requires 32 lines of code.)
For many enterprises just beginning to migrate to a cloud-based model for data storage and backup, Amazon may be a natural choice to evaluate, and it offers good pricing once you pass the petabyte-level of data required for storage. However, for those that want to make the move to a cloud-based model a little easier, other solutions such as Rackspace or Data Deposit Box may be a wiser alternative.
KineticD/Data Deposit Box
Data Deposit Box revealed last month that it would change the name of the company to KineticD. Whatever you wish to call it, we think its flagship Data Deposit Box storage solution is a strong one for many enterprises, including small or midsize enterprises looking to migrate to cloud solutions a little at a time.
With a quick, "out-of-the-box" installation, Data Deposit Box offers the ability to automate cloud-based data backup. Its management console is straightforward and simple, yet provides clear visibility into a network's data sets that require backup and whether scheduled backups have been successful. The CRN Test Center evaluated the service over several days and found it provided a level of uptime and reliability that would be fine for most enterprises.
Like the other solutions we viewed, Data Deposit Box provides password-level administration security. Through the application-based management console, we were able to manage users, reports, get a networkwide view of data being managed through the service, examine daily activity and establish e-mail alerts for a variety of activities. While not as robust as Amazon's storage and offering, it's aimed at a different level of enterprise and business strategy--namely, entry-level storage and backup rather than data application development and deployment.
For a deployment of 28 computers, five servers and 90 GB of total data, KineticD estimates a monthly cost of US$180 (or US$2 per GB per month.)
The Bottom Line: What all of these solutions prove is that there is no, single cookie-cutter approach to enterprise cloud IT--at least not yet. For VARs, there remains the very real possibility of delivering a client too much or not enough compute capability or storage capacity, or too much or not enough headroom. For now, then, best practices will include a full audit of an enterprise's data usage, compliance requirements, business road map and budget. Most enterprises are taking a slower approach to migrating data and compute resources to the cloud, and for good reason. There is a wide gulf between simple storage and backup and robust application development and delivery to and from the cloud.
While companies like Microsoft and Amazon.com provide resources and infrastructure for longer-term and more robust cloud solutions--and it will take time for developers and technology providers to get there-companies like KineticD, EMC and Ctera provide half-steps that we can clearly recommend.