Studying the 2008 server scene

By on
Studying the 2008 server scene
The server market in Australia is undergoing some fundamental changes. With a new generation of servers backed by multi-core and many core processors becoming more mainstream, developers face challenges in optimising their applications to take advantage of the new performance capabilities.

At the same time, virtualisation and emerging mobile and communications applications are exerting more pressure on existing server infrastructures, driving higher traffic volumes onto corporate networks and creating manageability issues for administrators.

Resolving these issues is currently the domain of vendors, OEMs, partners and systems integrators. To that end, here are some thoughts on the top challenges IT professionals face across various server markets, along with where some of the key opportunities now lie.

Virtualisation on the rise
Virtualisation isn’t new – it’s been around since the early 1960s when IBM introduced virtual machine technology into mainframe computers. In later years, Microsoft Windows NT included a virtual DOS machine, and EMC’s VMware introduced its first product, VMware Workstation, in 1999.

Although virtualisation has been around for more than four decades, the software industry is just beginning to understand the full implications of this important technology.

Server virtualisation to consolidate multiple machines into a single server is the most common form of virtualisation in use today but it is still very early in the adoption cycle. Indeed, analysts estimate fewer than 10 percent of servers are currently virtualised.

We believe that, in coming years, server virtualisation will become more ubiquitous as products are introduced that target today’s high-volume, low-cost hardware.

Beyond server and storage
There are many different types of virtualisation. Machine virtualisation uses software to create a virtual machine that emulates the services and capabilities of the underlying hardware, making it possible to run more than one operating system on a single machine. On servers, this approach is called server virtualisation; on end-user PCs, it is called desktop virtualisation.

Other types such as application virtualisation (which separates the application from the operating system to reduce conflicts between applications, and simplify deployments and upgrades), presentation virtualisation (which enables an application on a computer in one location to be controlled by a computer in another), and network virtualisation (which allows remote users to tap into a company network as if they were physically connected) are now finding a space in the market.

Adoption of these forms of virtualisation is just beginning and their potential value remains largely untapped. While each layer of virtualisation delivers an important set of benefits, the real power comes when companies implement an integrated strategy that extends across their IT infrastructure. Strong opportunities exist for the channel to assist customers in implementing virtualisation technologies that solve problems from the desktop to the data centre.

Virtualisation within the OS
Rather than remaining independent of it, virtualisation will become a default setting in the operating system – be it Windows or non-Windows. Today, vendors such as Sun, Novell and Red Hat incorporate virtualisation into their x86 operating systems, and HP and Hitachi incorporate a virtualisation layer into their Itanium-based systems. We’ve also seen a kernel-based virtual machine (KVM) added to the Linux kernel.

Microsoft has brought hypervisor-based virtualisation to Windows Server – and a wider range of customers – with Windows Server 2008. Certainly, as Microsoft and others offer operating systems with virtualisation, AMD and Intel design microprocessors with virtualisation capabilities built in, and hardware vendors explore ways of engineering virtualisation capabilities into firmware, server virtualisation will become a greater commodity, and customers will be the big winners.

Linux loses monopoly on PHP Websites and applications developed in PHP will increasingly be uncoupled from exclusive Linux server-based domains, as FastCGI extensions coupled with PHP improvements close the performance gap between Internet Information Server (IIS) and Apache.

The shift is starting to register in the hosting market, with providers such as Brisbane-based Emantra launching services to allow companies and developers to run existing websites and applications on the new IIS 7.0 platform.

“We are seeing a lot of interest in the service,” said Ross Dewar, managing director of Emantra. “Earlier in the year, we provided a free beta service open to the developer and integrator communities in Australia and across Asia Pacific that allowed them to experience the performance, usability and security improvements of the new platform. Now, with Microsoft’s official release of the new technology on 28 February, we have launched a commercial release of Hosted Windows Server 2008/IIS7 for our customers.”

Management is the key
The reality is that the single largest cost area contributing to total cost of ownership is the manageability of the solution. Better and easier manageability translates to time and cost savings.

As IT environments mix physical and virtual servers, in many cases running multiple operating systems, manageability increasingly becomes the key to success. In particular, customers want physical and virtual management to occur from the
same console.

“Virtualisation without good management is more dangerous than not using virtualisation in the first place,” said Tom Bittman, Gartner vice president and analyst.

Automation and delegation of administrative rights (without compromising the integrity of the box) are also features likely to rise in popularity. The latter is already gaining a foothold in the hosting market, where letting the website or application developer/ISV self-manage the hosted side of their solution means they are calling IT support less for issues they could easily fix themselves – if they had the right level of access.

Servers become modular
Attack surface reduction (ASR) has been a buzzphrase for the last couple of years. In the past, ASR meant disabling unnecessary services and features and uninstalling components that weren’t critical to the server’s role, thus reducing the number of points vulnerable to attack or that need to be maintained and patched.

As servers become more modular, administrators will be able to choose to install only the server modules and subsystems needed to fulfil the role of the box, e.g. only file server roles for a file server. Put simply, they won’t have to install everything and then go back and disable or uninstall unnecessary features.

The availability of minimal install options will also create an opportunity to reuse older hardware that perhaps lacks the capacity to run a full install, but could still be useful in a low-footprint role, e.g. as a print server.

Location intelligence goes the distance
The ubiquity of geographical services such as Virtual Earth and the increasing sophistication with which users consume data means that spatial information is just another component to be used as a basis for making better decisions and providing higher value services.

The recent trend towards Web 2.0 mash-up solutions in which information and content from multiple sources is combined to create versatile online applications is indicative of the way that computer users make sense of the vast amount of information that is available to them.

Already, ISVs are testing database server functionality that lets them create applications that add a spatial layer to existing data – for example, allowing a sales manager to define geographic sales regions, and use them to match customers to sales representatives and perform analysis
of sales performance per region.

Many core means multi-threads
Future improvements in server hardware processing power are proceeding on a new model. Instead of deriving processing power from scaling clock speed, which eventually increases power usage and heat emission to unmanageable levels, chip manufacturers have begun to increase overall processing power by adding additional CPUs, or “cores” to the microprocessor package.

Most mainstream server systems now ship with dual- or quad-core microprocessors – and with Intel’s 80-core chip showing just how far the concept can scale, the challenge now lies with the operating system and application developers to continue to optimise their code for multi core.

For server OS developers, kernel improvements in most cases will be required in order to create a multi-tasking operating system capable of dividing available processor time among the processes or threads that need it. For example, the new Microsoft Hyper-V virtualisation hypervisor supports multiple processors and multiple cores to a certain extent, including up to four execution threads in parallel.

Beyond mobile email
People expect to be able to do more and more with their mobile phones. One of the trends to emerge recently is for the creation of server roles
to manage mobile devices in a similar fashion to the way servers currently manage desktop and laptop-based environments.

The need for this is two-fold: as smart phones become further embedded in the enterprise, companies are increasingly delivering new applications to phones over the air or finding ways to connect people securely via a mobile virtual private network (VPN) to provide access
to critical data such as expense reports and other internal systems-based information.

Community support
As server platform updates and new roles emerge, the specialised skills and knowledge of partners, OEMs, ISVs and resellers will remain integral to the success and adoption of some or all of these trends. For example, Microsoft Australia is also working with more than 30 ISVs locally in early adoption programs to help them test and validate their solutions on products including Windows Server 2008.

“Partners and IT professionals alike are gearing up for the next wave of server innovations and advances,” said Nick Mayhew, group manager, partner strategy and marketing team at Microsoft Australia. “As with any technology trend, partners remain at the forefront in terms of delivering value to Australian customers. The ball is firmly in the vendor community’s court to provide the technical readiness and marketing tools and support to ensure the ecosystem can maximise value from these opportunities.”
Got a news tip for our journalists? Share it with us anonymously here.
Tags:

Log in

Email:
Password:
  |  Forgot your password?