This trilogy of articles looks at the development of bigger and better data centres across the world. It considers what is driving this construction boom, some of the challenges it faces and examples of how it is being implemented.
For a number of years the buzz word in the IT industry has been “cloud”. The word represents a shift in way all of us access and carry out our computing needs with an increased reliance on the internet. The term has even crossed into mainstream parlance thanks to services such as Apple’s iCloud.
The idea behind cloud computing is that users can access computing resources, such as applications or storage space, on remote computers, via the internet rather than on local individual machines. The advantages are plentiful as consumers can access just the services they need as and when they need them (akin to traditional utilities) and from a multitude of devices and locations.
Many of us use explicitly labelled cloud services but the majority of those used, particularly in the private market, don’t necessarily prompt the same recognition. For example, the massively popular social media tools, Twitter and Facebook, allow us to share communications and media by storing it on remote servers where it can be accessed by anyone with whom we wish to share it via the internet, i.e., through the cloud.
In a sense, cloud computing is a natural evolution of the concept of the world wide web, which was intended to provide a network of shared documents, however with the wide availability of higher bandwidth internet connections and the adoption of Web 2.0, including the idea of user generated content, the old simple documents have evolved into complex web based applications and rich media.
The flexibility, scalability and cost effectiveness of cloud computing offers considerable benefits for both private and enterprise consumers compared to ‘traditional’ one-off installations on individual devices and therefore adoption rates continue to rise. The concepts can appear fairly ephemeral to the consumer, as they access all of their computing resource via the internet but, ultimately, all of that digital information does need to be stored somewhere. As the services get more and more popular, so social network providers, cloud storage providers, cloud application providers and IaaS (Infrastructure as a Service) providers need to find more and more physical capacity and recent years have therefore seen what has been described by some as a data centre arms race to build more and bigger data centre facilities.
The struggle therefore for data centre providers is to build bigger, to reach the capacity they need, but at the same time, minimise the power that they consume and the impact that they have on the surrounding environment. Power consumption is measured by a PUE (Power Usage Effectiveness) score which indicates the energy consumed by supporting infrastructure (mainly heat management) versus that required to power the core servers. The industry benchmark is a PUE score of 2.0 which represents one unit of power consumed on infrastructure for every one on servers, whilst the ultimate efficiency would be a PUE of 1. Not only does a more efficient facility prove greener and more sustainable but, for the provider, it minimises cost and increases the scope for raising capacity. Consequently data centre constructors are increasingly looking to new and innovative ways to bring this score below 2 and as close to 1 as possible.