When Google filed its S-1 form in April 2004 before its stock market flotation, it revealed that not only was it already very profitable – making $105m (£67m) on $1bn revenue in 2003 – but also it had invested hundreds of millions of dollars in building a network of servers around the world.

Onlookers quickly realised there were hundreds of thousands of servers – and Google was making them itself from spare parts. It was at that time the third-largest server manufacturer in the US. But it didn’t sell a single one.

The cost of building a “cloud service” like Google’s has only gone up: in its latest results, it revealed that it invested $1.6bn on data centres just in the three months from April to June. That’s now become Google’s typical spending – $4bn a year – on the systems that it uses to index the web, answer searches, serve adverts, handle email, store photographs and provide maps and Street View photos.

And it’s far from alone. Globally, spending on data centres will hit $143bn this year, according to the research group Gartner, and $149bn next year, continuing a slow but steady growth. The cause? “Big data”, as companies try to cope with a growing flood of information about their business and others’, as well as the rush to enable “cloud computing” – so that data can be accessed from anywhere you have an internet connection.

Handling “big data” for millions of people generally involves billion-dollar cheques. That was the price Apple spent on its third data centre – in Malden, North Carolina – covering 500,000 sq ft (4.65 hectares) and cooled by water from nearby rivers.

It’s not alone: Google, Facebook, US phone company AT&T and services company Wipro also have centres there, attracted by cheap electricity and plenty of space in the rural state; one centre can cover as much space and use as much electricity as a small town. They are power-hungry and data-hungry. The servers needed to process the data have to be arranged in racks, with cooling air forced over them; storage is arranged in racks of hard drives which are set up in the expectation that some will fail, but be replaced. The data enters and exits via thick fibre-optic cables routed through the floor.

Yet two things you’ll rarely find in data centres are people or light. Many operate as “lights-out” systems, because the machines don’t need to be watched; they can be remotely monitored. But that need to make data move quickly means it’s important to build data centres near to their users. Hence Google built one near Dublin covering 4.45 hectares (11 acres) and costing €75m. Unusually, it’s air-cooled.

Even so, building data centres is increasingly becoming a tussle between access to space, and rapidity of connection. Ahead of the 2012 London Olympics, some IT administrators in the financial centres of Canary Wharf in east London fretted that they wouldn’t be able to get enough electricity to power their new centres – which had to be located close to dealing-room floors in order not to give up precious milliseconds of valuable trading time. Some considered briefly moving out of London – but relented and now benefit from the extra power sources installed to deal with them.

View Source