Data centres are increasingly vulnerable to extreme weather events and resiliency remains paramount. Can it be done in a sustainable way?
On 19 July, as the UK faced record-high temperatures Google Cloud’s data centres in London were experiencing cooling failures, resulting in outages and connectivity issues. Oracle’s data centre, too, was forced into a protective shutdown due to what the company described as “unseasonable temperatures”.
As global temperatures rise, the changing climate threatens the uninterrupted services of data centres. In a recent survey of data centre operators by the Uptime Institute, almost half (45%) of respondents said they had experienced an extreme weather event which threatened their continuous operation. Further, nearly one in 10 (8.8%) suffered an outage or significant service disruption as a result, making extreme weather one of the top causes of outages or disruption.
The number of data centre outages is increasing globally year on year, but this is because more new data centres are being built than ever before. According to the IDC, in 2012 500,000 data centres were handling global traffic worldwide. Today, there are 8 million.
“The industry is getting much bigger and certain companies are becoming more powerful,” observes Andy Lawrence, a founding member and the executive director of research for Uptime Institute Intelligence, an organisation that analyses and explains trends shaping the critical infrastructure industry. “When they fail, more fails.” He notes that we’re all becoming much more dependent on data centres, which means when one does fail, it has a wider-reaching impact.
A quarter of respondents in Uptime’s survey indicated that their most recent outage had cost more than $1m (£0.9m) in direct and indirect costs, with a further 45% reporting that their most recent outage cost between $100,000 and $1m.
Data centres themselves are notoriously bad for the environment. They have the same carbon footprint as the aviation industry and are set to account for 3.2% of the total worldwide carbon emissions by 2025, while consuming a fifth of global electricity.
Consequently, efforts are focused on how data centres can meet the demands of digitisation and create infrastructure resilience, while having as little impact as possible on the environment. These are five of the most popular solutions.
Waste heat utilisation
Across Europe, tech companies are experimenting with waste heat recovery in their data centres. Meta has been reusing heat from data centres to warm 6,900 homes in Denmark. Microsoft powered a data centre in Finland with carbon-free energy and recycled the waste heat to nearby homes and businesses. Energy efficiency agency Codema partnered with an Amazon data centre in Ireland to capture the waste heat for use in homes and council buildings. And in Sweden, a project called Stockholm Data Parks is running in partnership with the city’s government, the local heating and cooling agency, and multiple data centres with the goal of heating 10% of the city by 2035.
“Data centres in Germany have evolved from being enemies of the state to now becoming one of the heat sources,” says Stefan Mink, head of TechOps hosting at IONOS, who has been in charge of the planning, building and running of 20 data centres in Europe and the US. “It becomes a circular economy where the data centres are using the energy, but then also providing energy in terms of heat use.” A 2017 white paper from the Alliance for the Strengthening of Digital Infrastructures in Germany noted that the more than 13 billion kilowatt-hours of electricity that was converted into heat in Germany’s data centres would if reused, have met the energy needs of Berlin.
In 2019, investment analysis of waste heat from data centres showed that the process of reusing waste heat was a financially viable option and could provide a positive return on investment for companies. Further, by helping take pressure off the main grid, the process would eventually come back around and help make the data centres themselves less prone to outages.
There is still a way to go before waste heat utilisation can go mainstream, however. Most data centres still use air-based cooling but because air isn’t an efficient medium to transport heat, the consumer of the captured heat needs to be located close to the data centre. Added to this, the existing infrastructure would need an upgrade. “Capturing and reusing heat does require a full overhaul of your entire facilities while [other options] may be less invasive to your hardware setup,” says Daan Terpstra, executive chairman of the Sustainable Digital Infrastructure Alliance. “But a typical hardware refreshment cycle of all data centres being somewhere between five to seven years, I think this is an ideal moment to start plotting the chart and placing this at the top of the list.”
With the demand for data rising exponentially, data centres need a large amount of energy and electricity to stay running – and cool. Specialised computing equipment can emit large amounts of heat and it’s important to regulate this heat to keep the data centre functioning.
Traditionally, this was done by creating almost sub-zero, freezer-like conditions for the equipment. But in recent years, it’s been understood that data centres are most efficient at ambient temperatures of 18-27C. As recently as five years ago, 40% of the total energy consumed by data centres was going towards cooling the IT equipment, although that currently stands at approximately 10%.
While we’re currently in the era of air-based cooling, experts agree that liquid cooling – where heat from the equipment is transferred to a liquid and siphoned away – is far more energy-efficient. “Air-based cooling pushes the hot air out, you exhaust it,” says Lawrence. “And that’s wasted energy.”
With the opportunity to transfer waste heat from data centres and use it as an energy source, liquid cooling becomes an important technology. Even in insulated pipes, the hot air can’t travel very far before becoming useless. Hot liquid, on the other hand, is much more transportable. “The other thing about direct liquid cooling is it uses very little water. It’s quite innovation-proof in the sense that however hot it gets over the next two decades, it will be easy to use it.”
Large technology companies such as Microsoft, Google and Meta have set aggressive renewable energy targets for their data centres. Meta, which has more than 20 data centres in operation around the world, committed to 100% renewable energy in 2011, followed by Apple, Google and Amazon.
Microsoft has pledged to be carbon negative by 2030 and has committed to removing all the carbon the company has emitted, either directly or by electrical consumption, since it was founded in 1975, by 2050. A blog post on the company’s website states: “To reach this, data centres must be part of the solution for broad decarbonisation.” Microsoft currently manages 200 data centres and is on pace to build between 50 and 100 new data centres each year.
Still, the path to net zero is by buying offsets, which means that effectively many tech giants are still using fossil fuels. That situation may change, and quickly, believe many experts, in part due to societal pressure and upcoming legislation. “Based on the current social and economic climate in continental Europe and the UK, sustainability will become a licence to operate,” says Terpstra.
AI is one of the most cost-effective and scalable ways to improve energy efficiency in data centres. In 2018, Google and DeepMind jointly developed an AI-powered recommendation system to directly control data centre cooling, resulting in consistent energy savings of 30% on average, the company claimed.
AI can bring much more than energy savings – and therefore, cost-effectiveness. There’s also resiliency. Alibaba Cloud deployed machine learning-based temperature alert systems in its global data centre. “We took hundreds of temperature sensors’ time series monitoring data, using an ensemble graph model to quickly and precisely identify a temperature event due to cooling facility faults,” a company engineer told DatacenterDynamics. “It generated alerts much further in advance and provided the data centre operation team precious time to respond to the fault, reducing failure impact.”
Cooling and predictive maintenance are the most cited use cases for data centre AI. Microsoft is developing an AI system to analyse data and generate alerts to “prevent or mitigate the impact of safety incidents” and Meta is investigating ways that AI can anticipate how its data centres will operate under “extreme environmental conditions”.
Most data centres have multiple sources of power so that if one source fails or goes down, another can keep it functioning. Resiliency has always been a primary concern for data centre operators and while the threats and the solutions might be evolving, the ability of a data centre to withstand failures cost-effectively remains paramount.
Microgrids are increasingly seen as an excellent backup solution for data centres. A microgrid is an autonomous local energy grid that allows you to generate your electricity, which means it isn’t dependent on the traditional grid. They can not only keep the data centre’s power on during grid outages but store electricity and sell it back to the grid during peak demand. “So many outages are happening that any critical facility – whether it’s a hospital or a data centre – is thinking about how to make sure they’re able to run if the grid goes down, not just for an hour or two but potentially for days or weeks,” says Jayesh Goyal, chief revenue officer at Enchanted Rock, a company contracted by Microsoft to develop California’s largest microgrid. The microgrid will use renewable natural gas and provide Microsoft’s San Jose data centre with backup power.
What makes microgrids especially noteworthy, observes Goya, is that you can choose how you want to power them. Renewable, natural gas or fuel cells, the choice is yours and constrained only by cost and space. Natural gas is a popular choice for microgrids because of its accessibility and lower environmental footprint. But what’s exciting to many experts is the opportunity to bring hydrogen fuel cells into the picture.
In 2020, Microsoft and Power Innovations powered a row of data centre servers for 48 consecutive hours using hydrogen fuel cells with a first-of-its-kind hydrogen generator. Hydrogen is described as a clean fuel because it only produces water as a by-product. But hydrogen only occurs naturally in compound form and the cost and technology of separating it from other elements have been prohibitive. This status has begun to change, though, and as it does so, generators and microgrids fuelled by hydrogen start to look like a real possibility.
In Terpstra’s view, for hydrogen fuel cells to be scalable they need to be used for more than backup generators and microgrids. “The business-case calculations I’ve seen mean that the investment cost of setting up hydrogen backups versus the number of times you would need it are completely out of balance. The runtime on backups is too little, compared to the investment costs.” He judges that the only way it becomes cost-effective with a positive return on investment is to build a data centre that is fully powered by hydrogen fuel cells.
Underwater data centres
In 2018, Microsoft ran Project Natick, dropping a data centre containing 855 servers 35m below the sea near the Orkney Islands. The aim was to insulate it from temperature fluctuations and test whether underwater data centres could be reliable and practical while using energy sustainably.
Two years later, in 2020, the data centre was retrieved, and only eight servers were down. The company said the average on land in the same time frame would have been 64. Subsea Cloud, which plans to start operating an underwater data centre off the US West Coast before the end of this year, said underwater data centre construction is cheaper and would reduce carbon emissions by 40%.
In an attempt to meet their stated targets, Microsoft and other hyperscale companies are experimenting with ways to make data centres more sustainable. While this is to be lauded, many of these experiments are also impractical in terms of cost and scalability, says Terpstra.
“We’re running out of time. It’s super cool to have an underwater data centre, yes, but there are so many other, less investment-heavy, less engineering-heavy solutions possible that would result in the same effect by looking at the reliability and climate impact from a holistic design.”
Heat recovery from existing data centres, more clean energy sources and easily implemented AI. Things that move the needle now – and continue to create an impact as the infrastructure improves.