Microsoft hails success of its undersea data center experiment—and says it could have implications on dry land, too
That’s one of several problems Microsoft is attempting to solve in Project Natick, an audacious scheme that aims to make undersea data centers a thing. In 2015, the company dunked a prototype off the Californian coast for 105 days, to check the concept was feasible. Then, in 2018, it launched a new phase of the project by submerging a data center onto the seabed off Scotland’s Orkney Islands.
The point of the Scottish experiment was to see if the concept was “logistically, environmentally and economically practical.” On Monday, Microsoft reported back after two years to say the data center has been retrieved and yes, it is practical.
What’s more, the IT giant reckons the experiment could have big implications for data centers on dry land, too.
Briny things first, though. According to a Monday blog post from Microsoft, the consistently cool underwater temperatures made it possible to use similar heat-exchange plumbing to the kind found on submarines.
The team also placed the datacenter—dubbed “Northern Isles”—at a renewable-energy test site where it could be powered by experimental tidal turbines and wave energy converters, along with wind- and solar power from the local grid. In the future, Microsoft reckons it could make sense to co-locate such underwater data centers with offshore wind farms.
But the big revelation from this latest phase of Project Natick is about the advantages of designing a data center that is completely sealed and out of harm’s way—Northern Isles is not made to be entered by engineers looking to fix broken components, as standard data centers are.
“The team hypothesized that a sealed container on the ocean floor could provide ways to improve the overall reliability of data centers. On land, corrosion from oxygen and humidity, temperature fluctuations and bumps and jostles from people who replace broken components are all variables that can contribute to equipment failure,” Microsoft’s post read. “The Northern Isles deployment confirmed their hypothesis, which could have implications for data centers on land.”
Remarkably, it turned out that Northern Isles’ servers were eight times more reliable than those in standard data centers.
“I have an economic model that says if I lose so many servers per unit of time, I’m at least at parity with land,” project lead Ben Cutler said in the post. “We are considerably better than that.”
It’s not yet entirely clear why the underwater servers are quite so reliable, compared with their landbound alternatives. The team thinks it’s down to the use of dry nitrogen rather corrosive oxygen as the servers’ atmosphere, plus the lack of people who bump into things accidentally, but they’ve sent some failed servers and related cables—and some air samples—back to Microsoft headquarters in Redmond, Washington, to check.
There’s another major benefit to this model: it gives Microsoft a potential boost in the “edge computing” trend, where data centers are deployed closer to the customers they serve, for efficiency’s sake. As the firm noted in its post, over half the world’s population lives on or near coastlines, so undersea data centers might improve their online experience.
“We are populating the globe with edge devices, large and small,” William Chappell of Microsoft’s Azure cloud team said in the post. “To learn how to make data centers reliable enough not to need human touch is a dream of ours.”
Microsoft plans to move to 100% renewable energy by 2025. Google says it will hit that same target by 2030, though it claimed on Monday that it has already retroactively offset all its carbon emissions since the company’s 2018 founding. Facebook says it will meet its 100%-renewable-energy target this year.