With demand for data centers expected to rise significantly in the foreseeable future, tech firms providing cloud and distributed computing services are increasingly looking towards swift, cost-effective and environmentally friendly ways for the deployment and maintenance of their hardware. Microsoft, being one of the biggest players in this field, experimented with the idea of submerging a data center underwater as part of Project Natick, making its hardware physically inaccessible to humans in a bid to improve reliability and compare performance with on-land deployments. After two years, the 40 ft long steel vessel housing 864 servers on 12 racks has been brought up to surface, with Microsoft finding the solution to be feasible, reliable and energy-efficient.
Back in June 2018, Microsoft launched Project Natick that saw a 40 ft long steel tube filled with dry nitrogen submerged 117 ft underwater in the seafloor off Scotland’s Orkney Islands. Powered by windmill energy from a nearby grid, the experiment was conducted to study the feasibility of an underwater data center that would result in lower latency for nearby coastal populations with significantly reduced deployment time over traditional installations.
After two years, the sealed container covered in a thin coat of algae, barnacles and sea anemones has now been brought up to surface, power-washed and removed from its ballast-filled triangular base and transported to the mainland for detailed analysis.
In its findings, Microsoft revealed the underwater data center to be eight times more reliable than a duplicate setup deployed on land. Helped by the chemically inert dry nitrogen, the absence of oxygen and humidity around the servers is said to have protected them from corrosion, alongside other factors such as a stable temperature and lack of vibrations and bumps caused by human activity.
These servers, a handful of which failed, are now being removed by Project Natick’s researchers to send to Redmond for detailed analysis and determining the cause(s) behind their high reliability over traditional deployments.
Used to conduct research on climate change and forecast rainfall in sub-Saharan Africa and infectious bacteria, Project Natick was involved in finding ways to detect and treat cancer and also took part in [email protected]’s initiative to better understand Covid-19.
Instead of relying on Microsoft’s Azure platform, this distributed computing project ran on generic servers with processing capability similar to that of “several thousand high-end personal computers.”