Tuesday, August 17, 2021

Microsoft’s Underwater Datacenter: Future of Cloud and Edge Computing

 

Underwater data centers for edge computing and cloud are a hot cake among tech companies investing in cloud computing after Microsoft found underwater data centers reliable, practical, and energy sustainable. This project named “Natick” has drawn the attention of various research communities and tech companies. Through this underwater data center project, the main objectives of edge computing can be obtained as it gives quicker response time and decreases data latency. Because the undersea data centers will be geographically distributed and closer to users. Server failure and security are also the main concerns in this field. Which can be obtained through an undersea data center.    

Edge computing is a distributed computing model in which computation and data storage are brought closer to the data sources. This model increases response times and conserves bandwidth. Rather than a single technology, it is architecture. It's a type of distributed computing that's sensitive to topology and location. For implementing this model smaller data centers located closer to users are important. So, instead of a large warehouse data center, a smaller underwater data center can be a reliable solution for edge computing as well as for the cloud. About 50% of the world's population lives within 200 km distance of the coastal area [1]. As it can be deployed very quickly in coastal areas throughout the whole world.

                                    Figure 1: Project Natick Northern Isles Data Center

The underwater data center is a revolutionary model for placing servers under the sea instead of placing them in the land-based data center. A research team of Microsoft practically implemented this model successfully. Researchers found this model more efficient, economic, eco-friendly, and zero carbon-emitting technology. So, it would be best suited to economical technology for performing edge computing.

The idea for an underwater data center first appeared in 2014 at Microsoft at ThinkWeek, a gathering of staff members to explore novel concepts. The idea was viewed as a potential means of saving energy while offering coastal people lightning-fast cloud services. Microsoft took a project named Natick, to implement this idea aiming to be a carbon-negative technology company before 2030. The Natick Phase 1 vessel was stationed off the coast of California from August through November of 2015 [2]. To withstand pressure and fight against attacks from nature, the team created a small, self-sufficient circular container. In order to monitor the container from their offices, researchers additionally placed sensors and cameras. They were able to capture information on the temperature, humidity, power consumption, and speed of the system. The team was able to effectively demonstrate that data centers can be set up and run in an underwater environment during this phase. In phase II Microsoft submerged a shipping container-sized data center for two years in the UK in June 2018 [3]. During this time, Project Natick team members tested and kept track of the servers' dependability and performance. Marine experts retrieved it from the seafloor off Scotland’s Orkney Islands in July 2020. A multi-year project that demonstrated the feasibility of underwater data centers from a logistical, environmental, and economic standpoint entered its final step with the retrieval. This data center was covered with algae, barnacles, and sea anemones containing servers, storage, and computer underwater.

Power efficient datacenter architectures are possible due to the constantly cool undersea. For instance, they can make use of the heat-exchange plumbing used in submarines. The group hypothesized that information about techniques to boost datacenter dependability could be found in a sealed container on the ocean floor. On land, factors that might cause equipment failure include temperature changes, corrosion from oxygen and humidity, jostling and bumps from workers replacing broken parts.

 In 2015, more than half a mile from the shore, starfish, octopus, crabs, and other Pacific Ocean animals discovered a temporary addition to the seafloor: a 38,000-pound container. However, 10 feet by 7 feet is fairly little in the ocean [4]. The datacenter inside the container, which used the compute equivalent of 300 desktop PCs, was quieter than the shrimp investigating the ocean floor.

                                            Table-1: Comparison between two Phases of Project Natick

 

Phase I

Phase II

Container size

10*7 feet

40*7 feet

Servers

300

864

Deployed in

105 days

90 days

Submerged for

5 months

24 months

 Phase 2 of Project Natick's undersea data center included 12 racks, 864 servers, and 27.5PB of disk storage in 40 feet long container. It was also connected to the 250Kw power supply and networking system of the neighboring Orkney island [5]. Off the northern coast of Scotland are the Orkney Islands, which have a tidal, solar, and wind power grid that is entirely renewable. Orkney was able to power the data center, the islands, and the Scottish power system all while operating the prototype data center. Project Natick is an out-of-the-box design to serve the exponential increase in the need for cloud computing infrastructure close to population centers.

The data center located 35 meters (117 feet) below the surface of the ocean. It resembled a submarine in certain ways. Microsoft claims that the data center was contracted for, built, and set up in less than 90 days. The data center was designed to be smaller than a typical ISO shipping container. From the location where it was erected to the Orkney Island, including ferry crossings, the data center was transported on top of an 18-wheeler. It was supported by a triangular structure, dragged out to sea, and dumped there.

Early studies indicate that the servers in the underwater data center experienced 1/8th as many failures as those in a control data center on land. It will be challenging, though not impossible, to submerge a personnel to reach the facility for component repair when something fails. According to the team's economic model, they will at least be at parity with land if they lose this many servers per unit of time. This is vastly superior to that [6].

Microsoft credits the use of 100% Nitrogen (at 1 atmosphere pressure) as opposed to regular air and the absence of people to jiggle the machinery or affect the environment for the improved server stability. Additionally, it's likely that a typical data center on the surface of the globe experienced less temperature variation than a data center submerged in the ocean. If this were the case, it might also assist to explain why it is more reliable.

Table 2: Land based datacenter vs Under water datacenter

 

Land Based Datacenter

Under Water Datacenter

Atmosphere

Normal air atmosphere

100% Nitrogen

Power use efficiency

1.8

1.07

Deployment duration

minimum 365 days

90 days

Corrosiveness

High

Low

Floor Space required

Comparatively large

Comparatively small

Server Failure rate

5% - 11% frequency

 (based on age)

0.6% - 1.37% frequency

(based on age)

Power supply

Not renewable

100% renewable

 

Modern servers must be kept cool (and storage). The majority of data centers run at 1.8 PUE (power use efficiency), or 180 percent of the power needed for the servers, disk, and networking devices, according to NREL (USA National Renewable Energy Lab). The remaining 80% is primarily utilized for cooling electronics, but it also covers lighting, HVAC (Heating, Ventilation, and Air Conditioning), and other critical human services. Highly efficient data centers can reach a PUE of 1.2 (real implementation is not possible), according to NREL. While the Project Natick Phase 2 data center's PUE was 1.07. The only additional electricity most likely required would be for cooling. Seawater was pushed through the backs of server racks to cool the Project Natick Phase 2 data center. Electricity supplied to this datacenter fully powered by renewable energy.

These data centers would need to be constructed to last for years without maintenance, which doesn't seem practical at the moment, especially if a leak develops. If it is placed 35 meters under sea level, it might be in danger from the threat of extremist environmentalist organizations or terrorist attacks. By placing datacenters undersea close to coastal communities, data would pass a short way, enabling quick and seamless online browsing, video streaming, and game playing. By improving the effectiveness and sustainability of its cloud infrastructure, Microsoft is on a mission to become carbon-negative by 2030. By 2025, the corporation promises to run all of its data centers entirely on renewable energy. Cooling systems and renewable electricity from on-shore wind, solar and also from off-shore tides and waves [7].

The Microsoft team in Azure is eager to serve customers who need to deploy and operate tactical and important datacenters anywhere in the world, and discussions with them have been sparked by the proven dependability of underwater datacenters. By locating underwater data centers close to coastal communities, the corporation hopes to completely eliminate any data latency. Furthermore, data security, which Microsoft claims is a key objective of Azure, might be improved with an undersea data center (as it includes tests of post-quantum encryption technology).

In order to power the full range of Microsoft Azure cloud services, underwater datacenters will need to be scaled up. To do this, connecting a dozen or more vessels the size of the Northern Isles may be necessary. One image of "Azure Natick Gen 3.12" surfaced that indicates a significant increase in capacity. A 300-meter-long steel frame containing 12 data center cylinders, identical to those from Phase 2, is depicted in the design.

                                        Figure 2: Surfaced Image ‘Azure Natick Gen 3.12’ 
 

The whole structure's capacity has been estimated at 5 MW, which is in accordance with the 144 racks it might accommodate and a fairly low power density. If that's not a bold goal. Microsoft has indicated that in order to construct underwater Azure availability zones, several 5MW modules might be combined [8].

China's Hainan Province declared to build the world's first undersea commercial data center, expected to be completed in five years. The project hopes to set up 100 data centers in three phases during the 14th Five-Year Plan (2021-2025) period [9]. An initial survey of a region off the coast of Hainan has been conducted, and the center's general design has been completed.

China’s declaration for implementing a commercial undersea data center and Microsoft’s plan to Azure Natick Gen 3.12 proves that an undersea data center is a realistic, commercial and feasible solution for today’s edge computing and cloud. Also, it’s a green technology as it uses 100% renewable electric energy. It can be placed on any sea shore anywhere to serve 50% of the world population with lightning-fast computing speed.

For security purposes, the undersea data center should be placed much deeper so that it becomes safe from the threat of attack by any extremist group. Furthermore, research is required to observe whether it affects the sea environment or not for a long duration. A larger data center will emit a very large amount of heat, which can harm the sea environment. So necessary steps should be taken for managing this problem.


References

[1] John Roach (5 June, 2018). Under the sea, Microsoft tests a datacenter that’s quick to deploy, could provide internet connectivity for years. Retrieved from  https://news.microsoft.com/features/under-the-sea-microsoft-tests-a-datacenter-thats-quick-to-deploy-could-provide-internet-connectivity-for-years/?utm_source=innovation-huib&utm_medium=feature

[2] Athima Chansanchai (1 February, 2016). Microsoft research project puts cloud in ocean for the first time. Retrieved from https://news.microsoft.com/features/microsoft-research-project-puts-cloud-in-ocean-for-the-first-time/

[3]  John Roach (Sep 14, 2020). Microsoft finds underwater datacenters are reliable, practical and use energy sustainably. Retrieved from https://news.microsoft.com/innovation-stories/project-natick-underwater-datacenter/

[4] Athima Chansanchai (1 February, 2016). Microsoft research project puts cloud in ocean for the first time. Retrieved from https://news.microsoft.com/features/microsoft-research-project-puts-cloud-in-ocean-for-the-first-time/

[5] Ray (September 16,2020). Undersea datacenter in our future? Retrieved from https://silvertonconsulting.com/blog/2020/09/16/undersea-datacenter-in-our-future/

[6] Rich Miller (September 14,2020). Microsoft: Servers in Our Underwater Data Center Are Super-Reliable. Retrieved from https://datacenterfrontier.com/microsoft-servers-in-our-underwater-data-center-are-super-reliable/

[7] Mark Haranas (September 22,2020). Microsoft’s Underwater Data Center A Success; Azure Ahead. Retrieved from https://www.crn.com/news/data-center/microsoft-s-underwater-data-center-a-success-azure-ahead

[8] Peter Judge (March 15,2022). Projects in the US and China have shown that data centers underwater could be more efficient than those on land. But how do you build them? Retrieved from  https://www.datacenterdynamics.com/en/analysis/building-underwater/

[9]News Correspondent (26 May 2021). China starts building world's first commercial undersea data center. Retrieved from https://news.cgtn.com/news/2021-05-26/China-starts-building-world-s-first-commercial-undersea-data-center--10zOmgcZp6g/index.html



Microsoft’s Underwater Datacenter: Future of Cloud and Edge Computing

  Underwater data centers for edge computing and cloud are a hot cake among tech companies investing in cloud computing after Microsoft foun...