User:Andreadiddioelle/sandbox

= Natick Project = The Natick Project is an experiment carried out by Microsoft in collaboration with Naval Group. It consists of testing underwater data centers to determine the feasibility of subsea data centers. At the moment the project is composed of two phases. The first phase started in August 2015 and it operated on the seafloor, approximately one kilometer off the Pacific coast of United States. The second phase started in June 2018, Microsoft placed a new data center in the North Sea near Scotland’s Orkney Islands.

The idea behind the experiment
Building land-based data centers is a long and expensive process. A large data center takes about two years to build. The construction duration is also influenced by variables such as land cost, energy cost and temperatures.

Cooling is an important aspect of data centers, which normally involve significant costs for the operation of refrigeration systems to avoid overheating of computers inside them. In addition, fluctuating temperatures (varying over a range of more than 100°F in a single year) during the seasons could lead to a redesign of the cooling system. A typical US data center consumes more than 2% of the total energy generated of the country's total energy, and air cooling costs are estimated at $1.4 Billion.

In addition, these facilities can consume a lot of water. This is because they often use evaporation to cool the air before blowing it to the servers. This can be a problem in drought-prone areas, such as California, or where a growing population impoverishes local groundwater, as is happening in many developing countries. Even if the water is abundant, the addition to the air makes the electronic equipment more susceptible to corrosion.

About 40% of the world's population lives within 100 km of the coast. Coastal property is much more expensive than land in the inland and this is one of the main reasons why data centers are located away from towns; this distance also slows down the speed of data transfer.

Microsoft's project aims to address these issues through an alternative approach. Ben Cutler, a member of the Natick Project team, said that perhaps the most important aspect of an underwater data center could be its short implementation and deployment time of about 3 months. The presence of underwater data centers could be an advantage for customers, reducing latency (the time it takes for a packet to move from source to destination).

With regard to cooling, Natick uses two different Heat Exchanger that use fresh water as their working liquid. Of course, the colder the surrounding ocean is, the better this system will work. In order to have access to cold sea water even during the summer or in the tropics, it is sufficient to put the pods deep enough. For example, 200 meters deep off the east coast of Florida, the water remains below 59°F throughout the year.

Sean James, Director of Energy Research at Microsoft, says that it could also reduce construction costs, facilitate the feeding of these plants with renewable energy, such as Tidal Turbines, Wave Energy Converts and Offshore Wind Power that would halve infrastructure and operating costs.

The beginning of Natick
It all started in 2013 when Microsoft employee, Sean James, who served on a US Navy submarine submitted a paper during the ThinkWeek (a seven-day where the Microsoft teams gives ideas and think about the future of technology) where it describes an underwater data center, powered by renewable ocean energy. Norm Whitaker read the paper and built a team to explore this idea. The first meeting of the team was held in Redmond, Washington state. The research group consists of Eric Peterson, Spencer Fower, Norm Whitaker, Ben Cutler and Jeff Kramer. In late 2014, Microsoft kicked off Project Natick.

Phase one
Phase one started in August 2015 with the prototype Natick pod, dubbed the "Leona Philpot " (named for an Xbox game character). The team left the vessel under the water for 105 days at just 11 meters depth in the Pacific near San Luis Obispo, California, where the water ranged between 57 and 64 °F. It weighed 38,000 pounds (17 tons). For its construction and its pace, the engineers decided to take inspiration from the submarines. They built a round bowl because nature attacks sharp angles and ridges. The round tank with little edge and angle is less likely to be damaged. The researchers noted that even marine life adapted quickly to the presence of the tank and had even begun to inhabit the system. The fact that the tank is round also allows it to better withstand the pressure and prevent deformation and damage to the equipment.

The data center includes servers, a cooling and heat exchange system, and sensors. The servers are placed in a locker. Eric Peterson specifies that he and his team used standard servers that they adapted for the marine environment. On the outside of the rack is the cooling system, which is essential to prevent the servers from overheating. The electronic controls are also outside the locker. The whole is covered with a sealed tank. This tank had a dimension of 10 feet by 7 feet (3x2m). Outside the tank is the heat exchanger. The heat is evacuated by the heat exchanger and is cooled by the ocean water.

They showed that is possible to keep the submerged computers at temperatures that were at least as cold as mechanical cooling can achieve and with even lower energy overhead than the free-air approach (just 3%). They saw that the energy-overhead value is lower than any production or experimental data center of which we are aware. In the data center pod the atmosphere was oxygen free. They also removed all water vapor and dust. That made for a very benign environment for the electronics, minimizing problems with heat dissipation and connector corrosion.

The researchers monitored the container from their offices in Building 99 on Microsoft’s Redmond campus. Using cameras and other sensors, they recorded data like temperature, humidity, the amount of power being used for the system, even the speed of the current in order to better understand the environment in which the data center is located. During the experiment a diver was going down monthly to check on the vessel.

After a very successful series of tests, the Leona Philpot is lifted out of the water and is brought back to Redmond for analysis and refitting.

In this first phase, the Microsoft researchers said they studied the impact their computing containers might have on fragile underwater environments. They used acoustic sensors to determine if the spinning drives and fans inside the steel container could be heard in the surrounding water. What they found is that the clicking of the shrimp that swam next to the system drowned out any noise created by the container.

One aspect of the project that has the most obvious potential is the harvest of electricity from the movement of seawater. This could mean that no new energy is added to the ocean and, as a result, there is no overall heating, the researchers asserted. In their early experiment the Microsoft engineers said they had measured an “extremely” small amount of local heating of the capsule. Since the results were promising they decided to continue the project building a new bigger vessel.

Phase two
 { "type": "FeatureCollection", "features": [ {     "type": "Feature", "properties": {}, "geometry": { "type": "Point", "coordinates": [ -3.0145239830017094,         58.93147615849282        ]      }    }  ] } For the second phase of this project Microsoft decided to work with pioneers in marine energy and using submarine's technology with the aim of building a self-sufficient underwater data center, much larger than the first Microsoft's vessel. Naval Group was selected to lead design, fabrication and deployment. On the 1st June 2018 the vessel, named " Northern Isles ", was deployed in the European Marine Energy Centre (EMEC), in the Scotland's Orkney Islands, a test site for experimental tidal turbines and wave energy converters, that generates more than enough electricity to supply the residents of the islands with 100% renewable energy.

The data center receives electricity by a cable that requires approximately a quarte of megawatt of power, when it is operating at full capacity.

The partnership between Microsoft and EMEC is a step toward the realization of the Microsoft's vision about an Energy self-sufficient datacenter. This will make possible to bring cloud services in places with unreliable electricity, able to data centers that can quickly respond to market demand ( like during a natural disasters or special events like the World Cup) and, additionally, the possibility of eliminating costly backup generators.

This new datacenter, composed of twelve racks for a total of 864 servers, is bigger than the previous vessel and it measures 12.2m x 3.2m. It was assembled in the north of the France, in Brest. After being tested in France it has been moved, on a flatbed truck, to Scotland. Before being deployed on the seafloor it was attached to a ballast-filled triangular base.

The data center was drag and partially submerged into sea; then, a vehicle operating remotely took a cable containing the fiber optic and the power wiring, brought it to the surface, checked  and then attached it to the datacenter, that in this way was powered on. The most complex task was the dropping down, indeed the marine crew used 10 winches, a crane, a gantry barge and a vehicle operating remotely. Once in the seafloor the chains were released and the winch cables carried to the surface. At this point the operational control moved to the shore station.

Natick uses AI to operate like a standard land datacenter, the computers inside Natick can be used for machine learning to provide artificial intelligence to other applications just as in any other Microsoft datacenter.

Microsoft's vision is for Natick datacenter deployments of up to 5 years, which is the anticipated lifespan of the computers contained within the vessel. After each 5-year deployment cycle, the datacenter vessel would be retrieved, reloaded with new computers, and redeployed. The target lifespan of a Natick datacenter is at least 20 years. After that, the datacenter is designed to be retrieved and recycled.