Dear Visitor,

Our system has found that you are using an ad-blocking browser add-on.

We just wanted to let you know that our site content is, of course, available to you absolutely free of charge.

Our ads are the only way we have to be able to bring you the latest high-quality content, which is written by professional journalists, with the help of editors, graphic designers, and our site production and I.T. staff, as well as many other talented people who work around the clock for this site.

So, we ask you to add this site to your Ad Blocker’s "white list" or to simply disable your Ad Blocker while visiting this site.

Continue on this site freely
You are here: Home / Big Data / Microsoft Tests Undersea Data Center
Microsoft Says It Has Tested Undersea Data Center
Microsoft Says It Has Tested Undersea Data Center
By Shirley Siluk / CRM Daily Like this on Facebook Tweet this Link thison Linkedin Link this on Google Plus
Subsea fiber cables already help Microsoft keep data flowing between data centers and customers on either side of the Atlantic, but now the company is looking at putting the data centers themselves below the ocean's surface. Redmond revealed today that it successfully tested just such a data center -- codename Project Natick -- off the Pacific coast for four months last year, and is now planning a larger follow-up deployment.

The underwater data center, a cylindrical container 10 feet long and 7 feet across, was planted on the sea floor about one kilometer (about 0.6 miles) offshore in August. It remained in operation until it was retrieved and brought back on land in November. The container was named Leona Philpot, after a character in the Xbox game Halo. Microsoft said the project name doesn't really mean anything, but added that Natick is the name of a town in Massachusetts.

Why an underwater data center? Microsoft envisions several potential benefits. Containerized data centers can be quickly deployed wherever they're needed, and placing them underwater can significantly reduce the cost of keeping servers cool and efficient.

What's more, with half of the world's population within 120 miles of an ocean coastline, undersea data centers could help bring more computing power closer to where the people are, which could greatly cut lag-times -- or latency -- in data traffic.

Inspired by Service on Submarines

Microsoft isn't the only tech company whose engineers have seen the potential for a new type of data center in the ocean. Google, for example, made headlines back in 2009 when the U.S. Patent and Trademark Office granted its request for a patent on a platform-based floating data center design. And its data center in Hamina, Finland, has been using seawater from the Bay of Finland for cooling for the past couple years.

The idea of putting an entire, albeit small, data center below the ocean's surface came from a research paper co-written by Microsoft technology research program manager Sean James for ThinkWeek, a cross-company brainstorming effort. Ben Cutler, an engineering executive and "visionary entrepreneur" at Microsoft, happened upon the paper and saw the possibility of testing the idea in the real world.

"What helped me bridge the gap between data centers and underwater is that I'd seen how you can put sophisticated electronics under water, and keep it shielded from salt water," said James, who had spent three years serving on board submarines while in the Navy. "It goes through a very rigorous testing and design process. So I knew there was a way to do that."

20 Years Under the Sea?

While a typical data center requires hands-on human care for operations and maintenance, the Leona Philpot was designed as a "lights out" facility, which means it required minimal on-site physical support once it was deployed. Once the container was placed on the sea floor, the Project Natick team was able to monitor server operations remotely from their location in Building 99 at Microsoft's headquarters.

In addition to the servers, an assortment of cameras and other sensors was installed in and on the container to monitor everything from humidity and power consumption to underwater current speeds. The only hands-on intervention during the vessel's four months underwater came during a monthly visit from a diver to check on how things were going.

With the first test of an undersea data center now behind them, the Microsoft researchers are planning for the next phase, which could involve a container that's four times as big as the Leona Philpot and capable of holding 20 times the compute capacity.

The team eventually hopes to operate servers underwater for up to five years at a time before the equipment would have to be replaced, while the submersible data center itself might be able to last up to 20 years.

"The reality is that we always need to be pushing limits and try things out," said Christian Belady, Microsoft's general manager for data center strategy, planning and development, in a statement. "The learnings we get from this are invaluable and will in some way manifest into future designs."

Image Credit: Microsoft, Project Natick Undersea Data Center.

Tell Us What You Think


Posted: 2016-02-02 @ 1:15am PT
Well it sure makes a physical hack significantly more difficult.

Like Us on FacebookFollow Us on Twitter
© Copyright 2018 NewsFactor Network. All rights reserved. Member of Accuserve Ad Network.