> 50% OF US LIVE NEAR THE COAST. WHY DOESN'T OUR DATA?
It's corrosive, expensive to get things to and from it for replacement, leaks destroy the hardware, it's not close to power generation, internet access needs cables because RF doesn't penetrate water, everything is going need watercooling which is rather expensive.
Imagining they completely solve the problem of sea water, leaks, etc, it still is amazing to think that you would want to do your server maint by pulling a data center out of the ocean on a boat and replacing hard drives and the like.
The only way this makes sense to me is if there is the ability to create something akin to the cargo container as a building block of a data center, where you can have arbitrary compute and storage plug into a greater complex.
I think they should be able to handle most of those issues, for example they may be able to use wave power for power generation. Furthermore, I don't see how RF opacity is an issue, seeing as anyone running a data center over RF is criminally insane.
> everything is going need watercooling which is rather expensive
Why would everything need water cooling? I'd expect that something using water would be used to keep the air inside the unit cool, and then the cooling for the servers themselves would be ordinary air cooling.
Assuming they have ways around some of those issues this could work out rather well. The important thing is that it's only a research project. Microsoft's research turns out some really awesome stuff but plenty of it failed or is cut. Who knows what'll happen to this but it's a really interesting proposition!
For power you could use a small nuclear reactor, just like submarines and aircraft carriers. I've always thought it would be a fun exercise to take a decommissioned nuclear submarine and turn it into a floating datacenter.
Because 50% of us live near the coast, not past it. Maintaining anything in close association with an ocean is painful. Everything rusts. Even the stuff they say doesn't does. Anything that moves ages at an accelerated rate. As soon as the slightest waves start, little salt crystals appear on every surface.
A clever idea. People are wondering why such a thing might be useful, so let me advance a theory:
Latency.
Suppose you have a bunch of people somewhere, say, the US, and a bunch of other people somewhere else, say, China, and there's an ocean in between. If they need to work collaboratively on something, placing a datacenter in one country or the other yields asymmetric latency; someone has a lot more.
If you can just plop a datacenter exactly at the midpoint, everyone wins. It needn't be the biggest datacenter ever, just one that can handle the latency-sensitive tasks.
They pretty much say why in the project page: renewable energy [tidal, currents?] and cooling. The third, as you mention is latency --they want to be where the people are.
Plus, at depth, storms and typhoons don't affect things all that much, it's rather calm. So, the main threat might be from saboteurs rather than natural disasters [beside the salty environment] because beside a coast guard at the surface [which if contraband coming in is any proxy, it's pretty porous], you don't have a "police presence". So they'd have to rely heavily on monitoring systems.
That may be a part of it, but I also think a major component is cooling. Cooling account for 30-40% of the running cost of a datacenter, so building with access to enough water, they've essentially cut the running cost with to 2/3 of conventional datacenters.
My guess is they are more concerned with latency for real-time services used by millions in big cities, where land is expensive. Think VR servers.
Quick googling yields [1] datacenter land selling for more than $1 million per acre in SV and [2] Google's requirements for datacenter placement. The first four points listed are cheap electricity, carbon neutrality, lots of water, and large parcels of land; the 215 to 1200 acres mentioned in [2] would cost $240 million to $1.5 billion at the price quoted in [1]. Sealed containers anchored to the free sea floor, running on free wave energy and cooled with free sea water would be a very clever way to satisfy those requirements while staying close to the customers.
More likely it's to reconcile latency requirements with national borders. If you want to be close to a country to offer low latency, but political or legal or tax reasons mean you don't want to be in that country, then an ocean datacenter can get you close enough.
Speaking about effectiveness and stuff: I am still amazed how these modern bootstrapy landing pages, that, in this particular case, basically contain nothing more than text and a couple of pictures, can make browser noticeably slower. I mean, yeah, it works, can be built quickly and, since users are used to it — nobody really complains. If it would be some yet another tiny startup, I wouldn't even bother to comment.
But when it's a landing for some futuristic Microsoft project, which is about doing significant work to achieve relatively small improvement in something, and which is very likely to be non-environmentally-friendly… Really, just look at it. Enjoy how scrolling up and down makes your browser lag. And then look at the source. Just marvelous.
It's rendered entirely with JavaScript, has images (and gifs), and an iframe with a Bing map on it, some sort of plugin for "Azure Media Player", and it appears to also have some scrolling-based JavaScript. It's a good example of an obese website: http://idlewords.com/talks/website_obesity.htm
Cooling data centers with seawater is not new, Google has been doing it since at least 2011. There are many ways to mitigate the corrosive effects on the equipment. "For instance, Interxion uses materials like Cunifer and titanium, which will last approximately 10 years in seawater." [1]
There also are many tried and true ways to manage the heated water so that it is safe for the environment [1, 2].
"We pump that seawater through cooling modules - which are direct water to water heat exchange modules - and then the water is gravity fed from the cooling modules back out to a temporary building, which serves the purpose of mixing incoming seawater with outgoing return water, so when we return the water to the Gulf it is at a temperature more similar to the incoming water." [2].
I'm sure Microsoft was aware of all this before publishing their report. Good on them for thinking creatively.
The main advantage I would think that is not mentioned is mitigating the price of real estate in expensive regions like Hong Kong, New York, San Francisco etc.
Since the current post is more of an original source and is currently ranked higher on the front page, we treated that one as the dupe and merged its comments here.
I'm still not sure why one would want to place the data center at the bottom of the ocean. I would think that the disadvantage of not being able to perform maintenance for 5 years would be more significant -- can't they just create the data center near a water source and pump the water through pipes that run across a heat exchange on the backside of the servers?
Your idea is what's being done at the largest computing center in Switzerland, the CSCS. They pump water from the relatively deep Lake Lugano [1]. Europe's currently fastest supercomputer (world No. 7) is hosted there.
Environmentally pumping heat into a river would not be though acceptable I would think, so the water source probably is the ocean, running pipe across beach front property might not be as doable.
> perform maintenance for 5 years would be more significant
They would be built like Google, you don't waste money doing maintenance.
I think the main issue is it's unusual and as such costs of complexity and legal come in.
I assume it is for space reasons. You could put down a data center in a harbor, for example. The amount of metal that would be required to scale this model to that size, however, seems excessive.
It seems like putting this in freshwater would help with a lot of issues. A data center in say Lake Huron gets you relatively low latency access to a lot of US & Canadian population. Putting it in an artificially dammed lake on a river gives you free-ish power & cooling (and probably population proximity as well, since river-side areas tend to be densely populated).
It would be pretty amazing to put servers undersea in the middle of an ocean for transaction efficiency reasons -- say between NYC and LON directly on the straight line path.
The crazy thing is there probably are financial market applications/HFT where having processing power exactly equidistant between to market centers with the most efficient path would make sense. (Running microwave instead of fiber, to get 1c vs. 2/3c, would also be interesting, but there are different engineering challenges there.)
A tsunami would have relatively little effect on a pod suspended/anchored to the sea floor. They only get violent when they reach the shallows and come onto land. It'd have more effect on the shore connection point where the data and power go into the ocean. These could be buried up to the shore though and anything above ground secured in a sealed bunker though that doesn't solve the problem of keeping the pods powered through the power outage after a tsunami.
I don't get it. If this is all about heat, I would think that putting the data center beside the ocean and then pumping seawater around to cool things would be far easier than sinking the entire kit. Even if they really really want to be underwater, I'd assume digging an artificial lake and pumping water in and out would be easier than dealing with an actual ocean.
At first I was concerned that there would be so much humidity inside of the capsule from the condensation caused by the temperature difference between the outside and the inside, but I see they've addressed that by replacing the atmosphere with Nitrogen.
I wonder if they've considered an inert fluid to immerse the computers in? If you can use something like Fluorinert, or even high-grade mineral oil, you might be able to make the vessel not required to withstand crush forces as high since the fluid in the capsule can be at the same pressures.
I do love this idea because they can start putting data centers along of submarine cables. One in the middle of the Atlantic, between London and NYC would be great for HFT traders.
I wouldn't be surprised if the NSA wouldn't love something like this too. A whole data center on cross-oceanic cables would provide a lot of infrastructure they can use to analyze traffic in real-time.
> At first I was concerned that there would be so much humidity inside of the capsule from the condensation caused by the temperature difference between the outside and the inside, but I see they've addressed that by replacing the atmosphere with Nitrogen.
Whatever reason they have for using nitrogen (making the thing fireproof is one reason), avoiding condensation isn't it. You could do that just by using dry air and packing a little silica gel. It's not like you're going to introduce more moisture over time into a sealed system.
Data Sovereignty? Could this, for better or for worse, allow Microsoft to completely dictate the terms of how it stores and manages its data in international waters?
> 50% OF US LIVE NEAR THE COAST. WHY DOESN'T OUR DATA?
Regardless of the effort and results, I think they should reconsider putting an illogical and patently false statement at the header of this article in attempt to gain interest about it.
Some of my data, or more specifically "my data" lives in my house, in servers in my garage. While the collaborative argument has some merit in proposing splitting latency differences, I think the vast majority of "our data" should end up living near where we are.
That results in a better question to ask, and that's "Why do people tend to put their data where they don't live?" Following, one may ask "Are there valid business models that can be created to help people put their data near where they live?"
When the USS Jimmy Carter[1] splices an intercept into an undersea fiber optic cable it probably leaves something similar to this behind of locally filter the data and only uplink relevant data.
Makes you wonder what vendors the Navy uses and if MSFT is using the same.
[+] [-] steckerbrett|10 years ago|reply
It's corrosive, expensive to get things to and from it for replacement, leaks destroy the hardware, it's not close to power generation, internet access needs cables because RF doesn't penetrate water, everything is going need watercooling which is rather expensive.
[+] [-] hobs|10 years ago|reply
The only way this makes sense to me is if there is the ability to create something akin to the cargo container as a building block of a data center, where you can have arbitrary compute and storage plug into a greater complex.
[+] [-] foota|10 years ago|reply
edit: I can't grammar
[+] [-] tzs|10 years ago|reply
Why would everything need water cooling? I'd expect that something using water would be used to keep the air inside the unit cool, and then the cooling for the servers themselves would be ordinary air cooling.
[+] [-] BinaryIdiot|10 years ago|reply
[+] [-] tdicola|10 years ago|reply
[+] [-] sandworm101|10 years ago|reply
[+] [-] driverdan|10 years ago|reply
Water cooling significantly reduces running costs which is why many DCs are switching to it. Over the long term you save money.
[+] [-] riddlemethat|10 years ago|reply
http://www.romanconcrete.com/docs/spillway/spillway.htm
If we want to find a way to build beneath the sea that's a good place to start.
[+] [-] dingo_bat|10 years ago|reply
[+] [-] ksec|10 years ago|reply
And why "everything is going need watercooling which is rather expensive" ?? If i remember correctly every OVH DC Server are using it.
[+] [-] Spooky23|10 years ago|reply
[+] [-] YesProcrast|10 years ago|reply
Latency.
Suppose you have a bunch of people somewhere, say, the US, and a bunch of other people somewhere else, say, China, and there's an ocean in between. If they need to work collaboratively on something, placing a datacenter in one country or the other yields asymmetric latency; someone has a lot more.
If you can just plop a datacenter exactly at the midpoint, everyone wins. It needn't be the biggest datacenter ever, just one that can handle the latency-sensitive tasks.
Neat project.
[+] [-] mc32|10 years ago|reply
Plus, at depth, storms and typhoons don't affect things all that much, it's rather calm. So, the main threat might be from saboteurs rather than natural disasters [beside the salty environment] because beside a coast guard at the surface [which if contraband coming in is any proxy, it's pretty porous], you don't have a "police presence". So they'd have to rely heavily on monitoring systems.
[+] [-] hvidgaard|10 years ago|reply
[+] [-] T-A|10 years ago|reply
Quick googling yields [1] datacenter land selling for more than $1 million per acre in SV and [2] Google's requirements for datacenter placement. The first four points listed are cheap electricity, carbon neutrality, lots of water, and large parcels of land; the 215 to 1200 acres mentioned in [2] would cost $240 million to $1.5 billion at the price quoted in [1]. Sealed containers anchored to the free sea floor, running on free wave energy and cooled with free sea water would be a very clever way to satisfy those requirements while staying close to the customers.
[1] http://www.datacenterknowledge.com/archives/2015/07/24/equin...
[2] http://www.datacenterknowledge.com/google-data-center-faq-pa...
[+] [-] jimrandomh|10 years ago|reply
[+] [-] krick|10 years ago|reply
But when it's a landing for some futuristic Microsoft project, which is about doing significant work to achieve relatively small improvement in something, and which is very likely to be non-environmentally-friendly… Really, just look at it. Enjoy how scrolling up and down makes your browser lag. And then look at the source. Just marvelous.
[+] [-] vayeate|10 years ago|reply
[+] [-] doublerebel|10 years ago|reply
There also are many tried and true ways to manage the heated water so that it is safe for the environment [1, 2].
"We pump that seawater through cooling modules - which are direct water to water heat exchange modules - and then the water is gravity fed from the cooling modules back out to a temporary building, which serves the purpose of mixing incoming seawater with outgoing return water, so when we return the water to the Gulf it is at a temperature more similar to the incoming water." [2].
I'm sure Microsoft was aware of all this before publishing their report. Good on them for thinking creatively.
[1]: http://www.datacenterjournal.com/industry-perspective-seawat...
[2]: http://www.datacenterdynamics.com/power-cooling/googles-finl...
[+] [-] mmel|10 years ago|reply
[+] [-] RyJones|10 years ago|reply
https://en.wikipedia.org/wiki/United_States_Army_Natick_Sold...
ETA: that's what I get for typing mindlessly instead of cutting and pasting.
[+] [-] dang|10 years ago|reply
Since the current post is more of an original source and is currently ranked higher on the front page, we treated that one as the dupe and merged its comments here.
[+] [-] cjfont|10 years ago|reply
[+] [-] m_mueller|10 years ago|reply
[1] http://www.cscs.ch/cscs/an_innovative_centre/cooling_system/...
[2] http://www.top500.org/lists/2015/11/
[+] [-] aaron695|10 years ago|reply
> perform maintenance for 5 years would be more significant
They would be built like Google, you don't waste money doing maintenance.
I think the main issue is it's unusual and as such costs of complexity and legal come in.
[+] [-] jsprogrammer|10 years ago|reply
[+] [-] fiatmoney|10 years ago|reply
[+] [-] whalesalad|10 years ago|reply
[+] [-] rdl|10 years ago|reply
The crazy thing is there probably are financial market applications/HFT where having processing power exactly equidistant between to market centers with the most efficient path would make sense. (Running microwave instead of fiber, to get 1c vs. 2/3c, would also be interesting, but there are different engineering challenges there.)
Amazed Microsoft didn't mention this.
[+] [-] malloryerik|10 years ago|reply
One question, what happens when there's a tsunami?
This is a particularly relevant question on the US west coast: http://www.newyorker.com/tech/elements/how-to-stay-safe-when...
[+] [-] rtkwe|10 years ago|reply
[+] [-] jwcacces|10 years ago|reply
[+] [-] Qworg|10 years ago|reply
[+] [-] sandworm101|10 years ago|reply
[+] [-] deftnerd|10 years ago|reply
I wonder if they've considered an inert fluid to immerse the computers in? If you can use something like Fluorinert, or even high-grade mineral oil, you might be able to make the vessel not required to withstand crush forces as high since the fluid in the capsule can be at the same pressures.
I do love this idea because they can start putting data centers along of submarine cables. One in the middle of the Atlantic, between London and NYC would be great for HFT traders.
I wouldn't be surprised if the NSA wouldn't love something like this too. A whole data center on cross-oceanic cables would provide a lot of infrastructure they can use to analyze traffic in real-time.
[+] [-] justin66|10 years ago|reply
Whatever reason they have for using nitrogen (making the thing fireproof is one reason), avoiding condensation isn't it. You could do that just by using dry air and packing a little silica gel. It's not like you're going to introduce more moisture over time into a sealed system.
[+] [-] frik|10 years ago|reply
[+] [-] antmldr|10 years ago|reply
[+] [-] kordless|10 years ago|reply
Regardless of the effort and results, I think they should reconsider putting an illogical and patently false statement at the header of this article in attempt to gain interest about it.
Some of my data, or more specifically "my data" lives in my house, in servers in my garage. While the collaborative argument has some merit in proposing splitting latency differences, I think the vast majority of "our data" should end up living near where we are.
That results in a better question to ask, and that's "Why do people tend to put their data where they don't live?" Following, one may ask "Are there valid business models that can be created to help people put their data near where they live?"
[+] [-] late2part|10 years ago|reply
[+] [-] zmanian|10 years ago|reply
Makes you wonder what vendors the Navy uses and if MSFT is using the same.
[1] http://www.nytimes.com/2005/02/20/politics/new-nuclear-sub-i...