This is the future. Using algorithms to optimize usage of renewable energy. Not only will it be lower carbon, it will be cheaper. What's interesting is they describe it working on forecasts (for wind and sun) instead of instantaneous renewable production. I wonder what the rationale for that was? Basing the algorithm on instantaneous information should be more accurate and thus give better savings, but maybe it varied too much to reliably run the loads they want.
Imagine when your fridge can do this: freeze extra cold when the sun is shining (or wind is blowing), don't run the compressor when it's not, only run the blower after you open the door to move that extra cold from the freezer, allow a slightly larger temperature range, and of course run as necessary to avoid spoilage. It's not a simple algorithm, it has to handle various timeframes, such as solar being a daily cycle except there's less in winter and can go for a week or more with very little (storm/overcast). Maybe it could also use a bit of "learning" like the Nest thermostats to also optimize predicted usage.
I know of one commercial product that sort of does this: the Zappi electric car charger. If you have grid-tied solar, it measures the current being fed back to the grid and adjusts the charging current to match. So if a cloud goes over your house, or you turn on a big appliance, the charger reduces the power to the car by the same amount. This maximizes the use of your own solar energy and minimizes the use of grid energy.
I've been posting for years that an effective grid "battery" is internet connected refrigerators, water heaters, A/C, car chargers, etc., that only run when power is cheap, i.e. when solar/wind is providing excess power.
A great deal of our demand for electricity is elastic and shiftable, which will eliminate a huge chunk of the need for grid batteries.
>> What's interesting is they describe it working on forecasts (for wind and sun) instead of instantaneous renewable production. I wonder what the rationale for that was? Basing the algorithm on instantaneous information should be more accurate and thus give better savings
The article says they are “shifting the timing of our compute tasks”, so if they think that there will be cheap electricity later in the day (because it’s going to be especially windy or something) it would make sense for them to schedule some of their heavy compute tasks at that time, rather than right now.
The future is nuclear power, and when companies buy hardware they use it at max performance around the clock because energy is cheap and does not depend on weather.
There’s huge lie (by omission) about renewables: nobody explained how to convert the world to 100% renewable energy without coal backup.
Optimizing for this is a perfect task to throw at a simple market. Especially because actually reworking the software to take advantage of resources at different times is often going to require a decent amount of work by engineers.
One way to do it would be assign various jobs a value, (which could be dynamic e.g. it might get more important as information becomes stale) and have them bid on compute power. You could make the value virtual.
Or you could use real money. This is the premise behind EC2's spot instances. So when power is abundant, your prices drop and the relevant jobs kick off.
Using real market prices makes sense especially if you're renting out computing power, most customers will be happy to adjust workloads to save money.
Even if it's entirely internal, it's good to have a facility to "optimize for cost" and then report the savings. That's helpful to get the engineering resources devoted towards it, because "I saved $X" is a great bullet point to put in anyone's promotion packet or to base a bonus on.
> One way to do it would be assign various jobs a value. Or you could use real money.
It's not the value of the outcome for that job that you're interested, but rather its sensitivity to a delayed latency in executing it.
For example, preemptively converting Youtube videos to a lower resolution with optimum compression to avoid having to do it in real-time (when video is played) at a crappy compression (to be fast), is valuable for sure. It's just that it can be postponed for 24 hours without real impact. Executing a search for a single user is less valuable in terms of overall impact but much more latency-sensitive.
(you can think of value and latency-sensitive in terms of two dimensions that are independent between them.)
This idea helps save the planet for sure, but it requires cloud-providers to build APIs that enable devs to switch from the "here's the SSH to the server, do what you want with it" to a model where it's the devs that say instead "here's a lambda function and its desired latency execution, please schedule to run it for me and let me know when the result is ready" ( https://en.wikipedia.org/wiki/Inversion_of_control )
Google was able to do that because it owns a large part of the jobs executed in their datacenters. Hence they could build this adaptive scheduling for their own jobs quickly without necessarily passing through a cloud-based API that inverts the control of job scheduling.
On a smaller scale, you could do what Low Tech Magazine [0] does and actually have
downtime when sunlight is low. Since this doesn't happen too often and users can just
save articles (with RSS, email newsletters, etc.), websites like this can just be
powered by a single computer, solar cell, and small battery in the owner's Barcelona
apartment. Thanks to small static pages and tiny dithered images, the site is almost
always up.
The future doesn't always need to be as "webscale" as Google; sometimes, scaling down
is the smart thing to do. The minimal approach of LTM is the technology equivalent of
riding a bicycle (or electric velomobile [1]) to work instead of driving.
It's good to an extent, but if optimization for cost gets too intense then it will seek out the flaws in market rules. This will be true whether machines or humans are doing the optimizing.
I guess it's okay as long as the people making the rules have good monitoring and are watching out for weird exploits and fixing them. The flexibility to change the rules tends to be more common internally than externally where customers want more guarantees.
As we've seen, there also needs to be a balance between cost-optimization and preparedness. If the wind patterns don't match the prediction then you need to be ready for that.
Also, as we've seen with cryptocurrency, real money attracts theft. A human-adjusted credit system is better. In the real world, this looks like support having the discretion to forgive big bills. But to do that they need to know their customers. It's hard to automate.
Borg has concepts of quota and priority which function as the internal market you are talking about: Verma, et al. "Large-scale cluster management at Google with Borg".
Tangentially related but in Australia we run a household utility company that operates of that same assumption: https://www.amberelectric.com.au/
Our hypothesis is that market signals combined with the right tools (friendly app and home automation) can help households shift demand into less carbon intensive periods.
Yes! Especially when you have multiple data centers participating in different locations. It might be cloudy in one place but not another, so jobs get re-routed accordingly.
I really hoped this is what EC2 spot instances would be, but it doesn't seem to work that way. My spot instances usually get terminated due to "no available capacity" without any major price movement.
It would also be pretty neat to integrate processing power markets with the wholesale energy markets. Energy prices are quite volatile and making load responsive to that would actually be quite helpful to stabilize them.
This is a really good argument for carbon taxation to appropriately increase the cost of dirty energy. Send the correct price signal everywhere rather than making your own software do the equivalent of looking out the window at the weather and trying to decide if it’s a sunny or rainy or windy or clam day, and thus if solar or wind generation is making the grid cleaner. Or if instead those are likely offline and the grid is dirtier today.
Best thing. Then you incentivize a cleaner grid overall and you don’t even have to worry eventually about this kind of thing.
In particular a revenue neutral carbon tax with dividend should be politically as uncontroversial as it gets because it is also economically equitable. It's totally perplexing to me why these relatively low-hanging fruit solutions are not being pursued.
Exactly. It then also starts to make local storage (like powerwall) more interesting. When your PV generates a lot, prices will go down so better store the excess for times with higher prices.
And when you don't generate PV power but can still store, you buy cheap also to use when prices go up.
If there is also a dynamic price for using the grid, that usage will also spread.
It seems that it must be a really difficult problem to work out the optimal solution for having spare capacity to allow time/location shifting of workloads to minimize carbon per unit of compute.
This Dell paper[0] suggests that 16% of the carbon over a typical server lifecycle is from the manufacture, so you probably don't want a server sitting there unused for 23 hours per day, since the overall carbon/compute ratio would be worse overall.
The post doesn't mention this metric, but it would be really nice to see something more detailed in time - especially with this overall efficiency of the server/datacentre lifecycle in mind, rather than just energy consumed from use.
Carbon consumed in building a server is sunk cost and would be paid independent of whether the server does any kind of carbon-footprint-aware load shifting.
Assuming the server is "sitting unused for 23 hours a day" is the wrong model for what this work changed. You're assuming the server could be running at 50% duty cycle vs. 100% duty cyle. It isn't; since we're talking the batch load, there's a roughly fixed amount of low-priority work to be done and doubling the amount of CPU active-duty time alotted to doing the work doesn't get the work done faster (the details on that are complicated, but that's the right model for what Google's describing here). One should model the duty cycle as fixed relative to the processor (i.e. "This global datacenter architecture, over the course of its life, will do a fixed N units of computronium work on these batch tasks") and then ask whether that work should be done using coal to power the electrons or wind.
It is ultimately, in a certain sense, a mathematical optimization problem to determine the optimal configuration of the entire infrastucture of additional power sources. Perhaps like finding the optimal location position set of cell phone towers, perhaps using k-means clustering. Furthermore, additional issues must be resolved like legal regulations compliance—the decision maker or engineering agent has preferences or desires to satisfy.
> for having spare capacity to allow time/location shifting
Part of that calculation should be the amount of compute capacity headroom you'd choose to have anyway even if you didn't care about carbon.
Compute demands can vary from one day to the next. Maybe tomorrow people uploaded 3 times as many YouTube videos as they did today. Maybe load varies based on day of the week or day of the month. To some extent, you can smooth that out by delaying jobs, but there are practical limits.
You also want some spare capacity just for safety. Efficient utilization is important, but things like performance regressions or spikes in demand can happen.
FWIW, for the build2 project we host our own CI servers on premesis and the power supply is supplemented by a solar array. We have configured our daily package rebuild window to coincide with the maximum solar output so that on a sunny day it is all done using renewable energy.
Am I crazy or is this website capturing down-button clicks and ignoring them? I typically use down and up to slowly scroll as I read an article. This page is driving me nuts.
Seems like an intuitively good idea to me! It'd be great to see how effective this change was.
Regardless of this change, I wonder if they share their forecasted non-renewable energy needs with their energy supplier so that the energy supplier can prepare for changes to the expected base load.
Do any factories or other energy intensive operations do this?
It's probably not at the same level of granularity that Google is trying to accomplish here, but I believe that power-hungry commercial systems have tried to move to when power is cheapest for many years now.
Aluminum Foundries in particular are extremely power intensive and have been run during off-peak times (or are built in areas with cheap plentiful electricity like nearby hydro-electric dams).
Still, i'd love to see this concept made a lot easier for the average consumer. Many people already have smart thermostats, why can't that talk to my power generation company and allow me to over heat/cool when the impact is lowest? Why can't my dish washer run automatically when it would impact the world the least? Why can't my EV automatically charge when power is most available?
I know most of those things are possible, but they sure as hell aren't easy, and IMO they won't truly have an impact until they're on by default and don't require the user to do much of anything.
These things seem like they are easily doable, but we just need the different industries to work together to come up with ways to have all of this stuff interoperate.
It tends to work the other way around - at least in the markets I'm familiar with.
If you're a large consumer of energy and can turn that consumption on or off at short notice (on the order of seconds) then the grid operator will pay you to allow them to scale your consumption up or down.
The classic example of this is cold storage. If you have a warehouse full of freezers which need to be kept within a certain temperature threshold then it doesn't really matter when you run the freezers and you could switch off at several points during the day.
Having worked in a factory before, I can tell you that the factory was calling the electricity company before shutting down or turning on the factory. It is consuming a noticeable chunks out of what the whole surrounding city is consuming.
Given how much electricity a datacenter consumes. Google surely must have a direct support contact within the electricity provider and they better start working both ways, if they don't already.
Quick math. A datacenter is 60 000 servers, so 6 MW consumption at 100 W per server (moderate load). That is 1% of the peak output of a nuclear reactor. You bet the electric company wants to know when they need to adjust their reactors.
Seems like getting rid of the middleman at a basic, physical level. Power is available for very low cost at certain times. So let's time-shift computation that doesn't have to be done at a specific time! Really, it's just the same trick as time-shifting your EV charging and other power draws. It costs money to run battery banks and inverters. Let's just take them out of the process, where we can.
Great collab with Google! They've been probably the most serious corporate wrt getting all-renewable offsets for their operations... this is helping them reach the next milestone where the renewable offsets are time-matched. Great example of being serious about this stuff rather than just greenwashing.
It sounds like this project is for their own operations. Have you guys thought about how to offer this closer to a turnkey cloudops SASS / API? What kinds of abstractions would you present to developers building non-time-critical compute loads?
Could be a great differentiator for GCP vs. AWS (I have heard of some companies choosing GCP over AWS due to Google's green energy cloud). And for you guys, the only thing better than Google being a customer is all of Google's customers being your customers.
That looks like a good opportunity for an (at least tangentinally related) big shout-out to your great electricityMap (https://www.electricitymap.org/map) website!
I wish there would be more countries covered (in particular Switzerland), but I guess you depend on the live data being provided in these countries.
Hey Martin, this is super cool! Congrats on the collab!
Would you mind commenting on what your tech stack is like? Looking at your github repo it seems like you're combining a lot of data sources. Can you comment on your approach?
Also, considering this collaboration, are you running on GCP?
While the green angle on this story is definitely good to highlight, I wonder if they're seeing any cost savings too?
Solar and wind on the grid increase supply, which should drive down price per KwH (of course, the equation isn't quite that simple, since demand in most of the world near human population centers is also highest during the day).
I don't think the electric grid follows a traditional supply-and-demand curve. Every kilowatt used must be generated at the same time. Generation comes in tiers: cheap baseload (coal or nuclear, each with their issues), expensive peaker plants (nat. gas), and cheap but unpredictably intermittent renewables. Prices are highly regulated and agreed to in long-term contracts that take into account peak usage and minimal generation capacity. If demand increases, it is really expensive to supply until the level is enough to build a new baseload plant, but even those are expensive now.
Big users such as Google with its datacenters will of course negociate their own electricity contracts. I think renewables are the cheapest to buy right now, so by moving load around to be able to maximize use of cheap renewable electricity, they will definitely save money.
I've had an idea for the longest time that we should get rid of all the processes we've put in place to deliver everything "on-demand" and instead work with nature to get what we can.
What I mean by this is that instead of deciding "I want to drive 200 miles to the beach" and buying a tank of petrol, you would instead wait for favourable wind/solar conditions in order to "save up" the energy you need such that you can afford to drive to the beach. If you are unfortunate one year you might only end up with half of what you need, but you'll still be able to do something.
This goes for things like food too. Stop demanding the same food year round. Instead work with the seasons and eat what is available locally at that time of year.
This would be such a huge boost to happiness. You can't see light if it's light all the time. We just don't know how great our lives are because we simply expect it to all be available at all the time. Expectations are simply assimilated and become invisible very quickly. Not only that but it turns out that meeting these expectations comes at a huge price. Let's instead take what nature gives us, but no more.
I frequently criticize Google harshly for everything from search becoming more and more useless to pushing Chrome way to hard.
Seems some people at Google still hasn't got the memo yet that the "not evil" days are now a thing of the past. This looks amazing and more like something I would expect from old Google.
Not sure I’m very impressed by the plot they show here. The results during the day looks ok, but then they only translate two nightly peaks (low carbon) into one slightly larger... couldn’t even more of the work be done at night... also it is strange that there is a dip in both ends of the plot (maybe they just plot one 24h period, ignoring the previous day’s load and the next day’s load... I think it would be more appropriate to consider previous/day as well, as a 24h snapshot over a multiday view)
A more interesting measure would be the actual reduction in CO2 emissions.
Everyone wanting to really understand what is going on with the new green economy and these platitudes should watch Michael Moore's nee documentary he just released free on YouTube called Planet of the Humans. https://m.youtube.com/watch?v=Zk11vI-7czE
This documentary is without nuance and without pragmatism. And it criticizes without proposing a path forward -- it's a bunch of cheap pot-shots, and it demands perfection instead of proposing progress.
Yes, there are valid criticisms. Wind turbines are made of unrecyclable fiberglass. It takes energy to build them (truck rolls to the site, concrete for the foundations), and it's important to make sure the energy return on energy invested is net positive. We use fossil fuels to produce these renewables technologies. That's all true, but not insurmountable.
They say battery storage makes up only a tiny percent of the needed capacity to overcome renewable intermittency. Sure, but it also omits how solar has dropped two orders of magnitude in price over the last few decades as we've built more of it and gotten better at making them (the "learning curve").
It follows a group of Vermont hikers hiking to a wind turbine site and then being NIMBY about it, but none of them talk about where their energy SHOULD come from.
Look, it raises a lot of critical questions. But it also seems to expect a single magic pill that just doesn't exist. 2/3's of the way through they talk about the misrepresentations in biomass and point out how many organizations seem to be both for it and against it. "Which side are they really on?" says the classic accusatory documentary voiceover with scary music. Well, it's complicated! Clearly you don't want to burn all the forests all at once. And yeah, if you burn pressure-treated wood, those chemicals go into the local community. At the same time, wood does grow back. The nuance that's missing in this documentary is questions like "how many acres of rotationally-harvested woodlands are required to power a 1MW biomass plant sustainably in perpetuity? And can such projects exist in practice?"
Biomass isn't a panacea solution, and the HN startup mindset of "can I scale up a technology to dominate everything" doesn't apply because biomass has limits to it's scalability. It's just one of many tools, and the problem about this documentary is it can't envision a future where many tools are used together. When a Sierra Club exec is questioned about biomass, they kept the part where she says their "position is nuanced", but then they cut to something else without explaining that nuance. That's lazy documentary filming.
The complicated thing about energy is there is no silver bullet. This documentary finds the bad in each technology without considering how all the pieces could fit together. It presents the bad sides of each technology as if that should disqualify the tech instead of asking how can we improve each over time. There aren't easy answers to these questions, but this documentary just wallows in how bad everything is without asking the hard questions about how things can be made to work or what the alternative of doing nothing is.
Just a marketing device. Carbon neutral since 2007? Let me laugh in CO2, "green energies" are nowhere near carbon neutral. See planet of the humans by Moore.
I don't see your point. That alternative title is exactly as accurate, so it seems like a good idea to pick the version that sounds a bit better and is easier to understand.
Why would AI solutions help the fossil fuel industry burn more oil and gas? The fossil fuel industry sells oil and gas to others to burn, it doesn't light the stuff up itself for fun.
While the story is very positive and encouraging --
Unfortunately an unintended side consequence of these kinds of efforts (unless you're very conscientious about maintaining the correct incentives, generally through pricing) is sometimes that the gains in energy efficiency and savings are clawed back by an increase in overall energy consumption because it's gotten effectively cheaper to operate for the same number of compute cycles.
Just like with energy efficient LED light bulbs, although the overall energy use goes down, often it doesn't go down as much as it could have ideally, because people start lighting places that didn't have light before, because it's gotten so much more affordable to do so!
Or like when you add highway lane capacity, traffic gets worse...
Or in this case, the Google video engineers come up with new useless filters and resolutions to occupy the newly freed-up compute capacity.
Just something to be aware of. The people who do this have to monitor and put in place controls so that the outcome is what they intended. Otherwise people are more clever than you think.
If it is still an improvement in both end usage and utility isn't that letting the perfect be the enemy of the good?
LEDS have to be one of the worst possible example for claims of induced demand as a bad thing given that the efficency gains outstripped proliferation of additional always on devices and a cellphone per person.
While Induced Demand may exist it too has its saturating limits of diminished returns.
205guy|5 years ago
Imagine when your fridge can do this: freeze extra cold when the sun is shining (or wind is blowing), don't run the compressor when it's not, only run the blower after you open the door to move that extra cold from the freezer, allow a slightly larger temperature range, and of course run as necessary to avoid spoilage. It's not a simple algorithm, it has to handle various timeframes, such as solar being a daily cycle except there's less in winter and can go for a week or more with very little (storm/overcast). Maybe it could also use a bit of "learning" like the Nest thermostats to also optimize predicted usage.
I know of one commercial product that sort of does this: the Zappi electric car charger. If you have grid-tied solar, it measures the current being fed back to the grid and adjusts the charging current to match. So if a cloud goes over your house, or you turn on a big appliance, the charger reduces the power to the car by the same amount. This maximizes the use of your own solar energy and minimizes the use of grid energy.
https://myenergi.com/product/zappi/
WalterBright|5 years ago
I've been posting for years that an effective grid "battery" is internet connected refrigerators, water heaters, A/C, car chargers, etc., that only run when power is cheap, i.e. when solar/wind is providing excess power.
A great deal of our demand for electricity is elastic and shiftable, which will eliminate a huge chunk of the need for grid batteries.
Glad to see this finally gaining some traction!
sgillen|5 years ago
The article says they are “shifting the timing of our compute tasks”, so if they think that there will be cheap electricity later in the day (because it’s going to be especially windy or something) it would make sense for them to schedule some of their heavy compute tasks at that time, rather than right now.
unknown|5 years ago
[deleted]
josinalvo|5 years ago
cm2187|5 years ago
golemiprague|5 years ago
[deleted]
lightgreen|5 years ago
There’s huge lie (by omission) about renewables: nobody explained how to convert the world to 100% renewable energy without coal backup.
ben509|5 years ago
One way to do it would be assign various jobs a value, (which could be dynamic e.g. it might get more important as information becomes stale) and have them bid on compute power. You could make the value virtual.
Or you could use real money. This is the premise behind EC2's spot instances. So when power is abundant, your prices drop and the relevant jobs kick off.
Using real market prices makes sense especially if you're renting out computing power, most customers will be happy to adjust workloads to save money.
Even if it's entirely internal, it's good to have a facility to "optimize for cost" and then report the savings. That's helpful to get the engineering resources devoted towards it, because "I saved $X" is a great bullet point to put in anyone's promotion packet or to base a bonus on.
vladd|5 years ago
It's not the value of the outcome for that job that you're interested, but rather its sensitivity to a delayed latency in executing it.
For example, preemptively converting Youtube videos to a lower resolution with optimum compression to avoid having to do it in real-time (when video is played) at a crappy compression (to be fast), is valuable for sure. It's just that it can be postponed for 24 hours without real impact. Executing a search for a single user is less valuable in terms of overall impact but much more latency-sensitive.
(you can think of value and latency-sensitive in terms of two dimensions that are independent between them.)
This idea helps save the planet for sure, but it requires cloud-providers to build APIs that enable devs to switch from the "here's the SSH to the server, do what you want with it" to a model where it's the devs that say instead "here's a lambda function and its desired latency execution, please schedule to run it for me and let me know when the result is ready" ( https://en.wikipedia.org/wiki/Inversion_of_control )
Google was able to do that because it owns a large part of the jobs executed in their datacenters. Hence they could build this adaptive scheduling for their own jobs quickly without necessarily passing through a cloud-based API that inverts the control of job scheduling.
Seirdy|5 years ago
The future doesn't always need to be as "webscale" as Google; sometimes, scaling down is the smart thing to do. The minimal approach of LTM is the technology equivalent of riding a bicycle (or electric velomobile [1]) to work instead of driving.
[0]: https://solar.lowtechmagazine.com
[1]: https://solar.lowtechmagazine.com/2012/10/electric-velomobil...
skybrian|5 years ago
I guess it's okay as long as the people making the rules have good monitoring and are watching out for weird exploits and fixing them. The flexibility to change the rules tends to be more common internally than externally where customers want more guarantees.
As we've seen, there also needs to be a balance between cost-optimization and preparedness. If the wind patterns don't match the prediction then you need to be ready for that.
Also, as we've seen with cryptocurrency, real money attracts theft. A human-adjusted credit system is better. In the real world, this looks like support having the discretion to forgive big bills. But to do that they need to know their customers. It's hard to automate.
smadge|5 years ago
johnb|5 years ago
Our hypothesis is that market signals combined with the right tools (friendly app and home automation) can help households shift demand into less carbon intensive periods.
So far it's working pretty well.
bo1024|5 years ago
lukev|5 years ago
It would also be pretty neat to integrate processing power markets with the wholesale energy markets. Energy prices are quite volatile and making load responsive to that would actually be quite helpful to stabilize them.
pbhjpbhj|5 years ago
I've wanted to have realtime pricing like that for a while, it seems to be becoming available again.
I honestly thought that was what the advanced electricity meter roll-out was going to do; but it seems not.
More direct energy cost to service price charged seems like a good thing in general.
erentz|5 years ago
Best thing. Then you incentivize a cleaner grid overall and you don’t even have to worry eventually about this kind of thing.
Barrin92|5 years ago
ragebol|5 years ago
If there is also a dynamic price for using the grid, that usage will also spread.
unknown|5 years ago
[deleted]
bizzleDawg|5 years ago
This Dell paper[0] suggests that 16% of the carbon over a typical server lifecycle is from the manufacture, so you probably don't want a server sitting there unused for 23 hours per day, since the overall carbon/compute ratio would be worse overall.
The post doesn't mention this metric, but it would be really nice to see something more detailed in time - especially with this overall efficiency of the server/datacentre lifecycle in mind, rather than just energy consumed from use.
[0]: https://i.dell.com/sites/csdocuments/CorpComm_Docs/en/carbon...
shadowgovt|5 years ago
Assuming the server is "sitting unused for 23 hours a day" is the wrong model for what this work changed. You're assuming the server could be running at 50% duty cycle vs. 100% duty cyle. It isn't; since we're talking the batch load, there's a roughly fixed amount of low-priority work to be done and doubling the amount of CPU active-duty time alotted to doing the work doesn't get the work done faster (the details on that are complicated, but that's the right model for what Google's describing here). One should model the duty cycle as fixed relative to the processor (i.e. "This global datacenter architecture, over the course of its life, will do a fixed N units of computronium work on these batch tasks") and then ask whether that work should be done using coal to power the electrons or wind.
webdva|5 years ago
adrianmonk|5 years ago
Part of that calculation should be the amount of compute capacity headroom you'd choose to have anyway even if you didn't care about carbon.
Compute demands can vary from one day to the next. Maybe tomorrow people uploaded 3 times as many YouTube videos as they did today. Maybe load varies based on day of the week or day of the month. To some extent, you can smooth that out by delaying jobs, but there are practical limits.
You also want some spare capacity just for safety. Efficient utilization is important, but things like performance regressions or spikes in demand can happen.
ip26|5 years ago
- Spawn nightly regressions when wind power starts to pick up, instead of at some arbitrary wall clock time
- Dispatch compute-heavy jobs during low energy cost times; dispatch IO-heavy or memory-limited jobs during high cost times.
boris|5 years ago
mabbo|5 years ago
Am I crazy or is this website capturing down-button clicks and ignoring them? I typically use down and up to slowly scroll as I read an article. This page is driving me nuts.
crazygringo|5 years ago
Surprised to see that get through QA. (Up arrow works just fine.)
vlasev|5 years ago
dkarp|5 years ago
sambroner|5 years ago
Regardless of this change, I wonder if they share their forecasted non-renewable energy needs with their energy supplier so that the energy supplier can prepare for changes to the expected base load.
Do any factories or other energy intensive operations do this?
Klathmon|5 years ago
Aluminum Foundries in particular are extremely power intensive and have been run during off-peak times (or are built in areas with cheap plentiful electricity like nearby hydro-electric dams).
Still, i'd love to see this concept made a lot easier for the average consumer. Many people already have smart thermostats, why can't that talk to my power generation company and allow me to over heat/cool when the impact is lowest? Why can't my dish washer run automatically when it would impact the world the least? Why can't my EV automatically charge when power is most available?
I know most of those things are possible, but they sure as hell aren't easy, and IMO they won't truly have an impact until they're on by default and don't require the user to do much of anything.
These things seem like they are easily doable, but we just need the different industries to work together to come up with ways to have all of this stuff interoperate.
cdmp|5 years ago
If you're a large consumer of energy and can turn that consumption on or off at short notice (on the order of seconds) then the grid operator will pay you to allow them to scale your consumption up or down.
The classic example of this is cold storage. If you have a warehouse full of freezers which need to be kept within a certain temperature threshold then it doesn't really matter when you run the freezers and you could switch off at several points during the day.
user5994461|5 years ago
Given how much electricity a datacenter consumes. Google surely must have a direct support contact within the electricity provider and they better start working both ways, if they don't already.
Quick math. A datacenter is 60 000 servers, so 6 MW consumption at 100 W per server (moderate load). That is 1% of the peak output of a nuclear reactor. You bet the electric company wants to know when they need to adjust their reactors.
stcredzero|5 years ago
Seems like getting rid of the middleman at a basic, physical level. Power is available for very low cost at certain times. So let's time-shift computation that doesn't have to be done at a specific time! Really, it's just the same trick as time-shifting your EV charging and other power draws. It costs money to run battery banks and inverters. Let's just take them out of the process, where we can.
"The best part is no part." -- Elon Musk
martincollignon|5 years ago
floatrock|5 years ago
It sounds like this project is for their own operations. Have you guys thought about how to offer this closer to a turnkey cloudops SASS / API? What kinds of abstractions would you present to developers building non-time-critical compute loads?
Could be a great differentiator for GCP vs. AWS (I have heard of some companies choosing GCP over AWS due to Google's green energy cloud). And for you guys, the only thing better than Google being a customer is all of Google's customers being your customers.
ar0|5 years ago
I wish there would be more countries covered (in particular Switzerland), but I guess you depend on the live data being provided in these countries.
ZeroCool2u|5 years ago
Would you mind commenting on what your tech stack is like? Looking at your github repo it seems like you're combining a lot of data sources. Can you comment on your approach? Also, considering this collaboration, are you running on GCP?
mempko|5 years ago
shadowgovt|5 years ago
Solar and wind on the grid increase supply, which should drive down price per KwH (of course, the equation isn't quite that simple, since demand in most of the world near human population centers is also highest during the day).
205guy|5 years ago
Big users such as Google with its datacenters will of course negociate their own electricity contracts. I think renewables are the cheapest to buy right now, so by moving load around to be able to maximize use of cheap renewable electricity, they will definitely save money.
chickenpotpie|5 years ago
globular-toast|5 years ago
What I mean by this is that instead of deciding "I want to drive 200 miles to the beach" and buying a tank of petrol, you would instead wait for favourable wind/solar conditions in order to "save up" the energy you need such that you can afford to drive to the beach. If you are unfortunate one year you might only end up with half of what you need, but you'll still be able to do something.
This goes for things like food too. Stop demanding the same food year round. Instead work with the seasons and eat what is available locally at that time of year.
This would be such a huge boost to happiness. You can't see light if it's light all the time. We just don't know how great our lives are because we simply expect it to all be available at all the time. Expectations are simply assimilated and become invisible very quickly. Not only that but it turns out that meeting these expectations comes at a huge price. Let's instead take what nature gives us, but no more.
fhennig|5 years ago
eitland|5 years ago
Seems some people at Google still hasn't got the memo yet that the "not evil" days are now a thing of the past. This looks amazing and more like something I would expect from old Google.
meling|5 years ago
A more interesting measure would be the actual reduction in CO2 emissions.
seanwilson|5 years ago
mempko|5 years ago
floatrock|5 years ago
Yes, there are valid criticisms. Wind turbines are made of unrecyclable fiberglass. It takes energy to build them (truck rolls to the site, concrete for the foundations), and it's important to make sure the energy return on energy invested is net positive. We use fossil fuels to produce these renewables technologies. That's all true, but not insurmountable.
They say battery storage makes up only a tiny percent of the needed capacity to overcome renewable intermittency. Sure, but it also omits how solar has dropped two orders of magnitude in price over the last few decades as we've built more of it and gotten better at making them (the "learning curve").
It follows a group of Vermont hikers hiking to a wind turbine site and then being NIMBY about it, but none of them talk about where their energy SHOULD come from.
Look, it raises a lot of critical questions. But it also seems to expect a single magic pill that just doesn't exist. 2/3's of the way through they talk about the misrepresentations in biomass and point out how many organizations seem to be both for it and against it. "Which side are they really on?" says the classic accusatory documentary voiceover with scary music. Well, it's complicated! Clearly you don't want to burn all the forests all at once. And yeah, if you burn pressure-treated wood, those chemicals go into the local community. At the same time, wood does grow back. The nuance that's missing in this documentary is questions like "how many acres of rotationally-harvested woodlands are required to power a 1MW biomass plant sustainably in perpetuity? And can such projects exist in practice?"
Biomass isn't a panacea solution, and the HN startup mindset of "can I scale up a technology to dominate everything" doesn't apply because biomass has limits to it's scalability. It's just one of many tools, and the problem about this documentary is it can't envision a future where many tools are used together. When a Sierra Club exec is questioned about biomass, they kept the part where she says their "position is nuanced", but then they cut to something else without explaining that nuance. That's lazy documentary filming.
The complicated thing about energy is there is no silver bullet. This documentary finds the bad in each technology without considering how all the pieces could fit together. It presents the bad sides of each technology as if that should disqualify the tech instead of asking how can we improve each over time. There aren't easy answers to these questions, but this documentary just wallows in how bad everything is without asking the hard questions about how things can be made to work or what the alternative of doing nothing is.
alcover|5 years ago
This is beyond depressing. All cope and hope peddling.
cryptonector|5 years ago
PopeDotNinja|5 years ago
qu-everything|5 years ago
elwell|5 years ago
duncan_bayne|5 years ago
"Here's a barge full of coal. Maybe you can fix it with that."
driver8_|5 years ago
kerberos84|5 years ago
yjftsjthsd-h|5 years ago
btbuildem|5 years ago
hajderr|5 years ago
thebigshane|5 years ago
"Google: Data centers now perform LESS when the sun is not shining or the wind is not blowing"
jorams|5 years ago
rcMgD2BwE72F|5 years ago
They've worked so hard to sell their AI solutions to the fossil fuel industry, lately, so they can help them extract and burn more oil and gas[0].
[0] https://www.vox.com/recode/2020/1/3/21030688/google-amazon-a...
jdm2212|5 years ago
mav3rick|5 years ago
supernova87a|5 years ago
Unfortunately an unintended side consequence of these kinds of efforts (unless you're very conscientious about maintaining the correct incentives, generally through pricing) is sometimes that the gains in energy efficiency and savings are clawed back by an increase in overall energy consumption because it's gotten effectively cheaper to operate for the same number of compute cycles.
Just like with energy efficient LED light bulbs, although the overall energy use goes down, often it doesn't go down as much as it could have ideally, because people start lighting places that didn't have light before, because it's gotten so much more affordable to do so!
Or like when you add highway lane capacity, traffic gets worse...
Or in this case, the Google video engineers come up with new useless filters and resolutions to occupy the newly freed-up compute capacity.
Just something to be aware of. The people who do this have to monitor and put in place controls so that the outcome is what they intended. Otherwise people are more clever than you think.
Nasrudith|5 years ago
LEDS have to be one of the worst possible example for claims of induced demand as a bad thing given that the efficency gains outstripped proliferation of additional always on devices and a cellphone per person.
While Induced Demand may exist it too has its saturating limits of diminished returns.