This the first nonsense talk by Jeff Dean. AI doesn't help at all it battling climate change, only politics do help. The models are accurate enough for centuries, AI would only help in hard forecasting in the usual 2 weeks window on local events. Long term on global scale there's no AI needed at all. So it looks like he ran out of topics to entertain himself. Or he went politician. Which would be a welcoming change.
"Dean said the company is thinking of including more information in Google search results to give users a predicted carbon output for choices they make, like ordering a certain product."
An AI system that predicts carbon output for consumer choices and delivers that to consumers could be useful in battling climate change, even if only in a small way.
Does anyone have tips on how a European based developer with machine learning expertise can get involved with projects battling climate change like Jeff is talking about here?
I'm sure there are more, but these are the ones that I've worked with in the past. European climate research agencies seemed to have their act together a lot better than US ones; before I left the industry, they were much further on the path of moving computation to the commercial cloud (instead of trying to keep up in the supercomputer arms race, which always seemed like a losing proposition to me), and I think ML has been increasingly integrated into climate models as a way to approximate complex dynamical systems.
In terms of which of these labs had their head screwed on straight from a technical side, it was probably KNMI / Met Office, followed by Max Planck and then IPSL.
My current client is a sophisticated AI/ML startup/consultancy, Faculty (https://faculty.ai). They have extensive experience in a variety of areas, and do some cutting-edge stuff. If you're intested, either ping me (email in the profile) and I can connect, or use the website.
I am happy to share my perspective on this. I feel a really good blog on this topic is Bret Victor's blog post - What can a technologist do about climate change?
One of the clear recommendations is - contribute to Julia. A lot of what Bret said in 2015 has actually panned out. Julia has become a powerful language for scientific computing and machine learning. As a result it is being used in climate projects such as Climate Machine (MIT and Caltech). The Julia Lab at MIT participates in this project:
Contributions to Julia packages (compiler, stdlibs, math packages, ML packages, parallel computing) will end up finding their way into climate research, because of the extensive reuse of code within the Julia ecosystem. Specifically capabilities such as Zygote.jl (https://github.com/FluxML/Zygote.jl) for differentiable programming have the potential to dramatically make it easy to apply ML techniques to scientific codebases. Compiler contributors are hard to come along, so all contributions to compiler technology are incredibly valuable. The DiffEqFlux.jl ecosystem in Julia is a good example of combining mechanistic models with ML (https://github.com/JuliaDiffEq/DiffEqFlux.jl). Hop on to the Julia slack channel or discourse to dig in deeper.
Another thing Bret Victor speaks about in his blog post is working with agencies such as ARPA-E on advanced projects. Julia Computing is participating in an ARPA-E project to bring these capabilities to many energy related simulation and development technologies. This press release gives a broad idea:
I hope one of those trends will be fixing (or abandoning) TensorFlow, because it's really disorganized, full of legacy and mysterious behaviors, and in some places just plain badly engineered.
Jevons observed that England's consumption of coal soared after James Watt introduced the Watt steam engine, which greatly improved the efficiency of the coal-fired steam engine from Thomas Newcomen's earlier design. Watt's innovations made coal a more cost-effective power source, leading to the increased use of the steam engine in a wide range of industries. This in turn increased total coal consumption, even as the amount of coal required for any particular application fell. Jevons argued that improvements in fuel efficiency tend to increase (rather than decrease) fuel use ...
He probably does know about it because he seems he’s a well-read guy, but I think is too late for him and for people like him now: the pay they receive is too good for them to leave it all for some “principles” and on top of that I think the’ve also managed to acquire come cognitive dissonance traits that allow them to get out of bed in the morning and go to work without feeling guilty.
Otherwise I cannot understand how he can really think that “more AI” is going to help with deforestation, as in “more AI” probably means less overall costs for bad people in the Amazon (where your major costs are people-related, it’s phisically very demanding cutting down trees in an Equatorial climate) which in turn means more trees being cut down. And this is just the beginning of it.
Is Jevons paradox really a problem when your carbon footprint is zero?
> VentureBeat: One of the things that’s come up a lot lately, you know, in the question of climate change — I was talking with Intel AI general manager Naveen Rao recently and he mentioned this idea [that] compute-per-watt should become a standard benchmark, for example, and some of the organizers here are talking about the notion of people being required to share the carbon footprint of the model that they trained for submissions here.
> Dean: Yeah, we’d be thrilled with that because all the stuff we trained in our Google Data Center — the carbon footprint is zero.
I think google as a whole understands this concept. Isn't that why they've held off on the release of their self-driving car? I can't remember where I read this from but I remember an interview where someone said they didn't want to release their self-driving car to the world until it was X-times better than the average human driver, and extremely competitive with its price.
-He once shifted a bit so hard it ended up on another computer.
-He wrote an O(n^2) algorithm once. It was for the Traveling Salesman Problem.
-Jeff Dean once implemented a web server in a single printf() call. Other engineers added thousands of lines of explanatory comments but still don't understand exactly how it works. Today that program is known as GWS.
-There is no 'Ctrl' key on Jeff Dean's keyboard. Jeff Dean is always in control.
-Jeff Dean's watch displays seconds since January 1st, 1970. He is never late.
-Jeff's code is so fast the assembly code needs three HALT opcodes to stop it.
> Jeff Dean's watch displays seconds since January 1st, 1970.
At a Hacker Jeopardy some time ago (I think it was the one at 29C3), the final round ended in a tie, so a tie-breaker was needed. The tie-breaker question was "What is the current Unix timestamp?" The contestants struggled hard, leading the moderator to exclaim "For god's sake, don't you ever check the clock!?"
I don't like the idea of a computer that can think for itself, I don't like the idea of computers will replace humans jobs, I don't like the way we are heading.
[+] [-] thundergolfer|6 years ago|reply
- much more multitask learning and multimodal learning
- more interesting on-device models — or sort of consumer devices, like phones or whatever — to work more effectively.
- AI-related principles-related work is going to be important.
- ML for chip design
- ML in robots
[+] [-] rurban|6 years ago|reply
[+] [-] cyorir|6 years ago|reply
https://venturebeat.com/2019/12/16/ai-experts-urge-machine-l...
"Dean said the company is thinking of including more information in Google search results to give users a predicted carbon output for choices they make, like ordering a certain product."
An AI system that predicts carbon output for consumer choices and delivers that to consumers could be useful in battling climate change, even if only in a small way.
[+] [-] thundergolfer|6 years ago|reply
Of course the engineering is important, but politics is 1000% the road block on climate action and that is where efforts should be focused,
[+] [-] summerlight|6 years ago|reply
[+] [-] Nashooo|6 years ago|reply
[+] [-] listentojohan|6 years ago|reply
[+] [-] mrischard|6 years ago|reply
[+] [-] chaosphere2112|6 years ago|reply
- UK: Met Office https://www.metoffice.gov.uk/about-us/careers/vacancies)
- France: Institut Pierre Simon Laplace https://www.ipsl.fr/en/ (their job page is busted)
- Germany: Max Planck Institute https://www.mpimet.mpg.de/en/institute/opportunities/
- Netherlands: KNMI https://www.werkenvoornederland.nl/organisaties/ministerie-v...
I'm sure there are more, but these are the ones that I've worked with in the past. European climate research agencies seemed to have their act together a lot better than US ones; before I left the industry, they were much further on the path of moving computation to the commercial cloud (instead of trying to keep up in the supercomputer arms race, which always seemed like a losing proposition to me), and I think ML has been increasingly integrated into climate models as a way to approximate complex dynamical systems.
In terms of which of these labs had their head screwed on straight from a technical side, it was probably KNMI / Met Office, followed by Max Planck and then IPSL.
[+] [-] kot-behemoth|6 years ago|reply
[+] [-] ViralBShah|6 years ago|reply
http://worrydream.com/ClimateChange/
One of the clear recommendations is - contribute to Julia. A lot of what Bret said in 2015 has actually panned out. Julia has become a powerful language for scientific computing and machine learning. As a result it is being used in climate projects such as Climate Machine (MIT and Caltech). The Julia Lab at MIT participates in this project:
Climate Machine: https://clima.caltech.edu/ Github: https://github.com/climate-machine/ Julia Lab: http://julia.mit.edu/
Contributions to Julia packages (compiler, stdlibs, math packages, ML packages, parallel computing) will end up finding their way into climate research, because of the extensive reuse of code within the Julia ecosystem. Specifically capabilities such as Zygote.jl (https://github.com/FluxML/Zygote.jl) for differentiable programming have the potential to dramatically make it easy to apply ML techniques to scientific codebases. Compiler contributors are hard to come along, so all contributions to compiler technology are incredibly valuable. The DiffEqFlux.jl ecosystem in Julia is a good example of combining mechanistic models with ML (https://github.com/JuliaDiffEq/DiffEqFlux.jl). Hop on to the Julia slack channel or discourse to dig in deeper.
Another thing Bret Victor speaks about in his blog post is working with agencies such as ARPA-E on advanced projects. Julia Computing is participating in an ARPA-E project to bring these capabilities to many energy related simulation and development technologies. This press release gives a broad idea:
ARPA-E press release: https://www.energy.gov/articles/department-energy-announces-... Funded projects: https://arpa-e.energy.gov/sites/default/files/documents/file... Julia Computing press release: https://juliacomputing.com/communication/2019/12/09/arpa-e.h...
[+] [-] m00dy|6 years ago|reply
I'm also working something similar to this. Would you leave your contact info ?
[+] [-] ma2rten|6 years ago|reply
[+] [-] summerlight|6 years ago|reply
This paper seems relevant to the climate change part. This is more about what to do rather than technologies.
[+] [-] m0zg|6 years ago|reply
[+] [-] mstokholm|6 years ago|reply
[+] [-] narrator|6 years ago|reply
[+] [-] deepaksurti|6 years ago|reply
An example from [1]:
```
Jevons observed that England's consumption of coal soared after James Watt introduced the Watt steam engine, which greatly improved the efficiency of the coal-fired steam engine from Thomas Newcomen's earlier design. Watt's innovations made coal a more cost-effective power source, leading to the increased use of the steam engine in a wide range of industries. This in turn increased total coal consumption, even as the amount of coal required for any particular application fell. Jevons argued that improvements in fuel efficiency tend to increase (rather than decrease) fuel use ...
```
[1] https://en.wikipedia.org/wiki/Jevons_paradox
[+] [-] paganel|6 years ago|reply
Otherwise I cannot understand how he can really think that “more AI” is going to help with deforestation, as in “more AI” probably means less overall costs for bad people in the Amazon (where your major costs are people-related, it’s phisically very demanding cutting down trees in an Equatorial climate) which in turn means more trees being cut down. And this is just the beginning of it.
[+] [-] rejschaap|6 years ago|reply
> VentureBeat: One of the things that’s come up a lot lately, you know, in the question of climate change — I was talking with Intel AI general manager Naveen Rao recently and he mentioned this idea [that] compute-per-watt should become a standard benchmark, for example, and some of the organizers here are talking about the notion of people being required to share the carbon footprint of the model that they trained for submissions here.
> Dean: Yeah, we’d be thrilled with that because all the stuff we trained in our Google Data Center — the carbon footprint is zero.
[+] [-] unknown|6 years ago|reply
[deleted]
[+] [-] derangedHorse|6 years ago|reply
[+] [-] outside1234|6 years ago|reply
[+] [-] option|6 years ago|reply
[+] [-] bytematic|6 years ago|reply
[+] [-] symplee|6 years ago|reply
Some highlights include:
-Jeff Dean's PIN is the last 4 digits of pi.
-He once shifted a bit so hard it ended up on another computer.
-He wrote an O(n^2) algorithm once. It was for the Traveling Salesman Problem.
-Jeff Dean once implemented a web server in a single printf() call. Other engineers added thousands of lines of explanatory comments but still don't understand exactly how it works. Today that program is known as GWS.
-There is no 'Ctrl' key on Jeff Dean's keyboard. Jeff Dean is always in control.
-Jeff Dean's watch displays seconds since January 1st, 1970. He is never late.
-Jeff's code is so fast the assembly code needs three HALT opcodes to stop it.
[+] [-] Keloo|6 years ago|reply
[+] [-] majewsky|6 years ago|reply
At a Hacker Jeopardy some time ago (I think it was the one at 29C3), the final round ended in a tie, so a tie-breaker was needed. The tie-breaker question was "What is the current Unix timestamp?" The contestants struggled hard, leading the moderator to exclaim "For god's sake, don't you ever check the clock!?"
[+] [-] boris|6 years ago|reply
[+] [-] GrryDucape|6 years ago|reply
[+] [-] cookie_monsta|6 years ago|reply
[+] [-] 2sk21|6 years ago|reply
[+] [-] lonelappde|6 years ago|reply
[+] [-] rimliu|6 years ago|reply
[+] [-] AllasAskBar|6 years ago|reply
[deleted]