Are you saying that people playing video games are more energy intensive than literal warehouses full of GPU's running 24/7 mining ethereum? Nobody talks about that because it's a horrible comparison.
Does one person playing a game on their PS5 for a few hours a day really have the same footprint as one crypto miner running a swarm of several hundred or even thousands of GPUs for the same period of time (ignoring the reality that those things are mining 24/7)?
I don't really know how you can equate the two and also say that an individual person has no incentive to keep their househould energy costs at a minimum. Especially when you consider the socioeconomic status of the average gamer or their family, compared to the average owner of a crypto mining enterprise.
Eh, I don't think it's the same. People don't run games on warehouses of GPUs 24x7. They boot up games for a couple of hours and then turn them off.
A big reason why these comparisons fall flat is because the only way people can make gaming and crypto energy use look the same is by comparing a niche activity to a mainstream one. Gaming overall takes up a lot of energy because a lot of people do it. When we ignore that fact and only compare the global numbers then NFTs look a bit better than they should, they look like they're maybe competitive by some metrics. But that's because they're an incredibly niche product with double-digit transaction fees that people don't really spend much time interacting with.
When we break things apart into a per-person comparison, there's basically no way I've been able to find to make proof-of-work energy usage look good. Just think about it from a purely physical perspective, every transaction on the blockchain uses GPU time spread across multiple actors. Of course that uses more power than a single person's GPU running for their own personal gaming system. Distributed systems pretty much always use more power because they duplicate work.
We know this, it's the entire foundation of proof-of-work systems, they're impossible for a single actor to keep up with and that's why they have security guarantees. But the other side of that is that a distributed network of multiple GPUs that needs to be run for every chunk of transactions on the chain is of course going to be more expensive per-transaction then the equivalent "play-session" of an individual playing a game.
Maybe if everyone did their gaming on Google Stadia the comparison would be more accurate (although even there I bet Google's less-decentralized system would still probably be more efficient than a network of GPUs duplicating the same work). But that's not the case right now, most GPU work for graphics for games are happening locally, and those GPUs get powered down afterwards. There's no way someone playing a Switch for an hour is using the same amount of energy as an NFT transaction.
my gut feeling is that the blockchain is more energy intensive. but i am curious how much energy is consumed by a global online multiplayer game like fortnite or similar.
nyolfen|4 years ago
ljm|4 years ago
I don't really know how you can equate the two and also say that an individual person has no incentive to keep their househould energy costs at a minimum. Especially when you consider the socioeconomic status of the average gamer or their family, compared to the average owner of a crypto mining enterprise.
danShumway|4 years ago
A big reason why these comparisons fall flat is because the only way people can make gaming and crypto energy use look the same is by comparing a niche activity to a mainstream one. Gaming overall takes up a lot of energy because a lot of people do it. When we ignore that fact and only compare the global numbers then NFTs look a bit better than they should, they look like they're maybe competitive by some metrics. But that's because they're an incredibly niche product with double-digit transaction fees that people don't really spend much time interacting with.
When we break things apart into a per-person comparison, there's basically no way I've been able to find to make proof-of-work energy usage look good. Just think about it from a purely physical perspective, every transaction on the blockchain uses GPU time spread across multiple actors. Of course that uses more power than a single person's GPU running for their own personal gaming system. Distributed systems pretty much always use more power because they duplicate work.
We know this, it's the entire foundation of proof-of-work systems, they're impossible for a single actor to keep up with and that's why they have security guarantees. But the other side of that is that a distributed network of multiple GPUs that needs to be run for every chunk of transactions on the chain is of course going to be more expensive per-transaction then the equivalent "play-session" of an individual playing a game.
Maybe if everyone did their gaming on Google Stadia the comparison would be more accurate (although even there I bet Google's less-decentralized system would still probably be more efficient than a network of GPUs duplicating the same work). But that's not the case right now, most GPU work for graphics for games are happening locally, and those GPUs get powered down afterwards. There's no way someone playing a Switch for an hour is using the same amount of energy as an NFT transaction.
lwansbrough|4 years ago
blondin|4 years ago
my gut feeling is that the blockchain is more energy intensive. but i am curious how much energy is consumed by a global online multiplayer game like fortnite or similar.