> The average lightning strike contains about 1 million joules, enough energy to fry the founding father in his boots. “The typical house in the U.S. has 100 amp service or about 28 horsepower,” says Kirtley.
Boy do I get frustrated when using compatible units without conversion. The unit that I hate more than any other unit in the universe is the KwH, which is dimensionally equivalent to the Joule, so I don't understand why we don't just use that instead.
"The typical house in the U.S. has 100 amp service or about 28 horsepower" -- seems that it would be way more interesting to say that "the typical house has 100 amp service at 120V, which means 12,000 J/s".
The way the original quote is phrased (and the introduction of horsepower of all things) seems insane to me; the clarification adds zero value. You still haven't addressed the main question, which is "is the energy in a lightning bolt a significant amount of energy compared to household usage". For all I know 28 horsepower is 1,000,000 J/s, so a lightning bolt would only power a house for a second.
EDIT: as many commenters have pointed out, apparently most houses get 240V service, so just double the number above. Still, this is easily fixable, and the main point is that horsepower does not add any value to this discussion.
Strictly speaking, 240V. Normal electric service in North America is 240V split-phase, with the distribution transformer's center tap grounded and serving as the neutral line. We normally only use the full 240V for heavy loads like electric ovens, arc welders, large air conditioners, and such.
Large buildings often use 208V three-phase power, yielding 120V phase-to-neutral, and large commercial lighting installations are often 277V taken from one leg of a 480V three-phase feed. Voltages greater than 240 are not permitted in residential service, and I wouldn't be surprised if phase-to-neutral > 120 is out as well for homes.
The kilowatt hour is a fantastic unit when talking about electrical consumption.
Are you running a 100W load (0.1kW) for an hour? That's 0.1kWh. Running it for ten would make a whole kilowatt hour.
This allows for easy calculations of how much something is going to cost in electricity, and the units are such that it's easy to do the math in your head.
> The unit that I hate more than any other unit in the universe is the KwH, which is dimensionally equivalent to the Joule, so I don't understand why we don't just use that instead.
kWh/yr is worse. It's just watts but obfuscated. Gets used for appliances.
The KwH makes sense if you consider that while units are largely path-independent, mental calculations are not. At any given moment, the sensible measure of your house's electricity usage is in kW. And to work backwards to figure out consumption, the hour certainly beats the second. Sure, you could call it 3.6e6 joules, but at that what point what does it buy you?
Horsepower is clearly insane though, I have no idea why you'd bother.
Idk, I think OP indicated that a typical lightning bolt would power a typical house (presumably at the stated nominal full load) for about 15 minutes. I.e., you'd need 4 strikes per hour, per hour. Now granted most houses won't run at peak load all the time, maybe you'd only need one strike per hour per house - but that's still clearly a lot more lightning than is generally seen.
The kWh is of course exactly what you want if you're working with power over time.
Example: a 100 amp house circuit running maxed out in the US will use 12 kWh per hour, or 0.2 per minute. Try doing it in your head with joules. Annoying right?
12 kWh / h? Am I a crazy person? No. I'm working on a useful problem.
> "On an ordinary day over flat desert country, or over the sea, as one goes upward from the surface of the ground the electric potential increases by about 100 volts per meter. Thus there is a vertical electric field E of 100 volts/m in the air. The sign of the field corresponds to a negative charge on the earth’s surface. This means that outdoors the potential at the height of your nose is 200 volts higher than the potential at your feet! You might ask: “Why don’t we just stick a pair of electrodes out in the air one meter apart and use the 100 volts to power our electric lights?”
> "Although the electric current-density in the air is only a few micromicroamperes per square meter, there are very many square meters on the earth’s surface. The total electric current reaching the earth’s surface at any time is very nearly constant at 1800 amperes. This current, of course, is “positive”—it carries plus charges to the earth. So we have a voltage supply of 400,000 volts with a current of 1800 amperes—a power of 700 megawatts! With such a large current coming down, the negative charge on the earth should soon be discharged. In fact, it should take only about half an hour to discharge the entire earth. But the atmospheric electric field has already lasted more than a half-hour since its discovery. How is it maintained? What maintains the voltage? And between what and the earth? There are many questions."
Lightning-powered bitcoin mining is obviously the next big thing for green energy!
Now you need a double helping of luck - first that you get struck by lightning, and second that your miner guesses the right hash to make a block. But maybe the two somehow combine? After all, people who survive a lightning strike are said to be so lucky that they should buy a lottery ticket... so following this logic, lightning powered mining would be extra efficient!
Now I'm off to patent my new PoS invention - proof of strike :)
I know a number of people who have been struck by lightning. It seems like you've got a decent chance at survival if you're not touching anything metal.
To illustrate that even more vividly: if a typical lightning bolt is a few km long, the energy released is equivalent to detonating a stream of gasoline roughly 0.1mm in diameter (not much thicker than a human hair).
You have remember that most of the energy in a storm is more to do with its exchange of air masses. It's more like a boiler than a dynamo. So it doesn't surprise me that lightning in practice isn't energy dense. But figuring out how to capture the energy generated seems like a fun experiment just to see how far we can go with material science than anything else.
This reminds me of an old (misleading) graphic in Wired magazine suggesting that people on treadmills could be significantly reducing their electricity bills.
Olympic track cycling medallist powers toasting a slice of bread, which takes everything he's got and leaves him exhausted: https://youtu.be/S4O5voOCqAQ
I've often wondered if it would be viable to run a gym where all of the equipment is designed to harness customers' energy to help power the building. I suspect it wouldn't make enough of a difference to be worth it but have never seen anyone run the numbers.
Therefore, one could easily generate lightning with equal destructive power? Why is it not used yet in war (airplane to ground) or for entertainment (airplane to airplane)?
This looks like the opposite of a low hanging fruit. That fruit is hanging very high and on top of that it is very small and hard to get to the edible part.
Solar, wind, hydro, biofuel, geothermal, maybe even day-night temperature cycles - all of these look much more promising in the "free" energy department. Actually it's hard to think about a worse energy source. Earthquakes maybe? :-)
A lightning bolt is literally the opposite of clean power. Clean in this sense is lack of electrical noise. Random.org have created their RNGs using lightning strikes.
If you really mean gigawatts (unit of power), then you have to multiply it by the time period when this peak power is reached to get the energy output. Lightning bolts have very short durations, tens of microseconds.
Let's be generous and give it 1 millisecond. 1 millisecond times 1 gigawatt is 1 million joules, which is the estimate that the article gives.
For example the flash you have/had on standard camera (not smartphone who have LEDs, but the standard compact camera that is just a camera) is about 1kW!!
But this 1kW you have it for about 1ms (milisecond!!). It seems to last longer because the light gets “burned” into your eye.
At one point, I realized that lightning is the breakdown of the dielectric material of a capacitor. This means you harvest electric by draining it with a large sheet or web of conductors at cloud level. No idea how well this would work or how much energy there is to collect, and it would be pretty impractical.
To say this another way... A lightning bolt would have to hit the average US home every 12 minutes to keep everything powered (assuming perfect conversion and no losses)
It's really hard to do because within the cloud each water droplet contains a tiny bit of charge. The air between them is an insulator. The cloud is many cubic miles. So to extract the charge you either need to touch every droplet, or make a spark between each droplet and a neighbouring one (which is what happens during a lightning strike).
It's not so little power, it's so little _energy_ (power over some period of time). It's a crap-tonne of power, just over a _very_ short period of time.
[+] [-] andrewla|3 years ago|reply
Boy do I get frustrated when using compatible units without conversion. The unit that I hate more than any other unit in the universe is the KwH, which is dimensionally equivalent to the Joule, so I don't understand why we don't just use that instead.
"The typical house in the U.S. has 100 amp service or about 28 horsepower" -- seems that it would be way more interesting to say that "the typical house has 100 amp service at 120V, which means 12,000 J/s".
The way the original quote is phrased (and the introduction of horsepower of all things) seems insane to me; the clarification adds zero value. You still haven't addressed the main question, which is "is the energy in a lightning bolt a significant amount of energy compared to household usage". For all I know 28 horsepower is 1,000,000 J/s, so a lightning bolt would only power a house for a second.
EDIT: as many commenters have pointed out, apparently most houses get 240V service, so just double the number above. Still, this is easily fixable, and the main point is that horsepower does not add any value to this discussion.
[+] [-] flyinghamster|3 years ago|reply
Strictly speaking, 240V. Normal electric service in North America is 240V split-phase, with the distribution transformer's center tap grounded and serving as the neutral line. We normally only use the full 240V for heavy loads like electric ovens, arc welders, large air conditioners, and such.
Large buildings often use 208V three-phase power, yielding 120V phase-to-neutral, and large commercial lighting installations are often 277V taken from one leg of a 480V three-phase feed. Voltages greater than 240 are not permitted in residential service, and I wouldn't be surprised if phase-to-neutral > 120 is out as well for homes.
[+] [-] jfim|3 years ago|reply
Are you running a 100W load (0.1kW) for an hour? That's 0.1kWh. Running it for ten would make a whole kilowatt hour.
This allows for easy calculations of how much something is going to cost in electricity, and the units are such that it's easy to do the math in your head.
[+] [-] themitigating|3 years ago|reply
[+] [-] Schroedingersat|3 years ago|reply
kWh/yr is worse. It's just watts but obfuscated. Gets used for appliances.
[+] [-] ravi-delia|3 years ago|reply
Horsepower is clearly insane though, I have no idea why you'd bother.
[+] [-] greenbit|3 years ago|reply
[+] [-] samatman|3 years ago|reply
Example: a 100 amp house circuit running maxed out in the US will use 12 kWh per hour, or 0.2 per minute. Try doing it in your head with joules. Annoying right?
12 kWh / h? Am I a crazy person? No. I'm working on a useful problem.
[+] [-] unknown|3 years ago|reply
[deleted]
[+] [-] csours|3 years ago|reply
[+] [-] photochemsyn|3 years ago|reply
> "On an ordinary day over flat desert country, or over the sea, as one goes upward from the surface of the ground the electric potential increases by about 100 volts per meter. Thus there is a vertical electric field E of 100 volts/m in the air. The sign of the field corresponds to a negative charge on the earth’s surface. This means that outdoors the potential at the height of your nose is 200 volts higher than the potential at your feet! You might ask: “Why don’t we just stick a pair of electrodes out in the air one meter apart and use the 100 volts to power our electric lights?”
> "Although the electric current-density in the air is only a few micromicroamperes per square meter, there are very many square meters on the earth’s surface. The total electric current reaching the earth’s surface at any time is very nearly constant at 1800 amperes. This current, of course, is “positive”—it carries plus charges to the earth. So we have a voltage supply of 400,000 volts with a current of 1800 amperes—a power of 700 megawatts! With such a large current coming down, the negative charge on the earth should soon be discharged. In fact, it should take only about half an hour to discharge the entire earth. But the atmospheric electric field has already lasted more than a half-hour since its discovery. How is it maintained? What maintains the voltage? And between what and the earth? There are many questions."
https://www.feynmanlectures.caltech.edu/II_09.html
[+] [-] jcims|3 years ago|reply
[+] [-] joosters|3 years ago|reply
Now you need a double helping of luck - first that you get struck by lightning, and second that your miner guesses the right hash to make a block. But maybe the two somehow combine? After all, people who survive a lightning strike are said to be so lucky that they should buy a lottery ticket... so following this logic, lightning powered mining would be extra efficient!
Now I'm off to patent my new PoS invention - proof of strike :)
[+] [-] usrn|3 years ago|reply
[+] [-] typedef_struct|3 years ago|reply
[+] [-] teraflop|3 years ago|reply
[+] [-] ladyattis|3 years ago|reply
[+] [-] Gibbon1|3 years ago|reply
[+] [-] yellowapple|3 years ago|reply
[+] [-] visarga|3 years ago|reply
[+] [-] koheripbal|3 years ago|reply
[+] [-] a9h74j|3 years ago|reply
[+] [-] omnicognate|3 years ago|reply
[+] [-] agumonkey|3 years ago|reply
Some city noticed changes in consumption when new houses had meter installed on ground floor rather than in the basement.
[+] [-] NoboruWataya|3 years ago|reply
[+] [-] Mordisquitos|3 years ago|reply
[+] [-] david_draco|3 years ago|reply
[+] [-] praptak|3 years ago|reply
Solar, wind, hydro, biofuel, geothermal, maybe even day-night temperature cycles - all of these look much more promising in the "free" energy department. Actually it's hard to think about a worse energy source. Earthquakes maybe? :-)
[+] [-] 1970-01-01|3 years ago|reply
https://api.random.org/features
[+] [-] coldtea|3 years ago|reply
[+] [-] praptak|3 years ago|reply
Let's be generous and give it 1 millisecond. 1 millisecond times 1 gigawatt is 1 million joules, which is the estimate that the article gives.
[+] [-] selljamhere|3 years ago|reply
[+] [-] nfin|3 years ago|reply
For example the flash you have/had on standard camera (not smartphone who have LEDs, but the standard compact camera that is just a camera) is about 1kW!!
But this 1kW you have it for about 1ms (milisecond!!). It seems to last longer because the light gets “burned” into your eye.
[+] [-] dzhiurgis|3 years ago|reply
[+] [-] londons_explore|3 years ago|reply
Eg. https://en.wikipedia.org/wiki/Harvesting_lightning_energy#:~...).
[+] [-] dehrmann|3 years ago|reply
[+] [-] londons_explore|3 years ago|reply
[+] [-] raverbashing|3 years ago|reply
That might be "easier" in some aspects
[+] [-] londons_explore|3 years ago|reply
[+] [-] giardini|3 years ago|reply
[+] [-] whoomp12342|3 years ago|reply
[+] [-] trelane|3 years ago|reply
And yes, power is what fries the tree: https://en.wikipedia.org/wiki/Joule_heating
[+] [-] unknown|3 years ago|reply
[deleted]
[+] [-] abujazar|3 years ago|reply
[+] [-] unknown|3 years ago|reply
[deleted]