While 5G Will be a great boon, especially the beam-forming satellite version, another unintended consequence besides weather remote sensing is nuking the extremely important 24 GHz range (K Band) for radio astronomy. There are a few narrow protected windows for absolutely critical spectral lines, but the truth is that nature doesn't play by the spectrum allocations rules, and there are hundreds if not thousands of lines that are observed routinely outside of the protected bands. It is also remarkably free and clear of radio frequency interference (RFI), in part because industry has chosen other frequencies not attenuated by atmospheric water vapor. This isn't to say we should halt global human progress to save a local river bait fish, but that threat to forecasting is only one of the serious consequences major spectrum reallocation can have. This is especially true for passive use in the sciences, which has a weaker lobby than the private sector.
Lately I've become more aware of the secondary effects of 5G- on weather forecasting, on the radio spectrum, possibly on bees- and it's got me wondering why we need it for telecom. I just don't see the value added. I can already communicate with anyone in the world, access any information, and find my way anywhere with 4G. A significantly higher rate of data transfer just doesn't seem to add any new functionality to my phone. Can anyone give me a good rationale for 5G? Entertainment doesn't count.
I'll grant right-off-the-bat that it'll have some fantastic industrial applications; my issue is with personal telecom. It just feels like a new planned obsolescence vector.
A few or more of those and we have real loss in biodiversity. Maybe your "local river" can sustain that for a bit - but overall they are all important.
If 5G is going to impact radio astronomy then the governments that license the spectrum should fund alternatives. Some simple space-based telescopes orbiting out beyond the 5G bubble would be expensive but not terribly difficult (radio telescopes, not the JWST). Put a couple out beyond the moon and the next image of a black hole won't be so blurry.
Has anyone done a deep dive on 5g health concerns? E.g., 240-some scientists and 40 doctors signed a letter of discouragement (or something), claims research indicates 5g interacts with human biology in poorly understood ways: https://ehtrust.org/key-issues/cell-phoneswireless/5g-networ...
Unfortunately, as a society I think we're going to have to "pee on the electric fence" for ourselves to find out.
Despite the fact that this spectrum has never been used for any widespread purpose, we're rolling it out and the burden is not on the implementers to prove that it is safe. It's basically on researchers to both prove, publicize, and convince society as a whole that 5G has health impacts.
I am not going to go all conspiracy-theory and say that the research is being suppressed but certainly funding for this research is not going to be a priority for the US government, as they've been thoroughly bought and paid for. Most research into health effects of non-ionizing radiation is not funded from the US government, so draw your own conclusions from that.
It's all nonsense. The higher frequencies are non-ionizing and in common use today. See all of those microwave antennas on buildings and towers? A lot of them transmit around there.
Your neighbor can blast 24 ghz right at your house with a free licence:
The reason microwaves (like from a microwave oven) are dangerous are because of their power levels. Like, they'd cook you if they leaked out. Not because of ionizing radiation.
Some of those items on the list are sending up red flags to me. The weapon use is purely because it heats skin; if you apply a million times less energy for a million times as long, the danger level from heat is zero. When it can't penetrate skin, it does make sense to classify head skin exposure the same as foot skin exposure.
The thing about sweat glands is interesting. And sadly I have no idea how to evaluate the quality of the studies linked.
The "good news" is that if it was serious, we would very likely already know. Therefore via some Bayesian inference we can claim that it's relatively harmless, but obviously worth keeping an eye on.
The bad news is that there are and will be entrenched interests that will likely try to "work around" any health concern, and maybe, potentially we will hear that the existence of 5G is worth it. (For example the tech advantage helps with healthcare more than the radiation harms us.)
I was wondering, what exactly 5G will bring us. For the most part, all the tasks I need to do on a phone (pocket computer / communicator) can be done even with 3G (video, streaming music, and any website's loading time is more than acceptable at 3G speeds).
The only thing I can think of is 5G will allow for more overall network bandwidth, so the data caps on "unlimited" plans wouldn't be needed. But compared to how we use our phones today, what new items will be be able to do with 5G that we can't do with current 4G/LTE?
With data caps I don't see much use for it. I still think it's hilarious that carriers advertise download speeds that would blow through most people's monthly data allowance in at most a couple of minutes.
> Does this also mean that 5G will suck, when it’s raining? [from a comment below the article]
If 5G uses almost the same frequency where microwaves detect water vapour (around 24 GHz), won't the weather have a great impact on it?
Also, I always thought that such small waves would have problems with obstacles, with good signal just when your phone is in line-of-sight with antennas.
That's all correct, the higher frequency suffers from worse object penetration. Solutions I've heard was that 5G would likely involve neighborhood or even building repeaters.
IMO 5G is massively overhyped. My iPhone 7+ isn't limited by 4G LTE, it's limited by Verizon deciding to only allow it 10mbps down (with great signal). 5G won't matter one bit if the current bottleneck isn't 4G LTE in the first place.
> If 5G uses almost the same frequency where microwaves detect water vapour (around 24 GHz), won't the weather have a great impact on it?
There's a general misunderstanding about the technology that leads people down this road of thought.
5G is broken up into two frequency ranges, FR1 and FR2. FR1 is everything below 6Ghz and encompasses the same spectrum as traditional cellular technologies. FR2 is everything over 24Ghz and that's the bit everyone is confused about.
FR1 is like traditional cellular and will be slapped on cell towers to provide broad coverage over a wide area with performance characteristics similar to what we have today with LTE. It's not very exciting but it's 5G and this is what everyone is currently rolling out.
FR2 is meant to be absorbed, otherwise you'd have a big problem. Unlike FR1 which limits you to 100mhz bandwidth per channel, FR2 mandates that channel bandwidth be between 50-400mhz. So at a minimum, an FR2 channel will have half the maximum allowable bandwidth of FR1. If FR2 propagated more than a very short distance the airwaves would be quickly saturated by a small number of users.
FR2 is intended to be deployed in very dense areas like indoors. You'd be able to deploy many cell sites without worrying about overlap or signal propagation because everything from walls to moisture in the air will absorb the signals.
It might also be possible to slap an FR2 cell site on top of every lamp post going down a street.
I think the main point of 5g keeps getting missed when people are asking about cell phones and their broadband speed vs capacity etc etc.
The only reason telcos are going to put in 5g is for IOT coverage. Low powered trickle data from billions of devices.
Stuff for your personal cellular use would never come close to covering the costs involved. And 4g will still be used for many years to come for that.
I’m somewhat confused. I’ll admit that I’m not very familiar with super high frequency radio, but isn’t the difference at least 200 MHz, approximately 10 times larger than the entire FM radio spectrum? Doesn’t out-of-band emission stop being a problem at that much separation? Or should we look at it relative to the base frequency?
edit: For what it's worth, I found this paragraph from the FCC last year: https://www.federalregister.gov/d/2018-14806/p-20 It sounds like they're saying "we don't know if this will be a problem yet, but be prepared to limit emissions in 23.6-24ghz range because we might require it at some point".
Also, paragraph 9 of the same document has the actual band limits (with a special requirement) if anybody is interested:
> The 24 GHz band consists of two band segments: The lower segment, from 24.25-24.45 GHz, and the upper segment, from 24.75-25.25 GHz
> any mobile or transportable equipment capable of operating in any portion of the 24 GHz band must be capable of operating at all frequencies within the 24 GHz band, in both band segments
"At the high end of the electromagnetic spectrum, signals travel over a band of 10 million trillion Hz (that is, 10^22Hz). This end of the spectrum has phenomenal bandwidth, but it has its own set of problems. The wave forms are so miniscule that they're highly distorted by any type of interference, particularly environmental interference such as precipitation. Furthermore, higher-frequency wave forms such as x-rays, gamma rays, and cosmic rays are not very good to human physiology and therefore aren't available for us to use for communication at this point."
Not the best analysis but I'm working. Basically like any other metric, as the frequency increases the space between peaks and valleys decreases and it becomes harder to determine/separate from others 30hz to 230hz is much easier to tell the difference than 15khz to 15.2khz if you want to listen to audio tones. Once you get to microwaves this becomes of course much more difficult.
While bandwidth is typically measured in absolute Hz difference due to the spread directly relating to the amount of information-carrying capacity, when you are actually generating signals many things scale with carrier frequency.
The way most of these signals are being generated is through an "information signal" (usually referred to as a modulator) being created within your device, then moved up in frequency by combining it with a "carrier signal". This separation allows the information signal's properties (such as bandwidth, modulation scheme, bitrate/Hz, etc...) to be more-or-less independent from the physical characteristics of propagation, which will be more strongly related to the carrier signal's properties (e.g. water absorption, reflection/transmission off of/through materials, etc...).
However, no process is perfect, and although we would like to generate perfect signals, we can't. Distortion appears in multiple parts of this pipeline, both in the generation of the low-frequency modulator and the high-frequency carrier. Distortion typically comes from "linear" components not being perfectly linear and thereby generating harmonics of the signal passing through them. In the case of the modulator, this splatters signal energy up and down the spectrum on the order of the bandwidth of the modulator, but in the case of the carrier it does so on the order of the carrier frequency. This is all a matter of degree and depending on the application may not be that big a deal, but it definitely must be addressed for high-density communications equipment like cell networks.
All transmitters have filters at multiple stages of their signal processing chains, both lowpass filters that filter the modulator before it's mixed with the carrier signal to boost it up in frequency, as well as bandpass filters that ensure the output stays within the bounds it's meant to be; but these filters only do so much and they can be expensive to create (as they can have rather fine physical tolerances) so everybody is always just playing the "do the best job we can for the least money" game.
Luckily, most transmitters are also connected to antennae that provide a convenient filtering on the output (they only resonate at the frequencies they transmit at) which helps for specialized systems that only operate at a single transmit frequency, but for something like 5G is less helpful, due to the many channels it supports, causing the antenna to necessarily support a wide range of frequencies.
Rejecting on the grounds of no technical basis? I'd like to see more on that. I would hope when NASA raises a flag with the FCC it's taken with sincerity.
you still need more capacity, as the population grows more number of devices will compete for same 4g bandwidth and unlike optical fibre you cannot increase the bandwidth by adding new wire.
Pilot here - they used the example of a hurricane, however, I think it would have a daily impact on thousands of flights (general aviation and commercial) which all rely on on accurate weather forecasting. Weather is no joke in aviation; even if you're flying a 747.
> ... a letter from NASA Administrator Jim Bridenstine and Secretary of Commerce Wilbur Ross requesting that it be delayed. FCC Chairman Ajit Pai rejected the request ...
I would be much more happy to have reliable 4G or 3G at least, first.
I suppose quite some support comes from people who have connection issues with weak 4G and assumes 5G will solve them.
But since 5G will consume apparently 3G towers and has much less range, quite the opposite could happen. Even less connection for people not in the city.
Slightly OT or meta. I keep bumping up against these nutty conspiracy theories about 5G being dangerous in various forums. Has anyone done a study of the effects of certain frequencies and energy levels on the human body that I can use to refute these fools? Also, what is the canonical source on 5g spectrum and power levels?
While most of the promise of 5g feels like overhype, this article has a bit of flat earth feel to it. For example, we’ve been trying to convert obsolete UHF frequencies to usable bandwidth for years only to find all sorts of reasons why it can’t be killed off. If 5g really threatened weather forecasting and radio astronomy I would think at minimum there would some sort of initiative to address it prior to a cutover - and yet, this article doesn’t seem to acknowledge such a thing exists so I’m left to conclude maybe the article is nothing more than a hypothetical what if that likely won’t have much impact in the real world. This is just my gut feel...
Can someone help me understand this better? What is "very close" to 23.8-GHz frequencies? I don't know which bands 5G operates on, but it seems [1] that the closest they get, at least in the US, is ~27 GHz. If the FCC is auctioning 3000 licenses for the 24 GHz space, is that the space that can potentially interfere more? Can 5G operate on just any frequencies, then?
So how hard is it to limit the 5G signal to bandwidths that don't interfere with weather forecasting, and how hard is it to detect and enforce laws against such bandwidth spillover?
> detect and enforce laws against such bandwidth spillover?
This phenomenon is called adjacent-channel interference and violations can and would be enforced by the FCC. The challenge isn't detection or enforcement, it's the challenge of the different government agencies to balance the people's needs properly.
FCC wants to auction off the spectrum to benefit telcos and their customers. NOAA wants to protect people with accurate predictions of hazardous weather.
Your question presumes that the weather prediction function is more valuable, but the government may not reach that same conclusion.
Stupid question. I was under the impression that one of the limits of 5g was that it was a short-distance signal, easily blocked by a wall or any obstacle. Is it really going to create interferences all the way to space? I thought satellites measured the temperature of the top of the atmosphere, not of stuff on the ground.
They actually measure a lot of parameters, not only temperatures, and in all the layers of the atmosphere. And they're _very_ sensitive. For some forecast needs (short term forecast, storms, etc.), the conditions under the atmospheric boundary layer (https://en.wikipedia.org/wiki/Planetary_boundary_layer) is what matters most, so the microwave noise near the ground is definitely an issue.
Typical signal transmission uses a a signal to noise floor of something like 20-40 dB ( https://documentation.meraki.com/MR/WiFi_Basics_and_Best_Pra... ) for high speed data transmission, but if you want to just get a big fat one or zero across, then you don't need that much. And antennas are really good at picking up resonant EM radiation, even if it's not the "full signal".
But these very sensitive weather sensors. And they already work by detecting a trend and then detecting a big blip over that. (So rain currently looks like some sort of interesting blip in the noise.) So, it might be possible to have cities mapped with a differing trend, but it would further complicate models. And currently over land there's not much noise, because humans don't use this part of the EM spectrum. Mostly because it's not great at long range, because attenuates very fast exactly due to water vapor in the air. So it was "easy" to exploit this for getting weather data, because it was reasonable to assume close to constant natural emissions. (Probably only a simple daily and seasonal trend. Though it might be already necessary to handle differences between woodlands and urban areas.)
Any assurances that this won't seriously disturb the earth's ecology and human health, or do we no longer bother with that when manipulating the whole planet?
[+] [-] autocorr|6 years ago|reply
[+] [-] N_trglctc_joe|6 years ago|reply
This is something I've been having trouble with.
Lately I've become more aware of the secondary effects of 5G- on weather forecasting, on the radio spectrum, possibly on bees- and it's got me wondering why we need it for telecom. I just don't see the value added. I can already communicate with anyone in the world, access any information, and find my way anywhere with 4G. A significantly higher rate of data transfer just doesn't seem to add any new functionality to my phone. Can anyone give me a good rationale for 5G? Entertainment doesn't count.
I'll grant right-off-the-bat that it'll have some fantastic industrial applications; my issue is with personal telecom. It just feels like a new planned obsolescence vector.
[+] [-] craftyguy|6 years ago|reply
[+] [-] davengh|6 years ago|reply
A few or more of those and we have real loss in biodiversity. Maybe your "local river" can sustain that for a bit - but overall they are all important.
[+] [-] dontbenebby|6 years ago|reply
Maybe the economic benefit of 5G would be enough to justify a one time cost of a telescope launch, especially if US, EU, and Russia all pitch in.
[+] [-] woah|6 years ago|reply
[+] [-] sandworm101|6 years ago|reply
[+] [-] watersb|6 years ago|reply
A primary design goal was terrestrial signal rejection; we would get nuked by ABQ Traffic Control radar etc.
I haven't kept up. I suppose I should find out.
[+] [-] pkghost|6 years ago|reply
[+] [-] throwaway995669|6 years ago|reply
Despite the fact that this spectrum has never been used for any widespread purpose, we're rolling it out and the burden is not on the implementers to prove that it is safe. It's basically on researchers to both prove, publicize, and convince society as a whole that 5G has health impacts.
I am not going to go all conspiracy-theory and say that the research is being suppressed but certainly funding for this research is not going to be a priority for the US government, as they've been thoroughly bought and paid for. Most research into health effects of non-ionizing radiation is not funded from the US government, so draw your own conclusions from that.
[+] [-] xxpor|6 years ago|reply
Your neighbor can blast 24 ghz right at your house with a free licence:
https://en.wikipedia.org/wiki/1.2-centimeter_band
The reason microwaves (like from a microwave oven) are dangerous are because of their power levels. Like, they'd cook you if they leaked out. Not because of ionizing radiation.
[+] [-] Dylan16807|6 years ago|reply
The thing about sweat glands is interesting. And sadly I have no idea how to evaluate the quality of the studies linked.
[+] [-] pas|6 years ago|reply
The bad news is that there are and will be entrenched interests that will likely try to "work around" any health concern, and maybe, potentially we will hear that the existence of 5G is worth it. (For example the tech advantage helps with healthcare more than the radiation harms us.)
[+] [-] leroy_masochist|6 years ago|reply
https://www.youtube.com/watch?v=GOovtZXYqj4
[+] [-] black-tea|6 years ago|reply
[deleted]
[+] [-] CamperBob2|6 years ago|reply
[deleted]
[+] [-] derekp7|6 years ago|reply
The only thing I can think of is 5G will allow for more overall network bandwidth, so the data caps on "unlimited" plans wouldn't be needed. But compared to how we use our phones today, what new items will be be able to do with 5G that we can't do with current 4G/LTE?
[+] [-] MrMember|6 years ago|reply
[+] [-] woliveirajr|6 years ago|reply
If 5G uses almost the same frequency where microwaves detect water vapour (around 24 GHz), won't the weather have a great impact on it?
Also, I always thought that such small waves would have problems with obstacles, with good signal just when your phone is in line-of-sight with antennas.
[+] [-] penagwin|6 years ago|reply
IMO 5G is massively overhyped. My iPhone 7+ isn't limited by 4G LTE, it's limited by Verizon deciding to only allow it 10mbps down (with great signal). 5G won't matter one bit if the current bottleneck isn't 4G LTE in the first place.
[+] [-] cptskippy|6 years ago|reply
There's a general misunderstanding about the technology that leads people down this road of thought.
5G is broken up into two frequency ranges, FR1 and FR2. FR1 is everything below 6Ghz and encompasses the same spectrum as traditional cellular technologies. FR2 is everything over 24Ghz and that's the bit everyone is confused about.
FR1 is like traditional cellular and will be slapped on cell towers to provide broad coverage over a wide area with performance characteristics similar to what we have today with LTE. It's not very exciting but it's 5G and this is what everyone is currently rolling out.
FR2 is meant to be absorbed, otherwise you'd have a big problem. Unlike FR1 which limits you to 100mhz bandwidth per channel, FR2 mandates that channel bandwidth be between 50-400mhz. So at a minimum, an FR2 channel will have half the maximum allowable bandwidth of FR1. If FR2 propagated more than a very short distance the airwaves would be quickly saturated by a small number of users.
FR2 is intended to be deployed in very dense areas like indoors. You'd be able to deploy many cell sites without worrying about overlap or signal propagation because everything from walls to moisture in the air will absorb the signals.
It might also be possible to slap an FR2 cell site on top of every lamp post going down a street.
[+] [-] chrisswanda|6 years ago|reply
https://en.wikipedia.org/wiki/Rain_fade
[+] [-] roomey|6 years ago|reply
Stuff for your personal cellular use would never come close to covering the costs involved. And 4g will still be used for many years to come for that.
[+] [-] oceliker|6 years ago|reply
edit: For what it's worth, I found this paragraph from the FCC last year: https://www.federalregister.gov/d/2018-14806/p-20 It sounds like they're saying "we don't know if this will be a problem yet, but be prepared to limit emissions in 23.6-24ghz range because we might require it at some point".
Also, paragraph 9 of the same document has the actual band limits (with a special requirement) if anybody is interested:
> The 24 GHz band consists of two band segments: The lower segment, from 24.25-24.45 GHz, and the upper segment, from 24.75-25.25 GHz
> any mobile or transportable equipment capable of operating in any portion of the 24 GHz band must be capable of operating at all frequencies within the 24 GHz band, in both band segments
[+] [-] borkt|6 years ago|reply
Not the best analysis but I'm working. Basically like any other metric, as the frequency increases the space between peaks and valleys decreases and it becomes harder to determine/separate from others 30hz to 230hz is much easier to tell the difference than 15khz to 15.2khz if you want to listen to audio tones. Once you get to microwaves this becomes of course much more difficult.
http://www.informit.com/articles/article.aspx?p=24687&seqNum...
[+] [-] staticfloat|6 years ago|reply
The way most of these signals are being generated is through an "information signal" (usually referred to as a modulator) being created within your device, then moved up in frequency by combining it with a "carrier signal". This separation allows the information signal's properties (such as bandwidth, modulation scheme, bitrate/Hz, etc...) to be more-or-less independent from the physical characteristics of propagation, which will be more strongly related to the carrier signal's properties (e.g. water absorption, reflection/transmission off of/through materials, etc...).
However, no process is perfect, and although we would like to generate perfect signals, we can't. Distortion appears in multiple parts of this pipeline, both in the generation of the low-frequency modulator and the high-frequency carrier. Distortion typically comes from "linear" components not being perfectly linear and thereby generating harmonics of the signal passing through them. In the case of the modulator, this splatters signal energy up and down the spectrum on the order of the bandwidth of the modulator, but in the case of the carrier it does so on the order of the carrier frequency. This is all a matter of degree and depending on the application may not be that big a deal, but it definitely must be addressed for high-density communications equipment like cell networks.
All transmitters have filters at multiple stages of their signal processing chains, both lowpass filters that filter the modulator before it's mixed with the carrier signal to boost it up in frequency, as well as bandpass filters that ensure the output stays within the bounds it's meant to be; but these filters only do so much and they can be expensive to create (as they can have rather fine physical tolerances) so everybody is always just playing the "do the best job we can for the least money" game.
Luckily, most transmitters are also connected to antennae that provide a convenient filtering on the output (they only resonate at the frequencies they transmit at) which helps for specialized systems that only operate at a single transmit frequency, but for something like 5G is less helpful, due to the many channels it supports, causing the antenna to necessarily support a wide range of frequencies.
[+] [-] ummonk|6 years ago|reply
[+] [-] cdaringe|6 years ago|reply
[+] [-] ghostly_s|6 years ago|reply
[+] [-] NicoJuicy|6 years ago|reply
Bur for 99% of the cases, why would we need it?
IoT doesn't need 5G, it needs LiRa.
Streaming applications, I can stream with 4G.
50 ms latency with 4G, so what. Except for competitive multiplayer gaming perhaps, I don't see the issue. But I think they want everything wired ;)
Industrial applications, outside of IoT? Give me a valid example that needs countrywide coverage.
I hardly notice difference with 4G and my WiFi. Increase coverage for 4G, before implementing 5G.
Fyi: 4G offers maximum real-world download speeds up to 60Mbps. Currently, that is more than enough.
[+] [-] newusertoday|6 years ago|reply
[+] [-] cloakandswagger|6 years ago|reply
Police CAD systems, streaming video from body cameras, oil well monitoring.
Just because you can't imagine an application of 5G for your consumer needs doesn't mean it isn't needed.
[+] [-] pas|6 years ago|reply
[+] [-] codelitt|6 years ago|reply
[+] [-] Reelin|6 years ago|reply
Ajit Pai strikes again!
[+] [-] hutzlibu|6 years ago|reply
I would be much more happy to have reliable 4G or 3G at least, first.
I suppose quite some support comes from people who have connection issues with weak 4G and assumes 5G will solve them.
But since 5G will consume apparently 3G towers and has much less range, quite the opposite could happen. Even less connection for people not in the city.
[+] [-] ryanmarsh|6 years ago|reply
[+] [-] mattrp|6 years ago|reply
[+] [-] mholt|6 years ago|reply
[1]: https://www.cablefree.net/wirelesstechnology/4glte/5g-freque...
[+] [-] Jeff_Brown|6 years ago|reply
[+] [-] wyldfire|6 years ago|reply
This phenomenon is called adjacent-channel interference and violations can and would be enforced by the FCC. The challenge isn't detection or enforcement, it's the challenge of the different government agencies to balance the people's needs properly.
FCC wants to auction off the spectrum to benefit telcos and their customers. NOAA wants to protect people with accurate predictions of hazardous weather. Your question presumes that the weather prediction function is more valuable, but the government may not reach that same conclusion.
[+] [-] toomuchtodo|6 years ago|reply
[+] [-] Causality1|6 years ago|reply
[+] [-] cm2187|6 years ago|reply
[+] [-] lgeorget|6 years ago|reply
[+] [-] pas|6 years ago|reply
But these very sensitive weather sensors. And they already work by detecting a trend and then detecting a big blip over that. (So rain currently looks like some sort of interesting blip in the noise.) So, it might be possible to have cities mapped with a differing trend, but it would further complicate models. And currently over land there's not much noise, because humans don't use this part of the EM spectrum. Mostly because it's not great at long range, because attenuates very fast exactly due to water vapor in the air. So it was "easy" to exploit this for getting weather data, because it was reasonable to assume close to constant natural emissions. (Probably only a simple daily and seasonal trend. Though it might be already necessary to handle differences between woodlands and urban areas.)
(See also, why it's hard to do the same over water: https://www.researchgate.net/publication/252663726_The_Effec... )
[+] [-] unknown|6 years ago|reply
[deleted]
[+] [-] ReptileMan|6 years ago|reply
[+] [-] keepmesmall|6 years ago|reply
[+] [-] joncrane|6 years ago|reply