1. The advertising-supported media model is broken.
2. People on the internet are distressingly credulous.
3. There is a huge pent-up demand in the US for a next-generation communications network at reasonable cost. Somehow we can do long distance phone service for a low monthly flat rate, and have done for ~20 years, but you can't get moderate speed consumer internet service in a major metro for less than about $65/month.
I read some of the other comments and don't understand your point 3. Do you expect huge demand to drive the price down? I'd tend to expect the opposite.
Regarding #3, I'm not sure I quite see the link. Long-distance phone service is at most 64kbps of data, and these days typically far less. I agree that there's demand, and current service and prices are greatly sub-par, but the comparison with phone service seems completely wrong, since phone service is so much easier to provide.
"Cognitive" or "White-space" radio, the subject of the actual FCC notice, is real though. In simplistic terms, it means making fixed "FCC style" spectrum management largely obsolete, by pairing each radio transmitter with a receiver. The receiver is used to monitor the surrounding radio environment, and the system makes a determination, in real-time, of what (if anything) it can transmit whilst avoiding interference with other users of the spectrum.
Cognitive radio offers the potential for large increases in data capacity since the radio spectrum is currently underutilised. I was at a conference a few years ago, where a presenter had done the measurements, and the utilisation of the radio spectrum (averaged across time and frequency) was about 5%. It turns out that most licensees hardly, if ever, transmit.
There's currently a regulatory push to allow others to use a chunk of spectrum, as long as they don't interfere with the licensee, meaning the licensee gets guaranteed access, rather than exclusive access. It should be interesting, since it will open the (cheaper?) alternative of buying smarter radios instead of buying spectrum.
Note that sensing-based whitespace radio has a number of issues. The biggest one is the hidden node problem - you can interfere with receivers without being able to hear the transmitter. There is also the issue that the licensed users of the band need to be very interference-resilient for the first few ms of their transmissions, until all of the sensing-radio users realise that they aren't allowed to be there anymore (and then the sensing radios need to have a consistent way to keep the link up).
The first problem can be mitigated by making the sensing receiver significantly more powerful than the transmitter, although you still have issues with specific terrain layouts (e.g. something big and RF-attenuating between you and the transmitter). The second problem makes it very difficult to interoperate with existing licensees (who assume exclusive use of the spectrum); it may be possible to make it work better with new licensees.
There is another interesting approach to cognitive radio, which is to use a centralised database of spectrum allocation, and then use RF propagation models & surveys to find gaps in space & frequency where you can let people transmit - for example, there are big gaps between terrestrial TV transmitter zones, to avoid the multi-kW transmitters interfering with each other, where you can easily fit "big WiFi"-style transmitters transmitting at a few watts.
Obviously, there are a number of issues here (need GPS or an accurate location on installation, need internet access for the DB, need to restrict movement, inefficient without really good RF models, weather affecting RF propagation) which make this less of a panacea than sensing-based cognitive radio, but it is relatively simple and robust to implement. In particular, a lot of the issues disappear if you use it for "big WiFi" applications - access points already have a fixed location, internet access, and mains power.
It also opens the possibility of a much more dynamic marketplace for spectrum - if all spectrum users are checking with a centralised DB (at least in a particular band), then it becomes much easier to handle short-term/local licenses - for example, providing massive short-term additional cell/wifi service to big events and festivals.
tl;dr Cognitive radio is a very interesting and promising development, but it's much harder than initial intuition suggests.
That sounds right for the Land Mobile service (and even more so for the vast swaths of spectrum that the government retains for itself) - but less so for the existing network carriers, they have carriers lit up all the time.
It's pretty clear that two things need to be changed about how articles are written and propagated on the internet.
First, only primary sources should be cited. If an article is written by a news source, it stands to reason that is is not suitable for citation by another news source. They are both at the same level of empiricism, one cannot be an adequate citation for another.
And second, the exchange of data has not changed drastically in the says since the story was first reported, so this isn't an internet speed issue. It's an issue of people being less pliable to the truth once they've hit upon a wonderful fallacy.
What that translates to is that news articles need to be less sensationalized, more factual and have a much more rigorous set of criteria for what's true and what's not.
I'm not a journalist, but I wouldn't be surprised if journalists deliberately look for what can be sensationalized in a potential story, to the point of unintentionally fabricating material - it is not inconceivable that a reporter simply skimmed some material about the FCC and then this story hit him like a "Eureka!" moment. I doubt it was fully intentional, it was just being caught up in the prospect of a great story.
And when a primary source can be provided, it ought to be linked, e.g., articles based on a study, "new report," or court document. I feel like I am constantly hunting these things down for myself.
So while many of the posts seem to be focusing on how to pay for it, there is another issue. It's not technically feasible, in urban areas at 700 mhz you would need a very wide block of spectrum, several hundred megs, plus backhaul to provision something that could offer 50 mbit service to everyone, you need this, because of frequency reuse issues (700 carries too far, not enough atmospheric attenuation). In rural areas, you cant pay for the cost of the infrastructure on the subscriber density, rural areas need a much lower frequency to be able to make an affordable site density.
For this to be possible, we need much more dense modulation than we currently have, and a wide block of spectrum. I really believe for urban areas the future is mesh, ala Ricochet (even using some of the same methods), which means cells that extend over a 1-3 block radius and extremely fast/wide backhaul, the backhaul needs to be automatic and self healing, wireline backhaul from anything more then every other 3-5 sites breaks the cost quotient. High Sites wont work in any event, you could deploy something in rural areas looking more like the cell phone network - but again, to enable reuse, you need a really wide block of spectrum.
In either case, be it mesh, or cell site like things, these have to be engineered networks, you cant just throw it up and expect it to work. Also, the chance for great financial success as a commerical entity is low, see Ricochet and Clearwire.
I've read more stories denying this was true than the reverse. Actually, I hadn't heard about that rumor before reading all the blogs denying it since yesterday. Am I the only one here?
This remembered me of that post here on HN last week about how a guy invented a false lead about the new Xbox and news sites were running with it without even sending a e-mail back to him asking for more information.
Even worse is that some sites tried to just copy and paste from other sites and change a bit to look original, and ended reporting wrong (invented) facts.
What the Washington Post lacks in understanding the facts it makes up for with a lack of journalistic integrity.
No surprise here.
Notice that the "old media" in the article are the ones that refuse to correct incorrect reporting. At this point, the only people continuing to subscribe to their publications can't be driven off with a stick. Their revenue will finally go dry when their subscribers succumb to old age.
Does anyone think that having Wi-Fi that's "as free as air" would be a bad thing? If the government was actually doing this, wouldn't everyone react positively?
Excluding lobbying from telcos, is there any reason that the FCC hasn't already done this?
It wouldn't be "Free as air", it would be buried in your taxes along with twice the bureaucracy necessary to run it. Judging by other government IT projects, especially at the fed level, it would also be more concerned with filling out forms in triplicate than actually getting anybody connected to the Internet.
At the municipal level, though, things seem to run more smoothly and the amount of tax waste seems to be less. Keep the Feds out other than to provide the bandwidth, IMO.
The left would require that pro-violence sites be blocked, the right would require that child porn sites be blocked, and everyone would complain about access to "terrorist" sites. And the RIAA would get involved. Having a single government point of failure seems to introduce a lot of problems.
Having a free internet would be as nice as having free electricity, water, roads, etc. Unfortunately designing, implementing and running a infrastructure costs money.
I wouldn't mind as long as it was cheap and I could still get competing services. If it drove out all competitors and left me stuck with an inevitably sub-par public wireless network, I'd be pretty annoyed.
[+] [-] anigbrowl|13 years ago|reply
1. The advertising-supported media model is broken.
2. People on the internet are distressingly credulous.
3. There is a huge pent-up demand in the US for a next-generation communications network at reasonable cost. Somehow we can do long distance phone service for a low monthly flat rate, and have done for ~20 years, but you can't get moderate speed consumer internet service in a major metro for less than about $65/month.
[+] [-] pseut|13 years ago|reply
[+] [-] vacri|13 years ago|reply
[+] [-] mikeash|13 years ago|reply
[+] [-] femto|13 years ago|reply
Cognitive radio offers the potential for large increases in data capacity since the radio spectrum is currently underutilised. I was at a conference a few years ago, where a presenter had done the measurements, and the utilisation of the radio spectrum (averaged across time and frequency) was about 5%. It turns out that most licensees hardly, if ever, transmit.
There's currently a regulatory push to allow others to use a chunk of spectrum, as long as they don't interfere with the licensee, meaning the licensee gets guaranteed access, rather than exclusive access. It should be interesting, since it will open the (cheaper?) alternative of buying smarter radios instead of buying spectrum.
[+] [-] qlkzy|13 years ago|reply
The first problem can be mitigated by making the sensing receiver significantly more powerful than the transmitter, although you still have issues with specific terrain layouts (e.g. something big and RF-attenuating between you and the transmitter). The second problem makes it very difficult to interoperate with existing licensees (who assume exclusive use of the spectrum); it may be possible to make it work better with new licensees.
There is another interesting approach to cognitive radio, which is to use a centralised database of spectrum allocation, and then use RF propagation models & surveys to find gaps in space & frequency where you can let people transmit - for example, there are big gaps between terrestrial TV transmitter zones, to avoid the multi-kW transmitters interfering with each other, where you can easily fit "big WiFi"-style transmitters transmitting at a few watts.
Obviously, there are a number of issues here (need GPS or an accurate location on installation, need internet access for the DB, need to restrict movement, inefficient without really good RF models, weather affecting RF propagation) which make this less of a panacea than sensing-based cognitive radio, but it is relatively simple and robust to implement. In particular, a lot of the issues disappear if you use it for "big WiFi" applications - access points already have a fixed location, internet access, and mains power.
It also opens the possibility of a much more dynamic marketplace for spectrum - if all spectrum users are checking with a centralised DB (at least in a particular band), then it becomes much easier to handle short-term/local licenses - for example, providing massive short-term additional cell/wifi service to big events and festivals.
tl;dr Cognitive radio is a very interesting and promising development, but it's much harder than initial intuition suggests.
[+] [-] Aloha|13 years ago|reply
[+] [-] dylangs1030|13 years ago|reply
First, only primary sources should be cited. If an article is written by a news source, it stands to reason that is is not suitable for citation by another news source. They are both at the same level of empiricism, one cannot be an adequate citation for another.
And second, the exchange of data has not changed drastically in the says since the story was first reported, so this isn't an internet speed issue. It's an issue of people being less pliable to the truth once they've hit upon a wonderful fallacy.
What that translates to is that news articles need to be less sensationalized, more factual and have a much more rigorous set of criteria for what's true and what's not.
I'm not a journalist, but I wouldn't be surprised if journalists deliberately look for what can be sensationalized in a potential story, to the point of unintentionally fabricating material - it is not inconceivable that a reporter simply skimmed some material about the FCC and then this story hit him like a "Eureka!" moment. I doubt it was fully intentional, it was just being caught up in the prospect of a great story.
[+] [-] macchina|13 years ago|reply
[+] [-] Aloha|13 years ago|reply
For this to be possible, we need much more dense modulation than we currently have, and a wide block of spectrum. I really believe for urban areas the future is mesh, ala Ricochet (even using some of the same methods), which means cells that extend over a 1-3 block radius and extremely fast/wide backhaul, the backhaul needs to be automatic and self healing, wireline backhaul from anything more then every other 3-5 sites breaks the cost quotient. High Sites wont work in any event, you could deploy something in rural areas looking more like the cell phone network - but again, to enable reuse, you need a really wide block of spectrum.
In either case, be it mesh, or cell site like things, these have to be engineered networks, you cant just throw it up and expect it to work. Also, the chance for great financial success as a commerical entity is low, see Ricochet and Clearwire.
[+] [-] surrealize|13 years ago|reply
I'm no RF expert, but couldn't you decrease power to increase cell density?
[+] [-] patrickaljord|13 years ago|reply
[+] [-] ars|13 years ago|reply
People apparently want this so badly that they'll believe anything.
[+] [-] MBCook|13 years ago|reply
Now that's journalism.
[+] [-] speeder|13 years ago|reply
Even worse is that some sites tried to just copy and paste from other sites and change a bit to look original, and ended reporting wrong (invented) facts.
[+] [-] crusso|13 years ago|reply
No surprise here.
Notice that the "old media" in the article are the ones that refuse to correct incorrect reporting. At this point, the only people continuing to subscribe to their publications can't be driven off with a stick. Their revenue will finally go dry when their subscribers succumb to old age.
[+] [-] saraid216|13 years ago|reply
It took me about three passes to understand this sentence. A comma would have helped.
[+] [-] josh2600|13 years ago|reply
Excluding lobbying from telcos, is there any reason that the FCC hasn't already done this?
[+] [-] SoftwareMaven|13 years ago|reply
At the municipal level, though, things seem to run more smoothly and the amount of tax waste seems to be less. Keep the Feds out other than to provide the bandwidth, IMO.
[+] [-] slapshot|13 years ago|reply
[+] [-] rbf|13 years ago|reply
[+] [-] mikeash|13 years ago|reply