It seems that the past 6 years or so saw most big ISP's dropping USENET support claiming mostly piracy concerns. Was it piracy or the fact that it's tough for the government to control what people say on USENET?
Old usenet-head here (on it regularly from 1991, first met it 1986) ...
First problem: there's no identity authentication mechanism in NNTP. So spam is a problem, forged moderation headers are a problem, general abuse is a problem. (A modern syndicated forum system with OAuth or some successor model would be a lot easier to ride herd on.)
Second problem: storage demands expand faster than the user base. Because it's a flood-fill store-and-forward system, each server node tries to replicate the entire feed. Consequently news admins tended to put a short expiry on posts in binary groups so they'd be deleted fairly promptly ... but if you do that, the lusers can't find what they're looking for so they ask their friends to repost the bloody things, ad nauseam.
Third problem: etiquette. Yeah, yeah, I am coming over all elitist here, but the original usenet mindset was exactly that. These days we're used to being overrun by everyone who can use a point-and-drool interface on their phone to look at Facebook, but back in September 1992 it was a real shock to the system when usenet was suddenly gatewayed onto AOL, I can tell you. Previously usenet more or less got along because the users were university staff and students (who could be held accountable to some extent) and computer industry folks. Thereafter, well, a lot of the worse aspects of 4chan and Reddit were pioneered on usenet. (Want to know why folks hero-worshipped Larry Wall before he wrote Perl? Because he wrote this thing called rn(1). Which had killfiles.) Anyway, a side-effect of this was that when web browsers began to show up, the response was to double-down on the high-powered CURSES-based or pure command-line clients rather than to try and figure out how to put an easy-to-use interface on top of a news spool. Upshot: usenet clients remained rooted in the early 1990s at best.
These days much of the functionality of usenet (minus the binaries) is provided by Reddit. Usenet itself turned into a half-assed space-hogging brain dead file sharing network. And we know what ISPs think of space-hogging half-assed stuff that doesn't make them money and risks getting them sued.
> First problem: there's no identity authentication mechanism in NNTP.
Yup. Which is weird, since every message is sent by someone at a host; it should have been possible to simply use signatures to prove which site generated a message—and punish sites which didn't police their users. But crypto was hard (and illegal to export, once upon a time).
> Because it's a flood-fill store-and-forward system, each server node tries to replicate the entire feed.
The upshot of this is that reading news was fast. So fast that folks these days can't believe what a user-friendly experience reading from a local news feed can be. Imagine reading a website where pages come up in milliseconds—that's Usenet on a local feed.
> Consequently news admins tended to put a short expiry on posts in binary groups
Frankly, binaries were Usenet's downfall. Had those been eliminated, then I hope that Usenet would be a lot healthier today. I couldn't get numbers on the size of a daily text feed nowadays, but I imagine it's pretty manageable.
> Upshot: usenet clients remained rooted in the early 1990s at best.
Back in the 90s I used a pretty nice Mac GUI client.
But really, it doesn't get better than gnus…
Anyway, Usenet's not dead—it's still alive, people are still posting and some groups are doing pretty well.
They world could use a new Usenet, with the lessons learned from the first one: site-to-site; non-commercial; anti-spam measures built in.
I ran a Freenix-competitive Usenet server for a popular ISP in the mid-1990s (by way of bona fides: we were competitive because I hacked a history lookup cache into INN, a concept we apparently co-invented alongside Netcom). Usenet was by far our most expensive and most time-consuming infrastructure.
The reason for that was binaries. The amount of storage we were required to keep online for binaries was staggering. We ended up buying those ridiculous chrome-plated NetApp fileservers to handle the load. The hardware was expensive, but more expensive was the admin overhead: things went wrong with the INN filesystems regularly, and there was nothing you could do to recover from them quickly; simple filesystem errors that were really just an fsck away from repair could mean 4-6 hours of downtime. Which, by the way, tended to happen at night.
File-sharers sprayed multiple copies of huge files across several newsgroups, in little chunks. If any of those chunks went missing, our users screamed bloody murder. ISPs that tried to host no-binaries Usenet became the target of PR campaigns. Hosting discussion groups on Usenet was cheap and easy. Running a competitive full-feed server, on the other hand, required nearly full-time attention from an admin that could do light filesystem hacking.
The result was a death-spiral: as Usenet got more expensive to host, fewer ISPs hosted it; many outsourced to other providers. They could easily have hosted just the discussion groups! Usenet could have been kept alive and decentralized, and maybe even evolved alongside the web. Instead, software and pornography pirates coerced the network into a few centralized providers, who eventually decided not to waste huge amounts of money hosting infrastructure for those kinds of users.
> (Want to know why folks hero-worshipped Larry Wall before he wrote Perl? Because he wrote this thing called rn(1). Which had killfiles.)
Perl came into being in 1987, which is before September 1992, so his creation of rn must have had traction/worship before the "Eternal September." I'd hardly say that said event was the reason that people worshipped him for rn, especially since Perl was out for ~5 years at that point.
I especially agree with your third point. Any kid getting his first Windows machine was now a computer expert and ran to the (then) only knowledgeable place to get more info, and that was usenet. That turned usenet into reddit without any controls at all. Reddit, at least, does some minimal clean up.
(Background: I'm a Usenet user from the late 80's to early naughties, did outsourcing at netaxs as newsread.com, then ran readnews.com from 2004-1024).
Usenet is still around but mostly for binaries. The market's of pretty stable size, dominated by a few large wholesale players.
My take on what happened with text groups is that the S/N ratio just went to hell. In the 90s the problem was spam, but in the 2000s the problem was too many loudmouths who wanted to hear themselves talk drowning out the useful experts.
Like some of the other folks commenting, I've been pissed as hell at the PHPBB/vbulleting monstrosities. My original plan with readnews was to try to build a great web UI for discussion, but we got distracted by wholesale customers wanting service - and front-end is not my area of expertise.
For folks looking for something modern with promise, the news is good with discourse and a few others coming up. Would love to see something distributed, but if really distributed I suspect we'd see binaries and/or commercial spam and/or people with nothing interesting to say dominate - just like Usenet...
What Usenet did well was that it was completely decentralised, had zero cost of engagement (despite 'hundreds, if not thousands of dollars'), and was everywhere.
What Usenet did badly was that there was a complete absence of identity management or access controls, which meant no accountability, which meant widespread abuse; and no intelligence about transmitting messages, which meant that every server had to have a copy of the entire distributed database, which meant it wouldn't scale.
It's a tough problem. You need some way to propagate good messages while penalising bad messages in an environment where you cannot algorithmically determine what good or bad is, or have a single unified view of all messages, all users, or even all servers. And how do you deal with bad actor servers? You know that somewhere, there's a Santor and Ciegel who are trying to game the system in order to spam everyone with the next Green Card Lottery...
I think reddit is the reinvention of USENET. It is mod-heavy and has enough critical mass of users to provide excellent results from its upvoting system. And many subreddits are extremely well maintained with a very high signal to noise ratio.
It even has its equivalent of alt.binaries.pics.* if one is so inclined.
Few years ago I created http://www.newswebreader.com (still functioning) website which is the web frontend for USENET. It has NNTP server in the background connected to other NNTP servers, and it displays groups, headers and posts similar to three pane Thunderbird.
You can create an account and susbscribe groups, it remembers what messages you read.
Idea was in the end to make frontend to USENET that would look like Stack Overflow, with voting, and your replies would propagate back to USENET.
Already done, it's called reddit. And the main problem with Usenet was its replication architecture and not its identity/authentication.
reddit doesn't have any identity system in place and it has hundreds of millions of users.
reddit improved on Usenet by adding voting, which is something that at least one Usenet client tried to implement (gnus) but which should have been implemented in the architecture itself.
You don't have to penalize bad messages. Just don't link to them. Curation and moderation seem to be higher level problems that don't need to be specifically addressed by underlying storage/transport layers.
ipfs[1] is an interesting project that could be used to develop applications in this area.
Defining the goals is a key aspect. If re-invention is what we desire than I would like to take a shot at outlining the positive aspects of usenet, as well as the negatives.
Positive:
* Anonymity possible (to an extent)
* Moderation possible (to an extent)
* Caching of desired content at the network edge
* Binary data (though obviously no more yyencode/etc)
* Libre (as in freedom of speech)
* Free (as in beer)
* Useful, if probably illegal, content
* Distributed
The negatives:
* Impersonation/other false claim to identity.
* Spam
* Illegal content (to whom? how to identify? intractable)
* Flame wars
* Difficulty of setting up a 'feed'
I'd like to take a small stab at these various problems.
For identification I would specify the use of public key cryptography; it's the only de-centralized option I know of. OpenPGP with some extensions (IE: ed25519 signing keys) seems to be the obvious choice.
With identification the use of spam filtering technologies can also be resolved. Have users 'file' copies of messages in to several training bins via flags. Flags would be ternary state entities (true/false/null). Liked, On Topic, 'harmful content' (the catch all would be used in a design sense to include any type of illegal content, however for some groups that content /is/ the signal; this is meant to inform users so they can choose, not to be a nanny for them).
The above tagging would allow for aggregation to determine the 'health' of a data-pool, as well as how useful it was to the user base of a given server.
Data pools would, in themselves, be another type of tag. The built in base tags defined above would be the only 'required' ones, but a firehose of all data is crazy. Thus tags (similar to keywords) would also be attached. Advanced users (any that provide 'detailed' feedback) could 'vote' on the accuracy of applied tags including the base tags (which would be inferred as necessarily existing).
Base tags become 'groups' in this distributed database.
Critically servers aggregate and thus anonymize the tag weighting of their own userbase (even from their own user base).
Every (tag sync period) an enumeration of all non-default tags (and their yes/no vote counts) would be computed and the published result for that listed.
Also published, would be a list of the other 'servers' which this current server is aware of. SOME of these would be replication servers (which would have a non-zero weight that isn't required to be published), while others are just the servers known by other servers. Each entry would have an age; this would be the last time that the tag stats of that remote server was successfully polled (thus low entries are likely to be replication sources, BUT might be 'validation' of other servers as obfuscation).
Servers might only share post contents with authorized connections. Anyone able to do so would be able to source the other server and therefore replicate the tagged data that it chooses to cache. The other server may require something like providing account data for it to sync your server's userbase stats to it. Comparing the relative accuracy of stats would enable it to determine if your userbase is real or not, as well as how your userbase votes on things it's userbase does not. This would be the reason that (semi-anonymous) peering between even not-like-sized servers would be permitted; particularly if your own server is frugal and normally doesn't download things that aren't voted on it.
Obviously server to server communication would involve the automated use of signing keys /for the server/.
Binary groups were huge and users expected them for free. And users would download huge amounts of stuff. So it's pretty much a cost sink, and ISPs who tried to start charging (for this service that had dramatically increased costs) were faced with vigorous campaigns. At some point it's easier to just cancel and tell disattisfied customers to get a new ISP if they're unhappy.
The amount of groups distributing images of child sexual abuse created some risk (not every ISP is in the US) and things like stealth binary groups distributing porn put a bunch of people in oppressive regimes in tricky situations.
ISPs could have dropped binaries and only carried text groups. But this means putting up with groups of people who strongly held but conflicting opinions:
1) be a dumb pipe and provide everything
2) be a dumb pipe but filter spam with a Breidbart index of something or other.
3) make the news server operate to rules laid out in the ISP's ToS. (Young people may not realise but a lot of effort on the early Internet was spent on "what do we do if our users go on the Internet and start swearing?" Many ISPs had rules forbidding swearing. (At least, they did in europe))
Then www forums sprung up and they had some advantages: avatars, mods, etc.
I don't recognize the bit about "dumb pipes". News is a service, not a pipe. Very few servers carry all groups. Some didn't even carry the alt groups, that wasn't fun.
The ISPs I worked at in the late 90s did not carry the binaries groups, becuase of the outrageous storage requirements. I think that was quite common.
This all more or less kind of happened eventually, but the parts that were early did not go the way you state, and the parts that did go that way were much, much much later and did not directly influence the overall fate of usenet.
All in all, no, you're only talking about some side stuff that only clouds the history.
I skimmed the comments here, and never saw the real answer (to what I understand the question to be). Even though it was public knowledge, I had some extra insight from working for a large Usenet provider.
The New York Attorney General started a campaign against child porn groups on Usenet. In the end, his office identified a small number of groups they said were used for child porn -- I think it was less than 100 groups. Many ISP's jumped on the opportunity to stop paying for Usenet service.
In the 90's it was just assumed that an ISP service would include Usenet. With the growth of binaries groups, the quality of service declined. I remember retention would be a day or two, with about 50% completion. So, for most ISP's, the service was unusable, and only a small number of subscribers knew or cared about it. The others paid quite a bit for service from a third party, like my employer. I don't know why they didn't shut down service earlier, but once the NYAG campaign started, they could cancel Usenet, saving themselves money, and getting good press for fighting child porn.
I found USENET and associated newsgroups to be better than the WWW, especially for discussions of software. I once even promoted the use of internal newsgroups w/in a corporate environment, where a history of topics (discussions, problems, and decisions) would have IMO proven extremely useful.
But the idea never got traction: people were unwilling to participate because newsreaders were too different from the browser and they'd had enough trouble learning to navigate the WWW. Once blogs and browser-based "newsgroups" and forums began showing up, the handwriting was on the wall. In the end, the WWW browser's low bar to entry ate USENET.
I still value the treasure trove of information stored in the archives. And some people still actively participate in USENET and other newsgroups, just as some still participate in IRC (Internet Relay Chat, which also is fading). I think these are valuable tools with a lot of greybeard expertise held in reserve.
There's a sort of Gresham's law of the Internet: "The browser drives out every other interface."
I have to mention, the D Programming language has forums that can be accessed both by a newsreader and from a web browser[1]. It is coded in a framework called Vibe.d[2] for the NNTP protocol and HTTP[3], which I think is fascinating. My only complaint is that due to the somewhat dying out of USENET for discussion reasons, and mostly it just being used for piracy (if you do a bit of research it's still just about as active as torrenting, except it's a lot more automated, used to use USENET, I rather stick to legal alternatives and avoid the paranoia), the old clients while they still work, could still use some touching up which probably wont happen.
I enjoy the idea that if all discussions in a support forum are on the NNTP protocol, I can archive them all, so I hardly have to open up a browser to search through years (decades?) of threads to see if anyone else has had the same issues as me. Imagine something like Stack Overflow all of a sudden at your finger tips without any internet access. It's a really nice thing, sometimes the internet just dies on you when you need it most.
As for IRC, people are willing to use it, if you put something useful on there (support for a project, or a community that people are interested in). If you want adoption from users who are just browsing the internet, maybe a web client / desktop client combination that makes IRC a lot less seamless to the average "I don't know" type of user.
On the subject of IRC: it wasn't fading all that strongly, IMO, at least for open-source and free software discussions. Then Slack came along. Startling to see a proprietary clone of IRC (albeit with some nice extra features, namely history) come along and start taking over. See, e.g., the Clojurians Slack community.
USENET has always been used for porn and piracy, since at least the early 90s. Of course, most of the great newsgroups were discussions-based, but probably most of the bandwidth was porn and piracy.
When I was in college, I remember someone on my floor had written a program in Pascal that automatically downloaded porn off USENET. He would leave his computer running all the time, connected to the college's internet connection via modem, and we would occasionally see a flash of a porn pic on his screen and ask "What was that?". This was before the days of integrated TCP/IP stacks in the OS, so if I remember correctly he had to dial in via modem and then use something called Slurp or something like that, I can't remember exactly now.
This continued all through the 90s. A bunch of my friends had Airnews accounts and downloaded mp3s and porn 24x7, during what we called the "Golden Age" of piracy, when Napster was starting up in 97 up until the early 2000s, when the bust hit.
At some point, the medium for discussions moved off of USENET and went to more user friendly places like email mailing lists, google groups, yahoo groups, reddit, etc. This left only piracy and porn on USENET, and I'm actually surprised that some ISPs still support USENET at this point.
It felt like Usenet died as a meaningful place for discussion in the mid-to-late '90s, for all the same reasons that most (or all?) electronic communities eventually die. Bad posters drive away good posters and encourage even worse posters, which eventually results in something akin to YouTube. Forum entropy for lack of a better term.
By the time most ISPs started dropping it, a vanishingly small percentage of most ISPs' users even knew what it was, and the binaries groups had turned it into a source of both cost and legal risk. The heavy users were people who incurred that cost and risk to the ISPs because they were using it for pirating software and porn. The icing on the cake would've been the fact that it's a terribly inefficient way to distribute those things and the ISPs have to store all that stuff locally on servers they own.
From an ISP's perspective, maintaining Usenet feeds became all downside and no upside.
Regarding government control, I would think that Usenet would've been far easier to monitor and censor than the web.
> Bad posters drive away good posters and encourage even worse posters, which eventually results in something akin to YouTube. Forum entropy for lack of a better term.
I've heard it labeled "evaporative cooling", per [0].
> It seems that the past 6 years or so saw most big ISP's dropping USENET support claiming mostly piracy concerns. Was it piracy or the fact that it's tough for the government to control what people say on USENET?
No conspiracy theories needed here.
Copyright infringement is one angle; the other is that it costs ISPs a huge amount of resources for something few people use.
Once upon a time, a single server could easily mirror all of USENET for all users of an ISP, and almost every user expected it, so they'd treat it as an essential part of the service. Now, it would take far more storage to do so, and almost nobody expects it, so why should an ISP provide it? It's easier to let people get USENET from a third-party service, and it'd be a better experience for the people who actually want it, too.
If an ISP has resources to burn and wants to make their technical users happy, they'd get far better results for more users if they provided things like local Linux distribution mirrors instead. Far more users would make use of that than USENET.
And if they want to make the vast majority of users happy, and save resources on their end in the process, they can provide local CDN nodes for YouTube, Netflix, and similar.
Usenet isn't dead. I still use several Usenet groups via Thunderbird. Google Groups is a Usenet host/client, and many groups belong to both the Google and Usenet spaces. The Usenet interface is easier to use, has no ads, and doesn't require a Google account.
Ont thing that hasn't been mentioned yet is that it is essentially closed now. You can't just set up your own USENET server easily, because you have to pay someone $$$ to get federation (if at all possible). They will try to keep competitors out, because they sell access to newsgroups for a lot of money so people can download warez.
I think it should be possible to get replicas of non-binary newsgroups, but a quick search hasn't found any free option.
The single biggest issue was spam. Being largely unmoderated, it became flooded with garbage as the reach of the Internet expanded. Conversation moved to web based forums, which IMO had worse UI in il the early days, because there was more ability to moderate.
And not just traditional spam. There were active campaigns to "sporge" thousands of news groups with thousands of junk posts. Hipcrime is probably something that would return ghits.
Well, sorta. Yeah, spam was a big problem. The other problem is that there WERE moderated groups, and the original intent when creating moderated groups was that moderators would act like adults. They largely did not, and even if they did, they were usually strongly opinionated and would moderate according to those opinions.
I wonder just how big a non-binaries feed is these days. A tiny engineering company I worked for in '98-99 had its own Usenet server with a no-bin feed going into a SPARCstation 2 (think 386-486-class x86, equivalent) and it kept up just fine.
A couple years earlier I'd been one of the senior admins at Texas.Net (now DataFoundry) and helped build out what eventually turned into GigaNews, which used multiple dual-proc Sun E450s.. I think they're still one of the "biggest" Usenet providers these days.
USENET (the network) may be dying but NNTP is still going strong as a better interface to mailing lists. See for example http://gmane.org/ or the new GNU Mailman 3 gateway.
I am now subscribed to maybe 2 mailing lists, the rest (two dozens) I read via gmane.
I still use usenet every day. There are, admittedly, only a few good groups left. But where there's a high barrier to entry there's a high reward. The discussion is of high quality. Higher than most mailing lists and reddit/HN, at least.
There is a lot of history and useful knowledge archived in Usenet. A lot of that content (e.g., the early UNIX newsgroups) puts today's forums and blogs to shame.
Google acquired Deja News (if Usenet is wortheless, why?) and now all the archived Usenet messages are web-access only and fronted by Java and Javascript nonsense.
If the Usenet archives are no longer important or if everyone thinks Usenet is "dead", then why put these messages behind Javascript and try to prevent bulk downloads (which is how NNTP was designed to work)?
déjà news acquisition was in 2001, lifetimes ago in web time. And it does fit with Google's stated mission of putting everything online, in easily searchable format.
The "let's put a thin web veneer over X" approach has never been a great one, perhaps sufficient for Web 1.0, but not these days.
When DejaNews made USENET searchable. You could actually nuke messages from DejaNews, but then Google bought DejaNews and suddenly every nuked message were made available again forever. Google killed USENET.
It's expensive to keep binary groups online (bandwidth) and the text groups are all SPAM these days.
Edit: Forgot to say that the tech is fine; a member of my family operates a usenet server over in Switzerland for our family. Works well for that sort of thing and avoids facebook etc.
[+] [-] cstross|10 years ago|reply
First problem: there's no identity authentication mechanism in NNTP. So spam is a problem, forged moderation headers are a problem, general abuse is a problem. (A modern syndicated forum system with OAuth or some successor model would be a lot easier to ride herd on.)
Second problem: storage demands expand faster than the user base. Because it's a flood-fill store-and-forward system, each server node tries to replicate the entire feed. Consequently news admins tended to put a short expiry on posts in binary groups so they'd be deleted fairly promptly ... but if you do that, the lusers can't find what they're looking for so they ask their friends to repost the bloody things, ad nauseam.
Third problem: etiquette. Yeah, yeah, I am coming over all elitist here, but the original usenet mindset was exactly that. These days we're used to being overrun by everyone who can use a point-and-drool interface on their phone to look at Facebook, but back in September 1992 it was a real shock to the system when usenet was suddenly gatewayed onto AOL, I can tell you. Previously usenet more or less got along because the users were university staff and students (who could be held accountable to some extent) and computer industry folks. Thereafter, well, a lot of the worse aspects of 4chan and Reddit were pioneered on usenet. (Want to know why folks hero-worshipped Larry Wall before he wrote Perl? Because he wrote this thing called rn(1). Which had killfiles.) Anyway, a side-effect of this was that when web browsers began to show up, the response was to double-down on the high-powered CURSES-based or pure command-line clients rather than to try and figure out how to put an easy-to-use interface on top of a news spool. Upshot: usenet clients remained rooted in the early 1990s at best.
These days much of the functionality of usenet (minus the binaries) is provided by Reddit. Usenet itself turned into a half-assed space-hogging brain dead file sharing network. And we know what ISPs think of space-hogging half-assed stuff that doesn't make them money and risks getting them sued.
[+] [-] wtbob|10 years ago|reply
Yup. Which is weird, since every message is sent by someone at a host; it should have been possible to simply use signatures to prove which site generated a message—and punish sites which didn't police their users. But crypto was hard (and illegal to export, once upon a time).
> Because it's a flood-fill store-and-forward system, each server node tries to replicate the entire feed.
The upshot of this is that reading news was fast. So fast that folks these days can't believe what a user-friendly experience reading from a local news feed can be. Imagine reading a website where pages come up in milliseconds—that's Usenet on a local feed.
> Consequently news admins tended to put a short expiry on posts in binary groups
Frankly, binaries were Usenet's downfall. Had those been eliminated, then I hope that Usenet would be a lot healthier today. I couldn't get numbers on the size of a daily text feed nowadays, but I imagine it's pretty manageable.
> Upshot: usenet clients remained rooted in the early 1990s at best.
Back in the 90s I used a pretty nice Mac GUI client.
But really, it doesn't get better than gnus…
Anyway, Usenet's not dead—it's still alive, people are still posting and some groups are doing pretty well.
They world could use a new Usenet, with the lessons learned from the first one: site-to-site; non-commercial; anti-spam measures built in.
[+] [-] tptacek|10 years ago|reply
I ran a Freenix-competitive Usenet server for a popular ISP in the mid-1990s (by way of bona fides: we were competitive because I hacked a history lookup cache into INN, a concept we apparently co-invented alongside Netcom). Usenet was by far our most expensive and most time-consuming infrastructure.
The reason for that was binaries. The amount of storage we were required to keep online for binaries was staggering. We ended up buying those ridiculous chrome-plated NetApp fileservers to handle the load. The hardware was expensive, but more expensive was the admin overhead: things went wrong with the INN filesystems regularly, and there was nothing you could do to recover from them quickly; simple filesystem errors that were really just an fsck away from repair could mean 4-6 hours of downtime. Which, by the way, tended to happen at night.
File-sharers sprayed multiple copies of huge files across several newsgroups, in little chunks. If any of those chunks went missing, our users screamed bloody murder. ISPs that tried to host no-binaries Usenet became the target of PR campaigns. Hosting discussion groups on Usenet was cheap and easy. Running a competitive full-feed server, on the other hand, required nearly full-time attention from an admin that could do light filesystem hacking.
The result was a death-spiral: as Usenet got more expensive to host, fewer ISPs hosted it; many outsourced to other providers. They could easily have hosted just the discussion groups! Usenet could have been kept alive and decentralized, and maybe even evolved alongside the web. Instead, software and pornography pirates coerced the network into a few centralized providers, who eventually decided not to waste huge amounts of money hosting infrastructure for those kinds of users.
Not that I'm bitter.
[+] [-] pyre|10 years ago|reply
Perl came into being in 1987, which is before September 1992, so his creation of rn must have had traction/worship before the "Eternal September." I'd hardly say that said event was the reason that people worshipped him for rn, especially since Perl was out for ~5 years at that point.
[+] [-] davidgerard|10 years ago|reply
my thesis: the atom of NNTP is the message, the atom of web forums is the thread.
(Commenters on that are mostly old BOFHnetters you may still recognise.)
[+] [-] justwannasing|10 years ago|reply
[+] [-] avifreedman|10 years ago|reply
Usenet is still around but mostly for binaries. The market's of pretty stable size, dominated by a few large wholesale players.
My take on what happened with text groups is that the S/N ratio just went to hell. In the 90s the problem was spam, but in the 2000s the problem was too many loudmouths who wanted to hear themselves talk drowning out the useful experts.
Like some of the other folks commenting, I've been pissed as hell at the PHPBB/vbulleting monstrosities. My original plan with readnews was to try to build a great web UI for discussion, but we got distracted by wholesale customers wanting service - and front-end is not my area of expertise.
For folks looking for something modern with promise, the news is good with discourse and a few others coming up. Would love to see something distributed, but if really distributed I suspect we'd see binaries and/or commercial spam and/or people with nothing interesting to say dominate - just like Usenet...
[+] [-] codinghorror|10 years ago|reply
And yeah, distributed discussions are moon-shot hard, unfortunately.
[+] [-] spiritplumber|10 years ago|reply
Not sure if typo or integer overflow there :)
[+] [-] BillyParadise|10 years ago|reply
[+] [-] wcummings|10 years ago|reply
Sounds like reddit
[+] [-] david-given|10 years ago|reply
How would you reinvent Usenet?
What Usenet did well was that it was completely decentralised, had zero cost of engagement (despite 'hundreds, if not thousands of dollars'), and was everywhere.
What Usenet did badly was that there was a complete absence of identity management or access controls, which meant no accountability, which meant widespread abuse; and no intelligence about transmitting messages, which meant that every server had to have a copy of the entire distributed database, which meant it wouldn't scale.
It's a tough problem. You need some way to propagate good messages while penalising bad messages in an environment where you cannot algorithmically determine what good or bad is, or have a single unified view of all messages, all users, or even all servers. And how do you deal with bad actor servers? You know that somewhere, there's a Santor and Ciegel who are trying to game the system in order to spam everyone with the next Green Card Lottery...
[+] [-] stevewepay|10 years ago|reply
It even has its equivalent of alt.binaries.pics.* if one is so inclined.
[+] [-] sasoon|10 years ago|reply
Idea was in the end to make frontend to USENET that would look like Stack Overflow, with voting, and your replies would propagate back to USENET.
[+] [-] incepted|10 years ago|reply
Already done, it's called reddit. And the main problem with Usenet was its replication architecture and not its identity/authentication.
reddit doesn't have any identity system in place and it has hundreds of millions of users.
reddit improved on Usenet by adding voting, which is something that at least one Usenet client tried to implement (gnus) but which should have been implemented in the architecture itself.
[+] [-] harshreality|10 years ago|reply
ipfs[1] is an interesting project that could be used to develop applications in this area.
[1] http://ipfs.io
[+] [-] partomniscient|10 years ago|reply
[+] [-] mjevans|10 years ago|reply
Positive:
* Anonymity possible (to an extent)
* Moderation possible (to an extent)
* Caching of desired content at the network edge
* Binary data (though obviously no more yyencode/etc)
* Libre (as in freedom of speech)
* Free (as in beer)
* Useful, if probably illegal, content
* Distributed
The negatives:
* Impersonation/other false claim to identity.
* Spam
* Illegal content (to whom? how to identify? intractable)
* Flame wars
* Difficulty of setting up a 'feed'
I'd like to take a small stab at these various problems.
For identification I would specify the use of public key cryptography; it's the only de-centralized option I know of. OpenPGP with some extensions (IE: ed25519 signing keys) seems to be the obvious choice.
With identification the use of spam filtering technologies can also be resolved. Have users 'file' copies of messages in to several training bins via flags. Flags would be ternary state entities (true/false/null). Liked, On Topic, 'harmful content' (the catch all would be used in a design sense to include any type of illegal content, however for some groups that content /is/ the signal; this is meant to inform users so they can choose, not to be a nanny for them).
The above tagging would allow for aggregation to determine the 'health' of a data-pool, as well as how useful it was to the user base of a given server.
Data pools would, in themselves, be another type of tag. The built in base tags defined above would be the only 'required' ones, but a firehose of all data is crazy. Thus tags (similar to keywords) would also be attached. Advanced users (any that provide 'detailed' feedback) could 'vote' on the accuracy of applied tags including the base tags (which would be inferred as necessarily existing).
Base tags become 'groups' in this distributed database.
Critically servers aggregate and thus anonymize the tag weighting of their own userbase (even from their own user base).
Every (tag sync period) an enumeration of all non-default tags (and their yes/no vote counts) would be computed and the published result for that listed.
Also published, would be a list of the other 'servers' which this current server is aware of. SOME of these would be replication servers (which would have a non-zero weight that isn't required to be published), while others are just the servers known by other servers. Each entry would have an age; this would be the last time that the tag stats of that remote server was successfully polled (thus low entries are likely to be replication sources, BUT might be 'validation' of other servers as obfuscation).
Servers might only share post contents with authorized connections. Anyone able to do so would be able to source the other server and therefore replicate the tagged data that it chooses to cache. The other server may require something like providing account data for it to sync your server's userbase stats to it. Comparing the relative accuracy of stats would enable it to determine if your userbase is real or not, as well as how your userbase votes on things it's userbase does not. This would be the reason that (semi-anonymous) peering between even not-like-sized servers would be permitted; particularly if your own server is frugal and normally doesn't download things that aren't voted on it.
Obviously server to server communication would involve the automated use of signing keys /for the server/.
[+] [-] DanBC|10 years ago|reply
The amount of groups distributing images of child sexual abuse created some risk (not every ISP is in the US) and things like stealth binary groups distributing porn put a bunch of people in oppressive regimes in tricky situations.
http://www.exit109.com/~jeremy/news/misplacedbin.html
ISPs could have dropped binaries and only carried text groups. But this means putting up with groups of people who strongly held but conflicting opinions:
1) be a dumb pipe and provide everything
2) be a dumb pipe but filter spam with a Breidbart index of something or other.
3) make the news server operate to rules laid out in the ISP's ToS. (Young people may not realise but a lot of effort on the early Internet was spent on "what do we do if our users go on the Internet and start swearing?" Many ISPs had rules forbidding swearing. (At least, they did in europe))
Then www forums sprung up and they had some advantages: avatars, mods, etc.
[+] [-] xorcist|10 years ago|reply
The ISPs I worked at in the late 90s did not carry the binaries groups, becuase of the outrageous storage requirements. I think that was quite common.
[+] [-] steveax|10 years ago|reply
[+] [-] DougMerritt|10 years ago|reply
All in all, no, you're only talking about some side stuff that only clouds the history.
[+] [-] inyourtenement|10 years ago|reply
The New York Attorney General started a campaign against child porn groups on Usenet. In the end, his office identified a small number of groups they said were used for child porn -- I think it was less than 100 groups. Many ISP's jumped on the opportunity to stop paying for Usenet service.
In the 90's it was just assumed that an ISP service would include Usenet. With the growth of binaries groups, the quality of service declined. I remember retention would be a day or two, with about 50% completion. So, for most ISP's, the service was unusable, and only a small number of subscribers knew or cared about it. The others paid quite a bit for service from a third party, like my employer. I don't know why they didn't shut down service earlier, but once the NYAG campaign started, they could cancel Usenet, saving themselves money, and getting good press for fighting child porn.
[+] [-] thret|10 years ago|reply
[+] [-] jivardo_nucci|10 years ago|reply
I found USENET and associated newsgroups to be better than the WWW, especially for discussions of software. I once even promoted the use of internal newsgroups w/in a corporate environment, where a history of topics (discussions, problems, and decisions) would have IMO proven extremely useful.
But the idea never got traction: people were unwilling to participate because newsreaders were too different from the browser and they'd had enough trouble learning to navigate the WWW. Once blogs and browser-based "newsgroups" and forums began showing up, the handwriting was on the wall. In the end, the WWW browser's low bar to entry ate USENET.
I still value the treasure trove of information stored in the archives. And some people still actively participate in USENET and other newsgroups, just as some still participate in IRC (Internet Relay Chat, which also is fading). I think these are valuable tools with a lot of greybeard expertise held in reserve.
There's a sort of Gresham's law of the Internet: "The browser drives out every other interface."
[+] [-] giancarlostoro|10 years ago|reply
I enjoy the idea that if all discussions in a support forum are on the NNTP protocol, I can archive them all, so I hardly have to open up a browser to search through years (decades?) of threads to see if anyone else has had the same issues as me. Imagine something like Stack Overflow all of a sudden at your finger tips without any internet access. It's a really nice thing, sometimes the internet just dies on you when you need it most.
As for IRC, people are willing to use it, if you put something useful on there (support for a project, or a community that people are interested in). If you want adoption from users who are just browsing the internet, maybe a web client / desktop client combination that makes IRC a lot less seamless to the average "I don't know" type of user.
[1]: http://forum.dlang.org/ [2]: http://vibed.org/ [3]: https://github.com/rejectedsoftware/vibenews
[+] [-] gcv|10 years ago|reply
[+] [-] stevewepay|10 years ago|reply
When I was in college, I remember someone on my floor had written a program in Pascal that automatically downloaded porn off USENET. He would leave his computer running all the time, connected to the college's internet connection via modem, and we would occasionally see a flash of a porn pic on his screen and ask "What was that?". This was before the days of integrated TCP/IP stacks in the OS, so if I remember correctly he had to dial in via modem and then use something called Slurp or something like that, I can't remember exactly now.
This continued all through the 90s. A bunch of my friends had Airnews accounts and downloaded mp3s and porn 24x7, during what we called the "Golden Age" of piracy, when Napster was starting up in 97 up until the early 2000s, when the bust hit.
At some point, the medium for discussions moved off of USENET and went to more user friendly places like email mailing lists, google groups, yahoo groups, reddit, etc. This left only piracy and porn on USENET, and I'm actually surprised that some ISPs still support USENET at this point.
[+] [-] PopeOfNope|10 years ago|reply
[+] [-] mwfunk|10 years ago|reply
By the time most ISPs started dropping it, a vanishingly small percentage of most ISPs' users even knew what it was, and the binaries groups had turned it into a source of both cost and legal risk. The heavy users were people who incurred that cost and risk to the ISPs because they were using it for pirating software and porn. The icing on the cake would've been the fact that it's a terribly inefficient way to distribute those things and the ISPs have to store all that stuff locally on servers they own.
From an ISP's perspective, maintaining Usenet feeds became all downside and no upside.
Regarding government control, I would think that Usenet would've been far easier to monitor and censor than the web.
[+] [-] TeMPOraL|10 years ago|reply
I've heard it labeled "evaporative cooling", per [0].
[0] - http://lesswrong.com/lw/lr/evaporative_cooling_of_group_beli...
[+] [-] JoshTriplett|10 years ago|reply
No conspiracy theories needed here.
Copyright infringement is one angle; the other is that it costs ISPs a huge amount of resources for something few people use.
Once upon a time, a single server could easily mirror all of USENET for all users of an ISP, and almost every user expected it, so they'd treat it as an essential part of the service. Now, it would take far more storage to do so, and almost nobody expects it, so why should an ISP provide it? It's easier to let people get USENET from a third-party service, and it'd be a better experience for the people who actually want it, too.
If an ISP has resources to burn and wants to make their technical users happy, they'd get far better results for more users if they provided things like local Linux distribution mirrors instead. Far more users would make use of that than USENET.
And if they want to make the vast majority of users happy, and save resources on their end in the process, they can provide local CDN nodes for YouTube, Netflix, and similar.
[+] [-] Animats|10 years ago|reply
[+] [-] brobdingnagian|10 years ago|reply
[+] [-] captainmuon|10 years ago|reply
I think it should be possible to get replicas of non-binary newsgroups, but a quick search hasn't found any free option.
[+] [-] NoPiece|10 years ago|reply
[+] [-] DanBC|10 years ago|reply
[+] [-] MisterBastahrd|10 years ago|reply
[+] [-] JoshTriplett|10 years ago|reply
[+] [-] mrbill|10 years ago|reply
A couple years earlier I'd been one of the senior admins at Texas.Net (now DataFoundry) and helped build out what eventually turned into GigaNews, which used multiple dual-proc Sun E450s.. I think they're still one of the "biggest" Usenet providers these days.
[+] [-] gioele|10 years ago|reply
I am now subscribed to maybe 2 mailing lists, the rest (two dozens) I read via gmane.
[+] [-] nickysielicki|10 years ago|reply
[+] [-] rjsw|10 years ago|reply
[+] [-] ised|10 years ago|reply
Google acquired Deja News (if Usenet is wortheless, why?) and now all the archived Usenet messages are web-access only and fronted by Java and Javascript nonsense.
If the Usenet archives are no longer important or if everyone thinks Usenet is "dead", then why put these messages behind Javascript and try to prevent bulk downloads (which is how NNTP was designed to work)?
[+] [-] codinghorror|10 years ago|reply
The "let's put a thin web veneer over X" approach has never been a great one, perhaps sufficient for Web 1.0, but not these days.
[+] [-] tmpusenet|10 years ago|reply
[+] [-] batou|10 years ago|reply
Edit: Forgot to say that the tech is fine; a member of my family operates a usenet server over in Switzerland for our family. Works well for that sort of thing and avoids facebook etc.
[+] [-] randcraw|10 years ago|reply
http://www.cnet.com/news/n-y-attorney-general-forces-isps-to...
Thank Andrew Cuomo.