The year is 2050. You are reading this comment from a compatibility layer in your open-source browser that translates HTML from the 2010s into Thought-Interface Language 3.2, which was an open standard ratified in 2045 by a global consortium of content and browser developers.
Back in the 2010s, web access was peculiarly gated in a dendritic configuration as ISPs provided all the single-points-of-failure interconnections between end users (including both content providers as well as consumers) and the true "internet", a multiway resiliently-routed interconnect of servers. As we know now, extending the peer-to-peer core of the internet down to the consumer has had lasting impact, including breaking up the routing monopolies of the ISPs as well as making it possible for anyone willing to spend a few grand a year on server capacity to host a new peer-to-peer router for nearby Internet users.
Many of you may not remember the origins of Google as a "search engine", a monolithic index of "every reachable page on the internet." Such a quaint idea has long since joined even further historic concepts such as Yahoo's "human-curated list of pages on the Internet". Ever since the Searchtorrent protocol was introduced and consumer searches were conducted on one of several competing distributed hash tables across the internet, no one entity has had to shoulder the responsibility of storing all the web content on the internet. This author gladly pays a small monthly fee to a local search cache provider for reliably fast localized caching of search results.
The web is here to stay. Remember your history next time you visit the local Homo Sapiens preserve and give thanks to the carbon-based beings that invented the Internet.
DNS through a massive distributed ledger. The blockchain. I can't wait.
Individuals' devices will be the backbone of the web/internet, not massive server farms owned by Google and the likes. A small group of smart phones distributed across the region will be able to handle massive amounts of traffic with additional amazing cache protocols.
Only thing worrying about such decentralization is all of the extra tons of CO2 that it produces. Of course to robots, such concerns wouldn't be nearly as acute.
>If you’re over 50 you might just remember the birth of Google, with their famous motto ‘Do No Evil’.
I love how people misremember this motto. The original slogan was "Don't be evil" which is quite different and far more subjective to start with. Now they have updated it to "Do the right thing" and you can imagine how easy it is to dance around that.
But people seem to think Larry and Sergey were actually trying to be ethically meticulous. Nonsense--the slogan always had the subtle meaning of "Don't be Microsoft-level evil" and it turns out that was not an easy hurdle to clear.
Facebook changed theirs in 2014 from "Move Fast and Break Things" to "Move Fast with Stable Infrastructure". It's funny how they predictably become vaguer and less controversial.
I think it's similar to Obama's "don't do stupid shit". It's not really supposed to mean anything concrete, it's just a good philosophy for approaching big decisions. I don't know if it works, but I understand the intent.
I disagree. Cryptocurrencies have shown that the new generation (as well as the old one) can embrace new and decentralized technologies.
The decentralized web is already a "successful" idea. The correct implementation for its wide use is not there yet. But it will be there.
It is just a matter of time before we have a bigger "dark web", a decentralized web, decentralized payment networks, and still have Google, Facebook, and the likes.
As the internet population grows, and as people move to more digital lifestyles; the people won't be limited (or gravitate) to a single portal. Instead, they'll spread over different networks/infrastructures for their different needs. Facebook can still be successful and grow while the decentralized internet happen.
The Internet is growing both in number (population) and in use. People today use the Internet to surf, chat, read the news, buy stuff online, book flights and hotels, pay taxes, work, study, find partners, buy drugs, etc...
Cryptocurrencies are used for financial speculation, fraud and illegal trade. With only tiny fractions of the cryptocurrency-using fraction of society using them for other purposes. I would not call that nonsense the embrace of a generation.
As for cryptocurrencies, as long as they are dependent on ISPs, they aren't actually decentralized. They're quite vulnerable, and the only way out is to not depend on ISPs or the whole wired internet to be honest. Blockstream is a good way forward.
But all of things will never be found, if google or facebook don't point you to them. The author was saying, these 2 portals control access to everything else. Without join those 2 platforms, you have no visitors, thus the dark web.
Basically all decentralised systems share the same flaw. You have to download the entire dataset or use an index for every possible query which means the index will be larger than the data itself.
I like the idea of "rebooting the web". If things continue in the direction they are going now, I could see many forms of the internet existing. Just as the Darkweb exists, I could see other splinter networks and technologies taking shape as the internet we know now becomes more homogenized, whether it is because of giants like Google and Facebook or government control (oh god pls no) or any other factor.
I still fondly remember looking at Nike's newest shoe offerings in 1997, waiting for the photos to download and listening to my dad complain about the phone line being tied up. I looked at my girlfriend the other day in fact, and just went "god, think of how different the internet is now compared to when we were younger. What the hell will it look like in twenty years?" She called me a nerd, but still considered the question. Exciting and sightly terrifying thought to ponder, really.
If I know anything about the future, it doesn't look like the present.
The web won't look like it does now in 2050, and neither will the internet.
But it might very well be built on webassembly on browsing engines cum operating systems on top of hypervisors on top of verified microkernels, and the web will probably be delivered on top of HTTP/2 on top of TCP/UDP and so on. The layers probably won't change that much.
If things really are so dire in...33 years, then it won't be Facebook or Google's fault, it'll be the fault of hundreds of thousands of hackers who had the technology available and did nothing because everyone knows those two are unbeatable, despite the fact that the tech gets cheaper and more accessible every single day.
We've got a long way to go. They're not unbeatable. They're massive goliaths, yes, but they also bloated and slow to adapt, can't focus on any one thing, and don't have consumer loyalty. They can be beaten. Not saying they will, but they can.
Side note, Halt and Catch Fire, which has always tried to be technically accurate starts focusing a lot on the early web in season 3 and 4. CERN, NeXTcubes, and related all make an appearance. It's a fun watch if you are interested in that stuff. The pilot starts with them reverse engineering an IBM PC.
If you reduce the details of the story into the statement "the future of the web will be driven by anti-trust", I'd probably agree. The _present_ of the web is driven by anti-trust, and there's always more consolidation.
Where machine learning, social networks, and advertising have economies of scale, a tolerable future for the web would necessarily involve diseconomies of scale. Personal connection, concierge service, local long-term engagement with communities.
Absolutely. Yahoo! used to be a particular favourite of mine. Search, email, chat and games. A great place to hang out and weirdly not that far removed from Facebook. Sure the UI has evolved quite a bit but it's similiar basic services.
Servers are only going to get cheaper. Programming is only going to get easier. If anything, things like search engines and social networks are going to become more competitive.
If someone has a genius idea for making a better engine, he won't work for google, he'll create his own.
Implicit in this fear of centralization is a kaczynskiist belief that "everything that can be invented has been invented".
People predicted some company taking over everything forever, and in fact even before the web existed sci-fi-authors imagined a centralized network, were from the servers to the software everything is provided by the government. It's never going to happen.
>Servers are only going to get cheaper. Programming is only going to get easier. If anything, things like search engines and social networks are going to become more competitive
Programming skill and number of programmers is not a limiting factor to Product and social networking development. In fact, there is a vast oversupply of talent. If you look at the talent to opportunity ratio: it's enormous.
Look at product hunt. Dozens of potentially brilliant projects built and released everyday and yet only a tiny tiny percentage will ever be successful. Most fail because, either they've built something that too few people find useful or the market they're trying to address is too crowded (already has too many people offering similar services).
The rebooted decentralised web sounds exciting, but it's hard to deny that there are large number of projects that only Google can carry out. At what point does the dominance becomes irresponsibly large and requires intervention?
If history repeats itself, then some new technology will take Google and Facebook by surprise. And let a new player rise to the top.
AI is the obvious elephant in the room here.
If in 10 years Apple, Amazon, Tesla or some new startup has the better AI, then this AI will search and present content better. And market it better. And monetize it better. It might also produce its own content. Perfectly customized interactive 3D surround sound content.
Mayb it will be some decentralized autonomous organization that lives on a blockchain. Driven by AI, doing its thing. Outside of what a human mind can understand.
Relics like this exist today -- there are still Gopher servers out there. Current browsers no longer support the protocol, but you can tour the relics through a proxy -- info here:
The links to gopherspace itself are on the upper right ("standard version"/no-javascript); I'm honoring their request not to link to the proxy itself directly.
If I was to bet, I would bet that in 2050 the web will be mostly replaced by some kind of VR network with a lot of sound, 3D videos and interactive objects. The web as it is already decays due to tons legacy cruft, insane complexity of doing trivial things, oceans of bad content and hyper-centralization. And all of these things are getting worse every year. VR is our best bet for a clean start.
Directionally, AMP does make the Google search page start to look more like the semi captive AOL interface of old. Content producers upload stuff to AOL, and pay for things like ads and "AOL keywords" to get an audience. AOL, meanwhile, controls the UI, puts whatever they want in the sidebars, has all the analytics data, etc.
[+] [-] talyian|8 years ago|reply
Back in the 2010s, web access was peculiarly gated in a dendritic configuration as ISPs provided all the single-points-of-failure interconnections between end users (including both content providers as well as consumers) and the true "internet", a multiway resiliently-routed interconnect of servers. As we know now, extending the peer-to-peer core of the internet down to the consumer has had lasting impact, including breaking up the routing monopolies of the ISPs as well as making it possible for anyone willing to spend a few grand a year on server capacity to host a new peer-to-peer router for nearby Internet users.
Many of you may not remember the origins of Google as a "search engine", a monolithic index of "every reachable page on the internet." Such a quaint idea has long since joined even further historic concepts such as Yahoo's "human-curated list of pages on the Internet". Ever since the Searchtorrent protocol was introduced and consumer searches were conducted on one of several competing distributed hash tables across the internet, no one entity has had to shoulder the responsibility of storing all the web content on the internet. This author gladly pays a small monthly fee to a local search cache provider for reliably fast localized caching of search results.
The web is here to stay. Remember your history next time you visit the local Homo Sapiens preserve and give thanks to the carbon-based beings that invented the Internet.
[+] [-] QAPereo|8 years ago|reply
[+] [-] Cshelton|8 years ago|reply
Individuals' devices will be the backbone of the web/internet, not massive server farms owned by Google and the likes. A small group of smart phones distributed across the region will be able to handle massive amounts of traffic with additional amazing cache protocols.
[+] [-] scotty79|8 years ago|reply
[+] [-] icebraining|8 years ago|reply
So, they renamed YaCy? http://yacy.net/en/Technology.html
[+] [-] RonanTheGrey|8 years ago|reply
The alternative is... dystopian.
[+] [-] nsxwolf|8 years ago|reply
[+] [-] 127|8 years ago|reply
[+] [-] jstoiko|8 years ago|reply
sounds like there has been a lot of inflation in only 33yrs. what caused this?
[+] [-] jugg1es|8 years ago|reply
[+] [-] triangleman|8 years ago|reply
I love how people misremember this motto. The original slogan was "Don't be evil" which is quite different and far more subjective to start with. Now they have updated it to "Do the right thing" and you can imagine how easy it is to dance around that.
But people seem to think Larry and Sergey were actually trying to be ethically meticulous. Nonsense--the slogan always had the subtle meaning of "Don't be Microsoft-level evil" and it turns out that was not an easy hurdle to clear.
[+] [-] choxi|8 years ago|reply
[+] [-] burkaman|8 years ago|reply
[+] [-] jacquesm|8 years ago|reply
[+] [-] unknown|8 years ago|reply
[deleted]
[+] [-] 2OEH8eoCRo|8 years ago|reply
[+] [-] csomar|8 years ago|reply
The decentralized web is already a "successful" idea. The correct implementation for its wide use is not there yet. But it will be there.
It is just a matter of time before we have a bigger "dark web", a decentralized web, decentralized payment networks, and still have Google, Facebook, and the likes.
As the internet population grows, and as people move to more digital lifestyles; the people won't be limited (or gravitate) to a single portal. Instead, they'll spread over different networks/infrastructures for their different needs. Facebook can still be successful and grow while the decentralized internet happen.
The Internet is growing both in number (population) and in use. People today use the Internet to surf, chat, read the news, buy stuff online, book flights and hotels, pay taxes, work, study, find partners, buy drugs, etc...
[+] [-] mynewtb|8 years ago|reply
[+] [-] staltz|8 years ago|reply
As for cryptocurrencies, as long as they are dependent on ISPs, they aren't actually decentralized. They're quite vulnerable, and the only way out is to not depend on ISPs or the whole wired internet to be honest. Blockstream is a good way forward.
Basically, the new decentralized web must be wireless-first: http://ssb.staltz.com/view/%25a1xQAO6/UCC370Cq+HcWjEni1ziXH+...
[+] [-] pascalxus|8 years ago|reply
[+] [-] imtringued|8 years ago|reply
[+] [-] altotrees|8 years ago|reply
I still fondly remember looking at Nike's newest shoe offerings in 1997, waiting for the photos to download and listening to my dad complain about the phone line being tied up. I looked at my girlfriend the other day in fact, and just went "god, think of how different the internet is now compared to when we were younger. What the hell will it look like in twenty years?" She called me a nerd, but still considered the question. Exciting and sightly terrifying thought to ponder, really.
[+] [-] AaronFriel|8 years ago|reply
The web won't look like it does now in 2050, and neither will the internet.
But it might very well be built on webassembly on browsing engines cum operating systems on top of hypervisors on top of verified microkernels, and the web will probably be delivered on top of HTTP/2 on top of TCP/UDP and so on. The layers probably won't change that much.
[+] [-] aaron-lebo|8 years ago|reply
We've got a long way to go. They're not unbeatable. They're massive goliaths, yes, but they also bloated and slow to adapt, can't focus on any one thing, and don't have consumer loyalty. They can be beaten. Not saying they will, but they can.
Side note, Halt and Catch Fire, which has always tried to be technically accurate starts focusing a lot on the early web in season 3 and 4. CERN, NeXTcubes, and related all make an appearance. It's a fun watch if you are interested in that stuff. The pilot starts with them reverse engineering an IBM PC.
[+] [-] hawkice|8 years ago|reply
Where machine learning, social networks, and advertising have economies of scale, a tolerable future for the web would necessarily involve diseconomies of scale. Personal connection, concierge service, local long-term engagement with communities.
[+] [-] gaius|8 years ago|reply
[+] [-] hnlmorg|8 years ago|reply
[+] [-] freech|8 years ago|reply
If someone has a genius idea for making a better engine, he won't work for google, he'll create his own.
Implicit in this fear of centralization is a kaczynskiist belief that "everything that can be invented has been invented".
People predicted some company taking over everything forever, and in fact even before the web existed sci-fi-authors imagined a centralized network, were from the servers to the software everything is provided by the government. It's never going to happen.
[+] [-] pascalxus|8 years ago|reply
Programming skill and number of programmers is not a limiting factor to Product and social networking development. In fact, there is a vast oversupply of talent. If you look at the talent to opportunity ratio: it's enormous.
Look at product hunt. Dozens of potentially brilliant projects built and released everyday and yet only a tiny tiny percentage will ever be successful. Most fail because, either they've built something that too few people find useful or the market they're trying to address is too crowded (already has too many people offering similar services).
[+] [-] shubhamjain|8 years ago|reply
[+] [-] rvanmil|8 years ago|reply
[+] [-] hossbeast|8 years ago|reply
[+] [-] obiefernandez|8 years ago|reply
LOL... he would run as a Democrat, wouldn't he?
[+] [-] TekMol|8 years ago|reply
AI is the obvious elephant in the room here.
If in 10 years Apple, Amazon, Tesla or some new startup has the better AI, then this AI will search and present content better. And market it better. And monetize it better. It might also produce its own content. Perfectly customized interactive 3D surround sound content.
Mayb it will be some decentralized autonomous organization that lives on a blockchain. Driven by AI, doing its thing. Outside of what a human mind can understand.
[+] [-] mdekkers|8 years ago|reply
That's when I realised this article was Fake News!
[+] [-] ubertaco|8 years ago|reply
[+] [-] rst|8 years ago|reply
http://gopher.floodgap.com/gopher/
The links to gopherspace itself are on the upper right ("standard version"/no-javascript); I'm honoring their request not to link to the proxy itself directly.
[+] [-] kolbe|8 years ago|reply
[+] [-] romaniv|8 years ago|reply
[+] [-] zanybear|8 years ago|reply
[+] [-] swiftting|8 years ago|reply
Hopefully this will not be the result of AMP but interesting take nonetheless.
[+] [-] tyingq|8 years ago|reply
[+] [-] eponeponepon|8 years ago|reply
[+] [-] cerealbad|8 years ago|reply
what will replace it? a copy of all your favourite people stored on the implant in your brain. the interface will be a waking dream.
[+] [-] unknown|8 years ago|reply
[deleted]