top | item 7190866

Tim Berners-Lee: We need to re-decentralise the web

585 points| amirmc | 12 years ago |wired.co.uk | reply

200 comments

order
[+] doctoboggan|12 years ago|reply
Perfect timing, I just released a tool that enabled a more decentralized web. It used BitTorrent Sync to distribute files and I plan on implementing namecoin or colored coins for name resolution. As soon as an open source, stable btsync clone is released I plan on swapping it in.

The biggest drawback is that you can only serve static content, although I think many websites today would be fine with this restriction.

You can read more on my website: http://jack.minardi.org/software/syncnet-a-decentralized-web...

Follow along with the development on github here: https://github.com/jminardi/syncnet

[+] Sir_Cmpwn|12 years ago|reply
BitTorrent Sync is the wrong foundation for this revolution. It should be open source from top to bottom.
[+] p4bl0|12 years ago|reply
This is quite nice. You should submit it as a "Show HN". You can be sure that as soon as you have a free software replacement for btsync I'll use it at least to serve my personal website.

By the way, have you heard about Freenet? https://freenetproject.org/

[+] xerophtye|12 years ago|reply
While i love your idea, (and am glad you found the context to show your project off again. i read the original post), i do have two nitpicks.

You said it downloads the entire site when i open a site. Now, this is completely fine for normal text-based blogs, but what if it's an image heavy site? Like theChive? Wouldn't i end up storing several gigabits (possibly terrabytes) of data for that site?

Secondly, if somehow we extend the concept to dynamic content (don't ask me how) wouldn't it effectively mean each of us running a server and hosting it on that?

Maybe your idea can be implemented on the server level instead of the client level... like proxy servers doing what you want the clients to do and thus the content would be all over the internet instead of a few specific servers. (not saying that's a good idea btw)

[+] zooko_LeastAuth|12 years ago|reply
You could consider using Tahoe-LAFS instead of BitTorrent sync. Tahoe-LAFS is open source and stable, but it isn't a clone of btsync, and it might not fit for your purposes.
[+] MikeTaylor|12 years ago|reply
It's great that Berners-Lee is saying this, but he'd have retained a lot more of his authority to pronounce on the future of the Web had he not thrown his weight behind the W3C's horrible plan to put DRM into HTML.
[+] desireco42|12 years ago|reply
Let's crucify his for that :)

I hear what you are saying, but there is also reason for drm. Or we will have flash forever. What I am saying, I at least can understand why he did it.

[+] RexRollman|12 years ago|reply
I was thinking the same thing. Thank you.
[+] dbingham|12 years ago|reply
Decrentralizing the web's software isn't good enough. We need to decentralize the hardware. Right now, connections to the web look like a tree, where a whole bunch of connections get funneled through an ISP. That ISP has the power. The power to throttle, the power to block, the power to record. And that ISP can be pressured by other powers. We need to decentralize so that instead of looking like a hierarchical tree, the internet looks like a graph. With each building forming a node that connects to its neighbors.

Of course, the amount of work it would take to build such a web and move to it likely rules out the possibility of it ever happening. I mean, how do we go about forming a movement to build this? It only works, really, if everyone's on board.

[+] thisiswrong|12 years ago|reply
>Right now, connections to the web look like a tree, where a whole bunch of connections get funneled through an ISP

This is very true. More so if you look at countries like the UK where ISPs have recently been consolidated into MAFIAA-complying monopolies that censor just about anything that Hollywood and the government ask them to. It's scary how close ISPs are to the 'governements' of the UK and USA. Obamas' recent NSA 'reform' proposal even suggested that ISPs be in charge of surveillance & storage of Metadata instead of the NSA! WTF? Fascism.

Mesh networks have a big role to play in decentralizing the the post-NSA internet. Micropayments through bitcoin and cryptocurrencies will encourage individual nodes to provide bandwidth to users [1]. Smartphone apps like OpenGarden [2] and systems like the Athens Wireless Metropolitan Network (AWMN) [3] are the frontrunners of the new internet.

[1] https://crypto.cs.virginia.edu/courses/14s-pet/2014/01/30/aw... [2] http://en.wikipedia.org/wiki/Open_Garden [3] http://www.motherjones.com/politics/2013/08/mesh-internet-pr...

[+] archagon|12 years ago|reply
I sometimes wonder about this. Pretty much everyone already has a connected computer in their pocket. Wouldn't it be nice if we could use the phone without a cell provider? The web without an ISP? Connect to our friends without a social network? Exchange money without a bank?

Thinking further, what if all these services could be plugged into a well-abstracted peer-to-peer network, consisting of every connected device in the world? Services similar to Twitter or Facebook would no longer require a central host. Redundancy would be built in. Uptime would be pretty much guaranteed. Ads would go away. Freedom would be an implicit part of the system; no longer would profit motives sully (or censor!) services that people use and enjoy. And it would be more natural, too: pumping all our data through a few central pipes makes a lot less sense than simply connecting to our neighbors. Our devices would talk to each other just like we talk in the real world, only with the advantage of being able to traverse the global network in a matter of milliseconds.

New technologies like Bitcoin and BitTorrent would crop up naturally as part of this arrangement. It would put power back into the hands of the people.

Sadly, it seems a future of locked-down app stores controlled by a small number of large corporations is much more likely.

(Sorry, I realized this started sounding like a Marxist rant halfway through. And I'm not even a Marxist! It just sucks that this fascinating future is one that we're not likely to experience, even though we're almost at the point where we could actually implement it. I think.)

[+] aij|12 years ago|reply
Where I live at least, nearly everyone has a wireless router, which hardware-wise should have no trouble talking to other wireless routers in nearby buildings. Often, there are even multiple wireless routers in the same building, which can't talk to each other except through the ISPs, which I find quite ridiculous.

So, in urban areas of the US at least, it really is just a matter of software.

Of course, a large scale wireless mesh network would probably be somewhat lacking in terms of performance, but we also still have wired connections to ISPs and could start setting up wired connections to each other. Peering with your neighbor (wirelessly or not) to share idle bandwidth shouldn't make things any slower than they already are...

The problem of course is that everyone would have to use a compatible mesh protocol. There are several options for a mesh network protocol now, but I don't know of any that scales well while remaining decentralized. Even if someone did come up with a good protocol now, considering how long the switch to IPv6 has been taking, I wouldn't expect any switch away from IPv4 to happen very soon.

[+] dclowd9901|12 years ago|reply
Yes, probably too difficult to move entirely, but you can build it off of the current web, and just migrate it over, or: You don't move a mountain, you move stones.
[+] m_mueller|12 years ago|reply
Just throwing an idea a the wall here: Maybe it could be combined with the decentralization of the power grid, i.e. smartgrids? This would (a) give additional incentives to home owners (subsidizing their investment) and power companies (going after the broadband market) and (b) it would solve the question on where to get the bandwidth from.
[+] takeda|12 years ago|reply
One of solution is a different protocol: http://named-data.net/

This looks extremely interesting, basically the entire network works like a giant CDN.

[+] donpdonp|12 years ago|reply
The IndieWeb movement is pushing the constructs built on top of the web into a decentralized form using self hosted content and webhook notifications.

Example: Comments on a blog post.

Given a blog post at URL A, comments are created by posting a note on your own site at URL B, then using a webhook to notify the blog that such a comment exists.

Your content cannot go away as long as you host it, and its not dependent on the blog's cooperation to be available, just to be findable.

http://indiewebcamp.com/comment

[+] GrinningFool|12 years ago|reply
Doesn't that introduce the reverse problem though?

That makes someone else's content depend on the availability of your content - in the example of comments, the context and any intermediate replies can be lost if you stop hosting them. Leaving not a record of a conversation behind, but only indecipherable fragments.

A quick scan through the page at the link posted also seems to indicate the protocol depends a lot on content providers being well behaved. I'm assuming that somewhere in there there is handling for when content simply goes away with notification?

[+] riffraff|12 years ago|reply
so, trackbacks?
[+] soapdog|12 years ago|reply
The idea of an open, free and decentralized web created by all for all is one of the main reasons I became a Mozillian. I invite all that think this is a good idea to check out their local Mozilla communities and help carry this mission onwards.

You can get guidance on how to get involved with Mozilla by following the instructions and filling the form at http://www.mozilla.org/en-US/contribute/

=)

[+] rasz_pl|12 years ago|reply
Funny you say that. If you study history you will learn Mozilla org was big on pushing proprietary non standard compliant extensions. Rise of IE curbed their enthusiasm because roles have changed. I think it was in one of the interview videos at https://www.coursera.org/course/insidetheinternet
[+] nawitus|12 years ago|reply
The internet is decentralized, the web has never been. HTTP protocol doesn't support it. But it's time to do that. What we need is the possibility of peer-to-peer connections between browsers.
[+] zackmorris|12 years ago|reply
While I agree that TCP/IP is decentralized, unfortunately due to NAT and corporate firewalls, vast swaths of users have been relegated to second class citizens who can only act as data consumers. Unfortunately this same misguided approach to security, or as I call it suckerity, has been carried forward into IPv6.

There is some hope that WebRTC and other p2p technologies might alleviate the problem but then the task of punching through firewalls with techniques like STUN falls on developers. I'm not seeing an industry-led campaign to standardize a true p2p protocol based on something like UDP, because it would undermine the profits of data producers.

I could go into how the proper layering of TCP should have been IP->UDP->TCP instead of IP->TCP and a whole host of technical mistakes such as inability to set the number of checksum bits, use alternatives to TCP header compression, or even use jumbo frames over broadband, but what it all comes down to is that what we think of as “the internet” (connection-oriented reliable streams and the client/server model) is not really compatible with a distributed content-addressable connectionless internet that can work with high latencies and packet loss.

I think if this network existed, then extending HTTP to operate in a p2p fashion wouldn’t be all that complicated. Probably the way it would work is that you’d request data by hash instead of by address, and local caches would send you the pieces that match like BitTorrent. Most of the complexity will center around security, but I think Bitcoin and other networks of trust point the way to being able to both prove one’s identity and surf anonymously.

Tim Berners-Lee may not be thinking about the problem at this level but we need more people with his clout to get the ball rolling. I wasted a lot of time attempting to write a low-level stream API over UDP for networked games before zeromq or WebRTC existed and ended up failing because there are just too many factors that made it nondeterministic. It’s going to have to be a group effort and will probably require funding or at least a donation of talent from people familiar with the pitfalls to get it right. Otherwise I just don’t think a new standard (one that makes recommendations instead of solving the heart of the problem) is going to catch on.

[+] icebraining|12 years ago|reply
Why do you say HTTP doesn't support it? If a browser doubled as a server, and each person hosted its own pages, how would that not be decentralized and P2P? Hell, it's been done: Opera Unite had just that model, with web pages and applications hosted by the browser itself: http://www.operasoftware.com/press/releases/general/opera-un...
[+] jnbiche|12 years ago|reply
Peer-to-peer connections are possible right now with Chrome and Firefox using WebRTC peerconnection.

The WebRTC connection is bootstrapped through a server (STUN and TURN), but that's the case with almost all peer-to-peer software (take a look into how Bitcoin or Bittorrent is bootstrapped if you don't believe me). But anyone can run a STUN and/or TURN server.

Once the connection is bootstrapped, it's entirely peer-to-peer between browsers.

[+] EGreg|12 years ago|reply
I wrote about this as well: http://myownstream.com/blog#2011-05-21

Two years ago I really became passionate about the problem of decentralizing the consumer internet again. We can see with git and other tools how distributed workflows are better in many ways than centralized ones. The internet was originally designed to be decentralized, with no single point of failure, but there's a strong tendency for services to crop up and use network effects to amass lots of users. VC firms have a thesis to invest in such companies. While this is true, the future is in distributed computing, like Wordpress for blogs or Git for version control.

I started a company two years ago to build a distributed publishing platform. And it's nearly complete.

http://qbix.com/blog

http://magarshak.com/blog/?p=135

Soon... it will let people run a distributed social network and publish things over which they have personal control. And I'm open sourcing it:

http://github.com/EGreg/Q

http://qbixstaging.com/QP <-- coming soon

[+] wwwtyro|12 years ago|reply
Part of the difficulty of bringing this about is making p2p easy for users. I have high hopes that WebRTC data channels can start tearing these walls down by making it as simple as opening a web app.
[+] valisystem|12 years ago|reply
Not only easy, but also fast and reliable enough to compete with centralized services.
[+] chimeracoder|12 years ago|reply
> Part of the difficulty of bringing this about is making p2p easy for users.

As a general concept, P2P is easy for users - using bittorrent clients, TPB, etc. is essentially mainstream.

The one thing that would be nice is browser support - I would love it if Chrome had the option to download .torrent files using the Chrome download manager.

Thanks to web seeds[0], this would allow providers to convert any download to a torrent instead, both increasing use of P2P and decreasing their own server load.

Users wouldn't even need to know that they're using Bittorrent to download files - all they'd see are faster download speeds.

[0] https://en.wikipedia.org/wiki/BitTorrent#Web_seeding

[+] sergiosgc|12 years ago|reply
We need IPV6. We need every device to be fully connected, not the halfassed solution we have now. Full connectivity plus ever increasing distributed and unused computing capacity is the right playground for a new set of network usage paradigms.

The mainframe->distributed->mainframe cycle will then once again swing to distributed (yes, we now live again in a mainframe age).

[+] peterwwillis|12 years ago|reply
This actually has nothing to do with the web in general, or the internet, or politics, or personal liberties, or peer to peer networks. This is about designing stable distributed applications.

Almost every really big, really stable website (or network service) is built on a set of distributed applications. Global data redundancy is just one of the considerations. If you want your application available everywhere, all the time, you have to design it to withstand faults, to distribute data and computation, and to do this over long distances, and still perform the same actions the same way everywhere. That's all a de-centralized web needs to look like, and it's actually already implemented in many places.

What I think Tim is saying is that we need to move away from concepts that centralize data and computation and use existing proven models to make them more stable in a global way. And i'm totally behind that. But if you think we're going to get there with p2p, self-hosted solutions, new protocols, new tools, new paradigms, or a new internet, you've completely missed the boat. We have had everything we need to accomplish what Tim wants for a while. It just has to be used properly [which some companies actually do].

But good luck convincing most companies to spend the time and money doing that...

[+] dclara|12 years ago|reply
I partially agree with you in the sense that we don't need new network with p2p or self-hosted solutions, but need stable distributed applications. How?

I cannot agree with you to realize it via data redundancy and replication. The way you mentioned is just what Google is doing right now, which is not true distributed computing. Google's architecture relies on replicating services across many different machines to achieve distributed computing/storage and fault-tolerance across the world. See the reference here: http://static.googleusercontent.com/external_content/untrust...

The real distributed computing solution for providing a well-organized web has been proposed to build a layer on top of the existing physical infrastructure of the internet to present the web in a de-centralized manner under a systematic way. http://bit.ly/MwT4rx

[+] aaronpk|12 years ago|reply
I always chuckle when the title of the article was obviously changed after first putting it online because the slug uses a different word:

"Tim Berners-Lee: we need to re-decentralise the web"

vs

"tim-berners-lee-reclaim-the-web"

[+] dkuntz2|12 years ago|reply
That doesn't imply the title was changed. Shorter slugs are nicer, and there could be some policy involved in keeping them short. It doesn't mean anything.
[+] swalsh|12 years ago|reply
I've been trying to think about "rethinking" government for quite a while. The idea of a web based governance system seems really appealing, in the sense that I think the greatest strength of the web is how quickly ideas can go from concept to execution. Its about the closest thing to a meritocracy as we've ever come, and that's simply because the gate keeper is thrown away. Think about how many great things have sprung out of places like Reddit. Ordinary people who have a quick flash of a good idea can take 30 minutes to type something up... and sometimes it gets traction. Getting things done in our current system requires an inhuman like persistence, and the luck of making a relevant connection. Even if you have a good idea, its a difficult thing to execute.

I have this image of a framework/protocol in my head around allowing various policies to be created, and built on top of each other. As a programmer I pretty much used object oriented programming as an inspiration. Though I guess even OOP had its inspiration from biology. Its a way for an organic system to grow and adapt robustly. I think just like computer programs, government needs that too.

Added to that, if the web has a decent self regulation mechanism, that makes the justification of regulating the web even harder.

The one problem is, the advantage "physical" governments have is they have sovereignty. If Amazon starts doing illegal things, the FBI can arrest Jeff Bezos. The internet doesn't have an equivlent means.

However I think there is away to have the essence of sovereignty, and that is through a new crypto-currency like bitcoin. Imagine if to be a part, and to receive the benefits of this online sovereign nation you had to register your receive address. If the sovereign entity had a way to embargo your ability to accept payments in the currency, it might be enough penalty to fall back inline with whatever regulations were created. I think it also provides another means to give legitamicy to a crypto-currency.

An example of something that would be cool is this. You want to create an open source space program, so you propose the idea on your local netizen forum. The idea has overwhelming support, so someone creates a new policy component, and gives permissions to suction tax funds. I can imagine something like that happening in hours. Imagine initiating a public effort going from idea to reality in hours.

[+] cromwellian|12 years ago|reply
My own rant on this from 2008 from a different perspective: http://timepedia.blogspot.com/2008/05/decentralizing-web.htm...
[+] oneofthose|12 years ago|reply
Excellent article and almost 6 years old. There are many projects to mention here but so far none of them took of as far as I know. Or did they?

After publishing my own rant about the same topic [0], Dave Winer commented via Twitter and argued that RSS is a very successful tool that decentralizes the web. I didn't understand his point at the time but today I think he is absolutely right.

His point was that when trying to fix the problem we should look at what already exists and works. Why did RSS succeed? Probably because it solved a problem for many people. It did not try to reinvent old things in a decentralized way but offered real benefits. Another good example for this git vs svn (centralized vs decentralized revision system).

[0] http://www.soa-world.de/echelon/2011/09/the-decentralized-we...

[+] nathana|12 years ago|reply
Fantastic! Thanks for sharing. I've been beating the same drum for years within my circle of friends, and I always tend to get the "huh?" reaction. It's always refreshing to discover somebody else who seems to be running on the same wavelength. :)

I miss USENET. Nowadays, my digital identity has been spread to the four corners, across Facebook, Twitter, various unrelated phpBB-based sites, and places such as /. and HN. Even if I wanted to, there is no possible way that I could re-locate every digital utterance I have ever made publicly. Not only that, but I don't truly "own" anything I write anymore. It's stored in some unmarked server somewhere, stuffed into a schema that has no published documentation. I still have e-mail archives going back 10+ years, but if Facebook ceases to exist 10 years into the future, it will be like the (admittedly few) worthwhile written conversations and interactions I had with people on there never happened. There will be no record of it.

There's got to be a better way.

[+] gress|12 years ago|reply
Given the support Google has from the tech community, I don't see much movement on this in the near future.
[+] techaddict009|12 years ago|reply
Time Berners Lee : "I would have got rid of the slash slash after the colon. You don't really need it. It just seemed like a good idea at the time."

When asked what he would have done differently!!

[+] zokier|12 years ago|reply
I'm surprised by the amount of comments declaring "Berners-Lee supported DRM so his words are meaningless". Not that I myself put a lot of weight to his words, but not for some silly DRM issue (frankly I didn't even remember that), but because I can't really say what his contribution to the web (or anything relevant) has been in the past couple of decades. Looking at his wikipedia page, it mentions that he founded W3C in 1994, and that's about it. Everything since seems to be some sort fluff, often with grandiose names and concepts but very little substance.

So I'm not surprised that people do not put weight on his words, but rather the reason why they don't and the apparent recency of this opinion.

[+] munificent|12 years ago|reply
Wow, the giant flashing ad on the left of that article was so distracting I was unable to read the text. I eventually opened up the web inspector and just deleted the damn thing from the DOM.