top | item 9913746

Do we need browsers?

122 points| barnacs | 10 years ago |blog.justletit.be | reply

113 comments

order
[+] mbrock|10 years ago|reply
Personally, I don't enjoy using web browsers much, either for "documents" or apps. There's a lot of good stuff out there made with hard work by talented people, but when it comes to my own personal computing, I would much rather use other kinds of interaction.

99% of my internet life consists of a few homogenous styles of interaction: browsing pseudo-hierarchical directories; communicating via text; purchasing; looking at pictures and movies; etc. In a hypothetical cyber-utopia or something, this could all just be a simple protocol (say, um, combining aspects of Direct Connect, BitTorrent, NNTP, Gopher, Bitcoin, GPG) that I could use with my own client, which would probably be an Emacs interface. Instead we live in a world where the web standards seem to encourage teams to spend their energy on reimplementing drop-down widgets.

It's wasteful and doesn't even lead to particularly good user interfaces. How many sites or web apps work well (actually well, not just semi-tolerably) with keyboard navigation? How much bandwidth and CPU goes to waste? How many sites work passably offline? How easy is it to automate tasks as a user? How easy is it to learn how to make your own sites?

I think "browsers" should be something very different from what they are today, and maybe something new will come along in the near future.

[+] pjc50|10 years ago|reply
The browser is what it is because it's the camel designed by global committee. It's the place where competing interest groups fight one another to a standstill on a daily basis. Quite a lot of the stranger aspects aren't so much "features" as blast craters of past disasters.

For example, consider the five technologies for interactive, animated things in the browser: Java applets, ActiveX, Flash, Silverlight, and Javascript. Apart from Silverlight, which never really took off, they're all 20 years old. Only Javascript has survived the security wars; Java applets are dead, Flash gets a CVE bullet with its name on every few weeks, and ActiveX was killed long ago.

The near-death of NNTP is a story of spam cancels and cost allocation.

The browser covers all use cases and is available everywhere. That means it's necessarily horribly compromised compared to native solutions, but is an absolutely killer advantage for adoption. Incrementalism nearly always wins.

Furthermore, the unwillingness of people to pay directly for software leaves us with a continual problem of exploitative software. Everything from flashlight apps that steal your contact list to ads that steal your battery to connection-sharing apps that open you to liability for the actions of others. For the moment, we keep other people's software securely nailed shut in the browser.

[+] kylebrown|10 years ago|reply
> In a hypothetical cyber-utopia or something, this could all just be a simple protocol (say, um, combining aspects of Direct Connect, BitTorrent, NNTP, Gopher, Bitcoin, GPG)

The article does mention p2p and decentralization, but the browser isn't whats holding that back. The client-server development paradigm is.

Currently, every dev making a p2p app has to roll their own platform because there's no standard (i.e. no LAMP, rails, or Heroku for deploying p2p apps). Some projects are working to change this, namely BitTorrent Maelstrom (which is a fork of Chromium with native support for magnet URLs, so it auto-renders an html/js package distributed via torrent); and http://ipfs.io (a sort of content-addressable p2p filesystem, or bittorrent on steroids).

[+] teknologist|10 years ago|reply
Or, in short, you want everything to run in an emacs interface. The world has moved on. We have mice now. Sorry. Google implemented keyboard navigation in their search results and apparently the uptake has not been great.

CPUs are there to be "wasted". You pay for that clock frequency to use it and make your life better. Most people don't want to do everything in a terminal. It's depressing.

[+] nso95|10 years ago|reply
That's the entire point of semantic web
[+] asdfaoeu|10 years ago|reply
Regardless of what he's said its nice to read a webpage that for once didn't take 5 seconds to load, didn't clobber my screen with unnessecary junk or have jerky scrolling.
[+] qznc|10 years ago|reply
I would have appreciated a sensible max-width on my fullscreen desktop browser.
[+] onion2k|10 years ago|reply
The user opens a URL that downloads an 'app' and all it's dependencies. So, to start with, you'd have dozens of UI frameworks, network libraries, parsers, etc on your system. Some of them would be great, others would be terrible. Slowly, as the best ones bubble up to the top and become 'standards', each of dependencies would disappear. Developers would settle on one of a few different engines and frameworks. Apps would be the content, a few scripts to drive the engine, and a manifest to tell the user's computer which standard engine to download.

Which would essentially be the modern web as it is now.

[+] ajankovic|10 years ago|reply
As soon as the author started explaining problems about browsers I started thinking about solutions. And I realised that "we" already tried to solve this problem with plugins. Flash, Java applets, Silverlight, etc. were all envisioned as solutions to these problems. And it doesn't take too much brain to realise they all failed (at least in the popularity contest).

Restful nature of the Web is giving good structure on which to build and time shows that evolution, instead of forcing one good solution, is much better way for technology adoption when it comes to the masses.

[+] barnacs|10 years ago|reply
What you describe would be some kind of a natural selection among lower level abstractions. Maybe they would indeed converge into a "few standard engines", maybe not. Either way, they wouldn't force your hands: you could always use whatever UI framework or network library you want and still distribute your app as a simple URI.

In contrast, "the modern web as it is now" gives you a single built-in layout engine and not even a chance to implement network libraries. And that's exactly my point: For a document viewer, that's fine. For an application distribution platform, not so much.

[+] anon4|10 years ago|reply
Sometimes I dream of an alternate future where SUN weren't asshats and Java won and we're all running applications that come with maven POMs. Then I remember how badly the average web developer can bungle simple static HTML and realise how happy I am I don't have to debug spaghetti XML.
[+] MadcapJake|10 years ago|reply
The author meant "app" as in a real application on your computer not a "webapp" like from the Chrome Web Store.

Interestingly, I'd say it's closest to how the latest Android does it. E.g., click on a wikipedia link and it opens the page in the wikipedia app.

I'm not really following your fragmentation/consolidation line of argument or how it relates to the modern web...

[+] rimantas|10 years ago|reply
modern web is nothing like that.
[+] aikah|10 years ago|reply
hyperlinks are successful. if "native apps" had a standard way to handle links between apps then we wouldn't need browsers. if "native apps" would run automatically in a secure and sandboxed environment without requiring any installation then we wouldn't need native apps. We can argue about webtechs but there is no alternative to browsers and the web.

URLs / Security / Platform independence

Browsers based on standards offer all of these already. The problem is people trying to make products the web was not designed to run. That's why flash and co became so popular at one point and that's why browsers vendors are now trying to come up with something like web-assembly. I say sometimes it makes much more sense to write a native app. One isn't going to run a video-editing app like final cut on the web(especially since the file system api has been dropped, since Mozilla wasn't interested in it).

[+] MadcapJake|10 years ago|reply
The thing is, people create software to advance a purpose/goal or (perhaps more often) to make money. The web is the most accessible/available platform for either. That is why so many non-web-like technologies/paradigms end up on the web (flash, SPAs, huge client-side frameworks).

The author is saying, let's take the accessibility/entrepreneurial-nature of the web and bake it right into the user's desktop environment rather than placing it on top of a platform (the browser) that has to rewrite much of what's already there in the OS.

[+] black_knight|10 years ago|reply
Off on a tangent here. He mentions he wants a tool which takes uris and figures out the correct program to handle it. This sounds a lot like plumber [1] from Plan 9. Plumber has a powerful, configurable method of finding the right tool to handle a string, and other programs can make it aware of the context the string should be interpreted in.

[1] http://doc.cat-v.org/plan_9/4th_edition/papers/plumb

[+] ReaperOfCode|10 years ago|reply
Sounds to me like things already work on mobile OS-es e.g. on android tapping a youtube linke can take you to the app to view instead of in the browser. Perhaps you could have the same kind of function on desktop OS-es shouldn't be too hard , all the apps just need to understand URLs. Could even go further than a lookup table for certain domains, and have different protocols etc.
[+] DanielBMarkham|10 years ago|reply
We do not need browsers, but for different reasons than the author uses.

Browsers are just one of many tools to consume one of many types of data on the internet. The fact that we'd fallen into this browser-web-address-as-a-location-app-content-mining paradigm initially does not mean that it is the optimum one going forward.

To see how misaligned we've gotten, play a thought-game: if I were a blind, privacy-sensitive person, how would I consume content on the web? I wouldn't want or need ads, user-tracking, or any of the rest of it. All I'd want is somebody to read me some text for 5-10 minutes, maybe a few times a day. Each time I might have something to reply -- or not. This would provide me everything the current web does, and would be much more lightweight and flexible. 99% of the bytes we're pushing and the interactivity we experience from browsers has nothing at all to do with long-term value. It's much more focused on stickiness and engagement. One might use the word addiction. The core text data itself, while somewhat useful, is not very much at all compared to the rest of it.

Browsers are not built in users' best interest. Therefore I predict they'll be around for a very long time.

[+] JadeNB|10 years ago|reply
> I wouldn't want or need ads, user-tracking, or any of the rest of it.

Does anyone, blind or not, want or need ads or user tracking? (I suppose you could argue that any login constitutes a (hopefully) benign form of user tracking; but then I could say that blind people will need such logins too!)

[+] AshleysBrain|10 years ago|reply
Facebook is arguably a sophisticated real-time communications and social networking app. Imagine if Facebook launched a desktop app to do that. Who would use it? Would they ever have become as successful as they are today? My reading is the author seems to think something like Facebook is entirely unsuitable for the web and contrary to what it was designed for, but I think it excels in a browser because of, not in sprite of, the strengths of the web: cross-platform, URLs, connectivity, low barrier of entry, and so on.

Someone will probably bring up the Facebook HTML5 mobile app thing, but consider on desktop it was always good enough, and in fact they still have a mobile web version of Facebook, which as far as I am aware is also pretty good. I think the main problem Facebook faced with their mobile app was the immaturity of web view controls on mobile, which have since come on by leaps and bounds (WKWebView on iOS 8+, Chromium web view on Android 4.4+).

[+] inDigiNeous|10 years ago|reply
The mobile version of Facebook is crap. At least on android. Locks up and jams the machine running it, the one running inside a browser is much better. This on a performant NVidia Shield tablet, so hardware is not an issue.
[+] mbrock|10 years ago|reply
Lots of people used Napster, DC++, email, Usenet, Kazaa, IRC, ICQ, MSN, AIM, RSS, SMS, all kinds of stuff. There are lots of possibilities. Facebook's success is interesting but it's totally possible that something similar could have happened with another protocol than HTML over HTTP, no? Either way, Facebook is basically a commercial success involving network effects, vendor lock-in, good marketing, timing, etc; it's not a technical breakthrough at all.
[+] mixmastamyk|10 years ago|reply
I've been thinking the same thing for years. There were a few attempts at parts of the problem like java, XUL, or XForms, etc but they didn't break out.

Might be time for a rethink of the whole thing. An open, remote updatable, sandboxed, cross-platform, app platform, and leave the docs to the web browser.

[+] tritium|10 years ago|reply

  open, 
  remote updatable, 
  sandboxed, 
  cross-platform, 
  app platform
The problem isn't that we haven't faithfully accomplished all of those ideals set forth. The problem is that there were much too few, sophisticated enough to appreciate what was placed before them.

And amongst the miniscule audience that did understand what lay in their hands, half chose to abuse the unwashed masses that didn't tend to the honor system that stood in place of proper techincal security practices at the time.

Were such things simply way too far ahead of their time, lost on a market too immature for such luxuries to be made generally available?

Will there ever be a time when people care enough about anything other than instant messaging, to invest hours learning the intricacies of how to make a VCR stop flashing 12:00 AM?

[+] CmonDev|10 years ago|reply
WebAssembly is the latest hope.
[+] pjmlp|10 years ago|reply
No we don't, network protocols are what matters.

I like to have my email client, my native RSS reader, still jump to newsgroups occasionally, use my desktop chatting applications...

[+] EvanAnderson|10 years ago|reply
I often have mental flights of fancy when I wonder what the world would have been like if end-to-end had been preserved and IPv4 NAT hadn't arisen to put a stranglehold on protocol development. Twitter, Facebook, eBay, et. al. should be protocols, not freestanding businesses.

It's much harder to "monetize" a protocol. That's a feature, not a bug.

[+] amelius|10 years ago|reply
The problem in my opinion is that the browser is addressing too many layers of abstraction at once. This makes it very difficult to get the specs right, security becomes immensely difficult to get right, and it becomes almost impossible to implement a browser (bad for competition).
[+] lmm|10 years ago|reply
Sadly there's no way to get to there from here. Something like Java Web Start was and remains a much nicer application platform than a web browser. But every device has a web browser, and manufacturers fall over themselves to add support for the latest "web" functionality.

And ultimately it doesn't matter. Many of the layers of the computing stack are over- or under-engineered for the task they end up performing (have you seen the x86 ISA? The ELF spec?). But computers are very good at abstractions, so ultimately none of that matters. Running our applications on the web costs us a bit of complexity, a bit of performance, but we'll reach the point where all the mess is hidden the ordinary day-to-day programmer.

[+] z1mm32m4n|10 years ago|reply
The article mentions that "web browsers have become resource hungry beasts with millions of lines of code," suggesting that much of this cruft has been the result of having to support backwards-compatible standards for the web.

I'd love to see a project similar in spirit to Servo that instead of aiming to refactor the language browser engines are build with refactors the functionality they provide. Something that identifies the Majority Use Case™ and tries throwing out the rest.

I'm not saying that we should push for deprecation of certain functionality, but I think it'd be interesting if people would start using this browser for the promise of faster, snappier surfing.

[+] gunn|10 years ago|reply
I've been thinking about something like this for a while - a standard for an easily optimised subset of html technologies. To conform to this standard, pages would be restricted in the ways they can manipulate the DOM, have a simpler DOM, use only a small fraction of CSS properties, and not use some JS features e.g. eval, delete.

We can use the asm.js model for opting in. Browsers that support it run the pages super fast, other browsers run them just as fast as usual.

A browser engine supporting just this standard would be considerably smaller, more embeddable, and a nicer base for current webkit based apps (e.g. spotify, steam, or atom). It might also help apps that want to use something like webviews for embedding content but need to be careful with memory / performance.

[+] fridek|10 years ago|reply
From what I've heard Microsoft Edge looks like such attempt. When a site requires some kind of compatibility mode, the not-so-good-but-indeed-old IE is spawned to serve it. Great approach and I hope they set the bar high to make the common use case faster.

My ideal browser would support only something like "use strong" from [1] and spawn Netscape 4.0 for all pages that abuse JS.

[1] https://developers.google.com/v8/experiments

[+] annnnd|10 years ago|reply
I think OP is missing the point. Browsers of today are taking over the space which Java tried to capture (remember "write once run everywhere" slogan?) but failed. They provide (mostly) unified development platform. But the catch is that unification comes as a direct result of the fact that they are meant to be something else. Any other unified platform faces an uphill battle while browsers are... just there. Are they perfect as a development platform? Hell no. But in absence of every other option, well... we take what we can get.
[+] TuringTest|10 years ago|reply
Do we need Swiss Army knives? For every tool that's included in the knife, you can get a much better stand-alone version that performs the same task much more efficiently.

So, what possible benefit could you get from having so many poor tools in a single place, that you can't improve by carrying around the equivalent set of separate high-quality tools? The Swiss knife ought to be such a terrible idea and nobody would ever use it, right?

[+] mbrock|10 years ago|reply
The Swiss Army knives are of course known to be simple, beautiful, time- and battle-tested, coherent, reliable, etc. There may be better multi-purpose tools, but the ideal of the Swiss Army knife sets a high bar. Compare that to the browser... You can't bring a browser with you to the woods unless you have really good 3G. No single human can even understand everything a browser does. Browsers are huge, unwieldy, and change constantly. Army knives, like Zippo lighters, pride themselves on having near-Platonic designs that haven't changed in a hundred years. An intelligent extraterrestrial could grok them. A more appropriate metaphor for the browser is a tarpit, as in "Turing tarpit."
[+] geon|10 years ago|reply
I envision a switchable client, specified by a URI in the response headers. The client would be a platform independent bytecode of some sort, like the NaCl.

The only thing the browser would supply would be the chrome, networking, sandboxing, and a canvas for the client to draw to.

The current web runtimes could be refactored into one of these clients. If you don't like the way CSS works, or if you think JS is weird, just write another client.

[+] Boldewyn|10 years ago|reply
Well, we have a scenario rather close to the one described in the article on mobile: Browsers for text content and apps for specialized stuff, that can incorporate web views, if needed, and access the net. (Yes, I’ve read the second-to-last sentence.) In a nutshell, it sucks.

To start with, I don’t buy the premise “Okay, so web browsers are awful for applications.” The statements before are way to generic to prove anything.

“[...]resource hungry beasts with millions of lines of code” falsely connects those two properties.

“[...]use several gigabytes of RAM, even when just displaying document-like content” might also be rooted in advertisers packing megabytes of rubbish in an iframe or web devs loading tons of unneeded web fonts. So, that’s bad engineering on the server side, not the browser’s.

“[...]that reimplements much of the features of an operating system on top of a real operating system” Chromebook anyone? Yes, that’s actual, ready-to-be-bought devices out there right now, that do exactly this. And lo, the problems are somewhat contained.

The conclusion also does not show any solution to the non-problem discussed above. “Imagine something like xdg-open.” I don’t need to imagine that, I have it right before me available in the terminal. And packing another service discovery on top of the stack is, to come back to my opening words, not so different from the closed-world app stores. Even Ubuntu has such a thing. And guess what? For people without technical knowledge keeping everything in the browser is way more efficient (work-wise, not performance-wise) than explaining arbitrary switches in context from browser to some app to some other app and back to the browser.

Security: “I’m no expert [...but...] doesn’t seem to be completely unrealistic.” The devil’s in the detail, as virtually everyone who works on browsers’s JS engines can tell you. A runtime, that downloads arbitrary binaries from the web to be executed, sounds in every regard like a bad idea, even if you put it in a full virtual machine. The two-word argument against this is basically “Flash exploit”.

Platform independence: The author might be too young to remember Java’s “write once, run everywhere” claim, that turned out to be not so fully true. And turning the current state of almost full platform independence in the browser for some proposed, from-scratch infrastructure will become exactly that disaster, that Joel Spolsky warned about 15 years ago in the context of the Netscape rewrite (http://www.joelonsoftware.com/articles/fog0000000069.html).

“But one thing is certain: the web platform we have today is already bloated, does not suit our needs and severely limits innovation.” No. It is not certain. Browsers today run on low-profile smartphones. Bloated web platform? Most of these things are opt-in, and many clever people build fallback strategies in new specifications to enable _everyone_ to become part of the web. Limiting innovation? Quake runs smoothly in the browser. Who would have figured that 10 years ago?

All in all, to me it seems the post is written by someone, who hasn’t yet fully groked the web.

[+] Slartie|10 years ago|reply
I'm not sure that "a decade-old game runs smoothly" would qualify as an innovation, rather than as a showcase of the exact problem the author was hinting at. Just getting stuff shoehorned into the browser that we already had running perfectly fine outside of it is not "innovation".

What about truly new stuff that nobody has seen before, neither inside browsers nor in native applications?

[+] black_knight|10 years ago|reply
Surfing on a smart phone is a real pain. Pages take longer to load than in the 90s and contrary to the 90s you can't start reading before it all has loaded.

Java tried to be C++, but run on every machine. That turned out to be difficult. But I don't think the author is thinking of Java. I guess he more has in mind domain specific languages, which are abstract enough in nature to be executed faithfully on any system with given capabilities.

Security also goes hand-in-hand with this form of abstraction. If the language can only express safe actions, the program will not be malicious. In pure languages, such as Haskell, one can use type-guarantees to enforce these restraints. One could imagine a virtual machine with this kind of typing.

[+] JadeNB|10 years ago|reply
> A runtime, that downloads arbitrary binaries from the web to be executed, sounds in every regard like a bad idea, even if you put it in a full virtual machine. The two-word argument against this is basically “Flash exploit”.

Not sarcasm, but an honest question: barring the argument "even full virtual machines have bugs", to which one might as well retort "even multiply heavily tested browsers have bugs", why isn't it safe to run such a program in a virtual machine? It seems that most of the pain of Flash exploits comes from the fact that Flash doesn't run in a (proper) sandbox.

(I'm not a web developer, so I could easily be talking nonsense.)

[+] dsfsdfd|10 years ago|reply
I would have written something similar, had I not been so lazy. Good Job, well put
[+] edwinjm|10 years ago|reply
1) Almost every application uses technology that's build on older technology that's build on older technology. That's not necessarily a bad thing.

2) Nobody says browsers want to replace all applications. It's a false condition the whole article is based on.

And why is it anonymous? Does he/she know it's nonsense?

[+] pdkl95|10 years ago|reply
The web browser is not the shell[1] or window manager, and it never will be. It may render GUI widgets just fine, and you could use it as a shell, but only because you can technically use any program[2] as a shell.

I know developing native client applications is not popular recently; there good reasons for that, such as wanting to develop your software for as wide an audience as possible. What you have to remember, when choosing your development environment, is that there are always limitations to every platform. On the web that means you are always going to be sandboxed not only in what the application can do, but also in how it can interact with the user. Given that we always have to care about phishing, CSRF/clickjacking, and numerous other types of malware, applications developed for the web will never[3] be able to do many of the things we expect from a native application.

This doesn't mean you can't do good things on the web (we could list numerous examples of Great Tools that are available on the web); it's just isn't going to ever have all of the features you get with a "real" native app. Even if you try really hard to get away from "document"-style nature of the web, the sandbox and realities of making things safe for the user will always be a problem. Yes, we can try to work around that by rebuilding another OS -inside- the browser. Some people are certainty trying. Instead of throwing your sanity away on that never-ending pile of problems endlessly-expanding complexity, I suggest simply realizing that while some things just aren't going to be practical inside the browser, for other problems it's still a decent platform to develop for, and it is slowly getting better.

Oh, and the Mac/OSX people will be angry you try to get them to force them to use too many non-native GUIs.

/* I'm skipping the discussion of the "software as a service" scam... I'll assume, for the moment, that the desire to make the web into an GUI shell is not simply part of a scam to try to convert one-time sales into a recurring service fee. */

[1] https://en.wikipedia.org/wiki/Shell_%28computing%29

[2] I once saw /usr/bin/gopher stuffed into /etc/passwd as the shell

[3] at least I hope it's "never" - making a platform where I can impersonate too much of your native GUI over the network just asking to be attacked

[+] zlnimda|10 years ago|reply
Thanks, someone who says why web is going too far from its original purpose.

The major problem with web is : We use framework to abstract differences between browser which abstract web for different Operating system, which abstract hardwares.

[+] pjmlp|10 years ago|reply
I keep repeating it, even though I do have lots of experience in web development, but I also do have even more experience developing native applications and came to realize the same thing.

Nowadays, when given a choice I always pick native projects over web ones.

But these type of statements usually earn downvotes in HN.

[+] test1235|10 years ago|reply
According to my company firewall, this site has been blocked for malware ... ?
[+] fkooman|10 years ago|reply
Proving (part of) the point of the article ;)