top | item 10646100

What Web Can Do Today

354 points| porsager | 10 years ago |whatwebcando.today | reply

127 comments

order
[+] majika|10 years ago|reply
What the web is increasingly unable to do today: provide text content without requiring a code execution environment. This site is another example of that.

All non-application websites should provide all their content in semantic HTML at appropriate HTTP endpoints, with CSS styling (in as few requests as possible) as required per the design, and JavaScript (in as few requests as possible) that takes the semantic HTML and makes it interactive (potentially adding and removing elements from the DOM) as required per the design. The CSS should not depend on mutations resulting from the JavaScript, nor should the JavaScript assume anything of the applied styles (as the user agent should be able to easily apply custom user-styles for your site; e.g. Gmail only providing a limited set of styles that are managed server-side is laughable).

Thus, all content is readable and styled properly without requiring an arbitrary code execution environment. That is what the web was meant to be. Unfortunately, most "web developers" have made the web worse over the past 10 years because simple, functional, minimal technology is not impressive, and hipsters love to show off.

Nor does it help that there are few capitalist incentives for the web being open and malleable -- e.g. so users can easily use a different front-end for Facebook, or users can easily choose to avoid analytics or advertisements, or users might prefer to use the website rather than the app (providing access to personal details, contacts, location, tracking, etc).

The state of the web is emergent and I'm not sure what anyone could do about it (perhaps make a better browser?), but it really irks me when web developers pretend like they're actually doing something good or useful, or that the web is actually in a healthy state. In my experience, it's the people who don't talk about web development who are the best web developers; these are the people who don't wince when they write a HTML document without a single `<script>`.

[+] bad_user|10 years ago|reply
You're talking about "progressive enhancement". It's a romantic idea, but it never happened, probably because it's too hard and the cost is not justified given most users run with their browser's default settings.

The precursor of the web made by Tim Berners-Lee dates back to 1980, but it was not based on HTML or HTTP. These happened later in 1990 and early 1991. But then CSS happened in 1994. And Javascript happened in 1995 at Netscape, but then Javascript was completely useless until Microsoft came up with the iframe tag in 1996 and then with XMLHttpRequest in 1999, which was later adopted by Mozilla, Safari and Opera. And people still couldn't grasp its potential until Google delivered Gmail in 2004 and Google Maps in 2005.

Not sure what the "the web was meant to be", we should ask Tim Berners-Lee sometimes, but in my opinion the web has been and is whatever its developers and users wanted it to be, with contributions from multiple parties such as Netscape, Microsoft, Mozilla, KDE/KHTML, Apple, Google and many other contributors, being a constantly evolving platform.

[+] nandhp|10 years ago|reply
> What the web is increasingly unable to do today: provide text content without requiring a code execution environment. This site is another example of that.

I was about to argue that this website is actually an excellent example of what you seek -- each link has a separate URL associated with it that returns a page containing that content. The links point to these real URLs so they work with "open in new tab" and "copy link" and in browsers without JavaScript enabled, while the JavaScript that runs when you click it changes the page content via AJAX (possibly saving a few round-trips) and updates the current page URL so that back/forward history and the address bar both work just like you're navigating between real webpages.

And this works perfectly in Firefox (with JavaScript) and almost perfectly in Lynx (the table of contents still fills the first screenful, but that's hard to fix since Lynx doesn't support CSS). But it completely fails if you have JavaScript disabled in Firefox.

Every page starts with the table of contents visible and the content collapsed (through CSS). The page then seems to assume that JavaScript will be able to immediately switch the page to the correct view (i.e. the site is broken if you have working CSS but not JavaScript). Navigation to a given page directly should start the other way by default, and to make that happen is just 21 missing characters (` class="page-feature"` on the <body> tag). However, this unfortunate error completely ruins this otherwise beautiful example of progressive enhancement.

[+] mei0Iesh|10 years ago|reply
It's the Flashification of the web. People who wanted to show off, or code web apps, used to use Macromedia Flash. People complained about it, in part because sometimes if you accessed a site without the Flash plugin you'd see a blank page.

But Flash was great in many ways, and it was self-contained in objects, so websites were mostly still websites. JavaScript had been around for a long time, but there was still a cultural norm that most people respected about not requiring JavaScript. This was mainly because a lot of browsers still didn't fully support it, or people had it turned off. It was also when some people had cookies disabled.

Then once Adobe bought Flash, and then Apple blocked Adobe Flash, it really killed the Flash way, and all that spilled over into HTML with HTML 5 and the new cultural norm of kids who are more concerned with showing off socially than the meat and potatoes of hypertextual information.

It should've been obvious that there was a need for a new web, for code and multimedia. But in the .com boom nobody would dare try to start with something unpopulated, since it'd risk losing their chance at fortune.

Today, instead, maybe we should go in the opposite direction and create a new old web; a hypertext network that specifically only works for HTML, so people can have this one to morph into an app network, and we'll not lose the text-linking place we've grown accustomed to.

[+] userbinator|10 years ago|reply
Thus, all content is readable and styled properly without requiring an arbitrary code execution environment. That is what the web was meant to be.

In other words, it was supposed to be a worldwide hyperlinked document library --- and we have mostly achieved that goal, although it is a library wherein you are constantly tracked and bombarded by books flying off the shelves at you, screaming at you to read them, and most of the books consist solely of ads with very little useful informational content.

In my experience, it's the people who don't talk about web development who are the best web developers; these are the people who don't wince when they write a HTML document without a single `<script>`.

Agreed completely. The ones who write information-dense HTML pages, often by hand, would not be considered "web developers" nor would they consider themselves to be; but they are what the web needs most. I've done that, and I don't consider myself a "web developer" either.

it really irks me when web developers pretend like they're actually doing something good or useful, or that the web is actually in a healthy state

I wouldn't doubt that they genuinely feel like what they're doing is good or useful; I've noticed the appeal of "new and shiny" is especially prevalent in the web development community, with the dozens of frameworks and whatnot coming out almost daily, proposals of new browser features, etc. Very little thought seems put into the important question of whether we actually need all this stuff. It's all under the umbrella of "moving the web forward", whatever that means. But I think we should stop and look back on the monstrosities this rapid growth has created.

[+] BurningFrog|10 years ago|reply
Strong opinions on how things should be, but no arguments for why.

That is never going to convince me.

[+] Pxtl|10 years ago|reply
I tend to think rants like this are just being luddites, but all this JavaScript and external resources rest are ruining the experience of the web. Holy crap is everything slow now. On desktop we've all gotten into the habit of tabbing the new stuff and waiting for load while doing something else, but on mobile where that workflow isn't as easy and you have to watch a page load? I'd say HN is one of the very few sites I can stand anymore.
[+] TheAceOfHearts|10 years ago|reply
I think this is because so many websites try to shove ads and tracking crap down your throat.

It's not even particularly difficult to pull off for certain websites. For example, my blog uses React and it does server-side rendering, so it'll work without JS at all. However, if you do have JS enabled, it'll let you avoid doing full-page reloads to navigate around. I totally agree that for content-focused websites, you tend to get a better experience by limiting gratuitous JS abuse.

However, this isn't very feasible when you're building highly interactive applications. In the case of something like Facebook, they do have a version of their app that works without JS... But how many people can afford to maintain multiple versions of their applications?

I think a good compromise is achievable by well documented APIs or even better with a public GraphQL schema! If I don't use magic APIs to build our frontend app, you can build a different frontend that's tailored for your needs.

[+] ryandrake|10 years ago|reply
What we need is a new protocol: something that lets an author write and publish text documents, marked up with basic styling, with "hyperlinks" to other text documents. The protocol could allow embedding of simple inline figures as well. Users would run a "browser" whose function was limited to requesting and displaying these documents.
[+] ocfx|10 years ago|reply
> Unfortunately, most "web developers" have made the web worse over the past 10 years because simple, functional, minimal technology is not impressive, and hipsters love to show off

No it's because when I go into work someone says to make it a certain way and if I want to get paid I have to. If you want to blame anyone blame designers who see proof of concept stuff from developers and throw it in designs.

[+] Sephr|10 years ago|reply
Why is the Network Information API still a thing? I don't feel like it should even be listed on this website, as it's an anti-feature. It only encourages discrimination on connection types, as it doesn't expose the only important part: is this connection metered?

Developers are going to use this API to serve me higher resolution assets over my metered WiFi & Ethernet connections (assuming they are unmetered) and lower resolution assets over my unmetered cellular LTE connection (assuming it's metered).

WiFi vs cellular vs Ethernet is not important. What's important is the usage policy behind the connection, and in its current state, the Network Information API can only be used to harm my user experience.

[+] mmahemoff|10 years ago|reply
How can a web browser or even native OS determine if a user's wifi network is metered? I agree it would be nice in theory if there was some kind of "dollars per MB" API, but I don't see how web browsers could implement that in practice.

I work on a podcast app. When people are downloading hundreds of MB or more daily, they emphatically want control over when downloading takes place. In practice, almost all of them are satisfied with a binary choice between cellular or not. Those who do sometimes use expensive or slow wifi networks can blacklist them, which is also what network info APIs support.

There are also other applications such as metering apps, utilities that trigger actions upon network changes. Some very popular Android apps (e.g. Tasker, IFTTT) rely on network info for that; web falls further behind if web apps can't access that data.

[+] friendzis|10 years ago|reply
Usage policy is not necessarily metered vs free. It's a mixed bag of user expectations, developer needs and technical properties of the connection. Connections are by no means only unmetered broadband and metered LTE. I have recently experienced several connection "quirks" that require different approach:

  * punishingly high RTT
  * packet-per-timeframe ratelimit
  * throttling by queueing packets
  * unstability (some requests get served in tens of ms, others get lost in void)
  * connection jumping from one provider to another (e.g. mobile/wifi while next to home)
For example, jaggy GPRS in the middle of nowhere can mean two things: I working in a field and need to check something as quick as possible without ads and other unnecessary cruft hindering with that, or I am resting in a cabin and am willing to wait for a "proper" app to load.

I may also have two connections: fast metered, slow unmetered and would like applications to load important stuff fast over metered network and cosmetics over unmetered one. Too bad I have to pull whole javascript framework first in order to see anything, because web components, and it's uncacheable because it's webpacked with application code.

[+] Raed667|10 years ago|reply
I think this is important in case of 2G/3G/LTE distinction. Where phones that can access only 2G networks should get served "light" version of websites.
[+] eridius|10 years ago|reply
Network type can be used for more than just serving higher-resolution assets. It can also be used for doing aggressive pre-fetching of assets that may not be needed when on wifi. For example, my native RSS client will pre-fetch images from blog posts when it refreshes on wifi, but won't do that on cell. It's not unreasonable to think that various web apps might want to do the same sort of thing.
[+] dheera|10 years ago|reply
More stuff the web can't do that native apps can do:

* Request stuff from arbitrary URLs without a CORS proxy

* Open TCP and UDP sockets

* Be in the play store/app store AS a web app (home screen installation isn't enough; people still search for your thing in the app store, so even if the other problems aren't issues for your app, you still have to wrap your web app in a dummy native framework and upload it if you want users to discover you)

* Scroll like a native app without Safari's stupid rubber band scrolling the entire app at very erratic times

* Barometric sensors

* Background geolocation, background anything really

* launchActivityForResult so you can work with other native goodies

* Intent resolution for native actions

* Geolocation when the user has inadvertently blanket-blocked Geolocation for all of Safari instead of per-webpage. Thankfully Chrome comes pre-approved on Android and you can't accidentally make this mistake as a user

* Face tracking and other OpenCV-based stuff (it's been done, but JS is still not fast enough on mobile to handle these jobs)

* Display long lists of styled content and scroll without stalling

Still though I think touch gestures is really the killer missing feature. There isn't any canonically-supported way to do things like a simple Android ViewPager or pinch-to-zoom. You end up implementing a bunch of spaghetti code to do these things even in the best frameworks (Meteor, Phonegap + Polymer, Angular et al.). And then you find out the way your spaghetti code reacts is a tad different from the way someone else's spaghetti code reacts to the same gestures. This stuff really needs to be standardized on a OS, browser, or at least JS-framework level.

[+] deathanatos|10 years ago|reply
I'm not saying that native apps > web apps isn't true, but…

> * Request stuff from arbitrary URLs without a CORS proxy

CORS isn't a proxy; it's a browser policy/algorithm that allows a site the opportunity to say "yes (or no), other websites may (or may not) make AJAX requests here". The native app (unless you've informed it somehow, or it has done something malicious) doesn't have the user's cookies, whereas the browser does, and needs to be a bit more cautious.

> * Open TCP and UDP sockets

There are websockets, which aren't the same, I'll admit. Frankly, I'm not sure I want my web pages to be able to arbitrarily open TCP/UDP sockets. (I don't really want my native apps to be able to on whim, either…)

> * Barometric sensors

You mean like current air pressure? Are there common devices out there with these? (None of my computing devices have anything like this, for example.) (and what would I want this for?)

[+] venning|10 years ago|reply
> Scroll like a native app without Safari's stupid rubber band scrolling the entire app at very erratic times

This is defeatable, either with `-webkit-overflow-scrolling: touch;` or other techniques.

> Background geolocation, background anything really

You mean, Service Workers?

> Display long lists of styled content and scroll without stalling

I could be wrong, but I think this is a Safari-specific concern. Android Chrome has never had this problem for me. In older versions of iOS, Safari was not able to execute Javascript while scrolling which could be perceived as "stalled" rendering. I'm not sure if that's what you mean, but Apple has been gradually working to allow JS execution during scroll. But they're so behind (in years) that I wonder if there are still some cases where the execution can't keep up with the scroll.

[+] m_fayer|10 years ago|reply
* Offline mode:

If you have large amounts of structured data and high performance requirements for it... well, good luck.

[+] faizanbhat|10 years ago|reply
I think this is a well thought out list. I find it saddening that the 'health' of the web has deteriorated to a point where it's arguably uncompetitive for a vast number of modern applications including information discovery. I find it troubling that information discovery is moving away from the open web into non-neutral native application environments such as Facebook, Snapchat, and Twitter. Sure, information discovery on the web was driven by search engines. But I think search engines are really just part and parcel of the web (and arguably a feature that ought to have been built into web browsers from the beginning). A search engine doesn't express bias towards any information source (at least not by principle).

What troubles me is imagining a future where a vast majority of information that we consume is selected and curated by commercially driven, black box, non-neutral platforms.

I read an article recently about how Google is experimenting with removing the need to download apps by 'streaming' app content through a search box. Presumably, this is how Google search stays relevant in the post-web era where more and more information flows through walled gardens installed on mobile devices. Now, instead of re-inventing the web, shouldn't we be working towards 'fixing' the one that exists? This list provides a decent 10,000 ft overview of the problems.

Eager to hear your thoughts.

[+] slimsag|10 years ago|reply
This is really awesome :)

The truth is that the 'web' is really becoming the cross-platform application architecture.

Just a few years back I despised the very idea of this -- I want to program in my favorite language, not JavaScript (no matter how nice it is these days). I'd love to write applications that can act like native ones while also being cross platform (and use multiple threads at that!).

The fact that WebAssembly[1] intends to eventually add support for multi-threaded programs and languages aside from JavaScript (including GC'd ones) is amazing. Personally I believe these are the two biggest blockers for most people who would want to write an app using web technology.

Although they explicitly mention that their goal is not to replace JS in their FAQ, that IMO is more akin to a _"nothing will ever replace C"_ statement rather than a _"no apps will be written using other languages than C"_.

[1] https://github.com/WebAssembly

[+] r3bl|10 years ago|reply
I really wish there's a way to check what other popular browsers can do without having to install them and open up the website.
[+] paulirish|10 years ago|reply
Just click in on any of the features and you'll get details including the full cross-browser support.
[+] lucb1e|10 years ago|reply
Proximity sensor, ambient light sensor, vibration, access to contacts, change screen orientation... gee, I didn't know Firefox could do all that on my laptop!
[+] _urga|10 years ago|reply
Browsers are great at rendering.

But they are terrible when it comes to exposing the raw power and capability of a machine to web apps that the user wants to trust. They spec their API implementations by committee (and committees of committees) and they rarely implement any spec in its entirety. The web is becoming increasingly fragmented as a result.

The web is good for "web pages", but bad for "web apps". The web currently has no concept of a trusted web app.

For example, we have been waiting for years for browsers to give web apps some way to access the filesystem, and all we have is an open dialog and a file instance (and the debris of the failed filesystem api). And when will web apps (not "web pages") get TCP or UDP? The browsers will never be able to match the module ecosystem and core power of Node.

The way forward:

1. Give the power back to users. Give them a boolean way to indicate that they trust and want to install a web app.

2. If the web app is trusted and installed, give it access to Node.

In a matter of months, this could exponentially boost web apps, lighten the bloated browser codebase, keep the focus on browser rendering, and keep committee fingers off web app innovation.

[+] djtriptych|10 years ago|reply
Great site - quick usability notes:

- Left align all features

- Align icons

- Align check marks

- Consider using heavier-looking icons for supported/not-supported.

- Promote the checkmark / ex legend to the top of the page.

[+] kristopolous|10 years ago|reply
There's some issues with the code. navigator.contacts for instance, is not supported in my browser (or any browser at all, except for the most recent mobile firefox - I'm on a desktop) yet the website says that indeed, my desktop browser supports it. Same is true with navigator.getBattery() which supposedly will run fine in my console (it doesn't).
[+] eridius|10 years ago|reply
I'm very surprised to see this claim that Safari on OS X[1] can't handle Push Notifications, because it's been able to do that for a few years now. It's just not using Service Workers. It uses a separate solution built around the Apple Push Notification service. Granted, this isn't cross-platform and isn't a W3C standard, but it is a capability that can be used today. But of course the whole section on "Push Notifications" is actually just a description of Service Workers.

[1] Which isn't actually a listed browser, but it has the X next to it when viewed in Safari and the underlying caniuse.com data is for Service Workers, which Safari doesn't support.

[+] Navarr|10 years ago|reply
This website is about implementation of W3C standards, from what I grasped.

So it is not surprised that it doesn't claim Safari can handle push notifications. As it cannot.

[+] realityking|10 years ago|reply
If I'm not mistaken on iOS you can only do it if you wrap you r web app into a native app. On OS X however you can actually get real push notifications even for websites.
[+] adevine|10 years ago|reply
I think this should be called "What the web can do someday." I mean, push notifications? If it's not supported by mobile safari, it's a non-starter for actual widespread adoption.
[+] djm_|10 years ago|reply
The ticks/crosses on the front page are related to "your current browser" (see key). Click through for detailed stats from caniuse.com.
[+] JohnTHaller|10 years ago|reply
While this will affect the 44% of users that are locked into Mobile Safari IE6-style, there's no reason not to start adding the features supported by the open mobile platforms and desktop OSes to your web app.
[+] sawwit|10 years ago|reply
I've been imagining a website lately that displays progress in a similar way for all kinds of things. Sort of like a global to-do list. Should be fun to build.
[+] kalmar|10 years ago|reply
This is pretty great! Looking forward to more service worker support for more offline and push notifications.
[+] azakai|10 years ago|reply
What does the X axis in the "browser support" graphs mean? It seems to go up to 30, and has fractions along the way, both of which suggest it isn't a simple version number. A percentage of something, perhaps?

edit: I guess it's market share? but which measurement?

[+] phonyphonecall|10 years ago|reply
It's interesting to compare the site opened in Chrome vs. Safari. I count 12 features desktop Chrome supports that desktop Safari does not (on El Capitan). A mobile comparison would be intriguing as well.
[+] asdfprou|10 years ago|reply
Excellent choice of colour and tasteful use of animations =)