top | item 11262239

Brendan Eich: WebAssembly is a game-changer

316 points| alex_hirner | 10 years ago |infoworld.com | reply

303 comments

order
[+] jacquesm|10 years ago|reply
Personally, I think this is terrible (and it really is a game-changer, only not the kind that I'd be happy about). The further we get away from the web as a content delivery vehicle and more and more a delivery for executables that only run as long as you are on a page the more we will lose those things that made the web absolutely unique. For once the content was the important part, the reader was in control, for once universal accessibility was on the horizon and the peer-to-peer nature of the internet had half a chance of making the web a permanent read/write medium.

It looks very much as if we're going to lose all of that to vertical data silos that will ship you half-an-app that you can't use without the associated service. We'll never really know what we lost.

It's sad that we don't seem to be able to have the one without losing the other, theoretically it should be possible to do that but for some reason the trend is definitely in the direction of a permanent eradication of the 'simple' web where pages rather than programs were the norm.

Feel free to call me a digital Luddite, I just don't think this is what we had in mind when we heralded the birth of the www.

[+] _qbjt|10 years ago|reply
I hate to break it to you, but we're already there. People have been using the web to deliver desktop-like applications for the past decade. Over that period of time, the number of people connected to the Internet has more than doubled [1] and will continue to increase. Whether or not an application is delivered through a browser or natively is inconsequential to most of these users. If we look at the list of the most popular websites [2] we see mostly content-delivery platforms (Google, Bing, Wikipedia etc.) with some popular web apps which resemble desktop software in complexity (Facebook, YouTube, Windows Live etc.)

So we have two paths forward. One, we could try to influence the habits of billions of Internet users who use desktop-like web applications in an attempt to restore the document-based nature of the web; or two, we could provide an alternative to the artifice of modern JavaScript development which allows for better applications to be written and distributed to users that use and rely on them. The latter initiative is the more realistic and productive one, in my opinion.

WebAssembly will not lead to the end-of-times of the Internet as a content delivery vehicle. It is a net positive for the parts of the web that do not fulfill that purpose. If you're worried about the free and open web as a publishing platform, look more to governments and corporations around the world that collude to limit our freedom of expression (Facebook, we're all looking at you [3]).

[1] http://www.internetlivestats.com/internet-users/

[2] https://en.wikipedia.org/wiki/List_of_most_popular_websites

[3] http://www.theatlantic.com/technology/archive/2016/02/facebo...

[+] pcwalton|10 years ago|reply
Eh, I think things were a lot worse during the heyday of Flash and Java applets. Sites have been delivering this kind of content since WebRunner in 1997. At least now we have an open, vendor-neutral, consensus-based standards process and a commitment to multiple major open source implementations, which is something we never had with Java or, worse, Flash.
[+] s3th|10 years ago|reply
The web is an amazing content delivery vehicle, I totally agree! There's a whole class of content I want to be able to just `wget` and be done with it.

The goal of WebAssembly is to open up the reach and easy user experience of the web browser to new types of applications that just aren't possible to build efficiently with current tech.

We're also making sure that wasm is a first-class citizen of the open web: see, for example, our thoughts around ES6 module interop, GC + DOM integration, and view-source to see the textual encoding of wasm [0][1].

[0]: https://github.com/WebAssembly/design/blob/master/Web.md

[1]: https://github.com/WebAssembly/design/blob/master/TextFormat...

(Disclaimer: I work on V8.)

[+] Sir_Cmpwn|10 years ago|reply
I agree. I'm really running dry on respect for Brendan Eich. None of the moves he's making are for the benefit of user privacy - look at Brave, his new browser project. It replaces ads on the web with his own ads, tracks you, and puts money in his pocket instead of the publisher's pockets. I'm struggling to remember why he was respectable in the first place - for making JavaScript, an awful programming language we've spent 20 years trying to fix? I don't think that his word on these issues is worth anything any longer.
[+] losteric|10 years ago|reply
I prefer the term "mindful" to luddite. Luddite's opposed technology, you oppose the direction it's headed.

But I think you're completely right. We took all that made the web unique, and turned it into a black box for abstracting away hardware/OS.

It's hardly surprising though... you can decentralize a network, but power and control over the medium was bound to become centralized in some form.

[+] tclmeelmo|10 years ago|reply
I think what I worry about the most is how poorly the modern web partners with assistive technology: it feels like we're actually taking steps backwards.
[+] return0|10 years ago|reply
Despite all the javascript, the content stayed on the web. That's the big win, thats what draws people to use it, and what guarantees its popularity for a long time. The really great, awesome thing about the web is its openness. No company can tie you to their programming language , their "app store" ecosystem , insane policies and authoritative restrictions. We should be eternally greeatful to Berners-Lee for that.

The silos exist today, facebook platform etc. Despite how hard they tried, they did not take over the web.

[+] ZenoArrow|10 years ago|reply
> "It's sad that we don't seem to be able to have the one without losing the other, theoretically it should be possible to do that but for some reason the trend is definitely in the direction of a permanent eradication of the 'simple' web where pages rather than programs were the norm."

What does it matter what is the norm? Static HTML/CSS is going nowhere, you can still create static content, as you well know (IIRC you run a static blog). The improvements to the dynamic side of the web do not come at the expense of the document-oriented side, both currently coexist and I see no reason why making the dynamic side faster will change that.

Furthermore, changes to dynamic content can enhance the functionality of the document-focused side of the web. Consider Wikipedia. In some ways a Wiki is a set of documents, but it's a set of documents that grows based on utilising input from those using the service, democratising the accumulation of knowledge. For all its flaws, I can think of no other resource that better embodies the virtues of the web than Wikipedia, and Wikipedia would not have grown to the size it is now without the technology that supports web apps.

That said, I don't agree with the trend for moving everything to the cloud, and I hope we can see that trend reverse with better tools for people to take control of their own data. If more people had cheap home servers that were easy to maintain then the issues surrounding lack of control should be greatly reduced.

[+] pedalpete|10 years ago|reply
the web is and will always be a 'content delivery vehicle'. Apps which run on the web are a form of content.

Not all data is open, that is unfortunate, but realistic. At the same time, huge amounts of data is open and available without an app.

What is the use case where we lose to something because of native performance improvements in javascript?

Anybody who wants to build a simple static site can still do that, and I'd suggest the majority of the web is still just that, or very close to it.

I really don't understand your comment about 'executables that only run as long as you are on a page'. You can only read content as long as you are on a page as well. Or are you concerned about our ability to do search and data-mining on large volume of available data?

[+] acabal|10 years ago|reply
I upvoted you, but I want to add my voice too.

The web succeeded in part because it was possible for anyone to do "view source" and see what was going on under the hood.

Having that source available is also an important aspect of software freedom.

Losing all that--especially the freedom to see exactly what your browser is executing--for a slight speed increase is ludicrous and I'm very, very sad to see this is being taken so seriously.

We had a huge opportunity here to shape an open and free web. Turning the web into nothing more than a binary distribution platform will undo decades of work and we may never again find ourselves in the lucky confluence of economic prosperity, technological advancement, and governmental benign neglect, that made the open web possible.

[+] stcredzero|10 years ago|reply
The further we get away from the web as a content delivery vehicle and more and more a delivery for executables

Interactivity is an increasingly important aspect of media and of our culture. People spend more money and time on games than on movies and TV.

I just don't think this is what we had in mind when we heralded the birth of the www.

It's never like the framers imagined. It's always stranger and more wonderful than they could have imagined. (And horrible in some ways they couldn't have imagined.)

[+] girvo|10 years ago|reply
> Feel free to call me a digital Luddite

Sure. But don't despair, because this future isn't as bleak as you'd assume. With things like Hoodie[0], GunDB[1] and other amazing bits of technology, we can keep the benefits of web-tech for application development, but allow the user to own their data still. Offline-first, easy sync when needed. And really, anything more complex than delivering static HTML pages has the downsides you're mentioning, so unless you want to live in 1995 I can't really understand it from a practical perspective ;)

[0] http://hood.ie/

[1] http://gun.js.org/

[+] metaphorm|10 years ago|reply
I agree with quite a lot of your critique and share some of those concerns, but something doesn't quite add up to me. Why are you assuming that the two models of web-as-delivery-system are mutually exclusive? I don't see how web assembly competes with or causes movement away from the web as we've known it.
[+] chipsy|10 years ago|reply
There's always an element of "one step forward, two steps back" with major technology shifts. Every time we've had a big change in platforms, we also had to redo existing engineering work.

I don't think this is doom and gloom for accessibility, though. The future is in general-purpose assistance technologies that mediate any application. You can smell it with the new work in ML. It is not here now, but as with everything in technology, by the time it's mature and widely available, it's nearly obsolete.

[+] chc|10 years ago|reply
I don't see what vertical data silos have to do with the technology the OP is talking about. Vertical data silos would exist without the ability to run programs client-side, they'd just be less pleasant (e.g. lots more reloading), and less accessible for many people (harder to make usable AI).
[+] hughw|10 years ago|reply
Why can't both styles coexist? Declarative hypertext resources for things that are document like, and dynamic applications for things that are app like? We simply are augmenting the plain old HTML web with the ability to link to resources with new capabilities. We haven't subtracted anything.
[+] modeless|10 years ago|reply
I hear you, but what is there to do about it? You can't stop people from wanting these kinds of features, and there are huge, real benefits.
[+] reitanqild|10 years ago|reply
I have a suggestion: If we all start blogging again on a myriad of domains and subdomains, start linking to each other not for SEO but to provide users links to interesting stuff then all this could go away like a ugly nightmare once you are halfway into breakfast.

I know there are a few of you x-bloggers here: I loved the time when I could find lots and lots and lots of technical stuff in a never ending web of blogs. Not everyone agreed but we linked back to the ones we disagreed with without caring about SEO.

Hey, we could event do web rings and RSS and and and...

Edit: There is still a lot of content, I just miss the time where everyone blogged and I wish we would decide to go back, and then do it.

[+] unethical_ban|10 years ago|reply
The WWW is currently a bunch of apps. Mediawiki, Wordpress, Node, Django, and thousands of others.

What is the difference between serving HTML/CSS/JS to the browser, and some other stack of UI and algorithms?

[+] Touche|10 years ago|reply
They're not pulling JavaScript out of browsers, it's time to give up that dream.

If you're really so passionate about static content why not be part of a project to that aim? You can host a Gopher server and publish to it, it's stunningly easier to do.

Or maybe an alternative that runs on https? I'd be interested in that personally.

[+] danharaj|10 years ago|reply
I think the solution is to think of code as data that should be freely distributed and hackable as well. It isn't the garden that is the issue, but its walls.
[+] seagreen|10 years ago|reply
There's . . . already a complete programming language in web pages. I'm not sure what you're trying to accomplish here.
[+] lawnchair_larry|10 years ago|reply
Stallman was right, just for the wrong reasons. Scary. Offline computing is probably going to die.
[+] ihsw|10 years ago|reply
It's not the web that we are losing -- it's the browser.

It's all still HTTP requests and responses, but the web browser itself is becoming something very different from what it was a decade ago.

[+] moron4hire|10 years ago|reply
What's hilarious is that A) you're pining for something that hasn't existed for years, and B) you're arguing against something (Service Workers) that would bring it back.
[+] kibwen|10 years ago|reply
On the Rust side, we're working on integrating Emscripten support into the compiler so that we're ready for WebAssembly right out of the gate. Given that the initial release of WebAssembly won't support managed languages, Rust is one of the few languages that is capable of competing with C/C++ in this specific space for the near future. And of course it helps that WebAssembly, Emscripten, and Rust all have strong cross-pollination through Mozilla. :)

If anyone would like to get involved with helping us prepare, please see https://internals.rust-lang.org/t/need-help-with-emscripten-...

EDIT: See also asajeffrey's wasm repo for Rust-native WebAssembly support that will hopefully land in Servo someday: https://github.com/asajeffrey/wasm

[+] s3th|10 years ago|reply
As we get closer to having a WebAssembly demo ready in multiple browsers, the group has added a small little website on GitHub [0] that should provide a better overview of the project than browsing the disparate repos (design, spec, etc.).

Since the last time WebAssembly hit HN, we've made a lot of progress designing the binary encoding [1] for WebAssembly.

(Disclaimer: I'm on the V8 team.)

[0]: http://webassembly.github.io/ [1]: https://github.com/WebAssembly/design/blob/master/BinaryEnco...

[+] KMag|10 years ago|reply
About the binary encoding... It's a bit easy to armchair these things, and it's too late for WebAsm now... but if you're on the V8 team, you have access to Google's PrefixVarint implementation (originally by Doug Rhode, IIRC from my time as a Google engineer). A 128-bit prefix varint is exactly as big as an LEB128 int in all cases, but is dramatically faster to decode and encode. It's closely related to the encoding used by UTF-8. Doug benchmarked PrefixVarints and found both Protocol Buffer encoding and Protocol Buffer decoding would be significantly faster if they had thought of using a UTF-8-like encoding.

LEB128 requires a mask operation and a branch operation on every single byte, maybe skipping the final byte, so 127 mask operations and 127 branches. Using 32-bit or 64-bit native loads gets tricky, and I suspect all of the bit twiddling necessary makes it slower than the naive byte-at-a-time mask-and-branch.

    7 bits -> 0xxxxxxx
    14 bits -> 1xxxxxxx 0xxxxxxx
    ...
    35 bits -> 1xxxxxxx 1xxxxxxx 1xxxxxxx 1xxxxxxx 0xxxxxxx
    ...
    128 bits -> 1xxxxxxx 1xxxxxxx 1xxxxxxx ... xxxxxxxx
Prefix varints just shift that unary encoding to the front, so you have at most 2 single-byte switch statements, for less branch misprediction, and for larger sizes it's trivial make use of the processor's native 32-bit and 64-bit load instructions (assuming a processor that supports unaligned loads).

    7 bits -> 0xxxxxxx
    14 bits -> 10xxxxxx xxxxxxxx
    ...
    35 bits -> 11110xxx xxxxxxxx xxxxxxxx xxxxxxxx xxxxxxxx
    ...
    128 bits -> 11111111 11111111 xxxxxxxx xxxxxxxx ... xxxxxxxx
There's literally no advantage to LEB128, other than more people have heard about it. A PrefixVarInt 128 is literally always the same number of bytes, it just puts the length-encoding bits all together so you can more easily branch on them, and doesn't make them get in the way of native loads for your data bits.

Also, zigzag encoding and decoding is faster than sign extension, for variable-length integers. Protocol Buffers got that part right.

Note that for security reasons, if there are no non-canonical representations, there can't be security bugs due to developers forgetting to check non-canonical representations. For this reason, you may want to use a bijective base 256[0] encoding, so that there aren't multiple encodings for a single integer. In the UTF-8 world, there have been several security issues due to UTF-8 decoders not properly checking for non-canonical encodings and programmers doing slightly silly checks against constant byte arrays. A bijective base 256 saves you less than half a percent in space usage, but the cost is only one subtraction at encoding time and one addition at decoding time.

[0]https://en.wikipedia.org/wiki/Bijective_numeration

[+] machuidel|10 years ago|reply
Since I started hearing about WebAssembly I cannot stop thinking about the possibilities. For example: NPM compiling C-dependencies together with ECMAScript/JavaScript into a single WebAssembly package that can then run inside the browser.

For people thinking this will close the web even more because the source will not be "human"readable. Remember that JavaScript gets minified and compiled into (using Emscripten) as well. The benefits I see compared to what we have now:

- Better sharing of code between different applications (desktop, mobile apps, server, web etc.)

- People can finally choose their own favorite language for web-development.

- Closer to the way it will be executed which will improve performance.

- Code compiled from different languages can work / link together.

Then for the UI part there are those common languages / vocabularies we can use to communicate with us humans: HTML, SVG, CSS etc.

I only hope this will improve the "running same code on client or server to render user-interface" situation as well.

[+] rl3|10 years ago|reply
Considering how critical SharedArrayBuffer is for achieving parallelism in WebAssembly, I'm hoping we see major browsers clean up their Worker API implementations, or even just comply with spec in the first place.

Right now things are a mess in Web Worker land, and have been for quite some time.

[+] stevenh|10 years ago|reply
If anyone at infoworld.com reads these comments:

On the top of the page, there is a horizontal menu containing "App Dev • Cloud • Data Center • Mobile ..."

When I position my cursor above this menu and then use the scroll wheel to begin scrolling down the page, once this menu becomes aligned with my cursor, the page immediately stops scrolling and the scroll wheel functionality is hijacked and used to scroll this menu horizontally instead.

It took a few seconds to realize what was happening. At first I thought the browser was lagging - why else would scrolling ever abruptly stop like that?

I closed the page without reading a single word.

[+] eggy|10 years ago|reply
I still think there is a lot of room for static pages with links in the style that people seem to be prematurely waxing melancholy about when forecasting where WebAssembly _may_ lead the internet. I was always able to find sites of interest that didn't include Flash, Java applets, and company when I just wanted to read something. I find some of the scroll-hijacking, and other javascript goodies on modern pages to either be a distraction, or non-functional on some different devices. On the other hand, I am particularly happy about, and working with Pollen in Racket, a creation by Matthew Butterick. Pollen is a language created with Racket for making digital books, books as code, and bringing some long-needed, real-world publishing aesthetics back to the web [1,2]. I may even by a font of his to get going and support him at the same time!

   [1]  http://docs.racket-lang.org/pollen/
   [2]  http://practical.typography.com
[+] no1youknowz|10 years ago|reply
I think the web may split into two.

1) 'Simple' web pages will stick with jquery, react, angular, etc type code. Where you can still click view source and see whats going on. Where libs are pulled from CDNs etc.

2) 'Complex' saas web apps, where you need native functionality. This will be a huge bonus. I'm in this space. I would love to see my own application as a native app. The UI wins alone make it worth it!

[+] gsmethells|10 years ago|reply
To me, it's more about choice of programming language than performance. Though the latter is very important, I think the former is what will open up doors to making the browser a platform of choice (pun intended). Currently, it feels like JavaScript is the Comcast of the web. Everyone uses it, but that's only because there aren't any other options available to them.
[+] hutzlibu|10 years ago|reply
Sorry, but most of the discussion here is completly missing the point about WebAssembler.

It is just a technology, to make things brought through the web, faster. And it is open. And no less secure, than js. So I think it's great.

Good technology does exactly, what the creator wants. And if people don't like some of the things, that gets created with it, then it is not a problem of the technology itself.

So people can do good things, or bad things with it. But in the web, we have the freedom to choose, where we go.

And if we don't like ads for example, we should be aware, that Web-Site creators still want money for their work, so maybe we should focus and support a different funding model. I like the pay-what-you-want or donation model the most, Wikipedia shows, that this is possible on a large scale ...

[+] vruiz|10 years ago|reply
I want to agree with him, I'd like to see a future where WebAssembly closes the gap between native apps and the web. For better or worse browsers are the new OSes, and I dream of a future were all vendors come up with the equivalent of a POSIX standard where any web application can access all (or a wide common subset) of any device's capabilities, from the filesystem to native UI elements.
[+] icedchai|10 years ago|reply
WebAssembly... Wow, if we keep going, we'll re-invent what Sun achieved 20 years ago with Java. If only they hadn't f-ed it up...
[+] protomyth|10 years ago|reply
The JVM problem was that it had applets and did not have the DOM integration of Javascript. I do often wonder if instead of Javascript in 1995 we had got WebAssembly and WebSockets.
[+] nadam|10 years ago|reply
A question to WebAssembly experts: How easy it is to use WebAssembly as a sandboxed embedded scripting mechanism in my own native (C++) application? I am writing a native real-time system (a distributed 3D engine for VR) in which I send scripts on the wire between machines, and I need to call an update() method of these sent scripts like 90 times a frame. I need complete sandboxing, because my trust model is that what is trusted on machine A may be absolutely not trusted on machine B: not only not letting the scripts do any functions other than what I explicitly let them call, but I need to have hard limit on their memory usage and execution time also, but preferably they should execute in-process, so they can reach memory I let them and be called from the thread I want. Currently I go wtih Lua, but to have really good performance I will need to research this topic more deeply later.
[+] n00b101|10 years ago|reply
What is the upgrade path for Emscripten users? I understand that LLVM will have WebAssembly backend, but how will OpenGL to WebGL translation work, for example?
[+] bcoates|10 years ago|reply
If you think WebAssembly (or asm.js) is a good idea, I would very much like you to do the thought experiment of what design decisions something like WebAssembly would have made 15 or 25 years ago, and what consequences those would have today.

Helpful research keywords: Itanium RISC Alpha WAP Power EPIC Java ARM Pentium4 X.25

[+] Executor|10 years ago|reply
I'm conflicted. One one hand I support open data/raw documents. But this prevents native-like, real-time applications. It also forces developers to work on Javascript which is a terrible language.

On the other hand we have lock-in ecosystems, closed silos, that are detrimental to the commons.

The only consolation I have is that if WebAssembly provides a bytecode instead of machine code then we still have the ability to perform reverse engineering.

In the end, we have ALL have to do the hard task to inform every single person why Apple/FB/MS/Google are harmful to us and why we should boycott their programs/services.