top | item 29001270

Porting WebGL Shaders to WebGPU

88 points| AshleysBrain | 4 years ago |construct.net | reply

63 comments

order
[+] Jasper_|4 years ago|reply
I took a very different approach for porting my WebGPU shaders over: I have far too many shaders to port (sometimes pretty massive ones! [0]), so I used Naga [1] at to runtime-translate my GLSL shaders to WGSL. I had to get involved a bit upstream to fix quite a few different bugs in the implementation, but once I got it working I was really happy with the result. It's quite fast, it compiles to a pretty light-weight WebAssembly module, and it works surprisingly well.

[0] https://github.com/magcius/noclip.website/blob/e15f8045cf262... [1] https://github.com/gfx-rs/naga/

[+] saurik|4 years ago|reply
Why do you do this translation at runtime--causing every user to have to re-execute it--instead of at compile-time? Is there some kind of per-user optimization this makes possible?
[+] astlouis44|4 years ago|reply
Our team is working to build out support for Unreal Engine 4 and Unreal Engine 5 to support WebGPU, so game developers and real-time 3D developers can export their creations to the web at near native performance and access features like computer shaders.

Long term goal is to disrupt Steam and the App Stores.

We're also working on WebXR support to enable UE VR apps on the web. If anyone is interested and wants to learn more, you can join our Discord here: https://discord.gg/zUSZ3T8

[+] Impossible|4 years ago|reply
It seems like you aren't associated with Epic? What's the long term outcome here? Have Epic take your PRs? Get hired\acquihired (it's unclear if you are a company or a group of hobbyists)? It seems like when WebGPU is production ready Epic will support it as well as WebXR...

Epic's goal is very obviously to disrupt Steam and app stores as well... Ignore all of this if you actually are an Epic employee of course :)

[+] ianlevesque|4 years ago|reply
As a gamer though, why would I want to play any of these developers' games through the browser instead of Steam?
[+] jabl|4 years ago|reply
As a non-web-developer, I'm kinda excited about WebGPU. Specifically WebGPU native, that has potential to be a portable modern 3D graphics API without the difficulty of using Vulkan or DX12.
[+] pjmlp|4 years ago|reply
It can only expose a subset of their capabilities.

Don't expect using mesh shaders on WebGPU for example.

[+] astlouis44|4 years ago|reply
What game engine do you use? UE4 or Unity?
[+] slx26|4 years ago|reply
My main worry with WebGPU is the dependence with Javascript. If I understand it correctly, you can't use web APIs nor stuff like WebGPU directly through web assembly anyway, so there's always a javascript glue layer involved.

Now, I know some languages like Rust have decent bindings that hide all that, but I wonder about the performance cost of that glue. Has someone measured it, or estimated how better we could go without it? Or am I completely wrong about this? Or are there clear plans to make this better/disappear? (Though in my brief research I didn't find anything like it).

[+] modeless|4 years ago|reply
WebGPU is a less chatty API than WebGL. You can kick off a huge amount of GPU computation with just a couple of function calls. The overhead of the function calls themselves is usually not a big issue. That said, I know there has been work in Chrome to make these calls more efficient for WebGL and I think the overhead is actually small.
[+] wrnr|4 years ago|reply
The same WebGPU API can be used as a cross platform layer to do graphics, no js needed. For example rust's wgpu can just be used without a browser.
[+] lainga|4 years ago|reply
The confusion with `let` and `var` does seem confusing from a JS perspective, but it seems to be so that you can have the same mental model between WGSL and SPIR-V:

https://github.com/gpuweb/gpuweb/issues/2207

[+] devit|4 years ago|reply
It seems to be a syntax mismash between Rust, which uses `let` and `let mut`, and JavaScript, which uses `const` and `var`, giving WGSL's `let` and `var`, probably chosen because they are the shortest.
[+] jeroenhd|4 years ago|reply
As someone who's never used this API: is there some kind of permission prompt for this before websites start trying to use my GPU for crypto mining? Most CPU crypto miners were basically worthless, but GPU power is a whole lot more powerful.
[+] NiekvdMaas|4 years ago|reply
Is there any advantage at this point in time to port a game to WebGPU? Is it going to run faster, or have broader browser support in the future?
[+] klodolph|4 years ago|reply
It's like a web version of Vulkan / Metal / DX12, in that it's designed to more directly represent how modern GPUs work. If you are happy with your WebGL performance, by all means, continue using it. However, it can sometimes be a bit hard to understand what is going wrong (performance-wise) in an OpenGL or WebGL application... the underlying implementation will sometimes paper over its limitations. You can accidentally stray from the high-performance paths.

There are some feature differences, like how WebGPU has compute shaders.

[+] modeless|4 years ago|reply
At this time I would recommend WebGL 2. WebGL 2 is finally supported in all major browsers now that Safari 15 shipped. WebGL 2 has some of the benefits mentioned in the article for WebGPU, such as a reasonable minimum texture size (2048 vs 64 in WebGL 1) and the built-in ability to get the size of a texture in a shader.

WebGPU will not ship in all browsers for some time, and when it does it will not have have wider hardware support than WebGL 2. It has the potential to be faster in some cases, but the difference is unlikely to matter for most. The big draw of WebGPU should be compute shader support, though it is definitely possible to do compute work in WebGL 2 either with transform feedback or just regular shaders.

[+] gnarbarian|4 years ago|reply
assuming you are comparing it to webgl. WebGPU has better compute support. in practice this means you can more effectively use shaders to perform things like physics, math heavy computation, or algorithms that benefit from massive concurrency. this puts the advantages way beyond just games.
[+] pjmlp|4 years ago|reply
Not at all, it took 10 years for WebGL 2.0 to be generally available, and WebGPU is with luck reaching MVP 1.0 in 2022 on Chrome, let alone anywhere else.
[+] fulafel|4 years ago|reply
No to first question, unless you count learning, as it's mot enabled in browsers yet. Maybe to second (depending on your game).
[+] skinner_|4 years ago|reply
> GLSL supports the ternary ?: operator. WGSL does not support this, but provides the built-in function select(falseValue, trueValue, condition) which does much the same thing (although mind that parameter order!).

I'd really like to hear a justification for that parameter order. To me it seems like the 4th best option out of 6.

[+] jamienicol|4 years ago|reply
It could be because it matches the order of glsl's `mix()`. Not sure why glsl chose that order, though perhaps you might find it less unintuitive for an interpolation than a select. (Mix acts as a select rather than interpolation when the third argument is a boolean)
[+] kevingadd|4 years ago|reply
lerp(zero, one, t) -> select(falseValue, trueValue, condition ? 1.0 : 0.0)
[+] RantyDave|4 years ago|reply
Semi-related: does anyone have WebGPU running on an M1 Mac? Pretty sure it "can be done" but the appropriate experimental options are not even there on either Safari or Chrome...
[+] kangz|4 years ago|reply
WebGPU is enabled by default in Chrome on Mac as part of the WebGPU Origin Trial (https://web.dev/gpu). We didn't really test on M1 Macs but several people reported it just works.
[+] AshleysBrain|4 years ago|reply
Sorry about the formatting at the top. An unexpected problem with the blog system on our website! The rest of it looks OK though.
[+] jak6jak|4 years ago|reply
Really glad you made this article. Do you guys use rust for webGPU?