top | item 35581952

(no title)

qz_kb | 2 years ago

This is usually done with shaders and a circle of buffers which maintain state.

discuss

order

akomtu|2 years ago

Fragment shaders can only go so far. Can you do in webgpu something like this?

outputTexture2d[inputTexture2d[inputPosition]]++

In other words, if you have a large texture with (x,y) coordinates of points, can you draw another texture that shows a density cloud of these points? In webgl2 it becomes a phd level problem.

pjmlp|2 years ago

One needs to do multiple passes, not sure if that is to be considered a PhD level problem.

superzamp|2 years ago

Total graphics / shaders / GPU noob here. Does that mean you'll essentially get free visualisations (albeit non-sensical ones) as a byproduct of your computations?

AgentME|2 years ago

If you wanted to do compute in a shader before WebGPU with WebGL instead, then I think the answer is kind of yes. It wasn't "for free" without any code but it was required to do. But now WebGPU supports compute shaders properly so you don't have to do compute in a shader that produces textures.

sva_|2 years ago

No, it's an @compute shader rather than a combination of @vertex and @fragment shaders (which would do graphics) in the case of WebGPU.

Surely you could visualize it but not as a side effect.

jesse__|2 years ago

I think it depends, but given an arbitrary compute pipeline, you should be able to write the results (or intermediary results) to the screen with minimal effort.