top | item 38076899

(no title)

DaleCurtis | 2 years ago

> Ideally I'd like to be able to set the CBR / VBR bitrate

What's wrong with the existing VBR/CBR modes? https://developer.mozilla.org/en-US/docs/Web/API/VideoEncode...

> I don't think you need a shader...

Ah I see what you mean. It'd probably be hard for us to standardize this in a way that worked across platforms which likely precludes us from doing anything quickly here. The stuff easiest to standardize for WebCodecs is stuff that's already standardized as part of the relevant codec spec (e.g, AVC, AV1, etc) and well supported on a significant range of hardware.

> ... instead of round-tripping into a CPU buffer

We're working on optimizing this in 2024, we do avoid CPU buffers in some cases, but not as many as we could.

discuss

order

vlovich123|2 years ago

> It'd probably be hard for us to standardize this in a way that worked across platforms which likely precludes us from doing anything quickly here. The stuff easiest to standardize for WebCodecs is stuff that's already standardized as part of the relevant codec spec (e.g, AVC, AV1, etc) and well supported on a significant range of hardware.

As I said, oculus link worked with off the shelf encoders. Only the Nvidia one needed some special work and even that’s not even needed anymore since they raised the number of encoders (and the amount of work was really trivial - just adjusting some header information in the h.264 framing). I think all you really need is the ability to either slice a VideoFrame into strips 0 cost and have the user feed them into separate encoders OR to request sliced encoding and under the hood that’s implemented however (either multiple encoder sessions or using Nvidia slice API if using nvenc). You can even make support for sliced encoding optional and implement it just for the backends where it’s doable.