Great article. The description of how they handle shaders is just bonkers to me.
Is that really what you’d have to go through to have a working system with plugin shaders from 3rd parties on multiple backends? Or is mostly the result of time and trying to keep backwards compatibility with existing plugins?
Telling external devs “Write a copy in every shader language” would certainly be easier for the core team but that’s obviously undesirable.
Transpiling shaders is what most game engines have done for a decade now. Everybody thinks it's stupid in that field as well, but there is no viable alternative.
Vulkan is not supported on game consoles, with the exception of Switch, and even there you should use NVN instead.
It is not officially supported on Windows, it works because the GPU vendors use the Installable Client Driver API, to bring their own driver stack. This was initially created for OpenGL, and nowadays sits on top of the DirectX runtime.
On the embedded space, the OSes that support graphical output many are stil focused on OpenGL ES.
Vulkan capture support on Windows was introduced in v25 (on linux you need to use a plugin). There is no Vulkan renderer support—which the post clearly stated...
I’m more excited about the upcoming support for VST3, but this is still welcome news. It is far easier than getting hardware encoding working with Rockchip SoCs on Linux.
I'm no expert on topic. So, I maybe understood only 5% of what I read but I wish we had more posts like that. Announcements without any technical details sounds like marketing pieces.
> Metal takes Direct3D's object-oriented approach one step further by combining it with the more "verbal" API design common in Objective-C and Swift in an attempt to provide a more intuitive and easier API for app developers to use (and not just game developers) and to further motivate those to integrate more 3D and general GPU functionality into their apps.
slightly off-topic perhaps, but i find it amazing that an os-level 3d graphics api can be built in such a dynamic language as objective-c; i think it really goes to show how much optimization put in `objc_msgSend()`... it does a lot of heavy lifting in the whole os.
Modern graphics APIs minimize the number of graphics API calls vs. OpenGL and similar. Vulkan/Metal/DirectX 12 will have you pass command buffers with many commands in them instead of separate API calls for everything.
In the early 2000's there was a book on using Direct3D from C# that was pretty influential as far as changing people's assumption that you couldn't do high performance graphics in a GC'd language. In the end a lot of the ideas overlap with what c/c++ gamedevs do, like structuring everything around fixed sized tables allocated at load time and then minimal dynamic memory usage within the frame loop. The same concepts can apply at the graphics API level. Minimize any dynamic language overhead by dispatching work in batches that reference preallocated buffers. That gets the language runtime largely out of the way.
No, it doesn't. You won't find it used much if at all at these levels of the OS. Once you get past cocoa and friends it's restricted subsets of C++ (IOKit for example)
I hope Modern GPU APIs are just a stepping stone to something simpler. OpenGL is loved and hated; and I have grown to love it after using the new stuff.
It says in passing As the Metal backend is only supported on Apple Silicon devices, GPU and CPU share the same memory in the part talking about the differences between the Direct3D and Metal render pipelines.
Not sure why though, because Metal 3 is still supported on a bunch of Intel Macs...
Streaming video from camera? In general the newer Mac Minis in general were fine already just because the M-series chips are very fast, but hopefully this should make it much more efficient
Not all streamers are game streamers, and not all obs users are streamers. I installed on all of my workstations for its screen capture and virtual camera features.
Does anyone know if AMD 8845HS with 780M graphics (running fedora) can into this? Ideally very low system resources used, I only have 16GB RAM, also ideally very little storage space used, one or two frames per second is enough, ideally should compress even more if nothing has changed in the screen for a while, also ideally should create a new file every eight hours or so.
NVIDIA has a "lower overhead" screen recorder, no? It's alt + f9 or something. AFAIK It's supposed to be optimized, because they own the stack and all. It's probably only on Windows though.
[+] [-] JSR_FDED|3 months ago|reply
“OBS Studio Gets A New Renderer: How OBS Adopted Metal”
[+] [-] RobotToaster|3 months ago|reply
[+] [-] MBCook|3 months ago|reply
Is that really what you’d have to go through to have a working system with plugin shaders from 3rd parties on multiple backends? Or is mostly the result of time and trying to keep backwards compatibility with existing plugins?
Telling external devs “Write a copy in every shader language” would certainly be easier for the core team but that’s obviously undesirable.
[+] [-] delusional|3 months ago|reply
[+] [-] robert_foss|3 months ago|reply
Vulkan support was introduced in OBS Studio 25.0 in March 2020, 5.5 years ago.
[+] [-] pjmlp|3 months ago|reply
It is not officially supported on Windows, it works because the GPU vendors use the Installable Client Driver API, to bring their own driver stack. This was initially created for OpenGL, and nowadays sits on top of the DirectX runtime.
On the embedded space, the OSes that support graphical output many are stil focused on OpenGL ES.
[+] [-] jdboyd|3 months ago|reply
[+] [-] nuudlman|3 months ago|reply
[+] [-] rieter|3 months ago|reply
[+] [-] stephen_g|3 months ago|reply
[+] [-] Venn1|3 months ago|reply
[+] [-] ayi|3 months ago|reply
[+] [-] aizk|3 months ago|reply
[+] [-] andrekandre|3 months ago|reply
[+] [-] Rohansi|3 months ago|reply
[+] [-] jasonwatkinspdx|3 months ago|reply
In the early 2000's there was a book on using Direct3D from C# that was pretty influential as far as changing people's assumption that you couldn't do high performance graphics in a GC'd language. In the end a lot of the ideas overlap with what c/c++ gamedevs do, like structuring everything around fixed sized tables allocated at load time and then minimal dynamic memory usage within the frame loop. The same concepts can apply at the graphics API level. Minimize any dynamic language overhead by dispatching work in batches that reference preallocated buffers. That gets the language runtime largely out of the way.
[+] [-] adamnemecek|3 months ago|reply
[+] [-] pmalynin|3 months ago|reply
[+] [-] monster_truck|3 months ago|reply
[+] [-] leecommamichael|3 months ago|reply
[+] [-] zdw|3 months ago|reply
[+] [-] stephen_g|3 months ago|reply
Not sure why though, because Metal 3 is still supported on a bunch of Intel Macs...
[+] [-] dwoldrich|3 months ago|reply
[+] [-] keyle|3 months ago|reply
AAA titles with newer graphics, well, you can always send a capture the PC with the nvidia card's screen through a capture card.
Back in my days of streaming, macOS was no option, cca. 2017. Today I'd do it with any M processor mac without a second thought.
[+] [-] stephen_g|3 months ago|reply
[+] [-] __mharrison__|3 months ago|reply
[+] [-] ChrisMarshallNY|3 months ago|reply
[+] [-] Warchamp7|3 months ago|reply
[+] [-] keyle|3 months ago|reply
[+] [-] snvzz|3 months ago|reply
I hope the next version actually works in some facility.
[+] [-] caseyf7|3 months ago|reply
[+] [-] vegabook|3 months ago|reply
[+] [-] maxlin|3 months ago|reply
[+] [-] daviddever23box|3 months ago|reply
Turning off nearly everything iCloud- or Spotlight-related is a pretty good start; disable network access and you may find even more pearls of wisdom.
[+] [-] zeeeeeebo|3 months ago|reply
[+] [-] jdboyd|3 months ago|reply
[+] [-] 29athrowaway|3 months ago|reply
- recording your screen but not streaming
- you are not customizing what goes into your screen
Then use something else. GPU screen recorder has a lower overhead and produces much smoother recordings: https://git.dec05eba.com/gpu-screen-recorder/about/
[+] [-] aizk|3 months ago|reply
[+] [-] purple-dragon|3 months ago|reply
[+] [-] jasonlotito|3 months ago|reply
Edit: I think you might have skipped reading the post. It's about OBS on MacOS. Where quicktime exists. Your suggestion seems geared toward Linux.
[+] [-] mcny|3 months ago|reply
[+] [-] koakuma-chan|3 months ago|reply
[+] [-] stavros|3 months ago|reply