Great to see Elixir gaining traction in mission-critical broadcast systems! I wonder, how much of Cyanview's reliability comes from Elixir specifically versus just good implementation of MQTT? and is there any specific Elixir features were essential that couldn't be replicated in other languages?
ghislainle|11 months ago
We use MQTT a lot, it is really a central piece of our architecture, but Elixir brings a lot of benefits regarding the handling of many processes which are often loosely coupled. The BEAM and OTP offer a sane approach to concurrency and Elixir is a nice language on top. Here is what I find the most important benefits:
- good process isolation, even the heap is per process. This allows us to have robust and mature code running along more experimental features without the fear of everything going down. And you still have easy communication between processes
- supervision tree allows easy process management. I also created a special supervisor with different restart strategies. The language allows this and then, it integrates as any other supervisor. With network connections being broken and later reconnected, the resilience of our system is tested regularly, like a physical chaos monkey
- the immutability as implemented by the BEAM greatly simplifies how to write concurrent code. Inside a process, you don't need to worry about the data changing under you, no other process can change your state. So no more mutex/critical sections (or very little need). You can still have deadlock though, so it is not a silver bullet
somethingsome|11 months ago
I work at the university and we build acquisition systems with exotic cameras and screens, do you think we could meet one time to discuss possible (commercial and research) projects ?
dist-epoch|11 months ago
travisgriggs|11 months ago
jerf|11 months ago
As much as I am a critic of the system, if this is your use case, this is out-of-the-box a very strong foundation for what you need to get done.
davidbou|11 months ago
For anyone interested in the video stream itself, here's a summary. On-site, everything is still SDI (HD-SDI, 3G-SDI, or 12G-SDI), which is a serial stream ranging from 1.5Gbps (HD) to 12Gbps (UHD) over coax or fiber, with no delay. Wireless transmission is typically managed via COFDM with ultra-low latency H.264/H.265 encoders/decoders, achieving less than 20ms glass-to-glass latency and converting from/to SDI at both ends, making it seamless.
SMPTE 2110 is gaining traction as a new standard for transmitting SDI data over IP, uncompressed, with timing comparable to SDI, except that video and audio are transmitted as separate independent streams. To work with HD, you need at least 10G network ports, and for UHD, 25G is required. Currently, only a few companies can handle this using off-the-shelf IT servers.
Anything streamed over the public internet is compressed below 10 Mbps and comes with multiple seconds of latency. Most cameras output SDI, though some now offer direct streaming. However, SDI is still widely used at the end of the chain for integration with video mixers, replay servers, and other production equipment.
zaik|11 months ago
All programming languages can do any task. It's about how easy they make that task for you.
zwnow|11 months ago
thibaut_barrere|11 months ago
For instance, Elixir supports compilation targeting GPUs (within exactly the same language, not a fork).
Most languages do not allow that (and for most it would be fairly hard to implement).
goatlover|11 months ago
jdufawdfas|11 months ago
[deleted]
dorian-graph|11 months ago
From the article:
> “Yes. We’ve seen what the Erlang VM can do, and it has been very well-suited to our needs. You don’t appreciate all the things Elixir offers out of the box until you have to try to implement them yourself.