The author of the talk here. I am excited about the Multicore OCaml upstreaming plan.
We're going to phase it into 3 distinct phases. First, we will upstream the multicore runtime so that it co-exists with the current runtime as a non-default option. This will give a chance to test and tune the new GC in the wild. Once we are happy with it, it will be made the default.
The second PR would be support for fibers -- linear delimited continuations without the syntax extensions (exposed as primitives in the Obj module). This will help us test the runtime support for fibers and prototype programs.
The last PR would be effect handlers with an effect system. Any unhandled user-defined effects would be caught statically by the type system. The effect system is particularly useful not just for writing correct programs, but to target JavaScript using the js_of_ocaml compiler where we can selectively CPS translate the effectful bits à la Koka. As a result, we will retain the performance of direct-style code and pay extra cost only for the code that uses effect handlers. In the future, we will extend it to track built-in OCaml effects such as IO and mutable refs. At that point, we can statically distinguish pure (as Haskell defines it) functions from impure ones without having to use Monad transformers for the code that mixes multiple effects. I'd recommend Leo White's talk [0] on effect handlers for a preview of how this system would look like.
What really surprised me, that Facebook with so active usage of ReasonML doesn't push/sponsor Multicore OCaml. Surely that will improve performance in their high-load setups.
ReasonML has only very superficial changes to ML using the ppx AST extension framework. Multicore GCs are not really what Facebook planned on contributing, or at least has shown signs of contributing yet. It is also a bit outside of the scope of their project since they (mainly) market reason for compiling to JS VMs.
Rust is beginning to understand that they need more stability and LTS versions, and libraries are blossoming nicely.
Ocaml already has a very mature module ecosystem and is now becoming safe and modern.
I think rust will still have an edge in adoption due to its portrayal as C++ unfucked, but ocaml is definitely the easier tool to work with, imo. And maybe that will change.
I don't think even linear types and multicore would be enough for ocaml to make any significant dent in the systems programming world. Rist and C/C++/D/Zig all do memory management too conveniently, and it opens doors too close to the bottom for ocaml to keep up.
Any ocaml hackers: would you want to write system drivers in ocaml? Why/ why not?
I feel like OCaml is a language that has a lot of potential, but is hampered by somewhat gnarly syntax, a toolchain that feels utterly antique (for example, last time I worked with it, the REPL had no Readline support and had to be run with rlwrap), and lack of a good killer app.
In some ways, OCaml's problem mirror that of Erlang. The gnarly syntax is largely being addressed by Reason (much like Elixir does for Erlang), but I don't see it catching on as much as I'd like, and it still has some warts, I think.
As for killer apps, distributed data processing is something that OCaml could be great at, given that it marries Erlang's functional style with a rich type system, and there was a minor wave of libraries (for concurrent, distributed actor programming) way back in the 2010 or so, but those libraries are now dead and nothing really happened. Meanwhile, Scala largely leads that story (Akka is very popular, and Scala also seems to be the language of choice for Spark, Beam, etc.) and Haskell now has Cloud Haskell, which is modeled on Erlang. Also, we kind of need multicore for this.
(Distributed data is an area where I hope the Java/Scala world gets competition soon. A lot of people, I suspect, would like this, so there's an opportunity to rapidly gain mindshare. I don't see much happening there. Pony is promising (e.g. Wallaroo), and some people have had success with Go (Pachyderm). I don't know Pony, but without genetics, Go is a pretty awkward fit for data pipelines; witness the number of hoops jumped, and resulting limitations, in the Apache Beam SDK for Go. Spark et al rely on distributing bytecode to worker nodes, something you just can't do in Go. Not sure about OCaml.)
What should I learn if I want to develop desktop apps and I like functional programming? I feel like most of the cool 'new' programming languages are OOP (rust, go, scala).
I saw a discussion once (can’t remember where) about the possibility of adding opt-in GC functionality to Rust. If that ever comes to fruition, I wonder if that’d cover many of the current use cases for OCaml?
[+] [-] scott_s|7 years ago|reply
[+] [-] kcsrk|7 years ago|reply
We're going to phase it into 3 distinct phases. First, we will upstream the multicore runtime so that it co-exists with the current runtime as a non-default option. This will give a chance to test and tune the new GC in the wild. Once we are happy with it, it will be made the default.
The second PR would be support for fibers -- linear delimited continuations without the syntax extensions (exposed as primitives in the Obj module). This will help us test the runtime support for fibers and prototype programs.
The last PR would be effect handlers with an effect system. Any unhandled user-defined effects would be caught statically by the type system. The effect system is particularly useful not just for writing correct programs, but to target JavaScript using the js_of_ocaml compiler where we can selectively CPS translate the effectful bits à la Koka. As a result, we will retain the performance of direct-style code and pay extra cost only for the code that uses effect handlers. In the future, we will extend it to track built-in OCaml effects such as IO and mutable refs. At that point, we can statically distinguish pure (as Haskell defines it) functions from impure ones without having to use Monad transformers for the code that mixes multiple effects. I'd recommend Leo White's talk [0] on effect handlers for a preview of how this system would look like.
[0] https://www.janestreet.com/tech-talks/effective-programming/
[+] [-] laylomo2|7 years ago|reply
[+] [-] xvilka|7 years ago|reply
[+] [-] thomasjames|7 years ago|reply
[+] [-] wbl|7 years ago|reply
[+] [-] elcritch|7 years ago|reply
Or perhaps port or create a system similar to OTP using the effects, fibers, and threads.
[+] [-] qop|7 years ago|reply
Rust is beginning to understand that they need more stability and LTS versions, and libraries are blossoming nicely.
Ocaml already has a very mature module ecosystem and is now becoming safe and modern.
I think rust will still have an edge in adoption due to its portrayal as C++ unfucked, but ocaml is definitely the easier tool to work with, imo. And maybe that will change.
I don't think even linear types and multicore would be enough for ocaml to make any significant dent in the systems programming world. Rist and C/C++/D/Zig all do memory management too conveniently, and it opens doors too close to the bottom for ocaml to keep up.
Any ocaml hackers: would you want to write system drivers in ocaml? Why/ why not?
[+] [-] atombender|7 years ago|reply
In some ways, OCaml's problem mirror that of Erlang. The gnarly syntax is largely being addressed by Reason (much like Elixir does for Erlang), but I don't see it catching on as much as I'd like, and it still has some warts, I think.
As for killer apps, distributed data processing is something that OCaml could be great at, given that it marries Erlang's functional style with a rich type system, and there was a minor wave of libraries (for concurrent, distributed actor programming) way back in the 2010 or so, but those libraries are now dead and nothing really happened. Meanwhile, Scala largely leads that story (Akka is very popular, and Scala also seems to be the language of choice for Spark, Beam, etc.) and Haskell now has Cloud Haskell, which is modeled on Erlang. Also, we kind of need multicore for this.
(Distributed data is an area where I hope the Java/Scala world gets competition soon. A lot of people, I suspect, would like this, so there's an opportunity to rapidly gain mindshare. I don't see much happening there. Pony is promising (e.g. Wallaroo), and some people have had success with Go (Pachyderm). I don't know Pony, but without genetics, Go is a pretty awkward fit for data pipelines; witness the number of hoops jumped, and resulting limitations, in the Apache Beam SDK for Go. Spark et al rely on distributing bytecode to worker nodes, something you just can't do in Go. Not sure about OCaml.)
[+] [-] isakkeyten|7 years ago|reply
What should I learn if I want to develop desktop apps and I like functional programming? I feel like most of the cool 'new' programming languages are OOP (rust, go, scala).
[+] [-] naasking|7 years ago|reply
It just needs a cargo equivalent. The existing tools weren't sufficient last I checked.
[+] [-] bambataa|7 years ago|reply
Do you have any resources on this? I am learning Ocaml and am interested to see what is changing.
[+] [-] profquail|7 years ago|reply
[+] [-] abiox|7 years ago|reply
i though rust has been stable since 1.0. am i wrong?
[+] [-] alde|7 years ago|reply