(no title)
vchuravy | 3 years ago
- Pytorch/Tensorflow: There are several ML Frameworks written in Julia (as well as Julia bindings to ML Frameworks) the biggest Julia native one is likely Flux.jl
Regarding HF Transformers a quick Google points to https://github.com/chengchingwen/Transformers.jl but I have not had any personal experience with.
All of this is build by the community and your mileage may vary.
In my rather biased opinion the strengths of Julia are that the various ML libraries can share implementations, e.g. Pytorch and Tensorflow contain separate Numpy derivatives. One could say that you can write an ML framework in Julia, instead of writting a DSL in Python as part of your C++ ML library. As an example Julia has a GPU compiler so you can write your own layer directly in Julia and integrate it into your pipeline.
No comments yet.