(no title)
BadInformatics | 4 years ago
> motivation behind this is unclear.
Julia currently has two main DL libraries. Flux, which is somewhere between PyTorch and (tf.)Keras abstraction wise, and Knet, which is a little lower level (think just below PyTorch/around where MXNet Gluon sits). Frameworks like fastai, PyTorch Lightning and Keras demonstrate that there's a desire for higher-level, more batteries included libraries. FastAI.jl is looking to fill that gap in Julia.
> Since FastAI.jl uses Flux, and not PyTorch, functionality has to be reimplemented. FastAI.jl has vision support but no text support yet.
This is correct. That said, FastAI.jl is not and does not plan to be a copy of the Python API (hence "inspired by"). One consequence of this is that integration with other libraries is much easier, e.g. https://github.com/chengchingwen/Transformers.jl for NLP tasks.
> What is the timeline for FastAI.jl to achieve parity?
> When should I choose FastAI.jl vs fastai?
This depends on your use cases and how comfortable you are with a) Julia b) having to roll some of your own code. For the first, I'd recommend poking around with the language before as well as using the linked dev channel in TFA to get an informed opinion.
FastAI.jl itself is composed of multiple constituent packages that can and are used independently, so there's also the option of mixing and matching. For example, https://github.com/lorenzoh/DataLoaders.jl is completely library agnostic.
No comments yet.