top | item 27975519

FastAI.jl: FastAI for Julia

196 points| dklend122 | 4 years ago |forums.fast.ai

23 comments

order

oxinabox|4 years ago

I think particular nice thing about this is that it is a bundle of nice libraries integrated together well, with nice docs. Those libraries in turn also break down into other nice libraries and so forth (but many don't have does quite this nice) because that is how Julia is.

I can't seem myself ever using FastAI.jl (though I am sure many will). But I absolutely can see myself using Flux + FluxTraining.jl which nicely brings together TensorBoardLogger and EarlyStopping and several other things. (https://github.com/FluxML/FluxTraining.jl) And I can well imagine many will use DataLoaders.jl + Flux.

I feel like this project has nicely rounded out the ecosystem. Making standard tools where before there were a bunch of individual solutions per project. (Like I currently do use TensorBoardLogger + Flux directly with my own custom training loop)

ellisv|4 years ago

This is interesting to me but the motivation behind this is unclear. Since FastAI.jl uses Flux, and not PyTorch, functionality has to be reimplemented. FastAI.jl has vision support but no text support yet.

What does this mean for the development of fastai?

What is the timeline for FastAI.jl to achieve parity?

When should I choose FastAI.jl vs fastai?

BadInformatics|4 years ago

Having tried fastai for a "serious" research project and helped (just a bit) towards FastAI.jl development, here's my take:

> motivation behind this is unclear.

Julia currently has two main DL libraries. Flux, which is somewhere between PyTorch and (tf.)Keras abstraction wise, and Knet, which is a little lower level (think just below PyTorch/around where MXNet Gluon sits). Frameworks like fastai, PyTorch Lightning and Keras demonstrate that there's a desire for higher-level, more batteries included libraries. FastAI.jl is looking to fill that gap in Julia.

> Since FastAI.jl uses Flux, and not PyTorch, functionality has to be reimplemented. FastAI.jl has vision support but no text support yet.

This is correct. That said, FastAI.jl is not and does not plan to be a copy of the Python API (hence "inspired by"). One consequence of this is that integration with other libraries is much easier, e.g. https://github.com/chengchingwen/Transformers.jl for NLP tasks.

> What is the timeline for FastAI.jl to achieve parity?

> When should I choose FastAI.jl vs fastai?

This depends on your use cases and how comfortable you are with a) Julia b) having to roll some of your own code. For the first, I'd recommend poking around with the language before as well as using the linked dev channel in TFA to get an informed opinion.

FastAI.jl itself is composed of multiple constituent packages that can and are used independently, so there's also the option of mixing and matching. For example, https://github.com/lorenzoh/DataLoaders.jl is completely library agnostic.

darsnack|4 years ago

I’m not the main dev on FastAI.jl, but I work on the Julia ML community team that supported this project.

> Since FastAI.jl uses Flux, and not PyTorch, functionality has to be reimplemented.

We are looking to offer a high level API for ML in Julia similar to fastai for PyTorch. The goal is to enrich the Flux ecosystem, so just calling into Python fastai wouldn’t be appropriate. FastAI.jl is built on top of several lower level packages that can be used separately from FastAI.jl. These packages help build out the ecosystem not just for FastAI.jl, but any ML framework or workflow in Julia.

> What does this mean for the development of fastai?

FastAI.jl is “unofficial” in that Jeremy and the fastai team did not develop it. But Jeremy knows about the project, and we have kept in touch with the fastai team for feedback. FastAI.jl doesn’t affect the development of Python fastai in any way.

> FastAI.jl has vision support but no text support yet.

> What is the timeline for FastAI.jl to achieve parity?

We’re working to add more out-of-the-box support for other learning tasks. Currently, we have tabular support on the way, but the timeline for text is not decided.

Note that the framework itself could already support a text learning method, but you’d have to implement the high level interface functions for it yourself. We just don’t have built-in defaults like vision. You can check out https://fluxml.ai/FastAI.jl/dev/docs/learning_methods.md.htm... for a bit more on what I mean.

> When should I choose FastAI.jl vs fastai?

It depends on what you need. PyTorch and fastai are more mature, but Julia and Flux tend to be more flexible to non-standard problems in my experience. If you’re interested, then give Julia/Flux/FastAI.jl a try. If we’re missing a mission critical feature for you, then please let us know so we can prioritize it.

oxinabox|4 years ago

> Since FastAI.jl uses Flux, and not PyTorch, functionality has to be reimplemented.

Yes, and? That is how a port is.

When FastAI for swift was a thing (is it still a thing?) it was (is?) using Swift For TensorFlow, not PyTorch. https://www.fast.ai/2019/03/06/fastai-swift/

NegatioN|4 years ago

It does seem to actually be an unofficial implementation in Julia. So it shouldnt have any impact on the actual development?

Im kinda wondering if Jeremy Howard has ok'ed using their name on a library that they're not in charge of? I didnt find a clear answer to that. Particularly troubling since it seems like the FluxML organization is behind this, not some random dude

jstx1|4 years ago

> When should I choose FastAI.jl vs fastai?

Unless you're following the course, you probably shouldn't use either.

dklend122|4 years ago

Note: This is a sanctioned adaptation by members of the Julia community.

losvedir|4 years ago

Wasn't there a Swift version of FastAI, too? Are they trying to have libraries in multiple ecosystems, or did that one peter out?

adamnemecek|4 years ago

Swift for Tensorflow was cancelled.

cbkeller|4 years ago

Looking forward to trying this out!