top | item 36682008

Keras Core: Keras for TensorFlow, Jax, and PyTorch

191 points| dewitt | 2 years ago |keras.io

69 comments

order

dkga|2 years ago

I think that is pretty cool - literally made me screen "Yes!" when I saw it and I don't do this for your everyday framework.

I think the beauty of keras was the perfect balance between simplicity/abstraction and flexibility. I moved to PyTorch eventually but one thing I always missed was this. And now, to have it leapfrog the current fragmentation and just achieve what seems to be a true multi-backend is pretty awesome.

Looking forward to the next steps!

Narew|2 years ago

Keras was already that some years ago. It supported tensorflow, theano, mxnet if my memory is right. And then they ditched everything for tensorflow. At the time it was really hard to use keras without calling backend directly for lots of optimisation, unsupported feature on they API etc... This make the use of Keras not agnostic at all. What's different now ?

minimaxir|2 years ago

> What's different now ?

PyTorch adoption: back when Keras went hard into TensorFlow in 2018, both TF and PyTorch adoption were about the same with TF having a bit more popularity. Now, most of the papers and models released are PyTorch-first.

kerasteam|2 years ago

I worked on the project, happy to answer any questions!

__rito__|2 years ago

How does the future look for TFLite and Edge AI/TinyML in general?

Will Keras Core support direct deployment to edge devices like RPi or Arduino?

Will the experience of defining and training a model in JAX/PyTorch and then deploying to edge devices be seamless?

Anything related on the roadmap?

dbish|2 years ago

Great to see this, but I’m curious, does this mean we’ll get fewer fchollet tweets that talk up TF and down PyTorch? Is the rivalry done?

yazanobeidi|2 years ago

Hi, first off thank you for your contributions, and this goes to the entire team. Keras is a wonderful tool and this was definitely the right move to do. No other package nails the “progressive disclosure” philosophy like Keras.

This caught my eye:

> “Right now, we use tf.nest (a Python data structure processing utility) extensively across the codebase, which requires the TensorFlow package. In the near future, we intend to turn tf.nest into a standalone package, so that you could use Keras Core without installing TensorFlow.”

I recently migrated a TF project to PyTorch (would have been great to have keras_core at the time) and used torch.nested. Could this not be an option?

A second question. For “customizing what happens in fit()”. Must this be written in either TF/PyTorch/Jax only, or can this be done with keras_core.ops, similar to the example shown for custom components? The idea would be you can reuse the same training loop logic across frameworks, like for custom components.

bradhilton|2 years ago

Just want to say I love Keras. Thank you for your work!

sbrother|2 years ago

This looks awesome; I was a big fan of Keras back when it had pluggable backends and a much cleaner API than Tensorflow.

Fast forward to now, and my biggest pain point is that all the new models are released on PyTorch, but the PyTorch serving story is still far behind TF Serving. Can this help convert a PyTorch model into a servable SavedModel?

binarymax|2 years ago

Congrats on the launch! I learned Keras back when I first got in to ML, so really happy to see it making a comeback. Are there some example architectures available/planned that are somewhat complex, and not just a couple layers (BERT, ResNet, etc.)?

ayhanfuat|2 years ago

This is an amazing contribution to the NN world. Thank you all the team members.

seanhunter|2 years ago

Firstly thanks to all the team for everything you have done and congrats on this. It must have been a ton of work and I am excited to get my hands on it.

johnhenning|2 years ago

Do you foresee any compatibility or integration issues with higher level frameworks, i.e. lightning, transformers, etc?

hashtag-til|2 years ago

Thanks for helping to de-fragment a the AI ecosystem! I’ll look to get involved, test and collaborate with patches!

m_ke|2 years ago

As someone who has dealt with countless breaking changes in keras and wasted days of my life attempting to upgrade, no thank you.

My pytorch code from years ago still works with no issues, my old keras code would break all the time even in minor releases.

ipunchghosts|2 years ago

Agreed. This will only break things, especially research code.

dewitt|2 years ago

From the announcement:

"We're excited to share with you a new library called Keras Core, a preview version of the future of Keras. In Fall 2023, this library will become Keras 3.0. Keras Core is a full rewrite of the Keras codebase that rebases it on top of a modular backend architecture. It makes it possible to run Keras workflows on top of arbitrary frameworks — starting with TensorFlow, JAX, and PyTorch."

Excited about this one. Please let us know if you have any questions.

albertzeyer|2 years ago

That looks very interesting.

I actually have developed (and am developing) sth very similar, what we call the RETURNN frontend, a new frontend + new backends for our RETURNN framework. The new frontend is supporting very similar Python code to define models as you see in PyTorch or Keras, i.e. a core Tensor class, a base Module class you can derive, a Parameter class, and then a core functional API to perform all the computations. That supports multiple backends, currently mostly TensorFlow (graph-based) and PyTorch, but JAX was something I also planned. Some details here: https://github.com/rwth-i6/returnn/issues/1120

(Note that we went a bit further ahead and made named dimensions a core principle of the framework.)

(Example beam search implementation: https://github.com/rwth-i6/i6_experiments/blob/14b66c4dc74c0...)

One difficulty I found was how design the API in a way that works well both for eager-mode frameworks (PyTorch, TF eager-mode) and graph-based frameworks (TF graph-mode, JAX). That mostly involves everything where there is some state, or sth code which should not just execute in the inner training loop but e.g. for initialization only, or after each epoch, or whatever. So for example:

- Parameter initialization.

- Anything involving buffers, e.g. batch normalization.

- Other custom training loops? Or e.g. an outer loop and an inner loop (e.g. like GAN training)?

- How to implement sth like weight normalization? In PyTorch, the module.param is renamed, and then there is a pre-forward hook, which on-the-fly calculates module.param for each call for forward. So, just following the same logic for both eager-mode and graph-mode?

- How to deal with control flow context, accessing values outside the loop which came from inside, etc. Those things are naturally possible eager-mode, where you would get the most recent value, and where there is no real control flow context.

- Device logic: Have device defined explicitly for each tensor (like PyTorch), or automatically eagerly move tensors to the GPU (like TensorFlow)? Moving from one device to another (or CPU) is automatic or must be explicit?

- How to you allow easy interop, e.g. mixing torch.nn.Module and Keras layers?

I see that you have keras_core.callbacks.LambdaCallback which is maybe similar, but can you effectively update the logic of the module in there?

jszymborski|2 years ago

Keras and PyTorch! I thought I'd never see the day! Glad to see the two communities bury the hatchet.

p1esk|2 years ago

I don’t get it - why would you want Keras if you already use Pytorch?

syntaxing|2 years ago

Does this mean the weights output can be backend agnostic?

Also, are there any examples using this for the coral TPU?

kerasteam|2 years ago

Yes, model weights saved with Keras Core are backend-agnostic. You can train a model in one backend and reload it in another.

Coral TPU could be used with Keras Core, but via the TensorFlow backend only.

kaycebasques|2 years ago

Can someone ELI5 the relationship between Keras and TensorFlow/Jax/PyTorch/etc? I kinda get the idea the Keras is the "frontend" and TF/Jax/PyTorch are the "backend" but I'm looking to solidify my understanding of the relationship. It might help to also comment on the key differences between TF/Jax/PyTorch/etc. Thank you.

_Wintermute|2 years ago

Keras was a high-level wrapper around Theano or Tensorflow.

The creator of Keras was then employed by Google to work on Keras, who promised everyone Keras would remain backend agnostic.

Keras become part of Tensorflow as a high-level API and did not remain backend agnostic. There was lots of questionable twitter beef about Pytorch by the Keras creator.

Keras is now once again backend agnostic, as a high-level API for Tensorflow/PyTorch/Jax. Likely as them seeing Tensorflow losing traction.

dharmeshkakadia|2 years ago

Supporting multiple backends (especially Jax) is nice! Makes experimenting/migrating between them so much more approachable. Any timeline on when can we expect support for distributed Jax training? The doc currently seems to indicate only TF is supported for distributed training.

martin-gorner|2 years ago

Support for distributed JAX training demoed here: bit.ly/keras-on-jax-demo You have to write a custom training loop for now, but it works.

math_dandy|2 years ago

IIRC, Keras was added officially added to Tensorflow as part of the version 2.0 release. With Keras reverting to its backend-agnostic state, will it be removed from Tensorflow? Is this a divorce or are TF & Keras just opening up their relationship?

adolph|2 years ago

Wouldn't a multi-framework wrapper be a subset of any supported framework's features common among all frameworks?

Additionally would it always be at least a step behind any framework depending on the wrapper's release cycle?

boredumb|2 years ago

I must admit i've never actually used keras but this is interesting to see how they are implementing it with Jax, definitely worth a reminder to dig into one of these days.

ipunchghosts|2 years ago

Like everything keras, this will promise a lot and only deliver on conops deemed worthy by the keras team.

riku_iki|2 years ago

will keras be backward compatible, or as always and now google/tf ecosystem will have 3 gens of frameworks: tf1, tf2 + keras, keras core 3.