We don't have a separate `ops.linalg` package, but we do include `numpy.linalg` ops as part of `keras.ops`. For now only 2 ops are supported: `qr` and `solve`. We're open to adding any `numpy.linalg` op that turns out to be useful (or you could open a PR for any op you need).
Hey Francois, congrats to you and the team on the launch! I've generally chosen Pytorch over Tensorflow for my day to day, but now that Keras is framework agnostic I'm excited to revisit it.
One thing I'm wondering about is if it's possible (or necessary?) to use Keras in concert with Pytorch Lightning. In some ways, Lightning evolved to be "Keras for Pytorch," so what is the path forward in a world where both exist as options for Pytorch users—do they interoperate or are they competitors/alternatives to each other?
Both Keras models/layers (with the PyTorch backend) and Lightning Modules are PyTorch Modules, so they should be able to interoperate with each other in a PyTorch workflow. We have not tried this with Lightning, but we've had a good experience with custom PyTorch Modules.
More broadly, it's feasible to use Keras components with any framework built on PyTorch or JAX in the sense that it's always possible to write "adapter layers" that wrap a Keras layer and make it usable by another framework, or the other way around. We have folks doing this to use Flax components (from JAX) as Keras layers, and inversely, to use Keras layers as Flax Modules.
We made sure that TFLite workflows would run smoothly with Keras 3 models. We did not come up with any TFLite related improvements. The focus was on the multi-backend architecture, distribution, and training performance.
revskill|2 years ago
minihat|2 years ago
Cross-platform differences between the behavior of tf.linalg and torch.linalg have cost me a lot of time over the years.
kerasteam|2 years ago
daturkel|2 years ago
One thing I'm wondering about is if it's possible (or necessary?) to use Keras in concert with Pytorch Lightning. In some ways, Lightning evolved to be "Keras for Pytorch," so what is the path forward in a world where both exist as options for Pytorch users—do they interoperate or are they competitors/alternatives to each other?
kerasteam|2 years ago
More broadly, it's feasible to use Keras components with any framework built on PyTorch or JAX in the sense that it's always possible to write "adapter layers" that wrap a Keras layer and make it usable by another framework, or the other way around. We have folks doing this to use Flax components (from JAX) as Keras layers, and inversely, to use Keras layers as Flax Modules.
srvmshr|2 years ago
kerasteam|2 years ago
You can use this migration guide to identify and fix each of these issues (and further, making your code run on JAX or PyTorch): https://keras.io/guides/migrating_to_keras_3/
esafak|2 years ago
__rito__|2 years ago
Were any improvements made?
kerasteam|2 years ago
dave_sullivan|2 years ago