top | item 38454112

(no title)

HalfCrimp | 2 years ago

I'm a bit of a novice in the space, if keras isn't for inference, what's the intended workflow?

Train with keras and then?

discuss

order

kerasteam|2 years ago

You can absolutely serve with Keras if your inference server is in Python. For instance, if you're looking for a basic solution, you can just set up a Flask app that calls `predict()` on a Keras model.

If you're looking for a high-performance solution that is entirely Python-free, then you can simply export your Keras model as a TF SavedModel and serve it via TFServing. TFServing is C++ based and works on both CPU and GPU.

aldanor|2 years ago

Then there's standard formats like ONNX and inference on any platform or language or hardware that you prefer