top | item 38454353

(no title)

kerasteam | 2 years ago

You can absolutely serve with Keras if your inference server is in Python. For instance, if you're looking for a basic solution, you can just set up a Flask app that calls `predict()` on a Keras model.

If you're looking for a high-performance solution that is entirely Python-free, then you can simply export your Keras model as a TF SavedModel and serve it via TFServing. TFServing is C++ based and works on both CPU and GPU.

discuss

order

No comments yet.