top | item 44154414

(no title)

salamo | 9 months ago

Really happy to see additional solutions for on-device ML.

That said, I probably wouldn't use this unless mine was one of the specific use cases supported[0]. I have no idea how hard it would be to add a new model supporting arbitrary inputs and outputs.

For running inference cross-device I have used Onnx, which is low-level enough to support whatever weights I need. For a good number of tasks you can also use transformers.js which wraps onnx and handles things like decoding (unless you really enjoy implementing beam search on your own). I believe an equivalent link to the above would be [1] which is just much more comprehensive.

[0] https://ai.google.dev/edge/mediapipe/solutions/guide

[1] https://github.com/huggingface/transformers.js-examples

discuss

order

No comments yet.