top | item 43918646 (no title) yehors | 9 months ago Are you using a local Whisper? If yes, what do you use for inference, candle/ort? discuss order hn newest lukaesch|9 months ago Not local. Inference is the only part not written in Rust so far.I am using Replicate to run docker images with a pipeline based on faster-whipser, VAD, pyannote and a custom LLM enhancement flow.Thanks for sharing candle/ort. Interesting to see the WASM in-browser opportunities
lukaesch|9 months ago Not local. Inference is the only part not written in Rust so far.I am using Replicate to run docker images with a pipeline based on faster-whipser, VAD, pyannote and a custom LLM enhancement flow.Thanks for sharing candle/ort. Interesting to see the WASM in-browser opportunities
lukaesch|9 months ago
I am using Replicate to run docker images with a pipeline based on faster-whipser, VAD, pyannote and a custom LLM enhancement flow.
Thanks for sharing candle/ort. Interesting to see the WASM in-browser opportunities