Show HN: I built vector search for COSS podcasts & livestreams
2 points| zcesur | 1 year ago |tv.algora.io
I transcribed the VODs using Whisper and vectorized fixed-size segments from the transcripts with MPNet on Replicate GPUs. I made these segments overlap a little to prevent semantic meaning being lost inbetween segments
Then I indexed the vectors using HNSWLib in-memory vectorstore [4] and persisted the entire vectorstore into Tigris object storage [5] to cache multimedia and vectors across all Fly.io regions
I built the app in Elixir, almost entirely server-side rendered with minimal diffs sent to the client over WebSockets using Phoenix LiveView. I also used Livebook [6] a ton when I was building the multimedia processing & ML pipeline. I'm super bullish on Elixir for building webapps and/or MLops!
Let me know what you think :) If you're curious you can find the code at https://github.com/algora-io/tv
[1]: https://algora.io/podcast [2]: https://tv.algora.io/peerrich [3]: https://tv.algora.io/rfc [4]: https://github.com/nmslib/hnswlib [5]: https://tigrisdata.com [6]: https://github.com/algora-io/tv/blob/2586950/scripts/cossgpt...
No comments yet.