(no title)
chrisischris | 3 months ago
So I'm building a distributed AI inference platform where you can run models on your own hardware and access it from anywhere privately. Keep your data on infrastructure you control, but also leverage a credit system to tap into more powerful compute when you need it. Your idle GPU time can earn credits for accessing bigger models. The goal is making it dead simple to use your home hardware from wherever you're working.
It's for anyone who wants infrastructure optionality: developers who don't want vendor lock-in, businesses with compliance requirements, or just people who don't want their data sent to third parties.
Get notified when we launch: https://sporeintel.com
RationPhantoms|3 months ago
chrisischris|3 months ago
amysox|3 months ago
chrisischris|3 months ago
geeksinthewoods|3 months ago
https://www.youtube.com/watch?v=2jmWnPfgtik
> now let's continue