(no title)
photochemsyn | 2 months ago
Really just confirmed to me that long term, the best option for inference is just running an open source model on your own hardware, even if that's still expensive and doesn't generate as high quality output.
photochemsyn | 2 months ago
Really just confirmed to me that long term, the best option for inference is just running an open source model on your own hardware, even if that's still expensive and doesn't generate as high quality output.
No comments yet.