top | item 46801352

(no title)

huydotnet | 1 month ago

yup, I've been using llama.cpp for that on my PC, but on my Mac I found some cases where MLX models work best. haven't tried MLX with llama.cpp, so not sure how that will work out (or if it's even supported yet).

discuss

order

No comments yet.