top | item 37288725

(no title)

yangcheng | 2 years ago

but for code llm to be useful , the local machine need to have some very powerful GPU

discuss

order

jasonjmcghee|2 years ago

not true anymore. give codellama-7b-instruct a try. Just install Ollama. Pretty mind-blowing performance for the ram it uses and how fast it is. It's in the ballpark of chatgpt-3.5-turbo for code related stuff