But how good are these models compared to gpt4o? My last experience with llama2-8b was not great at all. Are there really that good models that would fit on an average consumer hardware (mine has already 32GB ram and 16GB vram)?
The post you're replying to couldn't have made it any easier to answer these questions yourself. No, it won't be as good as the state of the art with massive cloud infrastructure behind an http api.
noman-land|1 year ago