top | item 42347131

(no title)

rdescartes | 1 year ago

If anyone needs a more powerful constrain outputs, llama.cpp support gbnf:

https://github.com/ggerganov/llama.cpp/blob/master/grammars/...

discuss

order

jimmySixDOF|1 year ago

Thats is exactly what they are using

lolinder|1 year ago

Have you found the output for arbitrary grammars to be satisfactory? My naive assumption has been that these models will produce better JSON than other formats simply by virtue of having seen so much of it.

rdescartes|1 year ago

If you want to get a good result, the grammar should be following the expect output from the prompt, especially if you use a small model. Normally I would manually fine-tune the prompt to output the grammar format first, and then apply the grammar in production.

throwaway314155|1 year ago

Who would downvote this perfectly reasonable question?

edit: Nm

sa-code|1 year ago

This is amazing, thank you for the link

dcreater|1 year ago

How is it more powerful?

evilduck|1 year ago

Grammars don't have to just be JSON, which means you could have it format responses as anything with a formal grammar. XML, HTTP responses, SQL, algebraic notation of math, etc.