top | item 39620221

(no title)

mattew | 2 years ago

Good stuff. How does this compare to Instructor? I’ve been using this extensively

https://jxnl.github.io/instructor/

discuss

order

edunteman|2 years ago

answered in different thread. tldr: not that different for now. we're likely to do some serverside optimizations, esp. given our gpu inference history.

yaj54|2 years ago

I like your UX a lot more. Modeling the llm calls as actual python functions allows them to mesh well with existing code organization dev tooling. And using a decorator to "implement" a function just feels like a special kind of magic. I'd need more ability to use my own "prompt templates" to use this as a lib but I'm definitely going to try using this general pattern.