top | item 40024033

(no title)

trolan | 1 year ago

For a few uni/personal projects I noticed the same about Langchain: it's good at helping you use up tokens. The other use case, quickly switching between models, is a very valid reason still. However, I've recently started playing with OpenRouter which seems to abstract the model nicely.

discuss

order

sroussey|1 year ago

If someone were to create something new, a blank slate approach, what would you find valuable and why?

lordofmoria|1 year ago

This is a great question!

I think we now know, collectively, a lot more about what’s annoying/hard about building LLM features than we did when LangChain was being furiously developed.

And some things we thought would be important and not-easy, turned out to be very easy: like getting GPT to give back well-formed JSON.

So I think there’s lots of room.

One thing LangChain is doing now that solves something that IS very hard/annoying is testing. I spent 30 minutes yesterday re-running a slow prompt because 1 in 5 runs would produce weird output. Each tweak to the prompt, I had to run at least 10 times to be reasonably sure it was an improvement.

jsemrau|1 year ago

Use a local model. For most tasks they are good enough. Let's say Mistral 0.2 instruct is quite solid by now.