Would be cool to know some technical details, like are you fine-tuning GPT-3 on Open AI or built something yourself on top of an open source pretrained model?
Hey there,
you're welcome to our Discord server: https://discord.gg/kMpbueJMtQ - we do share some technical details there and we also share our daily progress in our #updates channel. Have you heard of LangChain ?
First time I hear about it, but just checked it out - seems pretty cool, but you still need to feed some model into it right? Many examples seem to use OpenAI.
I see a bunch of apps using LLMs popping out like mushrooms in a forest, but how do you fine-tune it for your dataset? The biggest (GPT-3 davinci) model on OpenAI is not available for fine-tuning.
Vasyl_R|3 years ago
therusskiy|3 years ago
I see a bunch of apps using LLMs popping out like mushrooms in a forest, but how do you fine-tune it for your dataset? The biggest (GPT-3 davinci) model on OpenAI is not available for fine-tuning.
JustBreath|3 years ago
I've read about GPT-J, GPT-NEO, Bloom. Understanding they aren't as effective/intuitive as Open AI's stuff but unlike OpenAI, they are actually open.
Vasyl_R|3 years ago
We plan to be agnostic of underlying LLMs as ecosystem matures.