top | item 34910923

(no title)

therusskiy | 3 years ago

Would be cool to know some technical details, like are you fine-tuning GPT-3 on Open AI or built something yourself on top of an open source pretrained model?

discuss

order

Vasyl_R|3 years ago

Hey there, you're welcome to our Discord server: https://discord.gg/kMpbueJMtQ - we do share some technical details there and we also share our daily progress in our #updates channel. Have you heard of LangChain ?

therusskiy|3 years ago

First time I hear about it, but just checked it out - seems pretty cool, but you still need to feed some model into it right? Many examples seem to use OpenAI.

I see a bunch of apps using LLMs popping out like mushrooms in a forest, but how do you fine-tune it for your dataset? The biggest (GPT-3 davinci) model on OpenAI is not available for fine-tuning.

JustBreath|3 years ago

Anyone care to recommend some open source options?

I've read about GPT-J, GPT-NEO, Bloom. Understanding they aren't as effective/intuitive as Open AI's stuff but unlike OpenAI, they are actually open.

Vasyl_R|3 years ago

That is a good point that you have brought up.

We plan to be agnostic of underlying LLMs as ecosystem matures.