(no title)
lqcfcjx | 2 years ago
I totally agree with this direction. I think it's a wrong decision for some companies to try to make it friendly to a wider age range, but in fact, actions / adult roleplay is what users really need. Who needs a virtual partner that only does safe chatting?
But honestly speaking, you need to make your service more performant. Responses are way too slow, so it's quite hard for me to get a sense of its quality.
I'm also curious about what underlying models are you using? Do you train / fine tune your own llms or it's on top of oai?
enjamet|2 years ago
Frankly, because of resource capacities, the site reserves faster speeds for supporters (we're community supported given the financial complexity of adult content) but I've been seriously thinking lately about how that might be affecting the first user experience. Currently toying with some changes.
The site uses a fine-tune of llama-2-13b, specifically pygmalion-2-13b. I've found it to be particularly engaging and creative in its storytelling, but as you can guess by the size, it has some tradeoffs. I'm currently experimenting with new models, would love to hear suggestions.