top | item 44577329

(no title)

requilence | 7 months ago

No, definitely not the empty string hallucination bug. These are clearly real user conversations. They start like proper replies to requests, sometimes reference the original question, and appear in different languages.

discuss

order

jonrouach|7 months ago

i had the exact same behavior back in 2023, it seemed like clearly leakage of user conversations - but it was just a bug with api calls in the software i was using.

https://snipboard.io/FXOkdK.jpg

postalcoder|7 months ago

There was an issue with conversation leakage, though. It involved some bug with Redis.

I felt like it was a huge deal at the time but it’s surprisingly hard to quickly google it.

JyB|7 months ago

I don’t see anything here that would prevent a LLM from generating these. Right?

requilence|7 months ago

In one of the responses, it provided the financial analysis of a not well-known company with a non-Latin name located in a small country. I found this company; it is real and numbers in the response are real. When I asked my ChatGPT to provide a financial report for this company without using web tools, it responded: `Unfortunately, I don’t have specific financial statements for “xxx” for 2021 and 2022 in my training data, and since you’ve asked not to use web search, I can’t pull them live.`.

addandsubtract|7 months ago

New Touring Test unlocked! Differentiate between real and fake hallucinations.

DANmode|7 months ago

So THAT'S what the "GT" means on all of these GPU model names!