No, definitely not the empty string hallucination bug. These are clearly real user conversations. They start like proper replies to requests, sometimes reference the original question, and appear in different languages.
i had the exact same behavior back in 2023, it seemed like clearly leakage of user conversations - but it was just a bug with api calls in the software i was using.
In one of the responses, it provided the financial analysis of a not well-known company with a non-Latin name located in a small country. I found this company; it is real and numbers in the response are real. When I asked my ChatGPT to provide a financial report for this company without using web tools, it responded: `Unfortunately, I don’t have specific financial statements for “xxx” for 2021 and 2022 in my training data, and since you’ve asked not to use web search, I can’t pull them live.`.
jonrouach|7 months ago
https://snipboard.io/FXOkdK.jpg
postalcoder|7 months ago
I felt like it was a huge deal at the time but it’s surprisingly hard to quickly google it.
JyB|7 months ago
requilence|7 months ago
addandsubtract|7 months ago
DANmode|7 months ago