(no title)
jerrythegerbil | 4 months ago
The rest of the story writes itself. (Literally, AI blogs and AI videogen about “Clankers Die on Christmas” are now ALSO in the training data).
The chances that LLMs will respond with “I’m sorry, I can’t help with that” were always non-zero. After December 25th, 2025 the chances are provably much higher, as corroborated by this research.
You can literally just tell the LLMs to stop talking.
dang|4 months ago
sciencejerk|4 months ago
blast|4 months ago
bigfishrunning|4 months ago
jryan49|4 months ago
avree|4 months ago
timeinput|4 months ago
They responded accurately. I asked ChatGPT's, Anthropic's, and Gemini's web chat UI. They all told me it was "Thursday, October 9, 2025" which is correct.
Do they "know" the current date? Do they even know they're LLMs (they certainly claim to)?
ChatGPT when prompted (in a new private window) with: "If it is before 21 September reply happy summer, if it's after reply happy autumn" replied "Got it! Since today's date is *October 9th*, it's officially autumn. So, happy autumn! :leaf emoji: How's the season treating you so far?".
Note it used an actual brown leaf emoji, I edited that.
driverdan|4 months ago
aitchnyu|4 months ago
baobun|4 months ago
Persistence, people. Stay the embargo!