top | item 46392039

(no title)

mv4 | 2 months ago

Other than the potential liability, cost may also be a factor.

Back in April 2025, Altman mentioned people saying "thank you" was adding “tens of millions of dollars” to their infra costs. Wondering if adding per-message timestamps would cost even more.

discuss

order

cj|2 months ago

Presumably you could decouple timestamps from inference.

I would be very surprised if they don’t already store date/time metadata. If they do, it’s just a matter of exposing it.

g947o|2 months ago

I think "thank you" are used for inference in follow-up messages, but not necessarily timestamps.

I just asked ChatGPT this:

> Suppose ChatGPT does not currently store the timestamp of each message in conversations internally at all. Based on public numbers/estimates, calculate how much money it will cost OpenAI per year to display the timestamp information in every message, considering storage/bandwidth etc

The answer it gave was $40K-$50K. I am too dumb and inexperienced to go through everything and verify if it makes sense, but anyone who knows better is welcome to fact check this.

mikkupikku|2 months ago

Altman was being dumb; being polite to LLMs makes them produce higher quality results which results in less back-and-forth, saving money in the long run.

stainablesteel|2 months ago

this is actually hilarious, also easily fixable if they just respond to that with a pre-determined

if response == 'thank you': print('your welcome')

jiggawatts|2 months ago

That won't work if the previous conversation was something like "translate everything from here on into <target language>".

nacozarina|2 months ago

it’s wild ppl accept his rhetoric at face value