top | item 47160973

(no title)

Forgeties79 | 4 days ago

> Sam Altman once joked (?) he wouldn't know how to raise his child without ChatGPT. Maybe he should ask ChatGPT how to behave more like a human? Or at least fake it?

Not to mention that was at a time when all kinds of wild suggestions like glue in pizza were coming out of ChatGPT’s sloppy outputs. There are so many little things that quickly become big things with kids, annd exhausted parents should absolutely not use LLM’s for sussing those things out.

I could easily see well-meaning parents looking for healthy snacks to make their kids accidentally feeding their baby fresh honey, for instance. Or asking how much water to give their infant and not realizing the answer is absolutely none unless they are severely dehydrated from an illness or something.

There are a lot of hazards for kids under 1 in particular that make me incredibly nervous to ever suggest exhausted parents use LLM’s to answer kid related questions. Recommendations also change relatively frequently so who knows if it’s even pulling on the most recent best practices.

discuss

order

No comments yet.