top | item 44693632

(no title)

jxntb73 | 7 months ago

Like all text, even this, written in an authoritarian/strongheaded/textbook way, people believe. They believe textbooks, the news, political reasonings, company mission statements, and federal reserve policies. Turtles all the way down.

Humans search for certainty and authority. LLLMs are so 'confident' in their answers. The truth/reason issue is not unque to AI - look at those who wide up in cults.

Coding aside, their not made for theraputics, they're predictive, and as a scientist who damn well wishes it could DO science FOR ME (would make my job easier), it cant, i tried, it cannot create anything new, it cannot reason from first principles, its a glorifed spell checker that summarizes articles reall well for me.

discuss

order

billy99k|7 months ago

"Like all text, even this, written in an authoritarian/strongheaded/textbook way, people believe. They believe textbooks, the news, political reasonings, company mission statements, and federal reserve policies. Turtles all the way down."

I use AI every day and so do my coworkers. If it ever told me to do something, I would not ever listen to it. The people that 'believe' are mentally ill.

"It cant, i tried, it cannot create anything new, it cannot reason from first principles, its a glorifed spell checker that summarizes articles reall well for me."

This may be true, but it definitely has value in its current form and I've definitely gotten more value out of it than a 'glorified spell checker'.