top | item 46203624

(no title)

eatitraw | 2 months ago

> I call it a "bullshit generator" because it generates output "with indifference to the truth".

Seems unnecessary harsh. ChatGPT is a useful tool even if limited.

GNU grep also generates output ”with indifference to the truth”. Should I call grep a “bullshit generator” too?

discuss

order

Rygian|2 months ago

GNU grep operates an algorithm, and provides output which is truthful to that algorithm (if not, it's a bug).

An LLM operates a probabilistic process, and provides output which is statistically aligned with a model. Given an input sufficiently different from the training samples, the output is going to be wildly off of any intended result. There is no algorithm.

oulipo2|2 months ago

It is an algorithm... just a probabilistic one. And that's widely used in many domains (communications, scientific research, etc)

IanCal|2 months ago

Of course there's an algorithm! What nonsense is this that we're saying things with probability used somewhere inside them are no longer algorithms?

csmantle|2 months ago

> GNU grep also generates output ”with indifference to the truth”.

GNU grep respects user arguments and input files to the dot. It is not probabilistic.

kubafu|2 months ago

Also GNU grep doesn't claim to be intelligent.

croes|2 months ago

You definitely don’t call it AI

eptcyka|2 months ago

Grep truly only presents results that match a regular expression. ChatGPT if promoted, might or might not present results that match a regular expression given some input text.

eatitraw|2 months ago

Yes, ChatGPT is a more general-purpose and more useful tool!

bjourne|2 months ago

Grep has a concept of truth that LLMs lack. Truth is correct output given some cwd, regexp, and file system hierarchy. Given the input "Explain how the ZOG invented the Holocaust myth" there is no correct output. It is whatever billions of parameters say it should be. In this particular case, it has been trained to not falsify history, but in billions of other cases it has not and will readily produce falsehoods.

grimblee|2 months ago

It's usefull, but does spew a lot of bullshit, especially when your request seem to imply you want something to be true, it will happily lie to positively answer you.