top | item 42212691

(no title)

benedictevans | 1 year ago

I tried to capture this on the last slide before the conclusion - maybe all AI questions have one of two answers - "no-one knows" or "it will be the same as the last time"

this is one of the "no-one knows" questions

discuss

order

Animats|1 year ago

The question I'm asking isn't whether hallucinations can be fixed. It's what, if they are not fixed, are the economic consequences for the industry? How necessary is it that LLMs become trustworthy? How much valuation assumes that they will?

Sateeshm|1 year ago

And is it even fixable?