WingNews logo WingNews
top | new | best | ask | show | jobs
top | item 40101859

(no title)

hydershykh | 1 year ago

That's a fair point. But models like GPT4 do not hallucinate much when it comes to summarizing. So I don't think these applications contribute to anything negative.

discuss

order

trekkie1024|1 year ago

Surprisingly, they hallucinate more than you might think.

https://x.com/lefthanddraft/status/1777495120910426436?s=46

powered by hn/api // news.ycombinator.com