(no title)
sh1mmer | 1 year ago
Of course, the incorrect Gemini answer was listed above that still.
[1] https://news.okstate.edu/articles/agriculture/2020/gedon_hom...
sh1mmer | 1 year ago
Of course, the incorrect Gemini answer was listed above that still.
[1] https://news.okstate.edu/articles/agriculture/2020/gedon_hom...
hn_throwaway_99|1 year ago
I think this "failure mode" really highlights how LLMs aren't "thinking", but just mashing up statistically probable tokens. For example, there was an HN article recently about how law-focused LLMs made tons of mistakes. A big reason for this is that the law itself is filled with text that is contradictory: laws get passed that are then found unconstitutional, some legal decisions are overturned by higher courts, etc. When you're just "mashing this text together", which is basically what LLMs do, it doesn't really know which piece of text is now controlling in the legal sense.
pclmulqdq|1 year ago