top | item 46224828

(no title)

grepex | 2 months ago

The way I think about it is that LLMs are just a tool, and if you trust the tool too much it can backfire. It reminds me of this video [1] that Louis Rossman posted regarding a police officer essentially trusting his AI tool (Flock cameras) too much and falsely accusing a woman of a crime, claiming "you can't take a breath of fresh air without us knowing about it".

[1] https://youtu.be/AoEQg1M92_E?si=A-XNXP_smH2I3hWj

discuss

order

No comments yet.