top | item 46579115

(no title)

akkad33 | 1 month ago

Couldn't this backfire if they put LLMs on safety critical data. Or even if someone asks LLms for medical advice and dies?

discuss

order

nxpnsv|1 month ago

I guess that the point is that doing so already is not safe?

bigstrat2003|1 month ago

You already shouldn't be using LLMs for either of those things. Doing so is tremendously foolish with how stupid and unreliable the models are.

akkad33|1 month ago

I don't think that would stop people

awkward|1 month ago

There are several humans who need to make decisions between bad training data and life or death decisions coming from an LLM.