top | item 44187255

(no title)

johnjreiser | 9 months ago

I'd counter with an anecdote; I had a colleague that boasted how he memorized a classmate's SSN in college and would greet him by SSN when seeing him years later. Is the goal of AI to replicate the entirety of the human experience (including social pressures, norms, and shame) or a tool to complement human decision making?

While, yes, you can argue the slippery slope, it may be advantageous to flag certain training material as exempt. We as humans often make decisions without perfect knowledge, and "knowing more" isn't a guarantee that it produces better outcomes, given the types of information consumed.

discuss

order

lmm|9 months ago

Knowing more might not improve your accuracy but it's not going to harm it. Forcibly forgetting true parts of your knowledge seems far more likely to have unintended consequences.

conception|9 months ago

Counterpoint: There are plenty examples of breakthroughs from folks who are ignorant of the “right” way to go about it. A fresh take isn’t always bad.

Dylan16807|9 months ago

I disagree. Actively fighting against your memory will slow you down in any context where some memorized idea is similar to what you're doing but you shouldn't be using the memorized idea.

lou1306|9 months ago

One obvious consequence: the model might still produce copyright infringement because it thinks its creative ideas are novel.

lynx97|9 months ago

The goal of AI is to make money. All the moralisation is very human, but also extremely naive.

BTW, I don't really understand what "social pressure" and "shame" has to do with your story? In my book, the person with a good memory isn't to blame. They're just demonstrating a security issue, which is a good thing.

falcor84|8 months ago

In that example, the mnemonist should be demonstrating the security issue to the government, and not to their friend. We have social taboos for this reason. As an extreme example, I wouldn't greet a person by their penis size after noticing it in the locker room - some information should still be considered private, regardless of how we came to obtain it.

Same with an LLM, when it got sensitive information in its weights, regardless of how it obtained it, I think we should apply pressure/shame/deletion/censorship (whatever you call it) to stop it from using that information in any future interactions.