Computers used to be like dogs. You could teach them some really cool tricks. We enjoyed the accomplishment, and appreciated the tricks. But, dogs are dogs. Essentially, even as much as one might love them, they're just property.
Now, computers have a soul; they're persons? Maybe not by definition, but that belief would seem to foreclose the property argument. One can destroy property, but one ought to shy away from destroying persons. Well, anyway, I think one should.
If someone pulled the plug on Claude, what does that mean, ethically?
This comparison of dogs to AI seems confused, inapt, and unhelpful.
First, "[dogs are] just property" is wrong on the facts. There are probably hundreds of millions of dogs in the world that are not pets (often called "free range dogs") and are no one's "property." This is probably in the ballpark of half of all dogs.
Pet dogs are not generally seen primarily as property. For example, if you were walking down the street in your neighborhood and saw someone in their driveway disassembling a bicycle and discarding the parts, you probably wouldn't think twice about it. Dismembering a dog is an entirely different thing and doing so to a live dog would be a crime in many jurisdictions.
Dogs are inarguably conscious and sentient. An "AI" is not.
Unlike dogs, a running AI is inarguably property. The software may or may not have some "open" license, but the hardware it runs on is, beyond a doubt, someone's property. No hardware, no "AI."
Pulling the plug on a running AI has no ethical implications.
Assuming a model is person-like, it gets even harder when we ask "who" the model is.
Is it this particular model from today? What if it's a minor release version change, is it a new entity, or is it only a new entity on major release versions? What about a finetune on it? Or a version with a particular tool pipeline? Are they all the same being?
I think the analogy breaks down pretty fast. Again, not to say we shouldn't think about it, but clearly the way to think of it is not "exactly a person"
I wanted to talk about Anthropic's "soul" document they include in Claude's prompt, some of the issues it might be causing, and point out what we're seeing now as we're seeing it probably isn't artificial consciousness so much as prompt adherence.
kayo_20211030|1 month ago
Computers used to be like dogs. You could teach them some really cool tricks. We enjoyed the accomplishment, and appreciated the tricks. But, dogs are dogs. Essentially, even as much as one might love them, they're just property.
Now, computers have a soul; they're persons? Maybe not by definition, but that belief would seem to foreclose the property argument. One can destroy property, but one ought to shy away from destroying persons. Well, anyway, I think one should.
If someone pulled the plug on Claude, what does that mean, ethically?
f30e3dfed1c9|1 month ago
First, "[dogs are] just property" is wrong on the facts. There are probably hundreds of millions of dogs in the world that are not pets (often called "free range dogs") and are no one's "property." This is probably in the ballpark of half of all dogs.
Pet dogs are not generally seen primarily as property. For example, if you were walking down the street in your neighborhood and saw someone in their driveway disassembling a bicycle and discarding the parts, you probably wouldn't think twice about it. Dismembering a dog is an entirely different thing and doing so to a live dog would be a crime in many jurisdictions.
Dogs are inarguably conscious and sentient. An "AI" is not.
Unlike dogs, a running AI is inarguably property. The software may or may not have some "open" license, but the hardware it runs on is, beyond a doubt, someone's property. No hardware, no "AI."
Pulling the plug on a running AI has no ethical implications.
i_dont_know_|1 month ago
Is it this particular model from today? What if it's a minor release version change, is it a new entity, or is it only a new entity on major release versions? What about a finetune on it? Or a version with a particular tool pipeline? Are they all the same being?
I think the analogy breaks down pretty fast. Again, not to say we shouldn't think about it, but clearly the way to think of it is not "exactly a person"
i_dont_know_|1 month ago
grantcas|1 month ago
[deleted]