top | item 47207169

(no title)

toss1 | 12 hours ago

If each of us individually or as corporations should not be in the business of deciding what it "evil", who should be in that business?

Everyone SHOULD continuously consider, decide, and live by moral judgements and codes they internalize, and use to make choices in life.

This aspect of life should NEVER be outsourced — of course, learn from and use codes others have developed and lived by — but ALWAYS consider deeply how it works in your situation and life.

(And no, I do NOT mean use situational ethics, I mean each considering, choosing, and internalizing the codes by which they live).

So, yes, Anthropic and anyone else building products absolutely should be deciding for themselves what they will build, for what purposes it is fit to use, and telling others about those purposes. For products like AI, this absolutely includes deciding what is "evil" and preventing such uses.

If the customer finds such restrictions are not what they want, they ARE FREE to not use the product.

discuss

order

thunky|7 hours ago

> If each of us individually or as corporations should not be in the business of deciding what it "evil", who should be in that business?

This is easy imo. Two methods:

1. The law. It should not be legal for the US Govt to murder people at will. If it is legal, then of course they'll use tools to make it easier. Maybe AI, maybe Clippy. If they can't use AI then they'll fall back to using some other way of doing it like they've already been doing for several years.

2. Voting. For representatives that actually represent us and have our interest in mind rather than their own corrupt interests. And voting with our wallet against companies that do legal but morally bankrupt things.

Of course we're failing both of these hard right now. But imo the answer is not to give up and let corporations make the rules.

In other words, if it were legal for a normal citizen to murder anyone they wanted, of course they'll use Google Maps to help them do that. We don't put restrictions on how people can use Google Maps. Instead we've made murder illegal. We should be doing the same thing here.

toss1|3 hours ago

Exactly zero of those account for an individual's or company's ability tolive by their own moral code

And this AI software is not a mere static object like a hammer that can be handed off to a customer and what it is used for is their business, to build a house or bash a living skull.

This is a system that must be constantly maintained by it's builders.

Moreover, even if we use your standard, the law, it has already been decided in Anthropic's favor.

What you require is that Anthropic actively participate in activities that they consider abhorrent and/or unwise. SCOTUS has already ruled that a business cannot even be required to sell a cake to someone if it does not like the intended purpose (in that case, at a celebration at a gay wedding).