top | item 47182828

(no title)

moozooh | 2 days ago

It may appear simpler on the surface but it's very easy to find that market forces that don't have any checks and balances on them eventually converge on increasingly aggressive and dehumanizing behavior—not unlike your example with women. I have many such well-documented behaviors to list as examples, and I guarantee you have encountered them regularly and been upset at them.

The way we organize in a society is by having governments, usually elected ones to represent what "most people in a society" actually think, to serve as an arbiter of applied morals in our interactions, including business. To that end, we codify most of them in laws with clear definitions to prevent things like unfettered monopolies, corporate espionage, poor working conditions and hiring practices, etc. This generally works, though it depends on how well a given government and its constituent parts does its job and whether it uses the power it has to serve the entire society's interests or the interests of the elites that drive decisions. We can see right now how it fails in real time, for example.

Morals don't have to be evaluated "objectively" (whatever that is) every time to be observed. Humanity has agreed on many things that make up UDHR, international law, and other related documents. It's not the hard part. Making independent actors conduct their business in accordance with these codes is the hard part. Somehow even making them follow their own self-imposed principles is crazy hard for some reason. When Amodei claims Anthropic develops Claude for the benefit of all humanity but greenlights its use for surveillance on non-Americans, that's scummy. When Amodei claims to be terrified of authoritarian regimes gaining access to powerful AI but seeks investment from them, that's scummy. The deal with Palantir, the mass-surveillance business, is scummy. Framing the use of autonomous weapons as only disagreeable insofar as the underlying capabilities aren't reliable enough is scummy. You don't need to be a PhD in morals to notice that.

discuss

order

vladms|1 day ago

The initial quote I responded to was:

> market incentives pretty much always go opposite of moral incentives because morals put breaks on decisions that multiply value for the company

Yes, both market and morals have to be defined and are subjected to some rules and conventions - as you mention correctly in the reply. What I think it could be more qualified is the market and moral incentives "always go opposite".

Even today in many countries the market ensure a lot of necessary things for a lot of the population. Not all topics can be managed as a market (for example I don't think healthcare or basic infrastructure fit) and not in all countries have such frameworks, but given the successful examples I think it's more about wrongly using the tool than due to the tool itself.

Regarding your examples (Palantir, Claude - guns/surveillance), the same things happened in places where market incentives are/were not a driving force (communist East Europe/China for surveillance, quite probable China for automated weapons).

Honestly I wish I could propose/explain what would help. But just blaming a generic tools that we have (market, AI, press) for the bad things resulting from incorrect usage, worries me, as it can lead to not using them even when they would work.