top | item 35843791

(no title)

fat-chunk | 2 years ago

I was at a conference called World Summit AI in 2018, where a vice president of Microsoft gave a talk on progress in AI.

I asked a question after his talk about the responsibility of corporations in light of the rapidly increasing sophistication of AI tech and its potential for malicious use (it's on youtube if you want to watch his full response). In summary: he said that it's the responsibility of governments and not corporations to figure out these problems and set the regulations.

This answer annoyed me at the time, as I interpreted it as a "not my problem" kind of response, and thereby trying to absolve tech companies of any damage caused by rapid development of dangerous technology that regulators cannot keep up with.

Now I'm starting to see the wisdom in his response, even if this is not what he fully meant, in that most corporations will just follow the money and try to be the first movers when there is an opportunity to grab the biggest share of a new market, whether we like it or not, regardless of any ethical or moral implications.

We as a society need to draw our boundaries and push our governments to wake up and regulate this space before corporations (and governments) cause irreversible negative societal disruption with this technology.

discuss

order

pvillano|2 years ago

The paperclip maximizer is a thought experiment described by Swedish philosopher Nick Bostrom in 2003.

> Suppose we have an AI whose only goal is to make as many paper clips as possible. The AI will realize quickly that it would be much better if there were no humans because humans might decide to switch it off. Because if humans do so, there would be fewer paper clips. Also, human bodies contain a lot of atoms that could be made into paper clips. The future that the AI would be trying to gear towards would be one in which there were a lot of paper clips but no humans.

Corporations are soulless money maximizers, even without the assistance of AI. Today, corporations perpetuate mass shootings, destroy the environment, rewrire our brains for loneliness and addiction, all in the endless pursuit of money

bostik|2 years ago

> Corporations are soulless money maximizers, even without the assistance of AI.

Funny you should say that. Charlie Stross gave a talk on that subject - or more accurately, read one out loud - at CCC a few years back. It goes by the name "Dude, you broke the future". Video here: https://media.ccc.de/v/34c3-9270-dude_you_broke_the_future

His thesis is that corporations are already a form of AI. While they are made up of humans, they are in fact all optimising for their respective maximiser goals, and the humans employed by them are merely agents working towards that aim.

(Full disclosure: I submitted that link at the time and it eventually sparked quite an interesting discussion.)

cmarschner|2 years ago

I have found that companies that are owned by foundations are the better citizens, as they think more long term and are more susceptible to goals that, while still focusing on profit, might also take other considerations into account.

GolfPopper|2 years ago

Yep. We've had AI for years - it's just slow, and uses human brains as part of its computing substrate.

Or, to look at it from another angle, modern corporations are awfully similar to H.P. Lovecraft's Great Old Ones.

robocat|2 years ago

> all in the endless pursuit of money

Money is not the goal. Optimisation is the goal. Anything with different internal actors (e.g. a corporation with executives) has multiple conflicting goals and different objectives apart from just money (e.g. status, individual gains, political games, etcetera). Laws are constraints on the objective functions seeking to gain the most.

We use capitalism as an optimisation function - creating a systematic proxy of objectives.

Money is merely a symptom of creating a system of seeking objective gain for everyone. Money is an emergent property of a system of independent actors all seeking to improve their lot.

To remove the problems caused by corporations seeking money, you would need to make it so that corporations did not try to optimise their gains. Remove optimisation, and you also remove the improvement in private gains we individually get from their products and services. Next thing you write a Unabomber manifesto, or throw clogs into weaving machines.

The answer that seems to be working at present is to restrict corporations and their executives by using laws to put constraints on their objective functions.

Our legal systems tend to be reactive, and some countries have sclerotic systems, but the suggested alternatives I have heard[1] are fairly grim.

It is fine to complain about corporate greed (the simple result of our economic system of incentives). I would like to know your suggested alternative, since hopefully that shows you have thought through some of the implications of why our systems are just as they currently are (Chesterton’s fence), plus a suggested alternative allows us all to chime in with hopefully intelligent discourse - perhaps gratifying our intellectual curiosity.

[1] Edit: metaphor #0: imagine our systems as a massively complex codebase and the person suggesting the fix is a plumber that wants to delete all the @‘s because they look pregnant. That is about the level of most public economic discourse. Few people put the effort in to understand the fundamental science of complex systems - even the “simple” fundamental topics of game theory, optimisation, evolutionary stable strategies. Not saying I know much, but I do attempt to understand the underlying reasons for our systems, since I believe changing them can easily cause deadly side effects.

schiffern|2 years ago

>Corporations are [intelligent agents non-aligned with human wellbeing], even without the assistance of AI.

Just to put a fine point on it...

usrusr|2 years ago

And it's going almost unchallenged because so many of those who like talking about not all being rosy in capitalism are blinded by their focus on the robber baron model of capitalism turning sour.

But the destructively greedy corporation is completely orthogonal to that. It could even be completely held by working class retirement funds and the like while still being the most ruthless implementation of soulless money maximiser algorithm. Running on its staff, not on chips. All it takes are modest number of ownership indirections and everything is possible.

erikerikson|2 years ago

> Corporations are soulless money maximizers

This seems stated as fact. That's common. I believe it is actually a statement of blind faith. I suspect we can at least agree that it is a simplification of underlying reality.

Financial solvency is eventually a survival precondition. However, survival is necessary but not sufficient for flourishing.

morkalork|2 years ago

Too bad America's view on society is so hollow. The very idea of building a society that serves its people is seemingly dead on arrival.

ssklash|2 years ago

This I think is a result of the mythology of "rugged individualism" so prevalent in the US.

wahnfrieden|2 years ago

It’s because the state is also an oppressive force. I wonder why you come across lots of libertarians and lots of socialists but not so much the combination of the two (toward realities alternative to both state and capital)

ftxbro|2 years ago

Just a heads up, when the moderator 'dang' sees this he's going to put it into his highlights collection that tracks people who share identifying stories about themselves. I hope that's OK with you. https://news.ycombinator.com/highlights

generalizations|2 years ago

What kind of axe are you grinding? That's totally not what the highlights are about, and it's obvious from reading through them.

turtleyacht|2 years ago

I think /highlights just shows the top upvoted, parent-level comment per thread. Do you observe that too?

It may be coincidence that PII just happens to be in there. Folks love a good yarn, and establishing context helps.

erikerikson|2 years ago

Do you have a corroborating source for rule?

gcheong|2 years ago

It sounds great until you realize that, in the US at least, the corporations spend a lot of money lobbying Washington to have the rules set in their favor if not eliminated. Fix that first and then I will believe we can have a government that would actually try to place appropriate ethical boundaries on corporations.

throw10920|2 years ago

This is exactly correct. What people think will happen is:

1. Someone sees a problem and asks a politician to fix it.

2. The politicians enact effective regulation and the problem is solved.

What actually happens is:

1. Someone sees a problem and asks a politician to fix it.

2. The politicians start drafting regulation on the issue.

3. Companies lawyers come in and lobby to have the regulation amended to either be ineffective or disadvantage their competitors.

4. The mal-regulation is enacted and we're all worse off.

5. The companies involved benefit financially and use their money to hire more lawyers (and politicians).

It is necessary to first fix our political system before trying to put more regulation in place. Every time someone says "we need regulation" without doing so, they are making the problem worse, and supporting this corrupt system.

An example of this is literally happening in Washington state around a right-to-repair bill: https://news.ycombinator.com/item?id=35715998

I feel like it's so obvious that it shouldn't have to be stated, but apparently it does: companies need to be regulated because they are composed of people (who are evil), but the governments that regulate those companies are composed of those same evil people and need to be controlled by their citizens. Everybody forgets about the second part, and it's the far more important one.

turtleyacht|2 years ago

If more people were directly invested in laws favoring their means and ends, would they take the time to lobby too?

Folks certainly outnumber corporations (?), and they could create representatives for their interests.

Maybe the end-to-end process--from idea to law--is less familiar to most. Try explaining how a feature gets into production to a layperson, for example :)

Maybe we need more "skeletal deployments" in action, many dry runs, accreted over time, to enough folks. This could be done virtually and repeated many times before even going there.

Just seems like a lot of work, too.

satisfice|2 years ago

Exactly.

I attended a public meeting of lawyers on the revision of the Uniform Commercial Code to make it easier for companies to ship bad software without getting sued by users. When I objected to some of the mischaracterizations about quality and testing that were being bandied around, the lawyer in charge said "well that doesn't matter, because a testing expert would never be allowed to sit on a jury in a software quality case."

I was, of course, pissed off about that. But he was right. Laws about software are going to be made and administered by people who don't know much about software. I was trying to talk to lawyers who represent companies, but that was the wrong group. I needed to talk to lawmakers, themselves, and lawyers who represent users.

Nothing about corporations governs them except the rule of law. The people within them are complicit, reluctantly or not.

HybridCurve|2 years ago

>We as a society need to draw our boundaries and push our governments to wake up and regulate this space before corporations (and governments) cause irreversible negative societal disruption with this technology.

This works in functioning democracies, but not so much for flawed ones.

>he said that it's the responsibility of governments and not corporations to figure out these problems and set the regulations.

In the US, they will say things like this while simultaneously donating to PACs, leveraging the benefits of Citizens United, and lobbying for deregulation. It's been really tough to get either side of the political spectrum to hold tech accountable for anything. Social media companies especially, since they not only have access to so much sentiment data, but also are capable of altering how information propagates between social groups.

lannisterstark|2 years ago

>he said that it's the responsibility of governments

>push our governments to wake up and regulate this space

The only thing the govts will do is to make it so it benefits THEM, the governments. It's high time you lot realize that the govts don't want what's best for you, but only want what will keep them in power the longest.

Democratization of AI/LLM is the way to go here, not handing off custodianship to governments or corporations.

quickthrower2|2 years ago

You were right to be annoyed. It is a very sad answer. Almost a “if I didn’t peddle on this street corner someone else would”. The answer is a cop out.

Individual citizens have much less power than big tech because they don’t have the lobbying warchest, the implied credibility, the connections or even the intelligence (as in the sheer number of academics/researchers). Companies are run by people with a conscious or not and those people should lead these pushes for the right thing. They are in the ideal spot to do so.

generalizations|2 years ago

> before corporations (and governments) cause irreversible negative societal disruption

I think the cat's out of the bag. These tools have already been democratized (e.g. llama) and any legislation will be as futile as trying to ban movie piracy.

dragonwriter|2 years ago

IMO, the regulation that is necessary is largely (1) about government and government-adjacent use, (2) technology-neutral regulation of corporate, government, etc., behavior that is informed by the availability of, but not specific to the use of, AI models.

Democratization of the technology, IMV, just means that more people will be informed enough to participate in the discussion of policy, it doesn’t impair its effectiveness.