top | item 39874059

(no title)

nonoesp | 1 year ago

> When the product officially rolls out there will be a “no-go voice list” that detects and prevents AI-generated speakers that are too similar to prominent figures.

Should they then let anyone register their "voice id" to prevent others frok generating similar voices?

What if your voice happens to resemble the voice of a "prominent figure?"

discuss

order

testacpwoek|1 year ago

It's insane that this protection exists for "prominent figures" but nobody else. The same damage that can be done to prominent figures can be done to regular people, they acknowledge the damage can be done, and yet they're still rolling this out? The more OpenAI does the more it's clear they don't give a single damn about the consequences of their technology.

xandrius|1 year ago

That's what for-profit and investors do to morality.

MrNeon|1 year ago

It is very obvious that the amount and type of damage is not the same if you clone any random person's voice or clone a political leader's voice.

It is not the same damage.

zoky|1 year ago

The “Open” in OpenAI refers to, uh, a certain part of your anatomy…

Shawnj2|1 year ago

It’s a little tricky because if someone cloned your voice first they could pretend they were you and stop you from using “their” voice, then do nasty stuff with it. This way they can still do that, but they can’t prevent you from using your own voice unless you have to upload an ID with your voice or something. It’s much easier to enforce this on known celebrities or otherwise well known people than the general public.

mtillman|1 year ago

It reminds me of the new EU law being implemented in that it eliminates the use of ai for private companies but explicitly allows it for law enforcement. Both are methods of consolidating power. Safer to be open for everyone and unrestricted than explicitly protecting the few.