top | item 39748128

(no title)

yoavz | 1 year ago

Most interesting example to me: "Digitally altering audio to make it sound as if a popular singer missed a note in their live performance".

This seems oddly specific to the inverse of what happened recently with Alicia Keys from the recent Superbowl. As Robert Komaniecki pointed out on X [1], Alicia Keys hit a "sour note" which was silently edited by the NFL to fix it.

[1] https://twitter.com/Komaniecki_R/status/1757074365102084464

discuss

order

elpocko|1 year ago

Digitally altering audio to make it sound as if a popular singer hit a lot of notes is still fine though.

yoavz|1 year ago

Correct, it's the inverse that requires disclosure by Youtube.

Still, I find it interesting. If you can't synthetically alter someone's performance to be "worse", is it OK that the NFL synthetically altered Alicia Key's performance to be "better"?

For a more consequential example, imagine Biden's marketing team "cleaning up" his speech after he has mumbled or trailed off a word, misleading the US public during an election year. Should that be disclosed?

frays|1 year ago

This is a great example as a discussion point, thank you for sharing.

I will be coming back to this video in several months time to check whether the "Altered or synthetic content" tag has actually been applied to it or not. If not, I will report it to YouTube.

ryandrake|1 year ago

Yea, it’s a really super example!

However autotune has existed for decades. Would it have been better if artists were required to label when they used autotune to correct their singing? I say yes but reasonable people can disagree!

I wonder if we are going to settle on an AI regime where it’s OK to use AI to deceptively make someone seem “better” but not to deceptively make someone seem “worse.” We are entering a wild decade.