top | item 16756147

(no title)

AmIFirstToThink | 8 years ago

I reject any and all violence. The law of the land must take its course.

This woman who talked about free speech and then committed this crime against YouTube has seriously set back the arguments people were making for Free Speech on Social Media.

I have given this a lot of thought in past couple of years.

Current situation:

A) A video could get demonetized because of following reasons:

1. An advertiser complained about it. 2. A user complained about it. 3. The system/AI flagged it inappropriate. 4. A human auditor found it inappropriate.

B) Media companies (print/television/online news & press) have targeted ad-tec companies like YouTube, Facebook and others. These companies have a new way to wage war against YouTube/Facebook: drum up "ads shown on objectionable content".

C) Groups with agenda trying to silence opposing viewpoints have a way to do so by reporting content as inappropriate.

D) Truly illegal, criminal content

E) Lack of powerful parental controls on content and on ads

Problem with current situation:

1. An advertiser finding a video objectionable for their ad doesn't mean every advertiser will find the content objectionable. E.g. an insurance company may not want to advertise on a video that glorifies extreme jumping over walls, building, but a helmet company or an action cam company would find it attractive market to advertise. When youtube demonetizes a video, it's demonetized period.

2. It is ridiculously easy to stop speech on social media by abusing the inappropriate content flag. Groups with agenda that have decent enough membership can run a very effective silencing campaign on social media by abusing the report-content button. Someone doesn't like some legal content, doesn't mean that they are allowed to take out that content from anyone else viewing it.

3. YouTube or Facebook doesn't have or offer a way fight the "ads shown on objectionable content" attacks. This is especially confusing given these are the tech giants we are talking about with the might of all the technology and AI powers they wield.

4. No human recourse for the content creator to plead their case.

5. Demonetizing political commentary is astonishing, given how much advertising CNN, MSNBC, Fox, and political print media gets.

6. Would it be OK if Conoco gas station stopped hybrid cars from refueling? Would it be OK if Conoco said why doesn't the hybrid car owner start her own oil well and oil refinery? Why is is acceptable for Google to delete accounts with thousands of subscribers? What if you had a Conoco points cards with hundreds of dollars on it? Isn't that equivalent to your channels membership that YouTube yanks away from you?

Solution:

1. Allow advertisers to select channels to advertise on. Advertisers opt-in to select channels.

2. Allow advertisers to select channels to specifically not advertise on. Advertisers opt-out of select channels.

3. Protect creators from harassment by targeted campaigns ran by people trying to de-platform or silence them. Embrace bubbles, let people opt out of certain tags e.g. if a user says I don't like meat preparation videos, then don't show it to them. But don't make it easy for people who don't like meat products to deplatform those who do.

4. Apple can put people in front of their customers to serve them. Facebook/Google make enough money to have a customer facing team for their content creators. They have invested into hiring auditors but there is no recourse for creators. Channel being banned after having thousands of subscribers is akin to paypal locking your account with thousands of dollars in it.

5. Removing content should depend on legality (law of the land) of the content alone. Removal of account must be because of a lawful request from authorities.

6. "Demonetization" can't be blanket. Each advertiser is unique. One advertisers not liking content doesn't make it inappropriate for all advertisers. Protect creators from activist advertisers.

7. These are tech giants that we are talking about with the might of engineering and AI with them. Letting advertisers opt-in/opt-out of channels should be possible for them. The fact that they haven't done so insinuates that they like the power they wield by removing content they themselves don't like for whatever (monetary/ideological) reasons. They need to prove that they are can be trusted with this power. Conoco shouldn't be allowed to reject gas to a lawyer who is on her way to a courthouse for an EPA trial, Google shouldn't be allowed to remove accounts that it doesn't like, especially without recourse. A regulatory framework may be needed, as Zukerberg conceded recently.

8. Allow parents to whitelist channels for kids. Show videos to kids only after 1000+ views from parents who use parental controls. Allow parents to restrict the videos kids can see where parents set the video playlist and kids can pick only from that playlist. Parents can subscribe to a trusted playlist for kids e.g. from NPR or NASA or their local school, activity group or church. Absence of these tools shows YouTube willing to run videos on autoplay to maximize ads without much regards for age-appropriate parent-controlled safe video experience for kids. A parent should be able to say I don't want to see ads for Disney Cruise as they can't afford it.

9. Subscriber portability just like phone number portability from carrier to carrier. A content creator should be able to take his subscribers with her to a different service. The social network should show forwarding address to the subscribers when content creator leaves a platform for another. Bring real competition for social platforms.

Disclaimer: I pay for YouTube Red, I like the ad-free experience and am glad that the option exists where I can buy that experience. I don't make, never have, any money based on the social media ad platforms. I am a consumer of content, not a creator. As a paying consumer of content, I do want complete freedom in my ability to see any legal content that others have created without an arbitrary filter of policy/guideline/appropriateness that has no basis in law of the land. I should have the ability to apply the filters if I choose to do so, just like marketplace of ideas there should be a marketplace for filters (whitelist/blacklist).

discuss

order

No comments yet.