YouTube Bans "Hateful" Videos From Platform

Article:

YouTube said it is stepping up efforts to scrub hateful content from its platform, including videos that deny historical events like the Holocaust, taking on more of the task of judging the validity of information on its popular video-streaming site.

The unit of Alphabet Inc.’s GOOG -1.03% Google will bar videos claiming that any group is superior to others to justify discrimination, segregation or exclusion based on qualities like age, gender, race, caste, religion, sexual orientation or veteran status, YouTube said in a blog post Wednesday. It said this would include videos that promote or glorify Nazi ideology, for example, and could affect thousands of channels.

The company also said it would remove content that falsely denied the existence of factual violent incidents, like the shooting at Sandy Hook Elementary School in Connecticut and the Nazis’ mass murder of Jews and other groups in World War II.

YouTube didn’t list the names of specific channels or videos that would be affected by the new policy.

“We recognize some of this content has value to researchers and NGOs looking to understand hate in order to combat it, and we are exploring options to make it available to them in the future,” the company said.

This announcement is the latest step by YouTube to respond to extremism and misinformation on its platform. Those efforts have thrust it further into controversial decisions over which content is acceptable, but haven’t inoculated it from criticism that the video site abounds with damaging information.

For years, YouTube, in addition to social-media companies Facebook Inc. and Twitter Inc., attempted to skirt the task of adjudicating content, justifying the hands-off approach with the view that their sites and apps are content-neutral technology platforms.

But mounting public anger over the prevalence of toxic content online and its effect on society has spurred calls for more action from the companies.

A recent survey from the Anti-Defamation League found that 37% of Americans experienced online hate and harassment in 2018. Some 17% of all users encountered hate and harassment on YouTube, specifically, according to the survey.

ADL Chief Executive Jonathan Greenblatt complimented the latest move, but called it insufficient. In a statement, he encouraged “many more changes from YouTube and other tech companies to adequately counter the scourge of online hate and extremism.”

Previous steps YouTube has taken to address these issues include removing content that is spammy or promotes violence, reducing the spread of “borderline” content that nearly violates its rules, and promoting videos from authoritative creators.

Facebook and Twitter also have taken steps to remove and reduce the spread of spammy and abusive posts, although they have stopped short of removing content for being inaccurate about well-documented events.

Facebook has focused on marking suspected fake information with additional context and stopping the spread of certain kinds of fake information, while Twitter avoids making determinations about the veracity of statements on its platform.

YouTube’s new rules won’t be absolute. Some videos could remain up, YouTube said, if they discuss topics like pending legislation, aim to condemn or expose hate, or provide analysis of current events.

The company said it would begin enforcing this updated policy Wednesday, but it will take time for its systems to fully ramp up.

Policing content brings its own problems for the companies. Any steps to remove material that some deem harmful risk triggering complaints of bias by others. Some conservative lawmakers claim that efforts to moderate content disproportionately affect them, while some experts who study the spread of toxic content say a disproportionate share of it on these sites has a right-wing bent.

More aggressive moderating also reinforces users’ expectations that they won’t face hate and harassment on the platforms, leading to additional questions about why some apparently abusive content is allowed and other material isn’t.

The control that the big internet platforms have over what content they carry takes on added significance now amid stronger concern from regulators and lawmakers that the tech giants broadly wield too much power.

YouTube also has to factor in the concerns of advertisers, who covet the huge and devoted audience that the platform brings but don’t want their brands to be associated with hate or extremism.

Getting the balance right is crucial for Google’s growth. Executives at the online-advertising behemoth have said new ads on YouTube videos are critical to the conglomerate’s future. In the most recent earnings call, they warned that growth was by some metrics decelerating. Alphabet’s stock took its biggest dive in nearly seven years after that report.

/r/Conservative Thread Link - sj.com