Google-owned YouTube ramps up monitoring of disturbing videos

Gladys Abbott
December 6, 2017

This latest announcement by the company comes in the wake of criticism by British Prime Minister Theresa May, who has always been scolding the social media companies to regulate the content responsibly, following a series of terrorist attacks in the United Kingdom. By training those algorithms to do the same for other types of videos, such as those questionable uploads that targeted children, the platform will be able to take them down a lot faster than it now can.

Amid what amounts to the second wave of the YouTube 'Adpocalypse', whereby inappropriate content has been discovered to target young children, CEO Susan Wojcicki is personally addressing the video giant's expanded efforts to wipe out policy-violating videos.

The reports led several big brands including Mars and Adidas to pull advertising from the site. Turning off comments in over 625,000 videos targeted by alleged child predators. Further, he noted that increased members would help YouTube fulfill the task better, and avoid inaccurate demonetizations, thus, giving creators more stability on the revenue front.

She said Youtube would be speaking with advertisers and creators "over the next few weeks" to hone its approach.

More news: UCF Sets Record in Selling Out Their Peach Bowl Tickets

Facebook in May said it would hire 3,000 more people to review videos and posts, and, later, another 1,000 to review ads after discovering Russian ads meant to influence the USA presidential election.

"Human reviewers remain essential to both removing content and training machine learning systems because human judgement is critical to making contextualised decisions on content", she said. While some publications are reporting that YouTube will add 10,000 staff, that's not quite right: The post didn't say how many people already work on moderation at YouTube, and how many will be transitioned from other positions. In addition, algorithms flagged about 98% of the videos removed for violent extremism.

In June YouTube announced that it would take steps to address the problem by improving its use of machine learning to remove controversial videos and by working with 15 new expert groups, including the No Hate Speech Movement, the Anti-Defamation League and the Institute for Strategic Dialogue.

Other reports by LeisureTravelAid

Discuss This Article