'

YouTube will have over 10,000 moderators in 2018

Will hiring more people solve YouTube's problems?

YouTube will have over 10,000 moderators in 2018

YouTube will have over 10,000 moderators in 2018

Throughout 2017, YouTube has been experiencing what many have now called the "adpocalypse", with YouTube investing in a machine learning system to moderate the millions of videos that are uploaded to the platform every year. 

YouTube implemented this system to appease advertisers, to help prevent ads from being shown beside inappropriate content. Sadly this method has also caused many prominent YouTubers to have their content demonetised or have monetisation limited for seemingly no reason, leaving many in the position where they need to reach out to fans for revenue using websites like Patreon. 

The Machine Learning approach used by YouTube has been very successful (from YouTube's perspective), with the company stating that the system flagged 98% of all the videos that were removed for violent extremism. This change has lead to YouTube's opponents in the mainstream media to drum up more complaints with the company, highlighting inappropriate text in the comments sections of "family friendly" content and a small number of disturbing videos that are disguised as family friendly content.  

To combat these charges, YouTube has decided to increase the size of their moderation team to over 10,000 employees within the next year, who will manually review videos and feed their findings to YouTube's Machine Learning algorithm, training their digital system while also allowing more content to be manually reviewed at a time. 

  

   Human reviewers remain essential to both removing content and training machine learning systems because human judgment is critical to making contextualized decisions on content. Since June, our trust and safety teams have manually reviewed nearly 2 million videos for violent extremist content, helping train our machine-learning technology to identify similar videos in the future.

We are also taking aggressive action on comments, launching new comment moderation tools and in some cases shutting down comments altogether. In the last few weeks we’ve used machine learning to help human reviewers find and terminate hundreds of accounts and shut down hundreds of thousands of comments. Our teams also work closely with NCMEC, the IWF, and other child safety organizations around the world to report predatory behavior and accounts to the correct law enforcement agencies.


We will continue the significant growth of our teams into next year, with the goal of bringing the total number of people across Google working to address content that might violate our policies to over 10,000 in 2018.

At the same time, we are expanding the network of academics, industry groups and subject matter experts who we can learn from and support to help us better understand emerging issues.



YouTube will have over 10,000 moderators in 2018

 

YouTube has also pledged to improve their system so that their system will demonetise fewer videos from creators by mistake. The company plans to increase the amount of manual curation it does and apply stricter criteria for monetisation, which should allow fewer videos to be demonised by mistake and fewer videos remain monetised when they shouldn't. 

  In the past year, we saw a significant increase in bad actors seeking to exploit our platform, from sharing extremist content and disseminating misinformation, to impersonating creators, to spamming our platform with videos that masquerade as family-friendly content, but are not. These actions harm our community by undermining trust in our platform and hurting the revenue that helps creators like you thrive.

In light of this, we’ve just announced new actions to protect our community from inappropriate content. We want to give creators confidence that their revenue won’t be harmed by bad actors while giving advertisers assurances that their ads are running alongside content that reflects their brand’s values.

To do that, we need an approach that does a better job determining which channels and videos should be eligible for advertising. We’ve heard loud and clear from creators that we have to be more accurate when it comes to reviewing content, so we don’t demonetize videos (apply a “yellow icon”) by mistake. We are planning to apply stricter criteria and conduct more manual curation, while also significantly ramping up our team of ad reviewers to ensure ads are only running where they should. This will help limit inaccurate demonetizations while giving creators more stability around their revenue. We will be talking to creators over the next few weeks to hone this new approach.

As I’ve said many times over the years, creators are the lifeblood of YouTube’s community. It is their originality, authenticity, talent and dedication that attracts a global audience to YouTube. It is their passion and presence that turns casual viewers into devoted fans who are eager to learn, share and come together. And it is their inclusiveness that gives over a billion people who visit YouTube every month a place to belong.

  
The sheer number of videos on YouTube, it is clear that an all-manual approach to moderations simply cannot work, though this increase in YouTube's moderation staff will undoubtedly help the video platform to support their creators while also keeping their platform safe for both users and advertisers. 

You can join the discussion on YouTube's pledge to increase their moderation staff to over 10,000 on the OC3D Forums

«Prev 1 2 Next»

Most Recent Comments

05-12-2017, 11:01:27

VonBlade
Yeah because having that many moderators is guaranteed not to turn into a storm of effluence. I'm sure all 10000 will be vetted and promise not to promote their mates content nor delete their rivals.Quote

05-12-2017, 13:09:40

NeverBackDown
It wouldn't be so bad in the first place if everybody didn't get offended so easilyQuote

05-12-2017, 13:21:42

Dicehunter
Quote:
Originally Posted by NeverBackDown View Post
It wouldn't be so bad in the first place if everybody didn't get offended so easily
Perfectly put !

Problem is a frighteningly large amount of people now have the mindset of children, Man children and fem babies as I like to call them, They can't handle reality or someone else' opinion so try to block it out or get people with differing opinions shutdown so to protect their own egos.Quote

05-12-2017, 13:25:45

looz
I find likely that more people are gaming the report system to troll / make life difficult for people they don't like, rather than using the system due to being actually offended.

That, and attempts at censorship which luckily demonstrates Streisand effect every so often.Quote

05-12-2017, 15:27:34

wozza365
Anything has surely got to be better than the current system which demonetises videos before they're even uploaded. Most YouTubers have taken it into their routine to appeal their video for monetisation. Of course it being accepted after that appeal.Quote
Reply
x

Register for the OC3D Newsletter

Subscribing to the OC3D newsletter will keep you up-to-date on the latest technology reviews, competitions and goings-on at Overclock3D. We won't share your email address with ANYONE, and we will only email you with updates on site news, reviews, and competitions and you can unsubscribe easily at any time.

Simply enter your name and email address into the box below and be sure to click on the links in the confirmation emails that will arrive in your e-mail shortly after to complete the registration.

If you run into any problems, just drop us a message on the forums.