YouTube will have over 10,000 moderators in 2018
YouTube will have over 10,000 moderators in 2018
YouTube implemented this system to appease advertisers, to help prevent ads from being shown beside inappropriate content. Sadly this method has also caused many prominent YouTubers to have their content demonetised or have monetisation limited for seemingly no reason, leaving many in the position where they need to reach out to fans for revenue using websites like Patreon.Â
The Machine Learning approach used by YouTube has been very successful (from YouTube’s perspective), with the company stating that the system flagged 98% of all the videos that were removed for violent extremism. This change has lead to YouTube’s opponents in the mainstream media to drum up more complaints with the company, highlighting inappropriate text in the comments sections of “family friendly” content and a small number of disturbing videos that are disguised as family friendly content. Â
To combat these charges, YouTube has decided to increase the size of their moderation team to over 10,000 employees within the next year, who will manually review videos and feed their findings to YouTube’s Machine Learning algorithm, training their digital system while also allowing more content to be manually reviewed at a time.Â
 Â
  Human reviewers remain essential to both removing content and training machine learning systems because human judgment is critical to making contextualized decisions on content. Since June, our trust and safety teams have manually reviewed nearly 2 million videos for violent extremist content, helping train our machine-learning technology to identify similar videos in the future.
We are also taking aggressive action on comments, launching new comment moderation tools and in some cases shutting down comments altogether. In the last few weeks weâve used machine learning to help human reviewers find and terminate hundreds of accounts and shut down hundreds of thousands of comments. Our teams also work closely with NCMEC, the IWF, and other child safety organizations around the world to report predatory behavior and accounts to the correct law enforcement agencies.
We will continue the significant growth of our teams into next year, with the goal of bringing the total number of people across Google working to address content that might violate our policies to over 10,000 in 2018.
At the same time, we are expanding the network of academics, industry groups and subject matter experts who we can learn from and support to help us better understand emerging issues.
Â
YouTube has also pledged to improve their system so that their system will demonetise fewer videos from creators by mistake. The company plans to increase the amount of manual curation it does and apply stricter criteria for monetisation, which should allow fewer videos to be demonised by mistake and fewer videos remain monetised when they shouldn’t.Â
 In the past year, we saw a significant increase in bad actors seeking to exploit our platform, from sharing extremist content and disseminating misinformation, to impersonating creators, to spamming our platform with videos that masquerade as family-friendly content, but are not. These actions harm our community by undermining trust in our platform and hurting the revenue that helps creators like you thrive.
In light of this, weâve just announced new actions to protect our community from inappropriate content. We want to give creators confidence that their revenue wonât be harmed by bad actors while giving advertisers assurances that their ads are running alongside content that reflects their brandâs values.
To do that, we need an approach that does a better job determining which channels and videos should be eligible for advertising. Weâve heard loud and clear from creators that we have to be more accurate when it comes to reviewing content, so we donât demonetize videos (apply a âyellow iconâ) by mistake. We are planning to apply stricter criteria and conduct more manual curation, while also significantly ramping up our team of ad reviewers to ensure ads are only running where they should. This will help limit inaccurate demonetizations while giving creators more stability around their revenue. We will be talking to creators over the next few weeks to hone this new approach.
As Iâve said many times over the years, creators are the lifeblood of YouTubeâs community. It is their originality, authenticity, talent and dedication that attracts a global audience to YouTube. It is their passion and presence that turns casual viewers into devoted fans who are eager to learn, share and come together. And it is their inclusiveness that gives over a billion people who visit YouTube every month a place to belong.
 Â
The sheer number of videos on YouTube, it is clear that an all-manual approach to moderations simply cannot work, though this increase in YouTube’s moderation staff will undoubtedly help the video platform to support their creators while also keeping their platform safe for both users and advertisers.Â
You can join the discussion on YouTube’s pledge to increase their moderation staff to over 10,000 on the OC3D Forums.Â
YouTube will have over 10,000 moderators in 2018
YouTube implemented this system to appease advertisers, to help prevent ads from being shown beside inappropriate content. Sadly this method has also caused many prominent YouTubers to have their content demonetised or have monetisation limited for seemingly no reason, leaving many in the position where they need to reach out to fans for revenue using websites like Patreon.Â
The Machine Learning approach used by YouTube has been very successful (from YouTube’s perspective), with the company stating that the system flagged 98% of all the videos that were removed for violent extremism. This change has lead to YouTube’s opponents in the mainstream media to drum up more complaints with the company, highlighting inappropriate text in the comments sections of “family friendly” content and a small number of disturbing videos that are disguised as family friendly content. Â
To combat these charges, YouTube has decided to increase the size of their moderation team to over 10,000 employees within the next year, who will manually review videos and feed their findings to YouTube’s Machine Learning algorithm, training their digital system while also allowing more content to be manually reviewed at a time.Â
 Â
  Human reviewers remain essential to both removing content and training machine learning systems because human judgment is critical to making contextualized decisions on content. Since June, our trust and safety teams have manually reviewed nearly 2 million videos for violent extremist content, helping train our machine-learning technology to identify similar videos in the future.
We are also taking aggressive action on comments, launching new comment moderation tools and in some cases shutting down comments altogether. In the last few weeks weâve used machine learning to help human reviewers find and terminate hundreds of accounts and shut down hundreds of thousands of comments. Our teams also work closely with NCMEC, the IWF, and other child safety organizations around the world to report predatory behavior and accounts to the correct law enforcement agencies.
We will continue the significant growth of our teams into next year, with the goal of bringing the total number of people across Google working to address content that might violate our policies to over 10,000 in 2018.
At the same time, we are expanding the network of academics, industry groups and subject matter experts who we can learn from and support to help us better understand emerging issues.
Â
YouTube has also pledged to improve their system so that their system will demonetise fewer videos from creators by mistake. The company plans to increase the amount of manual curation it does and apply stricter criteria for monetisation, which should allow fewer videos to be demonised by mistake and fewer videos remain monetised when they shouldn’t.Â
 In the past year, we saw a significant increase in bad actors seeking to exploit our platform, from sharing extremist content and disseminating misinformation, to impersonating creators, to spamming our platform with videos that masquerade as family-friendly content, but are not. These actions harm our community by undermining trust in our platform and hurting the revenue that helps creators like you thrive.
In light of this, weâve just announced new actions to protect our community from inappropriate content. We want to give creators confidence that their revenue wonât be harmed by bad actors while giving advertisers assurances that their ads are running alongside content that reflects their brandâs values.
To do that, we need an approach that does a better job determining which channels and videos should be eligible for advertising. Weâve heard loud and clear from creators that we have to be more accurate when it comes to reviewing content, so we donât demonetize videos (apply a âyellow iconâ) by mistake. We are planning to apply stricter criteria and conduct more manual curation, while also significantly ramping up our team of ad reviewers to ensure ads are only running where they should. This will help limit inaccurate demonetizations while giving creators more stability around their revenue. We will be talking to creators over the next few weeks to hone this new approach.
As Iâve said many times over the years, creators are the lifeblood of YouTubeâs community. It is their originality, authenticity, talent and dedication that attracts a global audience to YouTube. It is their passion and presence that turns casual viewers into devoted fans who are eager to learn, share and come together. And it is their inclusiveness that gives over a billion people who visit YouTube every month a place to belong.
 Â
The sheer number of videos on YouTube, it is clear that an all-manual approach to moderations simply cannot work, though this increase in YouTube’s moderation staff will undoubtedly help the video platform to support their creators while also keeping their platform safe for both users and advertisers.Â
You can join the discussion on YouTube’s pledge to increase their moderation staff to over 10,000 on the OC3D Forums.Â