CALIFORNIA, U.S. - Mired in scandal over the spread of disturbing videos aimed at children, YouTube has said that it is hiring over 10,000 people to address the issue.
The new moderators who would be hired now, will try and stop the spread of bizarre, potentially damaging videos and disturbing videos aimed at children across its site.
The site has been increasingly criticized in recent weeks for hosting posts, which seemed to target children but in fact show graphic, extremist and violent content.
The videos, believed to be containing footage from TV shows for children, focus on well-known characters, or actively claim to be showing things aimed at kids.
However, when they click through, these videos show that might not even be suitable for adults, including footage of children being forced to pretend to be sick.
According to reports, many of the videos appear to be generated by bots that pick out heavily searched terms and create new videos.
Such videos often turn out to include extreme violence or other disturbing content.
Many others are reportedly made by people who have been accused of abusing their children on video for clicks.
YouTube has said that it is proud of its success in developing software that can identify extremist videos and those linked to terrorism.
Now, the website claims that it will use that same technology to search out other problem videos, like those that target children.
It has said that once such videos are found, it will stop their uploaders being paid ad money and might even take them off the service.
The new staff being hired will help the website identify the videos when they are posted and these videos will then be fed into machine learning algorithms to make them able to pick them out themselves.
YouTube CEO Susan Wojcicki said in one of a pair of blog posts that the goal is to bring the total number of people across Google working to address content that might violate its policies to over 10,000 in 2018.
Wojcicki said, "We need an approach that does a better job determining which channels and videos should be eligible for advertising. We've heard loud and clear from creators that we have to be more accurate when it comes to reviewing content, so we don't demonetize videos by mistake."
Wojcicki added that the company would take "aggressive action on comments, launching new comment moderation tools and in some cases shutting down comments altogether."
The moves comes after advertisers, regulators and advocacy groups expressed concern over whether YouTube's policing of its service is sufficient.
In response, YouTube said that it was reviewing its advertising offerings as part of its response and it teased that its next efforts could be further changing requirements to share in ad revenue.