YouTube is a valuable resource, which provides thousands of hours of educational and entertaining content for users to enjoy, but in recent months a darker side of the platform has become more visible. It's also introducing tougher restrictions on advertising and making greater use of smart technology. She did not say how many people now monitor YouTube for offensive videos.
In an interview to The Daily Telegraph, Susan Wojcicki, Chief Executive of the Google-owned video-sharing site, said on Monday that "bad actors are exploiting" YouTube to "mislead, manipulate, harass or even harm".
"We are also taking aggressive action on comments, launching new comment moderation tools and in some cases shutting down comments altogether", Wojcicki said.
YouTube has grappled with a series of controversies this year concerning videos available on its platform.
The company voiced plans to widen the use of technologies to identify extremist- and terrorist-related videos, to attract more experts to its programme of identifying problematic videos, toughening the rules as for the content that did not clearly violate YouTube's rules and to expand its role in struggle against radical movements.
Russian Federation lists U.S. media organizations as foreign agents
USA intelligence officials accuse the Kremlin of using Russian media organisations it finances to influence US voters. They are Current Time TV , Azatliq Radiosi, Sibir Realii, Idel Realii, Factograf, Kavkaz Realii and Krym Realii.
In order to combat this issue, the video hosting intends to "apply stricter criteria and conduct more manual curation" while simultaneously boosting its team of human reviewers "to ensure ads are only running where they should".
The company isn't just banking on more human intervention, however.
According to the statement, 98% of the videos that YouTube removes for violent extremism are flagged by its machine-learning algorithms, and almost 70% of these are removed within eight hours of upload.
This is helping to train the company's machine learning technology to identify similar videos, which is enabling staff to remove almost five times as many videos as they were previously, she said.
Wojcicki said 180,000 people would have had to work 40 weeks to assess the same amount of content.