YouTube removed more than eight million videos from a three months record in response to complaints pertaining inappropriate content.
The count was filtered from the October-December 2017 category for removal of an estimated 8.3 million videos that contained adult content.
A Google-led machine system is coping in the matter to detect such objectionable videos, while a majority of the deleted videos were spam or people attempting to upload adult content.
According to Google, which owns YouTube, the process is getting quicker at taking down videos, and the system hunts down dodgy videos in a number of ways.
Google reported that out of a dig of 6.7 million machine-detected videos, 75.9 percent were reported of zero views at the time they were reviewed.
Google has triggered a longer term accuracy in the system to detect terrorist videos by training the machines to guess violent extremism with a current help of two million particular videos that were first hand-reviewed.
Once a bad video is reviewed and taken down, Google will save the details of clip in a mathematical code form – so it will be automatically block the next time someone attempts to upload that video again and flag that video to YouTube’s content reviews who will take it down.
Attacker identified as 50-year-old male doctor from Saudi Arabia who holds permanent residency in Germany
Jagmeet Singh’s NDP has been pivotal in sustaining PM Trudeau’s govt in recent years since last elections
"Macron resign," "you're talking nonsense," "water, water, water", young people and mothers shout at president
Since May 2023, some 260 people have been killed in and more than 60,000 displaced
Yoon has been stripped of his duties by parliament following his short-lived martial law declaration
Public health activities can suffer as more than half of agency's workers to be furloughed