YouTube removed more than eight million videos from a three months record in response to complaints pertaining inappropriate content.
The count was filtered from the October-December 2017 category for removal of an estimated 8.3 million videos that contained adult content.
A Google-led machine system is coping in the matter to detect such objectionable videos, while a majority of the deleted videos were spam or people attempting to upload adult content.
According to Google, which owns YouTube, the process is getting quicker at taking down videos, and the system hunts down dodgy videos in a number of ways.
Google reported that out of a dig of 6.7 million machine-detected videos, 75.9 percent were reported of zero views at the time they were reviewed.
Google has triggered a longer term accuracy in the system to detect terrorist videos by training the machines to guess violent extremism with a current help of two million particular videos that were first hand-reviewed.
Once a bad video is reviewed and taken down, Google will save the details of clip in a mathematical code form – so it will be automatically block the next time someone attempts to upload that video again and flag that video to YouTube’s content reviews who will take it down.
Oz unsuccessfully ran for US Senate in 2022
Documents seized by police revealed plan to shoot or poison Lula put two retired army generals in charge
This strike followed newly granted permission from outgoing Biden administration
Analysts say Russia could consider a nuclear strike in response to conventional attack
Chinese ambassador urges Tajik side to "get to the bottom of incident as soon as possible"
Crash takes place outside Yong'an primary school in central city of Changde