Google-owned YouTube has removed about 1.67 million channels for problematic videos, spam, scams or other misleading content in the previous quarter, the company said in its latest transparency report on Thursday.
In the first such report that included information about channel removal, YouTube said 79.6 percent of these channels were pulled off for spam, misleading content and scams from July to September, while 12.6 percent were removed for nudity or sexual content.
YouTube had previously released three such reports that contained only statistics about individual videos, without specifying the number of channels taken down from its platform.
It said these channels were terminated for violating several of its policies "within a three month period."
During the same period, the world's largest video-sharing network removed 7.8 million videos for similar reasons, and 81 percent of them were first detected by machines. About 74.5 percent of these machine-detected videos were not viewed by the public, YouTube said.
"We terminate entire channels if they are dedicated to posting content prohibited by our Community Guidelines or contain a single egregious violation, like child sexual exploitation," said the company in a statement released on Thursday.
YouTube said it has increasingly resorted to artificial intelligence (AI) technology such as machine learning to crack down on porn or illegal content in its service.
"We've always used a mix of human reviewers and technology to address violative content on our platform, and in 2017 we started applying more advanced machine learning technology to flag content for review by our teams," it said.
YouTube disclosed for the first time that over 224 million comments were removed for being labeled as "likely spam" in the third quarter of 2018, and 99.5 percent of them were intercepted by the company's automated systems.
YouTube has been scrutinized for being used by "bad actors" who took advantage of its service to spread extremist information, radical campaign or political-motivated content.
In the first such report that included information about channel removal, YouTube said 79.6 percent of these channels were pulled off for spam, misleading content and scams from July to September, while 12.6 percent were removed for nudity or sexual content.
YouTube had previously released three such reports that contained only statistics about individual videos, without specifying the number of channels taken down from its platform.
It said these channels were terminated for violating several of its policies "within a three month period."
During the same period, the world's largest video-sharing network removed 7.8 million videos for similar reasons, and 81 percent of them were first detected by machines. About 74.5 percent of these machine-detected videos were not viewed by the public, YouTube said.
"We terminate entire channels if they are dedicated to posting content prohibited by our Community Guidelines or contain a single egregious violation, like child sexual exploitation," said the company in a statement released on Thursday.
YouTube said it has increasingly resorted to artificial intelligence (AI) technology such as machine learning to crack down on porn or illegal content in its service.
"We've always used a mix of human reviewers and technology to address violative content on our platform, and in 2017 we started applying more advanced machine learning technology to flag content for review by our teams," it said.
YouTube disclosed for the first time that over 224 million comments were removed for being labeled as "likely spam" in the third quarter of 2018, and 99.5 percent of them were intercepted by the company's automated systems.
YouTube has been scrutinized for being used by "bad actors" who took advantage of its service to spread extremist information, radical campaign or political-motivated content.
Latest comments