YouTube to Rely on Humans with AI to Filter Videos
How would like a job, where you sat all day, watching flagged videos, and having to decide whether to delete them or not?
Cool huh? Well how about if your work schedule had 20 or 30 million videos to go through, or even more? Still sound cool? Well it’s a good thing that YouTube, which is the company that has to sift through that huge number of flagged videos has developed AI programs to do that very tedious work.
YouTube said it took down 8.28 million videos during the fourth quarter of 2017, and about 80 percent of those videos had initially been flagged by artificially intelligent computer systems.
The new data highlighted the significant role machines have in overseeing the service as it is hit by criticism over the spread of conspiracy videos, fake news and violent content from extremist organizations.
That data is from a New York Times article by Daisuke Wakabayashi, and he points out that this is a new approach for YouTube, as it is the first time that YouTube had publicly disclosed the number of videos it removed in a quarter, making it hard to judge how aggressive the platform has previously been in removing content, or the extent to which computers played a part in making those decisions.
And after the grilling that Mark Zuckerberg of Facebook went through, it’s clear free speech and privacy are two top subjects on any internet company’s topics for discussion in Monday mornings office meetings. Figuring out how to remove unwanted videos—and balancing that with free speech —is a major challenge for the future of YouTube, said Eileen Donahoe, executive director at Stanford University’s Global Digital Policy Incubator.
But if predictions are correct, AI will help solve the problem. Facebook has said it expects A.I. tools to detect fake accounts and fake news on its platform. But critics have warned against depending too heavily on computers to replace human judgment.
In December, Google said it was hiring 10,000 people in 2018 to address policy violations across its platforms. On Monday, YouTube, which Google owns, said it had filled the majority of the jobs allowed, including specialists with expertise in violent extremism, counterterrorism and human rights.