Sign In

Communications of the ACM

ACM TechNews

Can YouTube Quiet Its Conspiracy Theorists?


View as: Print Mobile App Share: Send by email Share on reddit Share on StumbleUpon Share on Hacker News Share on Tweeter Share on Facebook
Screenshots from conspiracy videos that have appeared on YouTube.

A study by researchers at University of California, Berkeley found that while YouTube has reduced how often its algorithm recommends conspiracy theory-related videos, its progress in dealing with conspiracy theories has been uneven.

Credit: The New York Times

University of California, Berkeley (UC Berkeley) researchers found that while YouTube has reduced how often its algorithm recommends conspiracy theory-related videos, its progress in dealing with conspiracy theories has been uneven, and the service still promotes certain types of fictional stories.

The study examined 8 million recommendations by the video-sharing platform over a 15-month period and found that while YouTube has almost completely removed some conspiracy theories from its recommendations, other falsehoods continue to flourish.

Said UC Berkeley’s Hany Farid, “It is a technological problem, but it is really at the end of the day also a policy problem. ...If you have the ability to essentially drive some of the particularly problematic content close to zero, well then you can do more on lots of things."

From The New York Times
View Full Article

 

Abstracts Copyright © 2020 SmithBucklin, Washington, DC, USA


 

No entries found

Sign In for Full Access
» Forgot Password? » Create an ACM Web Account