Sign In

Communications of the ACM

ACM News

Researchers Have Already Tested YouTube’s Algorithms for Political Bias


View as: Print Mobile App Share: Send by email Share on reddit Share on StumbleUpon Share on Hacker News Share on Tweeter Share on Facebook
Google logo seen during Google Developer Days in Shanghai, China, September 2019.

Motivated by the long-running argument in Washington, DC, over bias in social media, computer scientists at Northeastern University decided to investigate political bias in YouTube's comment moderation.

Credit: Getty Images

In August 2018, President Donald Trump claimed that social media was "totally discriminating against Republican/Conservative voices." Not much was new about this: for years, conservatives have accused tech companies of political bias. Just last July, Senator Ted Cruz (R-Texas) asked the FTC to investigate the content moderation policies of tech companies like Google. A day after Google's vice president insisted that YouTube was apolitical, Cruz claimed that political bias on YouTube was "massive."

But the data doesn't back Cruz up—and it's been available for a while. While the actual policies and procedures for moderating content are often opaque, it is possible to look at the outcomes of moderation and determine if there's indication of bias there. And, last year, computer scientists decided to do exactly that.

 

From Ars Technica
View Full Article

 


 

No entries found