Facebook is a political battleground where Russian operatives work to influence elections, fake news runs rampant, and political hopefuls use ad targeting to reach swing voters. We have no idea what goes on inside Facebook's insidious black box algorithm, which controls the all-powerful News Feed. Are politicians playing by the rules? Can we trust Facebook to police them? Do we really have any choice?
One emerging way to hold tech companies like Facebook accountable is to use similar technology to figuratively poke at that black box, gathering data and testing hypotheses about what might be going on inside, almost like early astronomers studying the solar system.
It's a tactic being pioneered at the nonprofit news organization ProPublica by a team of reporters, programmers, and researchers led by Pulitzer Prize-winning reporter Julia Angwin. Angwin's team specializes in investigating algorithms that impact people's lives, from the Facebook News Feed to Amazon's pricing models to the software determining people's car insurance payments and even who goes to prison and for how long. To investigate these algorithms, they've had to develop a new approach to investigative reporting that uses technology like machine learning and chatbots.
"The one thing that's been so interesting about the algorithms project that I would never have guessed is that we've ended up having to build algorithms all the time," says Angwin, who has been writing about data and surveillance for more than a decade. It's a resource-intensive, deeply challenging task in a media landscape where few are willing to invest in large projects, but Angwin views her team's reporting as essential to holding big tech companies accountable and providing lawmakers with concrete evidence of wrongdoing. "We're going to get police hats for our New Year's presents," she jokes.
View Full Article
No entries found