For most of us, our experience on Facebook is a benign — even banal — one. A status update about a colleague's commute. A "friend" request from someone we haven't seen for years. A picture of another friend's baby, barely distinguishable from the dozen posted the day before.
Some four billion pieces of content are shared every day by 845 million users. And while most are harmless, it has recently come to light that the site is brimming with pedophilia, pornography, racism and violence — all moderated by outsourced, poorly vetted workers in third-world countries paid just $1 an hour.
In addition to the questionable morality of a company that is about to create 1,000 millionaires when it goes public paying such paltry sums, there are significant privacy concerns for the rest of us. Although this invisible army of moderators receive basic training, they work from home, do not appear to undergo criminal checks, and have worrying access to users' personal details.
From The Telegraph
View Full Article
No entries found