News
Computing Applications News

Software Aims to Ensure Fairness in Crowdsourcing Projects

The debate rages on about whether crowdsourcing is a win-win for workers, as well as for employers.
Posted
  1. Introduction
  2. Further Reading
  3. Author
crowd of people

Are workers who participate in the highly distributed microlabor online system known as crowdsourcing treated fairly? And what about the crowdsourcing employers?

It is not a new topic of debate, but a new paper presented at the ACM SIGCHI Conference on Human Factors in Computing Systems in Paris in April is likely to heat up the discussion considerably, especially since one of the paper’s two authors urges computer professionals to take a harder look at crowdsourcing—a market which reaches an estimated tens of millions of people annually—and think not just about the technology that makes it possible, but also about the human workers and how they are impacted.

That author, Lilly Irani, a researcher and Ph.D. candidate in the Department of Informatics at the University of California Irvine, also developed Turkopticon, a software program released in February 2009 that was designed "as an ethically motivated response to crowdsourcing workers’ invisibility," says Irani.

She recalls being troubled by what she perceived as the plight of workers being paid just a few dollars an hour to perform tasks on Amazon Mechanical Turk (MTurk), an online crowdsourcing marketplace launched by Amazon.com in November 2005 as a meeting place for "requesters" (employers) with large volumes of microtasks or HITs (Human Intelligence Tasks). Indeed, one worker complaint heard over and over again was that requestors are able to walk away with the work submitted without paying for it, because Amazon leaves payment completely up to the discretion of the employers, who can claim they are unhappy with the quality of the work.

"There is no process for the employer to justify its decisions to workers or to Amazon," she says.

And so Irani created Turkopticon, a software tool cheekily named after a prison surveillance design with a guard tower in which there may or may not be a guard. The possibility of surveillance induces prisoners to discipline themselves, says Irani.


"Our hope is that Turkopticon will not only hold employers accountable, but also induce better behavior."


Functionally, Turkopticon is a browser extension that, when workers search MTurk for HITs, "scrapes the requester ID that is within the HITs market page list and inserts any reviews that other workers have written about that particular requester. So that when a worker is deciding whether they want to take the assignment or not, they can also review a quick summary of what other workers have said. Our hope is that Turkopticon will not only hold employers accountable, but also induce better behavior," she explains.

Turkopticon reportedly gets 120,000 page views per month and has been installed almost 10,000 times. Yet, has it made a difference?

"I can’t say that crowdsource wages have gone up," Irani says. "But I’ve heard requesters say at crowdsourcing conferences that it’s become important for them to establish good reputations with workers. If they put out tasks that aren’t clear or if they upset workers by not paying them, they sometimes need to create a new account and start from scratch. They find that treating workers badly can certainly raise the cost of doing business."

Luis von Ahn, a professor of computer science at Carnegie Mellon University and a crowdsourcing expert, says Irani is one of the few people trying to stop abuse on the part of the employers. Much more frequently, the abuse that takes place is on the part of the workers who try to game the system, hoping to get paid for little or no effort, he observes.

"Imagine a crowdsource task that pays people to look at images and tag each one with a description," he explains. "A worker can just, say, hit the ‘F’ key a few times and hope to get paid for that useless input."

Employers use various techniques to limit such shenanigans—either by refusing to pay for a response unless at least one other worker agrees with that response, or by testing each worker to make sure they are capable of doing the task.

Von Ahn estimates that, on Mechanical Turk, 10%–20% of the workers try to cheat in some form or another. Far fewer employers cheat, he maintains.

"We’re talking about workers who may be in some other country, often India, who can easily remain anonymous, and so it’s easy for them to cheat," von Ahn explains. "On the other hand, the employer is usually a university or a large company like Google and is much less prone to cheating because they have a lot more to lose."

Von Ahn does agree with Irani on one count: "She built Turkopticon to protect the little guy, and that’s a good thing, I think," he says. "When the little guy, the worker, gets cheated out of the buck or two per hour that he makes—and that is what most of these crowdsource workers do make—that’s a lot nastier than if the employer loses the money. Because a few bucks mean a lot more to the guy who lives in India than to the employer."

Anand Kulkarni agrees. He is the CEO of MobileWorks, a competitor to MTurk that bills itself as an online labor platform designed to put the workers’ interests first.

"MobileWorks is predicated on the idea that if we pay workers more and let them work under their real names in a collaborative environment that is more similar to a real-world workplace, they will perform better and deliver better-quality results compared to anonymous online work systems that are balanced against the workers," he explains.

MobileWorks charges its small business and corporate clients that make use of the crowd directly for its services and for use of the crowd, says Kulkarni. "We shield them from the complexity of interacting with the crowd themselves and we charge for doing so."

He compares elements of what MobileWorks does to "a digital union of sorts." When a task is posted, his team ensures it carries a "meaningful price based on the prevailing wages in the location of the workers wanted." And they make sure the workers get paid if the work has been completed.

Workers who join the platform are trained, receive certifications for certain skills, and then MobileWorks vouches for them and helps them graduate into the broader economy where they can find jobs, either through MobileWorks or on other sites.

As in Turkopticon, MobileWorks workers are encouraged to provide feedback on requesters.

Kulkarni believes that, despite efforts to eliminate abuses on MTurk, they continue to exist. On the other hand, he says, more and more alternative crowdsourcing systems are making efforts to develop solutions to these problems.

"On MobileWorks, we explicitly prevent these kinds of abuses by employers," he says, "while others—like Elance.com, Freelancer.com, and oDesk.com—have built-in mechanisms to mediate between employers and employees."


"Suddenly, when you need a whole bunch of people to do something useful, platforms like ours let you organize them very well."


Still, not everyone in the crowdsourcing space believes abuses are a major concern.

At CrowdFlower, founder and CEO Lukas Biewald observes that crowdsourcing gives people the opportunity to choose those tasks they want to perform when they want to perform them, and earn money or coupons or in-game currency for completing those tasks.

"That seems like a good deal to me," he says. "We have over four million people working on CrowdFlower syndicated tasks, and I log into our partner sites all the time to make sure that it’s a good experience for everyone. There are bound to be some complaints and some issues, but we work as hard as we can to resolve the issues as quickly as possible."

CrowdFlower’s business model is different from that of MobileWorks; instead of posting jobs on its own site, it uses an API to syndicate tasks from customers and then farm them out to thousands of partner sites, one of them being MTurk.

Like Turkopticon and MobileWorks, CrowdFlower uses technology—but mainly to guarantee the quality of the work for the employer.

Biewald explains that having a background in AI enabled him write the statistical algorithms that predict the likelihood that a task will be done correctly—and how many checks and redundancies need to be built into the tasks to produce optimal results.

"We have the work history of the people who sign on," he says, "so many of our decisions are based on their historical performance. For instance, if we think someone is going to be 90% accurate and the customer requires 99% accuracy, we know we need to have a second person check the work."

Also, he says, there are other indicators—if a person works too quickly or has never done a similar task before, the software sends up a flag.

Biewald believes few workers are dissatisfied, and the focus of those exploring crowdsourcing should be on the good it does, not just for businesses but also for the general population.

"I’m sure you can find someone, somewhere, who didn’t have a good experience," he says, "but if you interview workers—and I do that all the time—they are mostly really happy. I mean, the system wouldn’t work if people didn’t want to work for us."

He relates one task that began shortly after the 2010 earthquake in Haiti. The U.S. State Department had set up a number to which Haitians could send text messages to report emergencies. As it turned out, the messages were in Creole and, to make matters worse, were written in text message slang that prevented Google from translating them. Because the State Department did not have enough bilingual people to read through the huge volume of messages it was receiving, it contracted with CrowdFlower to seek out crowdworkers who spoke Creole and could translate—and they did so on a voluntary basis.

"Suddenly, when you need a whole bunch of people to do something useful," says Biewald, "platforms like ours let you organize them very well."

So the debate continues—between those who believe the crowdsourcing technology is enabling employers to take advantage of workers…and those, like Biewald, who see crowdsourcing opening up new opportunities for employers and workers alike.

Indeed, says Biewald, the future of crowdsourcing is generally trending away from very simple tasks and toward higher-level specialized tasks.

Carnegie Mellon’s von Ahn agrees, citing the Haitian scenario as just one example of what crowdsourcing can do, beyond having people tag photos and check websites for duplicate images. Another example, he says, is how crowdsourcing was used to translate tweets during the Arab Spring.

"Much of the discussion in the crowdsourcing space is by computer professionals and employers who are trying to make the work more efficient by making the workers more efficient and by making the tasks more exact," he says. "Perhaps there should be more focus on that, and less on how to get around the minimum wage laws."

Back to Top

Further Reading

von Ahn, L.,
"Human Computation," (video), August 22, 2012, http://www.youtube.com/watch?v=tx082gDwGcM

Cushing, E.,
"Dawn Of The Digital Sweatshop," August 1, 2012, "East Bay Express," http://www.eastbayexpress.com/oakland/dawn-of-the-digital-sweatshop/Content?oid=3301022

Kittur, A., Nickerson, J., Bernstein, M., Gerber, E., Shaw, A., Zimmerman, J., Lease, M., and Horton, J.,
"The Future of Crowd Work," published February 2013 at the 16th ACM Conference on Computer Supported Cooperative Work, http://hci.stanford.edu/publications/2013/CrowdWork/futureofcrowdwork-cscw2013.pdf

Neumann, E.,
"Tech Company CrowdFlower Denies Labor Violation," December 4, 2012, "MissionLocal," http://missionlocal.org/2012/12/tech-company-crowdflower-denies-underpaying-workers/

Ipeirotis, P.,
"Mechanical Turk: Now with 40.92% spam," December 16, 2010, http://www.behind-the-enemy-lines.com/2010/12/mechanical-turk-now-with-4092-spam.html

Irani, L., and Silverman, M.,
"Turkopticon: Interrupting Worker Invisibility in Amazon Mechanical Turk," April 2013, ACM SIGCHI Conference on Human Factors in Computing Systems, http://www.ics.uci.edu/~lirani/Irani-Silberman-Turkopticon-camready.pdf

Van Pelt, C., and Sorkin, A.,
"Designing a Scalable Crowdsourcing Platform," 2012 ACM SIGMOD International Conference on Management of Data, http://dl.acm.org/citation.cfm?id=2213951

Biewald, L.,
"Massive Multiplayer Human Computation for Fun, Money, and Survival," ICWE 2011, http://link.springer.com/chapter/10.1007%2F978-3-642-27997-3_18?LI=true#

Hester, V., Shaw, A,. and Biewald, L.,
"Scalable crisis relief: Crowdsourced SMS translation and categorization with Mission 4636," ACM Symposium on Computing for Development 2010, http://dl.acm.org/citation.cfm?id=1926199

Back to Top

Join the Discussion (0)

Become a Member or Sign In to Post a Comment

The Latest from CACM

Shape the Future of Computing

ACM encourages its members to take a direct hand in shaping the future of the association. There are more ways than ever to get involved.

Get Involved

Communications of the ACM (CACM) is now a fully Open Access publication.

By opening CACM to the world, we hope to increase engagement among the broader computer science community and encourage non-members to discover the rich resources ACM has to offer.

Learn More