Humans can perform many tasks with ease that remain difficult or impossible for computers. Crowdsourcing platforms like Amazon Mechanical Turk make it possible to harness human-based computational power at an unprecedented scale, but their utility as a general-purpose computational platform remains limited. The lack of complete automation makes it difficult to orchestrate complex or interrelated tasks. Recruiting more human workers to reduce latency costs real money, and jobs must be monitored and rescheduled when workers fail to complete their tasks. Furthermore, it is often difficult to predict the length of time and payment that should be budgeted for a given task. Finally, the results of human-based computations are not necessarily reliable, both because human skills and accuracy vary widely, and because workers have a financial incentive to minimize their effort.
We introduce AUTOMAN, the first fully automatic crowdprogramming system. AUTOMAN integrates human-based computations into a standard programming language as ordinary function calls that can be intermixed freely with traditional functions. This abstraction lets AUTOMAN programmers focus on their programming logic. An AUTOMAN program specifies a confidence level for the overall computation and a budget. The AUTOMAN runtime system then transparently manages all details necessary for scheduling, pricing, and quality control. AUTOMAN automatically schedules human tasks for each computation until it achieves the desired confidence level; monitors, reprices, and restarts human tasks as necessary; and maximizes parallelism across human workers while staying under budget.
The following letter was published in the Letters to the Editor in the November 2016 CACM (http://cacm.acm.org/magazines/2016/11/209131).
Crowdwork promises to help integrate human and computational processes while also providing a source of paid work for those who might otherwise be excluded from the global economy. Daniel W. Barowy et al.'s Research Highlight "AutoMan: A Platform for Integrating Human-Based and Digital Computation" (June 2016) explored a programming language called AutoMan designed to integrate human workers recruited through crowdwork markets like Amazon Mechanical Turk alongside conventional computing resources. The language breaks new ground in how to automate the complicated work of scheduling, pricing, and managing crowdwork.
While the attempt to automate this managerial responsibility is clearly of value, we were dismayed by the authors' lack of concern for those who carry out the actual work of crowdwork. Humans and computers are not interchangeable. Minimizing wages is quite different from minimizing execution time. For example, the AutoMan language is designed to minimize crowdwork requesters' costs by iteratively running rounds of recruitment, with tasks offered at increasing wages. However, such optimization is quite different from the perspective of the workers compared to the requesters. The process is clearly not optimized for economic fairness. Systems that minimize payments could exert negative economic force on crowd-worker wages, failing to account for the complexities of, say, Mechanical Turk as a global labor market.
Recent research published in the proceedings of Computer-Human Interaction and Computer-Supported Cooperative Work conferences by Lilly Irani, David Martin, Jacki O'Neill, Mary L. Gray, Aniket Kittur, and others shows how crowdworkers are not interchangeable cogs in a machine but real humans, many dependent on crowd-work to make ends meet. Designing for workers as active, intelligent partners in the functioning of crowdwork systems has great potential. Two examples where researchers have collaborated with crowdworkers are the Turkopticon system, as introduced by Irani and Silberman,(1) which allows crowdworkers to review crowdwork requesters, and Dynamo, as presented by Salehi et al.,(2) which supports discussion and collective action among crowdworkers. Both projects demonstrate how crowd-workers can be treated as active partners in improving the various crowd-work marketplaces.
We hope future coverage of crowd-work in Communications will include research incorporating the perspective of workers in the design of such systems. This will help counteract the risk of creating programming languages that could actively, even if unintentionally, accentuate inequality and poverty. At a time when technology increasingly influences political debate, social responsibility is more than ever an invaluable aspect of computer science.
Barry Brown and Airi Lampinen
(1.) Irani, L.C. and Silberman, M.S. Turkopticon: Interrupting worker invisibility in Amazon Mechanical Turk. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (Paris, France, Apr. 27–May 2). ACM Press, New York, 2013, 611–620.
(2.) Salehi, N., Irani, L.C., Bernstein, M.S. et al. We are Dynamo: Overcoming stalling and friction in collective action for crowd workers. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (Seoul, Republic of Korea, Apr. 18–23). ACM Press, New York, 2015, 1621–1630.
We share the concerns Brown and Lampinen raise about crowdworker rights. In fact, AutoMan, by design, automatically addresses four of the five issues raised by workers, as described by Irani and Silberman in the letter's Reference 1: AutoMan never arbitrarily rejects work; pays workers as soon as the work is completed; pays workers the U.S. minimum wage by default; and automatically raises pay for tasks until enough workers agree to take them. Our experience reflects how much workers appreciate AutoMan, consistently rating AutoMan-generated tasks highly on Turkopticon, the requester-reputation site.
Daniel W. Barowy, Charles Curtsinger, Emery D. Berger, and Andrew McGregor
Displaying 1 comment