Sign In

Communications of the ACM

ACM Careers

Faster Robots Demoralize Co-Workers


robot has more winnings in human-robot competition, illustration

It's not whether you win or lose; it's how hard the robot is working.

A Cornell-led team has found that when robots are beating humans in contests for cash prizes, people consider themselves less competent and expend slightly less effort—and they tend to dislike the robots.

The study, "Monetary-Incentive Competition Between Humans and Robots: Experimental Results," brought together behavioral economists and roboticists to explore how a robot's performance affects humans' behavior and reactions when they're competing against each other simultaneously.

The findings validate behavioral economists' theories about loss aversion, which predicts that people won't try as hard when their competitors are doing better, and suggests how workplaces might optimize teams of people and robots working together.

"Humans and machines already share many workplaces, sometimes working on similar or even identical tasks," says Guy Hoffman, assistant professor in Cornell's Sibley School of Mechanical and Aerospace Engineering. Hoffman and Ori Heffetz, associate professor of economics in Cornell's Samuel Curtis Johnson Graduate School of Management, are senior authors of the study, which was presented at HRI 2019, the 14th Annual ACM/IEEE International Conference on Human-Robot Interaction.

"Think about a cashier working side-by-side with an automatic check-out machine, or someone operating a forklift in a warehouse, which also employs delivery robots driving right next to them," Hoffman says. "While it may be tempting to design such robots for optimal productivity, engineers and managers need to take into consideration how the robots' performance may affect the human workers' effort and attitudes toward the robot and even toward themselves. Our research is the first that specifically sheds light on these effects."

Alap Kshirsagar, a doctoral student in mechanical engineering, is the HRI 2019 paper's first author. Other authors are Bnaya Dreyfuss and Guy Ishai, economics graduate students at the Hebrew University of Jerusalem.

In the study, humans competed against a robot in a tedious task—counting the number of times the letter G appears in a string of characters, and then placing a block in the bin corresponding to the number of occurrences. The person's chance of winning each round was determined by a lottery based on the difference between the human's and robot's scores: If their scores were the same, the human had a 50 percent chance of winning the prize, and that likelihood rose or fell depending which participant was doing better.

To make sure competitors were aware of the stakes, the screen indicated their chance of winning at each moment.

For the behavioral economists, the study offered an opportunity to test theories about loss aversion in a controlled setting; the effort of two humans in competition can't be controlled, but a robot's effort can. It also showed how loss aversion might affect humans' effort in a simultaneous competition, which had not been previously studied.

"The beauty of this project is that it is the birth of a true collaboration across engineering and economics—one of the things Cornell is good at," says Heffetz, who is also an associate professor of economics at the Hebrew University. "We tried to find questions that interest both crowds, and then we tried to design an experiment that gets the economics right, and is feasible from a human-robot interaction point of view."

After each round, participants filled out a questionnaire rating the robot's competence, their own competence, and the robot's likability. The researchers found that as the robot performed better, people rated its competence higher, its likability lower, and their own competence lower.

"We were surprised that people found themselves less competent against a fast, competitive robot, even though there's no direct interaction," Kshirsagar says. "The robot is doing its own work, you're doing your own work."

Most participants did not seem to anthropomorphize the robot, with comments including, "I sort of realized, I am just competing with an idea of mechanization, and the arm is just a prop to signify it." Another participant wrote, "It was obvious when the robot was going easy on me." In fact, the robot's efforts varied by round but did not change within each round.

Researchers were surprised that the value of the cash prize did not appear to significantly influence people's efforts, though previous experiments suggested people would try harder as the value rose.

The researchers plan to explore the reason for that in future work, but say participants may have been so focused on winning they didn't care about the actual prize value.

The research was partly supported by the Israel Science Foundation.


 

No entries found

Sign In for Full Access
» Forgot Password? » Create an ACM Web Account
ACM Resources