Robots can encourage humans to take greater risks in a simulated gambling scenario than they would if the robot is silent, research shows. Increasing understanding of whether robots can affect risk-taking could have clear ethical, practical, and policy implications, according to "The Robot Made Me Do It: Human-Robot Interaction and Risk-Taking Behavior," published in Cyberpsychology, Behavior, and Social Networking.
The research involved 180 undergraduate students taking the Balloon Analogue Risk Task (BART), a computer assessment that asks participants to press the spacebar on a keyboard to inflate a balloon displayed on the screen. With each press of the spacebar, the balloon inflates slightly, and 1 penny is added to the player's "temporary money bank." The balloons can explode randomly, meaning the player loses any money they have won for that balloon and they have the option to "cash-in" before this happens and move on to the next balloon.
One-third of the participants took the test in a room on their own, one third took the test alongside a robot that only provided instructions but was silent the rest of the time, and the third group took the test with the robot providing instruction as well as speaking statements such as, "Why did you stop pumping?"
The results showed that the group who were encouraged by the robot took more risks, blowing up their balloons significantly more frequently than those in the other groups did. They also earned more money overall.
"On the one hand, our results might raise alarms about the prospect of robots causing harm by increasing risky behavior," says Yaniv Hanoch, associate professor in risk management at the University of Southampton who led the study. "On the other hand, our data points to the possibility of using robots, and AI, in preventive programs such as anti-smoking campaigns in schools, and with hard to reach populations, such as addicts."
From University of Southampton
View Full Article
No entries found