Many technical and psychological challenges make it difficult to design machines that effectively cooperate with people. To better understand these challenges, we conducted a series of studies investigating human-human, robot-robot, and human-robot cooperation in a strategically rich resource-sharing scenario, which required players to balance efficiency, fairness, and risk. In these studies, both human-human and robot-robot dyads typically learned efficient and risky cooperative solutions when they could communicate. In the absence of communication, robot dyads still often learned the same efficient solution, but human dyads achieved a less efficient (less risky) form of cooperation. This difference in how people and machines treat risk appeared to discourage human-robot cooperation, as human-robot dyads frequently failed to cooperate without communication. These results indicate that machine behavior should better align with human behavior, promoting efficiency while simultaneously considering human tendencies toward risk and fairness.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC7797565PMC
http://dx.doi.org/10.1016/j.isci.2020.101963DOI Listing

Publication Analysis

Top Keywords

human-robot cooperation
12
machine behavior
8
human-human robot-robot
8
learned efficient
8
efficient risky
8
confronting barriers
4
human-robot
4
barriers human-robot
4
cooperation
4
cooperation balancing
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!