Automation reliability and transparency are key factors for trust calibration and as such can have distinct effects on human reliance behaviour and mission performance. One question that remains unexplored is: what are the implications of reliability and transparency on trust calibration for human-swarm interaction? We investigate this research question in the context of human-swarm interaction, as swarm systems are becoming more popular for their robustness and versatility. Thirty-two participants performed swarm-based tasks under different reliability and transparency conditions. The results indicate that trust, whether it is reliability- or transparency-based, indicates high reliance rates and shorter response times. Reliability-based trust is negatively correlated with correct rejection rates while transparency-based trust is positively correlated with these rates. We conclude that reliability and transparency have distinct effects on trust calibration. Reliability and transparency have distinct effects on trust calibration. Findings from our human experiments suggest that transparency is a necessary design requirement if and when humans need to be involved in the decision-loop of human-swarm systems, especially when swarm reliability is high. HRI: human-robot interaction; IOS: inter-organisational systems; LMM: liner mixed models; MANOVA: multivariate analysis of variance; UxV: heterogeneous unmanned vehicles; UAV: unmanned aerial vehicle.

Download full-text PDF

Source
http://dx.doi.org/10.1080/00140139.2020.1764112DOI Listing

Publication Analysis

Top Keywords

reliability transparency
24
trust calibration
16
distinct effects
12
trust
8
human-swarm interaction
8
transparency distinct
8
effects trust
8
reliability
7
transparency
6
transparency bases
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!