A PHP Error was encountered

Severity: Warning

Message: file_get_contents(https://...@pubfacts.com&api_key=b8daa3ad693db53b1410957c26c9a51b4908&a=1): Failed to open stream: HTTP request failed! HTTP/1.1 429 Too Many Requests

Filename: helpers/my_audit_helper.php

Line Number: 176

Backtrace:

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 176
Function: file_get_contents

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 250
Function: simplexml_load_file_from_url

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 3122
Function: getPubMedXML

File: /var/www/html/application/controllers/Detail.php
Line: 575
Function: pubMedSearch_Global

File: /var/www/html/application/controllers/Detail.php
Line: 489
Function: pubMedGetRelatedKeyword

File: /var/www/html/index.php
Line: 316
Function: require_once

Morality on the road: Should machine drivers be more utilitarian than human drivers? | LitMetric

Morality on the road: Should machine drivers be more utilitarian than human drivers?

Cognition

The Uehiro Oxford Institute, University of Oxford, Oxford, UK; Department of Economics, University of Exeter, Exeter, UK; Center for Humans and Machines, Max Planck Institute for Human Development, Berlin, Germany.

Published: January 2025

Machines powered by artificial intelligence have the potential to replace or collaborate with human decision-makers in moral settings. In these roles, machines would face moral tradeoffs, such as automated vehicles (AVs) distributing inevitable risks among road users. Do people believe that machines should make moral decisions differently from humans? If so, why? To address these questions, we conducted six studies (N = 6805) to examine how people, as observers, believe human drivers and AVs should act in similar moral dilemmas and how they judge their moral decisions. In pedestrian-only dilemmas where the two agents had to sacrifice one pedestrian to save more pedestrians, participants held them to similar utilitarian norms (Study 1). In occupant dilemmas where the agents needed to weigh the in-vehicle occupant against more pedestrians, participants were less accepting of AVs sacrificing their passenger compared to human drivers sacrificing themselves (Studies 1-3) or another passenger (Studies 5-6). The difference was not driven by reduced occupant agency in AVs (Study 4) or by non-voluntary occupant sacrifice in AVs (Study 5), but rather by the perceived social relationship between AVs and their users (Study 6). Thus, even when people adopt an impartial stance as observers, they are more likely to believe that AVs should prioritize serving their users in moral dilemmas. We discuss the theoretical and practical implications for AV morality.

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.cognition.2024.106011DOI Listing

Publication Analysis

Top Keywords

moral decisions
8
human drivers
8
moral dilemmas
8
dilemmas agents
8
pedestrians participants
8
avs study
8
avs
7
moral
6
morality road
4
road machine
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!