Machine learning (ML) is increasingly prevalent in mental health care, with contemporary initiatives leveraging these technologies, sometimes in combination with wearable devices, toward novel interventions. This paper investigates the development of one of these systems, using conversation analysis approach to better understand the work of "guides," a form of labor that is involved in the trial's implementation, and how people are trained for this role. Guides are assigned participants with whom they meet one-on-one over the course of the behavioral modification intervention. Guides are described in advance as an easily replaceable component of the trial. While their work appears sophisticated and valuable in ethnographic observation, in training sessions it is described and enacted as a narrow communicative task of adequately resolving participant queries, even when these queries raise questions about environmental factors or the trial protocol. This paper demonstrates how this occurs in guidance training interactions, offering an empirical account of how new forms of human labor that are required by a machine learning driven intervention are constituted in the interactional practice of training-a process that contributes to both the minimization of the new human labor required for ML-based interventions and the conceptualization of digital mental health interventions as neutral, portable, and not contingent on environmental factors. As digital mental health initiatives move from small pilot studies into broader implementation, understanding of the interactional processes by which new human roles are established is key for specifying new kinds of human labor involved in digital health interventions and leveraging these new roles for adapting interventions according to the particular circumstances of diverse participants and patients.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1016/j.socscimed.2024.117586 | DOI Listing |
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!