Crowd control: Effectively utilizing unscreened crowd workers for biomedical data annotation.

J Biomed Inform

Department of Biomedical and Health Informatics, The Children's Hospital of Philadelphia, United States. Electronic address:

Published: May 2017

Annotating unstructured texts in Electronic Health Records data is usually a necessary step for conducting machine learning research on such datasets. Manual annotation by domain experts provides data of the best quality, but has become increasingly impractical given the rapid increase in the volume of EHR data. In this article, we examine the effectiveness of crowdsourcing with unscreened online workers as an alternative for transforming unstructured texts in EHRs into annotated data that are directly usable in supervised learning models. We find the crowdsourced annotation data to be just as effective as expert data in training a sentence classification model to detect the mentioning of abnormal ear anatomy in radiology reports of audiology. Furthermore, we have discovered that enabling workers to self-report a confidence level associated with each annotation can help researchers pinpoint less-accurate annotations requiring expert scrutiny. Our findings suggest that even crowd workers without specific domain knowledge can contribute effectively to the task of annotating unstructured EHR datasets.

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.jbi.2017.04.003DOI Listing

Publication Analysis

Top Keywords

crowd workers
8
annotating unstructured
8
unstructured texts
8
data
7
crowd control
4
control effectively
4
effectively utilizing
4
utilizing unscreened
4
unscreened crowd
4
workers
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!