Crowd control: Effectively utilizing unscreened crowd workers for biomedical data annotation

Anne Cocos, Ting Qian, Chris Callison-Burch, Aaron J. Masino

Research output: Contribution to journalArticlepeer-review

28 Scopus citations

Abstract

Annotating unstructured texts in Electronic Health Records data is usually a necessary step for conducting machine learning research on such datasets. Manual annotation by domain experts provides data of the best quality, but has become increasingly impractical given the rapid increase in the volume of EHR data. In this article, we examine the effectiveness of crowdsourcing with unscreened online workers as an alternative for transforming unstructured texts in EHRs into annotated data that are directly usable in supervised learning models. We find the crowdsourced annotation data to be just as effective as expert data in training a sentence classification model to detect the mentioning of abnormal ear anatomy in radiology reports of audiology. Furthermore, we have discovered that enabling workers to self-report a confidence level associated with each annotation can help researchers pinpoint less-accurate annotations requiring expert scrutiny. Our findings suggest that even crowd workers without specific domain knowledge can contribute effectively to the task of annotating unstructured EHR datasets.

Original languageEnglish (US)
Pages (from-to)86-92
Number of pages7
JournalJournal of Biomedical Informatics
Volume69
DOIs
StatePublished - May 1 2017
Externally publishedYes

All Science Journal Classification (ASJC) codes

  • Health Informatics
  • Computer Science Applications

Keywords

  • Crowdsourcing
  • EHR data
  • Logistic regression
  • Sentence classification
  • Text annotations

Fingerprint

Dive into the research topics of 'Crowd control: Effectively utilizing unscreened crowd workers for biomedical data annotation'. Together they form a unique fingerprint.

Cite this