Go to content
UR Home

Comparing Bayesian Models of Annotation

URN to cite this document:
DOI to cite this document:
Paun, S. ; Carpenter, B. ; Chamberlain, J. D. ; Hovy, D. ; Kruschwitz, Udo ; Poesio, Massimo
License: Creative Commons Attribution 4.0
PDF - Published Version
Date of publication of this fulltext: 11 Jun 2019 09:37


Crowdsourcing has revolutionised the way tasks can be completed but the process is frequently inefficient, costing practitioners time and money. This research investigates whether crowdsourcing can be optimised with a validation process, as measured by four criteria: quality; cost; noise; and speed. A validation model is described, simulated and tested on real data from an online crowdsourcing ...


Owner only: item control page
  1. Homepage UR

University Library

Publication Server


Publishing: oa@ur.de
0941 943 -4239 or -69394

Dissertations: dissertationen@ur.de
0941 943 -3904

Research data: datahub@ur.de
0941 943 -5707

Contact persons