Go to content
UR Home

Comparing Bayesian Models of Annotation

URN to cite this document:
urn:nbn:de:bvb:355-epub-403048
DOI to cite this document:
10.5283/epub.40304
Paun, S. ; Carpenter, B. ; Chamberlain, J. D. ; Hovy, D. ; Kruschwitz, Udo ; Poesio, Massimo
[img]
Preview
License: Creative Commons Attribution 4.0
PDF - Published Version
(751kB)
Date of publication of this fulltext: 11 Jun 2019 09:37


Abstract

Crowdsourcing has revolutionised the way tasks can be completed but the process is frequently inefficient, costing practitioners time and money. This research investigates whether crowdsourcing can be optimised with a validation process, as measured by four criteria: quality; cost; noise; and speed. A validation model is described, simulated and tested on real data from an online crowdsourcing ...

plus


Owner only: item control page
  1. Homepage UR

University Library

Publication Server

Contact:

Publishing: oa@ur.de
0941 943 -4239 or -69394

Dissertations: dissertationen@ur.de
0941 943 -3904

Research data: datahub@ur.de
0941 943 -5707

Contact persons