Matches in Nanopublications for { <http://purl.org/np/RAF6IU1wwVTKGSj1NYWQ2JRQm0FBgANVTh8RY9l5U_vuE#paragraph> ?p ?o ?g. }
Showing items 1 to 3 of
3
with 100 items per page.
- paragraph type Paragraph assertion.
- paragraph hasContent "In order to replicate the approach followed in the contest, in the Find stage, we crowdsourced all the triples associated with resources that were explored by the LD experts. In total, we submitted to the crowd 30, 658 RDF triples. The microtasks were resolved by 187 distinct workers who identified 26, 835 triples as erroneous in 14 days, and classified them into the three quality issues studied in this work. Then, we selected samples from triples identified as erroneous in the Find stage by at least two workers from the crowd. This allowed us to fairly compare the outcome of the Verify stage from both workflows. Each sample contains the exact same number of triples that were crowdsourced in the Verify Stage in the first workflow, i.e., 509 triples with object value issues, 341 with data type or language tag issues, and 223 with interlinks issues. All triples crowdsourced in the Verify Stage were assessed by 141 workers in seven days. A summary of these results and further details are presented in Table 5." assertion.
- paragraph contains table assertion.