Matches in DBpedia 2014 for { <http://dbpedia.org/resource/Cohen's_kappa> ?p ?o. }
Showing items 1 to 43 of
43
with 100 items per page.
- Cohen's_kappa abstract "Cohen's kappa coefficient is a statistical measure of inter-rater agreement or inter-annotator agreement for qualitative (categorical) items. It is generally thought to be a more robust measure than simple percent agreement calculation since κ takes into account the agreement occurring by chance. Some researchers[citation needed] have expressed concern over κ's tendency to take the observed categories' frequencies as givens, which can have the effect of underestimating agreement for a category that is also commonly used; for this reason, κ is considered an overly conservative measure of agreement.Others[citation needed] contest the assertion that kappa "takes into account" chance agreement. To do this effectively would require an explicit model of how chance affects rater decisions. The so-called chance adjustment of kappa statistics supposes that, when not completely certain, raters simply guess—a very unrealistic scenario.".
- Cohen's_kappa wikiPageExternalLink weighted-kappa-example-in-php.
- Cohen's_kappa wikiPageExternalLink 201209-eacl2012-Kappa.pdf.
- Cohen's_kappa wikiPageExternalLink kappa.html.
- Cohen's_kappa wikiPageExternalLink kappa.
- Cohen's_kappa wikiPageExternalLink research_papers.html.
- Cohen's_kappa wikiPageExternalLink bjmsp2008_interrater.pdf.
- Cohen's_kappa wikiPageExternalLink psychometrika2008_irr_random_raters.pdf.
- Cohen's_kappa wikiPageExternalLink wiley_encyclopedia2008_eoct631.pdf.
- Cohen's_kappa wikiPageExternalLink Kappa.
- Cohen's_kappa wikiPageExternalLink ComKappa2.zip.
- Cohen's_kappa wikiPageExternalLink procon.
- Cohen's_kappa wikiPageExternalLink kappa.
- Cohen's_kappa wikiPageExternalLink ira.
- Cohen's_kappa wikiPageID "1701650".
- Cohen's_kappa wikiPageRevisionID "602577676".
- Cohen's_kappa hasPhotoCollection Cohen's_kappa.
- Cohen's_kappa subject Category:Categorical_data.
- Cohen's_kappa subject Category:Inter-rater_reliability.
- Cohen's_kappa subject Category:Non-parametric_statistics.
- Cohen's_kappa type Abstraction100002137.
- Cohen's_kappa type CategoricalData.
- Cohen's_kappa type Cognition100023271.
- Cohen's_kappa type Datum105816622.
- Cohen's_kappa type Information105816287.
- Cohen's_kappa type PsychologicalFeature100023100.
- Cohen's_kappa comment "Cohen's kappa coefficient is a statistical measure of inter-rater agreement or inter-annotator agreement for qualitative (categorical) items. It is generally thought to be a more robust measure than simple percent agreement calculation since κ takes into account the agreement occurring by chance.".
- Cohen's_kappa label "Coeficiente kappa de Cohen".
- Cohen's_kappa label "Cohen's kappa".
- Cohen's_kappa label "Cohens Kappa".
- Cohen's_kappa label "Kappa de Cohen".
- Cohen's_kappa label "Kappa di Cohen".
- Cohen's_kappa sameAs Cohens_Kappa.
- Cohen's_kappa sameAs Coeficiente_kappa_de_Cohen.
- Cohen's_kappa sameAs Cohenen_kappa.
- Cohen's_kappa sameAs Kappa_de_Cohen.
- Cohen's_kappa sameAs Kappa_di_Cohen.
- Cohen's_kappa sameAs m.05pkgl.
- Cohen's_kappa sameAs Q1107106.
- Cohen's_kappa sameAs Q1107106.
- Cohen's_kappa sameAs Cohen's_kappa.
- Cohen's_kappa wasDerivedFrom Cohen's_kappa?oldid=602577676.
- Cohen's_kappa isPrimaryTopicOf Cohen's_kappa.