Matches in DBpedia 2014 for { <http://dbpedia.org/resource/Inter-rater_reliability> ?p ?o. }
Showing items 1 to 27 of
27
with 100 items per page.
- Inter-rater_reliability abstract "In statistics, inter-rater reliability, inter-rater agreement, or concordance is the degree of agreement among raters. It gives a score of how much homogeneity, or consensus, there is in the ratings given by judges. It is useful in refining the tools given to human judges, for example by determining if a particular scale is appropriate for measuring a particular variable. If various raters do not agree, either the scale is defective or the raters need to be re-trained.There are a number of statistics which can be used to determine inter-rater reliability. Different statistics are appropriate for different types of measurement. Some options are: joint-probability of agreement, Cohen's kappa and the related Fleiss' kappa, inter-rater correlation, concordance correlation coefficient and intra-class correlation.".
- Inter-rater_reliability wikiPageExternalLink kappa.
- Inter-rater_reliability wikiPageExternalLink full.
- Inter-rater_reliability wikiPageExternalLink agreestat.html.
- Inter-rater_reliability wikiPageExternalLink chance_agreement_correction.html.
- Inter-rater_reliability wikiPageExternalLink book3.
- Inter-rater_reliability wikiPageExternalLink book_excerpts.html.
- Inter-rater_reliability wikiPageExternalLink bjmsp2008_interrater.pdf.
- Inter-rater_reliability wikiPageExternalLink 9781439810804.
- Inter-rater_reliability wikiPageExternalLink reliability.html.
- Inter-rater_reliability wikiPageExternalLink www.rateragreement.com.
- Inter-rater_reliability wikiPageExternalLink ira.
- Inter-rater_reliability wikiPageID "7837393".
- Inter-rater_reliability wikiPageRevisionID "597968627".
- Inter-rater_reliability hasPhotoCollection Inter-rater_reliability.
- Inter-rater_reliability subject Category:Inter-rater_reliability.
- Inter-rater_reliability subject Category:Statistical_data_types.
- Inter-rater_reliability comment "In statistics, inter-rater reliability, inter-rater agreement, or concordance is the degree of agreement among raters. It gives a score of how much homogeneity, or consensus, there is in the ratings given by judges. It is useful in refining the tools given to human judges, for example by determining if a particular scale is appropriate for measuring a particular variable.".
- Inter-rater_reliability label "Inter-rater reliability".
- Inter-rater_reliability label "Interrater-Reliabilität".
- Inter-rater_reliability sameAs Interrater-Reliabilität.
- Inter-rater_reliability sameAs Adostasun_neurri.
- Inter-rater_reliability sameAs m.026fs2y.
- Inter-rater_reliability sameAs Q470749.
- Inter-rater_reliability sameAs Q470749.
- Inter-rater_reliability wasDerivedFrom Inter-rater_reliability?oldid=597968627.
- Inter-rater_reliability isPrimaryTopicOf Inter-rater_reliability.