You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Some measures of interannotator agreement, such as Cohen's kappa and Krippendorff's alpha, are not meaningful for cases where all annotators assigned the same label. In this case the expected disagreement is 0 and, were the measure's usual formula to be calculated, it would produce a division by 0.
When KrippendorffAlphaAgreement is told to calculate Krippendorff's alpha for such cases, calculateAgreement() returns 0.0. I think this is misleading—a value of 0.0 means that the agreement was successfully calculated, and that it was indistinguishable to a random labelling. I think that calculateAgreement() should have instead signalled to the user that it could not return a meaningful result with the data it was given. CohenKappaAgreement correctly handles this case by throwing an exception of type InsufficientDataException—maybe KrippendorffAlphaAgreement should do likewise…?
The text was updated successfully, but these errors were encountered:
Hmm… I see now that CohenKappaAgreement is throwing the exception via CodingAgreementMeasure. So the problematic behaviour described above is probably a consequence of KrippendorffAlphaAgreement not implementing ICodingAgreementMeasure, as mentioned in Issue #17.
Some measures of interannotator agreement, such as Cohen's kappa and Krippendorff's alpha, are not meaningful for cases where all annotators assigned the same label. In this case the expected disagreement is 0 and, were the measure's usual formula to be calculated, it would produce a division by 0.
When
KrippendorffAlphaAgreement
is told to calculate Krippendorff's alpha for such cases,calculateAgreement()
returns 0.0. I think this is misleading—a value of 0.0 means that the agreement was successfully calculated, and that it was indistinguishable to a random labelling. I think thatcalculateAgreement()
should have instead signalled to the user that it could not return a meaningful result with the data it was given.CohenKappaAgreement
correctly handles this case by throwing an exception of typeInsufficientDataException
—maybeKrippendorffAlphaAgreement
should do likewise…?The text was updated successfully, but these errors were encountered: