nif:isString
|
-
In addition, Cohen’s kappa coefficient [56] was reported to provide a measure of agreement, corrected for chance agreement, between the scores on the BRIEF-C and BRIEF-A (0 = poor agreement, .20 –slight agreement, 0.4 = fair agreement, 0.6 = moderate agreement, 0.8 substantial agreement, 1.0 = almost perfect agreement).
|