Interobserver reliability and intraobserver reproducibility of Neer classification system
10.3760/cma.j.issn.1671-7600.2012.07.004
- VernacularTitle:Neer分型的可靠性研究
- Author:
Chunyan JIANG
;
Qiang ZHANG
- Publication Type:Journal Article
- Keywords:
Shoulder;
Fractures,bone;
Reproducibility of results;
Neer classification
- From:
Chinese Journal of Orthopaedic Trauma
2012;14(7):566-570
- CountryChina
- Language:Chinese
-
Abstract:
Objective To assess the interobserver reliability and intraobserver reproducibility of Neer classification system and its influencing factors.Methods The present study reviewed the series preoperative radiographs (including those of scapular anteroposterior view,scapular lateral view and modified Velpeau axillary view and an axial CT scan) of 40 patients who had been treated in our department from January 2010 to December 2010 for proximal humeral fractures.The radiographs were assessed by 12 individual observers on 2 separate occasions with an interval of 3 months at least Half of the observers (the professional group; n =6) had received a shoulder fellowship training and the other half (the control group; n =6) had not All the observers were asked to categorize the radiographs according to the Neer classification system of 16 types of fractures in a same process.The reliability and reproducibility of the system were assessed with the Kappa statistics.Comparisons of classification agreement were made between the professional group and the control group.We also evaluated the simplified Neer system of only 6 types of fractures with recombinant data.Results The interobserver reliability coefficients were 0.534 and 0.473 for the first and second assessments,with an intraobserver reproducibility coefficient of 0.669.The agreement level in the professional group was significantly higher than in the control group ( P < 0.05).The interobserver reliability coefficients of the simplified Neer system were 0.581 and 0.502,with an intraobserver reproducibility coefficient of 0.680.Use of the modified Neer system did not elevate the agreement level beyond the moderate range.The classification was agreed on by all the observers in 17.5% of the fractures during the first assessment and in 15.0% during the second assessment.Conclusions Neer classification may have fair interobserver reliability and moderate intraobserver reproducibility.Experience of shoulder fellowship training is an important factor influencing the reliability of the Neer system.Simplification of the system may not help increase its reliability.