Objective: This study evaluated the intraobserver and interobserver reliability of the AO classification for standard radiographs of wrist fractures.

Methods: Thirty observers, divided into three groups (orthopedic surgery senior residents, orthopedic surgeons, and hand surgeons) classified 52 wrist fractures, using only simple radiographs. After a period of four weeks, the same observers evaluated the initial 52 radiographs, in a randomized order. The agreement among the observers, the groups, and intraobserver was obtained using the Kappa index. Kappa-values were interpreted as proposed by Landis and Koch.

Results: The global interobserver agreement level of the AO classification was considered fair (0.30). The three groups presented fair global interobserver agreement (residents, 0.27; orthopedic surgeons, 0.30; hand surgeons, 0.33). The global intraobserver agreement level was moderated. The hand surgeon group obtained the higher intraobserver agreement level, although only moderate (0.50). The residents group obtained fair levels (0.30), as did the orthopedics surgeon group (0.33).

Conclusion: The data obtained suggests fair levels of interobserver agreement and moderate levels of intraobserver agreement for the AO classification for wrist fractures.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC6204541PMC
http://dx.doi.org/10.1016/j.rboe.2017.08.024DOI Listing

Publication Analysis

Top Keywords

wrist fractures
12
interobserver agreement
12
agreement level
12
intraobserver agreement
12
interobserver reliability
8
reliability classification
8
classification wrist
8
three groups
8
orthopedic surgeons
8
hand surgeons
8

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!