A PHP Error was encountered

Severity: Warning

Message: file_get_contents(https://...@pubfacts.com&api_key=b8daa3ad693db53b1410957c26c9a51b4908&a=1): Failed to open stream: HTTP request failed! HTTP/1.1 429 Too Many Requests

Filename: helpers/my_audit_helper.php

Line Number: 176

Backtrace:

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 176
Function: file_get_contents

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 250
Function: simplexml_load_file_from_url

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 1034
Function: getPubMedXML

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 3152
Function: GetPubMedArticleOutput_2016

File: /var/www/html/application/controllers/Detail.php
Line: 575
Function: pubMedSearch_Global

File: /var/www/html/application/controllers/Detail.php
Line: 489
Function: pubMedGetRelatedKeyword

File: /var/www/html/index.php
Line: 316
Function: require_once

The Neer classification system for proximal humeral fractures. An assessment of interobserver reliability and intraobserver reproducibility. | LitMetric

The radiographs of fifty fractures of the proximal part of the humerus were used to assess the interobserver reliability and intraobserver reproducibility of the Neer classification system. A trauma series consisting of scapular anteroposterior, scapular lateral, and axillary radiographs was available for each fracture. The radiographs were reviewed by an orthopaedic shoulder specialist, an orthopaedic traumatologist, a skeletal radiologist, and two orthopaedic residents, in their fifth and second years of postgraduate training. The radiographs were reviewed on two different occasions, six months apart. Interobserver reliability was assessed by comparison of the fracture classifications determined by the five observers. Intraobserver reproducibility was evaluated by comparison of the classifications determined by each observer on the first and second viewings. Kappa (kappa) reliability coefficients were used. All five observers agreed on the final classification for 32 and 30 per cent of the fractures on the first and second viewings, respectively. Paired comparisons between the five observers showed a mean reliability coefficient of 0.48 (range, 0.43 to 0.58) for the first viewing and 0.52 (range, 0.37 to 0.62) for the second viewing. The attending physicians obtained a slightly higher kappa value than the orthopaedic residents (0.52 compared with 0.48). Reproducibility ranged from 0.83 (the shoulder specialist) to 0.50 (the skeletal radiologist), with a mean of 0.66. Simplification of the Neer classification system, from sixteen categories to six more general categories based on fracture type, did not significantly improve either interobserver reliability or intraobserver reproducibility.

Download full-text PDF

Source
http://dx.doi.org/10.2106/00004623-199312000-00002DOI Listing

Publication Analysis

Top Keywords

interobserver reliability
16
intraobserver reproducibility
16
neer classification
12
classification system
12
reliability intraobserver
12
radiographs reviewed
8
shoulder specialist
8
skeletal radiologist
8
orthopaedic residents
8
classifications determined
8

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!