A PHP Error was encountered

Severity: Warning

Message: file_get_contents(https://...@pubfacts.com&api_key=b8daa3ad693db53b1410957c26c9a51b4908&a=1): Failed to open stream: HTTP request failed! HTTP/1.1 429 Too Many Requests

Filename: helpers/my_audit_helper.php

Line Number: 176

Backtrace:

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 176
Function: file_get_contents

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 250
Function: simplexml_load_file_from_url

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 1034
Function: getPubMedXML

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 3152
Function: GetPubMedArticleOutput_2016

File: /var/www/html/application/controllers/Detail.php
Line: 575
Function: pubMedSearch_Global

File: /var/www/html/application/controllers/Detail.php
Line: 489
Function: pubMedGetRelatedKeyword

File: /var/www/html/index.php
Line: 316
Function: require_once

Interobserver reliability of a CT-based fracture classification system. | LitMetric

Interobserver reliability of a CT-based fracture classification system.

J Orthop Trauma

Department of Orthopaedics and Rehabilitation, Oregon Health and Science University, Portland, OR, USA.

Published: October 2005

Objectives: This study was designed to determine whether the interobserver reliability of a fracture classification scheme applied based on a single, carefully defined, computed tomography (CT) cut is greater than those previously reported for systems designed for use with plain radiographs.

Design: Observer review of selected cases.

Setting: Four, level one, trauma centers.

Patients: Pretreatment CT scans of patients with calcaneus fractures were screened by the authors. Thirty cases were selected that had an appropriate semicoronal CT image. Ten orthopaedic traumatologists who were members of the Orthopaedic Trauma Association and had a minimum of 5 years postresidency experience were selected as reviewers.

Intervention: The reviewers were provided with a digital CT image for each case as well as written and diagrammatic representations of the Sanders classification system. The observers then classified each fracture according to the Sanders classification.

Results: : The mean kappa value for interobserver reliability for fracture types I-IV was 0.41 +/- 0.02 (mean +/- standard error of the mean; range, 0.07-0.64). Observers disagreed by more than 1 fracture type (ie, I vs. III or II vs. IV) in 10% of the cases. Observers agreed on the location of the fracture lines (A, B, C) in 90% of type II fractures and 52% of type III fractures.

Conclusions: The results indicate that in a carefully controlled paradigm, the interobserver reliability with a classification system based on interpretation of a single, carefully defined CT image was no better than the results reported for the same classification system used with full CT data or for other classification systems used for various fractures in the skeleton. Agreement in identifying the location of the fracture lines was very good for simple fractures but much worse for complex injuries. Additional study may determine whether the use of a full complement of CT images can improve reliability in classification of complex injuries.

Download full-text PDF

Source
http://dx.doi.org/10.1097/01.bot.0000177107.30837.61DOI Listing

Publication Analysis

Top Keywords

interobserver reliability
16
classification system
16
fracture classification
8
reliability fracture
8
single carefully
8
carefully defined
8
type iii
8
location fracture
8
fracture lines
8
reliability classification
8

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!