A PHP Error was encountered

Severity: Warning

Message: file_get_contents(https://...@pubfacts.com&api_key=b8daa3ad693db53b1410957c26c9a51b4908&a=1): Failed to open stream: HTTP request failed! HTTP/1.1 429 Too Many Requests

Filename: helpers/my_audit_helper.php

Line Number: 176

Backtrace:

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 176
Function: file_get_contents

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 250
Function: simplexml_load_file_from_url

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 3122
Function: getPubMedXML

File: /var/www/html/application/controllers/Detail.php
Line: 575
Function: pubMedSearch_Global

File: /var/www/html/application/controllers/Detail.php
Line: 489
Function: pubMedGetRelatedKeyword

File: /var/www/html/index.php
Line: 316
Function: require_once

Computational Metrics Can Provide Quantitative Values to Characterize Arthroscopic Field of View. | LitMetric

Purpose: The purpose of this study was to determine the inter-rater reliability of arthroscopic video quality, determine correlation between surgeon rating and computational image metrics, and facilitate a quantitative methodology for assessing video quality.

Methods: Five orthopaedic surgeons reviewed 60 clips from deidentified arthroscopic shoulder videos and rated each on a four-point Likert scale from poor to excellent view. The videos were randomized, and the process was completed a total of three times. Each user rating was averaged to provide a user rating per clip. Each video frame was processed to calculate brightness, local contrast, redness (used to represent bleeding), and image entropy. Each metric was then averaged over each frame per video clip, providing four image quality metrics per clip.

Results: Inter-rater reliability for grading video quality had an intraclass correlation of .974. Improved image quality rating was positively correlated with increased entropy (.8142; < .001), contrast (.8013; < .001), and brightness (.6120; < .001), and negatively correlated with redness (-.8626; < .001). A multiple linear regression model was calculated with the image metrics used as predictors for the image quality ranking, with an R-squared value of .775 and root mean square error of .42.

Conclusions: Our study demonstrates strong inter-rater reliability between surgeons when describing image quality and strong correlations between image quality and the computed image metrics. A model based on these metrics enables automatic quantification of image quality.

Clinical Relevance: Video quality during arthroscopic cases can impact the ease and duration of the case which could contribute to swelling and complication risk. This pilot study provides a quantitative method to assess video quality. Future works can objectively determine factors that affect visualization during arthroscopy and identify options for improvement.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC9042744PMC
http://dx.doi.org/10.1016/j.asmr.2021.10.017DOI Listing

Publication Analysis

Top Keywords

image quality
20
video quality
16
inter-rater reliability
12
image metrics
12
image
10
quality
9
user rating
8
video
7
metrics
5
computational metrics
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!