Severity: Warning
Message: file_get_contents(https://...@gmail.com&api_key=61f08fa0b96a73de8c900d749fcb997acc09&a=1): Failed to open stream: HTTP request failed! HTTP/1.1 429 Too Many Requests
Filename: helpers/my_audit_helper.php
Line Number: 176
Backtrace:
File: /var/www/html/application/helpers/my_audit_helper.php
Line: 176
Function: file_get_contents
File: /var/www/html/application/helpers/my_audit_helper.php
Line: 250
Function: simplexml_load_file_from_url
File: /var/www/html/application/helpers/my_audit_helper.php
Line: 1034
Function: getPubMedXML
File: /var/www/html/application/helpers/my_audit_helper.php
Line: 3152
Function: GetPubMedArticleOutput_2016
File: /var/www/html/application/controllers/Detail.php
Line: 575
Function: pubMedSearch_Global
File: /var/www/html/application/controllers/Detail.php
Line: 489
Function: pubMedGetRelatedKeyword
File: /var/www/html/index.php
Line: 316
Function: require_once
Background: Surgical videos are now being used for performance review and educational purposes; however, broad use is still limited due to time constraints. To make video review more efficient, we implemented Artificial Intelligence (AI) algorithms to detect surgical workflow and technical approaches.
Methods: Participants (N = 200) performed a simulated open bowel repair. The operation included two major phases: (1) Injury Identification and (2) Suture Repair. Accordingly, a phase detection algorithm (MobileNetV2+GRU) was implemented to automatically detect the two phases using video data. In addition, participants were noted to use three different technical approaches when running the bowel: (1) use of both hands, (2) use of one hand and one tool, or (3) use of two tools. To discern the three technical approaches, an object detection (YOLOv3) algorithm was implemented to recognize objects that were commonly used during the Injury Identification phase (hands versus tools).
Results: The phase detection algorithm achieved high precision (recall) when segmenting the two phases: Injury Identification (86 ± 9% [81 ± 12%]) and Suture Repair (81 ± 6% [81 ± 16%]). When evaluating three technical approaches in running the bowel, the object detection algorithm achieved high average precisions (Hands [99.32%] and Tools [94.47%]). The three technical approaches showed no difference in execution time (Kruskal-Wallis Test: P= 0.062) or injury identification (not missing an injury) (Chi-squared: P= 0.998).
Conclusions: The AI algorithms showed high precision when segmenting surgical workflow and identifying technical approaches. Automation of these techniques for surgical video databases has great potential to facilitate efficient performance review.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1016/j.jss.2021.07.003 | DOI Listing |
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!