Severity: Warning
Message: file_get_contents(https://...@pubfacts.com&api_key=b8daa3ad693db53b1410957c26c9a51b4908&a=1): Failed to open stream: HTTP request failed! HTTP/1.1 429 Too Many Requests
Filename: helpers/my_audit_helper.php
Line Number: 176
Backtrace:
File: /var/www/html/application/helpers/my_audit_helper.php
Line: 176
Function: file_get_contents
File: /var/www/html/application/helpers/my_audit_helper.php
Line: 250
Function: simplexml_load_file_from_url
File: /var/www/html/application/helpers/my_audit_helper.php
Line: 1034
Function: getPubMedXML
File: /var/www/html/application/helpers/my_audit_helper.php
Line: 3152
Function: GetPubMedArticleOutput_2016
File: /var/www/html/application/controllers/Detail.php
Line: 575
Function: pubMedSearch_Global
File: /var/www/html/application/controllers/Detail.php
Line: 489
Function: pubMedGetRelatedKeyword
File: /var/www/html/index.php
Line: 316
Function: require_once
Rationale And Objectives: Despite their increasing prevalence, online textbooks, question banks, and digital references focus primarily on explicit knowledge. Implicit skills such as abnormality detection require repeated practice on clinical service and have few digital substitutes. Using mechanics traditionally deployed in video games such as clearly defined goals, rapid-fire levels, and narrow time constraints may be an effective way to teach implicit skills.
Materials And Methods: We created a freely available, online module to evaluate the ability of individuals to differentiate between normal and abnormal chest radiographs by implementing mechanics, including instantaneous feedback, rapid-fire cases, and 15-second timers. Volunteer subjects completed the modules and were separated based on formal experience with chest radiography. Performance between training and testing sets were measured for each group, and a survey was administered after each session.
Results: The module contained 74 cases and took approximately 20 minutes to complete. Thirty-two cases were normal radiographs and 56 cases were abnormal. Of the 60 volunteers recruited, 25 were "never trained" and 35 were "previously trained." "Never trained" users scored 21.9 out of 37 during training and 24.0 out of 37 during testing (59.1% vs 64.9%, P value <.001). "Previously trained" users scored 28.0 out of 37 during training and 28.3 out of 37 during testing phases (75.6% vs 76.4%, P value = .56). Survey results showed that 87% of all subjects agreed the module is an efficient way of learning, and 83% agreed the rapid-fire module is valuable for medical students.
Conclusions: A gamified online module may improve the abnormality detection rates of novice interpreters of chest radiography, although experienced interpreters are less likely to derive similar benefits. Users reviewed the educational module favorably.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1016/j.acra.2017.05.005 | DOI Listing |
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!