A PHP Error was encountered

Severity: Warning

Message: file_get_contents(https://...@pubfacts.com&api_key=b8daa3ad693db53b1410957c26c9a51b4908&a=1): Failed to open stream: HTTP request failed! HTTP/1.1 429 Too Many Requests

Filename: helpers/my_audit_helper.php

Line Number: 176

Backtrace:

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 176
Function: file_get_contents

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 250
Function: simplexml_load_file_from_url

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 1034
Function: getPubMedXML

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 3152
Function: GetPubMedArticleOutput_2016

File: /var/www/html/application/controllers/Detail.php
Line: 575
Function: pubMedSearch_Global

File: /var/www/html/application/controllers/Detail.php
Line: 489
Function: pubMedGetRelatedKeyword

File: /var/www/html/index.php
Line: 316
Function: require_once

Predicting and enhancing American Board of Surgery In-Training Examination performance: does writing questions really help? | LitMetric

AI Article Synopsis

  • The study investigates whether generating questions as a study method improves performance on the American Board of Surgery In-Training Examination (ABSITE) compared to traditional studying.
  • A total of 206 residents from six general surgery programs participated, with one group focused on writing questions and all taking two practice exams.
  • Results showed no significant improvement in ABSITE scores from writing questions, but practice test scores and other factors were effective in predicting ABSITE performance.

Article Abstract

Background: The generative learning model posits that individuals remember content they have generated better than materials created by others. The goals of this study were to evaluate question generation as a study method for the American Board of Surgery In-Training Examination (ABSITE) and determine whether practice test scores and other data predict ABSITE performance.

Methods: Residents (n = 206) from 6 general surgery programs were randomly assigned to one of the two study conditions. One group wrote questions for practice examinations. All residents took 2 practice examinations.

Results: There was not a significant effect of writing questions on ABSITE score. Practice test scores, United States Medical Licensing Examination Step 1 scores, and previous ABSITE scores were significantly correlated with ABSITE performance.

Conclusions: The generative learning model was not supported. Performance on practice tests and other data can be used for early identification of residents at risk of performing poorly on the ABSITE.

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.amjsurg.2015.08.033DOI Listing

Publication Analysis

Top Keywords

american board
8
board surgery
8
surgery in-training
8
in-training examination
8
writing questions
8
generative learning
8
learning model
8
practice test
8
test scores
8
absite
6

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!