Objective: To examine the relationship between medical school applicants' performances in the Graduate Australian Medical School Admissions Test (GAMSAT) and structured interviews and their subsequent performance in medical school.
Design: Students in Years 2-4 of two graduate-entry medical programs were invited to complete two previously validated tests of clinical reasoning. These results and their Year 2 examination results were compared with their previous performance in GAMSAT and at interview.
Setting: The graduate-entry programs at the Universities of Queensland and Sydney.
Participants: 189 student volunteers (13.6% response rate).
Main Outcome Measures: Students' test results on a set of Clinical Reasoning Problems (CRPs) and a Diagnostic Thinking Inventory (DTI) and their Year 2 examination results.
Results: There was no association between performance in GAMSAT and performance in the CRPs; there was a weak negative correlation between performance in GAMSAT and the DTI (- 0.05 > r > - 0.31, P = 0.03). The correlation between GAMSAT and examination results was weak (r < 0.24, P = 0.02). The correlation between GAMSAT and interview scores for each school was weakly negative for University of Queensland (r = - 0.34, P < 0.01) and weakly positive for University of Sydney (r = 0.11), with a combined significance level P < 0.01.
Conclusions: We did not find evidence that GAMSAT and structured interviews are good predictors of performance in medical school. Our study highlights a need for more rigorous evaluation of Australian medical school admissions tests.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.5694/j.1326-5377.2007.tb01228.x | DOI Listing |
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!