Publications by authors named "Richard Shavelson"

(CR) when confronted with contradictory information from multiple sources is a crucial ability in a knowledge-based society and digital world. Using information without critically reflecting on the content and its quality may lead to the acceptance of information based on unwarranted claims. Previous personal beliefs are assumed to play a decisive role when it comes to critically differentiating between assertions and claims and warranted knowledge and facts.

View Article and Find Full Text PDF

Background: A holistic approach to performance assessment recognizes the theoretical complexity of multifaceted critical thinking (CT), a key objective of higher education. However, issues related to reliability, interpretation, and use arise with this approach.

Aims And Method: Therefore, we take an analytic approach to scoring students' written responses on a performance assessment.

View Article and Find Full Text PDF

The shift from cookbook to authentic research-based lab courses in undergraduate biology necessitates the need for evaluation and assessment of these novel courses. Although the biology education community has made progress in this area, it is important that we interpret the effectiveness of these courses with caution and remain mindful of inherent limitations to our study designs that may impact internal and external validity. The specific context of a research study can have a dramatic impact on the conclusions.

View Article and Find Full Text PDF

Purpose: To test the reliability of concept map assessment, which can be used to assess an individual's "knowledge structure," in a medical education setting.

Method: In 2004, 52 senior residents (pediatrics and internal medicine) and fourth-year medical students at the University of California-Davis School of Medicine created separate concept maps about two different subject domains (asthma and diabetes) on two separate occasions each (four total maps). Maps were rated using four different scoring systems: structural (S; counting propositions), quality (Q; rating the quality of propositions), importance/quality (I/Q; rating importance and quality of propositions), and a hybrid system (H; combining elements of S with I/Q).

View Article and Find Full Text PDF

The Collegiate Learning Assessment (CLA) program measures value added in colleges and universities, by testing the ability of freshmen and seniors to think logically and write clearly. The program is popular enough that it has attracted critics. In this paper, we outline the methods used by the CLA to determine value added.

View Article and Find Full Text PDF

Objective: In our effort to establish criterion-based skills training for surgeons, we assessed the performance of 17 experienced laparoscopic surgeons on basic technical surgical skills recorded electronically in 26 modules selected in 5 commercially available, computer-based simulators.

Methods: Performance data were derived from selected surgeons randomly assigned to simulator stations, and practicing repetitively during one and one-half day sessions on 5 different simulators. We measured surgeon proficiency defined as efficient, error-free performance and developed proficiency score formulas for each module.

View Article and Find Full Text PDF

The Collegiate Learning Assessment (CLA) is a computer administered, open-ended (as opposed to multiple-choice) test of analytic reasoning, critical thinking, problem solving, and written communication skills. Because the CLA has been endorsed by several national higher education commissions, it has come under intense scrutiny by faculty members, college administrators, testing experts, legislators, and others. This article describes the CLA's measures and what they do and do not assess, how dependably they measure what they claim to measure, and how CLA scores differ from those on other direct and indirect measures of college student learning.

View Article and Find Full Text PDF