Aim: This study determined inter-rater agreement between skill assessments provided by on-site PALS evaluators with ratings from evaluators at a remote site viewing the same skill performance over a videoconferencing network. Judgments about feasibility of remote evaluation were also obtained from the evaluators and PALS course participants.
Methods: Two remote and two on-site instructors independently rated performance of 27 course participants who performed cardiac and shock/respiratory emergency core cases. Inter-rater reliability was assessed with the intraclass correlation coefficient (ICC). Feasibility was assessed with surveys of evaluators and course participants. Core cases were under the direction of the remote evaluators.
Results: The ICC for overall agreement on pass/fail decisions was 0.997 for the cardiac cases and 0.998 for the shock/respiratory cases. Perfect agreement was reached on 52 of 54 pass/fail decisions. Across all evaluators, all core cases, and all participants, 2584 ratings of individual skill criteria were provided, of which 21 (0.8%) were ratings in which a single evaluator disagreed with the other three evaluators. No trends emerged for location of the disagreeing evaluator. Survey responses indicated that remote evaluation was acceptable and feasible to course participants and to the evaluators.
Conclusions: Videoconferencing technology was shown to provide adequate spatial and temporal resolution for PALS evaluators at-a-distance from course participants to agree with ratings of on-site evaluators.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1016/j.resuscitation.2008.11.025 | DOI Listing |
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!