Using the Triangulated OSCE to Assess Student Performance in Simulation

Saturday, 21 April 2018

Heather Johnson, DNP, FNP-BC, FAANP1
Catherine G. Ling, PhD, FNP-BC, FAANP2
Andrea Fuller, DNP, FNP-BC3
Laura Taylor, PhD, RN3
(1)Graduate School of Nursing, Uniformed Services University of the Health Sciences, Bethesda, MD, USA
(2)FNP Concentration Director College of Nursing, University of South Florida, Tampa, FL, USA
(3)Daniel K Inouye Graduate School of Nursing, The Uniformed Services University of the Health Sciences, Bethesda, MD, USA

Simulation is widely used in health education to improve interviewing and clinical skills. The Objective Structured Clinical Examination (OSCE) is a method of assessing clinical competence by rotating students through a variety of standardized patient (SP) scenarios or skills stations. There are at present no widely circulated gold standard evaluation methods for OSCE performance. Variability in psychometric properties, vague instructions for participants, inconsistency in SP responses, poorly defined outcomes and a mismatch between intent of the evaluation and type of data collected are long-standing critiques of OSCEs. Directly observed simulation encounters are labor intensive represent a significant strain on faculty time. Challenges associated with inter-rater reliability and outcomes can be minimized by adopting a standardized checklist. The checklist itself must be closely examined as it can steer the faculty observer to an evaluation of skills performance over clinical synthesis or decision making. The purpose of this presentation is to provide a description of how two programs collaborated to develop an evaluation procedure to provide a more complete perspective of APRN student performance in OSCE. Faculty determined that 3 data points were required: faculty observation, student experience and SP feedback. A standardized checklist rubric, tailored to each case and developmental year, was developed for use by faculty. The student experience captured the essential information gathered by the student during the encounter. The final data point was the Essential Elements of Communication rubric completed by SPs following an encounter. The triangulated approach had high inter-rater reliability and internal consistency. The project demonstrated that tailored rubrics, evaluation of student experience and SP feedback are strongly associated with demonstration (or lack of) clinical skills progression and provided a means of developing tailored goals and remediation plans for students who performed below expectations. At a higher level, students who were struggling clinically were identified much earlier in the program, allowing for more intensive instruction and remediation. The observation form has given uniformity to feedback and been a positive training instrument regarding expectations of student performance. The OSCE evaluation method is flexible enough to meet different stages of learning, formative, summative and high stakes assessment.
See more of: Poster Session 2
See more of: Oral Paper & Posters