Using Competency Testing to Close the Practice Gap With Undergraduate Baccalaureate Nursing Students

Saturday, 29 July 2017: 9:30 AM

Patricia A. Sharpnack, DNP, RN, CNE, NEA-BC, ANEF
Kimberly Dillon-Bleich, MSN, RN
Lauren Patton, MSN, RN, CCRN, CHSE
Breen School of Nursing, Ursuline College, Pepper Pike, OH, USA

Background: Nurse educators are continually exploring innovative methods to evaluate transfer of course content to clinical application and competence. Assessment of clinical competence is a critical requisite of nursing professional education, yet research has shown that new graduates are not prepared for transition to practice. Systematic use of competency testing throughout the curriculum can facilitate clinical decision making skill development in undergraduate BSN students (Salem, Ramadan, El-Guenidy & Gaifer, 2012). Attempts to unify the definition of competency and link competency testing to safe practice have been initiated in Canada, the United Kingdom, and Australia; however, evidence of this work has not been broadly presented in the United States McWilliam & Botwinski, 2012). The OSCE is defined as “an approach to the assessment of clinical competence in which the components of competence are assessed in a well-planned or structured way with attention being paid to objectivity” (Najja, Docherty & Miehl, 2016). An OSCE requires that each student demonstrate particular skills and behaviors in a simulated situation or with standardized patients.

Purpose: To explore the use of competency testing through objective structured clinical examinations (OSCE) in facilitating the transition to professional nursing practice.

Conceptual Framework: Ericsson’s Theory of Deliberate Practice was used to frame the study. The design demonstrates alignment with the statewide nurse competency model (USA).

Method: Exploratory study. Nurse educators designed a series of OSCE’s as a final semester summative assessment for students in a baccalaureate nursing program. Clinical practice partners provided input and guidance on station design and evaluated testing criteria and processes. Station design was aligned with the state action coalition's nurse competency model designed by nurse leaders in education and practice. Clinical scenarios that provided students an opportunity to demonstrate competency at patient management skills and identification of quality and safety concerns were included. Students were required to make clinical judgments based on assessments, initiate interventions, and demonstrate a professional, therapeutic relationship with the patient and/or family. Faculty evaluated each student on achievement of competencies using an objective evaluation tool; inter-rater reliability was maintained through consistent trained evaluators and the use of Panopto technology to record all stations. Student demographics and data from competency scores, participant feedback and NCLEX –RN results from more than 65 students was obtained.

Results: A chi-square test was performed to assess the relationship between competency testing stations, the ATI Pharmacology standardized assessment and NCLEX passage. The results for the ATI standardized assessment (1, N=65, = 6.08, p<.05), the clinical decision making competency station (1, N=65, = 4.4, p<.05), and the quality and safety station (1, N=65, = 4.69, p<.05), were significant. No significance was found with the delegation, patient assessment, or medication administration stations. Student and faculty feedback indicate that the OSCE effectively and fairly evaluated clinical competencies and judgement skills. Students suggested that the use of OSCE’s be integrated early in the curriculum to reduce stress level and promote improved accountability for best practice and maintenance of clinical competency. The lack of a reliable and valid tool for competency assessment was a limitation of the project.

Conclusion:  The association between preparedness for practice and competency development has implications for nursing. Including competency testing throughout the curriculum, specifically testing that requires clinical decision-making is vital for safe transition to practice. The use of OSCE’s at key points in the educational process can assist in evaluating student performance, identifying the need for remediation opportunities prior to graduation, and preparing students for the transition to practice. The use of Panopto video recordings of student testing provided opportunity for student reflection and self-assessment. Evidenced-based strategies that promote the use of competency testing and the integration of technology are essential for transference of knowledge into professional practice. Further research to evaluate student outcomes and develop a valid and reliable tool is essential in this process.