Using Web-Based Video and Data Capture to Improve Student and Faculty Clinical Skills Evaluation

Saturday, March 28, 2020: 10:55 AM

Gretchel Gealogo Brown, PhD, RN, CMSRN
Braulio Amezaga, MA
School of Nursing, University of Texas Health Science Center San Antonio, San Antonio, TX, USA

Purpose:

Nursing students have restricted opportunities to practice clinical competencies and more skills are learned and mastered in simulated learning lab settings (Ross, 2015; Galbraith, 2004; Kolb, 2015). Faculty in nursing programs traditionally evaluate student performance on an in-person basis using paper competency checklists. For faculty, this method limits performance review to the instructor’s visual and written just-in-time evaluation of the skill performance. For students, the traditional check-off method limits self reflection and the ability to maintain an objective record of the evaluation. Current research indicates clinical skills video recording augments self-assessment, psycho-motor performance and discovery learning (Strand, Gulbrandsen, Slettebø & Naden, 2016). However, electronic data capture has not been incorporated consistently for performance review of clinical skills video recording. The purpose of this program evaluation project was to use web-based video and electronic data capture to improve student and faculty clinical skills evaluation.

Methods:

A cohort of 116 first-semester BSN students were rated on indwelling catheter insertion on male or female manikins in a simulated inpatient room. Students self-recorded the competency using a SWIVL robot that automatically uploaded the recordings to a web-based cloud. Students then evaluated their performance using a REDCap-based competency checklist with the option to re-record their video based on their review. Student self-evaluations and videos were then accessed by faculty for review. Faculty reviewed the student videos, typed comments on the SWIVL video feature as needed, and completed the same REDCap competency checklist.

Results:

77 students passed the competency on the first attempt. The top 4 missed procedural steps were initial perineal assessment; aseptic perineal care; placing the patient bed in the lowest position post-procedure; and patient/family teaching related to CAUTI prevention. 30 students rated themselves as having passed the competency although faculty rated them as having failed. Analysis of faculty comments yielded themes (positioning the sterile field and breaking/crossing the sterile field) that supported quantitative findings.

Conclusion:

Student feedback mirrored previous findings in the literature that self recording was less stressful than traditional in-person skills evaluation (Cernusca, Thompson and Riggins, 2018). Students also stated that reviewing their video with faculty assisted them in their understanding of the clinical competency. Faculty reported satisfaction with the flexibility of web-based skills evaluation and the ability to use data captured electronically to discover trends and highlight opportunities for improvement. Next steps include validation sessions of the competency checklist using a sample video of the procedure to further improve student and faculty inter rater reliability. Feedback from these sessions will be used to refine the competency checklist. Customized user manuals for SWIVL and REDCap will be developed for faculty and student training sessions. Use of SWIVL and REDCap will be incorporated for learner/teacher evaluation of clinical competencies in other pre-licensure courses.

See more of: J 02
See more of: Research Sessions: Oral Paper & Posters