Paper
Sunday, November 13, 2005
This presentation is part of : Technology Improving Patient Care
Development and Testing of a Bedside PC Clinical Care Classification System (CCCS©) for Nursing Students Using Microsoft Access®
Veronica Feeg, PhD, RN, FAAN, College of Nursing and Health Science, George Mason University, Fairfax, VA, USA and Virginia Saba, EdD, Honorary, PhD, RN, FAAN, FACMI, LL, School of Nursing, Georgetown University, Washington, DC, USA.
Learning Objective #1: Discuss the utility of a mobile personal computer-based bedside documentation system to teach students nursing process using the Clinical Classification System(c)
Learning Objective #2: Describe the costs and benefits of integrating nursing informatics into students' experiences with clinical care documentation

The two aims were to develop a software application in Microsoft Access® of the CCCS (Clinical Care Classification System) to record care planning; and to evaluate nursing student electronic charting using a randomized design on simulated bedside computers in the clinical lab. The CCCS nursing taxonomy was developed by Saba and colleagues (1991). The PC version was developed by the investigators in Microsoft Access®. The programs were mounted on laptop PCs at each bedside in the clinical lab. Nursing students were invited to sign up for the evaluation project. Students who agreed to participate (n=60) were invited for two sessions. Students who returned to both sessions (n=15) each interviewed two simulated patient actors with conditions (congestive heart failure and pneumonia) and were randomly assigned to record their encounters to one of two PC electronic care plan charting methods: the CCCS or text-only version. At the end of the simulation sessions, 14 students completed the study, randomly assigned to each group (n=7) and printed their care plans for evaluation (final n=28). Each encounter was timed from the end of the interview until the care plan was printing. Each student completed an evaluation of the electronic care planning system with a 7 item instrument developed for the study. All care plans were assessed by the investigators and scored using the Evaluation of Documentation Performance instrument developed for the study (coefficient alpha = .913). Each measure created a score that was used to compare any differences in student evaluation of the system; length of time to record the care plan following an interview with a simulated patient; and the quality of the students' care plans. Evaluations of the student care plans demonstrated a significant difference in students who used the laptop CCCS Microsoft Access® bedside computer over the text based system.