A tool to assess the clinical judgment ability of students was developed by means of a methodological study. Sixteen existing tools were identified and appraised. A thematic analysis was done on the nine instruments that met the inclusion criteria and a draft questionnaire was developed. Review by an expert panel strengthened the content and face validity of the tool.
Ninety nursing students demonstrated their competence during a standardized patient simulation experience. Fifteen video footages were randomly selected and handed to 20 assessors. The assessors were trained on how to apply the tool. The 20 assessors used the tool to determine the competence of each of the 15 students captured in the video footage.
A Cronbach Alpha coefficient test, Intraclass Correlation coefficient (ICC) test and a Kendall’s Coefficient of concordance (W) test determined reliability of the developed assessment instrument. A Cronbach Alpha coefficient of .90 is indicative of good internal consistency and proves the developed assessment instrument reliable. The ICC value of .85 indicates excellent inter-rater reliability as a continuum of all the respondents and further contributes to the reliability of the developed assessment instrument. However the W values of the developed assessment instrument were low and ranged between .04 and .40 per item. The low W values was attributable to the fact that some respondents were inconstant in assessing students, the fact that respondents could not validate the reasoning of students and the large number of assessors (20) in comparison to other inter-rater studies that have at the most three assessors.
The value of the study is that nurse educators and preceptors can use the tool to determine the competence of the student and identify the learning needs of the particular student. Recommendations are that the tool be assessed in real life practice and that an exploratory and confirmatory factor analysis be done.