Nursing Student's Evaluation of Objective Structured Clinical Examination at Taibah University, Saudi Arabia
- Corresponding Author:
- Garcia Base Paul Reinald
Assistant Professor, College of Nursing
Taibah University, Medina 42353, Saudi Arabia
Tel: 966507605230
E-mail: aljohani.khalid@gmail.com
Abstract
Abstract:
The aim of this study is to assess current OSCE applications. This is a cross-sectional study undertaken at Nursing College, Taibah University. Main outcome measures were students’ view of examination attributes, quality of performance testing, validity and reliability and usefulness of the OSCE compared to other assessment formats. Despite that students were accepting OSCE with respect to comprehensiveness, clarity of instructions and sequence, fairness, practicality and usefulness, students felt that it was a strong anxiety stimulating experience and intimidating. Further evaluation of OSCE implementation and continuous quality improvement initiatives should be done to address students’ concern and improve OSCE application.
Keywords
OSCE, Clinical assessment, Performance testing, Assessment format
Introduction
The assessment of students’ clinical competence is of great importance that necessitates several measures in evaluating student performance. The objective structured clinical examination (OSCE) is one of these approaches to student assessment in which components of clinical competence reflected in actual clinical scenarios are evaluated in a comprehensive, consistent and structured manner in accordance to learning outcomes specified in the curriculum. Although there are various methods for evaluating students’ competencies, OSCE has been recognized as the most reliable method for evaluation of the clinical skills [1]. Further, its reliability as an effective tool for nursing students’ clinical skills were supported by numerous studies [2-4] and it has been widely used in the assessments of students’ clinical performance [5,6].
The stressful nature of OSCEs and potential unsuccessful outcomes brought negative consequences to the students; therefore, better planning in conducting OSCE is critically important. On the other hand, the various advantages of OSCE such as improving students’ clinical performance, preparing highly qualified and competent graduates, increasing decision making capabilities and enhance teaching level compels it as essential concern to all nursing faculty and clinical instructors in the context of acceptance, effectiveness, comprehensiveness, and accurateness of this clinical evaluation [7]. In this vein, this educational approach requires robust design based on sound pedagogy to assure practice and assessment of holistic nursing care [8].
The Nursing College at Taibah University has introduced OSCE in 2014 to be part of all clinical courses. Generally, OSCEs are undertaken as students’ midterm and final examinations in the practical part of nursing courses and completed prior to their theoretical final examination. Number of stations and contents varies among courses. Although OSCE is instituted in the university, there is no evidence that current OSCE practices at the College of Nursing are meeting students’ clinical competencies needs. In this regard, their insights are deemed important to understand the plausible lapses in development, organization and actual conduction of OSCE, which in turn would provide guidelines for faculty development initiatives and a curricular reform [9]. Moreover, student feedback among these attributes is one of the key elements to enhance OSCE as an evaluation tool. The aim of this study is to assess current OSCE applications based on the students’ perspective.
Materials and Methods
▪ Design
This study used a cross-sectional design to properly represent the phenomenon of OSCE – students’ evaluation of OSCE attributes and performance testing evaluation.
▪ Study sample
The respondents of the study were 3rd year to 4th year level students in Bachelor of Science in Nursing program, enrolled during the school year 2017-2018 in the College of Nursing at Taibah University, Kingdom of Saudi Arabia. The said students were enrolled in courses that employed OSCE in their practice examinations. Total enumerations were used to ensure representation of samples.
▪ Instrument
An adapted questionnaire from Pierre et al. [10] was the primary tool used to gather relevant data for the study [10]. There was minimal amendment undertaken to the original questionnaire. However, since it was to be used with participants of culture and language different than for whom it was originally developed, the researchers translated the content of the questionnaire into Arabic language. Early face validity was attained after administering the instrument to seventeen (17) students. The reliability of the instrument tested using Cronbach’s alpha were noted as follows: OSCE evaluation at 0.88 and quality of performance testing at 0.70 showing good and acceptable level of reliability respectively.
▪ Ethical consideration
The standard ethical guidelines of the 1995 Declaration of Helsinki (as revised in Brazil 2013) were followed. The study was approved by the Ethical Review Board of Taibah University. Study aim and objectives were explained to all students preparing for their OSCE exam. The study questionnaire was distributed to all nursing students prior to each OSCE exam and informed to drop their responses to the collection box in case they like to take part in the study. Retuning the questionnaire to the collection box was considered as a permission to participate. No personal or any identifiable data were collected. Participation was a voluntary basis and students were assured that those who will decline involvement in the survey will not be penalized. Data gathering was completed from September 2017 to January 2018.
▪ Inclusion and exclusion criteria
For the inclusion criteria, the respondents (1) should be having OSCE in their courses; (2) should be willing to participate; (3) should be 3rd or 4th year nursing students. For the exclusion criteria, the respondents have no OSCE in their courses.
▪ Data analyses
Data analysis were done using SPSS version 23. The demographic profile was treated using Frequency count and percentage. The same statistical tools were used to determine the students’ evaluation of attributes of OSCE, validity and reliability and quality of performance test. To examine the significant difference of student responses’ when grouped according to profile variables, the F-test (one-way ANOVA) was used.
Results
Most of the respondents were female (65.6%) (Table 1). The majority of the participants were in the Nursing Bridging Program – NBP (56.4%). Age and GPA were (26.5; SD ± 5.6); (4.26; SD ± 0.45) respectively. Years of experience for NBP students mean was (5.8: SD ± 5.2). In terms of educational level, most of the participants were in level 4 (49.3%), level 5 (27.8%) and level 8 (14.3%). In addition, this study included student nurses who took OSCE in their clinical midterm and final examination of the eight (8) professional nursing courses offering in the first semester of school year 2017-2018.
F | (%) | |
---|---|---|
Gender | ||
Male | 172 | (34.4) |
Female | 328 | (65.6) |
Program | ||
NBP | 282 | (56.4) |
RNP | 218 | (43.6) |
Course | ||
Fundamentals of Nursing | 24 | (4.8) |
Health Assessment | 21 | (4.2) |
Medical Surgical 2 | 71 | (14.3) |
Medical Surgical 3 | 18 | (3.6) |
Child Bearing Family | 19 | (18.1) |
Critical Care Nursing | 145 | (29.2) |
Emergency Nursing | 37 | (7.5) |
Geriatric Nursing | 89 | (17.9) |
Table 1: Respondent profiles.
▪ OSCE evaluation and quality of performance testing
Majority of the respondents agreed that the OSCE was comprehensive covering wide area of knowledge (61.2%) and included broad range of clinical skills (56.6%), yet only thirty-seven percent (37.4%) believed that the stations in the examination would permit them to compensate in some areas of weaknesses (Table 2). Further, less than half of them (41.4%) agreed that the assessment test helped them to identify weaknesses and gaps in their clinical skills sets. Concerning quality of the performance test, only half of the participants felt they were well oriented about the nature of exam and the required tasks to perform were in accordance to ones been taught in clinical teachings (53.2% and 52.8% respectively). Similar distribution was evident in the item which reflects that the required skills to demonstrate were fair and consistent with the actual curriculum of the course (53.4%).
Item | Agree | Neutral | Disagree | |||
---|---|---|---|---|---|---|
F | % | F | % | F | % | |
Exam was fair | 277 | 55.4 | 161 | 32.2 | 62 | 12.4 |
Wide knowledge area covered | 306 | 61.2 | 139 | 27.8 | 54 | 10.8 |
Needed more time at stations | 285 | 57.0 | 125 | 25.0 | 90 | 18.0 |
Exams well administered | 270 | 54.0 | 179 | 35.8 | 51 | 10.2 |
Exams very stressful | 323 | 64.6 | 125 | 25.0 | 51 | 10.2 |
Exams well-structured and sequenced | 202 | 40.4 | 176 | 35.2 | 122 | 24.4 |
Exams minimized chance of failing | 196 | 39.2 | 230 | 46.0 | 74 | 14.8 |
OSCE less stressful than other exams | 208 | 41.6 | 160 | 32.0 | 132 | 26.4 |
Allowed students to compensate in some areas | 187 | 37.4 | 143 | 28.6 | 170 | 34.0 |
Highlighted areas of weakness | 207 | 41.4 | 184 | 36.8 | 109 | 21.8 |
Exams were intimidating | 261 | 52.2 | 147 | 29.4 | 92 | 18.4 |
Students were aware of level of information needed | 239 | 47.8 | 165 | 33.0 | 96 | 19.2 |
Wide range of clinical skills covered. | 283 | 56.6 | 162 | 32.4 | 55 | 11.0 |
Fully aware of the nature of exam | 266 | 53.2 | 183 | 36.6 | 51 | 10.2 |
Tasks reflected those taught | 264 | 52.8 | 163 | 32.6 | 72 | 14.4 |
Time at each station was adequate | 185 | 37 | 169 | 33.8 | 146 | 29.2 |
Setting and context at each station felt authentic | 197 | 39.4 | 181 | 36.2 | 122 | 24.4 |
Instructions were clear and unambiguous | 289 | 57.8 | 162 | 32.4 | 49 | 9.8 |
Tasks asked to perform were fair | 267 | 53.4 | 183 | 36.6 | 50 | 10.0 |
Sequence of stations logical and appropriate | 279 | 55.8 | 166 | 33.2 | 55 | 11.0 |
Exams provided opportunities to learn | 244 | 48.8 | 170 | 34.0 | 86 | 17.2 |
Table 2: Nursing students’ evaluation of attributes of the OSCE.
The OSCE validity and reliability result showed that (52.2%) of the student nurses believed that it was a practical and useful experience for them. Furthermore, they are indefinite whether their scores were an actual reflection of their essential clinical nursing skills (45.8%). A similar low percentage of participating students agreed that marking criteria in the exam is standardized (46%). However, almost two-thirds felt that personality, ethnicity and gender did not affect OSCE scores. When it comes to preferred assessment stand, majority of student nurses perceived that the MCQ and clerkship were the easiest assessment formats (57 and 21% respectively). Most of the respondents agreed that they learned from clerkship (34.4%) and MCQ (32.4%). Clerkship (33.2%) and MCQs (32.8%) were the preferred formats of student nurses in their clinical years with some to a lesser extent agreed that it should be OSCE (23.8%).
▪ Difference of responses when grouped according to profile variables
Table 3 presents the comparison of responses of the respondents on the OSCE evaluation, student observation of validity and reliability and quality of performance. It was observed that there was significant difference on OSCE evaluation when grouped according to education level (0.000) and program (0.001). This means that the responses varied significantly and was found out that those who are in level 5 and RNP students have higher OSCE ranking compared to NBP students. Similarly, students’ perception on validity and reliability showed variations according to education level and program. This was observed on level 4 and RNP students. Computed r-value for age, GPA, and experience relationships to profile variables indicates almost negligible to moderate correlation, however, only experience (0.029) shows significant relationship on OSCE evaluation; age (0.001) and experience (0.008) on students’ observation of validity and reliability. Therefore, the lesser the experience and younger the age, the more that they assessed the OSCE positively. There was no statistical significance in the regression model that was run since the p-value of 0.190.
OSCE Evaluation | OSCE validity and reliability | Quality of performance testing | ||||
---|---|---|---|---|---|---|
F/r- value | p-value | F/r- value | p-value | F/r- value | p-value | |
Gender | 0.533 | 0.594 | 1.655 | 0.099 | 0.064 | 0.949 |
Education level | 4.694 | 0.000 | 4.235 | 0.000 | 3.704 | 0.001 |
Program | 3.300 | 0.001 | 4.293 | 0.000 | 0.697 | 0.486 |
Workplace | 0.706 | 0.494 | 2.464 | 0.086 | 0.042 | 0.959 |
Age | -0.078 | 0.105 | -.154 | 0.001 | 0.048 | 0.322 |
GPA | 0.042 | 0.459 | -0.058 | 0.311 | 0.074 | 0.194 |
Experience | -.105* | 0.029 | -.126 | 0.008 | 0.043 | 0.366 |
Table 3: Difference of student nurses’ responses when grouped according to profile variables.
Discussion
Participants in the study discerned that the OSCE as an assessment approach to the clinical practice had acceptable construct validity. This was evidenced by affirmative responses with regards to fairness, comprehensiveness, authenticity and objectivity of this assessment process. These findings are similar with past studies on OSCEs in the medical literature [11,12]. The respondents however conveyed concerns about whether the exam would minimize chances of failing or the results were a precise reflection of their clinical skills. Since that OSCE is relatively new to the nursing college, these concerns are important to integrate for future OSCE preparation workshop.
Many student nurses felt that the assessment tool was stressful and intimidating experience. The students’ perception with regards to the gap in the structuring and sequencing of stations, awareness of the level of information needed and the time to complete skills at each station may have intensified respondents’ anxiety. These indicators inferred the need to re-examine the examination logistics particularly those sections describing the roles and responsibilities of OSCE committee, orientation and debriefing of examinees, preparation of the exam materials including station profile, the opening statement (scenario or preface) and the candidate instructions; and the recommended standard and guidelines for the delivery of exams which also includes the allowable time to complete skills at each station [13]. The current study come in accordance with earlier international studies revealing that OSCE is a powerful anxiety stimulating experience and that the degree of anxiety lessens as learners advance through the examination [14-16]. Further, anxiety and lack of confidence were related to insufficient preparation for the assessment, which may have influenced student perception of the exam especially for those examinees who answered the questionnaire after their OSCE [15]. Therefore, stress-management and fatiguerelieving techniques should be included in the OSCE preparation workshop.
In terms of quality of performance testing, almost half of the participants were convinced that tasks of each station were fair and consistent with the actual curriculum. Possible explanation for those who did not consider that the OSCE reflects the real curriculum is that not all students have the opportunity to apply their new skills in the nursing lab, therefore, accumulated required skills result in skills-gap. However, these parameters can be exemplary cornerstone of enhancement measures of the assessment tool under scrutiny. In this regard a test matrix which was applied only in theoretical exam of the university can be an essential activity in OSCE preparation to ensure that this examination method would be maximally valid based on a blue print [17]. In addition, students’ skills log book should be examined by the OSCE panel prior to the exam date to ensure that all students had practiced their acquired skills.
Scores for other indicators such as the authenticity of each station reflecting real clinical setting or life situations and adequacy of time allocation to complete each tasks were considerably low. Gormley et al. [18] inscribed techniques and principles that can be used by the OSCE team to enhance the realism of this assessment process such as use of standardized patient (SP), video clips of real patients to recreate clinical scenario, hybrid simulations using wide range of manikin devices, modifying the environment surrounding of the SP or manikin and moulage kits. These approaches can be useful to tailor a more realistic clinical experience. Diversely, due to huge number of students per course, the length of time for each station in the college’s OSCE was between 3-5 minutes. This is different from the Saudi Commission for Health Specialties (SCFHS) guidelines of 5-10 minutes for each station when examining undergraduate students [13]. Furthermore, Khan et al. [19] asserted that an appropriate and realistic time allocation for task at individual stations improve the test validity which generally lasting between 5 and 10 minutes. Correspondingly, Humayun and Haider [20] propounded that the reliability and validity are both influenced by the number of stations and total length of examination [20]. Therefore, examination length is one of the critical indicators of OSCE that should be augmented.
The findings of this study about the rate of various assessment formats to which they have been exposed denoted that MCQs and clerkship are the preferred type of clinical examination of the student participants. Interestingly, this mode of exams had received considerable high scores in the indicators describing the acceptable level of difficulty of exam, fairness and opportunities to learn. Moreover, majority asserted that the MCQs and clerkship should be employed more often in the clinical years of the program compared to other assessment methods. These results are consistent with earlier studies where MCQs was the easiest and fairest assessment format as perceived by the students [15]. In the similar sense, student participants in the current study obtained higher grades in the clinical area compared to theoretical examinations. This may be attributed to their preference since it was easier for them to obtain scores through these test methods compared to essay/SAQs and OSCE. This context entails a scientific inquiry to determine the gaps or the differences in the outcome among these assessment methods.
It was observed in this study that RNP students have higher assessment on the evaluation of OSCE. Similarly, the younger age assessed the OSCE more positively. This implied that previous experiences of those in NBP both in the academic institutions and hospitals may influenced their perceptions about the college’s OSCE. In this context, it is noteworthy to assert the urgency of matching up the college’s OSCE with the evidenced based practices of local and international higher academic institutions and university hospitals.
Conclusion
It was observed in this study that OSCE was accepted by the majority of the nursing students they acknowledged its positive impacts on their learning achievements. On the other hand, a considerable proportion disputes OSCE implementations. In this regard, the OSCE for the Bachelor program of the College of Nursing in the university had been challenging yet the student participants’ acceptance and their participation were motivating. Their feedback was considered by the faculty as key indicators for successful implementation of the assessment tool and also provided basis for enhancement measures.
Recommendation
The study recommendations include: continuous implementation of OSCE evaluation process; matching up the college’s OSCE with the evidenced based practices of local and international higher academic institutions and university hospitals; and promote students’ awareness about OSCE examinations. Further, quality improvement initiatives should be established to address students’ concerns and enhance OSCE applications into Nursing College evaluation system.
Conflict of Interest
The authors have no conflict of interest to declare.
References
- Hosseini SA, Fatehi N, Eslamian J, et al. Reviewing the nursing students’ views toward OSCE test. Ira. J. Nursing. Midwi. Res 16(4), 318 (2011).
- Johnston AN, Weeks B, Shuker MA, et al. Nursing students' perceptions of the objective structured clinical examination: An integrative review. Clin. Simu. Nursing 13(3), 127-142 (2017).
- Massey D, Byrne J, Higgins N, et al. Enhancing OSCE preparedness with video exemplars in undergraduate nursing students. A mixed method study. Nurse. Edu.Today 54(1), 56-61(2014).
- Mompoint-Williams D, Brooks A, Lee L, et al. Using high-fidelity simulation to prepare advanced practice nursing students. Clin. Simu. Nursing 10(1), e5-e10(2014).
- Aronowitz T, Aronowitz S, Mardin-Small J, et al. Using Objective Structured Clinical Examination (OSCE) as education in advanced practice registered nursing education. J. Profes. Nursing 33(2), 119-125 (2017).
- El-Nemer A, Kandeel N. Using OSCE as an assessment tool for clinical skills: nursing students' feedback. Aus. J. Basic. App. Sci 3(3), 2465-2472 (2009).
- Eldarir SA, Nagwa A, Hamid A. Objective Structured Clinical Evaluation (OSCE) versus traditional clinical students achievement at maternity nursing: A comparative approach. IOSR. J. Dental. Med. Sci 4(3), 63-68 (2013).
- Kelly MA, Mitchell ML, Henderson A, et al. OSCE best practice guidelines—applicability for nursing simulations. Adv. Simul 1(1), 10 (2016).
- Siddiqui FG. Final year MBBS students' perception for observed structured clinical examination. J. Coll. Physicians. Surg. Pak 23(1), 20-24 (2013).
- Pierre RB, Wierenga A, Barton M, et al. Student evaluation of an OSCE in paediatrics at the University of the West Indies, Jamaica. BMC. Med. Edu 4(1), 22 (2004).
- Alaidarous S, Mohamed TA, Masuadi E, et al. Saudi Internal Medicine Residents ׳ Perceptions of the Objective Structured Clinical Examination as a Formative Assessment Tool. Health. Profes. Edu. 2(2), 121-129 (2016).
- Selim AA, Ramadan FH, El-Gueneidy MM, et al. Using Objective Structured Clinical Examination (OSCE) in undergraduate psychiatric nursing education: Is it reliable and valid? Nurse. Edu. Today 32(3), 283-288 (2012).
- Ware J, Abdelmoniem EL, Hamza A, et al. Objective Structured Clinical Exam Manual (2014).
- Al-Zeftawy AM, Khaton SE. Student evaluation of an OSCE in community health nursing clinical course at faculty of nursing, Tanta University. IOSR. J. Nurs. Health. Sci. (IOSR-JNHS) 5(4), 68e76 (2016).
- Al Nazzawi AA. Dental students' perception of the Objective Structured Clinical Examination (OSCE): The Taibah University experience, Almadinah Almunawwarah, KSA. J. Taibah. Univ. Med. Sci 13(1), 64-69 (2018).
- Nasir AA, Yusuf AS, Abdur-Rahman LO, et al. Medical Students’ Perception of Objective Structured Clinical Examination: A Feedback for Process Improvement. J. Surg. Edu 71(5), 701-706 (2014).
- Mema B, Park YS, Kotsakis A. Validity and feasibility evidence of objective structured clinical examination to assess competencies of pediatric critical care trainees. Crit. Care Med 44(5), 948-953 (2016).
- Gormley G, Sterling M, Menary A, et al. Keeping it real! Enhancing realism in standardised patient OSCE stations. Clini. Teacher 9(6), 382-386 (2012).
- Khan KZ, Ramachandran S, Gaunt K, et al. The objective structured clinical examination (OSCE): AMEE guide no. 81. Part I: An historical and theoretical perspective. Med. Teacher 35(9), e1437-e1446 (2013).
- Humayun M, Haider I. Objective structured clinical examination (osce): Still needs improvement. J. Med. Sci 24(3), 113-113 (2016).