All submissions of the EM system will be redirected to Online Manuscript Submission System. Authors are requested to submit articles directly to Online Manuscript Submission System of respective journal.

The Impact of Test Blueprint Transparency of Hematology Course on Students’ Evaluation and Final Grades

Corresponding Author:
Fahad AS Aleidan
College of Medicine, King Saud Bin
Abdulaziz University for Health Sciences
Riyadh, Saudi Arabia
E-mail: faleidan@gmail.com

Abstract

Abstract

Background: Test blueprint, sometimes referred as table of specifications, is rarely used in the construction of traditional summative assessments in medical schools.

Objectives: To assess the impact of releasing test blueprints on students’ attitudes towards the Hematology block evaluation.

Methods: The test blueprints for the mid-block and final written examinations were released and well explained at the start of hematology block by the block chief coordinator in 2016. These students from the 2016 cohort serve as the “after blueprint released group”, while students in 2015 cohort serve as “before blueprint released group”. The end of block evaluation was used for the mid-block and final examinations, using a 5-point Likert scale.

Result: In the mid-block examination, there was significant difference in the number (%) of students rating the course evaluation, with a comprehensive coverage of the content taught [119 (82.1) versus 83 (62.9), p<0.001] and with overall fairness of the examination [120 (82.8) versus 90 (68.2), p=0.005] between the after test blueprint released group and the before released group. Similarly, in the final examination, there was significant difference in the number (%) of students rating the course evaluation with a comprehensive coverage of the content taught [127 (87.6) vs 80 (60.6), p<0.001] and with overall fairness of the examination [123 (84.8) versus 95 (72.0), p=0.009]. The mean (± SD) mark of the after test blueprint released group was significantly higher than the before test blueprint released group [88.7 (± 8.50) versus 82.95 (± 9.75), p<0.0001].

Conclusion: The release of the test blueprint at the beginning of the hematology block improved the students’ perception of the fairness and comprehensiveness of the final examination and also led to their higher mean level of performance.

Keywords

Evaluation; Test blueprint; Medical education; Student; Grade

Introduction

At King Saud Bin Abdulaziz University for Health Sciences (KSAU-HS) College of Medicine (COM) model of PBL is adapted from the University of Sydney PBL model, and implemented according to the Saudi Arabian socio-cultural context. Hematology is the fourth block in the first year of basic sciences, phase-II. This block consists of six weeks and includes five cases (anemia, chronic lymphocytic leukemia, bleeding disorder, thalassemia, and thrombosis) and specific learning objectives for each case. Mid-block and end-block written exams comprising of Multiple-Choice Questions (MCQs) are used as a summative assessment tool for students to achieve their final grades. Other assessment methods are also used for different instructions, including Objective Structured Clinical Examination (OSCE), and Objective Structured Practical Examination (OSPE) [1].

The occurrence of mismatching between the content examined and the material assessed at the end of a course is frequently realized and perceived. This lack of coherence leads to an assessment that fails to provide evidence from which instructors can make valid judgements about students’ progress and grades [2]. The development and utilization of constructed test blueprints is one of the strategies to be adapted to mitigate mismatching. The test blueprints can assist the instructors to align the amount of class time spent on individual learning objective delivery with the cognitive level, thereby allowing instructors to identify the types and numbers of items needed to be included in their exams [3]. However, the influence of test blueprint transparency on students’ evaluations and grades has not been extensively studied. The aim of this study was to assess the influence of releasing test blueprints on students’ evaluation and grades.

Methods

At the end of the block, a self-administered questionnaire was used to gain student feedback and attitudes towards the instructional effectiveness, stimulation, subject relevance, amount learned from the cases and quality and difficulty of mid-block and final written examinations. Students rated items on a 5-point likert-type scale anonymously, where 1=poor and 5=excellent. Space was also provided for open-ended comments to the questions “what did you like most about this block?” and “what would you like to be changed in this block?” This questionnaire was developed by the department of medical education, and a group of medical educators assured content validity. Reliability,measured using an alpha coefficient, was 0.914.

The test blueprints for the mid-block and final written examinations were released and well explained at the start of the hematology block by the block chief coordinator (Tables 1, 2) in 2016, and these students serve as the “after blueprint released group”, while students in the 2015 cohort serve as “before blueprint released group”.

Delivery method B and CS theme CD CDS/CS Lecture Learning objectives PPD PS
Discipline
Hematology 3   1 7      
Geriatric medicine   1          
Immunology       1      
Medicine     2       1
Molecular medicine 1            
Oncology/pharmacology       1      
Physiology       1      
PBL         16    
Assessment methods OSPE/MCQ OSCE OSCE MCQ PBL rating/ MCQ OSCE OSCE
               
No of MCQs (%)* 2 (5)     30 (75) 8 (20)    

Table 1: The assessment blueprint for the mid-block written examination of hematology block.

Delivery method B and CS theme CD CDS/CS Lecture Learning objectives PPD PS
Discipline
EBM       2      
Ethics   1          
Hematology 1     4      
Hematology, behavioural sciences       1      
Hematology, public health 1     1      
Hematology, Surgery       1      
Infectious Diseases       3      
Medicine 1 1 3     1 1
Molecular Medicine       1      
Pharmacology       3      
Pharmacology, Obs/Gyn 1            
Paediatric           1  
PBL         23    
Radiology 1            
Assessment Methods OSPE/MCQ OSCE OSCE MCQ PBL rating/ MCQ OSCE OSCE
               
No of MCQs (%)* 2 (2.5)     48 (60) 18 (22.5)    

Table 2: The assessment blueprint for the final written examination of hematology block.

At the end of the block, students were asked to evaluate the mid-block and final examinations using the 5-point likert scale comprising the following items: Comprehensive coverage of the content taught; Quality of MCQs; Difficulty level of examinations; And overall fairness of examinations.

Data analysis

students’ feedback data were entered into an excel spreadsheet. The mean responses were calculated for all evaluation questions. Open-ended comments were analyzed qualitatively to explore the content of commentaries, to compare and contrast the strengths and weaknesses of both the block and PBL cases in terms of relevance, stimulation, and amount of knowledge learned as perceived by the students, and more importantly, the tutors. The end of block questionnaires for mid-course and final examinations before and after test blueprint released were compared and analyzed using the Z score test for two population proportions. Students’ grades as mean mark (± SD) were analyzed and compared using unpaired t-test. All tests were two-sided and a P value <0.05 was considered significant.

The Statistical Package for the Social Sciences (SPSS) software (USA, version 24) was used to perform the statistical analysis.

Results

Out of 298 students, 129 from each of the two batches answered and returned the questionnaires, with an overall response rate of 93%. Before the blueprint was released 132 out of 140 students answered and returned the questionnaires, with a response rate of 94%. After the blueprint was released, 145 students out of 158 responded and answered the questionnaires, with a response rate of 92%. The high rate of response helps justify the accuracy of the results to be presented in the study. Descriptions of students’ end of hematology course evaluation in term of block organization, PBL cases, and performance of PBL members before and after test blueprint released, are presented in Table 3.

Evaluated items Before blueprint released (n=132) After blueprint released (n=145)
Block organisation, mean
Quality of the block content 4.68 4.81
Quality of the block clarity 4.24 4.55
Sequence of activities 4.33 4.95
Schedule maintenance 4.41 4.4
Block duration 4.05 4
Function of chief coordinator 4.76 4.81
Function of coordinator 4.32 4.43
PBL ‘five cases’, mean
Amount of knowledge learned 3.96 4.11
Relevance to KSA socio-culture 4.25 4.25
Stimulation of student thinking 4.5 4.25
PBL members’ performance, mean
Cooperation 4.26 4.54
Critical thinking 4.24 4.34
Participation 4.15 4.39
Function of tutor 3.78 4.19
Function of chairmen 4.11 4.39
Function of secretaries 3.98 4.35

Table 3: Students’ end of hematology block evaluation before and after releasing blueprint.

In the mid-block examination, there was significant difference in the number (%) of students rating the course evaluation with a comprehensive coverage of the content taught [119 (82.1) versus 83 (62.9), p<0.001] and with the overall fairness of the examination [120 (82.8) versus 90 (68.2), p=0.005] between the after test blueprint released and before released groups. In the final examination, there was significant difference in the number (%) of students rating the course evaluation with a comprehensive coverage of the content taught (127 (87.6) versus 80 (60.6), p <.001) and with the overall fairness of the examination (123 (84.8) versus 95 (72.0), p=0.009) (Table 4).

Written exams Before blueprint released (n=132) After blueprint released (n=145) P-value*
Mid-block exam, No. (%)
Comprehensive coverage of the content taught 83 (62.9)  119 (82.1) <0.001
Quality of MCQ items, = ‘good’ 92 (69.7) 107 (73.8) 0.447
Level of difficulty, = ‘hard’ 101 (76.5) 110 (75.9) 0.896
Overall, a fair exam 90 (68.2) 120 (82.8) 0.005
Final exam, No. (%)
Comprehensive coverage of the content taught 80 (60.6) 127 (87.6) <0.001
Quality of MCQ items, = ‘good’ 98 (74.2) 111 (76.6) 0.653
Level of difficulty, = ‘hard’ 108 (81.8) 119 (82.1) 0.96
Overall, a fair exam 95 (72.0) 123 (84.8) 0.009

Table 4: Student’s evaluation for the mid-block and final exam before and after blueprint released.

The after test blueprint released group achieved higher average grades than the before test blueprint released group (B+ versus B). The mean (± SD) mark of the after test blueprint released group was higher than the before test blueprint released group (88.7 (± 8.50) versus 82.95 (± 9.75), p<0.0001) (Table 5).

Grades Before blueprint released n=140 (%) After blueprint released n=158 (%)
A+ 6 (4.29) 10 (6.33)
A 18 (12.86) 21 (13.29)
B+ 20 (14.29) 25 (15.82)
B 27 (19.29) 30 (18.99)
C+ 34 (24.29) 37 (23.42)
C 19 (13.57) 20 (12.66)
D+ 10 (7.14) 10 (6.33)
D 5 (3.57) 4 (2.53)
F 1 (0.71) 1 (0.63)
Average grade B B+
Average mark (± SD)* 82.95 (± 9.75) 88.70 (± 8.50)

Table 5: Students’ final grades before and after releasing hematology block blueprint.

Discussion

This study has shown that releasing the test blueprint at the beginning of the block, along with an in-depth discussion of the course and its objectives with the course director, had a positive impact on how students perceived the fairness of the mid-and end-block examination. Additionally, it positively affected the students’ perception of the comprehensiveness of the material taught and learned in the block. The release group also performed better on the midand end-block examinations than the pre-release cohort of students.

While little had changed in the block between the two years, a perceived weakness of the study is not having a true control group other than the cohort of students who participated in the block the year prior to release of the test blueprint [4].

The other difference from a previous study of making test blueprints available to students at the University of Calgary, was that the hematology block test blueprint only included the types of instruction from which the test questions would be drawn as opposed to a more precise listing of the actual subject matter of the question [5]. Nevertheless, the students in the release group not only felt more positive towards the mid-and end-block examinations but their mean level of performance was higher than the pre-release cohort, as noted above.

A well-constructed blueprint test offers content validity to the process of evaluation [6]. In addition to this psychometric benefit, a wellconstructed test blueprint also has practical advantages for all involved in the educational experience, including the course chair, the evaluation coordinator, and course instructors. However, while these advantages are generally accepted, the practice of releasing test blueprints per se is less well accepted. The argument against providing students with a test blueprint because it may make the evaluation easier for them appears to be unfounded; however, another argument suggests that it may drive students to ‘strategic learning’ [7] a consequence of which is a risk that their knowledge base may be less ‘rounded’.

The limitations of our study include its design of comparing the current hematology block to previous year’s block, which may present a selection bias. As mentioned above, blueprint release may drive students to strategic learning. Lack of previous experience in our college of medicine curricular employed.

Conclusion

The release of the test blueprint at the beginning of the hematology block improved the students’ perception of the fairness and comprehensiveness of the final examination and also led to their higher mean level of performance.

References