Analysis of unfair means cases in computer-based examination systems
https://doi.org/10.1016/j.psrb.2016.09.023Get rights and content
Open Access funded by Far Eastern Federal University, Kangnam University, Dalian University of Technology, Kokushikan University
Under a Creative Commons license
Copyright © 2016, Far Eastern Federal University, Kangnam University, Dalian University of Technology, Kokushikan University. Production and hosting by Elsevier B.V.
There are mainly two systems of examination used by educational institutes to evaluate the performance of students. Most institutes use the conventional examination system, and some use computer-based examination systems. Unfair means is an illegal act by a student in either type of examination systems. In this paper, we will provide an overview of the types of examination systems. We will discuss types of unfair means cases in computer-based examination systems and will discuss some patterns and trends discovered using data mining techniques. After discussing the patterns, we will also suggest some key points for avoiding and diminishing the unfair means cases.
Types of education
Computer Based Assessment (CBA): Perception of residents at Dow University of Health Sciences
Masood Jawaid,1Foad Ali Moosa,2Farhat Jaleel,3 and Junaid Ashraf4
1Dr. Masood Jawaid, MCPS, MRCS, FCPS, Assistant Professor Surgery, Incharge e-Learning at Dow University of Health Sciences, Dow University Hospital and Dow International Medical College, Dow University of Health Sciences, Karachi, Pakistan.
2Prof. Foad Ali Moosa, FRCS, Professor of Surgery, Dow University Hospital and Dow International Medical College, Dow University of Health Sciences, Karachi, Pakistan.
3Dr. Farhat Jaleel, FCPS, Associate Professor of Surgery, Dow University Hospital and Dow International Medical College, Dow University of Health Sciences, Karachi, Pakistan.
4Prof. Junaid Ashraf, FRCS, Chairman Postgraduate Committee, Principal and Professor of Neurosurgery, Dow Medical College, Dow University of Health Sciences, Karachi, Pakistan.
Correspondence: Dr. Masood Jawaid, Department of Surgery, Dow International Medical College, Karachi, Pakistan. E-mail: email@example.com
Author information ►Article notes ►Copyright and License information ►
Received 2014 Apr 12; Revised 2014 May 15; Accepted 2014 May 28.
Pak J Med Sci. 2014 Jul-Aug; 30(4): 688–691.
Background and objective: During the past few years, Computer-based assessment (CBA) has gained popularkity as a testing modality. This assessment offers several advantages over paper based assessment (PBA) testing. The objective of this study was to find out residents’ perception of this method of assessment.
Methods: The post graduate residents of Dow University of Health Sciences in the field of Surgery, Medicine, Gynecology and Obstetrics experienced their first formative Computer-based assessment (CBA) in year 2013.Immediately after formative CBA, an anonymous paper based questionnaire was distributed amongst the residents and response was sought for their self-perceived computer usage competence before starting residency, perceptions regarding CBA method and to determine their preference for PBA or CBA in future assessment preferences.
Results: Total 173 residents completed the questionnaire. More than half of residents, 56.1% had no prior experience of CBA. Three fourth, 76.4% of the residents were less than confident before sitting in CBA, while after completing CBA, 64.8% were either confident or extremely confident for CBA. Most common problem encountered by students was logging in 28.9%. More students (53.2%) believed that paper assessment took longer to complete than CBA. Majority of the students (61.8%) rated CBA as better than PBA despite experiencing it for the first time.
Conclusion: Resident’s perception for CBA is good and they recommend its use in future assessment as well. However, to take maximal advantage of this technology, faculty should be trained to develop questions not only with text and pictures but with audio and video support.
Key Words: Computer based assessment, Formative assessment, e-assessment
During the past few years, Computer-based assessment (CBA) has gained popularity as an assessment modality.1Tests can be taken independent of time or physical place and a unique test for each individual candidate can be generated with same difficulty index. Tests can be scored immediately after, providing instant, individualized and detailed feedback for each candidate. Another major advantage of CBA is that it can make use of multi-media materials (video, audio) in test items,2 thus increasing the validity of the test. With this modality, individual performance of candidates can be compared against group performance. All these features enable CBA highly suitable for use in official high stake examinations.3
However, there are also some disadvantages, among which is the very high initial cost of CBA setup and also this mode is not suitable for every type of assessment (example: extended response questions, performance). Other disadvantages are computer glitches, content errors, computers and server crashes and data security lapses.4Some students are used to taking notes in question paper and also in a habit of marking questions and/or answers for later review which is not very much suitable with this system. Some students read more quickly and more easily on paper than on a glaring computer screen.
From the students’ perspective of the CBA, there have been varied perceptions. One study reported that more students anticipated problems with the CBA than actually experienced them.5Before completing the assessment, fewer students were confident about CBA, however more students stated a preference for CBA after their first experience.3Studies conducted have showed a trend of preference for CBA over PBA.6 Some studies reported the main disadvantage as being increased anxiety amongst those inexperienced with use of computer3,5 and regarded them as “technophobic”.
However, the transition to CBT is neither easy nor cheap. Dow University of Health Sciences (DUHS),being one of the largest universities of public sector in Pakistan is working on innovations in the field of medical education.7 Currently Dow University of Health Sciences is using pen and paper method for the assessment of student`s knowledge. In the past few years, the number of programs offered by the University as well as the number of students has increased significantly and the conventional testing method has become time consuming and difficult to administer. We are taking advantage of the fact that all institutes of the University are well equipped with Information Technology (IT) department and have state of the art digital libraries. More ever now e-Learning Cell in the university is also functional. Utilizing these resources, a solution of examination in large classes of students, CBA was planned to be explored.
Before introducing CBA in high stake tests, it is important to find out student perception and also how they compare CBA and PBA. Experience gained with this exercise will be helpful for administration, faculties and students to work for high stake summative assessment with this modality.
Every year post graduate residents of DUHS are required to sit in formative assessment by pen and paper method. During the year 2013, all residents working in Surgery, Medicine, and Gynecology and Obstetrics appearing in formative assessment were assessed first time with CBA modality in Digital Library.
Dow University of Health Sciences was already equipped with locally hosted Learning Management System, MOODLE so the quiz activity of the software8 was used to administer the assessment which include 100 item one best multiple choice questions. Question for all students were same but sequence was different for every student and also sequence of options was different by using software technology. The test time was two hours. As this modality was utilized for the first time; students were given demonstration of usage of the software for approximately five minutes before the test. Students then went through a practice test so that they could familiarize themselves with the workflow of software, editing answers once marked and different navigational features.
Assessment test was taken by 206 residents. Immediately after the test, student perceptions of the assessment methods were enquired by a paper based self-administered questionnaire. Intentionally this method was used as those students who were not satisfied with this assessment method or found difficulty with computers during test, could give a feedback easily. Questionnaire also gathers information about demographic profile, resident’s self-rated competence in using computer and comparisons of their opinion about CBA and PBA. Data was analyzed with SPSS 17. Descriptive statistics was used to present data.
A total of 206 residents took the test. Out of these 173 residents completed and returned back the questionnaire. These included 58 (33.5) surgical, 67 (38.7%) Gynecology and Obstetrics and 48 (27.7%) Medicine residents. The response was 83.9%. There were 114 females (65.9%) and 59 males (34.1%).These included first year residents which were 60 in number (34.7%), second year resident 50 (28.9%), third year 31 (17.9%) and final year residents were 28 (16.2%).
Responding to their competence about use of computer prior to entering residency program, more than 60%rated themselves as at least “competent” in using e-mail and the internet. One fourth of the students had no prior experience of spread sheets. It is also important to note that 6.4% had never used email prior to starting residency (Table-I). Majority (68.2%) rated their overall computer competence as “adequate” or “more than adequate” before starting residency training.
Competence in using computer prior to entering Residency
More than half of residents, (56.1%) had no prior experience of CBA. Training (demonstration of the software usage) was found to be at least “adequate” by more than 80% of residents. Three fourth 76.4% of the residents were less than confident before sitting in CBA, while after finishing their CBA, 64.8% were either confident or extremely confident. Most common problem experienced by the residents was of login issue in (28.9%), followed by saving and editing answers in (9.2%). Most (64.3%) of the residents reported no problems (Table-II).
Feedback of residents about Computer Based Assessment (n = 173).
More than half of residents (53.2%) said that PBA takes longer to complete than CBA, whereas 15% thought the opposite. Majority of the students (61.8%) rated CBA as better than PBA despite majority had attempted it for the first time. When asked which method of assessment they would prefer to use in future, 67.1% responded with CBA, 9.8% preferred PBA and 16.8% responded that they had no preference (Table-III).
Resident’s comparison of Computer Based Assessment and Paper Based Assessment
Few representative open remarks of residents were:
“It was nice exposure to have online test first time for postgraduate trainees. Hope we will have same in future too”
“I appreciate this attempt of examination and recommend it to be taken half-yearly rather annually. I also recommend that DUHS should provide awards/certificates/prizes to the candidates who score the highest marks in order to have feeling of appreciation and have serious and worthy response from trainees.”
The findings of this study showed that CBA was very well received by the residents, despite the fact that, for most of them this was the first experience of its kind. No major technical problems were encountered apart from few which were rectified at the same time. Residents preferred CBA compared to PBA and requested that this method of assessment should be utilized more frequently instead of being carried out only on yearly basis.
Results of this study were not different from that of other studies about CBA. A study among physiology students reported that students are satisfied with CBA.9 More than half of the students (52%) rated CBA as better than PBA. Students said this method takes shorter time to complete. When asked which mode of assessment they would prefer in future, 53% opted for CBA. Another study among final year undergraduate students reported that 79.8% preferred CBA compared to PBA.3 Reasons for preference included they could proceed at their own pace, examination was fun, convenient and there was equality in marking.
CBA is not a method of improving the item weakness. Validity and good construction of items is always a pre requisite for a good assessment. This can be highlighted by a study among students of undergraduate chemistry.10 Only 29.2% were in favor of CBA. When reasons were probed, findings were that it was due to flawed chemical formulas, equations and structures in the test items.10
The implementation of CBA is not just transferring traditional testing models from paper to screen. The main purpose of using technology in assessment is to improve the quality of assessment in terms of validity and efficiency.11Incorporating the latest technological developments into testing requires significant research and investment in fields such as multimedia design, network issues, items quality and security, psychometrics, and information technology. All of these need to be addressed before embarking CBA as a regular feature in institutes.12
Future work is required for assessing test psychometric of CBA and comparing them with PBA, faculty perceptions especially after they have developed test items on multimedia format, as currently we have only used a few pictures in the test. Good formative assessments should result in increased student’s performance in summative assessments. For this, these areas need to be explored in future.
Residents rated CBA highly; despite the fact that for most of them, it was their first experience. Few technical issues were encountered during assessment, but all of them were resolved without any major issue. CBA offers a range of advantages: place-independent assessment in formative test, the use of multimedia, the automatic processing of results, automatic statistical analysis of exams and the reusability of questions. This makes this mode of assessment a highly efficient way of assessment for large number of students.
Conflict of interest: None
Source of funding: None.
1. Hassanien MA, Al-Hayani A, Abu-Kamer R, Almazrooa A. A six step approach for developing computer based assessment in medical education. Med Teach. 2013;35(s1):S15–S19.[PubMed]
2. Hartog R, Draaijer S, Rietveld L. Practical aspects of task allocation in design and development of digital closed questions in higher education. Pract Assess Res Eval. 2008;13(2):2.
3. Lim EC, Ong BK, Wilder-Smith EP, Seet RC. Computer-based versus pen-and-paper testing: Students' perception. Ann Acad Med Singapore. 2006;35(9):599.[PubMed]
4. Aisbitt S, Sangster A. Using internet-based on-line assessment: A case study. Account Edu. 2005;14(4):383–394.
5. Darrell LB. The Impact of Computer-Based Testing on Student Attitudes and Behaviour. Technol Source. 2003 [Online] Available from URL: http://ts.mivu.org/default.asp?show=article&id=1034.
6. Fyfe G, Meyer J, Fyfe S, Ziman M, Sanders K, Hill J. Self-evaluation of assessment performance can enhance student’s perception of feedback on computer-generated tests. [Online]. Available from URL: http://www.iaea2008.cambridgeassessment.org.uk/ca/digitalAssets/180505_Fyfe.pdf.
7. Jawaid M, Ashraf J. Initial experience of eLearning research module in undergraduate medical curriculum of Dow University of Health Sciences: Development and students perceptions. Pak J Med Sci. 2012;28(4)
8. Gómez-Muro A, Blanco M. Creating moodle quizzes for the subject of mathematics in secondary education. EDULEARN13 Proceedings. 2013:5145–52.
9. Sheader E, Gouldsborough I, Grady R. Staff and student perceptions of computer-assisted assessment for physiology practical classes. Adv Physiol Edu. 2006;30(4):174–180.[PubMed]
10. Jimoh R, Kawu A, Kola Y. Students' perception of Computer Based Test (CBT) for examining undergraduate chemistry courses. J Emerg Trends Com Info Sci. 2012;3(2):125–134.
11. Dindar M, Yurdakul IK, Dönmez FI. Multimedia in Test Items: Animated Questions vs. Static Graphics Questions. Procedia Soc Behav Sci. 2013;106:1876–1882.
12. Dennick R, Wilkinson S, Purcell N. Online eAssessment: AMEE guide no. 39. Med Teach. 2009;31(3):192–206.[PubMed]
Articles from Pakistan Journal of Medical Sciences are provided here courtesy of Professional Medical Publications