Comparative study of the psychometric properties of multiple-choice tests using confidence and number right scoring techniques

Authors

  • Jimoh Kasali Obafemi Awolowo University
  • Adediwura Alaba Adeyemi Obafemi Awolowo University

DOI:

https://doi.org/10.21580/jagc.2022.3.1.11276

Keywords:

Confidence scoring, number right, multiple choice items, item discrimination, item difficulty, distracter performance, Skor kepercayaan diri, nomor kanan, item pilihan ganda, diskriminasi item, kesulitan item, kinerja pengalih perhatian

Abstract

Purpose - This study aims to ascertain the impact of several options on distractor performance when confidence scoring was used and established the impact of the numbers of options for multiple-choice test items on the reliability of coefficients that were not significant.

Method: This research used a descriptive survey design. There are two sampling methods used in this research, simple random sampling technique and purposive sampling. The instrument used for this study was an adapted version of the 2015 West Africa School Certificate Examination (WASCE) Economics test items. Data collected were analyzed using ANOVA, Kuder -Richardson Formula (KR-20) and Fisher's Z-Test with aid of FZT compotator.

Result - The results of the study showed that several options had a significant impact on distractors' performance when scored using confidence scoring (F = 6.679, p <0.05). The results also showed that for each pairwise comparison of 3/4-options (zobt = 0.640), 3/5-options (zobt = 0.837) and 4/5-options (zobt = 0.196) at p <0.05 the difference in the reliability coefficients were not significant.

Implication - This study suggests to improve scoring, the procedure should be encouraged and used in schools because it is effective in reducing the contribution of random guessing to testees' total score and in rewarding testees' partial knowledge on multiple-choice tests. Furthermore, the confidence scoring procedure significantly reduces the 'craze' for a do-or-die affair to pass an examination at all costs and thus should be used in all schools.

Originality - This research is the study to improve scoring procedures that should be encouraged and used in schools.

***

Tujuan - Tujuan dari penelitian ini adalah memastikan dampak jumlah opsi pada kinerja pengecoh ketika penilaian kepercayaan digunakan dan menetapkan dampak jumlah opsi untuk item tes pilihan ganda pada keandalan koefisien tidak signifikan.

Metode - Penelitian ini menggunakan desain survei deskriptif. Ada dua metode pengambilan sampel yang digunakan dalam penelitian ini, yaitu teknik simple random sampling dan purposive sampling. Instrumen yang digunakan dalam penelitian ini adalah versi adaptasi dari soal-soal tes Ekonomi West Africa School Certificate Examination (WASCE) 2015. Data yang terkumpul dianalisis menggunakan ANOVA, Rumus Kuder-Richardson (KR-20) dan Fisher's Z-Test dengan bantuan kompotator FZT.

Hasil - Hasil penelitian menunjukkan bahwa jumlah opsi memiliki pengaruh yang signifikan terhadap kinerja pengecoh ketika diberi skor menggunakan penilaian kepercayaan (F = 6.679, p <0,05). Hasil juga menunjukkan bahwa untuk setiap perbandingan berpasangan opsi 3/4 (zobt = 0,640), opsi 3/5 (zobt = 0,837) dan opsi 4/5 (zobt = 0,196) pada p < 0,05 selisih koefisien reliabilitas tidak signifikan.

Implikasi - Penelitian ini menyarankan untuk meningkatkan prosedur penilaian harus didorong dan digunakan di sekolah-sekolah karena telah terbukti efektif dalam mengurangi kontribusi tebakan acak terhadap skor total testi dan dalam menghargai pengetahuan parsial testi pada tes pilihan ganda. Selanjutnya, prosedur penilaian kepercayaan secara signifikan mengurangi 'kegemaran' untuk urusan do-or-die untuk lulus ujian di semua biaya, dan dengan demikian harus digunakan di semua sekolah.

Orisinalitas - Penelitian ini adalah studi untuk meningkatkan prosedur penilaian harus didorong dan digunakan di sekolah.

Downloads

Download data is not yet available.

Author Biographies

Jimoh Kasali, Obafemi Awolowo University

Department of Educational Foundations and Counseling

Adediwura Alaba Adeyemi, Obafemi Awolowo University

Department of Educational Foundations and Counseling

References

Abiri, JOO (2006). Elements of Evaluation Measurement and Statistical Techniques in Education. Ilorin: Library and Publication committee, University of Ilorin, Nigeria.

Alonge, MF (2013) Assessment and Examination: The Pathways to Educational Development. Inaugural Lecture. University of Ado Ekiti.

Atalmis, E. H, & Kingston, N. M (2017). Three, four and none of the above options in multiple choice items. Turkish Journal of Education, 6( 4 ), 143-157.

Barton, C. (2019). Conference takeaways: Research Ed Blackpool 2019. [Audio podcast episode]. Accessed August 4, 2021 fromhttp://www.mrbartonmaths.com/blog/conference-takeaways-researched-blackpool-2019

Caldwell, DJ, and Pate, AN (2013). Effects of question formats on student and item performance. Am. J. Pharm. Educ. 77:71. doi: 10.5688/ajpe77471

De Ayala, RJ (2013). Theory and Practice of Item Response Theory. New York, NY: Guilford Press

DiBattista, D., and Kurzawa, L. (2011). Examination of the quality of multiple choice items on classroom tests. can. J. Scholarsh. Teach. Learn. 2:4. doi:10.5206/cjsotl-rcacea.2011.2.4

Downing, SM (2006). “SelectFoster, C. (2016). Confidence and competence with mathematical procedures. Educational Studies in Mathematics, 91( 2 ), 271–288 .https://doi.org/10.1007/s10649-015-9660-9

Foster, C., Woodhead, S., Barton, C., & Clark-Wilson, A. (2021). School students' confidence when answering diagnostic questions online. Educational Studies in Mathematics. https://doi.org/10.1007/ s10649-021-10084-7

Haladyna, TM, Downing, SM, & Rodriguez MC (2002). A review of multiple-choice item-writing guidelines for classroom assessment. Applied Measurement in Education , 15 (3), 309-334.

Hingorjo, MR, and Jaleel, F. (2012). Analysis of one-best MCQs: the difficulty index, discrimination index and distractor efficiency. J. sir. Med. Assoc. 62, 142–147.

Lau, SH (2010). Practicality and robustness of Number Right Elimination Testing (NRET) for multiple-choice items in paper-and-pencil testing (PPT) and computer-based testing (CBT). Unpublished doctoral dissertation, Universiti Malaysia Sarawak, Malaysia.

Marsh, HW, Pekrun, R., Parker, PD, Murayama, K., Guo, J., Dicke, T., & Arens, AK (2019). The murkydistinction between self-concept and self-efficacy: Beware of lurking jingle-jangle fallacies. Journal of Educational Psychology, 111 (2), 331–353. https://doi.org/10.1037/edu0000281 .

McCrea, E. (2019). Making every maths lesson count: Six principles to support great maths teaching. Crown House Publishing Limited.

Olutola, AT (2015). Empirical Analysis of Item difficulty and discrimination indices of senior school certificate multiple choice Biology test in Nigeria. A paper presented at the 41th Annual conference of International Association of Educational Assessment (IAEA) held on 11 th – 15 th October, 2015 at University of Kausa, Lawrence, Kausa USA

Omirin, MS (2021). Discrimination indices of three multiple choice tests using the confidence scoring procedure. International Journal of Education Research and Reviews 9( 1 ),1-4

Osunde, A. (2009). Essay and multiple choice tests: Bridging the Gap . Workshop papers on multiple choice test items. Writing procedures for academic staff, University of Ilorin, Ilorin On Monday 4 th Monday, 2009. pp. 14-24

Owolabi, H. O & Olatunji, M. (2009). Characteristics of Multiple Choice Items. Workshop Papers On Multiple Choice Tests Item Writing Procedures For Academic Staff, University Of Ilorin. Ilorin, On Monday, 4th , 2009.

Richard, BG, & Joseph, ML (2013). Inherent limitation of multiple-choice testing. Academic Radiology , 20 (10), 1319–1321.

Rodriguez, MC (2016). “Selected-response item development,” in Handbook of Test Development, 2nd Edn, eds S. Lane, M. Raymond, and T. Haladyna (New York, NY: Routledge), 259–273.

Tarrant, M., Ware, J., & Mohammed, AM (2009). An assessment of functioning and non-functioning distractors in multiple-choice questions: a descriptive analysis. BMC Med. Educ. 9:1–8. doi:10.1186/1472-6920-9-40

Testa S, Toscano A & Rosato R (2018) Distractor Efficiency in an Item Pool for a Statistics Classroom Exam: Assessing Its Relationship With Item Cognitive Level Classified According to Bloom's Taxonomy. Front. Psychol. 9:1585. doi: 10.3389/fpsyg.2018.01585

Thorndike, RM, & Thorndike-Christ, T. (2010). Measurement and Evaluation in Psychology and Education (8th Ed). Upper Saddle River, NJ: Pearson/Merril Prentice Hall.

Vyas, R., and Supe, A. (2008). Multiple choice questions: a literature review on the optimal number of options. christmas. Med. J. India 21, 130–133

Downloads

Published

2022-06-29

Issue

Section

Articles