International Council for Education, Research and Training

Assessing Item Reliability, Differential Item Functioning (DIF), and Wright Map Analysis of the GSP122 ICT Test at a Public University in Nigeria

Abubakar, Rabiu Uba1 and Hussaini, Aminu2

1Department of Education, Sule Lamido University, Kafun Hausa, Nigeria

2Department of Guidance and Counselling, Shehu Shagari University of Education, Sokoto-Nigeria.

Abstract

This study examines the psychometric properties of the GSP122 test, an Information and Communication Technology (ICT) knowledge assessment administered at a public university in Nigeria. Despite its importance in evaluating students’ ICT knowledge, no prior attempt has been made to investigate the test’s psychometric qualities. The research focuses on three key aspects: item reliability, Differential Item Functioning (DIF), and Wright Map analysis. The study employs Rasch analysis to evaluate these properties. A sample of 600 GSP122 test scripts was randomly selected from undergraduate students across various departments to ensure a representative assessment. Findings reveal that the test possesses strong item reliability, indicating consistency in measuring ICT knowledge. Furthermore, all items are found to be DIF-free, suggesting fairness across different subgroups of test-takers. The Wright Map analysis, however, indicates that the test doesn’t accurately target the abilities of students at the extreme ends and bottom of the proficiency spectrum. Specifically, some items are identified as too difficult and too easy relative to the students’ ability levels. These results provide valuable insights into the GSP122 test’s strengths and areas for improvement. This comprehensive analysis contributes to the validation of the GSP122 test and offers a foundation for evidence-based refinements in ICT assessment practices within the Nigerian higher education context.

Keywords: ICT test, Technology education, Psychometrics properties and Rasch model.

Impact Statement

It is evident from the literature reviewed and the personal experience that, studies on this perspective are limited in the developing world. This research is particularly valuable for identifying any mismatches between the test’s difficulty and the ability range of the student population, offering insights into the test’s overall effectiveness and areas for potential improvement. This study will have significant implications for ICT education and assessment practices in Nigerian higher education. By providing a detailed understanding of the GSP122 test’s psychometric properties, this research lays the groundwork for evidence-based improvements in test design and administration. Furthermore, it can contribute to the broader discourse on the importance of rigorous psychometric evaluation in educational assessment, particularly in the context of rapidly evolving fields like ICT. This study may have the potential to influence policy decisions, improve teaching practices, and ultimately enhance the quality of ICT education.

About The Author

The lead author, Mr. Abubakar Rabiu Uba, is a lecturer at the Department of Education, Sule Lamido University, Kafun Hausa, Jigawa state, Nigeria. Mr. Uba is an experienced scholar who has authored many articles in national and international journals and supervised many Undergraduate projects. He is interested in teaching and research. His part, the co-author Mr. Aminu Hussaini is an academic staff serving as an Examination Officer at the Department of Guidance and Counselling and also a Secretary of the Centre for Disaster, Risk Management and Humanitarian Studies, Shehu Shagari University of Education, Sokoto-Nigeria. Mr. Hussaini has authored/co-authored nearly forty research articles and published about twenty in national/international journals. He supervised numerous student projects, vetted/reviewed several articles, and validated many instruments. He is passionate about teaching, research, academic writing, counselling vulnerable persons, and traveling for academic activities.

References
  1. Adebayo, F. (2021). ICT skills and national development in Nigeria: A critical analysis. Journal of African Studies, 45(3), 278–292.

  2. Adebayo, F., & Eze, U. (2024). ICT proficiency and national development: A study of Nigerian university graduates. African Journal of Education and Technology, 15(2), 123–140.

  3. Ahmad Zamri, K., Mohd Affendi, M. M., & Mohd Zahuri, K. (2020). Psychometric properties of Raven’s Advanced Progressive Matrices with a sample Malaysian Youth. Pertanika J. Soc. Sci. & Hum., 28(1), 267–277.

  4. Andrich, D., & Styles, I. (2004). Final report on the psychometric analysis of the Early Development Instrument (EDI) using the Rasch model: A technical paper commissioned for the development of the Australian Early Development Instrument (AEDI). Murdoch University.

  5. Baghaei, P. (2008). The Rasch Model as a construct validation tool. Rasch Measurement Transactions, 22(1), 1145–1146.

  6. Baghaei, P., & Amrahi, N. (2011). Validation of a multiple-choice English vocabulary test with the Rasch Model. Journal of Language Teaching and Research, 2(5), 1052–1060. https://doi.org/10.4304/jltr.2.5.1052-1060

  7. Bond, T. G., & Fox, C. M. (2001). Applying the Rasch model: Fundamental measurement in the human sciences. Lawrence Erlbaum Associates. https://psycnet.apa.org/record/2001-06187-000

  8. Bond, T. G., & Fox, C. M. (2015). Applying the Rasch model: Fundamental measurement in the human sciences (3rd ed.). Routledge.

  9. Boone, W. J., Staver, J. R., & Yale, M. S. (2014). Rasch analysis in the human sciences. Springer. https://doi.org/10.1007/978-94-007-6857-4

  10. Chen, L., & Wong, K. (2023). Differential item functioning in ICT assessments: Implications for equity in higher education. International Journal of Educational Technology, 12(3), 287–302.

  11. Cooper, M., & Weaver, K. (2022). Gender bias in STEM education: Implications for educational practices. Journal of Educational Research, 115(3), 295–310.

  12. De Klerk, M., Nel, J. A., Hill, C., & Koekemoer, E. (2013). The development of the MACE work-family enrichment instrument. SA Journal of Industrial Psychology, 39(2), 1147–1162. https://doi.org/10.4102/sajip.v39i2.1147

  13. DeVellis, R. F. (2017). Scale development: Theory and applications (4th ed.). Sage Publications.

  14. Fisher, W. P. (2007). Rating scale instrument quality criteria. Rasch Measurement Transactions, 21(1), 1095.

  15. Hamad, A. A. (2021). The psychometric properties of measurement of the mathematics teachers’ professional identity in Saudi Arabia [Unpublished doctoral dissertation]. USM.

  16. Hambleton, R. K., & Jones, R. W. (2021). Principles and practices of test calibration and linking. Springer.

  17. Holland, P. W., & Wainer, H. (Eds.). (2012). Differential item functioning. Routledge.

  18. Hur, Y.-M., Te Nijenhuis, J., & Jeong, H.-U. (2017). Testing Lynn’s theory of sex differences in intelligence in a large sample of Nigerian school-aged children and adolescents (N > 11 000) using Raven’s standard progressive matrices plus. Mankind Quarterly, 57(3), 428–437. https://doi.org/10.46469/mq.2017.57.3.11

  19. Johnson, M., & Lee, S. (2019). The role of ICT literacy in higher education: A global perspective. International Journal of Educational Technology, 12(2), 145–163.

  20. Johnson, R., Smith, A., & Garcia, M. (2023). A review of ICT assessment methods in global higher education: Trends and challenges. Journal of Educational Technology and Society, 26(1), 45–60.

  21. Kumar, V., & Singh, R. (2022). Employer expectations of ICT skills in recent graduates: A cross-sectional study. Journal of Vocational Education and Training, 74(4), 512–528.

  22. Lee, J., & Park, S. (2024). Applying the Rasch model to evaluate ICT literacy: A case study from South Korea. Educational and Psychological Measurement, 84(2), 301–318.

  23. Linacre, J. M. (1994). Sample size and item calibration stability. Rasch Measurement Transactions, 7(4), 328.

  24. Linacre, J. M. (2006). A user’s guide to WINSTEPS/MINISTEPS. In A Rasch model computer programs. Chicago, USA. Winsteps.com.

  25. Linacre, J. M. (2012). Sample size and item calibration stability. Rasch Measurement Transactions, 7(4), 328.

  26. Liu, X., & Wilson, M. (2019). Ensuring fairness in educational assessment: The role of test targeting. Educational Measurement: Issues and Practice, 38(2), 25–33.

  27. Mustafa, A. K., & Ehab, M. N. (2022). Rasch analysis and differential item functioning of English language anxiety scale (ELAS) across sex in Egyptian context. BMC Psychology, 10(242), 4. https://doi.org/10.1186/s40359-022-00955-w

  28. Oladipo, A., & Eze, U. (2022). Evaluating ICT assessment tools in Nigerian universities: Challenges and opportunities. African Journal of Educational Assessment, 18(4), 412–428.

  29. Oladipo, A., Nwosu, L., & Eze, U. (2023). ICT proficiency and academic performance: A longitudinal study of Nigerian university students. International Journal of Educational Research, 112, Article 101742.

  30. Oluwatobi, S., Efobi, U., Olurinola, I., & Alege, P. (2019). ICT and higher education in Nigeria: The way forward. Journal of Educational Innovation, 32(1), 87–103.

  31. Paek, I., & Wilson, M. (2018). A Rasch analysis of the Test of English as a Foreign Language (TOEFL). Language Testing, 35(2), 147–164.

  32. Pellegrino, J. W., DiBello, L. V., & Goldman, S. R. (2016). A framework for conceptualizing and evaluating the validity of instructionally relevant assessments. Educational Psychologist, 51(1), 59–81. https://doi.org/10.1080/00461520.2016.1145550

  33. Redecker, C., & Johannessen, (2013). Changing assessment — Towards a new assessment paradigm using ICT. European Journal of Education, 48(1), 79–96. https://doi.org/10.1111/ejed.12018

  34. Richard, M. W., Peter, M. A., & Jotham, N. D. (2023). Psychometric properties of a test anxiety scale for use in computer-based testing in Kenya. The International Journal of Assessment and Evaluation, 30(1), 2327-8692. https://doi.org/10.18848/2327-7920/CGP/v31i01/1-18

  35. Rodriguez, M., Sanchez, J., & Lopez, A. (2023). Rasch analysis of an ICT competency test: Insights from a Spanish university. Computers and Education, 179, Article 104468.

  36. Smith, J., Brown, A., & Garcia, C. (2020). The importance of ICT skills in the 21st-century workforce. Journal of Vocational Education and Training, 72(3), 312–328.

  37. Thompson, K., & Liu, X. (2024). Enhancing computer science assessments through Wright Map analysis: A Rasch modeling approach. Journal of Educational Measurement, 61(1), 78–95.

  38. Wang, W.-C. (2008). Assessment of differential item functioning. Journal of Applied Measurement, 9(4), 387–408.

  39. Wilson, M. (2005). Constructing measures: An item response modeling approach. Lawrence Erlbaum Associates.

  40. Wilson, M. (2018). Making measurements important for education: The crucial role of classroom assessment. Educational Measurement: Issues and Practice, 37(1), 5–20. https://doi.org/10.1111/emip.12188

  41. Wilson, M. (2022). Constructing measures: An item response modeling approach. Routledge.

  42. Wilson, M., & Brown, G. (2022). The role of psychometric analysis in promoting equity in educational assessment. Educational Measurement: Issues and Practice, 41(3), 12–24.

  43. Wright, B. D., & Masters, G. N. (2018). Rating scale analysis: Rasch measurement. Sage Publications.

  44. Wu, M., & Adams, R. (2020). Applying the Rasch model to evaluate fairness in testing. Educational Assessment, 25(4), 287–305.

  45. Yang, P. et al. (2020). A Rasch analysis of the Chinese version of the Beck Depression Inventory – II. Journal of Clinical Psychology, 76(1), 35–47.

  46. Zumbo, B. D. (2007). Three generations of DIF analyses: Considering where it has been, where it is now, and where it is going. Language Assessment Quarterly, 4(2), 223–233. https://doi.org/10.1080/15434300701375832

  47. Mishra, S., & Gupta, S. (2023). Atal tinkering labs and the global notion of STEM education. Shodh Sari-An International Multidisciplinary Journal, 02(04), 131–137. https://doi.org/10.59231/sari7629

Muraina, I. O., Lameed, S. N., & Adesanya, O. M. (2023). Pedagogical Skeptics and Challenges towards the Application of Drones in Teaching and Learning Sciences. Shodh Sari-An International Multidisciplinary Journal, 02(03), 413–424. https://doi.org/10.59231/sari7616

Scroll to Top