Development of the Digital Competency Test for Bachelor’s Degree Using the Multidimensional Item Response Theory
Main Article Content
Abstract
This research aimed to: 1) develop a digital competency model for bachelor’s degree students, 2) develop a digital competency assessment using the multidimensional item response theory (MIRT), and 3) examine the digital competency levels of undergraduate students. The study employed a research and development methodology. The sample consisted of 1,200 undergraduate students from higher education institutions, selected through a multi-stage sampling technique. The research instrument was a four-option multiple-choice digital competency test comprising 30 indicators and 240 items. Data were analyzed using descriptive statistics, item quality analysis based on MIRT, and confirmatory factor analysis (CFA). The findings revealed that the digital competency model for bachelor’s degree students consisted of three components: 1) Media, Information, and Digital Literacy, 2) Skills in Using, Developing, and Solving Problems with Digital Tools, and 3) Adaptive Digital Transformation. The test was consistent with empirical data and aligned with the between-item multidimensional 3-parameter model. A total of 211 items (87.92%) met the quality criteria, with an average difficulty of 1.44, average discrimination power of 0.24, and an average guessing parameter of 0.13. Item-fit indices ranged from 0.62 to 1.05 for OUTFIT MNSQ and 0.74 to 1.14 for INFIT MNSQ. The test demonstrated high reliability, with an EAP reliability coefficient of 0.865. The digital competency model exhibited good fit with the empirical data (= 233.011, p = .217,
/df = 1.074, AGFI = 0.975, RMSEA = 0.008). Results from the competency assessment indicated that students scored highest in Digital Communication Skills and the lowest score in Computational Thinking (M = 5.99 and 2.40 respectively).
Article Details

This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.
The content and information contained in the published article in the Journal of Educational Measurement Mahasarakham University represent the opinions and responsibilities of the authors directly. The editorial board of the journal is not necessarily in agreement with or responsible for any of the content.
The articles, data, content, images, etc. that have been published in the Journal of Educational Measurement Mahasarakham University are copyrighted by the journal. If any individual or organization wishes to reproduce or perform any actions involving the entirety or any part of the content, they must obtain written permission from the Journal of Educational Measurement Mahasarakham University.
References
Baker, F. B. (2017). The basics of item response theory using R. Springer Nature.
Briggs, D. C., & Wilson, M. (2003). An introduction to multidimensional measurement using Rasch models. Journal of Applied Measurement, 4(1), 87-100.
Brossman, B. G., & Guille, R. A. (2014). A comparison of multi -stage and liner test designs for medium-size licensure and certification examinations. Journal of Computerized Adaptive Testing, 2(2), 18-36.
Comrey, A. L., & Lee, H. B. (2013). A first course in factor analysis. Psychology Press.
Lord, F. M. (1980). Applications of item response theory to practical testing problems. Erlbaum.
Lunz, M. E., Wright, B. D., & Linacre, J. M. (1990). Measuring the impact of judge severity on examination scores. Applied measurement in education, 3(4), 331-345.
Nunnally, J. C., & Bernstein, I. H. (1994). Psychometric Theory. McGraw-Hill.
Rovinelli, R. J., & Hambleton, R. K. (1977). On the use of content specialists in the assessment of criterion-referenced test item validity. Tijdschrift voor Onderwijsresearch, 2(2), 49–60.
Schumacker, R. E., & Lomax, R. G. (2010). A beginner’s guide to structural equation modeling (3rd ed.). Routledge.
Thorndike, R. L. (Ed.). (1971). Educational measurement (2nd ed.). American Council on Education.
Wilson, L. O. (2016). Anderson and Krathwohl Bloom’s taxonomy revised understanding the new version of Bloom’s taxonomy. The Second Principle, 1(1), 1-8.
Yao, L., & Schwarz, R.D. (2006). A multidimensional partial credit model with associated item and test statistics: An application to mixed- format tests. Applied Psychological Measurement, 30(6), 469-492.
Busabong, C. (2023). Digital Competency: A Sustainable Learning Skill in APEC Education Goals 2022. ECT Education and Communication Technology Journal, 18(24), 70-85. (in Thai)
Chianchana, C. (2009). Multidimensional Analysis. Journal of Education Khon Kaen University, 32(4), 13-22. (in Thai)
Duangprakesa, N. (2018) Learning Management within the Framework of Bloom’s Taxonomy Questioning Method. Axademic Journal of Pheichaburi Rajabhat University, 8(3), 130-138. (in Thai)
Jarupoom, A. (2016) A Study of Information Technology Competency for Government Readiness on Digital Economy: Case Study at Information and Communication Technology Centre-office of the Permanent Secretary, Ministry of Finacne [Master’s Independent Study]. Thammasat University. (in Thai)
Kanjanawasee, S. (2012). New Test Theory (4th ed.). Chulalongkorn University Printing House. (in Thai)
Ministry of Higher Education, Science, Research and Innovation. (2023). Download additional higher education statistics. https://info.mhesi.go.th/ (in Thai)
National Digital Economy and Society Commission. (2019). 25 Elements Digital Competency. https://www.dlbaseline.org/digital competency.(in Thai)
Office of the Higher Education Commission. (2022). Announcement of higher education standards on details of learning outcomes of educational qualifications, B.E. 2565. Government Gazette, 139(Special Issue 212), 35–36. (in Thai)
Puechsing, Y. (2021). The development of Computational Thinking Skills Using Problem Based Learning and Social Network for Eighth Grade Students [Master’s thesis]. Mahasarakham University. (in Thai)
Samae, N., Guadamuz, T., & Warachwarawan, W. (2021). Digital resilience: immunity in the digital world. Mahidol University. https://ebookservicepro.com/showcase/DigitalResilience/ (in Thai)
Sirisak, K. (2016) Curriculum Research on Teacher Education Program for Developing Digital Competence Enhancement Guidance [Master’s thesis]. Chulalongkorn University. (in Thai)
Suwanroj, T., LeeKejwattana, P., Saeung, O., & Siripan, A. (2020). The Essential digital competency for undergraduate students in Thai higher education institutions: Academic documents analysis. Narkkhabut Paritat Journal, 12(2), 88-106. (in Thai)
Techataweewan, W., & Prasertsin, U. (2016). Digital Literacy Assessment of the Undergraduate Students to the Universities in Bangkok and Its Vicinity. Journal of Information Science Research and Practice, 34(4), 1-28. (in Thai)