Detecting Differential Item Functioning of Grade 9 Science Literacy Test
Main Article Content
Abstract
The aims of this research were 1) to construct and develop a science literacy test for grade 9, and 2) to analyze the differential item functioning (DIF) of the test by using Multiple Indicators and Multiple Causes (MIMIC) of the science literacy test through Mplus Program, classified by the gender. The sample consisted of 689 grade 9 students in the academic year 2018 from schools in Chaiyaphum Province, obtained through multistage random sampling. The research instrument was a science literacy test with 54 items dealing with 3 parts of knowledge content: physical systems, in 18 items, living systems, in 18 items, and Earth and space systems, in 18 items. The research findings showed that:
1. The creation and finding the quality of the 54-item science literacy test revealed the consistency index ranging from 0.60 to 1. When the quality of the test items was considered, 45 items passed the selection criteria.
1.1 The quality of the science literacy test according to the Classical Test Theory (CTT) was found to be as follows: the difficulty (p) ranged from 0.22 to 0.79; the discrimination (r) ranged from 0.21 to 0.77 and the total reliability was 0.92.
1.2 The quality of the science literacy test, according to the Item Response Theory (IRT) by using a two-parameter logistic model (Two-Parameter Model) was found to be as follows: the difficulty of the test (b-parameter) ranged from -0.537 to 4.000 and the discrimination of the test (a-parameter) ranged from 0.313 to 1.382.
2. The analysis of the differential item functioning (DIF) of the test by using Multiple Indicators and Multiple Causes (MIMIC) of the science literacy test through Mplus Program, classified by the gender, revealed that males had a higher possibility of answering correctly than females in 18 items, and females had a higher possibility of answering correctly than males in 7 items.
Article Details
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.
The content and information contained in the published article in the Journal of Educational Measurement Mahasarakham University represent the opinions and responsibilities of the authors directly. The editorial board of the journal is not necessarily in agreement with or responsible for any of the content.
The articles, data, content, images, etc. that have been published in the Journal of Educational Measurement Mahasarakham University are copyrighted by the journal. If any individual or organization wishes to reproduce or perform any actions involving the entirety or any part of the content, they must obtain written permission from the Journal of Educational Measurement Mahasarakham University.
References
Ahman, S.J., & Glock, M.D. (1967). Evaluating Pupil Growth Principle of Tests and Measurement (3rd Ed.). Allyn & Bacon.
Le, L. T. (2009). Investigating Gender Differential Item Functioning Across Countries and Test Languages for PISA Science Items. International Journal of Testing, 9(2), 122-133.
Maier, N. R. F., & Casselman, G. G. (1970). Locating the difficulty in insight problems: Individual and sex differences. Psychological Rep, 26, 103-107.
Chantawan, S. (2019). Development of Mathematics Reasoning Abilities Assessment for Matthayomsueksa 5 Students in Kalasin Province [Maste’s thesis]. Mahasarakham University. (in Thai)
Kanjanawasee, S. (2012). Modern Test Theories (4th Ed.). Chulalongkorn University Press. (in Thai)
Ministry of Education. (2009). Basic Education Core Curriculum B.E. 2551 (2008), Office of the Basic Education Commission, Ministry of Education. Agricultural Co-operative Association of Thailand Company Limited. (in Thai)
Phatthiyathani, S. (2008). Educational measurement (6th Ed.). Prasarn Press. (in Thai)
Phatthiyathani, S. (2010). Educational measurement (7th Ed.). Prasarn Press. (in Thai)
Saiyot, L., & Saiyot, A. (2000). Learning Measurement Techniques (2nd Ed). Suweerivasarn Press. (in Thai)
Srisa-ard, B. (2010). Preliminary Research (8th Ed). Suweerivasarn Press. (in Thai)
Sukaek, P. (2017). Detecting Differential Item Functioning Using Multiple Indicators and Multiple Causes (MIMIC) [Maste’s thesis]. Prince of Songkla University. (in Thai)
Sumlertrum, Y. (2015). The Construction of a Science Academic Achievement Test at the Matthayom Sueksa One Level as Based on the Programme for International Student Assessment (PISA). Journal of Hunamities and Social Sciences Surin Rajabhat University, 19(2), 21-34. (in Thai)
Thanakhwang, T. (2010). Construction of creative thinking test in mathematics for Prathom Suksa 4 students [Maste’s thesis]. Chiang Mai University. (in Thai)
The Institute for the Promotion of Teaching Science and Technology. (2017a). PISA 2015 Assessment Framework. Aroonkarnpim Press. (in Thai)
The Institute for the Promotion of Teaching Science and Technology. (2017b). Summary of preliminary data PISA 2015. Aroonkarnpim Press. (in Thai)
The Institute for the Promotion of Teaching Science and Technology. (2018). PISA 2015 results Snapshot of performance in science, reading and mathematics. Success Publication Company Limited. (in Thai)
Weerasin, N. (2018). Construction of numerical creative thinking test for vocational students 1 at Intrachai Commercial College of Vocational Education Commission [Maste’s thesis]. Burapha University. (in Thai)
Worakham, P. (2015). Educational research (7th Ed.). Takkasila press. (in Thai)