Designing of the Diagnostic Report of Mathematical Proficiency Level Through Real Time Digital Learning Platform

Main Article Content

Sineenart Phaengkham
Patcharee Junpeng
Samruan Chinjunthuk
Prapawadee Suwannatrai
Metta Marwiang
Chaiwat Tawarungruang
Jaruwan Thuanman

Abstract

The study was intended to analyze students’ multidimensional response patterns for creating transition points of mathematical proficiency levels and to design the student real time diagnostic report of mathematical proficiencies. The respondents were 1,559 grade 7 students. The research instrument was a 4-choice objective test on 3 topics, namely number and algebra, measurement and geometry, and statistics and probability, through a package of diagnostic tools in an online testing system—"eMAT-Testing.”
The results are presented below.
1. The transition points for mathematical proficiency levels, from the analysis of the students’ response patterns, which were used to design the student report covered two dimensions: mathematical procedures and conceptual structures, in three substances. In each substance the student’s proficiency in each dimension could be classified into 5 levels and 4 transition points.
2. The student diagnostic report of mathematical proficiency level using real time digital learning platform was composed of 2 parts: 1) the individual report for students and parents, displaying personal information, earned scores and feedbacks to indicate the student’s current proficiency level, areas of improvement, and channels of additional learning; 2) the report for teachers, educational institutions, and educational service areas, showing students’ overall results.

Downloads

Download data is not yet available.

Article Details

Section
Research Article

References

Adams, R. J., & Khoo, S.T. (1996). Quest: The interactive test analysis system. Australian Council for Educational Research.

Adams, R. J., Wilson, M., & Wang, W. C. (1997). The multidimensional random coefficients multinomial logit model. Applied Psychological Measurement, 21(1), 1-23

AERA, APA, & NCME. (2014). Standards for Educational and Psychological Testing (6th ed.). American Educational Research Association.

DeMars, C. (2010). Item Response Theory: Understanding Statistics Measurement. Oxford University Press.

Junpeng, P., Inprasitha, M., & Wilson, M. (2018). Modeling of the open-ended items for assessing multiple proficiencies in mathematical problem solving. The Turkish Online Journal of Educational Technology, 2(Special Issue for INTE-ITICAM-IDEC), 142–149.

Junpeng, P., Marwiang, M., Chiajunthuk, S., Suwannatrai, P., Chanayota, K., Pongboriboon, K., Tang, K. N., Wilson, M. (2020b). Validation of a digital tool for diagnosing mathematical proficiency. International Journal of Evaluation and Research in Education (IJERE). 9(3), 665-674.

Lin, T. H. & Dayton, C. M. (1997). Model-selection information criteria for nonnested latent class models. Journal of Educational and Behavioral Statistics, 22, 249-264.

Linacre, J. M. (1994). Sample size and item calibration [or person measure] stability. Rasch Measurement Transactions, 7(4), 328.

OECD. (2019). OECD learning compass 2030. OECD Publishing. http://www.oecd.org/education/2030-project

Sass D. A., Schmitt T. A., & Walker C. M. (2008). Estimating non-normal latent trait distributions with item response theory using true and estimated item parameters. Applied Measurement in Education, 21, 65-88.

Wilson, M. (2005). Constructing measures: An item response modeling approach. Rout ledge.

Wilson, M., Allen, D. D., & Li. J.C. (2006). Improving Measurement in Health Education and Health Behavioral Research using Item Response Modeling: Comparison with the Classical Test Theory Approach. Health Educ. Res., 21, 19-32.

Wright, B. D., & Stone, M. H. (1979). Best Test Design: Rasch Measurement. Mesa Press.

Wu, M. L., Adams, R. J., & Wilson, M. R. (2007). ACER Conquest: Generalized item response modelling software. ACER Press. https://works.bepress.com/ray_adams/25/

Zapata-Rivera and Irvin R. Katz. (2014). Applying audience analysis to the design of interactive score reports. Assessment in Education: Principles, Policy & Practice, 21(4), 442–463, http://dx.doi.org/10.1080/0969594X.2014.936357