RESEARCH PAPER
Interactive computer assessment and analysis of students’ ability in scientific modeling
,
 
,
 
,
 
 
 
More details
Hide details
1
Collaborative Innovation Center of Assessment for Basic Education Quality, Beijing Normal University, Beijing, CHINA
 
2
Department of Science Education, National Taipei University of Education, Taipei City, TAIWAN
 
3
Science Education Center, Graduate Institute of Science Education and the Department of Earth Sciences, National Taiwan Normal University, Taipei City, TAIWAN
 
4
Department of Biology, Universitas Negeri Malang, INDONESIA
 
 
Publication date: 2022-12-05
 
 
EURASIA J. Math., Sci Tech. Ed 2022;18(12):em2194
 
KEYWORDS
ABSTRACT
Scientific modeling (SM) is a core scientific practice and critical for students’ scientific literacy. Previous research has not used interactive computer assessment to investigate students’ SM ability. This study aimed to explore an effective way in human-computer interaction to reveal the challenges faced by students in the four-element process of constructing, using, evaluating, and revising models. Contextualized in the solar system, eleven interactive tasks assessed 419 students in grades 4, 7, and 10. Results indicated that “model evaluation” and “model revision” were more difficult for students than “model construction” and “model use.” Grade significantly predicted students’ SM ability (p<.001). The interaction with re-answer according to feedback promoted students’ in-depth reflection and performance in SM. Findings of the study may provide a basis for improving students’ SM ability.
 
REFERENCES (77)
1.
Albert, B., Tullis, T., & Tedesco, D. (2010). Beyond the usability lab: Conducting large-scale online user experience studies. Morgan Kaufmann Publishers. https://doi.org/10.1016/B978-0....
 
2.
Annett, J., & Duncan, K. D. (1967). Task analysis and training design. Journal of Occupational Psychology, 41, 211-221. https://files.eric.ed.gov/full....
 
3.
Bahar, M., & Asil, M. (2018). Attitude towards e-assessment: Influence of gender, computer usage and level of education. Open Learning: The Journal of Open, Distance and e-Learning, 33(3), 221-237. https://doi.org/10.1080/026805....
 
4.
Baker, E. L., & O’Neil, H. F. (2002). Measuring problem solving in computer environments: Current and future states. Computers in Human Behavior, 18(6), 609-622. https://doi.org/10.1016/S0747-....
 
5.
Bamberger, Y., & Davis, E. (2013). Middle-school science students’ scientific modelling performances across content areas and within a learning progression. International Journal of Science Education, 35(2), 213-238. https://doi.org/10.1080/095006....
 
6.
Barak, M., & Hussein-Farraj, R. (2013). Integrating model-based learning and animations for enhancing students’ understanding of proteins structure and function. Research in Science Education, 43, 619-636. http://doi.org/10.1007/s11165-....
 
7.
Bedny, G., & Meister, D. (1999). Theory of activity and situation awareness. International Journal of Cognitive Ergonomics, 3(1), 63-72. https://doi.org/10.1207/s15327....
 
8.
Burkhardt, H., & Pead, D. (2003). Computer-based assessment: A platform for better tests? In C. Richardson (Ed.), Whither assessment (pp. 133-148). Qualifications and Curriculum Authority. https://www.mathshell.com/pape....
 
9.
Card, S. K., Moran, T. P., & Newell, A. (1983). The psychology of human-computer interaction. Erlbaum. https://doi.org/10.1201/978020....
 
10.
Carlson, R. D. (1994). Computer adaptive testing: A shift in the educational paradigm. Journal of Educational Technology Systems, 22(3), 213-224. https://doi.org/10.2190/QP36-W....
 
11.
Chang, C., & Chiu, M. (2009). The development and application of modeling ability analytic index: Take electrochemistry as an example. Chinese Journal of Science Education (Taiwan), 17(4), 319-342. https://doi.org/10.6173/CJSE.2....
 
12.
Chang, H.-Y. (2022). Science teachers’ and students’ metavisualization in scientific modeling. Science Education, 106, 448-475. https://doi.org/10.1002/sce.21....
 
13.
Chipman, S. E., Schraagen, J. M., & Shalin, V. L. (2000). Introduction to cognitive task analysis. Cognitive Task Analysis, 1, 1-8. https://corescholar.libraries.....
 
14.
Chittleborough, G., & Treagust, D. (2008). Correct interpretation of chemical diagrams requires transforming from one level of representation to another. Research in Science Education, 38, 463-482. https://doi.org/10.1007/s11165....
 
15.
Cohen, B. J. (2011). Design-based practice: A new perspective for social work. Social Work, 56(4), 337-346. http://doi.org/10.1093/sw/56.4....
 
16.
Crystal, A., & Ellington, B. (2004). Task analysis and human-computer interaction: Approaches, techniques, and levels of analysis. Americas Conference on Information Systems, 391, 1-9. https://aisel.aisnet.org/cgi/v....
 
17.
Dori, Y. J., & Kaberman, Z. (2012). Assessing high school chemistry students’ modeling sub-skills in a computerized molecular modeling learning environment. Instructional Science, 40, 69-91. https://doi.org/10.1007/s11251....
 
18.
Farrell, T., & Rushby, N. (2016). Assessment and learning technologies: An overview. British Journal of Educational Technology, 47, 106-120. https://doi.org/10.1111/bjet.1....
 
19.
Fortus, D., Shwartz, Y., & Rosenfeld, S. (2016). High school students’ meta-modeling knowledge. Research in Science Education, 46, 787-810. https://doi.org/10.1007/s11165....
 
20.
Fyiaz, K., Tabassum, S., & Hasnain, A. (2018). Enhancement of user experience by hierarchical task analysis for interaction system. In I. Nunes (Ed.), Advances in human factors and systems interaction (pp. 427-438). Springer. https://doi.org/10.1007/978-3-....
 
21.
Gagné, R. M. (1975). Observing the effects of learning. Educational Psychologist, 11(3), 144-157. https://doi.org/10.1080/004615....
 
22.
Gilbert, J. K., & Justi, R. (2016). Modeling-based teaching in science education. Springer. https://doi.org/10.1007/978-3-....
 
23.
Grosslight, L., Unger, C., Jay, E., & Smith, C. L. (1991). Understanding models and their use in science: Conceptions of middle and high school students and experts. Journal of Research in Science Teaching, 28(9), 799-822. https://doi.org/10.1002/tea.36....
 
24.
Guler, C., Kilic, E., & Cavus, H. (2014). A comparison of difficulties in instructional design processes: Mobile vs. desktop. Computers in Human Behavior, 39, 128-135. https://doi.org/10.1016/j.chb.....
 
25.
Halloun, I. (1996). Schematic modeling for meaningful learning of physics. Journal of Research in Science Teaching, 33, 1019-1041. http://doi.org/1019-1041.10.10...<1019::AID-TEA4>3.0.CO;2-I.
 
26.
Hashim, N., & Jones, M. (2014). Activity theory: A framework for qualitative analysis. https://ro.uow.edu.au/commpape....
 
27.
Hewson, C., & Charlton, J. P. (2019). An investigation of the validity of course-based online assessment methods: The role of computer-related attitudes and assessment mode preferences. Journal of Computer Assisted Learning, 35, 51-60. https://doi.org/10.1111/jcal.1....
 
28.
Hoffman, R. R., & Militello, L. G. (2012). Perspectives on cognitive task analysis: Historical origins and modern communities of practice. Psychology Press. https://doi.org/10.4324/978020....
 
29.
Justi, R. S., & Gilbert, J. K. (2002). Modeling, teachers’ views on the nature of modeling, and implications for the education of modelers. International Journal of Science Education, 24(4), 369-387. https://doi.org/10.1080/095006....
 
30.
Justi, R. S., & Gilbert, J. K. (2003). Teachers’ views on the nature of models. International Journal of Science Education, 25(11), 1369-1386. https://doi.org/10.1080/095006....
 
31.
Kadir, B. A., & Broberg, O. (2021). Human-centered design of work systems in the transition to industry 4.0. Applied Ergonomics, 92, 103334. https://doi.org/10.1016/j.aper....
 
32.
Ke, L., & Schwarz, C. V. (2021). Supporting students’ meaningful engagement in scientific modeling through epistemological messages: A case study of contrasting teaching approaches. Journal of Research in Science Teaching, 58(3), 335-365. https://doi.org/10.1002/tea.21....
 
33.
Kitajima, M. & Toyota, M. (2012). Simulating navigation behavior based on the architecture model human processor with real-time constraints (MHP/RT). Behavior & Information Technology, 31(1), 41-58. https://doi.org/10.1080/014492....
 
34.
Ko, C., & Cheng, C. (2008). Flexible and secure computer-based assessment using a single zip disk. Computers & Education, 50(3), 915-926. https://doi.org/10.1016/j.comp....
 
35.
Krell, M., Reinisch, B., & Krüger, D. (2015). Analyzing students’ understanding of models and modeling referring to the disciplines biology, chemistry, and physics. Research in Science Education, 45(3), 367-393. https://doi.org/10.1007/s11165....
 
36.
Kuo, B., Liao, C., Pai, K., Shih, S., Li., C., & Mok, M. M. C. (2019). Computer-based collaborative problem-solving assessment in Taiwan. Educational Psychology, 40(9), 1053-1055. https://doi.org/10.1080/014434....
 
37.
Liaw, S., & Huang, H. (2012). Perceived satisfaction, perceived usefulness, and interactive learning environments as predictors to self-regulation in e-learning environments. Computers & Education, 60(1), 14-24. https://doi.org/10.1016/j.comp....
 
38.
Lin, J.-W., & Chiu, M.-H. (2008). Exploring high school students’ knowledge of models and modeling from cognitive methodology. Science Education Journal (Taiwan), 307, 9-14. https://doi.org/10.6216/SEM.20....
 
39.
Lu, H., Hu, Y. P., Gao, J. J., & Kinshuk. (2016). The effects of computer self-efficacy, training satisfaction and test anxiety on attitude and performance in computerized adaptive testing. Computers & Education, 100, 45-55. https://doi.org/10.1016/j. compedu.2016.04.012.
 
40.
Mason, B., Rau, M. A., & Nowak, R. (2019). Cognitive task analysis for implicit knowledge about visual representations with similarity learning methods. Cognitive Science, 43(9), 37. https://doi.org/10.1111/cogs.1....
 
41.
Meyer, A. J., Innes, S. I., Stomski, N. J., & Armson, A. J. (2016). Student performance on practical gross anatomy examinations is not affected by assessment modality. Anatomical Sciences Education, 9(2), 111-120. https://doi.org/10.1002/ase.15....
 
42.
MOE. (2017). Gāozhōng wùlǐ kèchéng biāozhǔn [The curriculum standard for high school physics]. Ministry of Education of the People’s Republic of China. http://www.moe.gov.cn/srcsite/....
 
43.
MOE. (2018). Zhōngguó yìwù jiàoyù zhìliàng jiāncè bàogào [Report on monitoring of the quality of compulsory education of China]. Ministry of Education of the People’s Republic of China. http://www.moe.gov.cn/jyb_xwfb....
 
44.
Mullis, I. V. S., & Martin, M. O. (2017). TIMSS 2019 assessment frameworks. IEA. https://timss2019.org/wp-conte....
 
45.
NAGB. (2019). Science framework for the 2019 national assessment of education progress. National Assessment Governing Board. https://www.nagb.gov/content/d....
 
46.
Nieuwenhuis, S., Heslenfeld, D., J., von Geusau, N. J. A., Mars, R. B., Holroyd, C. B., & Yeung, N. (2005). Activity in human reward-sensitive brain areas is strongly context dependent. NeuroImage, 25(4), 1302-1309. https://doi.org/10.1016/j.neur....
 
47.
Nikou, S. A., & Economides, A. A. (2019). A comparative study between a computer-based and a mobile-based assessment: Usability and user experience. Interactive Technology and Smart Education, 16(4), 381-391. https://doi.org/10.1108/ITSE-0....
 
48.
Nissen, J. M., Jariwala, M., Close, E. W, & Dusen, B. V. (2018). Participation and performance on paper- and computer-based low-stakes assessments. International Journal of STEM Education, 5, 21. https://doi.org/10.1186/s40594....
 
49.
Norman, D. A. (2008). The way I see it–Simplicity is not the answer. Interactions, 15(5), 45-46. https://doi.org/10.1145/139008....
 
50.
Norris, J. T., Pauli, R. B., & Bray, D. E. (2007). Mood change and computer anxiety: A comparison between computerized and paper measures of negative effects. Computers in Human Behavior, 23(6), 2875-2887. https://doi.org/10.1016/j.chb.....
 
51.
NRC. (2012). A framework for K-12 science education: Practices, crosscutting concepts, and core ideas. The National Academy Press. https://nap.nationalacademies.....
 
52.
OECD. (2019). PISA 2018 assessment and analytical framework. Organization for Economic Co-operation and Development. https://www.oecd-ilibrary.org/....
 
53.
Otto, A. R., & Vassena, E. (2021). It’s all relative: Reward-induced cognitive control modulation depends on context. Journal of Experimental Psychology: General, 150(2), 306-313. https://doi.org/10.1037/xge000....
 
54.
Padilla, J. L., & Leighton, J. P. (2017). Cognitive interviewing and think aloud methods. In B. D. Zumbo, & A. M. Hubley (Eds.), Understanding and investigating response processes in validation research; understanding and investigating response processes in validation research (pp. 211-228). Springer. https://doi.org/10.1007/978-3-....
 
55.
Panteli, M., & Kirschen, D. S. (2015). Situation awareness in power systems: Theory, challenges, and applications. Electric Power Systems Research, 122, 140-151. https://doi.org/10.1016/j.epsr....
 
56.
Peel, A., Zangori, L., Friedrichsen, P., Hayes, E., & Sadler, T. D. (2019). Students’ model-based explanations about natural selection and antibiotic resistance through socio-scientific issues-based learning. International Journal of Science Education, 41(4), 510-532. https://doi.org/10.1080/095006....
 
57.
Pierson, A. E., Clark, D. B., & Sherard, M. K. (2017). Learning progressions in context: Tensions and insights from a semester‐long middle school modeling curriculum. Science Education, 101(6), 1061-1088. https://doi.org/10.1002/sce.21....
 
58.
Plummer, J. D., Bower, C. A., & Liben, L. S. (2016). The role of perspective taking in how children connect reference frames when explaining astronomical phenomena. International Journal of Science Education, 38(3), 345-365. https://doi.org/10.1080/095006....
 
59.
Plummer, J. D., Udomprasert, P., Vaishampayan, A., Sunbury, S., Cho, K., Houghton, H., Johnson, E., Wright, E., Sadler, P. M., & Goodman, A. (2022). Learning to think spatially through curricula that embed spatial training. Journal of Research in Science Teaching, 59(7), 1134-1168. https://doi.org/10.1002/tea.21....
 
60.
Preece, J., Rogers, Y., & Sharp, H. (2002). Interaction design: Beyond human-computer interaction. John Wiley & Sons. https://doi.org/10.1145/512526....
 
61.
Ruppert, J., Duncan, R. G., & Chinn, C. A. (2019). Disentangling the role of domain-specific knowledge in student modeling. Research in Science Education, 49(3), 921-948. https://doi.org/10.1007/s11165....
 
62.
Schwarz, C. V., Ke, L., Salgado, M., & Manz, E. (2022). Beyond assessing knowledge about models and modeling: Moving toward expansive, meaningful, and equitable modeling practice. Journal of Research in Science Teaching, 59(6), 1086-1096. https://doi.org/10.1002/tea.21....
 
63.
Schwarz, C. V., Reiser, B. J., Davis, E. A., Kenyon, L., Acher, A., Fortus, D., Schwartz, Y., Hug, B., & Krajcik, J. (2009). Developing a learning progression for scientific modeling: Making scientific modeling accessible and meaningful for learners. Journal of Research in Science Teaching, 46(6), 632-654. https://doi.org/10.1002/tea.20....
 
64.
Schwarz, C., Reiser, B. J., Acher, A., Kenyon, L., & Fortus, D. (2012). MoDeLS: Challenges in defining a learning progression for scientific modeling. In A. C. Alonzo, & A. W. Gotwals (Eds), Learning progressions in science: Current challenges and future directions (pp. 101-137). Sense Publishers. https://doi.org/10.1007/978-94....
 
65.
Seifried, J., Brandt, S., Kögler, K., & Rausch, A. (2020). The computer-based assessment of domain-specific problem-solving competence–A three-step scoring procedure. Cogent Education, 7(1), 1719571. https://doi.org/10.1080/233118....
 
66.
Silva, N., Zhang, D., Kulvicius, T., Gail, A., Barreiros, C., Lindstaedt, S., Kraft, M., Bölte, S., Poustka, L., Nielsen-Saines, K., Wörgötter, F., Einspieler, C., & Marschik, P. B. (2021). The future of general movement assessment: The role of computer vision and machine learning–A scoping review. Research in Developmental Disabilities, 110, 103854. https://doi.org/10.1016/j.ridd....
 
67.
Skryabin, M., Zhang, J., Liu, L., & Zhang, D. (2015). How the ICT development level and usage influence student achievement in reading, mathematics, and science? Computers & Education, 85, 49-58. https://doi.org/10.1016/j.comp....
 
68.
Stragier, J., Derboven, J., Laporte, L., Hauttekeete, L., de Marez, L. (2013). Kilowhat? A multidisciplinary approach on the development of a home energy management system. Behavior & Information Technology, 32(11), 1086-1104. https://doi.org/10.1080/014492....
 
69.
Sung, J. Y., Oh, P. S. (2018). Sixth grade students’ content-specific competencies and challenges in learning the seasons through modeling. Research in Science Education, 48, 839-864. https://doi.org/10.1007/s11165....
 
70.
Terzis, V., & Economides, A. A. (2011). Computer based assessment: gender differences in perceptions and acceptance. Computers in Human Behavior, 27(6), 2108-2122. https://doi.org/10.1016/j.chb.....
 
71.
Terzis, V., Moridis, C. N., & Economides, A. A. (2012). How student’s personality traits affect computer based assessment acceptance: Integrating BFI with CBAAM. Computers in Human Behavior, 28(5), 1985-1996. https://doi.org/10.1016/j.chb.....
 
72.
Thelwall, M. (2000). Computer-based assessment: A versatile educational tool. Computer & education, 34, 37-49. https://doi.org/10.1016/S0360-....
 
73.
Thurlow, M., Lazarus, S. S., Albus, D., & Hodgson, J. (2010). Computer-based testing: Practices and considerations. National Center on Educational Outcomes, University of Minnesota. https://nceo.umn.edu/docs/onli....
 
74.
Timmers, C., & Veldkamp, B. (2011). Three studies are presented on attention paid to feedback provided by a computer-based assessment for learning on information literacy. Computers & Education, 56(3), 923-930. https://doi.org/10.1016/j.comp....
 
75.
Wang, J.-H., Chang, L.-P., & Chen, S. Y. (2018). Effects of cognitive styles on web-based learning: Desktop computers versus mobile devices. Journal of Educational Computing Research, 56(5), 750-769. https://doi.org/10.1177/073563....
 
76.
Zainuddin, Z., Shujahat, M., Haruna, H., & Chu, S. K. W. (2020). The role of gamified e-quizzes on student learning and engagement: An interactive gamification solution for a formative assessment system. Computers & Education, 145, 103729. https://doi.org/10.1016/j.comp....
 
77.
Zangori, L., Peel, A., Kinslow, A., Friedrichsen, P., & Sadler, T. D. (2017). Student development of model‐based reasoning about carbon cycling and climate change in a socio‐scientific issues unit. Journal of Research in Science Teaching, 54(10), 1249-1273. https://doi.org/10.1002/tea.21....
 
eISSN:1305-8223
ISSN:1305-8215
Journals System - logo
Scroll to top