Preservice elementary teachers (PSTs) prepare for various standardized assessments, such as the Praxis® licensure assessment. However, there is little research on test-taking behavior and test-taking strategies for this examinee population. A common belief and instruction given in some test preparation materials is that examinees should stick to their initial answer choice. Decades of research has debunked this belief, finding that generally examinees benefit from answer changing behavior. However, there is minimal research on answer changing behavior among PSTs. Moreover, there is little research examining answer changing behavior for tests assessing constructs that integrate content and practice, or across different technology-enhanced item types. We use an online Content Knowledge for Teaching (CKT) assessment that measures PSTs’ CKT in one science area: matter and its interactions. In this study, we analyzed process data from administering the online CKT matter assessment to 822 PSTs from across the US to better understand PSTs’ behaviors and interactions on this computer-based science assessment. Consistent with prior research findings, this study showed that examinees who changed their responses benefited more often than were harmed by doing so with higher-performing examinees benefiting more than lower-performing examinees, on average. These findings also were consistent across item types. Implications for computer-based CKT science assessment design and delivery are discussed.
Castellano, K. E., Mikeska, J. N., Moon, J. A., Holtzman, S., Gao, J., & Jiang, Y. (2022). Examining preservice elementary teachers’ answer changing behavior on a content knowledge for teaching science assessment. Journal of Science Education and Technology, 31. 528–541.