Skip to main content
Log in

Examining Preservice Elementary Teachers’ Answer Changing Behavior on a Content Knowledge for Teaching Science Assessment

  • Original Research
  • Published:
Journal of Science Education and Technology Aims and scope Submit manuscript

Abstract

Preservice elementary teachers (PSTs) prepare for various standardized assessments, such as the Praxis® licensure assessment. However, there is little research on test-taking behavior and test-taking strategies for this examinee population. A common belief and instruction given in some test preparation materials is that examinees should stick to their initial answer choice. Decades of research has debunked this belief, finding that generally examinees benefit from answer changing behavior. However, there is minimal research on answer changing behavior among PSTs. Moreover, there is little research examining answer changing behavior for tests assessing constructs that integrate content and practice, or across different technology-enhanced item types. We use an online Content Knowledge for Teaching (CKT) assessment that measures PSTs’ CKT in one science area: matter and its interactions. In this study, we analyzed process data from administering the online CKT matter assessment to 822 PSTs from across the US to better understand PSTs’ behaviors and interactions on this computer-based science assessment. Consistent with prior research findings, this study showed that examinees who changed their responses benefited more often than were harmed by doing so with higher-performing examinees benefiting more than lower-performing examinees, on average. These findings also were consistent across item types. Implications for computer-based CKT science assessment design and delivery are discussed.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5

Similar content being viewed by others

Notes

  1. We excluded any Praxis® Elementary Science test-takers who participated in our earlier CKT matter item pilots.

  2. The sample was 92% female (pop = 93%), 80% White (pop = 80%), 4% Midwest (pop = 4%), 22% Northeast (pop = 22%), 46% South (pop = 49%), 28% West (pop = 25%), 22% in the first quartile (Q1) of the Praxis® Science score distribution (pop = 26%), 27% in Q2 (pop = 27%), 24% in Q3 (pop = 23%), and 27% in Q4 (pop = 24%).

  3. Pilot recruitment occurred at the beginning of 2018 so only included Praxis® Elementary Science test-takers in the 2018 calendar year, whereas Field Test recruitment occurred in July 2019 so it included Praxis® Elementary Science examinees from January 2018 to June 2019. PSTs in the pilot were ineligible to participate in the field test.

References

  • Al-Hamly, M., & Coombe, C. (2005). To change or not to change: Investigating the value of MCQ answer changing for Gulf Arab students. Language Testing, 22(4), 509–531.

    Article  Google Scholar 

  • Ball, D. L., Thames, M. H., & Phelps, G. (2008). Content knowledge for teaching: What makes it special? Journal of Teacher Education, 59(5), 389–407. https://doi.org/10.1177/0022487108324554

    Article  Google Scholar 

  • Bath, J. A. (1967). Answer-changing behavior on objective examinations. The Journal of Educational Research, 61(3), 105–107.

    Article  Google Scholar 

  • Bauer, D., Kopp, V., & Fischer, M. R. (2007). Answer changing in multiple choice assessment change that answer when in doubt–and spread the word! BMC Medical Education, 7(1), 1–5.

    Article  Google Scholar 

  • Bertram, A., & Loughran, J. (2012). Science teachers’ views on CoRes and Pap-eRs as a framework for articulating and developing pedagogical content knowledge. Research in Science Education, 42(6), 1027–1047. https://doi.org/10.1007/s11165-011-9227-4

  • Bridgeman, B. (2012). A simple answer to a simple question on changing answers. Journal of Educational Measurement, 49, 467–468. https://doi.org/10.1111/j.1745-3984.2012.00189.x

    Article  Google Scholar 

  • Friedman-Erickson, S. (1994). To change or not to change: the multiple choice dilemma.

  • Geiger, M. A. (1991a). Changing multiple-choice answers: Do students accurately perceive their performance? The Journal of Experimental Education, 59(3), 250–257.

    Article  Google Scholar 

  • Geiger, M. A. (1991b). Changing multiple-choice answers: A validation and extension. College Student Journal.

  • Green, K. (1981). Item-response changes on multiple-choice tests as a function of test anxiety. The Journal of Experimental Education, 49(4), 225–228.

    Article  Google Scholar 

  • Henze, I., & Van Driel J. H. (2015). Toward a more comprehensive way to capture PCK in its complexity. In A. Berry, P. Friedrichsen, & J.Loughran (Eds.), Re-examining pedagogical content knowledge in science education (pp. 120–134). New York: Routledge.

  • Jeon, M., De Boeck, P., & van der Linden, W. (2017). Modeling answer change behavior: An application of a generalized item response tree model. Journal of Educational and Behavioral Statistics, 42(4), 467–490.

    Article  Google Scholar 

  • Liu, O. L., Bridgeman, B., Gu, L., Xu, J., & Kong, N. (2015). Investigation of response changes in the GRE revised general test. Educational and Psychological Measurement, 75(6), 1002–1020.

    Article  Google Scholar 

  • Mathews, C. O. (1929). Erroneous first impressions on objective tests. Journal of Educational Psychology, 20(4), 280.

    Article  Google Scholar 

  • McConnell, M. M., Regehr, G., Wood, T. J., & Eva, K. W. (2012). Self-monitoring and its relationship to medical knowledge. Advances in Health Sciences Education, 17(3), 311–323.

    Article  Google Scholar 

  • Mikeska, J. N., & Castellano, K. (2021, April 9-12). National field test results examining elementary preservice teachers’ content knowledge for teaching about matter [Paper presentation]. AERA Annual Meeting, Orlando, FL. Virtual conference.

  • National Academies of Sciences, Engineering, and Medicine. (2015). Science teachers learning: Enhancing opportunities, creating supportive contexts. Committee on Strengthening Science Education through a Teacher Learning Continuum. Board on Science Education and Teacher Advisory Council, Division of Behavioral and Social Science and Education. Washington, DC: The National Academies Press.

  • National Research Council. (2013). Monitoring progress toward successful K-12 STEM education: A nation advancing? Washington, DC: The National Academies Press. https://doi.org/10.17226/13509

  • Ouyang, W., Harik, P., Clauser, B. E., Paniagua, M. A. (2019). Investigation of answer changes on the USMLE Step 2 Clinical Knowledge examination. BMC Medical Education, 19(389). https://doi.org/10.1186/s12909-019-1816-3

  • Park, S., & Oliver, J. S. (2008). Revisiting the conceptualization of pedagogical content knowledge (PCK): PCK as a conceptual tool to understand teachers as professionals. Research in Science Education, 38(3), 261–284. https://doi.org/10.1007/s11165-007-9049-6

    Article  Google Scholar 

  • Park, S., & Suh, J. K. (2015). From portraying to assessing PCK: Drivers, dilemmas, and directions for future research. In A. Berry, P. Friedrichsen, & J. Loughran (Eds.), Re-examining pedagogical content knowledge in science education (pp. 104–119). New York: Routledge. https://doi.org/10.4324/9781315735665

  • Payne, B. D. (1984). The relationship of test anxiety and answer-changing behavior: An analysis by race and sex. Measurement and Evaluation in Guidance, 16(4), 205–210.

    Article  Google Scholar 

  • Ramsey, P. H., Ramsey, P. P., & Barnes, M. J. (1987). Effects of student confidence and item difficulty on test score gains due to answer changing. Teaching of Psychology, 14(4), 206–210.

    Article  Google Scholar 

  • Roth, K. J., Garnier, H. E., Chen, C., Lemmens, M., Schwille, K., & Wickler, N. I. (2011). Video-based lesson analysis: Effective science PD for teacher and student learning. Journal of Research in Science Teaching, 48(2), 117–148. https://doi.org/10.1002/tea.20408

  • Shulman, L. S. (1986). Those who understand: Knowledge growth in teaching. Educational Researcher, 15(2), 4–14. https://doi.org/10.3102/0013189x015002004

    Article  Google Scholar 

  • Stylianou-Georgiou, A., & Papanastasiou, E. C. (2017). Answer changing in testing situations: The role of metacognition in deciding which answers to review. Educational Research and Evaluation, 23(3–4), 102–118.

    Article  Google Scholar 

  • Van der Linden, W. J., Jeon, M., & Ferrara, S. (2012). “A paradox in the study of the benefits of test-item review”: Erratum. Journal of Educational Measurement, 49(4), 466. https://doi.org/10.1111/j.1745-3984.2012.00188.x

    Article  Google Scholar 

  • Vidler, D., & Hansen, R. (1980). Answer changing on multiple-choice tests. The Journal of Experimental Education, 49(1), 18–20.

    Article  Google Scholar 

  • Waddell, D. L., & Blankenship, J. C. (1994). Answer changing: A meta-analysis of the prevalence and patterns. The Journal of Continuing Education in Nursing, 25(4), 155–158.

    Article  Google Scholar 

  • Wilson, S. M. (2016). Measuring the quantity and quality of the K–12 STEM teacher pipeline (Education White Paper). Menlo Park, CA: SRI International.

Download references

Funding

This study was supported by a grant from the National Science Foundation (Award No. 1813254).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jamie N. Mikeska.

Ethics declarations

Consent Statement

The data collection for this study was approved by our institutional review board. All participants completed a written consent form to have their data used for research purposes.

Disclaimer

The opinions expressed herein are those of the authors and not the funding agency.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Below is the link to the electronic supplementary material.

Supplementary file1 (DOCX 94.8 KB)

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Castellano, K.E., Mikeska, J.N., Moon, J.A. et al. Examining Preservice Elementary Teachers’ Answer Changing Behavior on a Content Knowledge for Teaching Science Assessment. J Sci Educ Technol 31, 528–541 (2022). https://doi.org/10.1007/s10956-022-09971-2

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10956-022-09971-2

Keywords

Navigation