This is a collaborative project to develop, test, and analyze sets of technology-supported diagnostic classroom assessments for middle school (grades 6-8) physical science. Assessments are aligned with the performance assessment and evidence-centered design methodologies suggested in the Framework for K-12 Science Education (NRC, 2012).
Designing Assessments in Physical Science Across Three Dimensions (Collaborative Research: Pellegrino)
This is a collaborative proposal among the University of Illinois at Chicago, Michigan State University, and SRI International to develop, test, and analyze sets of technology-supported diagnostic classroom assessments for middle school (grades 6-8) physical science. Assessments are aligned with the performance assessment and evidence-centered design methodologies suggested in the Framework for K-12 Science Education (NRC, 2012). The study focuses on the development of new measures of learning that take into account the interdependence of science content and practice. Two disciplinary core ideas--Matter and its Interactions, and Energy--and two scientific and engineering practices--Constructing Explanations and Designing Solutions, and Developing and Using Models--are used for this purpose.
The research questions are: (1) What are the characteristic features of science assessments based upon systematic application of the Evidence-Centered Design (ECD) assessment process?; (2) To what extent can assessment designs incorporate critical core idea, crosscutting concept and science/engineering practice dimensions in ways that both separate and integrate these dimensions as part of the design architecture?; (3) What is the evidence that the multiple dimensions of science learning (e.g., content, practices and crosscutting concepts) are separable and recoverable in the performance of students who respond to these assessments?; (4) How instructionally sensitive are these assessments? (i.e., Do they show differential and appropriate sensitivity to students' opportunity to learn science in ways consistent with the vision contained in the NRC Framework?); (5) What forms of evidence can be provided for the validity of these assessments using a multifaceted validity framework that takes into account both the interpretive and evidentiary components of a validity argument for these new assessments?; (6) What are the characteristics of assessments that best serve the needs of classroom teachers relative to a formative assessment process and in what ways do such assessments and scoring processes need to be designed to support effective teacher implementation?; and (7) What are the unique affordances and opportunities provided by technology in designing and implementing assessments focused on merging content & practices performance expectations?
Assessments are iteratively designed and administered in three school districts and a laboratory school in Florida and one school district in Wisconsin using the "Investigating and Questioning our World through Science and Technology" curriculum. The three school districts in Florida have classrooms that are using typical curriculum. The assessments will also be administered and tested with students in these classrooms. To address the research questions, the project conducts five major tasks: (1) development of assessment items using the ECD process to document and guide coherence of items; (2) an alignment study to review design patterns and task templates; (3) a cognitive analysis study to empirically investigate the extent to which the items elicit the intended guidelines; (4) three empirical studies, including (a) an early-stage testing with teachers (n=6) and students (n=180) in Year 1, (b) a pilot testing in Year 2 with teachers (n=12) and students (n=360), and (c) a main study in Year 3 with teachers (n=30) and students (n=900); and (5) a study to investigate the formative use of the assessment items using teacher focus groups' feedback and analysis of student performance data from previous studies.
Project outcomes are: (a) research-informed and field-tested assessment prototypes that measure students' thinking around the two physical science core ideas and the two scientific and engineering practices; (b) relevant data and procedures used in the studies; and (c) a framework for the formative use of the assessments, including guidelines, scoring rubrics, and criteria for assessment design decisions.
Publications & Presentations
Alozie, N., Fujii, R., Leones, T., Cheng, B., Pennock, P. H. & Damery, K. (2017, May). An equity framework for the design and development of NGSS aligned formative assessment tasks. Paper in N. Alozie (Chair), Using NGSS to Inform and Provide Equitable Instruction, Learning, and Assessments to Diverse Students. Structured poster session at the annual meeting of the American Educational Research Association, San Antonio, TX.
Alozie, N., Madden, K., Zaidi, S., Haugabook Pennock, P., & Harris, C. J. (2018, April) Challenges in designing instructionally supportive science assessments using culturally relevant principles for diverse students. Paper presented at the American Educational Research Association Annual Meeting, New York, NY.
Alozie, N., Pennock, P.H., Madden, K., Zaidi, S., Harris, C. J., & Krajcik, J. (2018, March). Designing and developing NGSS-aligned formative assessment tasks to promote equity. Paper presented at the annual conference of National Association for Research in Science Teaching, Atlanta, GA.
Dahsah, C., Lee, J., DeBarger, A., Damelin, D., & Krajcik, J. (2015, April). Involving teachers in developing assessments aligned with NGSS using a 7-step process. Paper presented at the annual meeting of the National Association for Research in Science Teaching, Chicago, IL.
Damelin, D. (2014). Developing Assessments for the NGSS. @Concord, 18(2), 12-13.
Damelin, D. (2017). Using Technology to Enhance NGSS-Aligned Assessment Tasks for Classroom Formative Use. @Concord, 21(1), 8-9.
DeBarger, A. H., Harris, C. J., D’Angelo, C., Krajcik, J., Dahsah, C., Lee, J., & Beauvineau, Y. (2014). Constructing assessment items that blend core ideas and science practices. In J. L. Polman, E. A. Kyza, D. K. O'Neill, I. Tabak, W. R. Penuel, A. S. Jurow, K. O’Connor, T. Lee, & L. D’Amico (Eds.), Learning and becoming in practice: The International Conference of the Learning Sciences (ICLS) 2014, Vol. 3. Boulder, CO: International Society of the Learning Sciences.
Gane, B. D., McElhaney, K. W., Zaidi, S. Z., & Pellegrino, J. W. (2018, March). Analysis of student and item performance on three-dimensional constructed response assessment tasks. Paper presented at the 2018 NARST Annual International Conference, Atlanta, GA.
Gane, B. D., Zaidi, S. Z., & Pellegrino, J. W. (2018). Measuring what matters: Using technology to assess multidimensional learning. European Journal of Education, 53, 176–187. https://doi.org/10.1111/ejed.12269
Harris, C. J., Krajcik, J., Pellegrino, J. & DeBarger, A. H. (in press, 2019). Designing knowledge-in-use assessments to promote deeper learning. Educational Measurement: Issues and Practice.
Harris, C. J., Krajcik, J. S., Pellegrino, J. W., & McElhaney, K. W. (2016). Constructing assessment tasks that blend disciplinary core ideas, crosscutting concepts, and science practices for classroom formative applications. Menlo Park, CA: SRI International.
Harris, C. J., Krajcik, J. S., Pellegrino, J. W., McElhaney, K. M., Pennock, P., H., & Gane, B. (2018, March). Designing classroom-based assessments for supporting three-dimensional teaching and learning. Paper presented at the NARST 2018 Annual International Conference, Atlanta, GA.
McElhaney, K., D’Angelo, C., Harris, C. J., Seeratan, K., Stanford, T., & DeBarger, A. (2015, April). Integrating crosscutting concepts into 3-dimensional scoring rubrics. Poster and paper presented at the annual meeting of the National Association for Research in Science Teaching, Chicago, IL.
McElhaney, K.W., Gane, B. D., DiBello, L.V., Fujii, R., Pennock, P.H, Vaishampayan, G., & Pellegrino, J.W. (2017, May). Designing scoring rubrics to support NGSS-aligned, classroom-based formative assessment. Paper and poster presented at the American Educational Research Association Annual Meeting, San Antonio, TX.
McElhaney, K., Gane, B. D., Harris, C. J., Pellegrino, J. W., DiBello, L. V., & Krajcik, J. S. (2016, April). Using learning performances to design three-dimensional assessments of science proficiency. Paper presented at the annual conference of National Association for Research in Science Teaching, Baltimore, MD.
McElhaney, K.W., Zaidi, S., Gane, B. D., Alozie, N., & Harris, C.J. (2018, March). Designing NGSS-aligned assessment tasks and rubrics to support classroom-based formative assessment. Paper presented at the NARST Annual International Conference, Atlanta, GA.
McElhaney, K., Vaishampayan, G., D’Angelo, C., Harris, C. J., Pellegrino, J. W., & Krajcik, J. (2016, June). Using learning performances to design science assessments that measure knowledge-in-use. In C. K. Looi, J. L. Polman, U. Cress, & P. Reiman (Eds.). Transforming learning, empowering learners: Proceedings of the 12th international conference of the learning sciences (ICLS) 2016, Vol. 2 (pp. 1211–1212).Singapore: International Society of the Learning Sciences.
Pellegrino, J. W. (2015, August). Measuring what matters: Challenges and opportunities in assessing science proficiency. In Proceedings of the Learning Assessments Research Conference (p. 54–58). Melbourne, Australia: Australian Council for Educational Research.
Pellegrino, J. W., DiBello, L. V., & Goldman, S. R. (2016). A framework for defining and evaluating the validity of instructionally relevant assessments. Educational Psychologist, 51(1), 59-81.
Pellegrino, J. W., Gane, B. D., Zaidi, S. Z., Harris, C. J., McElhaney, K. W., Alozie, N., Haugabook Pennock, P., Severance, S., Neumann, K., Fortus, D., Krajcik, J., Nordine, J., Furtak, E. M., Briggs, D., Chattergoon, R, Penuel, B., Wingert, K. Van Horne, K. (2018). The challenge of assessing “knowledge in use”: Examples from three-dimensional science learning and instruction. In Kay, J. and Luckin, R. (Eds.). Rethinking Learning in the Digital Age: Making the Learning Sciences Count, Proceedings of the 13th International Conference of the Learning Sciences (ICLS) 2018, 2, 1211-1218. London, UK: International Society of the Learning Sciences.
Pellegrino, J. W., Harris, C. J., Krajcik, J., Gane, B. D., McElhaney, K. W., Pennock, P.H., Alozie, N., & Zaidi, S. Z. (2018). Design of next generation science assessments: Measuring what matters. In Kay, J. and Luckin, R. (Eds.). Rethinking Learning in the Digital Age: Making the Learning Sciences Count, Proceedings of the 13th International Conference of the Learning Sciences (ICLS) 2018, 2, 1212-1213. London, UK: International Society of the Learning Sciences.
Pellegrino, J. W., Krajcik, J., Harris, C., & Damelin, D. (2016). Constructing Science Assessment Tasks that Integrate Disciplinary Core Ideas, Science Practices and Crosscutting Concepts. Paper presented at the EARLI SIG on Assessment and Evaluation Conference. Munich, Germany.
Pennock, P. H., Alozie, N., & Morales, C. (March, 2018). Assessing 3-D learning with instructionally supportive tasks and rubrics. Paper presented at the 2018 Annual National Conference of the National Science Teachers Association (NSTA), Atlanta, GA.
Pennock, P. H., & Severance, S. (March, 2018). Comparative analysis of three-dimensional research-based and classroom based rubrics for formative assessment. Paper presented at the NARST Annual International Conference, Atlanta, GA.
Zaidi, S.Z., Ko, M., Gane, B.D., Madden, K., Gaur, D., & Pellegrino. J.W. (2018, March). Portraits of teachers using three-dimensional assessment tasks to inform instruction. Paper presented at the NARST Annual International Conference, Atlanta, GA.