Middle

Next Generation STEM Learning for All-envisioning advances based on NSF supported research

 
On November 9, 2015, an NSF-supported STEM Forum was held, organized by STELAR (the ITEST resource network) and CADRE (the DR K-12 resource network). This report stems from the discussion at the forum. 
 
About the Report
How can research-based findings and advances help society to re-envision STEM learning and education?
 
Author/Presenter

Carrie Parker

Sarita Pillai

Jeremy Roschelle

Year
2016
Short Description

How can research-based findings and advances help society to re-envision STEM learning and education? This report captures key takeaways, strategies, and challenges identified during the November 2015 workshop, including: research-based advances for STEM learning; multiple stakeholder communities around STEM schools; social justice, equity, and excellence in STEM schools and communities; scale and sustainability

Constructing Assessment Tasks that Blend Disciplinary Core Ideas, Crosscutting Concepts, and Science Practices for Classroom Formative Applications

How do we measure knowledge in use? In this paper we describe how we use principles of evidence-centered design to develop classroom-based science assessments that integrate three dimensions of science proficiency—disciplinary core ideas, science practices, and crosscutting concepts. In our design process, we first elaborate on, or “unpack”, the assessable components of the three dimensions.

Author/Presenter

Christopher J. Harris

Joseph S. Krajcik

James W. Pellegrino

Kevin W. McElhaney

Year
2016
Short Description

How do we measure knowledge in use? In this paper we describe how we use principles of evidence-centered design to develop classroom-based science assessments that integrate three dimensions of science proficiency—disciplinary core ideas, science practices, and crosscutting concepts.

Constructing Assessment Tasks that Blend Disciplinary Core Ideas, Crosscutting Concepts, and Science Practices for Classroom Formative Applications

How do we measure knowledge in use? In this paper we describe how we use principles of evidence-centered design to develop classroom-based science assessments that integrate three dimensions of science proficiency—disciplinary core ideas, science practices, and crosscutting concepts. In our design process, we first elaborate on, or “unpack”, the assessable components of the three dimensions.

Author/Presenter

Christopher J. Harris

Joseph S. Krajcik

James W. Pellegrino

Kevin W. McElhaney

Year
2016
Short Description

How do we measure knowledge in use? In this paper we describe how we use principles of evidence-centered design to develop classroom-based science assessments that integrate three dimensions of science proficiency—disciplinary core ideas, science practices, and crosscutting concepts.

Constructing Assessment Tasks that Blend Disciplinary Core Ideas, Crosscutting Concepts, and Science Practices for Classroom Formative Applications

How do we measure knowledge in use? In this paper we describe how we use principles of evidence-centered design to develop classroom-based science assessments that integrate three dimensions of science proficiency—disciplinary core ideas, science practices, and crosscutting concepts. In our design process, we first elaborate on, or “unpack”, the assessable components of the three dimensions.

Author/Presenter

Christopher J. Harris

Joseph S. Krajcik

James W. Pellegrino

Kevin W. McElhaney

Year
2016
Short Description

How do we measure knowledge in use? In this paper we describe how we use principles of evidence-centered design to develop classroom-based science assessments that integrate three dimensions of science proficiency—disciplinary core ideas, science practices, and crosscutting concepts.

Broadening Participation — Making STEM Learning Relevant and Rigorous for All Students

This CADRE brief explores factors that contribute to opportunity gaps in STEM education based on race, ethnicity, gender, ability, and socioeconomic status. It showcases the work of several DR K-12 projects and describes promising approaches for removing barriers for underrepresented groups and enhancing the STEM learning of all students.

Author/Presenter

CADRE

Year
2015
Short Description

This brief explores factors that contribute to opportunity gaps in STEM education based on race, ethnicity, gender, ability, and socioeconomic status. It showcases the work of several DR K-12 projects and describes promising approaches for removing barriers for underrepresented groups and enhancing the STEM learning of all students.

The Impact of Information and Communication Technology (ICT) Usage on Psychological Well-Being among Urban Youth

Coleman, L. O., Hale, T. M., Cotten, S. R., Gibson, P. (2015), The Impact of Information and Communication Technology (ICT) Usage on Psychological Well-Being among Urban Youth. In Sampson Lee Blair , Patricia Neff Claster , Samuel M. Claster (ed.) Technology and Youth: Growing Up in a Digital World (Sociological Studies of Children and Youth, Volume 19) Emerald Group Publishing Limited, pp. 267-291.

Author/Presenter

LaToya O’Neal Coleman

Timothy M. Hale

Shelia R. Cotten

Philip Gibson

Lead Organization(s)
Year
2015
Short Description

Information and communication technology (ICT) usage is pervasive among present day youth, with about 95% of youth ages 12-17 years reporting use of the Internet. Due to the proliferation of ICT use among this generation, it is important to understand the impacts of ICT usage on well-being. The goal of this study was to determine the impact of ICT usage on psychological well-being among a sample of urban, predominately African American youth.

Constructing Scientific Arguments Using Evidence from Dynamic Computational Climate Models

Pallant, A., & Lee H.-S. (2015). Constructing scientific arguments using evidence from dynamic computational climate models. Journal of Science Education and Technology. 24 (2-3) 378-395. doi 10.1007/s10956-014-9499-3.

Author/Presenter

Amy Pallant

Hee-Sun Lee

Lead Organization(s)
Year
2014
Short Description

Modeling and argumentation are two important scientific practices students need to develop throughout school years. In this paper, we investigated how middle and high school students (N=512) construct a scientific argument based on evidence from computational models with which they simulated climate change. We designed scientific argumentation tasks with three increasingly complex dynamic climate models. Each scientific argumentation task consisted of four parts: multiple-choice claim, open ended explanation, five-point Likert scale uncertainty rating, and open-ended uncertainty rationale.
We coded 1,294 scientific arguments in terms of a claim’s consistency with current scientific consensus, whether explanations were model based or knowledge based and categorized the sources of uncertainty (personal vs. scientific). We used chi-square and ANOVA tests to identify significant patterns. Results indicate that (1) a majority of students incorporated models as evidence to support their claims, (2) most students used model output results shown on graphs to confirm their claim rather than to explain simulated molecular processes, (3) students’ dependence on model results and their uncertainty rating diminished as the dynamic climate models became more and more complex, (4) some students’ misconceptions interfered with observing and interpreting model results or simulated processes, and (5) students’ uncertainty sources reflected more frequently on their assessment of personal knowledge or abilities related to the tasks than on their critical examination of scientific evidence resulting from models. These findings have implications for teaching and research related to the integration of scientific argumentation and modeling practices to address complex Earth systems.

Assessment of uncertainty-infused scientific argumentation

Lee, H-S, Liu, O.L, Pallant, A., Roohr, K. C., Pryputniewicz, S., & Buck, Z. (2014). Assessment of uncertainty-infused scientific argumentation. The Journal of Research in Science Teaching. 51(5), 581-605.

Author/Presenter

Hee-Sun Lee

Lydia Liu

Amy Pallant

Katrina Crotts Roohr

Sarah Pryputniewicz

Zoë E. Buck

Lead Organization(s)
Year
2014
Short Description

Though addressing sources of uncertainty is an important part of doing science, it has largely been neglected in assessing students' scientific argumentation. In this study, we initially defined a scientific argumentation construct in four structural elements consisting of claim, justification, uncertainty qualifier, and uncertainty rationale. We consulted literature to characterize and score different levels of student performances on each of these four argumentation elements. We designed a test comprised of nine scientific argumentation tasks addressing climate change, the search for life in space, and fresh water availability and administered it to 473 students from 9 high schools in the United States. After testing the local dependence and unidimensionality assumptions, we found that the uncertainty qualifier element was not aligned with the other three. After removing items related to uncertainty qualifier, we applied a Rasch analysis based on a Partial Credit Model. Results indicate that (1) claim, justification, and uncertainty rationale items form a unidimensional scale, (2) justification and uncertainty rationale items contribute the most on the unidimensional scientific argumentation scale as they cover much wider ranges of the scale than claim items, (3) average item difficulties increase in the order of claim, justification, and uncertainty rationale, (4) students' elaboration of uncertainty exhibits dual characteristics: self-assessment of their own knowledge and ability versus scientific assessment of conceptual and empirical errors embedded in investigations, and (5) students who can make warrants between theory and evidence are more likely to think about uncertainty from scientific sources than those who cannot. We identified limitations of this study in terms of science topic coverage and sample selection and made suggestions on how these limitations might have affected results and interpretations.