Assessment

On the Design and Implementation of Practical Measures to Support Instructional Improvement at Scale

STEM Categorization
Day
Thu

Learn about two efforts to design and implement practical measures of science and mathematics teaching to inform school and district instructional improvement efforts.

Date/Time
-

In contrast to evaluative research that uses accountability measures, improvement science research (Bryk, Gomez, Grunow, & LeMahieu, 2015), using practical measures is designed to provide practitioners with frequent, rapid feedback that enables them to assess and adjust instruction during the process of implementation. The resulting data is potentially of use to multiple stakeholders. For example, practical measures can orient teachers to attend to key aspects of the classroom that might be invisible to them.

Session Types
References

Bryk, A. S., Gomez, L. M., Grunow, A., & LeMahieu, P. (2015). Learning to improve: How America's schools can get better at getting better.
       Cambridge, MA: Harvard Education Press.
Yeager, D., Bryk, A. S., Muhich, J., Hausman, H., & Morales, L. (2013). Practical measurement. Carnegie Foundation for the Advancement of
       Teaching. Stanford, CA.

Kara Jackson, Jessica Thompson

Game-Based Learning Assessments: Using Data from Digital Games to Understand Learning

STEM Categorization
Day
Thu

Discover how digital games can inform classroom teaching using data from innovative formative assessments from three different game-based projects.

Date/Time
-

This session aims to open up a conversation about of how games can be used for formative assessment and how data from digital games can inform classroom teaching.

Session Types

Data-Intensive Research in Education: New Opportunities for Making an Impact

STEM Categorization
Day
Thu

Join a facilitated discussion about the application of data science to education, drawing on a recent NSF-sponsored report. Participants share insights from DR K–12 projects.

Date/Time
-

The Computing Research Association’s report from an NSF-sponsored workshop describes seven next steps for data-intensive research in education:

Session Types

STEM Smart Brief: Teaching and Learning Under the Next Generation Science Standards

This brief gives an overview—and by no means a comprehensive one—of several NGSS-aligned projects in the areas of curriculum, instruction, assessment, and professional development.

Author/Presenter

CADRE

Year
2016
Short Description

This brief gives an overview—and by no means a comprehensive one—of several NGSS-aligned projects in the areas of curriculum, instruction, assessment, and professional development.

ScratchJr: A Coding Language for Kindergarten

Computer programming for young children has grown in popularity among both education researchers and product developers, but still relatively little is known about how to assess and track young children’s learning through coding. This study presents an assessment tool to track Kindergarten through second grade students’ learning after engaging in a programming curriculum. Researchers worked with N=57 Kindergarten through second grade students over seven weeks to implement a curriculum using ScratchJr to introduce concepts of sequencing to create animated stories, collages, and games.

Author/Presenter

Amanda Strawhacker

Dylan Portelance

Marina Bers

Lead Organization(s)
Year
2015
Short Description

A paper on the prototype evolution of the ScratchJr programming environment.

Towards Domain-Independent Assessment of Elementary Students’ Science Competency using Soft Cardinality

Automated assessment of student learning has become the subject of increasing attention. Students’ textual responses to short answer questions offer a rich source of data for assessment. However, automatically analyzing textual constructed responses poses significant computational challenges, exacerbated by the disfluencies that occur prominently in elementary students’ writing. With robust text analytics, there is the potential to analyze a student’s text responses and accurately predict his or her future success.

Author/Presenter

Samuel P. Leeman-Munk

Angela Shelton

Eric N. Wiebe

James C. Lester

Year
2014
Short Description

This paper presents a novel application of the soft cardinality text analytics method to support assessment of text.

Assessing Elementary Students' Science Competency with Text Analytics

Real-time formative assessment of student learning has become the subject of increasing attention. Students’ textual responses to short answer questions offer a rich source of data for formative assessment. However, automatically analyzing textual constructed responses poses significant computational challenges, and the difficulty of generating accurate assessments is exacerbated by the disfluencies that occur prominently in elementary students’ writing. With robust text analytics, there is the potential to accurately analyze students’ text responses and predict students’ future success.

Author/Presenter

Samuel P. Leeman-Munk

Eric N. Wiebe

James C. Lester

Year
2014
Short Description

This paper presents WriteEval, a hybrid text analytics method for analyzing student-constructed responses.

Swimming Upstream in a Torrent of Assessment

Growing attention to preK mathematics and increased focus on standards in the US may be leading policy makers, administrators, and practitioners down the wrong path when it comes to assessing young children. The temptation to rely on standardised assessment practices may result in misguided understandings about what children actually know about mathematics.

Author/Presenter

Anita A. Wager

M. Elizabeth Graue

Kelly Harrigan

Year
2015