Reliability

Examining Formative Assessment Practices for English Language Learners in Science Classrooms (Collaborative Research: Li)

This is an exploratory study to identify critical aspects of effective science formative assessment (FA) practices for English Language Learners (ELLs), and the contextual factors influencing such practices. FA, in the context of the study, is viewed as a process contributing to the science learning of ELLs, as opposed to the administration of discrete sets of instruments to collect data from students. The study targets Spanish-speaking, elementary and middle school students.

Lead Organization(s): 
Partner Organization(s): 
Award Number: 
1118951
Funding Period: 
Thu, 09/01/2011 to Sat, 08/31/2013
Project Evaluator: 
Advisory board members
Full Description: 

This is a two-year exploratory study to identify critical aspects of effective science formative assessment (FA) practices for English Language Learners (ELLs), and the contextual factors influencing such practices. Three institutions join efforts for this purpose: University of Colorado at Boulder, University of Colorado at Denver, and University of Washington. FA, in the context of the study, is viewed as a process contributing to the science learning of ELLs, as opposed to the administration of discrete sets of instruments to collect data from students. The study targets Spanish-speaking, elementary and middle school students. Findings from this study contribute to advance knowledge and understanding of FA as an inherent component of the science learning process in linguistically diverse classrooms, and to define a research agenda aimed at enhancing science teachers' ability to enact equitable and effective assessment practices for this student subpopulation.

Three research questions guide the work: (1) What FA practices are occurring in science classrooms that serve predominantly mainstream students and in those serving predominantly ELLs?; (2) How are teachers' FA practices for mainstream students different from or similar to those used with ELLs?; and (3) How do contextual factors and teachers' cultural and linguistic competencies influence FA practices? To address these questions, two conceptual frameworks are used--one for characterizing FA events; the other for examining FA events as a communication process. The study employs a mixed-methods research approach with emphasis on case studies. The sample size consists of three school districts in Colorado and Washington, 16 classrooms (8 elementary, 8 middle school), 16 teachers, and 96 ELLs. Classrooms are selected to represent a particular combination of four factors: (a) teacher ethnicity, (b) teacher formal academic preparation in teaching ELLs, (c) type of linguistic student background, and (d) grade level. Students are selected through a stratified random sample, identified by achievement level (i.e., low, medium, high), and linguistic background (i.e., mainstream, ELL). Data collection strategies to document the implementation of FA at the beginning, during, and at the end of a science unit include: (a) classroom observation protocols, (b) classroom video-recording, (c) video/artifact simulated recall, (d) assessment artifacts, (e) student interviews, (f) teacher questionnaires, (g) teacher interviews, (h) school principal interviews, and (i) school observations. Reliability and validity of most of the data-gathering instruments is determined through pilot studies. Data interpretation strategies include: (a) coding based on the two conceptual frameworks, (b) scoring rubrics to identify levels of effectiveness, and (c) narratives and profiles to describe FA patterns. Publications and the development of a website constitute the main dissemination strategies. A technical advisory board is responsible for formative and summative evaluation. Key evaluation questions are: (1) To what extent does the project enhance research on ELL FA practices through case studies?, and (2) How effectively do the project dissemination activities facilitate understanding of FA practices?

Major project outcomes include: (1) a description of the patterns of formal and informal FA practices for ELLs; (2) a comparison of the FA practices observed in classrooms that vary on the dimensions of teacher characteristics and linguistic diversity; and (3) an empirically and theoretically informed set of findings and strategies for supporting teachers to enact and enhance FA practices sensitive to cultural and linguistic diversity. Three main products are developed: (1) a monograph describing the FA practices observed across the different classrooms with concrete examples; (2) a description of possible professional development strategies to improve in-service FA practices for linguistically diverse students; and (3) a research-informed approach for analyzing FA practices. Besides filling the existing research gap on FA with ELLs, outcomes and products serve as a foundation for a future research agenda and a comprehensive project aimed at ensuring equitable science learning for all students, including ELLs.

Levels of Conceptual Understanding in Statistics (LOCUS)

LOCUS (Levels of Conceptual Understanding in Statistics) is an NSF Funded DRK12 project (NSF#118618) focused on developing assessments of statistical understanding. These assessments will measure students’ understanding across levels of development as identified in the Guidelines for Assessment and Instruction in Statistics Education (GAISE). The intent of these assessments is to provide teachers and researchers with a valid and reliable assessment of conceptual understanding in statistics consistent with the Common Core State Standards (CCSS).

Lead Organization(s): 
Partner Organization(s): 
Award Number: 
1118168
Funding Period: 
Thu, 09/01/2011 to Fri, 08/31/2012
Project Evaluator: 
TERC, Jim Hammerman
Full Description: 

The goal of this project is to develop two tests (instruments) to assess conceptual understanding of statistics. The instruments are based on the levels A/B and on level C of statistical understanding development as described in the American Statistical Association Guidelines for Assessment and Instruction of Statistics Education (GAISE) framework. These instruments will be used to assess knowledge of statistics by grades 6-12 students. The instruments will have multiple-choice and constructed response (CR) items. The CR items will have scoring rubrics. The assessments will be pilot tested in school districts in six states. The instruments will be used by teachers to analyze students' growth in understanding of statistics and will be useable for both formative and summative purposes. An assessment blueprint will be developed based on the GAISE framework for selecting and constructing both fixed-choice and open-ended items. An evidenced-based designed process will be used to develop the assessments. The blueprint will be used by the test development committee to develop items. These items will be reviewed by the advisory board considering the main statistics topics to be included on the assessments. Through a layering process, the assessments will be piloted, revised, and field tested with students in grades 6-12 in six states. A three-parameter IRT model will be used in analyzing the items. The work will be done by researchers at the University of Florida with the support of those at the University of Minnesota, the Educational Testing Service, and Kenyon College. Researchers from TERC will conduct a process evaluation with several feedback and redesign cycles.

The assessments will be aligned with the Common Core State Standards for mathematics (CCSSM) and made available as open-source to teachers through a website. The research team will interact with the state consortia developing assessments to measure students' attainment of the CCSSM. As such, the assessments have the potential of being used by a large proportion of students in the country. The more conceptually-based items will provide teachers with concrete examples of what statistics students in grades 6-12 should know.

Modeling Engineered Levers for the 21st Century Teaching of STEM (Collaborative Research: Schunn)

This project will develop three replacement units for biology and refine them through classroom testing. The units will be models of STEM integration by using the important concepts of proportional reasoning and algebraic thinking and engineering re-design to address big ideas in science while also promoting the learning of 21st century skills. The materials will be educative for teachers, and the teacher materials and professional development methods will work at scale and distance.

Project Email: 
Lead Organization(s): 
Partner Organization(s): 
Award Number: 
1027629
Funding Period: 
Wed, 09/01/2010 to Sun, 08/31/2014
Project Evaluator: 
Bill Bickel
Full Description: 

Research in biology has become increasingly mathematical, but high school courses in biology use little mathematics. To address this concern, this project will develop three replacement units for biology and refine them through classroom testing. The units will be models of STEM integration by using the important concepts of proportional reasoning and algebraic thinking and engineering re-design to address big ideas in science while also promoting the learning of 21st century skills. The materials build on existing work on the use of model eliciting activities and focus science and technology instruction on high-stakes weaknesses in mathematics and science. They address the scaling issue as part of the core design work by developing small units of curriculum that can be applied by early adopters in each context. The materials will undergo many rounds of testing and revision in the early design process with at least ten teachers each time. The materials will be educative for teachers, and the teacher materials and professional development methods will work at scale and distance.

Learning of science content will be measured through the use of existing instruments in wide use. Existing scales of task values, achievement goals and interest are used to measure student motivation. The work performed is guided by a content team; a scaling materials team; a scaling research team; the PI team of a cognitive scientist, a robotics educator, and a mathematics educator specializing in educational reform at scale; and the summative evaluation team lead by an external evaluator.

There is great interest in understanding whether integrated STEM education can interest more students in STEM disciplines. The focus on mathematics integrated with engineering in the context of a science topic is interesting and novel and could contribute to our understanding of integrating mathematics, engineering and science. The development team includes a cognitive scientist, a mathematics educator, teachers and scientists. The issues and challenges of interdisciplinary instruction will be investigated.

Using Rule Space and Poset-Based Adaptive Testing Methodologies to Identify Ability Patterns in Early Mathematics and Create a Comprehensive Mathematics Ability Test

This project will develop a new assessment for children ages 3-7 to provide teachers with diagnostic information on a child's development of mathematics facility on ten domains such as counting, sequencing, adding/subtracting, and measurement. The Comprehensive Research-based Mathematics Ability (CREMAT) is being developed using innovative psychometric models to reveal information about children on specific attributes for each of the 10 domains.

Project Email: 
Lead Organization(s): 
Partner Organization(s): 
Award Number: 
1313695
Funding Period: 
Sat, 09/01/2012 to Wed, 02/28/2018
Full Description: 

A new assessment for children ages 3-7 is being developed to provide teachers with diagnostic information on a child's development of mathematics facility on ten domains such as counting, sequencing, adding/subtracting, and measurement. The Comprehensive Research-based Mathematics Assessment (CREMA) is being developed using innovative psychometric models to reveal information about children on specific attributes for each of the 10 domains. The CREMA will produce information based on carefully developed learning trajectories in a relative short period of time by using computer adaptive testing. The project is guided by two goals: 1) to produce a cognitively diagnostic adaptive assessment that will yield more useful and detailed information about students' knowledge of mathematics than previously possible, and 2) subject the developmental progressions to close cognitive diagnosis using cutting-edge psychometric approaches. An item pool of about 350 items is being developed that can be used to identify the level of understanding children ages 3-7 have on the 10 domains that have been identified as foundational to further learning in mathematics. A research team headed by Dr. Douglas Clements at the University of Buffalo is conducting the development work while being assisted by Dr. Curtis Tatsuoka, a statistician at Case Western Reserve University.

The CREMA is being developed using leading-edge psychometric models based on Q-Matrix theory, rule-state models, and posets. The initial item pool includes items from the REMA, a previously developed instrument based on unidemensional IRT models. New items are being piloted with at least 200 students from a group of a total of 800 students evenly distributed among pre-K to grade 2. The successful items then are used to create the new CREMA. The new assessment is being field tested with 300 children, pre-K to grade 2. A random sample of 50 students (at least 10 from each grade) is being video taped as they work the items. Specific criteria of convergence are being used for feedback on how specific items are performing to meet the required specifications. An external evaluator is auditing the process and is doing spot checks of item codings and other analyses performed.

The main product will be the CREMA that will be made widely available. This instrument using computer adaptive testing will provide teachers with ready information on young children's understanding of critical mathematical ideas. The new psychometric models that will be used and developed to process multiple attributes from individual items will make large strives to move forward the field of mathematics assessment of young children. A publisher has expressed interest to make the assessment widely available that increases the likelihood the assessment will have large impact on early childhood mathematics learning.

This project was previously funded under award # 1019925.

Chemistry Facets: Formative Assessment to Improve Student Understanding in Chemistry

This project implemented a facets-of-thinking perspective to design tools and practices to improve high school chemistry teachers' formative assessment practices. Goals are to identify and develop clusters of facets related to key chemistry concepts; develop assessment items; enhance the assessment system for administering items, reporting results, and providing teacher resource materials; develop teacher professional development and resource materials; and examine whether student learning in chemistry improves in classes that incorporate a facet-based assessment system.

Partner Organization(s): 
Award Number: 
0733169
Funding Period: 
Sat, 09/15/2007 to Wed, 08/31/2011
Project Evaluator: 
Heller Research Associates
Full Description: 

Supported by research on students' preconceptions, particularly in chemistry, and the need to build on the knowledge and skills that students bring to the classroom, this project implements a facets-of-thinking perspective for the improvement of formative assessment, learning, and instruction in high school chemistry. Its goals are: to identify and develop clusters of facets (students' ideas and understandings) related to key high school chemistry concepts; to develop assessment items that diagnose facets within each cluster; to enhance the existing web-based Diagnoser assessment system for administering items, reporting results, and providing teacher resource materials for interpreting and using the assessment data; to develop teacher professional development and resource materials to support their use of facet-based approaches in chemistry; and to examine whether student learning in chemistry improves in classes that incorporate a facet-based assessment system.

The proposed work builds on two previously NSF-funded projects focused on designing Diagnoser (ESI-0435727) in the area of physics and on assessment development to support the transition to complex science learning (REC-0129406). The work plan is organized in three strands: (1) Assessment Development, consisting of the development and validation of facet clusters related to the Atomic Structure of Matter and Changes in Matter and the development and validation of question sets related to each facet cluster, including their administration to chemistry classes; (2) Professional Development, through which materials will be produced for a teacher workshop focused on the assessment-for-learning cycle; and (3) Technology Development, to upgrade the Diagnoser authoring system and to include chemistry facets and assessments.

Anticipated products include: (1) 8-10 validated facet clusters related to the Atomic Structure of Matter and Changes in Matter; (2) 12-20 items per facet cluster that provide diagnostic information about student understanding in relation to the facet clusters; (3) additional instructional materials related to each facet cluster, including 1-3 questions to elicit inital student ideas, a developmental lesson to encourage students' exploration of new concepts, and 3-5 prescriptive lessons to address persistent problematic ideas; and (4) a publically-available web-based Diagnoser for chemistry (www.Diagnoser.com), including student assessments and instructional materials.

Science Literacy through Science Journalism (SciJourn)

This project aims to develop, pilot, and evaluate a model of instruction that advances the scientific literacy of high school students by involving them in science journalism, and to develop research tools for assessing scientific literacy and engagement. We view scientific literacy as public understanding of and engagement with science and technology, better enabling people to make informed science-related decisions in their personal lives, and participate in science-related democratic debates in public life.

 

Project Email: 
Lead Organization(s): 
Award Number: 
0822354
Funding Period: 
Mon, 09/01/2008 to Fri, 08/31/2012
Project Evaluator: 
Brian Hand, University of Iowa
Full Description: 

For a more in-depth look at Scijourn, visit the project spotlight.

Subscribe to Reliability