Westat will perform a multi-year evaluation of the DR K-12 program. The evaluation will examine the results and effects of the DR K-12 program from its inception as a program in 2006 through 2011, including all active grants in those years. In addition, some analysis of themes that reach back into the legacy programs of DR K-12 (i.e., IMD, CLT and TPC programs) will be conducted as warranted to understand trends and impact.
Westat will perform a multi-year evaluation of the DR K-12 program. Five broad evaluation questions guide the evaluation, though others may be forthcoming: 1. What does the portfolio of funded projects look like? 2. What percent of development-intensive projects funded in the DR K-12 program employ appropriate methods to evaluate the efficacy and apply them rigorously? What were the methods used to study these projects? What have been the results of these studies? What is the quality of non-development projects? 3. What percent of resources (instructional programs, models, or interventions) developed by the DR K-12 program are found to be effective and ready for adoption at scale? 4. What are the combined effects of the DR K-12 projects that have been evaluated with rigorous methods? What do these combined effects contribute to the knowledge base about innovative approaches to improving STEM learning and teaching? What are the effects from non-development projects? 5. Do the resources, models, tools, and technologies developed and/or studied in DR K-12 projects lead to significant improvement in student learning? To significant improvement in teacher STEM competency? The evaluation will: 1. Collect and analyze secondary data including conducting portfolio analysis of all funded projects and a meta-analysis of all development-intensive projects to examine both the portfolio and the effects on students and teachers. 2. Administer a web survey of principal investigators about the perceived impacts of the projects, and evidence to support the claims. 3. Perform bibliometric analyses of publications and citations to examine impacts that go beyond the program to others in the STEM research field. 4. Conduct follow-up studies. Possible examples may include expert panel review to judge the quality and usefulness of potentially promising products identified or exploratory projects, case studies of selected projects or products that either are important but under-studied or are unique from a methodological or substantive perspective, etc. 5. Establish an advisory group with substantive expertise in STEM content area, STEM education, and evaluation research to provide advice and constructive criticism through the duration of the project.