Reporting Findings to Decision-Makers and Project Participants

Submitted by Cadre-Admin on Mon, 07/27/2020 - 13:30

Christopher Wilson, Research Division Director, BSCS Science Learning

As DRK-12 researchers conducting empirical studies of interventions in science education, the findings from our studies are important to multiple audiences. While the dissemination plan might be one of the last sections we write in our proposals, and one of the last pieces we consider during the timeline of a project, it is probably the most important activity we engage in. I’ll always remember the advice my wife’s PhD advisor gave her during her studies on adolescence and animal behavior: “If you’re not publishing, you’re not doing science, you’re just watching hamsters mating in a basement.” The former presumably more justifiable than the latter.

At BSCS Science Learning we’re finding that the results from our research studies are important to an increasingly broad range of audiences. In the past we might have begun projects with the expectation that in the final year we’d be starting the often-endless process of publishing papers in research journals, and presenting findings at national research conferences. Remember those? In more recent years, as the evidence base for the efficacy of instructional materials or professional development programs has become more established, we’ve become more involved in scaling up these effective interventions. Successful scale-up requires that all elements of a program are communicated effectively to decision makers at multiple levels, such as teachers, principals, district science leads, and state science supervisors. That includes the structure of the program, the learning theory behind the program, and importantly, research on its impact. Demonstrated evidence of effectiveness, particularly on student achievement, is an increasingly important consideration for decision-makers tasked with choosing between adopting different interventions, and in investing sparse district resources.

Needless to say, research findings need to be presented differently for different audiences. Most science education researchers shudder a little when presented with hierarchical linear models, never mind those who work closer to the classroom. Presenting a series of Greek letters with multiple subscripts rarely indicates that one is concerned with the findings being accessible to a wide range of audiences. Graphing data to show differences in means between groups can make findings infinitely more digestible. The same goes for measures of statistical significance or effect sizes, which can be quite abstract. Instead of p-values and Hedges’-g, we often strive to demonstrate impacts in more meaningful units, such as:

  • the number of students now above a defined and relevant proficiency level,
  • the types of cognitive measures or assessment items that students are now able to perform well on,
  • impacts on achievement gaps between different groups of students, or impacts on specific groups,
  • comparing growth to the expected one-year gain achieved by students in the absence of the intervention, or
  • teacher testimonials and quotes that illustrate the richness of the impact on teaching and learning.

It is important to note that any effort to make data more accessible needs to be consistent with the research design and the types of claims supported. We must be careful to avoid unscientific terms like proven, to describe important limitations, and the extent to which findings are generalizable to different populations or contexts.

Dissemination to Project Participants

In addition to communicating findings to support the scaling-up of an intervention, participants in our research studies frequently request data and reports to see how they or their students benefited from participation. This might be a district, a school, or an individual teacher. These requests often come towards the end of a project, and while entirely reasonable, can be problematic for researchers when they’re inconsistent with analysis timelines, and result in a significant unexpected reporting burden when project resources are often running low.

Having at times struggled with these requests, we’ve found the best solution is to anticipate them and plan for them early in a project. That means clearly communicating with project participants and stakeholders during recruitment about what data will be shared, when it will be shared, and at what grain size (student, class, teacher, school, etc.). Careful coordination with human-subjects permissions and the IRB is important here, as consistency is needed with all consent documents, especially to protect the anonymity of individuals in the study.

All of this speaks to elevating the dissemination plan to a critical project component that requires attention from the beginning to the end of any study. If not, we run the risk of neglecting a key practice of science, and not attending to the interests of our research participants, be they teachers, students, or hamsters.

---

Any opinions, findings, and conclusions or recommendations expressed are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.