Building Flexible Modality Layers to Support Inclusive Educational Technologies

Brianna Tomlinson, 2017-18 CADRE Fellow

Educational technologies provide an avenue for engaging students in learning through computer-facilitated experiences.  These tools provide opportunities for learners to explore science, technology, engineering, and mathematics (STEM) content in ways they may not otherwise have available (D’Angelo et al., 2014). Simulations are one type of technology which can support interaction while isolating specific concepts as the focus context (Edelson, Gordin, & Pea, 1999). They allow learners to explore hypothetical cases in ways which may not be possible to conduct in real life (J.A. van Berkum & de Jong, 1991).

Even with the potential benefits of use from educational technologies, many barriers exist which prevent learners from fully engaging with them. One of the largest barriers to accessibility is the lack of flexibility in inputs and outputs from their interactions with those technologies. For instance, historically, many simulations utilize only visual representation modalities, so learners with visual impairment are often excluded from using these tools (Levy & Lahav, 2012). Building flexible tools where learners can choose modalities which fit best with their preferences and needs will lead to a more inclusive learning environment (Ayotte, Vass, Mitchell, & Treviranus, 2014).

One practical example for this comes from the PhET Interactive Simulations Project (PhET). PhET simulations (sims) center learner-driven exploration, interactivity, and dynamic feedback as key design goals (Lancaster, Moore, Parson, & Perkins, 2013). Sims leverage visuals to build initial understanding (e.g., Ohm’s Law uses the number of batteries to convey the amount of voltage), the visuals update immediately with user interaction, and learners can repeatedly & methodically explore how individual changes affect the overall concept conveyed. The sims undergo thorough evaluation during design & development for their visual concepts, and their success is well-documented (Adams et al., 2008; Keller, Finkelstein, Perkins, & Pollock, 2006; Moore, Chamberlain, Parson, & Perkins, 2014). However, reliance on one representation modality (i.e., visuals) can limit the ability for learners needing alternative or additional depictions to autonomously explore the sim content.

To address the accessibility challenges, particularly for learners with visual impairment, PhET began to design and implement alternative representations through speech and non-speech feedback, and to support additional input controls for interaction  (Moore, Smith, & Greenberg, 2018; Smith & Moore, 2020; Tomlinson, Kaini, et al., 2018; Tomlinson, Walker, & Moore, 2020). These representation modalities were chosen to support the main goals of the sims: using implicit scaffolding to encourage exploration without guided instructions. This meant providing timely and concise feedback to learners throughout interaction, to help them explore the phenomena in an enjoyable manner. Spoken description, provided through screen reader software, conveys the static and dynamic information, as well as just-in-time updates about the state of the sim. State descriptions are always available and update to incorporate any user-driven changes; they include a scene summary and description of each object in the interactive area. Responsive descriptions trigger when a controllable object has focus. They also provide immediate updates about changes occurring within the sim during interaction.

While spoken description is necessary to set the scene and give specific details, it could easily encumber learners and slow their interaction. Non-speech auditory representations (sonification) provided an avenue to give immediate feedback without overwhelming working memory through complex description. Different layers of sound can be included to convey ad hoc feedback for directly interactive variables (controllable pieces of the sim) and indirectly interactive variables (parts of the sim which change because of other relationships). Sounds can also augment the experience by leveraging real-world mappings and metaphors to aid in comprehension.

To ensure the alternative modalities can successfully support learners with diverse needs, evaluations should be incorporated throughout the design process. For these sound-enhanced sims, numerous rounds of evaluation were conducted. Evaluations ranged from convergent narrowing after brainstorming (Tomlinson, Walker, & Moore, In Press; Winters, Tomlinson, Walker, & Moore, 2019) to open-ended interviews with learners (Tomlinson, Batterman, Kaini, Walker, & Moore, 2018; Tomlinson, Kaini, Harden, Walker, & Moore, 2019) to more structured evaluations with knowledgeable screen-reader users (Tomlinson et al., 2020). Large-scale evaluations of educational technology with specialized learner populations are difficult to achieve in practice (Brulé et al., 2020), so this combination allowed the PhET team and their collaborators to comprehensively evaluate the design and impact of these alternative modalities.   

Though the auditory representation layers were created as a means for learners with vision impairment to explore the sims, they had provided other opportunities for impact. The sounds were designed to be enjoyable and aesthetically pleasing, as well as informative, so they could be used in addition to the visual representations to make the experience more immersive for other learners. For example, evaluations with learners with intellectual and developmental disabilities (Tomlinson, Batterman, et al., 2018) demonstrated how sound layers could emphasize visually-displayed relationships. Some of these learners struggled to interpret label changes, something which speech description could mitigate, if made available more generally (instead of only through screen readers). Allowing learners to select the representation layers they want available for receiving feedback would let them customize the experience based on their own needs.

Understanding learner needs and developing modalities which match those needs is necessary to reduce the barrier for access between learners with impairment and educational technologies. Often, the extra effort in development is worthwhile, as others may benefit from the additional modalities explored. This is also true for modalities not discussed here, including haptic output and touch or gesture input controls. It will be even more important as technological advancements are made in the fields of augmented, mixed, and virtual realities. Designing flexibility into input and output modalities will help to ensure students with impairment are able to benefit from these new educational experiences as they become available to a larger audience.


Adams, W. K., Reid, S., Lemaster, R., McKagan, S. B., Perkins, K. K., Dubson, M., & Wieman, C. E. (2008). A study of educational simulations Part 1 - Engagement and learning. Journal of Interactive Learning Research, 19(3), 397–419.

Ayotte, D., Vass, J., Mitchell, J., & Treviranus, J. (2014). Personalizing interfaces using an inclusive design approach. International Conference on Universal Access in Human- Computer Interaction, 8513 LNCS(PART 1), 191–202.

Brulé, E., Tomlinson, B. J., Metatla, O., Jouffrais, C., Brulé, E., Tomlinson, B., … Jouffrais, C. (2020). Review of Quantitative Empirical Evaluations of Technology for People with Visual Impairments Marcos Serrano To cite this version : HAL Id : hal-02437881 Review of Quantitative Empirical Evaluations of Technology for People with Visual Impairments. 2020 CHI Conference on Human Factors in Computing Systems, 1–14.

D’Angelo, C., Rutstein, D., Harris, C., Haertel, G., Bernard, R., & Borokhovski, E. (2014). Simulations for STEM Learning: Systematic Review and Meta-Analysis Report Overview. In Menlo Park: SRI International. Retrieved from

Edelson, D. C., Gordin, D. N., & Pea, R. D. (1999). Addressing the Challenges of Inquiry-Based Learning Through Technology and Curriculum Design. Journal of the Learning Sciences, 8(3–4), 391–450.

J.A. van Berkum, J., & de Jong, T. (1991). Instructional environments for simulations. Education and Computing, 6(3–4), 305–358.

Keller, C. J., Finkelstein, N. D., Perkins, K. K., & Pollock, S. J. (2006). Assessing The Effectiveness Of A Computer Simulation In Conjunction With Tutorials In Introductory Physics In Undergraduate Physics Recitations. 109–112.

Lancaster, K., Moore, E. B., Parson, R., & Perkins, K. K. (2013). Insights from Using PhET’s Design Principles for Interactive Chemistry Simulations. ACS Symposium Series, 1142, 97–126.

Levy, S. T., & Lahav, O. (2012). Enabling people who are blind to experience science inquiry learning through sound-based mediation. Journal of Computer Assisted Learning, 28(6), 499–513.

Moore, E. B., Chamberlain, J. M., Parson, R., & Perkins, K. K. (2014). PhET interactive simulations: Transformative tools for teaching chemistry. Journal of Chemical Education, 91(8), 1191–1197.

Moore, E. B., Smith, T. L., & Greenberg, J. (2018). Keyboard and Screen Reader Accessibility in Complex Interactive Science Simulations: Design Challenges and Elegant Solutions. International Conference on Universal Access in Human-Computer Interaction, 385–400.

Smith, T. L., & Moore, E. B. (2020). Storytelling to Sensemaking: A Systematic Framework for Designing Auditory Description Display for Interactives. Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems.

Tomlinson, B. J., Batterman, J. M., Kaini, P., Walker, B. N., & Moore, E. B. (2018). Supporting Simulation Use for Students with Intellectual and Developmental Disabilities. Journal on Technology and Persons with Disabilities, 6, 202–218.

Tomlinson, B. J., Kaini, P., Harden, E. L., Walker, B. N., & Moore, E. B. (2019). A Multimodal Physics Simulation: Design and Evaluation with Diverse Learners. Journal on Technology and Persons with Disabilities,.

Tomlinson, B. J., Kaini, P., Zhou, S., Smith, T. L., Moore, E. B., & Walker, B. N. (2018). Design and Evaluation of a Multimodal Science Simulation. Proceedings of the 20th International ACM SIGACCESS Conference on Computers and Accessibility - ASSETS ’18, 438–440.

Tomlinson, B. J., Walker, B. N., & Moore, E. B. (In Press). Identifying and Evaluating Conceptual Representations for Auditory-enhanced Interactive Physics Simulations. Journal of Multimodal User Interfaces (JMUI).

Tomlinson, B. J., Walker, B. N., & Moore, E. B. (2020). Auditory Display in Interactive Science Simulations: Description and Sonification Support Interaction and Enhance Opportunities for Learning. Proceedings of the 2020 CHI Conference on Human Factosrs in Computing System.

Winters, R. M., Tomlinson, B. J., Walker, B. N., & Moore, E. B. (2019). Sonic Interaction Design for Science Education. Ergonomics in Design: The Quarterly of Human Factors Applications, 27(1), 5–10.