Expanding Access to Data Visualization for the Visually Impaired
Extracting insights from massive datasets to answer questions and make decisions is the primary objective of data science. These insights often involve the production of data visualizations that offer a better understanding of the subject matter by shifting the focus from numerical values to graphical elements.
However, not everyone has equal access to these data visualizations. This barrier often impedes blind or visually challenged students seeking to pursue the field of data science.
Niklas Elmqvist, a professor in the College of Information Studies (iSchool) with an appointment in the University of Maryland Institute for Advanced Computer Studies (UMIACS), is addressing this problem head on by developing and evaluating new tools that can provide visually challenged high schoolers in Maryland with better access to visualization data. His work is supported by a $2 million contract with the state’s Department of Education.
Computer scientists Andreas Stefik and Sina Bahram, representing the companies Data11y and Prime Access Consulting, respectively, are tasked with creating the new data visualization materials and coursework; while Elmqvist and the iSchool’s fourth-year doctoral student Pramod Chundury are in charge of evaluating the developed materials.
Elmqvist’s research in this area dates to 2018 when a blind student registered for his data visualization class. “It was illuminating, and took me completely by surprise,” he says. “Only then did I see the elephant in the room, and it had always been there.”
There are many data science materials for sighted users that simply cannot be easily replicated for blind users, Elmqvist says. For the coursework to be accessible, a solution must afford the ability to read the data in charts, interpret and communicate workflows, facilitate interactions with the data, and describe the visualizations.
While screen readers—software programs that read text with a speech synthesizer—can navigate tables, they are not scalable to large datasets. Braille displays are not congruous with this task either.
Sensory substitution is a general assistive technology approach to replace one sense with another, and could be used to substitute visual information with tactile or auditory information.
For example, using sound representation, a pitch might represent the height of a bar in a bar chart—the higher the pitch, the higher the bar. Yet many of the current sensory substitution techniques are not scalable because they take too much time to create, are too costly, or are not widely available.
In 2020, Elmqvist covered this topic extensively at a community TedxTalk that took place at Montgomery-Blair High School in Silver Spring, Maryland.
Now his team is building upon other work that was undertaken more than a decade ago by UMD faculty members Ben Shneiderman, Catherine Plaisant, Jonathan Lazar, and others. A tool they built, called iSonic, creates an audio version of a map that allows a user to execute sweeps from left to right to hear various pitches that represent the data.
The new project will scale this idea to mobile devices, and include more types of data representations. One focus of their work will be facilitating the interaction between the user and the data. For example, sighted users interact in a variety of ways with charts and other representations of data like zooming in, selecting points, filtering the data and more.
Their work seeks to identify these types of interactions and then make them accessible to blind users through other means, including both sound and haptic feedback, allowing for inputs via keyboards and touchscreens.
Although this project is to develop and evaluate new data visualization materials for high school students in Maryland, the team hopes their work will be eventually scaled up to the national level.
—This story was written by Maria Herd and adapted from an iSchool news release by Liz Zogby.