HCIL Researchers Present Eight Papers at VIS 2019
A team of researchers from the University of Maryland’s Human-Computer Interaction Lab (HCIL) are in Vancouver, British Columbia this week, showcasing their work at a major international conference focused on data visualization.
HCIL faculty, students and staff associated with the lab are presenting eight papers at VIS 2019, a gathering of researchers and practitioners from academia, government and industry who meet each year to exchange recent findings on the design and use of visualization tools.
Now in its 24th year, the conference—organized and run by IEEE—is considered the premier forum for the latest advances in theory, methods and applications related to computer visualization.
The Maryland contingent is highlighting their work on a wide array of visualization topics. Half of the papers being presented investigate ways that data can be viewed and organized—whether on mobile displays, in virtual reality, or personal computers.
Three of the papers focus on factors that impact how participants perceive and solve visual tasks. And the final one describes how people’s natural cognitive biases affect data science and visualization.
“I am especially impressed with the efforts of some of the younger faculty who are making important contributions to the conference and are getting their names known,” says Niklas Elmqvist, a professor in the College of Information Studies (iSchool) and the director of HCIL.
Elmqvist—who also has an appointment in the University of Maryland Institute for Advanced Computer Studies (UMIACS)—collaborated on five of the papers, working with doctoral students from the iSchool, former HCIL members, and other faculty on the Maryland campus.
The eight papers are:
“Sherpa: Leveraging User Attention for Computational Steering in Visual Analytics” was a team effort between HCIL and researchers in the Center for Bioinformatics and Computational Biology. The paper was written by two doctoral students—Zhe Cui and Jayaram Kancherla—and their advisers, Elmqvist and Héctor Corrado Bravo, an associate professor of computer science with an appointment in UMIACS. The researchers introduce a method called Sherpa, which organizes and prioritizes data based on the user’s navigational patterns. It allows analysts to guide large-scale computations on the data that they’re most interested in, without having to manually organize it themselves. The researchers tested Sherpa for gene regulation in biology applications and a stock market analysis. In an “expert review with bioinformaticians, [they] felt that the Sherpa model was more empowering and efficient than merely seeing progressive visual updates,” the researchers concluded.
Computer science doctoral student Brian Ondov and Elmqvist helped write “The Perceptual Proxies of Visual Comparison,” which won a best paper honorable mention award at the conference, and builds on a paper they presented last year at VIS 2018. The researchers define perceptual proxies as “features of visualization that are actually used by the human visual system to solve a given task.” They studied how perceptual proxies influenced participants to calculate mean and range. “The general idea of perceptual proxies is interesting, as it may suggest ways to model perception in basically a computational manner,” explains Elmqvist.
Catherine Plaisant, a senior research scientist at UMIACS and associate director of HCIL, contributed to “A Task-based Taxonomy of Cognitive Biases for Information Visualization.” The paper classifies 154 types of cognitive biases into seven categories that apply to data visualizations. The authors hope their taxonomy will help visualization researchers relate their design to corresponding possible biases, and lead to new research that detects and addresses biased judgment in data visualization.
“The Role of Latency in Predicting Visual Search Behavior” was led by Leilani Battle, an assistant professor of computer science with an appointment in UMIACS. The paper explores how factors like task type and task complexity can modulate the effect of latency, to the point that it is statistically insignificant in predicting user behavior.
“There Is No Spoon: Evaluating Performance, Space Use, and Presence with Expert Domain Users in Immersive Analytics” examines how professional economists organize and analyze data immersively in virtual reality. The research was led by Andrea Batch, an information studies doctoral student who collaborated with Elmqvist and researchers in Australia.
Batch and Elmqvist analyzed video footage from that study to also write “All Right, Mr. DeMille, I’m Ready for My Closeup: Adding Meaning to User Actions from Video for Immersive Analytics.” The researchers discuss a new method of inferring the participants’ behavior based on how they interacted with data. “This could open the door to future experiments based solely on video capture of participants, which is significantly cheaper and easier than most other methods,” explains Elmqvist.
In addition, Elmqvist contributed to “Common Fate for Animated Transitions in Visualization,” which expands on traditional perceptual psychology laws by showing that humans perceive objects to be a part of the same group not only if they are moving in the same trajectory, but also if they simultaneously change in size and color.
Finally, “A Comparative Evaluation of Animation and Small Multiples for Trend Visualization on Mobile Phones” will also be represented by HCIL researchers. For that paper, the research team studied how to best show time-changing data on a mobile display.
—Story by Maria Herd