New data science method makes charts easier to read at a glance – ScienceDaily

Doctors reading EEGs in emergency rooms, first responders looking at multiple screens showing live data streams from sensors in a disaster area, brokers buying and selling financial instruments all need to make informed decisions very quickly. The complexity of visualization can complicate decision making when looking at data on a chart. When timing is critical, it is essential that a chart is easy to read and interpret.

To help decision makers in scenarios like these, computer scientists at Columbia Engineering and Tufts University have developed a new method – “Pixel Approximate Entropy” – which measures the complexity of a data visualization and can be used to develop visualizations that are easier to read. . Eugene Wu, assistant professor of computer science, and Gabriel Ryan, then a master’s student and now a doctoral student at Columbia, will present their paper at the IEEE VIS 2018 conference on Thursday, October 25 in Berlin, Germany.

“This is a whole new approach to working with line graphics with many different potential applications,” says Ryan, lead author of the article. “Our method gives visualization systems a way to measure how difficult it is to read line graphs. So we can now design these systems to automatically simplify or summarize graphs that would be difficult to read on their own.”

Besides visually inspecting a visualization, there are few ways to automatically quantify the complexity of a data visualization. To solve this problem, Wu’s group created Pixel Approximate Entropy to provide a “visual complexity score” that can automatically identify difficult graphics. They modified a low-dimensional entropy measure to work on line graphs, then conducted a series of user studies that demonstrated that the measure could predict how well users perceive graphs.

“In fast-paced environments, it’s important to know if the visualization is going to be so complex that the signals can be obscured,” says Wu, who is also co-chair of the Data, Media, & Society Center at the Data Institute of Sciences. “The ability to quantify complexity is the first step in automatically doing something about it.”

The team expects their system, which is open source, to be particularly useful for data scientists and engineers who are developing AI-based data science systems. By providing a method that allows the system to better understand the visualizations it displays, Pixel Approximate Entropy will help drive the development of smarter data science systems.

“For example, in industrial control, an operator may need to observe and respond to trends in readings from a variety of system monitors over time, such as in a chemical or power plant,” adds Ryan. “A system aware of the complexity of the charts could tailor the readings to ensure that the operator can identify important trends and reduce fatigue from attempting to interpret potentially noisy signals.

Wu’s group plans to expand data visualization to use these models to automatically alert users and designers when visualizations may be too complex and suggest smoothing techniques, and to develop other quantitative perception models. that can assist in the design of data processing and visualization systems.

Source of the story:

Material provided by Columbia University School of Engineering and Applied Sciences. Note: Content can be changed for style and length.

Source link

About Donald P. Hooten

Check Also

A new method for measuring quantum entanglement in a set of nuclear spins

These are the three-dimensional spectral data the team obtained from the electronic proxy qubit, with …

Leave a Reply

Your email address will not be published. Required fields are marked *