Join the conversation: #sxswibm #accessdata
Susann Keohane and Brian Cragun, consultants for IBM Research’s Human Ability & Accessibility Center (referred to as the AbilityLab), will present Beyond a Thousand Words: Accessible Complex Data – a discussion about the accessibility challenges of analyzing, visualizing, and using today’s big data – on Tuesday, March 13.
Q: What kinds of solutions does the IBM AbilityLab develop? What is a recent example?
![]() |
Susann Keohane |
And our Access My City application provides real-time transit data, geo-location and mapping technologies, and publicly available accessibility information, to mobile devices to help disabled residents and visitors navigate a city.
Check out some of our other research projects, here.
Q: At SXSW, you will be discussing "Accessible Complex Data." What kinds of new accessibility challenges are being posed by complex data?
![]() |
Brian Cragun |
But as generally implemented, these complex visualizations are inaccessible to the blind. For the blind, the question we are working to answer is: how can we approximate and approach the high-bandwidth understanding, and autonomous discovery of the key information the sighted gain from complex visualizations, such as stock market history, census trends, or scientific data.
Q: What about smart devices – phones, televisions, etc. – that access the data? How are they a part of making information accessible (or preventing accessibility)?
Smart devices make information available anywhere at anytime. When users move to a smart device, many will be affected by what we call "situational" disability: outside light, a tiny screen, using one hand, riding on a bumpy road, or needing to access information without touching or looking at the device while driving.
More then ever, these situations emphasizes the need for inclusive design. The research we work on for core disability types (deaf, blind, mobility impaired) will benefit all users of smart devices.
Q: How is IBM making today's flood of data, and the way it's analyzed and shown, more accessible?
This is a great question – and the core of our presentation.
In current products, we provide user interaction with graphs, allowing the user to sift, sort, scale and filter the information. These capabilities are already available for the visually impaired. Now, research is looking at navigation of the graphs with audible cues, so users can discover the visualization themselves.
We're also looking at how to convert the visualizations into descriptive text so any user needing information in a hands-free or eyes-free environment can benefit. Technologies on the horizon, such as electrostatic screens, electrical sensations, and other tactile feedback tools will provide other sensory exploration to effectively utilized complex data.
Q: What needs to happen to make accessibility an automatic part of the process in expressing data?

A taxonomy of graphs and content with corresponding navigation, and audible output, can standardize interactions, and provide a foundation for new graphs in the future.
Attending SXSW? Add Susann and Brian’s presentation to your schedule.
Excellent article. With all the conversation and focus on Big Data, ensuring it's accessible is non-trivial. Well done!
ReplyDeleteEnjoyed the interview...good luck w/ your presentation at SXSW.
ReplyDeleteOutstanding explanation of why big data needs to be accessible and how we can begin to make it happen. As a non-technical person working in the field myself, I appreciate the straightforward Q&A.
ReplyDeleteWe need to have a live data-model of the global-economy for any-one to perform simulation of their ideas/solutions, especially those that are running for public offices.
ReplyDelete