Editor’s note: This brief Q&A series will feature IBM researchers making presentations at the 2012 South-by-Southwest Interactive Conference in Austin, Texas.
Join the conversation: #sxswibm #accessdata
Susann Keohane and Brian Cragun, consultants for IBM Research’s Human
Ability & Accessibility Center (referred to as the AbilityLab), will present Beyond a Thousand Words: Accessible Complex Data – a discussion about the accessibility challenges of analyzing, visualizing, and using today’s big data – on Tuesday, March 13.
Q: What kinds of solutions does the IBM AbilityLab develop? What is a recent example?
Our lab develops solutions to help everyone participate in technology. For example, Accessible Workplace Connections is a web-based application for employees with disabilities to have their necessary work accommodations be delivered, changed, supported and maintained effectively and efficiently.
And our Access My City application provides real-time transit data, geo-location and mapping technologies, and publicly available accessibility information, to mobile devices to help disabled residents and visitors navigate a city.
Check out some of our other research projects, here.
Q: At SXSW, you will be discussing "Accessible Complex Data." What kinds of new accessibility challenges are being posed by complex data?
We all struggle to find pearls in the ocean of complex data. Well-chosen graphical visualizations have the ability to communicate key information quickly.
But as generally implemented, these complex visualizations are inaccessible to the blind. For the blind, the question we are working to answer is: how can we approximate and approach the high-bandwidth understanding, and autonomous discovery of the key information the sighted gain from complex visualizations, such as stock market history, census trends, or scientific data.
Q: What about smart devices – phones, televisions, etc. – that access the data? How are they a part of making information accessible (or preventing accessibility)?
Smart devices make information available anywhere at anytime. When users move to a smart device, many will be affected by what we call "situational" disability: outside light, a tiny screen, using one hand, riding on a bumpy road, or needing to access information without touching or looking at the device while driving.
More then ever, these situations emphasizes the need for inclusive design. The research we work on for core disability types (deaf, blind, mobility impaired) will benefit all users of smart devices.
Q: How is IBM making today's flood of data, and the way it's analyzed and shown, more accessible?
This is a great question – and the core of our presentation.
In current products, we provide user interaction with graphs, allowing the user to sift, sort, scale and filter the information. These capabilities are already available for the visually impaired. Now, research is looking at navigation of the graphs with audible cues, so users can discover the visualization themselves.
We're also looking at how to convert the visualizations into descriptive text so any user needing information in a hands-free or eyes-free environment can benefit. Technologies on the horizon, such as electrostatic screens, electrical sensations, and other tactile feedback tools will provide other sensory exploration to effectively utilized complex data.
Q: What needs to happen to make accessibility an automatic part of the process in expressing data?
Better mappings of visual information to other sensory modes need to be researched and proven.
A taxonomy of graphs and content with corresponding navigation, and audible output, can standardize interactions, and provide a foundation for new graphs in the future.
Attending SXSW? Add Susann and Brian’s presentation to your schedule.
Labels: abilitylab, accessibility, big data, complex data, sxsw