Skip to Main Content

Reading by touch: New software predicts direction of user’s fingers

Unknown

Image: Division of Research

Two key refinements that improve the experience of blind or visually impaired people who use iPads as touch-based reading devices have been developed by Francis Quek, Texas A&M professor of visualization, and Yasmine N. El-Glaly, assistant professor of computer science at Port Said University in Egypt.

A research paper detailing the refinements, which improve how accurately the software responds to a user’s touch, earned an outstanding paper award at the Nov. 12-16, 2014 International Conference for Multimodal Interaction at Bogazici University in Istanbul, Turkey.,

In their paper, “Digital Reading Support for The Blind by Multimodal Interaction,” Quek and El-Glaly describe how blind or visually impaired readers drag their fingertips along virtual lines of text on the tablet’s screen or an overlay to hear the tablet “speak” text of a book or article.

“Existing applications force the user to be slow,” said Quek, director of the Texas A&M Embodied Interaction Laboratory.  “If the user runs her finger too quickly on the virtual lines of text or changes the software’s access mode to read bigger chunks of words, she can easily lose her place or wander between virtual lines of text without realizing it,” he said.

To address these issues, Quek and El-Glaly developed software for an iPad that predicts the direction of a user’s finger on a tablet overlay, audibly rendering words in sequence then alerting the reader if she strays from the reading line. Their work was supported with a $302,000 grant from the National Science Foundation.

#TAMUresearch

#12thManResearch