Haptic technology, already present on some tablets and smart phones, is emerging as an innovative way to augment visual information with effects such as vibration or “feelable” textures. The implications for accessibility are exciting and this project explored using vibrotactile feedback to augment visual information in EPUB and HTML content. A sample EPUB with haptic enabled SVG shapes was developed and is available for download. Additionally, instructions and source code for adding haptic effects to EPUB and HTML content is provided. Finally usability issues with haptic interfaces are discussed along with the results from a usability study with students with visual impairments.
Consumer tablet devices, such as the Apple iPad and Google Android products, are making significant inroads in education as a platform for delivering eBooks, instructional material, and assessments. Accessibility of the devices, and the content presented using them is an important consideration, especially for students with visual impairments, where access to graphical and spatially presented information essential to the study of science, technology, engineering, and mathematics (STEM) can pose significant challenges.
A new class of technology – tablet-based haptics – may enable an effective mechanism for presenting graphical information to students with visual impairments. As tablets with haptic capability increase their foothold in the classroom, and in the hands of students at school (and home), there is the potential for significant advances in how students with visual impairments are enabled to independently interact with STEM content. This DIAGRAM Center funded research project examines the use of haptic feedback as a means to provide access to graphical information in HTML and EPUB, using widely available vibrotactile feedback. Sample content, source code, and a discussion of usability issues is included.
What are Haptics?
Haptics, by definition, involves the sense of touch. Today, haptics applies to a number of technologies which are used to enable touch-based feedback in computer-based applications and devices. Vibrotactile, or vibration-based feedback is the most common form of feedback and is widely available on mobile phones and many tablets. Haptic feedback based on vibration first appeared on mobile phones to enable silent alerts via programmatic modulation of vibration generated by a small motor embedded within the device. The technology is universally implemented on mobile phones from multiple vendors, including touch screen smart phones. Vibrotactile capabilities on tablet devices is a different story. Our research has indicated that many of the tablets running the Google Android platform incorporate a vibrotactile capability while the Apple iPad and iPod Touch devices do not.
End-user acceptance of haptic feedback on mobile phones is clear, whether used as an alternative to an audio ringtone in silent mode, or as an alert that a text message has arrived. Some phones implement haptic feedback to enhance interaction with onscreen buttons or keyboards, and some games incorporate vibration effect to enhance gameplay. On tablets, while vibrotactile feedback is supported on a number of Android devices, the applications are generally limited to gaming or enhancements to the touch feedback (e.g., a vibration occurs when an on screen button is pressed). One interesting application with accessibility benefits is the inclusion of vibration feedback in Google’s Talkback screen reader for Android.
Beyond Silent Mode and Games
A number of researchers (Avada, et al., 2013; Gorlewicz, 2013; Toennie, et al., 2011; Guidice, et al., 2012; Hakkinen, et al., in 2013; Liimatainen, et al., 2014; Hansen, et al., 2014) have been exploring consumer tablets with vibration support for representation of graphical data. Toennie et al., in a study involving 10 sighted students, presented graphical items using a haptic enabled display and explored both haptic and audio feedback. The students, who were blindfolded for the study, were presented with both X,Y coordinate location and shape identification tasks and were able to correctly locate Cartesian points and accurately distinguish lines from shapes. Though the findings are difficult to generalize, due to the population studied and the sample size, the use of off-the-shelf technologies for research of this kind shows significant potential. Gorlewicz, et al (2014) subsequently explored the same task with visually impaired students and observed similar high rates of point location on a grid. Guidice, et al. studied three tasks: intelligibility and comprehension of bar graphs, pattern recognition using letter shapes, and orientation discrimination of geographic shapes using an off-the-shelf Android tablet. The study, involving 12 sighted (blindfolded) and 3 blind participants, demonstrated high accuracy rates for both sighted and blind participants across all tasks. Hakkinen developed a prototype task utilizing vibration feedback as a method to enhance accessibility to the Proportional Punch Task, in which students learn about ratios and proportion (Cayton-Hodges, et al., 2012). Ease of authoring haptic enabled graphics motivated Hakkinen, et al. (2013) and Liimatainen, et al., (2014), to explore a rapid means to implement the haptic effects using the widely adopted Scalable Vector Graphics standard (SVG) and an emerging standards-based vibration interface. This latter research led to the current project.
Standards-based Approach to Using Vibration
A promising next step to simplify authoring, and potentially support additional haptic technologies beyond vibration, is to extend haptics into the W3C Cascading Style Sheets (CSS) standard. Using this approach, it would be possible for content authors to apply haptic styles, such as vibration patterns or textures to web-based content, much in the same way colors or fill patterns are applied to visual content today. While a draft proposal for haptic CSS was created in 2010 by Nokia, it is no longer active. Liimatainen, et al. (2014) modeled their implementation after Haptic CSS. A W3C Community Group, Haptic Interaction on the Web, has been created with a goal to continue this work.
The rapid development of new technologies can quickly make older hardware obsolete, which places a burden on content publishers to be regularly updating their tactile content for new devices. Most haptic research to date has involved proprietary or one-off approaches to creating stimuli. A focus on a standards based-approach enables a technology neutral method to specify haptic characteristics. Haptic CSS may provide a means to future-proof graphical content that insulates the author from the low level implementation of a specific haptic technology.
Android Tablets Supporting Vibration
To experiment with the haptic samples developed in this project, or to develop your own, you will need an Android tablet supporting vibration. We have tested a number of devices from different vendors, including Lenovo, Samsung, Asus, and Toshiba. While we cannot endorse any specific vendor, it is strongly recommended to verify that a device you currently own, or are considering for purchase, incorporates vibration feedback. There are two methods, described below, to verify vibration support.
- Depending upon your Android version, you will find, in Settings, a menu option for Sound and Notification (or just Sound). Select this option and you will see further options, including Vibrate on Touch. Enable this option and you should be able to feel vibration effects when touching the screen. If no vibrations are felt, the device most likely does not support vibration.
- Ensure the device has either the latest Google Chrome or Firefox browsers. Go to the Haptic Test site, using either Chrome or Firefox, and push the button labeled “Vibrate!”. If your device supports vibration, you will feel the device vibrate. For easy access, you can print the QR code shown below and take it with you if shopping for a tablet. The QR Code will launch the test site. Click on the QR code image below to display a larger version suitable for printing.
Note: we have observed variation in the strength of the vibration motors used by different tablet vendors. Some haptic effects are very pronounced on one device, and subtle on another. If you have the opportunity to try several tablets before purchasing, you may wish to compare the vibration effect to see which you may prefer.
How Vibration Haptic Works
In Android tablets (and most smart phones), vibration is generated via small motors, usually piezoelectric, that generate vibration. An example of two styles of vibration motors is shown in Figure 1.
Getting Started with the Samples
There are two categories of sample content available. Assuming you have a device supporting vibration, and one of the supported browsers, you will be able to explore standard HTML pages containing haptic content, as well as an EPUB sample book containing haptic geometric shapes. It is important to note that because of the implementation approach utilized for the samples, it is recommended that you disable the TalkBack screen reader when exploring the haptic samples. If TalkBack is active, you will not be able to explore content and receive immediate haptic feedback. We expect that this is a limitation that will be eliminated as we pursue standards and support of haptic interaction on the web.
Developing Haptic Content
As part of this project, we are releasing sample code and documentation to enable web developers to create their own haptic enabled content. You can learn more about implementing haptics in HTML and EPUB as part of this report. All sample code and documentation is available on github.com/haptica11y and we urge those interesting in experimenting and enhancing the code to contribute or fork, as desired.
Haptics and Usability
While off-the-shelf mainstream tablet technologies with vibrotactile feedback, combined with standards-based approaches to content creation, offer an attractive and economical platform for haptic interaction, it is not without challenges. A limitation of the tablet-based haptic technologies is the lack of support for multi-touch interaction. In our own observations of how blind students explore physical embossed tactiles, two-handed exploration is common and can provide a frame of reference when exploring images (e.g., Heller et al. 2005). In our own studies, we have instructed the participants to utilize only one finger, and while we observed initial attempts to use more than one finger, the participants appear to adapt to the single-finger approach within a single session. While Wijntjes, et al. (2008) suggests that two-finger exploration should be encouraged, and Morash, et al. (2014) has demonstrated the advantages of multiple hands and fingers on haptic performance by the blind, it will be important to understand how the technical limitations of a specific technology may affect acceptance and/or usability of haptic presentation. Further, sonification combined with haptic presentation may be an effective way to augment single-touch exploration, and emerging technologies such as electro-statics may offer a method to enable two-finger exploration, where one finger is stationary and serves as a frame of reference.
In the current study, we instructed the participants, six student participants with visual impairments (ages 14-17), to explore the haptic images with only one finger. All participants had prior experience with either iPad or Android tablets. After introduction to basic geometric shapes (circle, square, triangle) on the tablet, the students where next presented with 10 haptic stimuli and asked to describe the shape, if any, represented. In addition, each student was asked to complete a System Usability Scale (SUS) survey at the conclusion of the session.
Accuracy of identification was mixed, and not dissimilar to our earlier pilot studies. All six participants correctly identified a horizontal line, which was not one of the initial familiarization shapes presented. The circle and square shapes were correctly identified by four out six participants, and the triangle by five out six participants. The diamond (square rotated 45 degrees) was only correctly identified by one participant, with four participants identifying it as a square or rectangle. The right triangle was identified as a triangle on its side by one participant, and a triangle by two others. A five-pointed star was not correctly identified by any participant, but one participant indicated that the shape had five corners, and asked if it was a pentagon. A second participant counted out four corners and then said, “pentagon”. A “noise” shape introduced among the stimuli, a random pattern of dots, resulted in no identification by four out of six students, and was reported as a square and circle by the remaining two participants.
The identification rate of approximately 66% to 88% for the square, circle, and triangle shapes and shape variants in this study is promising, given the short training and duration of the exposure (30 minutes). The star posed the most difficulty, and we feel that with present technology, complex shapes of that nature will prove difficult to interpret. Though we did not set out to explore lines in this study, the 100% identification rate suggests, along with our earlier work, that line graphs are a promising area for exploration. The result of the SUS survey indicates that usability of the haptic interface has room for improvement, scoring just above average (mean score 57.5 out of 100, standard deviation 7.35).
In our past and current usability pilot studies, we believe individual differences in how haptic effects are perceived is an area requiring further research. Such differences may be exacerbated by variability in haptic effect intensity across different vendors’ devices.
References and Further Reading
We have compiled a list of references and resources for further reading.