![]() ![]() Our interest is in methods to capture, code, and analyze tactile behaviors related to the cognitive science of tactile graphics-the human information processing involved in the reading and reasoning with tactile pictures and diagrams. Additionally, from a computer science angle, there is growing interest in the potential of gesture-based human-computer interaction (e.g., Wobbrock, Morris, & Wilson, 2009). From the perspective of display technology for people with visual impairment, research has been conducted to assess the efficacy of novel devices for delivering tactile stimuli to fingers (Blazie & Cranmer, 1976 Bliss, Katcher, Rogers, & Shepard, 1970 Kaczmarek, Tyler, & Bach-y-Rita, 1997) and the back (Geldard & Russel, 1957 White, Saunders, Scadden, Bach-y-Rita, & Collins, 1970). Beyond perception, there is interest in the strategies that readers employ to read braille (Bertelson, Mousty, & D’Alimonte, 1985 Breidegard et al., 2008 Hughes, McClelland, & Henare, 2014 Millar, 2003 Mousty & Bertelson, 1992 Symmons & Richardson, 2000) and whether lateralization occurs in such ability (Mousty & Bertelson, 1985). Some studies have probed the perceptual differences between sighted participants who are blindfolded and people with visual impairment (e.g., Alary et al., 2009 Heller, 2002 Jehoel et al., 2006 McCallum et al., 2006). One may be interested in our basic perceptual abilities in touch, such as the appropriateness of certain tactile features for discrimination (Jehoel, McCallum, Rowell, & Ungar, 2006 McCallum, Ungar, & Jehoel, 2006), the role of tactile gestalts in perception (Gallace & Spence, 2011), or perceptual aspects reading a tactile script (Millar, 2003). Interest in technology and techniques to capture, code and analyze tactile interactions with (nearly flat) 2-D materials, such as braille and raised line graphics, persists for several interrelated reasons. Tactile reading of braille, and tactile interaction more generally, involves fingers and hands moving across the surface of a display to obtain information, such as messages encoded in a linear sequence of raised alphanumeric symbols or the spatial configuration of other raised elements. ![]() To demonstrate the functions of MIDAS, its three components were used to capture, analyze, code, and interpret the behavior of an experienced user and an inexperienced user of tactile graphics as they performed a shape-matching task. The efficacy of MIDAS was assessed against a set of criteria drawn from the successes and limitations of prior approaches to the study of tactile interactions. ![]() MIDAS-tactile protocol analysis (TPA) provides a scheme and a method to enable the rich coding and interpretation of tactile behaviors over multiple spatial and temporal scales. The MIDAS-analyser is a software program for the qualitative and quantitative analysis of MIDAS-logger touch data, which includes a fully interactive visualization of the data and a yoked display of a conventional simultaneous video recording made of the interactions. The MIDAS-logger uses the current screen technology of tablet computers to capture touches (up to ten fingers at high spatial and temporal resolution) through conventional tactile graphics that are overlaid on the screen. MIDAS is a set of three tools for the digital capture, coding, analysis, and interpretation of time-series, multitouch, interactive behaviors on a tactile surface. The study of haptic perception and cognition requires data about how humans interact with tactile surfaces in the context of performing cognitive tasks.
0 Comments
Leave a Reply. |