Exploring environments and objects via haptics has always been a natural and intuitive part of the learning process. When we deal with people, animals or inanimate objects, we are sensing tactile information like forces or temperature. This information helps us to build a mental model of the objects we interact with. Two-dimensional interfaces, such as touchscreens, limit us to explore new environments with all our senses, solely relying on visual feedback. This makes it almost impossible to develop a somatic understanding. Could interfaces be easier to learn and work with if they weren’t purely visual?
In my project I strip down conventional two-dimensional interfaces by getting rid of the visual feedback element and replacing it with vibrotactile feedback. I use this “artificial blindness” to explore new ways of interaction, which do not rely on any visual cues.
In order to learn about the nature of haptic feedback interfaces my approach was built upon trial error testing. This approach resulted various prototypes for evaluation. My main focus was tactile rendering of geometric forms using vibrations. The prototypes have been developed using Arduino, Processing and Unity3D.
The final output of my project consisted of three different tactile feedback interfaces. One for exploring patterns, one for exploring text and one with a complex 3D surface. For input I used blank touchscreens, leaving the users to rely solely on their tactual senses. The only handle provided were three printed illustrations of the content to explore.