Manufacturers have been falling over each other to present us with the most innovative sensory input method for some time, from voice control software to gesture recognition boxes. But surely all this output is lagging behind? We have plenty of visual and audio products: HD displays, 3D displays, multiple displays, and speakers. Yet our biggest organ, the skin, is touch-sensitive, and touch-screens still don’t feel up to scratch.
There’s nothing like the satisfactory feeling of pressing a button on an actual keyboard. Touch-screens lack direct tactile feedback – something our emotions are programmed to respond to – meaning we continue to rely on vision alone. You’re already familiar with vibrating game controllers, but imagine a navigation system that physically pushes you in the right direction or a tablet screen that lets you feel the fur on a picture of a cat and hug a friend on the other side of the world. While tablet computers on the market today are able to provide an evocative immersive experience, they have not yet been designed to fully integrate multi-sensory feedback.
A number of researchers have already focused on ways to use physical objects to manipulate the digital world. MIT Tether gloves, for example, enable users to handle digital objects by gesturing, pinching, stretching and drawing, but they are unable to give tangible feedback. The same applies to Portico: a portable system that uses two cameras to detect physical products in the vicinity of a tablet PC. The real challenge is making virtual objects not only behave, but also feel, physical. And this is where haptic feedback comes in.
There are two main categories of haptic feedback: texture feedback and force feedback. New technologies have made texture feedback relatively easy. One of the main researchers in this field is Ivan Poupyrev, who explains on his personal website how multiple piezoceramic layers that bend under the influence of an electric current can be used to create and alternate various textures on a surface. Another technology, applied in Senseg touch-screens, generates a ‘Coloumb’s force’ between the screen and the user’s finger, producing tactile sensations that feel like vibrations and even edges. This method is completely silent and delivers a larger variety of textures than alternatives. In March, rumours surfaced that this technology was going to be used in the iPad 3.
As for force feedback, we recognize a primitive version of this in our vibrating phones and game controllers. More subtle movement is much harder to accomplish. However, there has been one successful attempt in the form of surround haptics, where a grid of actuators vibrating at a high resolution, were used to generate tactile illusions. The immersive experience that this creates is likely to be applied in the gaming and movie industries, but also to improve social connectedness – allowing people to ‘touch’ each other over a distance.
A further step in virtual tangibility development could conceivably be connecting people in ‘the haptic cloud’. Cloud hosting services would provide a space where virtual tactile objects could be created in a combination of pressure and texture feedback. Cloud computing already provides the necessary processing power and storage space for the large amounts of data and complex computations needed to enable sensors and actuators to respond to physical interaction in real time. Users could touch objects, animals, and each other in the same virtual world while being physically miles apart. If you look at it like this, haptic feedback has the potential to be so much more than a novel smartphone function. Let’s hope that developers agree this is where the future lies.