Touch and go: fondling the digital world

Published on June 11, 2012 by Melissae Fellet. Touch and go: Fondling the digital world What would digital bits and graphics feel like if we could grasp them with our hands? Thanks to tactile illusions on touchscreens, we’re about to…

Published on June 11, 2012 by Melissae Fellet.

Touch and go: Fondling the digital world

What would digital bits and graphics feel like if we could grasp them with our hands? Thanks to tactile illusions on touchscreens, we’re about to find out

IF YOU could touch it, what would the internet feel like? If you are picturing a big blob of information, perhaps you’d find it squishy, or bumpy. Then again, some corners might be rough like grass, whereas other parts would feel like cloth or polished wood.

It may seem like a silly idea, but the internet – along with the rest of the digital world – is poised to become a thing we can actually feel with our hands and fingers. Games, emails and apps will feature tangible graphics that you can poke, fondle or caress, and eventually, web pages might have the texture of leather, say, or sandpaper.

This is all thanks to a new generation of super touchscreens on the cusp of entering widespread use. These use tactile illusions to tickle our fingertips and trick our brain into feeling a texture, or employ morphing materials that rise from a flat surface to form graspable shapes. Nokia and Microsoft are thought to have screens with tangible graphics in the works. And Apple – which is rumoured to be launching a new product at its World Wide Developers Conference this month – has been applying for patents covering the technology since 2009.

The evidence, then, suggests that the next wave of smartphones and tablets will have tactile features. And by allowing us to sense bits and bytes with more than just our eyes and ears, these screens have the potential to transform how we interact with the digital world. Welcome to the age of supertouch.

Touchscreens were first deployed in British air traffic control towers in 1967, according to Bill Buxton of Microsoft Research, who maintains an extensive collection of touchscreens and historical consumer gadgets. The airport screens used capacitive sensing to register touch – a technology still employed today on most phones and tablets. It essentially works by detecting how a finger distorts an electrostatic field within the screen.

For the next few decades, touchscreens were mainly found on large, static computer terminals in public places like schools and museum information kiosks, and on ATMs. The first smartphone – IBM’s Simon, released in 1993 – featured a touchscreen number pad and basic apps, but it failed to sell. It took the launch of Apple’s iPhone in 2007 for touchscreens to become widespread on portable devices. Although its multi-touch screen wasn’t the first of its kind, it made touch mainstream and sparked countless imitations. According to some predictions, 97 per cent of smartphones will have touchscreens by 2016. In 2006, it was only 7 per cent. A few of the latest devices even have more than one touch interface: video games on Sony’s PlayStation Vita, for example, can be played using a touchscreen on its front and a touchpad on the back.

Even though the ability to swipe, pinch and drag websites, apps and photos using our fingers has proven to be a highly intuitive way of navigating the digital world, many designers, companies and researchers argue that something is still missing from touchscreens: they all feel like glass. We evolved to interact with the world around us with all our body and senses, they point out, and surfing the web, playing video games or messing about with apps cannot match the rich tactile sensations of handling and manipulating real objects. “I think we as human beings want that,” says interaction designer Ivan Poupyrev, who has built several tactile screens in recent years. The question is: how can digital information displayed on a screen be made tangible? You just need to employ the right tricks, says Poupyrev.

Some smartphones and tablets made by LG, Nokia and Samsung already offer enhanced touch sensations. The simplest are devices that buzz, creating the feeling that you have clicked a button on the screen. The vibrations are often powered by tiny motors programmed by a company called Immersion Corporation, based in San Jose, California. The feedback they provide makes it easier to type on a virtual keyboard, the company says.

That’s not all vibration can do. For example, Immersion has built smartphones with precision motors that recreate the twang of guitar strings, kickback from a virtual gun, or the sensation of a ball rolling about inside, bashing against the sides of the phone. All these effects add sensation to mobile gaming, the company says.

Poupyrev started out using vibration effects too. In 2005, at Sony, he helped develop the first commercial device with a vibrating touchscreen – it was a remote control for entertainment centres. Piezoelectric motors underneath the glass bowed when electrified, pressing the glass into your finger so that you felt the sensation of clicking a button. The technology may have been ahead of its time, says Poupyrev, but it never took off. Part of the reason may be that screens featuring such strong vibration effects need to be relatively thick, and in the smartphone and tablet market, thin is in. That’s why Poupyrev and other researchers have been searching for other ways to trick the finger into feeling sensations and textures (see diagram).

After Sony, Poupyrev moved to Disney Research Laboratory in Pittsburgh, Pennsylvania. This R&D arm of the Disney media empire quietly develops consumer technology, alongside researchers at universities in the US and Europe. Poupyrev’s challenge was to find a way to create touchscreen sensations – but without any mechanical motion.

He decided to turn to a phenomenon called electrovibration, which had been discovered by chance in 1953. One day, a chemist called Edward Mallinckrodt touched a brass light socket and noticed that the surface felt rougher when the light was on. Mallinckrodt realised that a mild current was flowing through the socket, possibly because of faulty insulation. He felt the same sensation after applying an alternating current to all sorts of conductive surfaces – even a freshly cooked egg (Science, vol 118, p 277).

Electrickery

After staging further experiments, Mallinckrodt discovered why. Alternating current oscillates by definition, with a particular frequency and varying charge strength, or voltage. When you place your finger on the electrified screen surface, opposite charges build up in the outermost layer of your skin, about 10 micrometres deep. Because the amount of charge in the surface oscillates with the current, the electrical attraction between your finger and the surface also varies. Yet you don’t actually feel anything until you move your finger across the screen, when friction combines with the oscillating electrostatic attraction. This induces vibrations in your finger, which are detected by three different types of touch receptors. Because these vibrations are essentially the same as those generated by running your finger over a ridged surface like wood, corduroy or sandpaper, your brain interprets the sensation in the same way.

In 2010, Poupyrev and his colleagues used this electrovibration perceptual illusion to make TeslaTouch, a touch panel with artificial textures that could eventually be installed in ATMs, mobile phones or even interactive walls. They had found that, in general, a high-frequency current makes a screen feel smoother than one at lower frequency. The sensations that TeslaTouch can mimic still feel somewhat artificial, Poupyrev says, but testers described the screen as feeling like paper when the current alternated at 400 hertz and wood or bumpy leather at 80 hertz. In principle, designers could use the effect to program apps or websites with different textures, says Poupyrev. E-books could have the roughness of an actual page, for example.

For now, Poupyrev’s screen is a prototype, but he’s far from the only one to be exploiting the electrovibration illusion. Senseg, a company based in Espoo, Finland, has created a similar screen that could be incorporated into some upcoming Android smartphones – and perhaps even Apple devices if a few industry rumours are to be believed. Meanwhile, Nokia is developing its own screen exploiting the electrovibration illusion.

To simulate a wider range of tactile sensations, however, you need more than feelings of texture alone: another factor is force. Turn a knob or flip a switch in the physical world and you can feel resistance pushing back on your finger. Ed Colgate at Northwestern University in Chicago and his colleagues wanted to find a way to recreate forces on flat glass – and in particular, those felt when sliding a finger across physical buttons. So he built a screen, called the LateralPad, which uses vibrations at 22,000 hertz. These movements are faster than we can hear or feel, unlike the stronger vibrations we experience on existing phones.

The vibrations can move the screen up and down away from the finger, as well as side to side. Colgate realised that by changing the timing between the lateral and vertical vibrations, the screen can generate a force that pushes a finger towards the left or right across the screen. For example, if the glass pulls 90 degrees to the left exactly at the point it is making maximum contact with a finger, it will apply a 70 millinewton push in that direction.

By combining this with sensors that track where a finger is placed, the screen simulates the feeling of physical buttons and switches. For example, Colgate can simulate the feel of a concave button – like the iPhone’s home button – on a flat screen by applying vibrations from the button’s edge. These push a finger towards the centre so that it feels like it is entering a dip.

Another force pattern gives you the feeling of pushing a light switch from side to side. As soon as Colgate felt such a virtual switch, he found the illusion addictive. “It was like those monkeys that keep pressing a bar for apple juice because it excites the dopamine,” he says, referring to the brain’s reward chemical. “I genuinely hope one day others can feel a similar level of excitement.”

Eventually, such screens could also enhance gaming. So if you were playing Angry Birds on your phone, the elastic slingshot you use to launch the birds could be made to tug on your finger as you stretch the rubber back. Or the screens could make digital apps and files more intuitively related to their properties. As a folder of photos fills up, for instance, it could be programmed to “weigh” more, requiring a harder tug to drag it across the screen.

Realistically, Colgate’s technology is unlikely to appear on smartphones and tablets for a few years, not least because cheaply engineering the complex vibration required is a huge challenge. Still, he has set up a company called Tangible Haptics to develop the technology. He believes that such screens are bound to be adopted eventually because they give app developers and interaction designers an almost unlimited number of sensations to play with.

Shape-shifting screens

But there is a rival approach vying to become the touch technology of the future – and it dispenses with the idea of flat screens altogether.

One example is a screen built by Chris Harrison and Scott Hudson at Carnegie Mellon University in Pittsburgh, Pennsylvania, which features shapes and buttons that rise out of it in 3D. A motor inflates cavities underneath a latex screen so that buttons emerge appropriate to the task at hand. It could produce any kind of keypad, such as a number pad, dials on a car dashboard or controls on a stereo. If the technology was applied to an ATM touchscreen, buttons could perhaps rise and fall depending on which actions were available, like entering a PIN or selecting how much cash to withdraw.

Harrison and Hudson’s technology is not ready for commercialisation, but a start-up called Tactus Technology, based in Fremont, California, is a bit closer to making raised screens real. Its screens have buttons, such as a keyboard or number pad, that rise up from a flat surface, pushed by fluid or gas.

All the shapes on these screens must be predefined during manufacture. But in 2010, Apple applied for a patent for a more exotic – and infinitely more flexible – morphing 3D screen. The patent described an elastic touchscreen with “shape change elements” beneath, which vibrate, bend or deform into any shape or position that a designer cared to program. Naturally, Apple isn’t talking about it, but if it is for real, it is not impossible that a future iPhone touchscreen could be capable of projecting and deforming into all sorts of shapes. Apart from allowing for more tactile buttons and keys, one use might be to add topography to a digital map of your surroundings, for example.

All in all, the touchscreens around the corner are likely to transform the ease with which we navigate the digital world. Various studies have shown that tactile sensations are crucial for dexterity and performance when we interact with machines. For instance, people make more errors when typing if their fingers are deprived of tactile feedback. So adding the sense of touch to screens will be more than just an aesthetic addition.

“Sometimes we need to remember what we’ve forgotten,” says Sile O’Modhrain, a tangible interfaces researcher at the University of Michigan in Ann Arbor. “Before computers came along, everybody’s world was very tactile. All of the things we used were mechanical.”

Indeed, the history of personal computers up until the early 21st century may strike future generations as a period when we were forced into a sort of digital sensory deprivation. When the virtual world can be poked, pushed and fiddled with in all its myriad textures and hefts, it may be difficult imagine how we lived this long without feeling it.