Ultrasound makes for palm-based computer displays you can feel
-   +   A-   A+     12/04/2016
From buzzing phones to quivering console controllers, haptic feedback has become indispensable in modern computing, and developers are already wondering how it will be felt in systems of the future. Sending ultrasound waves through the back of the hand to deliver tactile sensations to the front might sound a little far-fetched, but by achieving just that UK scientists claim to have cleared the way for computers that use our palms as advanced interactive displays.

From buzzing phones to quivering console controllers, haptic feedback has become indispensable in modern computing, and developers are already wondering how it will be felt in systems of the future. Sending ultrasound waves through the back of the hand to deliver tactile sensations to the front might sound a little far-fetched, but by achieving just that UK scientists claim to have cleared the way for computers that use our palms as advanced interactive displays.

For years now scientists have been chipping away at the idea of using human skin as a computer display. It sounds unlikely, but with technology becoming more miniaturized, the uptake in wearable devices and more time spent gazing into computer screens, in some ways it seems natural that we use our most readily available surfaces as gateways to the digital realm.

While we're not expecting the very next Fitbit to project your calories burned onto your forearm, some promising prototypes have emerged in this area. The Skinput display system from 2010 used a bio-acoustic sensing array to translate finger taps on the palm into input commands, while the Cicret wristband concept from 2014 envisioned beaming an Android interface onto the arm and used proximity sensors to follow finger movements.

Researchers at the University of Sussex are working to improve palm-based displays by adding tactile sensations to the mix. Importantly, they are aiming to do so without using vibrations or pins, approaches they say have plagued previous efforts as they require contact with the palm and therefore disrupt the display.

So they are looking to sneak in the back door. Their SkinHaptics system relies on an array of ultrasound transmitters that when applied to the back of the hand, send sensations to the palm, which can therefore be left exposed to display the screen.

The team says it was able to achieve this through something it calls time-reversal processing. As the ultrasound waves enter through the back of the hand they begin as broad pulses that actually become more targeted as they move through to the other side, landing at a specific point on the palm. The researchers liken it to water ripples working in reverse.

"Wearables are already big business and will only get bigger," says Professor Sriram Subramanian, who led the research. "But as we wear technology more, it gets smaller and we look at it less, and therefore multisensory capabilities become much more important. If you imagine you are on your bike and want to change the volume control on your smartwatch, the interaction space on the watch is very small. So companies are looking at how to extend this space to the hand of the user. What we offer people is the ability to feel their actions when they are interacting with the hand."


Read count: 2429 Previous page Back to top
Other news