- Register


Home>CONTROLS>HMIs>Hands up for touch-free control

Hands up for touch-free control

06 March 2014

Scientists at SINTEF, The the largest independent research organisation in Scandinavia, are working to develop interactions between themselves and mobile devices which do not require touching the display. The basic technology has been around for several years, used initially for scrolling through document pages, they are now working on selecting and moving objects, or saying stop by raising a hand.

Originally, the focus was on touch-free interaction with stationary installations such as PCs and laptops. Now it is mobile phones and tablets. The technology could also be very useful in shop-floor automaton environments.

The researchers see this as an extended service, not something that will replace the functions of a touch-display. "Imagine that you're baking bread. Your fingers are sticky and you want to check the tablet to see how much flour it said in the recipe. In situations like this it would be great not to have to touch the screen," says Petter Risholm at SINTEF ICT. "Or you're working in your workshop, your hands are oily, and you want to check a phone number on your mobile. Many people also suffer from 'mouse arm', and it would be of great help to them to interact with the screen using more expansive hand and arm movements."

Researchers see this as an extended service, not something that will replace the functions of a touch-display


Many companies around the world are focusing on the field of touch-free interaction, and there are many competing technologies. A field in which touch-free interplay is widely used is the games industry, and Risholm mentions Microsoft's Kinect sensor, which is the best known example. It can recognise body postures and expansive arm movements. However, touch-free interaction is not yet widely used elsewhere or for mobile devices, although some systems have been launched onto the market. For example, it’s possible on Samsung's newest mobile phone to scroll though an e-book by moving your fingers in front of the integrated infra-red sensor.

But according to the researchers, the system has limited options and a limited field of vision. "To use it, you have to perform the command directly over the sensor, which tends to be at the top of the PC/mobile/iPad. If you move your fingers or hand away just a little, nothing happens," says Risholm.

The Norwegian researchers have therefore chosen to focus on ultrasound. This technology enables the whole screen to be used – which means a larger working surface. "The system picks up on what you’re doing both in front of and beside the screen. This creates a large interaction area, and means that it’s also possible to control the device without having to screen off the display. The mobile or tablet can also detect what you’re doing as long as you move within between 2 and 30cm from the screen", says Risholm.

No room for misunderstanding

Much of the work focuses on making the system so robust that the command you make will always be understood with no room for misunderstanding. This becomes particularly challenging as researchers try to expand the vocabulary to include gestures for more advanced commands, such as drag and drop.

The technology for future touch-free interaction must be cheap and power-efficient. The system must be small and uncomplicated, so that it can be integrated into mobile devices, and it must have a large work capacity so that interplay can be relatively detailed.

"Cameras and infra-red technologies are being used on a global scale, but we have seen that our ultrasound technology has properties that fulfil many of these criteria," says Tom Kavli of commercial partner Elliptic Labs. The first project for Elliptic Labs ran from 2008–2012. It was here that much of the basic technology for touch-free interaction with PCs or desktop devices was developed. A new project, called Multigest (2012–2016), is now up and running, and is extending this work to include mobile phones and tablets. Both projects are BIA projects (User-driven Research-based Innovation) funded by the Research Council of Norway.

"The main advantage is that our technology enables a larger space from which a device can detect and recognise the user's gestures. You are not limited to the top of the device, but can operate in front of the whole screen, which is also best in terms of ergonomics."

In October 2013, Elliptic Labs launched touchless gesturing capability on Android smartphones using ultrasound. "Touchless gesturing in a natural way all around the screen of a smart phone or tablet, using the movements you use in daily life, gives smartphone manufacturers a way to easily and cost effectively include consumer-friendly touchless gesturing in their phones,” said Laila Danielsen, CEO, Elliptic Labs. "Our technology is also great for playing games on smartphones and running applications that require high relative accuracy and speed. It uses little power and has high resolution.”

Equipped with tiny microphones, transducers, and proprietary software, Android smartphone OEMs can now use the ultrasound spectrum (above 20kHz) to enable touchless gesturing. Sound waves sent from the device interact with a user’s hand and it’s this interaction that moves objects on a device screen. Accurate time-of-flight measurement and distributed sensing (capturing movement from multiple angles) enable true 3D interaction above, below and to the side of the screen at 180 degrees.

Kavli says that the field is developing at a furious pace, and that new, exciting applications will open up as the technology continues to develop. "We’ve had an extremely good response internationally from all mobile phone manufacturers, and have just been awarded an international innovation award in Japan," he says. Now the company is working to commercialise the technology, and estimates that it will be rolled out in a few years’ time.

* The abbreviation SINTEF means The Foundation for Scientific and Industrial Research at the Norwegian Institute of Technology (NTH). Every year, SINTEF supports the development of 2000 or so Norwegian and overseas companies via its research and development activity.

Key Points

Widely used in the games industry, touch-free interaction is not yet signifcantly used elsewhere or for mobile devices

Many companies are focusing on the field of touch-free interaction, and there are many competing technologies

Ultrasound technology enables the whole screen to be used, which means a larger working surface