Sean Gustafson unlocks his iPhone by swishing a finger across its screen and pecking out a four-digit PIN on its keypad. It's unremarkable save for one small thing: there is no phone in his hand. Instead, he's pressing invisible "buttons" on his palm – to operate an imaginary cellphone. And, astonishingly, it works.
Imagine you cannot find or use your phone when it starts ringing. Perhaps it's fallen into the depths of your sofa, or your hands are wet from washing up or greasy from baking. To decline the call and send it to your messaging service, you press the area of your palm corresponding to the position of the relevant button if you had the phone in your hand. Or you could press the buttons to answer the call and turn on the speakerphone.
The idea is certainly strange, butGustafsonand his colleaguesPatrick Baudischand Christian Holz at the Hasso Plattner Institute at Potsdam University in Germany think there is a gap in the market for phones and TV remotes like this that don't actually exist.
For it to work they reason you'd need two things: people who know precisely where the apps are on their physical phone, and a technology that can sense where they are pressing on their hand so a computer can respond and send commands to your phone – wherever it is.
From iPhone to iPalm
To find out how well people know their modern touchscreen phones, the Potsdam trio recruited 12 volunteers from among the iPhone users they spotted in their cafeteria and tested how well they knew the position of their favoured apps without their phone. "We found 68 per cent of iPhone users can locate the majority of their home screen apps on their hand. This means that iPhone use inadvertently prepares users for the use of an imaginary version," says Baudisch.
Having established a reasonable chance of successfully finding an app's position on someone's palm, they then decided to use "depth cameras" – similar to those at the heart of Microsoft's Kinect motion-sensing gaming system – to detect where someone is pressing on their palm.
The depth camera they used in their tests is a "time-of-flight" device that flashes an invisible infrared pattern on the scene and which uses ultrafast receiver circuitry to time how long it takes the light bathing different parts of the scene to be returned to a sensor. That way, it knows how far all the objects in the scene are from the camera – so when a users' finger presses on their palm, it registers where and when it does so. The signal is sent to a computer which processes it and then sends the relevant command to your cellphone.
In their tests, the depth camera was a clunky head mounted device. "But ultimately, we envision the camera becoming so small that it integrates into clothing, such as the button of a shirt, a brooch, or a pendant. So people would not even notice if someone carries an imaginary phone," Baudisch told New Scientist.
"We envision that users will initially use imaginary phones as a shortcut to operate the physical phones in their pockets. As users get more experienced, it might even become possible to leave the device at home and spend the day ‘all-imaginary'."
Answering calls on the phone would still require the physical device – but it would be possible to access apps and forward calls to voicemail with the imaginary version.
It's not all about phones, however: Gustafson is now working out how a TV remote control could be replaced by an imaginary zapper. The team hope to present their work at a conference on user interfaces later this year.
"For quite simple interactions this is probably going to work well," saysNick Bryan-Kinnsof Queen Mary, University of London. "But for more complicated functions it's difficult to know how you'd do it without using audio feedback from the device, telling you which function you've activated."