Carnegie Mellon University (CMU) researchers have developed a wearable system that allows users to turn anything into a touch interface.
The kit relies on a handheld laser “pico” projector that delivers the interface image, while a Kinect-like depth-sensing camera sits on the user’s shoulder and detects interactions with a surface, which could be a table, wall or a hand.
The system, OmniTouch, which was jointly produced with Microsoft, will be unveiled by the CMU natural interaction research group at the User Interface Software and Technology Symposium this week.
The researchers opted for fingers as the key interacting tool and first tackled how the system would detect a click.
“In this case, we're detecting proximity at a very fine level,” researcher Hrvoje Benko said.
“In practice, a finger is seen as 'clicked' when its hover distance drops to one centimeter or less above a surface, and we even manage to maintain the clicked state for dragging operations.”
Users can draw out the touch interface's parameters on the selected surface, which the projector tracks when the surface is moved.
“You can tap on your hand or drag your interface out to specify the top left and bottom right border. All this stems from the main idea that if everything around you is a potential interface, then the first action has to be defining an interface area,” explained Benko.
The prototype is not as small as the researchers had hoped, however they believe it can be miniaturised in the future.