Japanese IT company NEC is touting a new user interface technology that could solve one of the major challenges of gesture control – how to see what you’re doing when you’re interacting with thin air.
The ‘Ambient Interface by Real World Interaction’ system, first debuted at Mobile World Congress (MWC) in February, combines a motion-sensing camera with a compact image projector. At MWC it was used to project an image being transferred between multiple devices (see video below), but NEC says it can also project images for input, such as a keyboard.
The camera recognises three dimensions, so could conceivably respond to a user pressing down on the projected image.
Images can be projected anywhere they are needed, such as onto a desk or even a user’s hand.
Initial applications are in technologies like interactive digital signs, although the technology could also be used as an interface for tablets and smartphones.
Check out the demo video from MWC and this extra info and post your thoughts in the comments – do you think this technology could improve the gesture interface space?