OmniTouch, a wearable projection system developed by researchers at Microsoft Research and Carnegie Mellon University, lets you turn pads of paper, walls, or even your own hands, arms, and legs into graphical, interactive surfaces.
OmniTouch uses a depth-sensing camera, similar to the Microsoft Kinect, to track your fingers on everyday surfaces. You control interactive applications by tapping or dragging your fingers. The projector can superimpose keyboards, keypads, and other controls onto any surface, automatically adjusting for the surface’s shape and orientation to minimize distortion of the projected images.
You can use the palm of your hand as a phone keypad, or as a tablet for jotting down brief notes. Maps projected onto a wall can be panned and zoomed with the same finger motions that work with a conventional multitouch screen.
It’s conceivable that anything you can do on today’s mobile devices, you will be able to do on your hand using OmniTouch,” said Chris Harrison, a Ph.D. student in Carnegie Mellon’s Human-Computer Interaction Institute.
The OmniTouch device includes a short-range depth camera and laser pico-projector and is mounted on your shoulder. But Harrison said the device ultimately could be the size of a deck of cards, or even a matchbox, so that it could fit in a pocket, be easily wearable, or be integrated into future handheld devices.
Harrison previously worked with Microsoft Research to develop Skinput, a technology that used bioacoustic sensors to detect finger taps on a person’s hands or forearm to control smartphones or other compact computing devices.
Harrison was an intern at Microsoft Research when he developed OmniTouch in collaboration with Microsoft Research’s Hrvoje Benko and Andrew D. Wilson. Harrison will describe the technology on Wednesday (Oct. 19) at the Association for Computing Machinery’s Symposium on User Interface Software and Technology (UIST) in Santa Barbara, Calif.
Courtesy:Newscientist.com
No comments:
Post a Comment