Leap Motion Hand and Stylus Tracking for Calibration and Interaction within Optical See-Through Augmented Reality
Kenneth R. Moser, Sujan Anreddy, and J. Edward Swan II. Leap Motion Hand and Stylus Tracking for Calibration and Interaction within Optical See-Through Augmented Reality. In Research Demonstrations, IEEE International Conference on Virtual Reality (IEEE VR 2016), Mar 2016.
https://youtu.be/FCblJACs7sQ
Download
Abstract
Highly anticipated consumer level optical see-through head-mounted display offerings, such as the Microsoft HoloLens and Epson Moverio Pro BT-2000, include not only the standard IMU and GPS sensors common to modern mobile devices, but also feature additional depth sensing and hand tracking cameras intended to support and promote the development of innovative user interaction experiences. Through this demonstration, we showcase the potential of these technologies in facilitating not only interaction, but also intuitive user-centric calibration, for optical see-through augmented reality. Additionally, our hardware configuration provides a straightforward example for combining consumer level sensors, such as the Leap Motion controller, with existing head-mounted displays and secondary tracking devices to ease the development and deployment of immersive stereoscopic experiences. We believe that the methodologies presented within our demonstration not only illustrate the potential for ubiquitous calibration across next generation consumer devices, but will also inspire and encourage further developmental efforts for optical see-through augmented reality from the community at large.
BibTeX
@InProceedings{IEEEVR16-lmd, author = {Kenneth R. Moser and Sujan Anreddy and J. Edward {Swan~II}}, title = {Leap Motion Hand and Stylus Tracking for Calibration and Interaction within Optical See-Through Augmented Reality}, booktitle = {Research Demonstrations, IEEE International Conference on Virtual Reality (IEEE VR 2016)}, location = {Clemson, South Carolina, USA}, date = {March 19--23}, month = {Mar}, year = 2016, abstract = { Highly anticipated consumer level optical see-through head-mounted display offerings, such as the Microsoft HoloLens and Epson Moverio Pro BT-2000, include not only the standard IMU and GPS sensors common to modern mobile devices, but also feature additional depth sensing and hand tracking cameras intended to support and promote the development of innovative user interaction experiences. Through this demonstration, we showcase the potential of these technologies in facilitating not only interaction, but also intuitive user-centric calibration, for optical see-through augmented reality. Additionally, our hardware configuration provides a straightforward example for combining consumer level sensors, such as the Leap Motion controller, with existing head-mounted displays and secondary tracking devices to ease the development and deployment of immersive stereoscopic experiences. We believe that the methodologies presented within our demonstration not only illustrate the potential for ubiquitous calibration across next generation consumer devices, but will also inspire and encourage further developmental efforts for optical see-through augmented reality from the community at large. }, wwwnote = {<a target="_blank" href="https://youtu.be/FCblJACs7sQ"> https://youtu.be/FCblJACs7sQ</a>}, }