Authors: Sung-A Jang, Hyung-il Kim, Woontack Woo, Graham Wakefield
Abstract: In this paper, we present a new kind of wearable augmented reality (AR) 3D sculpting system called AiRSculpt in which users could directly translate their fluid finger movements in air into expressive sculptural forms and use hand gestures to navigate the interface. In AiRSculpt, as opposed to VR-based systems, users could quickly create and manipulate 3D virtual content directly with their bare hands in a real-world setting, and use both hands simultaneously in tandem or as separate tools to sculpt and manipulate their virtual creations. Our system uses a head-mounted display and a RGB-D head-mounted camera to detect the 3D location of hands and fingertips then render virtual content in calibration with real-world coordinates.
Jang SA., Kim H., Woo W., Wakefield G. (2014) AiRSculpt: A Wearable Augmented Reality 3D Sculpting System. In: Streitz N., Markopoulos P. (eds) Distributed, Ambient, and Pervasive Interactions. DAPI 2014. Lecture Notes in Computer Science, vol 8530. Springer, Cham. https://doi.org/10.1007/978-3-319-07788-8_13