Users within a Virtual Environment often need support designing the environment around them with the need to find relevant content while remaining immersed. We focus on the familiar sketch-based interaction to support the process of content placing and specifically investigate how interactions from a tablet or desktop translate into the virtual environment. To understand sketching interaction within a virtual environment, we compare different methods of sketch interaction, i.e., 3D mid-air sketching, 2D sketching on a virtual tablet, 2D sketching on a fixed virtual whiteboard, and 2D sketching on a real tablet. The user remains immersed within the environment and queries a database containing detailed 3D models and replace them into the virtual environment. Our results show that 3D mid-air sketching is considered to be a more intuitive method to search a collection of models; while the addition of physical devices creates confusion due to the complications of their inclusion within a virtual environment. While we pose our work as a retrieval problem for 3D models of chairs, our results are extendable to other sketching tasks for virtual environments.
Mixing realities for sketch retrieval in Virtual Reality
2019-11-01
Presented at: 17th ACM SIGGRAPH International Conference on Virtual-Reality Continuum and its Applications in Industry (VRCAI), Brisbane, Australia. (2019)
Image
Electronic Resource
English
DDC: | 629 |
Explorative Study on Asymmetric Sketch Interactions for Object Retrieval in Virtual Reality
BASE | 2022
|Sketch Based Image Retrieval using Transfer Learning
IEEE | 2019
|A comparison of methods for sketch-based 3D shape retrieval
British Library Online Contents | 2014
|Emerald Group Publishing | 1946