Interacting with data visualizations without an instrument or touch surface is typically characterized by the use of mid-air hand gestures. While mid-air expressions can be quite intuitive for interacting with digital content at a distance, they frequently lack precision and necessitate a different way of expressing users' data-related intentions. In this work, we aim to identify new designs for mid-air hand gesture manipulations that can facilitate instrument-free, touch-free, and embedded interactions with visualizations, while utilizing the three-dimensional (3D) interaction space that mid-air gestures afford. We explore mid-air hand gestures for data visualization by searching for natural means to interact with content. We employ three studies-an Elicitation Study, a User Study, and an Expert Study, to provide insight into the users' mental models, explore the design space, and suggest considerations for future mid-air hand gesture design. In addition to forming strong associations with physical manipulations, we discovered that mid-air hand gestures can: promote space-multiplexed interaction, which allows for a greater degree of expression; play a functional role in visual cognition and comprehension; and enhance creativity and engagement. We further highlight the challenges that designers in this field may face to help set the stage for developing effective gestures for a wide range of touchless interactions with visualizations.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1109/TVCG.2023.3332647 | DOI Listing |
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!