What if magic could be used as an effective metaphor to perform data visualization and analysis using speech and gestures while mobile and on-the-go? In this paper, we introduce WIZUALIZATION, a visual analytics system for eXtended Reality (XR) that enables an analyst to author and interact with visualizations using such a magic system through gestures, speech commands, and touch interaction. Wizualization is a rendering system for current XR headsets that comprises several components: a cross-device (or ARCANE FOCUSES) infrastructure for signalling and view control (WEAVE), a code notebook (SPELLBOOK), and a grammar of graphics for XR (OPTOMANCY). The system offers users three modes of input: gestures, spoken commands, and materials. We demonstrate Wizualization and its components using a motivating scenario on collaborative data analysis of pandemic data across time and space.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1109/TVCG.2023.3326580 | DOI Listing |
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!