Version 2 2024-06-04, 07:14Version 2 2024-06-04, 07:14
Version 1 2019-01-07, 12:17Version 1 2019-01-07, 12:17
journal contribution
posted on 2018-01-01, 00:00authored byJules Moloney, Branka Spehar, Anastasia Globa, Rui WangRui Wang
Using the theory of affordance from perceptual psychology and through discussion of literature within visual data mining and immersive analytics, a position for the multi-sensory representation of big data using virtual reality (VR) is developed. While it would seem counter intuitive, information-dense virtual environments are theoretically easier to process than simplified graphic encoding—if there is alignment with human ecological perception of natural environments. Potentially, VR affords insight into patterns and anomalies through dynamic experience of data representations within interactive, kinaesthetic audio-visual virtual environments. To this end we articulate principles that can inform the development of VR applications for immersive analytics: a mimetic approach to data mapping that aligns spatial, aural and kinaesthetic attributes with abstractions of natural environments; layered with constructed features that complement natural structures; the use of cross-modal sensory mapping; a focus on intermediate levels of contrast; and the adaptation of naturally occurring distribution patterns for the granularity and distribution of data. While it appears problematic to directly translate visual data mining techniques to VR, the ecological approach to human perception discussed in this article provides a new framework for big data visualization researchers to consider.