Visible to the public Immersive Systems

SoS Newsletter- Advanced Book Block

Immersive Systems

Immersion systems, commonly known as "virtual reality", are used for a variety of functions such as gaming, rehabilitation, and training. These systems mix the virtual with the actual, and have implications for cybersecurity because they may make the jump from virtual to actual systems. The research cited here was presented between January and August of 2014.

  • Gebhardt, S.; Pick, S.; Oster, T.; Hentschel, B.; Kuhlen, T., "An Evaluation Of A Smart-Phone-Based Menu System For Immersive Virtual Environments," 3D User Interfaces (3DUI), 2014 IEEE Symposium on , vol., no., pp.31,34, 29-30 March 2014. doi: 10.1109/3DUI.2014.6798837 System control is a crucial task for many virtual reality applications and can be realized in a broad variety of ways, whereat the most common way is the use of graphical menus. These are often implemented as part of the virtual environment, but can also be displayed on mobile devices. Until now, many systems and studies have been published on using mobile devices such as personal digital assistants (PDAs) to realize such menu systems. However, most of these systems have been proposed way before smartphones existed and evolved to everyday companions for many people. Thus, it is worthwhile to evaluate the applicability of modern smartphones as carrier of menu systems for immersive virtual environments. To do so, we implemented a platform-independent menu system for smartphones and evaluated it in two different ways. First, we performed an expert review in order to identify potential design flaws and to test the applicability of the approach for demonstrations of VR applications from a demonstrator's point of view. Second, we conducted a user study with 21 participants to test user acceptance of the menu system. The results of the two studies were contradictory: while experts appreciated the system very much, user acceptance was lower than expected. From these results we could draw conclusions on how smartphones should be used to realize system control in virtual environments and we could identify connecting factors for future research on the topic.
    Keywords: human computer interaction; mobile computing; smart phones; user interfaces; virtual reality; VR applications; immersive virtual environments; platform-independent menu system; smart-phone-based menu system; system control; user acceptance; virtual reality; Control systems; Mobile communication; Navigation; Smart phones; Usability; Virtual environments (ID#:14-2939)
  • Hansen, N.T.; Hald, K.; Stenholt, R., "Poster: Amplitude Test For Input Devices For System Control In Immersive Virtual Environment," 3D User Interfaces (3DUI), 2014 IEEE Symposium on, vol., no., pp.137,138, 29-30 March 2014. doi: 10.1109/3DUI.2014.6798858 In this study, the amplitudes best suited to compare four input devices are examined in the context of a pointer-based system control interface for immersive virtual environments. The interfaces are based on a pen and tablet, a touch tablet, hand-tracking using Kinect and a Wii Nunchuk analog stick. This is done as a preliminary study in order to be able to compare the interfaces with the goal of evaluating them in the context of using virtual environments in a class lecture. Five amplitudes are tested for each of the four interfaces by having test participants mark menu elements in an eight-part radial menu using each combination of amplitude and interface. The amplitudes to be used for future experiments were found. Also, the movement times for the interfaces do not fit the predictions of Fitts' law.
    Keywords: interactive devices; notebook computers; user interfaces; virtual reality; Kinect; Wii Nunchuk analog stick; amplitude test; class lecture; hand-tracking; immersive virtual environments; input devices; menu elements; pen; pointer-based system control interface; touch tablet; Context; Control systems; Educational institutions; Indexes; Layout; Three-dimensional displays; Virtual environments (ID#:14-2940)
  • Khan, N.M.; Kyan, M.; Ling Guan, "ImmerVol: An Immersive Volume Visualization System," Computational Intelligence and Virtual Environments for Measurement Systems and Applications (CIVEMSA), 2014 IEEE International Conference on, pp.24,29, 5-7 May 2014. doi: 10.1109/CIVEMSA.2014.6841433 Volume visualization is a popular technique for analyzing 3D datasets, especially in the medical domain. An immersive visual environment provides easier navigation through the rendered dataset. However, visualization is only one part of the problem. Finding an appropriate Transfer Function (TF) for mapping color and opacity values in Direct Volume Rendering (DVR) is difficult. This paper combines the benefits of the CAVE Automatic Virtual Environment with a novel approach towards TF generation for DVR, where the traditional low-level color and opacity parameter manipulations are eliminated. The TF generation process is hidden behind a Spherical Self Organizing Map (SSOM). The user interacts with the visual form of the SSOM lattice on a mobile device while viewing the corresponding rendering of the volume dataset in real time in the CAVE. The SSOM lattice is obtained through high-dimensional features extracted from the volume dataset. The color and opacity values of the TF are automatically generated based on the user's perception. Hence, the resulting TF can expose complex structures in the dataset within seconds, which the user can analyze easily and efficiently through complete immersion.
    Keywords: data visualisation; feature extraction; image colour analysis; medical computing; opacity; rendering (computer graphics); self-organising feature maps; transfer functions;vectors;3D datasets analysis; CAVE; DVR; ImmerVol; SSOM; automatic virtual environment; color values; direct volume rendering; feature extraction; immersive volume visualization system; medical domain; navigation; opacity values; rendered dataset; spherical self organizing map; transfer function; Data visualization; Image color analysis; Lattices; Rendering (computer graphics);Three-dimensional displays; Training; Vectors (ID#:14-2941)
  • Basu, A; Johnsen, K., "Ubiquitous Virtual Reality 'To-Go'," Virtual Reality (VR), 2014 IEEE, pp.161,162, March 29 2014-April 2 2014. doi: 10.1109/VR.2014.6802101 We propose to demonstrate a ubiquitous immersive virtual reality system that is highly scalable and accessible to a larger audience. With the advent of handheld and wearable devices, we have seen it gain considerable popularity among the common masses. We present a practical design of such a system that offers the core affordances of immersive virtual reality in a portable and untethered configuration. In addition, we have developed an extensive immersive virtual experience that involves engaging users visually and aurally. This is an effort towards integrating VR into the space and time of user workflows.
    Keywords: notebook computers; ubiquitous computing; virtual reality; wearable computers; handheld devices; immersive virtual experience; portable configuration; ubiquitous immersive virtual reality system; ubiquitous virtual reality; untethered configuration; wearable devices; Educational institutions; Pediatrics; Positron emission tomography; Three-dimensional displays; Training; Virtual environments (ID#:14-2942)
  • Laha, B.; Bowman, D.A; Socha, J.J., "Effects of VR System Fidelity on Analyzing Isosurface Visualization of Volume Datasets," Visualization and Computer Graphics, IEEE Transactions on, vol.20, no.4, pp.513, 522, April 2014. doi: 10.1109/TVCG.2014.20 Volume visualization is an important technique for analyzing datasets from a variety of different scientific domains. Volume data analysis is inherently difficult because volumes are three-dimensional, dense, and unfamiliar, requiring scientists to precisely control the viewpoint and to make precise spatial judgments. Researchers have proposed that more immersive (higher fidelity) VR systems might improve task performance with volume datasets, and significant results tied to different components of display fidelity have been reported. However, more information is needed to generalize these results to different task types, domains, and rendering styles. We visualized isosurfaces extracted from synchrotron microscopic computed tomography (SR-mCT) scans of beetles, in a CAVE-like display. We ran a controlled experiment evaluating the effects of three components of system fidelity (field of regard, stereoscopy, and head tracking) on a variety of abstract task categories that are applicable to various scientific domains, and also compared our results with those from our prior experiment using 3D texture-based rendering. We report many significant findings. For example, for search and spatial judgment tasks with isosurface visualization, a stereoscopic display provides better performance, but for tasks with 3D texture-based rendering, displays with higher field of regard were more effective, independent of the levels of the other display components. We also found that systems with high field of regard and head tracking improve performance in spatial judgment tasks. Our results extend existing knowledge and produce new guidelines for designing VR systems to improve the effectiveness of volume data analysis.
    Keywords: computerised tomography; data analysis; data visualisation; image texture; rendering (computer graphics); virtual reality; 3D texture-based rendering; CAVE-like display; SR-mCT scans; VR system fidelity; beetles; head tracking; isosurface visualization; synchrotron microscopic computed tomography; volume data analysis; volume datasets; volume visualization ;Abstracts; Computed tomography; Isosurfaces; Measurement; Rendering (computer graphics);Three-dimensional displays; Visualization; Immersion; micro-CT; data analysis; volume visualization; 3D visualization; CAVE; virtual environments; virtual reality (ID#:14-2943)
  • Grechkin, T.Y.; Plumert, J.M.; Kearney, J.K., "Dynamic Affordances in Embodied Interactive Systems: The Role of Display and Mode of Locomotion," Visualization and Computer Graphics, IEEE Transactions on, vol.20, no.4, pp.596,605, April 2014. doi: 10.1109/TVCG.2014.18 We investigated how the properties of interactive virtual reality systems affect user behavior in full-body embodied interactions. Our experiment compared four interactive virtual reality systems using different display types (CAVE vs. HMD) and modes of locomotion (walking vs. joystick). Participants performed a perceptual-motor coordination task, in which they had to choose among a series of opportunities to pass through a gate that cycled open and closed and then board a moving train. Mode of locomotion, but not type of display, affected how participants chose opportunities for action. Both mode of locomotion and display affected performance when participants acted on their choices. We conclude that technological properties of virtual reality system (both display and mode of locomotion) significantly affected opportunities for action available in the environment (affordances) and discuss implications for design and practical applications of immersive interactive systems.
    Keywords: gait analysis; helmet mounted displays; virtual reality; CAVE; HMD; embodied interactive systems; full-body embodied interactions; head-mounted display; interactive virtual reality systems ;locomotion mode; perceptual-motor coordination task; user behavior; Interactive systems; Legged locomotion; Logic gates; Psychology; Tracking; Virtual environments; Virtual reality; embodied interaction; affordances; perceptual-motor coordination; display type; interaction technique; mode of locomotion (ID#:14-2944)
  • Masiero, B.; Vorlander, M., "A Framework for the Calculation of Dynamic Crosstalk Cancellation Filters," Audio, Speech, and Language Processing, IEEE/ACM Transactions on, vol.22, no.9, pp.1345, 1354, Sept. 2014. doi: 10.1109/TASLP.2014.2329184 Dynamic crosstalk cancellation (CTC) systems commonly find use in immersive virtual reality (VR) applications. Such dynamic setups require extremely high filter update rates, so filter calculation is usually performed in the frequency-domain for higher efficiency. This paper proposes a general framework for the calculation of dynamic CTC filters to be used in immersive VR applications. Within this framework, we introduce a causality constraint to the frequency-domain calculation to avoid undesirable wrap-around effects and echo artifacts. Furthermore, when regularization is applied to the CTC filter calculation, in order to limit the output levels at the loudspeakers, noncausal artifacts appear at the CTC filters and the resulting ear signals. We propose a global minimum-phase regularization to convert these anti-causal ringing artifacts into causal artifacts. Finally, an aspect that is especially critical for dynamic CTC systems is the filter switch between active loudspeakers distributed in a surround audio-visual display system with 360 deg of freedom of operator orientation. Within this framework we apply a weighted filter calculation to control the filter switch, which allows the loudspeakers' contribution to be windowed in space, resulting in a smooth filter transition.
    Keywords: acoustic signal processing; crosstalk; filtering theory; frequency-domain analysis; interference suppression; loudspeakers; virtual reality; CTC filter calculation; VR applications; active loudspeakers; anticausal ringing artifacts; dynamic CTC filters; d ynamic CTC systems; dynamic crosstalk cancellation filters; dynamic crosstalk cancellation systems; dynamic setups; ear signals; echo artifacts; filter switch; filter transition; filter update rates; frequency-domain calculation; minimum-phase regularization; noncausal artifacts; operator orientation; surround audio-visual display system; virtual reality applications; weighted filter calculation; wrap-around effects; Crosstalk; Ear; Frequency-domain analysis; Loudspeakers; Speech; Speech processing; Time-domain analysis; Binaural technique; causal implementation; dynamic crosstalk cancellation; minimum-phase regularization (ID#:14-2945)
  • Yifeng He; Ziyang Zhang; Xiaoming Nan; Ning Zhang; Fei Guo; Rosales, E.; Ling Guan, "vConnect: Connect the Real World To The Virtual World," Computational Intelligence and Virtual Environments for Measurement Systems and Applications (CIVEMSA), 2014 IEEE International Conference on, pp.30,35, 5-7 May 2014. doi: 10.1109/CIVEMSA.2014.6841434 The Cave Automatic Virtual Environment (CAVE) is a fully immersive Virtual Reality (VR) system. CAVE systems have been widely used in many applications, such as architectural and industrial design, medical training and surgical planning, museums and education. However, one limitation for most of the current CAVE systems is that they are separated from the real world. The user in the CAVE is not able to sense the real world around him or her. In this paper, we propose a vConnect architecture, which aims to establish real-time bidirectional information exchange between the virtual world and the real world. Furthermore, we propose finger interactions which enable the user in the CAVE to manipulate the information in a natural and intuitive way. We implemented a vHealth prototype, a CAVE-based real-time health monitoring system, through which we demonstrated that the user in the CAVE can visualize and manipulate the real-time physiological data of the patient who is being monitored, and interact with the patient.
    Keywords: health care; patient monitoring; physiology; real-time systems; software architecture; virtual reality; CAVE; VR system; cave automatic virtual environment; health monitoring system; patient monitoring; physiological data; real-time bidirectional information exchange; vConnect architecture; vHealth prototype; virtual reality; virtual world; Biomedical monitoring; Computers; Data visualization; Medical services; Prototypes; Real-time systems; Three-dimensional displays (ID#:14-2946)
  • Hodgson, E.; Bachmann, E.; Thrash, T., "Performance of Redirected Walking Algorithms in a Constrained Virtual World," Visualization and Computer Graphics, IEEE Transactions on, vol.20, no.4, pp.579, 587, April 2014. doi: 10.1109/TVCG.2014.34 Redirected walking algorithms imperceptibly rotate a virtual scene about users of immersive virtual environment systems in order to guide them away from tracking area boundaries. Ideally, these distortions permit users to explore large unbounded virtual worlds while walking naturally within a physically limited space. Many potential virtual worlds are composed of corridors, passageways, or aisles. Assuming users are not expected to walk through walls or other objects within the virtual world, these constrained worlds limit the directions of travel and as well as the number of opportunities to change direction. The resulting differences in user movement characteristics within the physical world have an impact on redirected walking algorithm performance. This work presents a comparison of generalized RDW algorithm performance within a constrained virtual world. In contrast to previous studies involving unconstrained virtual worlds, experimental results indicate that the steer-to-orbit keeps users in a smaller area than the steer-to-center algorithm. Moreover, in comparison to steer-to-center, steer-to-orbit is shown to reduce potential wall contacts by over 29%.
    Keywords: virtual reality; aisles; constrained virtual world; corridors; generalized RDW algorithm; immersive virtual reality; passageways; physical world; redirected walking algorithm performance; steer-to-center algorithm; steer-to-orbit algorithm; unbounded virtual worlds; user movement characteristics; Extraterrestrial measurements; Legged locomotion; Navigation; Orbits; Rendering (computer graphics);Tracking; Virtual environments; Virtual environments; redirected walking; navigation; locomotion interface; algorithm comparison (ID#:14-2947)


Articles listed on these pages have been found on publicly available internet pages and are cited with links to those pages. Some of the information included herein has been reprinted with permission from the authors or data repositories. Direct any requests via Email to SoS.Project (at) for removal of the links or modifications to specific citations. Please include the ID# of the specific citation in your correspondence.