Paul Milgram defined the well-known Reality-Virtuality Continuum in 1994 .
It explains the transition between physical world on the one hand, and a complete digital or computer-generated environment on the other hand. However, from a technology point of view, a new umbrella term has been introduced, named eXtended Reality (XR), often referred to as XR.
It is the umbrella term used for Virtual Reality (VR), Augmented Reality (AR), and Mixed Reality (MR), as well as future realities immersive technologies might create. XR covers the full spectrum of real and virtual environments. In Figure 1, the Reality-Virtuality Continuum is extended by the new umbrella term. As seen in the figure, a less-known term is presented, called Augmented Virtuality. This term relates to an approach, where the reality, e.g. the user’s hand, appears in the virtual world.
Augmenting the perception of the real environment with virtual elements by mixing in real-time spatially-registered digital content with the real world. Pokémon Go and Snapchat filters are commonplace examples of this kind of technology used with smartphones or tablets. AR is also widely used in the industry sector, where workers can wear AR glasses to get support during assembly, maintenance, or for training.
Augmenting the perception of a virtual environment with real elements. These elements of the real world are generally captured in real-time and injected into the virtual environment. The capture of the user’s body that is injected into the virtual environment is a well-known example of AV aimed at improving the feeling of embodiment.
Applications use headsets to fully immerse users in a computer-simulated reality. These headsets generate realistic images and sounds, engaging two senses to create an interactive virtual world. aimed at improving the feeling of embodiment.
This term includes both AR and AV. It blends real and virtual worlds to create complex environments, where physical and digital elements can interact in real-time. It is defined as a continuum between the real and the virtual environments but excludes both of them.
eXtended Reality System
In Figure 2, a simplified schematic diagram of an eXtended Reality system is presented.
Application: On the left-hand side, the user is performing a task by using an XR application. The most recent XR4ALL Landscape Report provides a complete overview of all the relevant domains covering advertisement, cultural heritage, education and training, industry 4.0, health and medicine, security, journalism, social VR and tourism.
Interaction: The user interacts with the scene and his/her interaction is captured with a range of input devices and sensors, which can be visual, audio, motion, and many other types.
Processing and rendering: The acquired data serve as input for the XR hardware, where further necessary processing in the rendering engine is performed. For example, the correct view point is rendered or the desired interaction with the scene is triggered. In the same report, an overview of the major algorithms and approaches is given. However, not only captured data is used in the rendering engine, but also additional data that comes from other sources such as edge cloud servers or 3D data available on the device itself.
Feedback: The rendered scene is then fed back to the user to allow him/her to sense the scene. This is achieved by various means such as XR headsets or other types of displays and other sensorial stimuli.