Sensor technology has the potential to transform how we interact with the world radically. Today, smart devices carry sensors which can triangulate the device’s location in three dimensions. Producing augmented/mixed reality based on these data points allows people to interact with the world in entirely new ways. Avi Horowitz, a futurist entrepreneur, explains the concepts of XR (augmented and virtual reality) and how they will positively impact people well into the future.
An Explosion of Sensors
As more types of devices incorporate sensors, the possibilities for XR are endless. The three types of sensors that are most important to XR are accelerometers, gyroscopes, and a combination of ToF cameras and/or LiDAR sensors to achieve SLAM.
Accelerometers collect data that helps a smart device understand which way it is being held. A good example of the use of an accelerometer is when your phone can turn itself from portrait mode to landscape mode depending on its orientation. Accelerometers also sense how quickly a device is moving.
Gyroscope sensors allow the device to determine its position about a flat plane. These sensors collect vibration data and translate this into orientation. Gyroscopes in smart devices allow for real-time analysis of the orientation data.
SLAM, or Simultaneous Localization and Mapping, is the computational problem of mapping one’s spatial environment while simultaneously providing feedback to the device about its precise location within that environment. An amalgamation of different types of sensors are used to achieve SLAM, says Avi Horowitz.
ToF, or Time of Flight, are range imaging camera systems that resolve distance based on measuring the “time of flight” of light signals between the camera and the subject. Similarly, but more complex, LiDAR (light imaging, detection, and ranging), originally a portmanteau of “light” and “radar,” are sensors that measure distance by illuminating a target with pulses of laser light that rebound off the surrounding terrain. Differences in laser rebound times and wavelengths are then used to reconstruct a 3D representation of the environment digitally. LiDAR operates similarly to older SONAR or RADAR technologies, but relies on light reflections rather than sound, allowing for increased speed and precision.
The culmination of these multifaceted sensors is a device that is capable of creating a real-time view of the world around it, as well as accurately depict its own geographical location within that world. This is a critical milestone in the employment of XR technology as this gives a device the complete picture of the surroundings over which new information can be superimposed.
Possible Uses of XR
Smart devices have evolved to collect an incredible range of data. When interpreted, this data helps to enrich the XR experience, creating an immersive reality for the user.
The smart device can be implemented with its own digital assistant that, with the contextual awareness of its sensors, can be utilized in a vast range of real-world applications such as a retail store, giving the user a resource for information on prices and details on the item for sale.
XR can also be applied to the smart home. At a glance, homeowners can access data about all of their smart devices. The XR system can help users view security camera footage or monitor children’s entertainment and internet use. It can also facilitate the typical applications of the smart home like control of lights and electrical appliances.
A Technology on the Cusp
Avi Horowitz notes that XR is poised to make a grand impact on not only the technology industry, but e-commerce, enterprise, manufacturing, and any other that is already touched by the digital mobile industry. Within the next few years, these sensors’ integrations will become more prolific, making it possible to track more types of data. XR changes the relationship people have with their data and digital environment, making the world more transparent and easier to understand.