VR/AR technologies have been traditionally associated with the entertainment industry, with virtual and immersive environments used for gamers, product designers, and architects. However, increasing healthcare expenditure and the need for cutting-edge technologies to aid the development of novel therapies and diagnostics have fuelled the need for this technology in the healthcare industry.
Listed below are the key technology trends impacting VR/AR in the healthcare industry, as identified by GlobalData.
AR as the next big computing platform
Ultimately, AR glasses of some sort may replace the smartphone as the primary connected device that users carry around with them. First-mover advantage in AR technology will be a game-changer. This is why all the large technology companies are taking it so seriously. Apple, the world’s most profitable smartphone maker, is potentially the most vulnerable in the longer term.
AR software or apps could be installed on over 3.5bn smartphones by 2022, according to Digi-Capital. A 2019 survey conducted by the same firm on AR companies and enterprises found that 78% of respondents identified mobile as the most important AR platform. The existing smartphone ecosystem, consumer comfort with smartphones, and the improving computing capabilities of these devices are major drivers of mobile AR platforms. Apple and Google are the biggest potential beneficiaries of the growth of mobile AR, given their strong smartphone ecosystems and well-established AR software development kits (SDKs).
AR cloud refers to a real-time 3D virtual map overlaid onto the real world. It allows multiple users and devices to share AR experiences. In the jargon, AR cloud promises to enable persistent content for use by multiple users, either individually or collectively. Real-time tagging of virtual content to physical locations will propel AR beyond the boundaries of devices and make the AR experience more natural and intuitive. Google, Apple, Microsoft, Facebook, Amazon, Magic Leap, and Samsung are all heavy investors in AR cloud.
Smartphones today typically come with virtual assistants integrated into the operating system. AR headsets and smart glasses are beginning to incorporate similar capabilities. Microsoft’s Cortana in HoloLens 2, Amazon’s Alexa in Focals by North, and Google Assistant in Bose AR smart glasses are just a few of the early iterations of voice-capable AR devices. AI-powered, voice-activated virtual assistants enable hands-free operation of the device, which can be critical for some AR apps.
Increasing degrees of freedom
One of the chief issues with using VR and AR technologies is nausea in inexperienced users. This is due to the disconnect between the apparent movement that the user’s eye perceives while the user’s body stays stock still. To tackle this, VR companies have recently introduced an extra three degrees of freedom to increase the fidelity of the VR experience. These include gyroscopes and accelerometers that have been added to further advance the fidelity of the VR experience.
Since ‘tool simulators’ are a potentially hugely important part of the future VR space, it is important that the user be able to ‘feel’ the tool they are holding for maximum fidelity. To this end, companies such as ImmersiveTouch have provided surgical tool simulators that provide haptic feedback.
As VR and AR become mature as technologies, the costs to purchase a new set of VR equipment is steadily dropping. This means that adoption may increase across the healthcare sector as it becomes more mainstream.
As VR and AR become more mature, the level of detail at which the user can interact with the VR ‘world’ becomes more sophisticated. Steam is trialling the EV3 Knuckle controllers, which track the position of each of the user’s fingers individually. This means that while before users could only grasp a tool or let go of it, users can now hold a tool with a couple fingers, pinch a tool, and so on. The tool is even able to detect which fingers are holding on more firmly than others.
5G can improve many aspects of VR/AR in healthcare, from medical imaging to ensuring remote consultations and patient monitoring. Verizon recently teamed up with Medivis at its 5G lab in New York, leveraging its 5G technology with the Medivis team of surgeons, radiologists, and engineers to accelerate the development of the Medivis platform and bring VR/AR to the forefront of healthcare.
Brain-computer interface (BCI) involves devices that enable users to interact with computers solely by means of brain activity, generally measured by electroencephalography (EEG). The fusion of BCI with VR/AR may provide additional communication channels by increasing the bandwidth of the human-VR/AR interaction. This is achieved either explicitly through active BCIs or implicitly using passive BCIs. Active BCIs allow users to issue commands to devices or to enter text without physical involvement, while passive BCIs monitor a user’s state and can be used to proactively adapt the VR/AR interface.
BCIs with VR/AR offer the possibility for immersive situations through induced illusions of an artificially perceived reality that can be utilised not only in basic BCI research but also therapeutic applications, etc. In September 2019, Facebook acquired neural interface startup CTRLLabs, a company that makes a wristband capable of transmitting electrical signals from the brain into computer input.
This is an edited extract from the Virtual/Augmented Reality in Healthcare – Thematic Research report produced by GlobalData Thematic Research.