The metaverse is much talked about, especially with companies like Facebook and Nvidia presenting their visions of the digital future. But what exactly is the metaverse, what are the hardware challenges in enabling it, and what elements are already in place? Analyst firm IDTechEx explored the progress in areas such as displays, image sensors, optics, haptics, augmented reality / virtual reality in some depth recently; we present some of that discussion here.

The endgame for the metaverse is for virtual worlds to exist seamlessly alongside the real, with immersive interactions between the two changing our perception of physical presence. The software for this is nearly there, but the hardware still has many hurdles to overcome.

The metaverse promises a step-change in how society communicates but, without the hardware technology, will ultimately remain a pipe dream. We will still be able to interact with the metaverse through our phones and laptops, but these will become legacy devices. In order to immerse ourselves, this requires a virtual reality (VR) headset; for true integration between the created and the physical, augmented reality (AR) devices are required.

IDTechEx metaverse diagram
The metaverse will require capabilities such as AR, VR, and also enable legacy devices to operate within it. (Source: IDTechEx)

A touchstone in the AR/VR device industries is social acceptability and without advances in sensing, display, and optics technology, AR and VR headsets will not be sleek enough to achieve this. Despite billions of dollars being poured into development, IDTechEx said true AR glasses are still nowhere near the point of sitting alongside Ray-Bans in terms of appearance or displaying images that are of IMAX quality. Meta (formerly Facebook) announced its AR glasses project at the same time as its name change, admitting they were years from viability. VR headsets sometimes invite comparisons to Robocop – only the highest-end devices start to blur the lines between what is real and what is not.

Long term, the goal is a light and comfortable device that you can wear all day, switching between AR and VR whilst enabling natural interactions between yourself and other metaverse users. The technological journey towards the hardware needed to create this hypothetical device presents even more compelling developments and complex challenges than the software.

MicroLED displays are part of the solution

Putting a screen right in front of your eyes reveals things we do not notice whilst looking at a phone or TV. We see the gaps between pixels in a phenomenon called the screen door effect and the rule of thumb is that 60 pixels per degree (ppd) of field of view are required for VR or AR to start looking like reality, leading to big demands on resolution. On top of this, optics are required to focus and size these images correctly for our vision. In the case of AR, these optics are very inefficient, leading to brightness demands in the millions of nits – for reference, the iPhone 13 Pro Max screen maxes out at 1200 nits.

MicroLED displays are a promising solution for AR and VR. They do not suffer from burn-in like OLED displays, can have crazy brightness levels (JadeBird makes one for AR applications with a maximum brightness level of 3 million nits), and enable tiny pixel pitches, with Mojo Vision producing what they call a nanoLED display that is small enough to fit into a contact lens and has a subpixel pitch of 900nm.

However, there is one major issue: microOLED microdisplays are not good at producing full-color images, with blue microLEDs being significantly more efficient than other colors. Quantum dot color conversion is the favored solution here, enabling conversion of blue light to red and green, and these quantum dots can be inkjet printed or lithographically patterned. There are still concerns with longevity, especially in very bright microdisplays, as well as reliance on heavy metals in many formulations. The development timeline for microLED displays is outlined in detail in this report on technology, commercialization, opportunity, market and players.

IDTechEx metaverse elements
The hardware requirements for the metaverse. (Source: IDTechEx)

The biggest battleground for AR, in particular, is combiner optics. These devices overlay projected images on a transparent lens. Here, companies fight for the best color rendition, the widest field of view, and the largest eye box to enable a convincing display experience that works for every set of eyes.

Considering the big news this year in this area, surface relief waveguides seem to be the solution the industry is betting on. In May, WaveOptics, was acquired by Snap (another social media giant looking towards the metaverse) and in November, investment by such giants as Samsung Electronics in Digilens led to a valuation of $500 million. Both are fabless waveguide firms. An exciting development from Digilens is its TREX waveguide, which can double effective display resolution, providing one weapon in the arsenal of reaching the 60ppd. Another report on optics and displays in AR, VR, and MR covers the technologies, players and Markets, and delves into optical combiner and display tech in these applications in detail.

Eye-tracking technology

If you tried to build a headset that covered all 135° of each eye’s horizontal field of view at 60ppd resolution, things would quickly get unmanageable – fortunately, only the center of our vision is of this high quality, with the outer edges not being so demanding. By tracking our eyes, resolution can be maximized in the center of where the user is looking whilst lowering demands elsewhere.

In the future, this eye-tracking tech may even be used to project AR/VR images directly onto the retina via laser beam scanning, getting around the need for combiner optics and correction for glasses wearers: that is, if consumers can get comfortable with the idea.

Companies in this space are using emerging image sensor technologies to track the eye more efficiently. Event-based vision can help keep processing demands down by natively recording movement instead of a stream of conventional image frames. Using printed image sensors, eye-tracking technology can be squeezed into a more svelte package. Meta Materials Inc. (no relation to Meta) is already embedding microcameras directly into glasses lenses and this approach will be integrated into combiner or magnifier optics for AR and VR respectively as the technology matures.

Haptics and sensory experiences

All of what has been discussed so far amounts to little more than a high-end TV strapped to your face if you cannot interact with it in ways resembling the real world. Not only do AR and VR devices need to sense our movements but, for full immersion, haptic (touch feedback) devices are required as well. In November 2021, Meta’s Reality Labs (RL) division showed off a prototype haptics glove product, including videos of Mark Zuckerberg trialing various demos. This glove uses microfluidic systems to deliver local haptic feedback to different areas of the hand, appearing to deliver further touch feedback to each finger. Although there was some controversy over this prototype’s resemblance to a product from HaptX, Meta’s IP position is strong here and it represents the efforts being made by metaverse-focused companies to ensure that sensory experiences past the audio-visual are delivered, according to IDTechEx.

The analyst said a key winner on the sensing side in recent headsets has been time of flight cameras for hand tracking, eliminating the need for game controller style interactions with VR and AR devices. Apple have been known to be invested in the VR/AR space for years and, in October 2021, industry sources reported that LG Innotek had begun supplying time of flight cameras to the firm for a VR headset slated for release in 2022. When Apple adopts a technology, it is usually a strong statement that it is about to become ubiquitous, representing another data point in the strong future for AR and VR.

IDTechEx concludes to say that, in a limited way, the metaverse is already here. The hardware development mountain that prevents its full realization is slowly being climbed and light, good-looking AR glasses replacing our phones and laptops in the future feels like a near certainty.

from: https://www.embedded.com/enabling-the-hardware-for-the-metaverse/