In 1992, American science fiction novelist Neil Stephenson first mentioned the concept of "Metaverse" in his novel "Avalanche". In the novel, Stephenson created a three-dimensional digital space closely connected with society. Parallel to the real world, in this digital space, people in the real world use digital avatars to control communication and entertainment.

What is the gesture recognition driven by the Metaverse?

Under the background of the media, the concept of "Metaverse" has gradually become popular recently. For the virtual world that is yearning for, "gesture recognition technology" has become a hard demand, and major giants have begun to lay out this track, such as: global chip giant Qualcomm External Confirmed that it has acquired the HINS SAS team and its wholly-owned subsidiary, hand tracking and gesture recognition provider Clay, to deploy commercially available gesture tracking and gesture recognition technology; Huawei announced the long-term research and development of the "photograph gesture recognition preview" patent, which is expected to be further applied in the future To its smart phones and so on.

How to realize gesture recognition?

1. Time of Flight (Time of Flight), also known as 3D imaging based on time of flight. This technology needs to be equipped with a 3D camera module that emits and receives pulsed light. First, the camera module emits pulsed light. Because the fingers at different distances receive the light for different times, the time for the light to return to the receiving module is also different. According to different return times, the processing chip can calculate the specific positions of different fingers to recognize gestures.

2. Structured light technology. First, a laser transmitter is used to project structured light onto the front surface of the human body, and then an infrared sensor is used to receive the structured light pattern reflected by the human body. The processing chip calculates the spatial information of the human body according to the position and the degree of deformation of the received pattern, and then combines with a certain algorithm for depth calculation to recognize gestures.

3. Millimeter wave radar. Transmit radio waves, then receive echoes, and the processing chip calculates the target position data in real time based on the time difference between receiving and sending. Compare finger positions in different time periods and compare with built-in data to recognize gestures.

Opportunities for millimeter wave radar

Among many gesture recognition technologies, the advantages of millimeter wave radar are undoubtedly more prominent. As early as 2016, Google launched the Project Soli gesture recognition solution based on millimeter wave technology.

What is the gesture recognition driven by the Metaverse?

Millimeter-wave radar gesture recognition detects air gestures through wireless signals, and human hands do not need to touch the screen, thus providing a new dimension of interaction.

Compared with optical-based gesture recognition, millimeter-wave radar gesture recognition has low power consumption, is not affected by the environment, and has higher reliability. Relative to the camera, millimeter-wave radar to better to protect customer privacy, and the algorithm is simple and does not require GPU support neural networks, and therefore more responsive lower power consumption.

In addition, the millimeter-wave radar can still work effectively in harsh environments due to its strong penetrability, and is not affected by light , and can also be used at night, with strong anti-interference ability.

Based on the above advantages, it is expected that millimeter-wave radar has become the most popular gesture recognition path.

On December 1st, the Honor 60 series officially released the "AI gesture recognition, Vlog change mirror" function. With the advancement of science and technology, when gesture recognition finds an application scenario that can support its explosion, it will replace the touch screen and bring a more natural and integrated human-computer interaction experience for humans.