A team of researchers from Nanyang Technological University in Singapore has introduced a groundbreaking method for monitoring human movement in the metaverse.
Traditional approaches to capturing human activity in virtual environments rely on device-based sensors or camera systems. However, these methods face limitations. Device-based sensors capture data from only one point on the body, restricting their ability to model complex movements. Camera systems struggle with low-light conditions and physical obstacles.
The researchers turned to WiFi sensing, drawing parallels to radar technology. WiFi signals can be utilized to detect objects in space, offering a novel approach to tracking human movement.
WiFi-based tracking requires sophisticated artificial intelligence (AI) models. Training these models typically demands vast amounts of labeled data, a labor-intensive process.
To address this challenge, the research team developed “MaskFi,” leveraging unsupervised learning techniques. This approach involves pretraining AI models on smaller datasets and refining them iteratively to achieve accurate predictions.
MaskFi demonstrated impressive performance, boasting approximately 97% accuracy across benchmark tests. This breakthrough paves the way for a new frontier in the metaverse: real-time, lifelike representations of the physical world.
With further development, MaskFi could revolutionize the metaverse, offering a seamless integration of real-world movements into digital environments. This advancement has the potential to enhance user experiences and unlock new possibilities for immersive virtual interactions.
Get $200 Free Bitcoins every hour! No Deposit No Credit Card required. Sign Up