The new Metaverse firm has developed a haptic solution capable of industry-wide change
Meta AI, formerly Facebook AI, announced in a blog post on Monday it had created a new material and touch sensor capable of boosting the development of the Metaverse.
Researchers of artificial intelligence (AI) at the newly-branded Metaverse firm have worked jointly with researchers and scientists at Carnegie Mellon University to build ReSkin: a plastic, 3-millimetre-thick membrane with magnetic particles capable of measuring touch and sensation.
According to the post from Roberto Calandra, Meta AI Research Scientist and Mike Lambeta, Research Engineer for Meta AI, the new material would allow sensors to monitor changes in the magnetic field to communicate with AI-powered software to enable tactile-sensing robots.
Mark Zuckerberg, Meta’s Chief Executive and Co-Founder, wrote the team had “designed a high-res touch sensor and worked with Carnegie Mellon to create a thin robot skin,” bringing the company “one step closer to realistic virtual objects and physical interactions in the metaverse.”
Robots handling soft fruits trialled the new skin, and further testing found the AI programme required training on 100 human touches to understand the link between magnetic field changes and touch.
The news comes after Meta said it had released the full open-source design of its DIGIT sensor last year, which was “significantly cheaper to manufacture and provides hundreds of thousands more contact points” compared to current commercial tactile sensors, offering greater access to global research teams.
Abhinav Gupta, Research Scientist for Meta, said in a phone conversation that tactile sensing had previously been too expensive or fragile to collect substantial data, adding,
“If you think of how humans or babies learn, rich multimodal data is quite critical for developing an understanding of the world. We are learning from pixels, sound, touch, taste, smell, and so on. But if you look at how AI has advanced in this last decade, we have made huge advances in pixels and we have made advances in sound: audio, speech and so on. But touch has been missing from this advancement even though it is very critical”
He concluded Meta’s ReSkin could allow robots to measure pressure forces to roughly 0.1 Newtons on objects less than 1-millimetre wide, adding researchers could finally “have better understanding of the physics behind object” as they worked to build the Metaverse.
Meta AI’s blog posts echoed Gupta’s comments, stating the sensors should replicate the functions of the human finger, be compact and miniturised, “withstand wear and tear” caused by repeated contact with surfaces, and measure key information about objects such as surface features and contact forces.
The announcements come after numerous firms, including SenseGlove and HaptX, boosted efforts to develop XR technologies for robotic teleoperations, with both enterprises launching haptics-based solutions capable of detecting and simulating a spectrum of haptic feedback levels.
A key representative of HaptX spoke to XR Today in an industry-wide round table on key emerging technologies such as 5G, who said it was vital to allowing haptic solutions to detect, transmit, and function with low-latency, high-speed connections for greater accuracy and responsiveness.