Home Metaverse Nvidia wants Omniverse to be metaverse builders’ glue

Nvidia wants Omniverse to be metaverse builders’ glue

by admin

The free collaborative 3D graphics and rendering toolkit comes out of public beta, and its cross-app support could be a sign of where all the other VR/AR tools need to go.

The word metaverse has infiltrated every conversation on VR, AR, virtual communities and the future of 3D graphics. It’s exhausting. One key promise in most metaverse pitches, though, is some sort of universal interoperability with other platforms and tools. Nvidia sees its Omniverse, an interconnecting online toolkit for 3D creative apps, as a way of making that happen. The software’s now out of beta as a free service for people owning PCs using Nvidia GeForce RTX graphics, the company announced at its virtual CES 2022 press conference, along with a whole new lineup of graphics chips.

Nvidia’s Omniverse platform is cloud-connected 3D collaboration toolkit that was previously in open public beta, and became available for business subscriptions last year, which offer larger-scale secure connections for companies. The free version is limited to two-person collaboration, but the possibilities seem incredibly useful, and the software hooks into (or will hook into) a ton of other applications: Blender, Maya, Autodesk, Adobe and Epic’s MetaHuman are some of the many apps that work with it.

Omniverse is clearly a 3D collaborative creative platform, and Nvidia’s plans for it cross over into AI, robotics, autonomous vehicle development, VR, AR and gaming. But at the same time, Omniverse’s support of connected standards is exactly what the rest of the AR, VR and even crypto landscape are currently trying to head towards. 

Nvidia announced that 3D asset sites will be able to be dragged and dropped into Omniverse-connected apps, including Sketchfab (owned by Epic) and Turbosquid(owned by Shutterstock). There’s also a new tool called Audio2Face that generates avatar facial animation from a voice recording, using AI, and exports it into Epic’s ultra-realistic MetaHuman software. 

Nvidia’s Omniverse VP Richard Kerris sees that cloud-based connected processing power, and support of common standards, as the stuff the rest of the metaverse promises are in need of. “The metaverse is already here to some extent,” Kerris said in a conversation with CNET last fall. “We have a lot of the basic technologies available to us.”

But Kerris explained that virtual worlds created with platforms like Omniverse are essential for Nvidia’s AI training. “At this moment, one of the things we need to create these AIs, that’s fundamental, is a simulation of virtual worlds,” he said. “If you’re going to create robots that know how to operate inside our world, they need to be trained somewhere safe, and they need to be trained for hours and hours.”

Omniverse looks like a tool to accelerate creative work on the PC side. Maybe it will also link into being a backbone for a next wave of AR and VR creative apps, too.

Quelle:

Foto:

Nvidia’s new Audio2Face tool, part of Omniverse, automatically converts voice files into animated avatar animation. It can be exported to Epic’s MetaHuman software.Nvidia

https://www.cnet.com/tech/computing/nvidia-wants-omniverse-to-be-metaverse-builders-glue/

You may also like

Leave a Comment

* By using this form you agree with the storage and handling of your data by this website.

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More