The NVIDIA Omniverse has unveiled a series of new tools for creators and developers in virtual worlds to allow users to build physically accurate digital twins and realistic avatars.Â
To ramp up its Omniverse worlds, NVIDIA has introduced new features for Omniverse Kit, Nucleus, and The Audio2Face and Machinima apps via shifting them to the cloud. With this much-needed shift, users can work virtually using non-RTX systems like Macs and Chromebooks.
Moreover, the new set of developer tools is focused on metaverse environments, including powerful AI, simulations, and other creative assets. The NVIDIA toolkit included the Omniverse Avatar Cloud Engine (ACE).
The developers claim that ACE will improve the building conditions of digital humans and virtual assistants. For users, enhanced interaction and flexibility in Universal Scene Description (USD) workflows with OmniLive offer increased speed and performance to those who are connecting with third-party applications.
OmniLive also enables custom versions of USD to live-sync seamlessly making Omniverse Connectors much easier to develop. Simply put, Live layers will allow users a seamless experience in any collaborative session within the metaverse.
NVIDIA is also planning the next wave of AI and industrial simulation with the latest updates to NVIDIA PhysX, an advanced real-time engine for simulating realistic physics. This means developers can include realistic reactions to metaverse interactions that obey the law of physics.
In fact, some experts predict that metaverse market share should surpass $50 billion in the next four years, signaling more people will hop on the trend. With so many developments happening, we are sure that NVIDIA is one of the metaverse companies to watch out for!!!