Developer - Nvidia Corporation
developer.nvidia.com/ace
Omniverse, real-time simulation and collaboration platform
developer.nvidia.com/ace
Omniverse, real-time simulation and collaboration platform
Last edited:
In this demo, we see one example of Project Tokkio—a “talking kiosk” reference application. NVIDIA Maxine embodies a photorealistic, life-like autonomous toy avatar that responds to challenging domain-specific questions. The avatar, in NVIDIA Omniverse, is a reference application that uses NVIDIA Riva for speech AI, NVIDIA Megatron-Turing NLG 530B large language model for natural language understanding, and a combination of NVIDIA Omniverse animation systems for facial and body animation based on an audio source.
Collaboration with global audiences can be dramatically improved when speaking in their language. To enable better communication and understanding, Project Maxine integrates Riva’s real-time translation and text-to-speech with photo animation “live portrait” and eye contact in real time.
The technology of Omniverse Avatar enables DRIVE Concierge to serve as everyone’s digital assistant, helping them make recommendations, book reservations, make phone calls and provide alerts — all personalized to each driver and passenger.
Making virtual robots - or avatars - look, sound and behave realistically is a complex process.
NVIDIA Omniverse Avatar is a technology framework built on the Omniverse platform that lets developers quickly build and deploy intelligent virtual robots, or "avatars" for different use cases with the ability to connect NVIDIA AI SDKs for speech and intelligence, to Omniverse rendering and animation technology for final output.
Let Toy Jensen introduce Omniverse Avatar and NVIDIA Tokkio, one of the applications built on top of the Omniverse Avatar platform.
NVIDIA Tokkio brings AI-driven customer service to customer service touchpoints in retail outlets, quick service restaurants, or the web.
Omniverse Avatar Cloud Engine offers developers of games, chatbots, digital twins and virtual worlds a suite of cloud-native AI models that make it easier to build and deploy interactive avatars.
Avatar Cloud Engine (ACE) opens a new world of possibilities for bringing digital avatars to life. See how ACE’s cloud-native #AI microservices power avatars of all shapes and sizes, from 2D portraits to 3D characters created by partners like Ready Player Me.
Developers and teams building avatars and virtual assistants can now register to join the Omniverse ACE early access program: https://nvda.ws/3BGAJE8
NVIDIA ACE is a suite of digital human technologies, packaged as easy-to-deploy, fully optimized microservices, also known as NVIDIA NIM.
Developers can integrate ACE NIM into their existing frameworks, engines, and digital human experiences. NeMotron LLM and SLMs to understand intent and orchestrate other models. Riva NIM for interactive speech and translation. Audio2Face and Audio2Gesture NIM for facial and body animation.
ACE NIM run on NVIDIA GDN, a global network of NVIDIA-accelerated infrastructure that delivers low-latency digital human processing to over 100 regions. And today, ACE runs on RTX AI PCs.