NVIDIA Unveils “ACE” Model For Next-Gen Games, Integrating AI Into Virtual Characters

NVIDIA Unveils “ACE” Model For Next-Gen Games, Integrating AI Into Virtual Characters

 0
NVIDIA Unveils “ACE” Model For Next-Gen Games, Integrating AI Into Virtual Characters
NVIDIA Unveils "ACE" Model For Next-Gen Games, Integrating AI Into Virtual Characters

NVIDIA unveils Avatar Cloud Engine (ACE), a custom AI model that integrates AI into Non-Playable Characters (NPCs) for natural game language interactions.

With the aid of this tool, software, and game developers will be able to create and integrate unique voice, conversation, and motion AI models. The project is being carried out in partnership with Convai, a business developing conversational AI for online gaming environments.

Here is how NVIDIA breaks down the foundations of the model:

  • NVIDIA NeMo — for building, customizing and deploying language models, using proprietary data. The large language models can be customized with lore and character backstories and protected against counterproductive or unsafe conversations via NeMo Guardrails.
  • NVIDIA Riva — for automatic speech recognition and text-to-speech to enable live speech conversation.
  • NVIDIA Omniverse Audio2Face — for instantly creating expressive facial animation of a game character to match any speech track. Audio2Face features Omniverse connectors for Unreal Engine 5, so developers can add facial animation directly to MetaHuman characters.
  • NVIDIA also showcased the model by demonstrating an example of an AI-generated conversation with an NPC. As you can see in the video below, the character answers questions based on the narrative backstory through the help of generative AI.

    NVIDIA's new tool is a significant advancement for the gaming industry. We're excited to see how AI is incorporated into forthcoming games since it will significantly change the experience of playing a particular title, mainly when dealing with NPCs.

    Game developers and startups are already using NVIDIA generative AI technologies. For instance, GSC Game World is using Audio2Face in the much-anticipated S.T.A.L.K.E.R. 2 Heart of Chornobyl. And indie developer Fallen Leaf is using Audio2Face for character facial animation in Fort Solis, their third-person sci-fi thriller set on Mars. Additionally, Charisma.ai, a company enabling virtual characters through AI, is leveraging Audio2Face to power the animation in their conversation engine.

    via NVIDIA

    NVIDIA is leveraging AI across the board and games are an essential segment where the company is also bringing its Neural Compression technology for up to 16x details, training the DLSS 3 models in NVIDIA servers for enhanced details and quality & a new radiance cache technology that can help enhance the performance of path tracing in titles such as Cyberpunk 2077 that utilize it.

    What's Your Reaction?

    like

    dislike

    love

    funny

    angry

    sad

    wow