Skip to content

NVIDIA makes way for generative AI in the creation of digital avatars for video games

NVIDIA has just introduced production microservices for NVIDIA Avatar Cloud Engine (ACE) that enable game, tool and middleware developers to integrate next-generation generative Artificial Intelligence (AI) models into the digital avatars of their games and applications.

The new ACE microservices allow developers to create interactive avatars using AI models such as NVIDIA Audio2Face™ (A2F), which creates expressive facial animations from audio sources, and NVIDIA Riva Automatic Speech Recognition (ASR), to create customizable applications of multilingual speech and translation using generative AI.

READ ALSO: iPhone: little-known keyboard tricks that every user should put into practice

Developers that have adopted ACE include Charisma.AI, Convai, Inworld, miHoYo, NetEase Games, Ourpalm, Tencent, Ubisoft and UneeQ.

“Generative AI technologies are transforming virtually everything we do, and that includes game creation and gameplay,” said Keita Iida, vice president of developer relations at NVIDIA. For him, NVIDIA ACE opens up new possibilities for game developers by populating their worlds with realistic digital characters and eliminating the need for pre-established dialogue, providing greater immersion in the game.

NVIDIA ACE

Leading interactive game and avatar developers are pioneering the use of ACE and generative AI technologies to transform interactions between players and non-playable characters (NPCs) in games and applications.

READ ALSO: Generative AI and regenerative AI: how are they different, which is better and how will they transform the future?

According to Tencent Games, this is a key moment for AI in games: “NVIDIA ACE and Tencent Games will help lay the foundation that will bring digital avatars with individual, realistic personalities and interactions to video games.”

Bring game characters to life

Historically, NPCs have been designed with predetermined responses and facial animations. This limited interactions with players, which tended to be transactional, short in duration, and, as a result, skipped by most players.

In the words of Purnendu Mukherjee, Founder and CEO of Convai: “AI-powered generative characters in virtual worlds unlock various use cases and experiences that were impossible before.” “Convai is leveraging Riva ASR and A2F to enable realistic NPCs with low-latency response times and high-fidelity natural animation.”

READ ALSO: The two “magical” words from Harry Potter to turn your cell phone flashlight on and off

To show how ACE can transform NPC interactions, NVIDIA has worked with Convai to expand the NVIDIA Kairos demo, which debuted at Computex, with a number of new features and the inclusion of ACE microservices.

In the latest version of Kairos, Riva ASR and A2F are used extensively, improving NPC interactivity. The new Convai framework now allows NPCs to converse with each other and gives them knowledge of items, allowing them to pick up and deliver items to desired areas. Additionally, NPCs can guide players to objectives and traverse worlds.

Audio2Face and Riva Automatic Speech Recognition microservices are now available. Developers of interactive avatars can incorporate the models individually into their development processes.

Source: Elcomercio

Share this article:
globalhappenings news.jpg
most popular