Nvidia revealed the latest advancements in its generative AI-powered Non-Player Characters (NPCs) at the Consumer Electronics Show (CES) on Monday. They showcased automated dialogues between players and computer-generated characters, potentially transforming the game development landscape. The Avatar Cloud Engine (ACE) technology by Nvidia integrates text-to-text recognition, generative AI facial animation, and automated character personas to produce interactions with computer-generated characters.
During the demonstration at Nvidia’s specific Les 2024 location, Seth Schneider, a senior product manager of ACE, exhibited the systems in action. Schneider explained that the technology converts spoken words into text, processes it through a cloud-based large-language model to generate NPC responses, and utilizes Omniverse Audio2Face to synchronize the response text with facial animations and spoken sound within the game environment.
This video showcases advancements from a previous version of the technology presented at Computex in 2023, featuring Jin, the proprietor of a cutting-edge noodle shop. The new demo presented by Nvidia expands on this concept by introducing AI-generated conversations between Jin and another NPC named Nova, with dialogues adapting based on the player’s actions.
Moreover, the video highlights a new technology from Convai that empowers AI-driven NPCs to interact not only verbally but also with their surroundings. For instance, in the demonstration, when Schneider prompts Jin to “bring out the great thing,” Jin retrieves a bottle of alcohol, showcasing the NPCs’ environmental awareness. Convai’s technology enables NPCs to engage with various objects such as pots, bottles, lights, and decorations within the game world.
Nvidia reports that ACE manufacturing services, including Audio2Face and Riva automated speech recognition, are already in use by several game developers. Schneider mentioned that prominent digital content creators, such as Mihoyo, the developer of Genshin Impact, NetEase Games, Tencent, and Ubisoft, are leveraging AI-powered NPCs in their products.
Both Nvidia and Convai emphasize that their technology seamlessly integrates with game engines like False Engine and Unity. However, the specific games that will implement these AI-generated NPCs remain undisclosed. Despite the advancements in creating lifelike conversations, the characters like Jin and Nova still exhibit mechanical and unnatural speech patterns, hinting at the ongoing challenge of achieving truly convincing NPC interactions in games.
One notable implication of Nvidia’s latest showcase is the increasing likelihood that AI, rather than human input, will drive the development of NPC interactions in future games.