advertisement
Facebook
X
LinkedIn
WhatsApp
Reddit

NVIDIA shows off tech that lets you speak to NPCs

  • While Hollywood writers strike against AI in writing rooms, NVIDIA introduces tech that allows for conversations with NPCs using AI.
  • The tech allows developers to employ AI models that will respond to voice conversations with players, no human talent needed.
  • The ACE service can also be used to animate faces using AI.

At Computex happening in Taipei, Taiwan this week, NVIDIA has shown off some interesting tech powered by artificial intelligence that would let gamers speak into a headset to speak to NPCs.

This is made possible via a new foundry service called NVIDIA Avatar Cloud Engine or ACE for brevity.

ACE is itself comprised of multitude of solutions which work together to make it possible to chat to an NPC in a game, even one without scripted a voice actor.

The technology stack is made up of:

  • NVIDIA NeMo – for building, customising and deploying language models using proprietary data. This language model can be customised to a game so that it includes lore and back story. The tech can also be honed to protect from counterproductive or unsafe conversations,
  • NVIDIA Riva – for automatic speech recognition and text-to-speech to enable live conversations,
  • NVIDIA Omniverse Audio2Face – for creating expressive facial animation of a game character to match the speech track.

“Generative AI has the potential to revolutionize the interactivity players can have with game characters and dramatically increase immersion in games,” says John Spitzer, vice president of developer and performance technology at NVIDIA.

“Building on our expertise in AI and decades of experience working with game developers, NVIDIA is spearheading the use of generative AI in games.”

How does it work? Well, in a word, poorly.

The video above highlights a very stilted conversation that may as well have been written in passing by a member of the writing team. Except, instead of a human helping lend their creativity and expertise in exchange for money, NVIDIA is proposing game studios give the job of fleshing out a game world to AI.

Right now we don’t see the likes of Activision Blizzard or EA adding this to their premiere titles but as the technology improves AI could make a sector that is already fraught with worker abuse, worse.

All this while the Writers Guild of America protests against the same technology being abused by big production houses to pay writers less or replace them with AI entirely.

This is not to say that some of the tools in the stack aren’t useful. The Audio2Face technology could be good to save animators some time. According to NVIDIA’s press release, S.T.A.L.K.E.R. 2: Heart of Chernobyl’s dev team is using Audio2Face which is an application that makes sense as facial animations can take ages and eat into budgets.

However, this also has the potential to hurt animators as executives point to these tools as a cheaper solution that boots on the ground.

As for having AI generated conversations with NPCs? It’s a great gimmick to be sure but it’s seems like a nightmare from a gameplay perspective. Imagine having to converse with a city of NPCs to find the one piece of information you need to progress a quest or having the AI experience a hallucination and needing a patch to solve the problem.

Above all though, can we stop asking robots to make art? It’s not really very good and we’re pushing starving artists to the brink so that a Silicon Valley exec can buy their fourth home.

advertisement

About Author

advertisement

Related News

advertisement