Games are evolving all the time, and we’re always hearing about the latest technological “great leap forward”. The industry is constantly trumpeting higher resolutions, more powerful GPUs, and shiny new graphics. That’s all very well, but there are comparatively few advances in technology that fundamentally change the way games are played and created. 3-D modeling and polygon graphics perhaps, or maybe VR, but most of the big new innovations like ray-tracing, cloth cap (which simulates clothing in real-time), and hair simulation do very little to change anything other than the way a game looks. There aresome truly stunning games out there, but that doesn’t necessarily make them innovative or interesting.
There is, however, a big change on the horizon.Nvidiahas been talking up the capabilities of its new Avatar Cloud Engine, a custom AI model foundry that promises to bring natural language interactions to games through intelligent NPCs. We could start to see games featuring characters that can react, recall, and converse just like (or a lot like) a real person would. In theory, a player could use some kind of text interface, or even an audio interface to talk to NPCs and have them respond appropriately.

READ MORE:The Xbox Showcase Was Cool, But It’s Not A Redemption Arc For Microsoft
To understand why this is such a big deal, it’s worth looking at what AI in games has been capable of until now. The vast majority of NPCs in games can do two things: pathfinding and reacting to stimuli, the latter of which is often referred to as being a “finite state machine”. Pathfinding is simply the ability to navigate obstacles and to know that certain areas are off-limits. Being a finite-state machine involves an NPC changing its behaviour in response to either something the player or some other element of the game does. Think of the guards inSplinter CellorMetal Gear Solid. They can switch between several states, such as passive, alert, or pursuit depending on the actions of the player.

The guards inMetal Gear Solid V: The Phantom Paingo one step further. If the guards discover a lot of bodies with bullet holes in their heads, more of them will start to wear helmets, or if key assets start to go missing, they will start to lay mines near around important installations. The important thing to recognize is that while this looks like decision-making and mimics a degree of intelligence, everything the enemy soldiers do is still an instruction that the developers have pre-programmed.
This is where Nvidia’s new tech could make a real difference. Using the various components of the ACE software, developers will be able to create NPCs that can interact with the player far more convincingly than previous technology allowed. NPCs can now be language models, with the ability to generate novel responses to questions or prompts in real-time. Nvidia has already shown the technology off with a small demo called Kairos, where players can converse with Jin, a man who owns a noodle stand. The player can ask Jin whatever they like, and his AI will generate an appropriate response, relative to his backstory. You can get a rough sense of how this feelshere.
READ MORE:Minecraft Legends: The Hardest Enemies, Ranked
There is now potential for NPCs that have memories, personalities, and ambitions, in a way analogous to those of human beings. Imagine a game likeShadows of Doubt, (a wonderful slice of noir, cyberpunk murder mystery) with this technology incorporated. Not only would you be able to interrogate suspects with total freedom, no longer beholden to text menus, but the NPCs would also all have distinct personalities that affected the game in meaningful ways. Maybe some of them are willing to cooperate, while others do nothing but lie through their teeth. There’s even potential for relationships between NPCs to develop, and someone might deliberately mislead you to protect a loved one.
It’s early days yet, but the potential applications of this new technology are seriously exciting. AMD, Nvidia’s big rival, is also trying to get in on the act, and has recently shown off its new MI300X chips, which are designed specifically for large language models. Microsoft has also been tinkering with AI language models recently, unveiling a new search engine with a chatbot built in, as well as aMinecraftdemo where natural language commands were used to control the game.
Open AI (the company behind ChatGPT) has even beentraining a computer model to play Minecraftby having it watch more than 70,000 hours of footage of human beings playing the game. If AI models can be trained to play a game the way a human would, that presents all kinds of interesting possibilities. I’d love to see how a horror game likeOutlastorAlien Isolationfelt with “human-like” intelligence behind the monster.
If Nvidia (and their competitors) make good on their promises, we could witness a radical change in how games work. Up until now, NPC AI has been all about deception. Developers have had to use smoke and mirrors to make what are relatively simple programs feel intelligent, and they’ve done that very successfully.
We are approaching a point, however, where the emphasis is on the “I” in AI more than it ever has been before, and the possibility of NPCs with the ability to “think” is becoming more and more real. Who knows where we’ll end up, but there is potentially a big shift in how we play games on the cards right now.