For years, the promise of AI-powered game characters went like this: imagine an NPC that doesn't follow a script, that reacts to your actions in genuinely unpredictable ways, that remembers what happened earlier in the session and adapts accordingly. Imagine a companion that behaves less like a puppet and more like a player.
NVIDIA has been building toward that vision with its ACE (Autonomous Character Engine) platform since 2023. At GDC 2026, the company announced that the technology has moved from demos and tech showcases into actual live game integrations — with PUBG: BATTLEGROUNDS, NARAKA: BLADEPOINT Mobile PC Version, and inZOI all confirmed as launch partners for the expanded ACE platform.
This is worth paying attention to. When technology like this moves from a NVIDIA-hosted GDC demo into a game with tens of millions of active players, the feedback loop changes dramatically. Here is what ACE is actually doing, what the early integrations look like, and what the persistent challenges are.
What NVIDIA ACE does
The original ACE platform, introduced in 2023, focused on conversational NPCs — characters that could respond to open-ended player dialogue using large language models, generate context-aware speech via AI voice synthesis, and produce realistic lip sync animations in real time. The most famous early demo was a bartender NPC in a cyberpunk city who you could actually hold a conversation with.
The 2026 version of ACE expands the scope dramatically. The new focus is autonomous game characters — AI agents that don't just respond to player queries, but actively perceive the game world, make decisions, and act independently to achieve goals.
This shift relies on several components working together:
Small Language Models (SLMs) optimized for games: Rather than running a general-purpose frontier LLM in the cloud, NVIDIA's ACE now uses small, game-specific language models that can run at or near real-time speeds and operate on device for supported hardware. These models are designed to plan and reason at frequencies compatible with actual gameplay — responding to game events in tens of milliseconds rather than seconds.
Multimodal perception: ACE characters can now perceive audio cues from the game world, process visual information about their environment, track game state, and integrate information about player behavior over time. A companion character can hear that a fight just started, see that the player has low health, notice that there's cover nearby, and decide to move to provide support — all without scripted triggers.
Action integration: Instead of just generating text or dialogue, ACE characters now interface directly with game agent systems. They don't just say "I'll go grab supplies" — they actually execute the behaviors in the game engine. This makes them usable as genuine AI teammates, not just chatty decoration.
The live integrations: what they're actually doing
PUBG: BATTLEGROUNDS
The PUBG integration is arguably the highest-profile real-world test of ACE to date. PUBG has a massive live player base and a brutally competitive environment where competence actually matters.
The ACE characters in PUBG function as AI squad teammates. NVIDIA describes them as capable of:
- Finding and gathering loot independently based on squad loadout priorities
- Swapping and managing gear in response to what they pick up and what the squad needs
- Making tactical plays — moving to flanking positions, calling out enemy positions they've spotted, pushing weak points
- Adapting to player tactics over the course of a match — if the player consistently rotates toward the center circle early, the AI teammate learns this and coordinates accordingly
This is categorically different from the scripted bots PUBG has used to fill lobbies. Those bots follow predetermined patrol patterns and reaction scripts. ACE characters are making moment-to-moment decisions using a planning model that processes current game state.
inZOI
inZOI is a life simulation game from Krafton, the PUBG developer, and it launched in early access in March 2025 to significant attention. Its ACE integration takes a different approach: instead of combat AI, inZOI uses ACE for NPC social behavior.
The inZOI Zois (the game's citizens) powered by ACE can engage in extended conversations, remember what happened in previous interactions, form opinions about the player character based on observed behavior, and make social decisions accordingly. A Zoi who has been ignored repeatedly might stop initiating interactions. A Zoi who witnessed the player doing something generous might bring it up later.
For a life sim where emergent social behavior is the core experience, this is potentially transformative. The gap between The Sims-style scripted social interactions and what ACE offers is significant.
NARAKA: BLADEPOINT Mobile PC Version
NARAKA: BLADEPOINT is a melee-focused battle royale from NetEase. The ACE integration focuses on AI combat companions — characters who can fight alongside the player using melee combat AI that responds to the actual flow of the fight rather than canned attack sequences.
The demo that drew ridicule
Not every reaction to the GDC 2026 ACE demos was positive. One moment went viral: during a live showcase of an NPC in a medieval tavern setting, the AI character refused to cook chicken for the player despite standing directly next to a cooked chicken. The character cited a rule about the tavern not cooking fowl — while the visual contradiction was sitting right in the scene.
NVIDIA's response was that this illustrates an ongoing challenge in live AI NPC systems: the language model driving the character's behavior may not have reliable access to the current visual state of the environment, leading to coherence failures where stated rules don't match what's visible on screen.
This is not a trivial problem. The more autonomous an AI character becomes, the more opportunities there are for its behavior to be incoherent — contradicting itself, misinterpreting game state, or behaving in ways that break immersion precisely because they're almost-but-not-quite plausible. Fully scripted NPCs never do this. They're predictable and limited, but they don't suddenly refuse to acknowledge a chicken that's right there.
Why this matters for the future of game AI
The shift from scripted AI to model-driven AI in games is not going to happen all at once. But the move of ACE characters into live games with real player populations is a meaningful milestone, because those populations will stress-test these systems in ways no internal QA process can.
A few things are worth watching as this plays out:
Latency and cost at scale: Running SLMs for AI characters in real time, for thousands of concurrent game sessions, is computationally expensive. NVIDIA's pitch is that on-device inference via RTX GPUs keeps costs manageable for the developer. Whether this holds at scale — and whether non-RTX players get meaningfully degraded AI companions — is an open question.
Cheating and manipulation: Once AI characters can be talked to in natural language, players will immediately try to manipulate them. Getting an AI NPC to reveal information it shouldn't, to behave in ways that give unfair advantages, or to produce outputs that violate community standards are all live concerns for any game that deploys conversational AI at scale.
The "uncanny valley" for AI behavior: The chicken incident illustrates a specific version of this: AI characters that are almost coherent are, in some ways, more immersion-breaking than characters that are transparently scripted. When you know something is a scripted bot, its limitations don't feel like failures — they're just limits. When something is presenting as an intelligent agent, incoherence feels like a bug.
What it replaces versus what it adds: The most defensible near-term use case for ACE is not replacing hand-crafted companion AI in story-driven games — it's improving the utility of AI teammates in multiplayer games where players already expect the bots to be, well, bot-like. PUBG AI teammates that can actually play the game competently is a lower bar than asking ACE to carry the emotional weight of a Dragon Age companion.
What games to watch if you're interested
If autonomous AI NPCs interest you, these are the games worth keeping an eye on in 2026:
- PUBG: BATTLEGROUNDS — The most stress-tested deployment of ACE in a live game. Updates are expected throughout the year.
- inZOI — Still in early access; the ACE social AI is one of its most ambitious differentiators versus The Sims.
- NARAKA: BLADEPOINT — The melee combat AI integration is a technically demanding use case that will show how well ACE handles fast-paced action.
If you want to pick up any of these games at a discount, Instant Gaming regularly has deals on live-service titles and their associated DLC content.
The bottom line
NVIDIA ACE's expansion into live game integrations is the most concrete sign yet that model-driven game AI is moving from a tech demo category into an actual product feature. The PUBG integration in particular puts autonomous AI characters in front of a massive, competitive player base that will test them ruthlessly.
The chicken incident is funny, but it is also instructive: the gap between "impressive demo" and "reliable live feature" is still real. What GDC 2026 showed is that the gap is closing faster than skeptics expected — and that the games most likely to benefit first are multiplayer titles where the AI's job is to play effectively, not to be emotionally compelling.
The more ambitious vision — AI characters that remember your history with them, form genuine opinions, and behave with the kind of coherence that makes you forget they're not human — is probably still a few years away. But PUBG AI teammates that can actually loot and flank? That's available now.