Building Hunted’s AI: Some Fundamental Elements

So Jim has been going on about how dynamic AI combat is the heart of Sir, You Are Being Hunted – let’s look at what that means from the perspective of the development work we are actually doing. I’ve illustrated this with a few shots from the AI testscenes build, which shows some of the “scene view” elements of Unity’s editor. Obviously the scenery and so forth is placeholdery and not actually intended to be seen as the sort of scale or density we’re expecting for a final game.


One of the key features we wanted in Hunted, was the ability for enemy AI to behave as autonomously as possible. Rather than groups of AI being introduced at scripted intervals and exisiting only to provide staged skirmishes for the player, we wanted to see them wandering about the landscape with their own objectives. This adds a significant level of complexity to any AI design.

The enemy NPCs in hunted come in several varieties. The main class are the mobile hunters – the red visored chaps you’ve seen in the first few screenshots – these guys patrol from one location in the world to another, and on arrival will rest, interact with one another and perhaps search the local area for tresspassers or loot. To facilitate this I designed a procedural generation system that places detailed areas of interest (hubs) across the world, connected by trails across the intervening terrain. The hunters move in squads of different factions and travel along these pathways between the hubs. Navigation is done via a nodebased a* network, which is created and stored at the same time as the world is generated. Hubs contain a much higher level of detail in terms of obstacles and waypoints. Hubs generally attract both the player and NPCs as they are generally the location of valuable resources and cover/safety. The wandering hunters have several states of behaviour, governed by a simple Finite State Machine manager. The states are:


Wander: NPCs move between hub locations based on various internal reasoning algorithmns, sometimes they will stop for a while and rest, or even loiter on roads outside of the settlements.

Alert/Search: If an NPC hears a nearby sound, or sees either the player or another NPC from a rival faction, they will switch into this state. In this state the hunter approaches locations close to the last heard sound or last sighting, their viewcone detection increases and they will continue exploring nearby areas until they encounter an enemy or eventually get bored and return to wandering.

Combat: If a hunter sees a target and can approach to within combat range (dependent on the range of their weapon etc), they will do so. In this state NPCs will reload and move to keep in range of their target and when losing sight of the target they will attempt to round appropriate corners to track them down.

Cover: Every time an NPC is injured they have a chance of switching to a cover-seeking state (more likely as they become more heavily wounded). When in this state they abandon all combat and run to the nearest cover location (essentially calculated from a list of locations that are out of sight of the enemy that is pursuing them). Once in cover they will constantly look around themselves to spot any chasing enemy. If found and attacked they will run on to another potential safe spot. If safe for long enough they will regain their courage and return to either a combat or search state.


When two opposing squads meet at a hub they react to each other in the same way as they would to the player, a skirmish occurs and one side will usually end up wiped out (though occasionally they may pass safely by each other or a few stragglers may escape to wander onwards). Debugging this sort of AI is a tricky task, as the character models don’t (currently) easily indicate what state they are in. To remedy this I draw a number of helpful indicators in the scene window. Icons above NPCs show their states, their HP and their faction. Viewcones, target lines, shot lines and sound lines are also available. The system is going to require Jim and James to spend a lot of time working on balancing, as we have to deal with situations where other players and NPCs also enter the fray (at the moment hunters that are infighting will only switch to the player if they are significantly closer and not already engaged).


Right now the framework for all this behaviour is in place, and we now need to start making it function as games usually do – with visual feedback that can understood in the context of the game, and the appropriate behaviours for a player to work out what is happening, and successfully interact with the AI. We’re hoping to have a “test village” up soon, and from there we’ll be able to show off some of the behaviour in motion – complete with some more tech bells and whistles that will make the factional battles believable and compelling.

In future we’ll talk in a bit more detail about combat behaviours, and also the way in which we intend to enable to hunters to actually track down the player.

Something to say? Head over to our forum!