Why Can’t I Enter The Buildings In Sir?

This is the question we get asked most frequently, so we thought we’d answer it in some detail.

Firstly, many people seem to imagine that it’s some kind of trivial oversight, as if we are going to say “gosh, yes, we hadn’t thought about that”, and then fix it. We have thought about it, of course. We think that it would be great for the player – and the robots – to be able to enter buildings. We’ve spent a lot of time examining how they might do that, and concluded that, some open and ruined structures aside, they cannot.

Here’s why.

There are two main solutions to being able to enter buildings:

- Skyrim-style instanced buildings, where the interior is essentially another space. Enter through the door and explore that space independently. This is clearly a compromise, and not ideal for an open world game. What happens to the robots chasing you? Do they just wait outside? Do you get an outdoor camera? Do they come inside? It rapidly moves away from the kind of game we were trying to produce, as well as producing a cascade of additional design issues.

- Buildings that are entered within the world space, as in Arma II or Stalker. These can be explore seamlessly and allow you to look out into the world through the windows and doors. This is the ideal solution, and they one we would have to go for.

So we immediately discounted the instanced buildings, because they are not part of the world. They don’t allow you to hide inside and peek outside at your antagonists. They would require interiors to built, either statically, which would increase our art time and budget, or procedurally, which would increase our art time and budget as well as programming time (proc gen solutions are extremely intensive for the programmer and testers). Doing it procedurally would mean increasing the generation time for any given world enormously, too, which makes doing it like this a drag for the player. Do you increase the generation time at the start of the game by several minutes? Or create a lag on entering any building? Clearly, neither is ideal for creating a streamlined experience. If you do go for static interiors, then you have to create a large number of them so that you don’t appear to be entering the same building all the time, which again increases our art time and cost, and implementation, and testing.

The points above about creating the building procedurally also count towards the second more ideal solution, with necessary increases to procedural generation complexity and art budget. There are other questions too: do the interiors then have furniture? Where is the loot? Do we create additional 3D objects for the items, or do we create secondary loot containers inside the buildings, rather than the doors as per the current system? Solutions like those in Arma 2 are great, but they add several large burdens to development, not least of which is cost.

If we did decide we had the time and budget to go with that second system – and let me be very clear when I say that we do not have the resources, we are a three-man team with a small Kickstarter budget, and I still had to work another job for the whole of development – there’s another problem, too, which is that we’d have to introduce another complex level of detail structuring system to deal with the increased geometry load from the villages. These areas are already very intensive, even with a bunch of low poly models, and when we’re throwing this many polygons at the screen we start to lose slower machines entirely. The performance hit would be significant, and the polycounts of the buildings would skyrocket. Given that this was supposed to be a lo-fi game about British countryside we’re suddenly spending huge amounts of resources – computational and developmental – on one problem that is solved by saying “you can’t enter the buildings”.

So that’s what we said.

Oh, and then there’s the issue of having to build every model in such a way as to allow the robots to enter the buildings – or redesigning pathfinding entirely. The system that procedurally places a navmesh onto the world can’t cope with interiors, so we’d be looking at yet another solution that needs to be designed and programmed. More time. More resources, and another hit to the game’s performance.

It can be done. Yes, it can be done – other games have done it. But we, Big Robot, the guys making Sir, You Are Being Hunted before the money runs out, cannot do it.

Of course there are structures you can still enter in Sir – ruins, pole barns, and so on – but the interiors of the intact, British-style houses we would need to sell the “British” ideal were simply out of reach. And this is really important: we could have removed buildings entirely. We could have made it so that there was just forest and countryside, but part of the high concept of the game was Britishness, and we couldn’t get that across in the way that we wanted without British villages. And granted, we do lose something by not solving this issue – the capacity to explore and hide inside buildings – but since the point of the game was running across open countryside, rather than poking about in urban spaces, we were happy with that concessions. Lots of games do building interiors and urban environments well, but we do open, thematically-coherent countryside well. That was our goal, and we totally nailed it.

Yeah, I am bound to be a bit out of sorts when addressing this problem, so I apologise for that. It’s somewhat exasperating for us to create a procedural generation engine unlike anything else ever made, to have made Herculean efforts to produce a first-person stealth game with a huge range of weapons and equipment, to implement an AI system more complex than what generally appears in first-person games, and still be told that we’re lacking because the player has to use the buildings as loot stashes rather than Wendy houses.

We’re gamers too, of course. We can see the ideal game you all have in your minds. We’d love to have been able to deliver it, but the part about entering buildings is missing.

I hope you can understand why.

Most importantly: once you play the game, that concern will drop away. The game works, and players get along with the systems we’ve created just fine. Trust us, and wait for August 19th.

Big Robot Is Pleased To Announced Pre-Orders Of Sir, You Are Being Hunted, Via Humble Store! Also: Rezzed Appearances.

Lots to come in the next few weeks, but we’re starting with the pre-order, thanks to those lovely folk at Humble. Pre-order folk will get early access in our second round of testing prior to release, along with the base Kickstarter tiers. This will come about a month after the Kickstarter early access tiers get their access. This will all kick off at some point in the summer. It’s so close we can taste it!

In the meantime, please pass this link on to anyone who missed the Kickstarter and who might want to support the development of Sir in the next couple of months.

We are also enormously excited to announce that Sir will be playable at Rezzed 2013 in Birmingham, UK, on 22nd and 23rd of June. Please come along! Not only will the game be there, but Tom is going to be talking about the game’s development and his procedural generation experiments in Unity.

Sir, Unity, Mecanim And The Trials Of Small Studio In-Game Animation

We thought it might be insightful for you guys to be able to read a bit more about the challenges we’ve faced in completely overhauling our animation system since the success of our Kickstarter. As we mentioned in that pitch, animation is one of the key challenges for the game, and we feel we need high quality character animation to really make the experience work. So that’s what we’ve been doing. However, it was not without complications.

During the time where we were working on the initial prototypes for Sir, we were lucky enough to get Frogames‘ Christophe Canon to produce some great character art and an early model for the main enemy NPC, the Hunter. At this point we didn’t have the resources or time to commission a full animation set for this character so we looked around for alternative quick solutions – something that Unity has in abundance.

IK And Smart Locomotion

Rune Johansen’s dynamic locomotion system for Unity was an obvious choice, because it does so much without us having to code very much into the game.

Runes system

Rune developed it for his Masters thesis and it’s a really smart piece of coding. It’s essentially a procedural animation blending system with foot inverse kinematics, or “IK”. This means that characters to which it is applied will walk realistically, placing their feet sensible on the terrain they traverse. We implemented it with a series of animations from Mixamo.com (an autorigging service that applies generic animations to any humanoid model).

Once we had our initial Hunter model and a set of animations we could use Rune’s system to blend them together. Rune explains how this works better than we can:

“The system uses a set of example motions primarily in the form of keyframed or motion-captured walk and run cycles. The system automatically analyzes each motion at design-time and extracts parameters such as impact and lift-off times for each foot as well as overall velocity. At runtime the system first blends the motions according to the current velocity and rotational velocity of the character, it then adjusts the movements of the bones in the legs by means of inverse kinematics to ensure that the feet step correctly on the ground.”

So for an attacking stance you might have the following animations (walk forwards, run forwards, strafe left, strafe right, walk backwards etc.) Rune’s code examines each of these animations and then can pace and crossfade the animations dynamically to match the characters movement. The system is also capable of aligning the characters feet to the surface between them and then cascading the bone changes up the transform chain (IK).

All of these features require some setting up and tweaking but can result in a genuinely dynamic and adaptable animation system. The net result is a character whose feet and legs remain solidly and realistically rooted the ground, bending and blending procedurally to create really believable connection to the floor, regardless of changes in movement vector. Of course the underlying movement is not Root-motion based (where the distance travelled in each animation loop is dictated by the animation loop itself), but constantly variable. The only disadvantage to this system is that it is (understandably) very complex and can be a significant performance hit if you intend to run multiple characters – which for Sir was a big problem…


When Unity4 introduced their Mecanim animation solution we were excited to try out the new system, hoping it would provide us with a similar system to what we had enjoyed with Rune’s work, but with less overhead. However it took quite a while to readjust our models, animations, and code to get a satisfying result.

Mecanim provides a “blend tree” system where animations can be crossfaded in realtime based on script-sent input values. For instance this could involve defining a series of blend states to mix between a walk and a run animation based on an passed speed value. This area of functionality is similar to the original blending functionality of the locomotion system we used before – in fact it offers far greater control of the process – but it involves more manual setup via the Mecanim ‘Animator’ tool. The Animator essentially allows you to minutely build and fine tune blend trees for your character’s animation. Where before Rune’s loco system did this on the fly, it was now down to us to rebuild the rules governing what anims to play with what blends at what time. No small task!

To Root or not to Root

Most Mecanim demonstrations use root motion based movement because blending animations generally works best when the animations are mixed at specific loop points (i.e. moving from a walk to a run works best when the ending of the first animation has the character’s feet in the same position as the start of the run). Root motion approaches often imply that once a character has begun a specific animation they can’t react until it has reached a convenient loop point (an example is in many games where an enemy will continue reloading even if staying in that pose exposes them to a rain of bullets). Our NPCs operate in a procedurally generated world and function much better if they can stop and alter their motion and behaviour mid cycle where necessary. So instead of converting all our code to work with root motion we decided to try and implement a mix of Rune’s procedural system and the new Mecanim blend trees.

As a result, each character in Sir has a script that tracks the velocity, turn speed, strafe speed, heading and other motion values. These values are then sent to whatever current Mecanim layer we have activated on the NPC (we have a layer for each behaviour; attacking, fleeing, hunting, searching, roaming etc.). The blend trees in that layer then resolve the final animation mix that is appropriate for the characters current behaviour. This can result in a few rare hiccups where an animation crossfade happens at a non-ideal cycle point, but with careful setup of the blend trees and clever animation design such instances can be minimised.

This approach also gets tricky when a character has the potential to execute a wide range of movements. It’s convenient to branch animation blend trees first by velocity, then by turn speed for example, however if a character can strafe, turn and walk/reverse the tree can quickly get complex (Of course this would also be true for a root motion based animator too). Finally, we can also control the overall animation speed of any layer (attack, flee etc) from the AI state itself, so we can easily speed up a particular behaviour and adjust the animation speed in the same script.


The loss of the original IK system is a compromise we had to make. It felt like a big loss, but the advantages of another system were clear. The Mecanim solution is a lot more efficient and allows us to have more active NPCs in view at once, which is clearly more important to the game than exceptional foot placement. The majority of AAA games do without foot IK on models and however nice the original IK looked our NPC hunters are often knee deep in grass or peering over walls to shoot you, making believable leg movement pointless. You generally don’t have time to admire the enemy footwork while running for your life through the bracken, either.

Ultimately we think our experience with these two different approaches to implementing animation show the complexity of the choices faced by developers working in the kind of environments we are using to build Sir. It’s rare that one piece of tech or middleware will do exactly what you want, and that’s the reason why so many larger teams choose to write bespoke engines and code for their games. When you are a small team like us, it’s wiser to appraise the solutions available and then weigh the advantages: the system that is most efficient, and easiest for the designers to tweak, is almost always the winner.

New Year, New Animations!

Hello everybody! We hope that 2013 is treating you well.

It’s already been a busy month at Big Robot, as you can see. We’ve also had a tricky holiday period, with the phone company keeping Jim without internet for a numbing 37 days! Anyway, we’re back to regular updates, and below we’ve posted a new video showcasing some of the new animations for the Hunter character.

They’re being worked on by the very talented James Benson who’s work some of you may have seen in this Team Fortress machinima. We’re really pleased with the work James is doing on the anims, they have a great deal of life and energy as well as capturing exactly the right feeling of mechanical menace we were after.

But we haven’t just been waiting on anims from Benson!

This month we’ve been hard at work on important housekeeping tasks, but some cool stuff too. Housekeeping making sure the save system correctly stores all the island information, so that everything we need is recorded and can be recreated. Yes, saving the game is utterly crucial, and it’s a big task for Tom to program.

Until now we haven’t been able to save all the island data between sessions. Every time we’ve played or tested we’ve had to generate new worlds – a rapid process, but not one that is representative of the finished game. Finally this week we’ve got the save system sorted and it’s now possible to generate a world comprised of a group of five 1x1km islands, save this data off, play for a while and save any changes we’ve made through our actions.

On the face of it that doesn’t sound that interesting or difficult, but when you think that all the buildings in each village on each island have their own contents inventories – which can be changed by the player adding to or removing items from those inventories – as well as storing all the actual procedurally generated island structure data you can see that our save files are actually quite a task! Okay, okay, it’s a mundane thing. But still essential!

Less mundane is our brand new island generation front end. Until now we’ve been using the IDE of Unity to set the parameters of and generate the islands, but this month we’ve completed a first pass of the end-user front end for this. The stuff you will actually be using to generate the islands yourselves. Players will use this front end to dictate the physical make up of each individual island in the world.

Using a set of sliders and toggles players can now create unique bespoke islands. So you might decide you want one island to be covered in just hills and forests while another is just lakes and villages. Or maybe you want a large open fields and hedges, or just tiny stone walled enclosures like parts of Ireland and Wales. With these new tools you can really create the world you want to play in. We actually see this as a huge step forward because it’s one more step away from using the dev tools and one step closer to the game being a self-contained system. The more stuff like this we do, the closer we are to ‘game’ rather than being chained to Unity IDE.

At the moment this new world gen front end is using ‘programmer art’ UI textures and text so it looks horrible, but it works.

That’s progress that is!

More soon.