AI budget tool (a 2014 revisit)

Blast from the past

I originally drafted this blog post a year or so after Splinter Cell: Blacklist launched, and I was still at Ubisoft (around 2014)!

I decided to hold off publishing it since there weren’t any active Splinter Cell projects at the time, and I always figured I’d come back to it and hit publish at a later date.
And well… here we are, very exciting stuff to come from Ubisoft Toronto and their partner studios ๐Ÿ™‚

I’ve left the blog post largely as I wrote it back then, and in hindsight it’s pretty funny to think that I was working in Unreal 2, on a game mode that was inspired by the Gears Of War Horde game modes, years before I made the move to The Coalition to work on Gears!

Extraction (Charlie’s Missions)

When working on Splinter Cell: Blacklist, we had guidelines for the numbers of AI spawned.

So a heavy AI might be worth 1.1 bananas, and a dog 0.7 bananas, with a total banana budget of 10. The numbers roughly mapped to CPU budgets in milliseconds, but the important thing really was the ratio of costs for the various archetypes.

It’s a tricky thing to manage AI budgets across all the game modes and maps, and probably something that the Design department and AI programmers lost lots of sleep over.

Where it got particular tricky was in the CO-OP Extraction game mode.

The game mode has “waves” (a round of AI that is finished only when all of the enemies have been dealt with).

Within waves there are sub-waves, where AI have various probabilities of spawning, and these sub-waves can start either based off numbers of take downs in a previous sub-wave, or based off time.

Sanaa

So the player, for example, could just happen to let the most computationally expensive enemies live in a sub-wave, take out all the computationally cheap enemies (with tactical knock out cuddles, of course), and the next sub-wave could spawn in and we’d blow our AI budgets!

The team building the co-op maps in our Shanghai studio were great at sticking to the budgets, but this variation in the spawning for AIs was obviously going to be very hard to manage.

Having our QC team just test over and over again to see if the budgets were getting blown was obviously not going to be very helpful.

XML and C#/WPF to the rescue

Luckily, one of the engineers who was focused on Extraction, Thuan Ta, put all of the Extraction data in XML. This is not the default setup for data in the Unreal engine, almost all of the source data is in various other binary file formats, but his smart choice saved us a lot of pain.

It made it incredibly easy for me to spend a week(ish) bashing together this glorious beast:

AmmanUI

A feat of engineering and icon design, I hear you say!!
Certainly can never be enough Comic Sans in modern UI design, in my opinion…

What is this I don’t even

Each row is an AI wave that contains boxes that represent varying numbers of sub-waves.

The sub-wave boxes contain an icon for each of the different AI types it might spawn, assuming the worst case (most expensive random AI) for that sub-wave (heavy, dog, tech, sniper, regular with a helmet, etc):

5 icons for different AI types: a Heavy, a dog, a tech, a sniper and a regular with a helmet

The number at the top right of each sub-wave box is the worst case AI cost that can occur in that sub-wave, and it can be affected by enemy units that carry over from the previous sub-wave:

Estimated worst case AI cost for a sub-wave, red arrow pointing to the number on the UI screenshot

So, for example, if sub-wave 1 has a chance of spawning 0-2 heavies, and 1-3 regulars, but only to a max number of 4 enemies, the tool will assume 2 heavies get spawned (because they are more expensive), and 2 regulars get spawned to estimate the worst cost AI for the sub-wave.

If sub-wave 2 then has a trigger condition of “start sub-wave 2 when 1 enemy in sub-wave 1 is taken out” (killed, or convinced to calmly step away and consider their path in life), then the tool would assume that the player chose to remove a regular in sub-wave 1, not a heavy, because regulars are cheaper than heavies.

Following this logic, the cost of each sub-wave is always calculated on the worst cases all the way to the end of the wave.

Long lived

Sometimes you’d want to know, at a glance, which enemies in a sub-wave can live on to the next sub-wave.

If you mouse over the header part of a sub-wave (where the orange circle is below), all the units that are created in that sub-wave are highlighted red, and stay highlighted in the following waves indicating the longest they can survive based off the trigger conditions for the following sub-waves:

WaveHeader

So in the above case, the heavies that spawn in Wave 15, Sub-wave 1 can survive all the way through to sub-wave 3.

This is important, because if sub-wave 3 was over budget, one possible solution would be to change the condition on sub-wave 2 to require the player to take out one additional unit.

Also worth pointing out, the colour on the sub-wave bar headers are an indication of how close to breaking the budget we are, with red being bad. Green, or that yucky browny green are fine.
The colour on the bar on the far left (on the wave itself) is representative of the highest cost of any sub-wave belonging to this wave.
So you can see at a glance if any wave is over budget, and then scroll the list box over to find which sub-wave(s) are the culprits.

Listboxes of listboxes of listboxes

There’s about 300 lines of XAML UI for this thing, and most of it is a set of DataTemplates that set up the three nested listboxes: One containing all the waves, a listbox in each wave for the subwaves, a listbox in each sub-wave for the AI icons.

Each of the icon blocks has its own DataTemplate, which just made it easier for me to overlay helmets and shields onto the images for the different AI variants:

<datatemplate x:key="EAIShieldedHeavyController_Template" datatype="{x:Type local:Enemy}">
	<grid>
		<rectangle fill="Black" width="30" height="30" tooltip="Heavy + Shield">
			<rectangle.opacitymask>
				<imagebrush imagesource="pack://application:,,,/Icons/Heavy.png">
			</imagebrush></rectangle.opacitymask>
		</rectangle>
		<rectangle horizontalalignment="Right" verticalalignment="Bottom" fill="Green" width="15" height="15">
			<rectangle.opacitymask>
				<imagebrush imagesource="pack://application:,,,/Icons/Shield.png">
			</imagebrush></rectangle.opacitymask>
		</rectangle>
	</grid>
</datatemplate>

System.Xml.Linq.Awesome

Probably goes without saying, but even in a horrible hard-codey, potentially exception ridden hacky way like the way I was using it in this application, using the XDocument functionality in Linq makes life really easy ๐Ÿ™‚

I definitely prefer it to XPath, etc.

Forgive me for one line Linq query without error handling, but sometimes you’ve got to live on the wild side, you know?:

_NPCTemplates = SourceDirInfo.GetFiles("*.ntmp").Select(CurrentFile => XDocument.Load(CurrentFile.FullName)).ToList();

And with those files, pulling out data (again, with error/exception handling stripped out):

foreach (XDocument Current in _NPCTemplates)
{
	// Get a list of valid NPC names
	foreach (XElement CurrentNPC in Current.Descendants("npc"))
	{
		List NameAttr = CurrentNPC.Attributes("name").ToList();
		if (NameAttr != null)
		{
			// Do things!!
		}
	}
}

Conclusion

Although it’s nothing particularly fancy, I really do like it when programmers choose XML for source data ๐Ÿ™‚

It makes life really really easy for Tech Art folk, along with frameworks like WPF that really minimize the plumbing work you have to do between your data models and view, as well as making very custom (ugly) interfaces possible using composition in XAML.

Beats trying to create custom combo boxes in DataGrids in Borland C++ at any rate ๐Ÿ˜›

Also, Comic Sans. It’s the future.

Crumbling tiger, hidden canyon

In the world of Shangri-La, Far Cry 4, lots of things crumble and dissolve into powder.

For example, my buddy Tiges here:

TigesAnim.gif

Or, asย Max Scoville hilariously put it, the tiger turns into cocaine… NSFW I guess… That gave me a huge chuckle when I first saw it.

I was not responsible for the lovely powder effects in the game, we had a crack team (see what I did there) of Tricia Penman, Craig Alguire and John Lee for all that fancy stuff.
The VFX were such a huge part of the visuals Shangri-La, the team did an incredible job.

What I needed to work out was a decent enough way of getting the tiger body to dissolve away.
Nothing too fancy, since it happens very quickly for the most part.

Prototype

I threw together a quickย prototype in Unity3d using some of the included library content from Modo:

unitydude.gif

I’m just using a painted greyscale mask as the alpha, then thresholding through it (like using the Threshold adjustment layer in Photoshop, basically).

There’s a falloff around the edge of the alpha, and I’m applying a scrolling tiled firey texture in that area.

I won’t go into it too much, as it’s a technique as old as time itself, and there are lots of great tutorials out there on how to set it up in Unity / UE4, etc.

As it turns out, there was already some poster burning tech that I could use, and it worked almost exactly the same way, so I didn’t need to do the shader work in the end:

You mentioned canyons?

I actually used World Machine to create the detail in the maps.
In the end, I needed to make about 30 greyscale maps for the dissolving effects on various assets.

Workflow

I’ll use my Vortigaunt fellow as an example, since I’ve been slack at using him for anything else or finishing him (typical!).

First up, for most of the assets, I painted a very rough greyscale mask in Modo:

VortigauntRoughMask

Then, I take that into World Machine, and use it as a height map.
And run erosion on it:

WMVort

I then take the flow map out of World Machine, and back into Photoshop.
Overlay the flow map on top of the original rough greyscale mask, add a bit of noise to it.
With a quick setup in UE4, I have something like this:

Vortigone

Sure, doesn’t look amazing, but for ten minutes work it is what it is ๐Ÿ™‚

You could spend more time painting the mask on some of them (which I did for the more important ones), but in the end, you only see it for a few dozen frames, so many of them I left exactly how they are.

Better-er, more automated, etc

Now that I have the Houdini bug, I would probably generate the rough mask in Houdini rather than painting it.

I.e:

  • Set the colour for the vertices I want the fade to start at
  • Use a solver to spread the values out from these vertices each frame (or do it in texture space, maybe).
  • Give the spread some variation based off the roughness and normals of the surface (maybe).
  • Maybe do the “erosion” stuff in Houdini as well, since it doesn’t really need to be erosion, just some arbitrary added stringy detail.

Again, though, not worth spending too much time on it for such a simple effect.
A better thing to explore would be trying to fill the interior of the objects with some sort of volumetric effect, or some such ๐Ÿ™‚
(Which is usually where I’d go talk to a graphics programmer)

Other Examples

I ended up doing this for almost all of the characters, which exception of a few specific ones (SPOILERS), like the giant chicken that you fight.
That one, and a few others, were handled by Nils Meyer and Steve Fabok, from memory.

So aside from those, and my mate Tiges up there, here’s a few other examples.

Bell Chains

BellAnim

Hard to see, but the chain links fade out 1 by 1, starting from the bottom.

This was tricky, because the particular material we were using didn’t support 2 UV channels, and the chain links are all mapped to the same texture space (which makes total sense).

Luckily, the material *did* support changing UV tiling for the Mask vs the other textures.

So we could stack all of the UV shells of the links on top of each other in UV space, like so:

ChainUVs

So then the mask fades from 0 –> 1 in V.
In the material, if we had 15 links, then we need to tile V 15 times for Diffuse, Normal, Roughness, etc, leaving the mask texture tiled once.

Edwin Chan was working on the assets for these, and I could have just made him manually set that up in Max, but it would have been a bit of a pain, and I’d already asked him to do all sorts of annoying setup on the prayer wheels…

There were 3-4 different bell chain setups, and each of those had multiple LODs for each platform, so I wrote a Maxscript that would pack all the UVs into the correct range.

Quite a lot of work for such a quick effect, but originally the timing was a lot slower, so at that point it was worth it ๐Ÿ™‚

Bow gems

BowAnim

Although I never really got this as on-concept as I would have liked, I’m pretty happy with how these turned out.

Amusingly, the emissive material didn’t support the alpha thresh-holding effect.

So there are two layers of mesh: the glowy one and the non-glowy one.
It’s actually the non-glowy layer that fades out!
The glowy stuff is always there, slightly smaller, hidden below the surface.

Dodgy, but got the job done ๐Ÿ˜›