The devil is in the decals

October 20, 2016


Frequently when talking about mesh decals in UE4, I get comments about them being annoying to maintain, because every time you change your meshes you have to rebuild / adjust layers of decals.

Now, personally, I don’t really care that much, because my projects are all pretty small, and fixing up decals in Modo is generally a very quick job.

But it’s come up enough that I figured I’d make a “2 metres short of Minimum Viable Product” example of how you could address this.


That’s what I’m calling Houdini Engine + UE4 now, just to continue the tradition of me being annoying.

Right. Houdini stuff.
I made a digital asset:


There are two inputs, which will get fed in from UE4 (later).
In the Houdini scene, #1 input is the object I want to generate a decal on, object #2 is a projection plane.

The stuff on the left is actually all redundant, but what I was planning to do was construct layout patterns in Houdini for different decals on one sheet, and let Houdini just automatically do the UV layout. But procedural UV’ing got super annoying, so I decided not to do that.


Extrude plane, cookie with box:


Delete faces that are on the opposite side of the projection (dot product driven delete sop, basically).

Since I couldn’t really get the UVs working the way I wanted, I created a centre point on the projection plane, get the normal and constructed U and V vectors, which I then project onto the verts in the decal mesh.

I did that all in VEX, because it seemed like a good idea at the time.

I was fairly annoyed with working on it by this point, so I just exposed the rotation and scale of the decal so you can play with it in Unreal🙂


Back in UE4

With that done, and the thing saved as a Houdini Digital Asset, time to load up a shamefully unfinished UE4 project (there are lots of choices here…).

The workflow is:

  • Load the digital asset into the content browser.
  • Drag a copy into the scene.
  • Using “World Outliner Input”, Select a plane for the projection, and an object to put decals on:


Bam! New decal mesh, floating over the top of the original object, you can save it out using the Houdini engine bake stuff, or whatever you want to do.


I didn’t bother taking this too far, because I don’t really intend to use it myself, but if I thought it was going to be useful there are a bunch of things I’d probably do.

I mean, aside from completely re-build it from scratch, because it’s a whole bunch of broken hack right now…

  • Expose a few different projection types
  • Create separate Houdini asset that lets you lay out planes on a decal sheet to define regions for different decals (which I started on)
  • Make it work with multiple planes passed into the one asset

With any luck, Epic will just come along with a similar workflow where you can press a button on a projected decal in editor, and it will do this sort of thing for you🙂

(In the meantime, I’ll just stick with manually doing it in Modo, thanks very much…)


City scanner scene – Breakdown pt3

October 15, 2016


Part 3 of the breakdown of my recent Half-Life 2 Scanner scene.

And now for animation! Also known as “Geoff stumbling around blindly for a week when he really should have watched some UE4 animation tutorials”.

So, erm, feel free to use this as a guide on how *not* to approach making an object float down a hallway…

Down the garden path

Early on, I was trying to work out if I wanted to just animate the whole thing from start to finish in Modo, or do something a little more systemic.

For the sake of trying something different, I settled on having the main movement down the tunnel, rotation of the centre wheel, tail and little flippy bits (technical term) through blueprints, and then blend in a few hand animated bits.

There are three main blueprints that do the work: the Scanner blueprint, the Scanner Path blueprint, and the Scanner Attract Point blueprint.


The division of labour between these things ended up being pretty arbitrary, but the initial idea was that an Attract Point can start playing an animation on the Scanner when it reaches the point, and can also modify the max speed of the Scanner when leaving the point.

Here are the parameters that each point has:


So when the Scanner reaches an attract point, it can pause for a while (I use this time to play an animation, generally). The animation doesn’t start when the scanner reaches the point though, it actually starts at a certain percentage of distance along the previous spline segment leading up to this point.

There is also a blend in and out time for the animation, to give transition from the manually animated idle.

The animation blueprint itself does very little:


Down the bottom left is the Idle animation that happily ticks away all the time, and that blends with an override animation, which is what the Attract Points set.

Each of the rotators are driven by procedural animation on the Scanner blueprint, which I’ll show in a bit.

Improvements in hindsight

The Idle / Override blending part of this is definitely something I would change in hindsight, because blending in and out of the idle is a mess: the scanner could be on an up or down point when I start.

There’s a few ways I could deal with it, including just changing the up and down sine wave motion to be driven by blueprint instead, or just restarting the idle animation when I start blending it back in (or probably a dozen better ways to do it that I don’t know about :P).

Also, the “pause” functionality in the Attract Points is not a great way to do things.
Timing the pause and playing animations was a lot of trial and error, I should have sent events out from the animations instead that trigger the pause.

Custom animations in Modo

There’s three custom animations that I made in Modo:

  • Idle
  • Searching left and right animation half way down the tunnel
  • The final “do I see something? I think I see something… SQUIRREL!!”

Everything in the mesh is hard rigged (no deformation), so just parenting all the pieces together, and importing into Unreal generates a skeleton for me.

In Modo, I didn’t do anything particularly exciting this time around, i.e no interesting rig, I just keyframed the bones.

Modo has export presets for UE4 and Unity now, which is pretty ace!
You can also set up your own presets:


It was pretty fun doing some animation again, it’s something I really don’t do very often, and Modo makes it pretty easy to dive in.


Ok, back in UE4, time to have a quick look over the tick event in the Scanner blueprint.


Unlike my normal lazy self, this time I stitched a bunch of things together, so you can see it in full!

Embrace lag

I wanted quite a few of the animations to be driven by the velocity of the scanner.
I’m calculating the velocity myself anyway, based on set max speed and acceleration values, so I could have just used that value, and built some lag into it.

But I found something more fun:


I have 2 non colliding “spring arms” attached to the Scanner root node, one that is slow to catch up (used for body tilt), one that is fairly quick (used for tail rotation).

This was inspired by some amazingly cool animation work I saw Greg Puzniak doing in Unity3D. It’s not up on his site, but he has cool matcap stuff up there that you should check out!🙂

So in a lot of my blueprint functions, I get the distance from the spring arm to the arrow, and use that to drive laggy rotation:


Despite the comment in the blueprint, when I tried to use this for banking, the ATM swallowed my card (har har har).

So most of my procedural animation is driven this way.
When I’m not playing the override animations, everything except for the up and down bobbing is procedural. Which is pretty silly, because that’s a sine wave, so it’s not really an Idle animation, and I should have got rid of it.

The rest of the blueprint is about keeping track of how far along the spline we are, how close to the next point we are (so we can start playing an animation, if necessary), and then setting the world location and rotation of the Scanner from the distance along the spline.

Next-gen volumetric fog

Is something I don’t know much about, so I just made a hacky material UV offset thingo😛

I say “made”, but there are some great light beam and fog examples from Epic included in the engine content, so I grabbed most of this thing from those examples, and here:

Fog Sheet and Light Beams

The scanner has a geometry cone on it that is UV mapped 0->1 U along the length.

I don’t think I really changed much from the content example (I honestly can’t remember), but I did add two parameters that adjust the tiling offset of the noise texture:


As the Scanner moves along the path, it increases the FogForwardOffset which pans the U coordinate of the UVs, so that it looks like the cone is moving through a volume. There’s also always a little bit of panning going on anyway, even when the Scanner is stopped.

As the Scanner rotates, I scroll the V coordinate, just so the noise in the beam doesn’t stay fixed. The rotation isn’t very convincing, so I could probably do a better job of that.

There’s not much going on in the blueprint, but I put whatever I could in there rather than in the material:




The idea of the scene was to be almost entirely lit by the scanner, but I do have a bunch of static lights scatter around too, just to give some ambient light.

There are also two stationary lights in the scene to get highlights where I want them (the lights selected in the screenshot above).
One of them is a spotlight, used to hit the puddle and left wall.

There is also a small light at the front of the tunnel that has “Indirect Lighting Intensity” set to 0, so it doesn’t affect the bounced lighting.
This is the light that hits the scanner here:


This light is quite bright, so when the Scanner hits it, the rest of the environment darkens down, which (hopefully) puts the focus all on the Scanner (yay for auto-exposure!).

There are only two shadow casting lights in the scene, and they are both on the Scanner.
One of them is on a giant point light, and is super expensive, which is the main reason I limited shadow casters everywhere else:


Spinning spotlights

There are also two non shadow casting spotlights on the side of the scanner that rotate and project patterns on the wall.

For some reason that I can’t remember, I decide to generate the pattern in a material, rather than do the smart thing and use a texture.


I’m modifying the “VectorToRadialValue” function to generate bands, then fading it out in the middle bit.

Seriously though, unless you have a really good reason, you should probably do this in a texture🙂


So I *think* that’s it!

I’m sure there are things I’ve missed, or glossed over a bit, so feel free to ask questions in the comments and I’ll fill in the gaps.



City scanner scene – Breakdown pt2

October 13, 2016


This is part 2 of the breakdown for my recent scene Half-Life 2 scanner scene (part 1 here).

This time, I’m going to focus on the Houdini web setup.

Although it took me a while to get a very subtle result in the end, it was a fun continuing learning experience, and I’m sure I’ll re-use a bunch of this stuff!

Go go Gadget webs!

I saw a bunch of really great photos of spider webs in tunnels (which you can find yourself by googling “tunnel cobwebs concrete” :)).

I figured it would be a fun time to take my tunnel into Houdini, and generate a bunch of animated hanging webby things, and bring them back into UE4.

This fun time ended up looking like a seahorse:


I will break this mess a bit🙂

Web starting points


I import the geometry for the tunnel and rails, and scatter a bunch of points over it, setting their colour to red.

On the right hand side of the seahorse is a set of nodes for creating hanging webs, which is just some straight down line primitives, with a few attributes like noise and thickness added to them.
I’ll come back to these later:


In the top middle of the seahorse, I have a point vop apply two layers of noise to the colour attribute, and also blend the colour out aggressively below the rails, because I only wanted webs in the top half of the tunnel.

The web source points look like this:


From these points, I ray cast out back to the original geometry.

Ray casting straight out of these points would be a little boring, though, so I made another point vop that randomizes the normals a little first:


After this, I have a few nodes that delete most of the points generated from the pipe connections: they have a high vertex density, compared to every other bit of mesh, so when I first ran the thing, I had a thousand webs on the pipe connections.
I also delete really small webs, because they look lame.

We are now at seahorse upper left.

Arcy Strangs.


Not sure what I was thinking when naming this network box, but I’m rolling with it.

So anyway, the ray cast created a “dist” attribute for distance from the point to the ray hit, in the direction of the normal.

So my “copy1” node takes a line primitive, copies it onto the ray points, sets the length of the line to the “dist” attribute (my word, stamping is such a useful tool in Houdini).


Before the copy, I set the vertex red channel from black to red along the length of the line, just for convenience.

Previous up the chain, I found the longest of all the ray casts, and saved it off in a detail attribute. This is very easy to do by just using Attribute Promote, using Maximum as the Promotion Method.

So, I now define a maximum amount of “droop” I want for the webs, a bit of random droop, and then I use those values to move each point of each web down in Y a bit.


I use sample that ramp parameter up there using the web length, and then multiply that over the droop, so that each end of the web remains fastened in place.
And I don’t really care if webs intersect with the rails, because that’s just how I roll…

Fasten your seatbelts, we are entering seahorse spine.

Cross web connecty things


For each of the webs in the previous section, I create some webs bridging between them.
Here’s the network for that.


I use Connect Adjacent Pieces, using Adjacent Pieces from Points, letting the node connect just about everything up.

I use a carve node to cut the spline up, then randomly sort the primitives.

At this point, I decided that I only want two connecting pieces per named web, and I got lazy so I wrote vex for this:

string CurrentGroupName = "";

string PickedPieces[];
int PieceCount[];

int MaxPerPiece = 2;
int success = 0;

addprimattrib(geoself(), "toDelete", 0, "int");

for (int i = 0; i < nprimitives(geoself()); i ++)
    string CurrentName = primattrib(geoself(), "name", i, success);

    int FindIndex = find(PickedPieces, CurrentName);
    if (FindIndex < 0)
        push(PickedPieces, CurrentName);        
        push(PieceCount, 1);
        int CurrentPieceCount = PieceCount[FindIndex];
        if (CurrentPieceCount >= MaxPerPiece)
            setprimattrib(geoself(), "toDelete", i, 1, "set");
            PieceCount[FindIndex] = CurrentPieceCount + 1;
    setprimattrib(geoself(), "name", i, CurrentName);

So that just creates an attribute on a connecting piece called “toDelete”, and you can probably guess what I do with that…

The rest of the network is the same sort of droop calculations I mentioned before.

One thing I haven’t mentioned up to this point, though, is that each web has a “Primitive ID” attribute. This is used to offset the animation on the webs in UE4, and the ID had to get transferred down the chain of webs to make sure they don’t split apart when one web meets another.

At this point, I add a bunch of extra hanging webs off these arcy webs, and here we are:


Then I dump a polywire in, and we’re pretty much good to go!

Well… Ok. There’s the entire seahorse tail section.

For some reason, Polywire didn’t want to generate UVs laid out along the web length.

I ended up using a foreach node on each web, stacking the web sections up vertically in UV space, using a vertex vop, then welding with a threshold:


Since I have the position, 0-1, along the current web, I could use that to shift the UV sections up before welding.

With that done on every web, my UVs look like this:


Which is fine.
When I import the meshes into UE4, I just let the engine pack them.

Seriously, though… These are the sorts of meshes that I really wish I could just bake lighting to vertex colours in UE4 instead of a lightmap.
It would look better, and have saved me lots and lots of pain…

And here we are, swing amount in red vertex channel, primitive offset (id) in green:


Web contact meshes

I wanted to stamp some sort of mesh / decal on the wall underneath the hanging meshes.
If you have a look back at the top of the seahorse, you might notice an OUT_WebHits node which contains all the original ray hits.

I’m not going to break this down completely, but I take the scatter points, bring in the tunnel geometry, and use the scatter points to fracture the tunnel.

I take that, copy point colour onto the mesh, and subdivide it:


Delete all the non red bits, push the mesh out along normals with some noise, polyreduce, done🙂


I could have done much more interesting things with this, but then life is full of regrets isn’t it?

Back to UE4

So, export all that stuff out, bring it into UE4.

Fun story, first export I did was accidentally over 1 million vertices, and the mesh still rendered in less than half a millisecond on a GeForce 970.
We are living in the future, people.


Most of this material is setting up the swinging animation for the webs, using World Position Offset.

There’s two sets of parameters for everything: One for when the web is “idle”, one for when it is being affected by the Scanner being near it.

To pass the position of the scanner into the material, I have to set up a Dynamic Material Instance, so this is all handled in the web blueprint (which doesn’t do much else).

It also passes in a neutral wind direction for when the webs are idle, which I set from the forward vector of an arrow component, just to make things easy:


So now I have the scanner position, for each vertex in each web I get the distance between it, and the scanner, and use that to lerp between the idle and the “windy” settings.

All of these values are offset by the position id that I put in the green channel, so that not all of the webs are moving at exactly the same time.

Still to come…

Animation approach from Modo to blueprints, lighting rig for the scanner, all the fun stuff!🙂

City scanner scene – Breakdown pt1

October 12, 2016


In this post, I’ll go through the construction of the environment for my recently posted Half Life 2 scanner scene.

The point of this project was really just to do a bit of animation on my scanner, and show it off in a simple environment. I can’t remember the last time I did any animation, but my guess would be when I was studying at the AIE over ten years ago🙂

So with that in mind, figuring I was going to struggle with the animation side, I wanted to keep the environment dead simple. It was always going to be dark, anyway, since I wanted the scanner to light the scene!

Modelling / texturing the tunnel

I looked up a bunch of photo reference for cool tunnels in Europe, presumably the sort of thing that the resistance in city 17 would have used🙂

I blocked out basic lighting, camera setup, and created the tunnel out of cubes in UE4.
Once I was happy with the layout, I could then just export the blocked out mesh to FBX to use as a template in Modo:


I also took the time to make a really basic animatic.
I changed the path of the scanner quite a bit, and timing, etc, but I still found this to be useful:

Anyway, at this point, the scene blockout is in Modo, and I can start building geometry:


The geometry itself is dead simple, so I won’t go into that too much, I just extruded along a spline, then beveled and pushed a few edge loops around🙂

I always use the sculpt tools to push geometry around a little, just to make things feel a bit more natural. Here specifically I was sinking some of the vertices on the side pathways:


Layered vertex painted materials can be expensive, so I wanted to avoid going too far down that path.
In the end, I settled on having two layers: concrete, and moldy damp green stuff:


The green stuff is vertex paint blended on, and the vertex colours for the mask was done in UE4 rather than in Modo, just because it is quick and easy to see what I’m doing in editor.

Most of the materials in the scene were made in Substance painter.
And I’m lazy, so they are usually a couple of layers with procedural masks, and one or two hand painted masks🙂


Water plane


For the purposes of this scene, I could get away with a pretty low tech / low quality water plane. As long as it had some movement, and is reflective, then it would do!

The engine provides flow map samples and functions in the content samples, so I just used those. I’ve written my own ones before (and by that, I mean I copied what they were doing in the Portal 2 Water Flow presentation from siggraph 2010), but the UE4 implementation does exactly what I wanted🙂

And seriously, if you haven’t looked at that presentation, go do it.
They used Houdini to generate water flow, but I’m lazy and ain’t got time for that! (Not for this scene, at any rate).

I just generated mine in Photoshop, using this page as a guide:

Photoshop generated flow maps

At some point, I’d like to see if I can set up the same workflow in Substance Painter and/or Houdini.

Anyway, the material is a bit messy (sorry):


I’m passing the flowmap texture and some timing parameters into the flowmaps material function, and getting a new normal map out of it.

The only other thing going on here is that I have a mask for the edges of the water, where it is interacting with the walls. I blend in different subsurface colour, normal strength and roughness at the edges.

Fog planes


I’ve got a few overlapping fog planes in the scene, with a simple noisy texture, offset by world position (having a different offset on each make it feel a little more volumetric).

Much like the water, the fog plane has a subtle flow map on it, to fake a bit of turbulence, and the material uses depth fade on opacity to help it blend with the surrounding geometry:


UE4 4.13 mesh decals

I was going to use a bunch of the new 4.13 features originally, but in the end I think the only one I used was “mesh decals”.

These are decals in the old school sense, not the projected decals that UE4 users have probably come to love. In the back of my mind, I had thought I might turn this into a VR scene at some point, and the cost of projected decals is a somewhat unknown commodity for me at the moment.

The main advantage of mesh decals, vs floating bits of geometry with Masked materials, is that mesh decals support full alpha blending.

In these shots, the water puddle, stain and concrete edge damage are all on part of the same decal sheet:

The decals are all using Diffuse, Normals, Roughness, Metallic and Occlusion (the last three packed together):

DecalsTextures.pngI built the decals one at a time, without much planning, basically guessing at how much texture space I thought I was going to need (I didn’t bother setting a “texels per metre” type of limit for my project, but that probably would have been sensible).

Each time I wanted a new mesh decal, I’d work out in Modo how big I want it first:


Then I’d copy it into a separate Modo scene just for Decal Layout which I take into Substance Painter.
I just did this so I could keep all the mesh together in one space, to keep it easy for painting:


And then here is the scene in Substance:


And here is the scene with and without decals:


What’s great about this, is that mesh decals don’t show up in Shader Complexity, so the tech artists on the project will never know… (I kid, I kid. They will find them in PIX, and will hunt you down and yell at you).

I really like this approach to building wear and tear into materials. The first time I saw this approach was when I was working at Visceral Games in Melbourne, and the engine was very well optimized to handle a pretty huge amount of decals. I didn’t embrace it as much as I should have, back then.


A few years back, I made a blueprint for pipes that allowed joining sections, etc.
So I knocked together a model in Modo for the connection pieces:


Edge-weighted sub-d, of course, because I can’t help myself🙂
I even started sculpting in some heavy rust, but had to have a stern word to myself about not spending too much time on stuff that isn’t even going to be lit…

Textured in Substance Painter:


Same dealio with the pipe segments:


Then I just built the spline in the editor, and set it up like in my old blog post.

Much like I did with the original blockout geometry, I also exported the final pipes back out to Modo so that I could use them to work out where I wanted to put some decals.

The only other thing that was a pain, was that the pipes need lightmaps, but I couldn’t work out a way to generate unique UVs for the final pipe mesh.

In the end, I just used the merge actors function in the editor, so that they all became a single static mesh, and let Unreal generate lightmap UVs.


Did you notice that there were hanging spider webs in the scene?
No? Good, because I don’t like them much😛

I probably spent 10-20 hours just messing about with these silly things, but at least I got some fun gifs out of them:


Next up…

I’ll break down the construction of those web things, might be useful for a scene full of badly animated vines, I suppose…

I’ll also go through all of the silly things I did on the animation / blueprint / lighting side.

City Scanner scene

October 10, 2016

Had a bit of fun making a scene for my Half Life 2 city scanner.
Will do some break downs of the scene here on my blog at some point🙂

City Scanner

August 27, 2016

Since I had so much fun with the last Modo / Substance project I did, thought I’d do another one🙂

This time, I decided to make a City Scanner, from Half Life 2.
It’s a work in progress, and I’ll keep posting regular screenshots up on my twitter, but here’s where I’m currently at:


I could have been smart, and just grabbed the model from Source and built around it, but I need to practice building things from scratch, so I built it based off a bunch of screenshots I grabbed out of the game.

It has quite a few differences to the original, which I’m going to pretend was due to creative license, rather than me screwing up proportions, etc (I particularly hate the green side panel, and some of the rear details, but I’m not going to fix the modelling on those at this point) …

Building the model

As with everything I do, this was built as an edge-weighted Catmull-Clark subdivision surface, in Modo 10.

Whenever working on these things, I tend to throw some basic Modo procedural materials and render them out, so here’s where I was at by the end of the high poly process:


Once I was happy with the model (read: sick of working on it :P), I created the low poly mesh for it, and unwrapped the thing.

Unwrapping aside, this didn’t take a huge amount of time, because I just used the base sub-d cage, and stripped out a bunch of loops.
It’s pretty heavy still, at about 7000 vertices, but it’ll do!

Painter work

I could have baked the procedural materials out of Modo, and painted over the top of them, etc (Modo actually has some great baking and painting tools these days), but I need to keep using painter more.

Probably the largest amount of time I spent from this point on was splitting the high and low poly up into lots of different meshes so that I could bake all the maps I needed in Substance Painter.

Models with lots of floating, yet welded intersecting parts are a little bit of a pain for this sort of thing, but I got there eventually.

From Modo, I baked out a Surface ID mask (actually, I used a Diffuse render output, and had flood fill colours on all my materials, but I use it as a Surface ID mask in Painter):


For each of the colour blocks, I set up a folder in Painter that had a Colour
Selection mask on it:


And then I just stack up a bunch of flood fill colour layers with masks until I’m happy.

There’s not a lot of actual painting going on here, at this point, although I do always paint out some parts of the procedural masks, because having even edge wear across the whole model looks pretty silly.

That said, smart masks with flood fill layers aren’t a bad way to flesh out basic wear and tear, etc:


I still need to paint out more of the wear and tear on my model, and put more colour variation in, it looks a little like it has been in a sandstorm, then thrown down some stairs🙂



Aside from some issues with Reflection Capture Actors (having emissive materials in a scene can mess them up a bit), I really didn’t do much except throw the exported textures from Substance onto the mesh, and put a few lights in.

I did mess about with the texels per pixel, min and fade resolutions, and radius thresholds of the shadow casters a bit, because the default settings for shadows in UE4 are pretty low quality for some reason, even on Epic settings.

The material is really boring at the moment, the only thing it exposes is a multiplier for emissive:


Next steps

I will probably animate this in UE4 at some point, and have it floating around, flashing lights, etc.
And it will end up as a minor piece in an environment at some point, hopefully :)

For now, though, I want to continue down the fun path of Modo sub-d/Substance, so I will probably start working on a new model.

Watch this space, and/or twitter 🙂



Modo 10 on the move

June 19, 2016

A month ago, I had a fun adventure taking a train across Canada (which I can highly recommend, by the way).

I’ve moved from Toronto to Vancouver, so I’ve been sans PC for a few months.

Never fear, though, I could still run Modo on my trusty Surface Pro 1 :)


One of the stops along the way was in Winnipeg.
I had two tasks while there, getting some t-shirts, and finding something to model in Modo (well ok, three, if you include a milkshake at VJ’s).

I decided on this auto-sprinkler thing:


The plan was to do most of the modelling work with standard Pixar sub-d stuff in Modo 901 while on the train.

After I arrived in Vancouver, though, I upgraded to Modo 10, which gave me some fun new tools to play with!

Procedural Text

Non destructive modelling along the lines of Max’s stack, and/or Maya’s history is something that has been discussed a long time in Modo circles, and it has landed in Modo 10!

So, once the main mesh was finished, I could select an edge loop in the sub-d mesh, use Edges to Curves to create a curve to place the text around.

Then, in a new procedural text item, I reference in the curve, and use it with a Path generator and a path segment generator to wrap the text around the sprinkler base plate:


I couldn’t work out a procedural way to get those letters rotated correctly, so I just fixed that up manually afterwards.

Fusey fuse

Since I wanted the text to be extruded from the surface and to look like it is all one piece, I decided to use Modo’s Mesh Fusion to Boolean the text on:


Since the mesh was a sub-d mesh, I didn’t really need to make a low poly, I just used the cage.
Well… Technically I should probably still make a low poly (the cage is 3500 vertices, which is pretty heavy), but it’s amazing what you can get away with these days, and soon we will have edge weighted sub-d in all major engines anyway (we won’t… But if I say it enough times, maybe it will happen??):


At this point, I unwrapped the cage, to get the thing ready for texturing.

Substance Painter time

I won’t go too much into the process here, because my approach is generally along the lines of: stack up dozens of procedural layers, and mess about with numbers for a few hours…

Since I could not be bothered rendering out a Surface ID map from Modo, I quickly created some base folders with masks created from the UV Chunk Fill mode in the Polygon Fill tool.

So in about 10 minutes I had a base set of folders to work with, and some materials applied:


Hello weird bronze liney thing.
Looks like someone forgot to generate world space normal maps…

Anyway, I went with a fairly standard corroded bronze material for the main body, and tweaked it a little.
Then added a bunch more procedural layers, occasionally adding paint masks to break them up here and there when I didn’t feel like adding too many more procedural controls.

There’s about 30 layers all in all, some on pretty low opacity:


And here’s what I ended up with in Painter:


Pretty happy with that🙂
Could do with some more saturation variation on the pink bits, and the dirt and wear is a bit heavy, but near enough is good enough!

Giant hover sprinkler of doom

And here it is in UE4, really large, and floating in the air, and with a lower resolution texture on it (because 2048 is super greedy :P):


Speaking of UE4: Modo 10 has materials that are compatible with the base materials in Unreal and Unity now, so you can have assets look almost identical between the two bits of software.

Which is pretty neat. I haven’t played with that feature, but I feel like it will be particularly nice for game artists who want to take Unreal assets into Modo, and render them out for folio pieces, etc.

Factory – pt 4 – (Trimming the flowers)

April 10, 2016

Part 4 of


Alpha card objects

In most games, you have some objects that have on/off alpha transparency, generally for objects that you wouldn’t model all the detail for (leaves, flowers, etc).


^ Exactly like that, beautiful isn’t it?
Years of art training not wasted at all…

Also referred to as punch-through / 1-bit / masked materials, btw.
So, you can see that the see-through portion of that polygon is pretty large.

When rendering these types of assets, you are still paying some of the cost for rendering all of those invisible pixels. If you are rendering a lot of these on screen, and they are all overlapping, that can lead to a lot of overdraw, so it’s fairly common to cut around the shape to reduce this, something like this:


What does this have to do with the factory?
Aren’t you supposed to be building a factory?

I get distracted easily…

I’m not really planning to have a lot of unique vegetation in my factory scene, but I am planning on generating a bunch of stuff out of Houdini.

When I create LODs, that will be in Houdini too, and the LODs will probably be alpha cards, or a combination of meshes and alpha cards.

When I get around to doing that, I probably don’t want to cut around the alpha manually, because… Well, because that sounds like a lot of work, and automating it sounds like a fun task🙂

Houdini mesh cutting tool

The basic idea is to get my image plane, use voronoi fracture to split up the plane, delete any polygons that are completely see-through, export to UE4, dance a happy dance, etc.

For the sake of experiment, I want to try a bunch of different levels of accuracy with the cutting, so I can find a good balance between vertex count, and overdraw cost.

Here’s the results of running the tool with various levels of cutting:


Here’s what the network looks like, conveniently just low resolution enough so as to be totally no use… (Don’t worry, I’ll break it down :))


The first part is the voronoi fracture part:


I’m subdividing the input mesh (so that I end up with roughly a polygon per pixel), then use an Attribute VOP to copy the alpha values from the texture onto the mesh, then blur it a bunch:


I scatter points on that, using the alpha for density, then I join it with another scatter that is just even across the plane. This makes sure that there are enough cuts outside the shape, and I don’t get weird pointy polygons on the outside of the shape.

Here is an example where I’ve deliberately set the even spread points quite low, so you can see the difference in polygon density around the edges of the shape vs inside the shape:


Counting up the alpha

So, earlier, I mentioned that I subdivided up the input mesh and copied the alpha onto it?
I’ll call this the pixelated alpha mesh, and here’s what that looks like:



Next, I created a sub network that takes the pixelated alpha mesh, pushes it out along its normals (which in this case, is just up), ray casts it back to the voronoi mesh, and then counts up how many “hits” there are on each voronoi polygon.

Then we can just delete any polygon that has any “hits”.

Here is that network:


After the ray sop, each point in the pixelated alpha mesh has a “hitprim”, which will be set to the primitive id that it hit in the voronoi mesh.

I’m using an Attribute SOP to write into a integer array detail attribute on the voronoi mesh for each “hitprim” on the pixelated alpha mesh points, and here’s that code:

int success = 0;
int primId = pointattrib(0, &quot;hitprim&quot;, @ptnum, success);

int primhits[] = detail(0, &quot;primhits&quot;);

if (primId &gt;= 0)
setcomp(primhits, 1, primId);
setdetailattrib(0, &quot;primhits&quot;, primhits, &quot;add&quot;);

After all that stuff, I dump a “remesh” node, which cheapens up the mesh a lot.

And back to UE4…

So, with all the above networks packaged into a Digital Asset, I could play with the parameters (the two scatter values), and try out a few different levels of cutting detail, as I showed before:


I’m creating a rather exaggerated setup in UE4 with an excessive amount of overdraw, just for the purposes of this blog post.

For now, I’ve made the alpha cards huuuuuuuuge, and placed them where I used to have the flowers in my scene:


Then, all I need to do is swap in each different version of my alpha card, and then GPU profile!


The camera shot, without any alpha plane cutting optimization, took about 12 ms.

Test1, which is 27 vertices, seemed to be the best optimization. This came in at about 10.2 ms, so a saving of more than 1.5 ms, which is pretty great!

I was actually expecting Test2 to be the cheapest, since it chops quite a bit more off the shape, and at 84 vertices I didn’t think the extra vertex cost would even register on a GTX 970. Turns out I was wrong, Test2 was marginally more expensive!

This just goes to show, never trust someone about optimization unless they’ve profiled something😛

Test3, at 291 vertices, costs about another 0.3 ms.



Of course, the savings are all quite exaggerated, but in a “real world” scenario I would probably expect to have a lot more instances, all a lot smaller. In which case, going with the lower vertex count mesh seems like it would still make sense (although I will, of course, re-profile when I have proper meshes).

Lots of more fun things to do with this: get it working on arbitrary meshes (mostly working), see if I can use Houdini engine and integrate it into UE4, etc.
Still not sure how much vegetation I’ll have in my factory scene, but I think this will still be useful🙂




Factory – pt 3 (early optimisation is something something)

March 3, 2016

Part 3 of

Optimizing art early, before you have a good sense of where the actual expense of rendering your scene is, can be a pretty bad idea.

So let’s do it!!


I’ll do it #procedurally.
Sort of.

20 gallons of evil per pixel

My ground shader is pretty expensive. It’s blending all sorts of things together, currently, and I still have things to add to it.

I don’t want to optimize the actual material yet, because it’s not done, but it looks like this and invokes shame:


As a side note here, this material network looks a bit like the Utah teapot, which is unintentionally awesome.

Every pixel on this material is calculating water and dirt blending.

But many of those pixels have no water or dirt on them:


So why pay the cost for all of that blending across the whole ground plane?
What can I do about it?

Probably use something like the built in UE4 terrain, you fool

Is probably what you were thinking.
I’m sure that UE4 does some nice optimization for areas of terrain that are using differing numbers of layers, etc.

So you’ve caught me out: The technique I’m going to show off here, I also want to use on the walls of my factory, I just haven’t built that content yet, and I thought the ground plane would be fun to test on🙂

Back to basics

First up, I want to see exactly how much all of the fancy blending is costing.

So I made a version of the material that doesn’t do the water or the dirt, ran the level and profiled them side by side:


^ Simple version of my material vs the water and dirt blending one.


So, you can see above that the material that has no dirt/water blending is 1.6 milliseconds cheaper.

Now, if I can put that material on the areas that don’t need the blending, I can’t expect to get that full 1.6 milliseconds back, but I might get 1 millisecond back.

That might not sound like much, but for a 60 fps game, that’s about 1/16th of the entire scene time.

Every little bit helps, getting that time back from cutting content alone can take many hours🙂

Splitting the mesh

To put my cheap material onto the non-blending sections, I’ll split the mesh around the areas where the vertex colour masks have a value of 0.

Luckily, the ground plane is subdivided quite highly so that it plays nice with UE4 tessellation and my vertex painting, so I don’t need to do anything fancy with the mesh.

Back to Houdini we go!


So, anything that has > 0 sum vertex colour is being lifted up in this shot, just to make it obvious where the mesh split is happening.

Here’s the network:


The new nodes start at “Attribcreate”, etc.

The basic flow is:

  • “Colour value max” is set as max(@Cd.r, @Cd.g), per point, so it will be set to some value if either dirt or water are present.
  • Two new Max and Min attributes per polygon are created by promoting Colour Value max from Point –> Polygon, using Min and Max promotion methods (so if one vertex in the polygon has some dirt/water on it, the then max value will be non zero, etc)
  • The polygons are divided into three groups: Polygons that have no vertices with any blending, Polygons that have some blending, Polygons that have all verts that are 100% blending.
  • NOTE: For the purposes of this blog post, all I really care about is if the Polygon has no dirt/water or if it has some, but having the three groups described above will come in handy in a later blog post, you’ll just have to trust me🙂
  • The two groups of polygons I care about get two different materials applied to them in Houdini.
    When I export them to UE4, they maintain the split, and I can apply my cheaper material.

So, re-exported, here it is:

Looks the same?

Great, mission successful! Or is it…

Checking the numbers

Back to the GPU profiler!


Ok, so the column on the right is with my two materials, the column in the middle is using the expensive material across the whole ground plane.

So my saving was a bit under one millisecond in this case.
For an hour or two of work that I can re-use in lots of places, I’m willing to call that a success🙂

Getting more back

Before cleaning up my shader, there’s a few more areas I can/might expand this, and some notes on where I expect to get more savings:

  • I’ll have smaller blending areas on my final ground plane (less water and dirt) and also on my walls. So the savings will be higher.
  • I might mask out displacement using vertex colours, so that I’m not paying for displacement across all of my ground plane and walls.
    The walls for flat sections not on the corner of the building and/or more than a few metres from the ground can go without displacement, probably.
  • The centre of the water puddles is all water: I can create a third material that just does the water stuff, and split the mesh an extra time.
    This means that the blending part of the material will be just the edges of the puddles, saving quite a lot more.

So all in all, I expect I can claw back a few more milliseconds in some cases in the final scene.

One final note, the ground plane is now three draw calls instead of one.
And I don’t care.
So there.🙂







Factory – pt 2 (magical placeholder land)

February 17, 2016

Part 2 of:


I had to split this post up, so I want to get this out of the way:
You’re going to see a lot of ugly in the post. #Procedural #Placeholder ugly🙂

This post is mostly about early pipeline setup in Houdini Python, and UE4 c++.

Placeholder plants

For testing purposes, I made 4 instances of #procedural plants using l-systems:


When I say “made”, I mean ripped from my Shangri-La tribute scene, just heavily modified:

Like I mention in that post, if you want to learn lots about Houdini, go buy tutorials from Rohan Dalvi.
He has some free ones you can have a run through, but the floating islands series is just fantastic, so just buy it😛

These plants I exported as FBX, imported into UE4, and gave them a flat vertex colour material, ’cause I ain’t gonna bother with unwrapping placeholder stuff:


The placeholder meshes are 4000 triangles each.
Amusingly, when I first brought them in, I hadn’t bothered checking the density, and they were 80 000 + triangles, and the frame rate was at a horrible 25 fps😛

Houdini –> UE4

So, the 4 unique plants are in UE4. Yay!

But, I want to place thousands of them. It would be smart to use the in-built vegetation tools in UE4, but my purpose behind this post is to find some nice generic ways to get placement data from Houdini to UE4, something that I’ve been planning to do in my old Half Life scene for ages.
So I’m going to used Instanced Static Meshes🙂

Generating the placements

For now, I’ve gone with a very simple method of placing vegetation: around the edges of my puddles.
It will do for sake of example. So here’s the puddle and vegetation masks in Houdini (vegetation mask on the left, puddle mask on the right):


A couple of layers of noise, and a fit range applied to vertex colours.

I then just scatter a bunch of points on the mask on the left, and then copy flowers onto them, creating a range of random scales and rotations:


The node network for that looks like this:


Not shown here, off to the left, is all the flower setup stuff.
I’ll leave that alone for now, since I don’t know if I’ll be keeping any of that🙂

The right hand side is the scattering, which can be summarized as:

  • Read ground plane
  • Subdivide and cache out a the super high poly plane
  • Move colour into Vertex data (because I use UVs in the next part, although I don’t really have to do it this way)
  • Read the brick texture as a mask (more on that below)
  • Move mask back to Point data
  • Scatter points on the mask
  • Add ID, Rotation and Scale data to each point
  • Flip YZ axis to match UE4 (could probably do this in Houdini prefs instead)
  • Python all the things out (more on that later)

Brick mask

I mentioned quickly that I read the brick mask as a texture in the above section.
I wanted the plants to mostly grow out of cracks, so I multiplied the mask by the inverted height of the bricks, clamped to a range, using a Point VOP:


And here’s the network, but I won’t explain that node for node, it’s just a bunch of clamps and fits which I eyeballed until it did what I wanted:


Python all the things out, huh?

Python and I have a special relationship.
It’s my favourite language to use when there aren’t other languages available.

Anyway… I’ve gone with dumping my instance data to XML.
More on that decision later.

Now for some horrible hackyness:

node = hou.pwd()
from lxml import etree as ET

geo = node.geometry()

root = ET.Element("ObjectInstances")

for point in geo.points():
pos         = point.position()
scale       = hou.Point.attribValue(point, 'Scale')
rotation    = hou.Point.attribValue(point, 'Rotation')
scatterID   = "Flower" + repr(hou.Point.attribValue(point, 'ScatterID')+1)

PosString       = repr(pos[0]) + ", " + repr(pos[1]) + ", " + repr(pos[2])
RotString       = repr(rotation)
ScaleString     = repr(scale) + ", " + repr(scale) + ", " + repr(scale)

ET.SubElement(root, scatterID,

# Do the export
tree = ET.ElementTree(root)
tree.write("D:/MyDocuments/Unreal Projects/Warehouse/Content/Scenes/HoudiniVegetationPlacement.xml", pretty_print=True)

NOTE: Not sure if it will post this way, but in Preview the tabbing seems to be screwed up, no matter what I do. Luckily, programming languages have block start and end syntax, so this would never be a prob… Oh. Python. Right.

Also, all hail the ugly hard coded path right at the end there🙂
(Trust me, I’ll dump that into the interface for the node or something, would I lie to you?)

Very simply, this code exports an XML element for each Point.
I’m being very lazy for now, and only exporting Y rotation. I’ll probably fix that later.

This pumps out an XML file that looks like this:

<Flower1 Location=-236.48265075683594, -51.096923828125, -0.755022406578064 Rotation=(0.0, 230.97622680664062, 0.0) Scale=0.6577988862991333, 0.6577988862991333, 0.6577988862991333/>


Reading the XML in UE4

In the spirit of slapping things together, I decided to make a plugin that would read the XML file, and then add all the instances to my InstancedStaticMesh components.

First up, I put 4 StaticMeshActors in the scene, and in place I gave them an InstancedStaticMesh component. I could have done this in a Blueprint, but I try to keep Blueprints to a minimum if I don’t actually need them:


As stated, I’m a hack, so the StaticMeshActor needs to be named Flower<1..4>, because the code matches the name to what it finds in the XML.

The magic button

I should really implement my code as a either a specialized type of Data Table, or perhaps some sort of new thing called an XMLInstancedStaticMesh, or… Something else clever.

Instead, I made a Magic Button(tm):


XML Object Loader. Probably should have put a cat picture on that, in retrospect.

Brief overview of code

I’m not going to post the full code here for a bunch of reasons, including just that it is pretty unexciting, but the basic outline of it is:

  1. Click the button
  2. The plugin gets all InstancedStaticMeshComponents in the scene
  3. Get a list of all of the Parent Actors for those components, and their labels
  4. Process the XML file, and for each Element:
    • Check if the element matches a name found in step 3
    • If the Actor name hasn’t already been visited, clear the instances on the InstancedStaticMesh component, and mark it as visited
    • Get the position, rotation and scale from the XML element, and add a new instance to the InstancedStaticMesh with that data

And that’s it! I had a bit of messing around, with originally doing Euler –> Quaternion conversion in Houdini instead of C++, and also not realizing that the rotations were in radians, but all in all it only took a hour or two to throw together, in the current very hacky form🙂

Some useful snippets

The FastXML library in UE4 is great, made life easy:

I just needed to create a new class inheriting from the IFastXmlCallback interface, and implement the Process<x> functions.

I’d create a new instance in ProcessElement, then fill in the actual data in ProcessAttribute.

Adding an instance to an InstancedStaticMeshComponent is as easy as:


And then, in shortened form, updating the instance data:

FTransform InstanceTransform;
_currentStaticMeshComp->GetInstanceTransform(_currentInstanceID, InstanceTransform);

// ...


_currentStaticMeshComp->UpdateInstanceTransform(_currentInstanceID, InstanceTransform);

One last dirty detail…

That’s about it for the code side of things.

One thing I didn’t mention earlier: In Houdini, I’m using the placement of the plants to generate out the dirt map mask so I can blend in details around their roots:


So when I export out my ground plane, I am putting the Puddles mask into the blue channel of the vertex colours, and the Dirt mask into the red channel of the vertex mask🙂

Still to come (for vegetation)

So I need to:

  • Make the actual flowers I want
  • Make the roots/dirt/mossy texture that gets blended in under the plants
  • Build more stuff

Why.. O.o

Why not data tables

I’m all about XML.

But a sensible, less code-y way to do this would be to save all your instance data from Houdini into CSV format, bring it in to UE4 as a data table, then use a Construction Script in a blueprint to iterate over the data and add instances to an Instanced Static Mesh.

I like XML as a data format, so I decided it would be more fun to use XML.

Why not Houdini Engine

That’s a good question…

In short:

  • I want to explore similar workflows with Modo replicators at some point, and I should be able to re-use the c++/Python stuff for that
  • Who knows what other DCC tools I’ll want to export instances out of
  • It’s nice to jump into code every now and then. Keeps me honest.
  • I don’t own it currently, and I’ve spent my software budget on Houdini Indie and Modo 901 already :)

If you have any questions, feel free to dump them in the comments, I hurried through this one a little since it’s at a half way point without great results to show off yet!