Gears of Washroom – Pt 1

I wanted to do a project that focuses a bit more on simulation work in Houdini, and also rendering, and this is the result. Probably one of the sillier things I’ve done in a while, I suppose ūüôā

Originally, I was going to render it in Renderman, but I settled on using Mantra (which I’m really starting to love).
Renderman in Houdini was just a little too much to deal with, on top of all the other things I was learning.


All of the models in the scene are edge weighted sub-d, modelled in Modo.
The water grenade is Gears Of War inspired, cobbled together from whatever concept I could find off the interwebz ūüôā

In a previous post, I showed off the water grenade model when I was talking about transferring edge weighted sub-d models from Modo to Houdini.

After posting it on this great Modo / Houdini thread, Pascal Beeckmans (aka PaQ WaK) informed me that Alembic files keep edge weight values.
Using Alembic files is a much better idea than the silly workaround hack I was doing!

Toilet paper dispenser model

Water grenade sub-d model in Modo

Toilet model

The whole base geometry for the scene (room objects + one grenade) comes in under 20000 triangles, such is the joy of sub-d in Modo.

Grenade materials

The material for the grenade was made in Substance Painter:

Grenade - Substance Painter texturing


I tried out a few different colour schemes, but settled on eye searing orange.

Something I hadn’t used much in Substance Painter is the “Hard Surface” stamping feature.
This is a really cool way of adding little details that I couldn’t be bothered modelling:

Hard surface stamps on Grenade model in Substance Painter

Substance Painter comes with quite a few to choose from:


I can imagine that if you built up a library of them, you could detail up models super quick!

Designer fun

I decided to do the walls, floor and wood shelf materials in Substance Designer.

Tiles, wood and plaster wall materials

I won’t go through all the networks node by node, but I’ll do a bit of an overview of the Wood Substance, since it’s the slightly more interesting of the three.

Wood Substance

Substance Designer network for wood material
Click for larger view

I’m taking an anisotropic noise, warping it with a crystal noise, taking a creased noise, warping the rest with that, making some gaussian spots, converting them to a normal, vector warping everything with that.

That gives me a greyscale image, that I gradient map to make a diffuse texture.
In case the graph, and that last sentence weren’t confusing enough, here it is in gif form!


I always find it a bit to talk through Substance Designer networks, because so much of it is fiddling around until you have something you like.
I could probably remake this a lot better, and remove more than half of the nodes!

One really fun part of this was the Gradient ramp right at the end.

In the Gradient node, you can click and drag over anything on your screen (google image search images, in my case) to pick a gradient.
Here’s a great video explaining it:

I ran the picker over a photo of a wood plank that I liked, and then manually cleaned up the points on the gradient a bit:


Setting the scene

Having exported the scene from Modo as Alembic, I’m loading all the parts of the scene separately, and creating Groups for them.


I just noticed that under the Attributes tab in the Alembic node, there is the option to “Add Path Attribute”, so using that for grouping would be the smarter and neater way go!

The UV layout node in the middle was just me messing around with some of the new packing features in 16.5.
I’d UV’d in Modo already, but I wanted to see how the layout node fills in holes:


Turns out it’s pretty great!

The last section of this network, I’m setting up the length of the chain by copying and rotating the one chain piece, and offsetting the end handle part:


To keep the handle at the end of the chain with a transform node, I can just reference the transform and number of copies properties from the copy1 node.

So the x translation is:

ch(“../copy1/ncy”) * ch(“../copy1/tx”)

And to get the handle at the right 90 degree rotation:

(ch(“../copy1/ncy”) % 2) * 90

It’s nothing exciting, but it’s great how easy it is just to dump expressions into parameters just about anywhere in Houdini.

For the room geometry, the import setup is very similar to the grenade setup.
One thing probably worth pointing out: I’m subdividing the assets at render time.
So although they are not subdivided in the viewport, you’ll just have to trust me that all the edge weighting came in fine ūüôā


In the next blog post, I’ll start getting into some of the simulation setup.
From here on, the focus of these posts on this project will be 100% on Houdini.


Subsurface Scattering spherical harmonics – pt 3

Welcome to part 3 of this exciting series on how to beat a dead horse.

By the time I got to the end of the work for the last post, I was just about ready to put this project to bed (and by that, I mean P4 obliterate…).

There was just one thing I wanted to fix: The fact that I couldn’t rotate my models!
If I rotate the object, the lighting rotates with it.


To fix the rotating issue, in the UE4 lighting pass, I need to transform the light vector into the same space that I’m storing the SH data (object space, for example).


To do that, I need to pass through at least two of those object orientation vectors to the lighting pass (for example, the forward and right vectors of the object).

So, that’s another 6 floats (if I don’t compress them) that I need to pass through, and if you remember from last time, I’d pushed the limits of MRTs with my 16 spherical harmonics coefficients, I don’t have any space left!

This forced me to do one of the other changes I talked about: Use 3 band Spherical Harmonics for my depth values instead of 4 band.
That reduces the coefficients from 16 to 9, and gives me room for my vectors.

<Insert montage of programming and swearing here>


So yay, now I have 3 band SH, and room for sending more things through to lighting.

Quality didn’t really change much, either, and it helped drop down to 5 uv channels, which became very important a little later…

Going off on a tangent

I figured that since I was solving the problem for object orientation, maybe I could also do something for deforming objects too?
For an object where the depth from one side to the other doesn’t change much when it’s deforming, it should be ok to have baked SH data.

The most obvious way to handle that was to calculate and store the SH depth in Tangent space, similar to how Normal maps are usually stored for games.

I wanted to use the same tangent space that UE4 uses, and although Houdini 15 didn’t have anything native for generating that, there is a plugin!

With that compiled and installed, I could plonk down a Compute Tangents node, and now I have Tangents and Binormals stored on each vertex, yay!

At this point, I create a matrix from the Tangent, Binormal and Normal, and store the transpose of that matrix.
Multiplying a vector against it will give me that vector in Tangent space. I got super lazy, and did this in a vertex wrangle:

matrix3 @worldToTangentSpaceMatrix;
vector UE4Tang;
vector UE4Binormal;
vector UE4Normal;

// Tangent U and V are in houdini coords
UE4Tang         = swizzle(v@tangentu, 0,2,1);
UE4Binormal     = swizzle(v@tangentv, 0,2,1);
UE4Normal       = swizzle(@N, 0,2,1);

@worldToTangentSpaceMatrix = transpose(set(UE4Tang, UE4Binormal, UE4Normal));

The swizzle stuff is just swapping Y and Z (coordinate systems are different between UE4 and Houdini).

Viewing the Tangent space data

To make debugging easier, at this point I made a fun little debug node that displays Tangents, Binormals and Normals the same as the model viewer in UE4.

It runs per vertex, and creates new coloured line primitives:


Haven’t bothered cleaning it up much, but hopefully you get the idea:


And the vectorToPrim subnet:


So, add a point, add some length along the input vector and add another point, create a prim, create two verts from the points, set the colour.
I love how easy it is to do this sort of thing in Houdini ūüôā

The next step was to modify the existing depth baking code.

For each vertex in the model, I was sending rays out through the model, and storing the depth when they hit the other side.
That mostly stays the same, except that when storing the rays in the SH coefficients, I need to convert them to tangent space first!


Getting animated

Since most of the point of a Tangent space approach was to show a deforming object not looking horrible, I needed an animated model.

I was going to do a bunch of animation in Modo for this, but I realized that transferring all my Houdini custom data to Modo, and then out to fbx might not be such a great idea.

Time for amazing Houdini animation learningz!!
Here’s a beautiful test that any animator would be proud of, rigged in Houdini and dumped out to UE4:


So, I spent some time re-rigging the Vortigaunt in Houdini, and doing some more fairly horrible animation that you can see at the top of this post.


Although the results aren’t great, I found this weirdly soothing.
Perhaps because it gave me a break from trying to debug shaders.

At some point in the future, I would like to do a bit more animation/rigging/skinning.
Then I can have all the animators at work laugh at my crappy art, in addition to all the other artists…

Data out

Hurrah, per-vertex Tangent space Spherical Harmonic depth data now stored on my animated model!

This was about the part where I realized I couldn’t find a way to get the Tangents and Binormals from the Houdini mesh into Unreal…

When exporting with my custom data, what ends up in the fbx is something like this:

   UserDataArray:  {
    UserDataType: "Float"
    UserDataName: "tangentu_x"
    UserData: *37416 {...

When I import that into UE4, it doesn’t know what that custom data is supposed to be.

If I export a mesh out of Modo, though, UE4 imports the Tangents and Binormals fine.
So I jumped over into Modo, and exported out a model with Tangents and Binormals, and had a look at the fbx.
This showed me I needed something more like this:

LayerElementTangent: 0 {
 Version: 102
 Name: "Texture"  
 MappingInformationType: "ByPolygonVertex"
 ReferenceInformationType: "Direct"
 Tangents: *112248 {...
This is probably around about when I should have set the project on fire, and found something better to do with my time but…

C# to the rescue!!

I wrote an incredibly silly little WPF program that reads in a fbx, changes tangentu and tangentv user data into the correct layer elements.

Why WPF you ask?
Seriously, what’s with all the questions? What is this, the Spanish inquisition?
Real answer: Almost any time I’ve written any bit of code for myself in the past 7 years, it’s always a WPF program.
80% of them end up looking like this:
The code is horrible, I won’t paste it all, but I build a list of all the vectors then pass them through to a function that re-assembles the text and spits it out:
        public string CreateLayerElementBlock(List<Vector3D> pVectors, string pTypeName)
            string newBlock = "";

            int numVectors  = pVectors.Count;
            int numFloats   = pVectors.Count * 3;

            newBlock += "\t\tLayerElement" + pTypeName + ": 0 {\n";
            newBlock += "\t\t\tVersion: 102\n";
            newBlock += "\t\t\tName: \"Texture\"\n";
            newBlock += "\t\t\tMappingInformationType: \"ByPolygonVertex\"\n";
            newBlock += "\t\t\tReferenceInformationType: \"Direct\"\n";
            newBlock += "\t\t\t" + pTypeName + "s: *" + numFloats + " {\n";
            newBlock += "\t\t\t\ta: ";

Gross. Vomit. That’s an afternoon of my life I’ll never get back.
But hey, it worked, so moving on…

UE4 changes

There weren’t many big changes on the UE4 side, just the switching over to 3 band SH, mostly.

One really fun thing bit me in the arse, though.
I’d been testing everything out on my static mesh version of the model.
When I imported the rigged model, I needed to change the material to support it:
And then the material failed to compile (and UE4 kept crashing)…
So, apparently, skinned meshes use a bunch of the UV coordinate slots for… Stuff!
I needed to switch back to my old approach of storing 6 coefficients in TexCoord1,2 and 3, and the remaining three SH coeffs in vertex colour RGB:
Cropped this down to exclude all the messy stuff I left in for texture based SH data, but those three Appends on the right feed into the material pins I added for SH data in the previous posts.
And yeah, there’s some redundancy in the math at the bottom too, but if you don’t tell anyone, I won’t.

Shader changes

Now to pass the Tangent and Binormal through to the lighting pass.

I ended up compressing these, using Octahedron normal vector encoding, just so I could save a few floats.
The functions to do this ship with UE4, and they allow me to pass 2 floats per vector, rather than x,y,z, and the artifacts are not too bad.
Here’s some more information on how it works:
So now the Tangent and Binormal data is going through to the lighting pass, and I transform the light to tangent space before looking up the SH data:
 float3x3 TangentToWorld =
  cross(GBuffer.WorldTangent, GBuffer.WorldBinormal),

 float3 TangentL = mul(L, transpose(TangentToWorld));

 float DepthFromPixelToLight  = saturate(GetSH(SHCoeffs, TangentL));
Probably could do that transposing in BassPassPixelShader I guess, and save paying for it on every pixel for every light, but then there’s a lot of things I probably could do. Treat my fellow human beings nicer, drink less beer, not stress myself out with silly home programming projects like this…


If I were to ever do this for real, on an actual game, I’d probably build the SH generation into the import process, or perhaps when doing stuff like baking lighting or generating distance fields in UE4.

If you happened to have a bunch of gbuffer bandwidth (i.e, you had to add gbuffers for something else), and you have a lot of semi translucent things, and engineering time to burn, and no better ideas, I suppose there could be a use for it.

Subsurface Scattering spherical harmonics ‚Äď pt 2


This is my 2nd blog post on using spherical harmonics for depth based lighting effects in Unreal 4.

The first blog post focused on generating the spherical harmonics data in Houdini, this post focuses on the Unreal 4 side of things.

I’m going to avoid posting much code here, but I will try to provide enough information to be useful if you choose to do similar things.

SH data to base pass

The goal was to look up the depth of the object from each light in my scene, and see if I could do something neat with it.

In UE4 deferred rendering, that means that I need to pass my 16 coefficients from the material editor ‚Äď> base pass pixel shader -> the lighting pass.

First up, I read the first two SH coefficients out of the red and green vertex colour channels, and the rest out of my UV sets (remembering that I kept the default UV set 0 for actual UVs):


Vertex colour complications

You notice a nice little hardcoded multiplier up there… This was one of the annoyances with using vertex colours: I needed to scale the value of the coefficients¬†in Houdini to 0-1, because vertex colours are 0-1.

This is different to the normalization part I mentioned in the last blog post, which was scaling the depth values before encoding them in SH. Here, I’m scaling the actual computed coefficients. I only need to do this with the vertex colours, not the UV data,¬†since UVs¬†aren’t restricted to 0-1.

The 4.6 was just a value that worked, using my amazing scientific approach of “calculate SH values for half a dozen models of 1 000 – 10 000 vertices, find out how high and low the final sh values go, divide through by¬†that number +0.1”. You’d be smarter to use actual math to find the maximum range for coefficients for normalized data sets, though… It’s probably something awesome like 0 –> 1.5 pi.

Material input pins

Anyway, those values just plug into the SH Depth Coeff pins, and we’re done!!

Unreal 4 SH depth material

That was a lie.
Those pins don’t exist usually… And neither does this shading model:


So, that brings me to…

C++ / shader side note

To work out how to add a shading model, I searched the source code for a different shading model (hair I think), and copied and pasted just about everything, and then went through a process of elimination until things worked.
I took very much the same approach to the shader side of things.

This is why I’m a Tech Artist, and¬†not a programmer… Well, one of many reasons ūüėČ
Seriously though, being able to do this is one of the really nice things about having access to engine source code!

The programming side of this project was a bunch of very simple changes across a wide range of engine source files, so I’m not going to post much of it:


There is an awful lot of this¬†code that really should be data instead. But Epic gave me an awesome engine and lets me mess around with source code, so I’m not going to complain too much ūüėõ

Material pins (continued…)

So I added material inputs for the coefficients, plus some absorption parameters.

Sh coeffs

The SH Coeffs material pins are new ones, so I had to make a bunch of changes to material engine source files to make that happen.
Be careful when doing this: Consistent ordering of variables matters in many of these files. I found that out the easy way: Epic put comments in the code about it ūüôā

Each of the SH coeffs material inputs is a vector with 4 components, so I need 4 of these to send my 16 coefficients through to the base pass.

Custom data (absorption)

The absorption pins you might have noticed from my material screenshot are passed as “custom data”.
Some of the existing lighting models (subsurface, etc) pass additional data to the base pass (and also through to lighting, but more on that later).

These “custom data” pins can be renamed for different shading models. So you can use these¬†if you’d rather not go crazy adding new pins, and you’re happy with passing through just two extra float values.
Have a look at MaterialGraph.cpp, and GetCustomDataPinName¬†if that sounds like a fun time ūüôā

Base pass to lighting

At this point, I’d modified enough code that I could start reading and using my SH values in the base pass.

A good method for testing if the data was valid was using the camera vector to look up the SH depth values. I knew things were working when I got similar results to what I was seeing in Houdini when using the same approach:


That’s looking at “Base Color” in the buffer visualizations.

I don’t actually want to do anything with the SH data in the base pass, though, so the next step is to pass the SH data through to the lighting pass.

Crowded Gbuffer

You can have a giant parameter party, and read all sorts of fun data in the base pass.
However, if you want to do per-light stuff, at some point you need to write all that data into a handful of full screen buffers that the lighting pass uses. By the time you get to lighting, you don’t have per object data, just¬†those full screen buffers and your lights.

These gbuffers are lovingly named GBufferA, GBufferB, GBuffer… You get the picture.

You can visualize them in the editor by using the various buffer visualizers, or explicitly using the¬†“vis” command, e.g: “vis gbuffera”:


There are some other buffers being used (velocity, etc), but these are the ones I care about for now.

I need to pass an extra 16 float values through to lighting, so surely I could just add 4 new gbuffers?

Apparently not, the limit for simultaneous render targets is 8 ūüôā

I started out by creating 2 new render targets, so that covers half of my SH values, but what to do with the other 8 values?

Attempt 1 – Packing it up

To get this working, there were things that I could sacrifice from the above existing buffers to store my own data.

For example, I rarely use Specular these days, aside from occasionally setting it to a constant, so I could use that for one of my SH values, and just hard code Specular to 1 in my lighting pass.

With this in mind, I overwrote all the things I didn’t think I cared about for stylized translucent meshes:

  • Static lighting
  • Metallic
  • Specular
  • Distance field anything (I think)

Attempt 2 – Go wide!

This wasn’t really ideal. I wasn’t very happy about losing static lighting.

That was about when I realized that although I couldn’t add any more simultaneous render targets, I could change the format of them!

The standard g-buffers are 8 bits per channel, by default. By going 16 bit per channel, I could pack two SH values into each channel, and store all my SH data in my two new g-buffers without the need for overwriting other buffers!

Well, I actually went with PF_A32B32G32R32F, so 32 bits per channel because I’m greedy.

It‚Äôs probably worth passing out in horror at the cost of all this at this point: 2 * 128bit buffers is something like 250mb of data. I‚Äôm going to talk about this a little later ūüôā

Debugging, again

I created a few different procedural test assets in Houdini with low complexity as test cases, including one which I deleted all but one polygon as a final step, so that I could very accurately debug the SH values ūüôā

On top of that, I had a hard coded matrix in the shaders that I could use to check, component by component, that I was getting what I expected when passing data from the base pass to lighting, with packing/unpacking, etc:

const static float4x4 shDebugValues = 
	0.1, 0.2, 0.3, 0.4,
	0.5, 0.6, 0.7, 0.8,
	0.9, 1.0, 1.1, 1.2,
	1.3, 1.4, 1.5, 1.6

It seems like an obvious and silly thing to point out, but it saved me some time ūüôā

Here are some of my beautiful procedural test assets (one you might recognize from the video at the start of the post):

Houdini procedural test asset (rock thing)testobject3testobject2testobject1

“PB-nah”, the lazy guide to not getting the most out of my data

Ok, SH data is going through to the lighting pass now!

This is where a really clever graphics programmer could use if for some physically accurate lighting work, proper translucency, etc.

To be honest, I was pleasantly surprised that anything was working at this stage, so I threw in¬†a very un-pbr scattering, and called it a day! ūüôā

float3 SubsurfaceSHDepth( FGBufferData GBuffer, float3 L, float3 V, half3 N )
	float AbsorptionDistance 	= GBuffer.CustomData.x;
	float AbsorptionPower 		= lerp(4.0f, 16.0f, GBuffer.CustomData.y);

	float DepthFromPixelToLight 	= Get4BandSH(GBuffer.SHCoeffs, L);
	float absorptionClampedDepth 	= saturate(1.0f / AbsorptionDistance * DepthFromPixelToLight);
	float SSSWrap 			= 0.3f;
	float frontFaceFalloff 		= pow(saturate(dot(-N, L) + SSSWrap), 2);

	float Transmittance 		= pow(1 - absorptionClampedDepth, AbsorptionPower);

	Transmittance *= frontFaceFalloff;

	return Transmittance * GBuffer.BaseColor;
It’s non view dependent scattering, using the SH depth through the model towards the light, then dampened by the absorption distance.
The effect falls off by face angle away from the light, but I put a wrap factor on that because I like the way it looks.
For all the work I’ve put into this project, probably the least of it went into the actual lighting model, so I’m pretty likely to change that code quite a lot ūüôā
What I like about this is that the scattering stays fairly consistent around the model from different angles:
So as horrible and inaccurate and not PBR as this is, it matches what I see in SSS renders in Modo a little better than what I get from standard UE4 SSS.

The End?

Broken things

  • I can’t rotate my translucent models at the moment ūüėõ
  • Shadows don’t really interact with my model properly

I can hopefully solve both of these things fairly easily (store data in tangent space, look at shadowing in other SSS models in UE4), I just need to find the time.
I could actually rotate the SH data, but apparently that’s hundreds of instructions ūüôā

Cost and performance

  • 8 uv channels
  • 2 * 128 bit buffers

Not really ideal from a memory point of view.

The obvious optimization here is to drop down to 3 band spherical harmonics.
The quality probably wouldn’t suffer, and that’s 9 coefficients rather than 16, so I could pack them into one of my 128 bit gbuffers instead of two (with one spare coefficient left over that I’d have to figure out).

That would help kill some UV channels, too.

Also, using 32 bit per channel (so 16 bits per sh coeff) is probably overkill. I could swap over to using a uint 16 bits per channel buffer, and pack two coefficients per channel at 8 bits each coeff, and that would halve the memory usage again.

As for performance, presumably evaluating 3 band spherical harmonics would be cheaper than 4 band. Well, especially because then I could swap to using the optimized UE4 functions that already exist for 3 band sh ūüôā

Render… Differently?

To get away from needing extra buffers and having a constant overhead, I probably should have tried out the new Forward+ renderer:

Since you have access to per object data, presumably passing around sh coefficients would also be less painful.
Rendering is not really my strong point, but my buddy Ben Millwood has been nagging me about Forward+ rendering for years (he’s writing his own renderer

There are other alternatives to deferred, or hybrid deferred approaches (like Doom 2016’s clustered forward, or Wolfgang Engels culled visibility buffers) that might have made this easier too.
I very much look forward to the impending not-entirely-deferred future ūüôā


I learnt some things about Houdini and UE4, job done!

Not sure if I’ll keep working on this at all, but it might be fun to at least fix the bugs.


Subsurface Scattering spherical harmonics – pt 1

In this post, I’ll be presenting “SSSSH”, which¬†will be¬†the sound made by any real programmer who happens to accidentally read this…

This has been a side project of mine for the last month or so with a few goals:

  • Play around more with Houdini (I keep paying for it, I should use it more because it’s great)
  • Add more gbuffers to UE4, because that sounds like a useful thing to be able to do and understand.
  • Play around with spherical harmonics (as a black box) to understand the range and limitations of the technique a bit better.
  • Maybe accidentally make something that looks cool.

Spherical harmonics

I won’t go too much into the details on spherical harmonics because:
a) There’s lots of good sites out there explaining them and
b) I haven’t taken the time to understand the math, so I really don’t know how it works, and I’m sort of ok with that for now ūüėõ

But at my basic understanding level, spherical harmonics is a way of representing data using a set of functions that take spherical coordinates as an input, and return a value. Instead of directly storing the data (lighting, depth, whatever), you work out a best fit of these functions to your data, and store the coefficients of the functions.

Here is a very accurate diagram:


You’re welcome!
Feel free to reuse that amazing diagram.

SH is good for data that varies rather smoothly, so tends to be used for ambient/bounced lighting in a lot of engines.

The function series is infinite, so you can decide how many terms you want to use, which determines how many coefficients you store.

For this blog post, I decided to go with 4-band spherical harmonics, because I’m greedy and irresponsible.
That’s 16 float values.

Houdini SH

Thanks to the great work of Matt Ebb, a great deal of work was already done for me:

I had to do a bit of fiddling to get things working in Houdini 15, but that was a good thing to do anyway, every bit of learning helps!

What I used from Matt were two nodes for reading and writing SH data given the Theta and Phi (polar and azimuthal) angles:


Not only that, but I was able to take the evaluate code and adapt it to shader code in UE4, which saved me a bunch of time there too.

It’s not designed to be used that way, so I’m sure that it isn’t amazingly efficient. If I decide to actually keep any of this work, I’ll drop down to 3 band SH and use the provided UE4 functions ūüôā

Depth tracing in Houdini

I’m not going to go through every part of the Houdini networks, just the meat of it, but here’s what the main network looks like:


So all the stuff on the left is for rendering SH coefficients out to textures (more on that later), the middle section is where the work is done, the right hand side a handful of debug modes visualizers, including some from the previously mentioned Matt Ebb post.

Hits and misses

I’m doing this in SOPs (geometry operations), because it’s what I know best in Houdini¬†at the moment, as a Houdini noob¬†ūüôā
I should try moving it to shops (materials/per pixel) at some point, if that is at all possible.

To cheat, if I need more per-pixel like data, I usually just subdivide my meshes like crazy, and then just do geometry processing anyway¬†ūüėõ

The basic functionality is:

  • For each vertex in the source object:
    • Fire a ray in every direction
    • Collect every hit
    • Store the distance to the furthest away primitive that is facing away from the vertex normal (so back face, essentially)

All the hits are stored in an array, along with the Phi and Theta angles I mentioned before, here’s what that intersection network looks like currently:


I’m also keeping track of the maximum hit length, which I will use later to normalize the depth data. The max length is tracked¬†one level up from the getMaxIntersect network from the previous screenshot:


This method currently doesn’t work very well with objects with lots of gaps in them, because the gaps in the middle of an object will essentially absorb light when they shouldn’t.
It wouldn’t be hard to fix, I just haven’t taken the time yet.


Before storing to SH values, I wanted to move all the depth values into the 0-1 range, since there are various other places where having 0-1 values makes my life easier later.

One interesting thing that came up here: when tracing rays out from a point, there are always more rays that miss than hit.

That’s because surfaces are more likely to be convex than concave, so at least half of the rays are pointing out into space:


Realistically, I don’t really care about spherical data, I probably want to store hemispherical data around the inverse normal.
That might cause data problems in severely concave areas of the mesh, but I don’t think it would be too big a problem.
There are hemispherical basis functions that could be used for that, if I were a bit more math savvy:

A Novel Hemispherical Basis for Accurate and EfÔ¨Ācient Rendering

Anyway, having lots of values shooting out to infinite (max hit length) was skewing all of the SH values, and I was losing a lot of accuracy, so I encoded misses as zero length data instead.

Debug fun times!

So now, in theory, I have a representation of object thickness for every vertex in my mesh!

One fun way to debug it (in Houdini) was to read the SH values using the camera forward vector, which basically should give me depth from the camera (like a z buffer):


And, in a different debug mode that Matt Ebb had in his work, each vertex gets a sphere copied onto it, and the sphere is displaced in every direction by the SH value on the corresponding vertex:



This gives a good visual indicator on how deep the object is in every direction, and was super useful once I got used to what I was looking at ūüôā

And, just for fun, here is shot from a point where I was doing something really wrong:


Exporting the data

My plans for this were always to bake out the SH data into textures, partially just because I was curious what sort of variation I’d get out of it (I had planned to use displacement maps on the mesh in Houdini to vary the height).

And yes, that’s 4 images worth of SH data, best imported as HDR.
But hey, I like being a bit over the top with my home projects…

One of my very clever workmates, James Sharpe, had the good suggestion of packing the coeffs into UV data as I was whining to him over lunch about the lack of multiple vertex color set support in UE4.
So I decided to run with UVs, and then move back to image based once I was sure everything was working ūüôā


Which worked great, and as you can probably see from the shot above, per-vertex (UVs or otherwise) is perfectly adequate ūüôā

Actually, I ended up putting coefficients 1-14 into uvs, and the last two into the red and green vertex color channels, so that I could keep a proper UV set in the first channel that I could use for textures.

And then, all the work…

Next blog post coming soon!

In it, I will discuss all the UE4 work, the things I should have done, or done better, might do in the future and a few more test shots and scene from in UE4!

To be continued!!

The devil is in the decals


Frequently when talking about mesh decals in UE4, I get comments about them being annoying to maintain, because every time you change your meshes you have to rebuild / adjust layers of decals.

Now, personally, I don’t really care that much, because my projects are all pretty small, and fixing up decals in Modo is generally a very quick job.

But it’s come up enough that I figured I’d make a “2 metres short of Minimum Viable Product” example of how you could address this.


That’s what I’m calling Houdini Engine + UE4 now, just to continue the tradition of me being annoying.

Right. Houdini stuff.
I made a digital asset:


There are two inputs, which will get fed in from UE4 (later).
In the Houdini scene, #1 input is the object I want to generate a decal on, object #2 is a projection plane.

The stuff on the left is¬†actually all redundant, but what I was planning to do was construct layout patterns in Houdini for different decals on one sheet, and let Houdini just automatically do the UV layout. But procedural UV’ing got super annoying, so I decided not to do that.


Extrude plane, cookie with box:


Delete faces that are on the opposite side of the projection (dot product driven delete sop, basically).

Since I couldn’t really get the UVs working the way I wanted, I created a centre point on the projection plane, get the normal and constructed¬†U and V¬†vectors, which I then project onto the verts in the decal mesh.

I did that all in VEX, because it seemed like a good idea at the time.

I was fairly annoyed with working on it by this point, so I just exposed the rotation and scale of the decal so you can play with it in Unreal ūüôā


Back in UE4

With that done, and the thing saved as a Houdini Digital Asset, time to load up a shamefully unfinished UE4 project (there are lots of choices here…).

The workflow is:

  • Load the digital asset into the content browser.
  • Drag a copy¬†into the scene.
  • Using “World Outliner Input”, Select a plane for the projection, and an object to put decals on:


Bam! New decal mesh, floating over the top of the original object, you can save it out using the Houdini engine bake stuff, or whatever you want to do.


I didn’t bother taking this too far, because I don’t really intend to use it myself, but if I thought it was going to be useful there are a bunch of things I’d probably do.

I mean, aside from completely re-build it from scratch, because it’s a whole bunch of broken hack right now…

  • Expose a few different projection types
  • Create separate Houdini asset that lets you lay out planes on a decal sheet to define regions for different decals (which I started on)
  • Make it work with multiple planes passed into the one asset

With any luck, Epic will just come along with a similar workflow where you can press a button on a projected decal in editor, and it will do this sort of thing for you ūüôā

(In the meantime, I’ll just stick with manually doing it in Modo, thanks very much…)


City scanner scene ‚Äď Breakdown pt3


Part 3 of the breakdown of my recent Half-Life 2 Scanner scene.

And now for animation! Also known as “Geoff stumbling around blindly for a week when he really should have watched some UE4 animation tutorials”.

So, erm, feel free to use this as a guide on how *not* to approach making an object float down a hallway…

Down the garden path

Early on, I was trying to work out if I wanted to just animate the whole thing from start to finish in Modo, or do something a little more systemic.

For the sake of trying something different, I settled on having the main movement down the tunnel, rotation of the centre wheel, tail and little flippy bits (technical term) through blueprints, and then blend in a few hand animated bits.

There are three main blueprints that do the work: the Scanner blueprint, the Scanner Path blueprint, and the Scanner Attract Point blueprint.


The division of labour between these things ended up being pretty arbitrary, but the initial idea was that an Attract Point can start playing an animation on the Scanner when it reaches the point, and can also modify the max speed of the Scanner when leaving the point.

Here are the parameters that each point has:


So when the Scanner reaches an attract point, it can pause for a while (I use this time to play an animation, generally). The animation doesn’t start when the scanner reaches the point though, it actually starts at a certain percentage of distance along the previous spline segment leading up to this point.

There is also a blend in and out time for the animation, to give transition from the manually animated idle.

The animation blueprint itself does very little:


Down the bottom left is the Idle animation that happily ticks away all the time, and that blends with an override animation, which is what the Attract Points set.

Each of the rotators are driven by procedural animation on the Scanner blueprint, which I’ll show in a bit.

Improvements in hindsight

The Idle / Override blending part of this is definitely something I would change in hindsight, because blending in and out of the idle is a mess: the scanner could be on an up or down point when I start.

There’s a few ways I could deal with it, including just changing the up and down sine wave motion to be driven by blueprint instead, or just restarting the idle animation when I start blending it back in (or probably a dozen better ways to do it that I don’t know about :P).

Also, the “pause” functionality in the Attract Points is not a great way to do things.
Timing the pause and playing animations was a lot of trial and error, I should have sent events out from the animations instead that trigger the pause.

Custom animations in Modo

There’s three custom animations that I made in Modo:

  • Idle
  • Searching left and right animation half way down the tunnel
  • The final “do I see something? I think I see something… SQUIRREL!!”

Everything in the mesh is hard rigged (no deformation), so just parenting all the pieces together, and importing into Unreal generates a skeleton for me.

In Modo, I didn’t do anything particularly exciting this time around,¬†i.e no interesting rig,¬†I just keyframed the bones.

Modo has export presets for UE4 and Unity now, which is pretty ace!
You can also set up your own presets:


It was pretty fun doing some animation again, it’s something I really don’t do very often, and Modo makes it pretty easy to dive in.


Ok, back in UE4, time to have a quick look over the tick event in the Scanner blueprint.


Unlike my normal lazy self, this time I stitched a bunch of things together, so you can see it in full!

Embrace lag

I wanted quite a few of the animations to be driven by the velocity of the scanner.
I’m calculating the velocity myself anyway, based on set max speed and acceleration values, so I could have just used that value, and built some lag into it.

But I found something more fun:


I have 2 non colliding “spring arms” attached to the¬†Scanner root node, one¬†that is slow to catch up (used for body tilt), one that is fairly quick (used for tail rotation).

This was inspired by some amazingly cool¬†animation work I saw Greg Puzniak doing in Unity3D. It’s not up on his site, but he has cool matcap stuff up there that you should check out! ūüôā

So in a lot of my blueprint functions, I get the distance from the spring arm to the arrow, and use that to drive laggy rotation:


Despite the comment in the blueprint, when I tried to use this for banking, the ATM swallowed my card (har har har).

So most of my procedural animation is driven this way.
When I’m not playing the override animations, everything except for the¬†up and down bobbing is procedural. Which is pretty silly, because that’s a sine wave, so it’s not really an Idle animation, and I should have got rid of it.

The rest of the blueprint is about keeping track of how far along the spline we are, how close to the next point we are (so we can start playing an animation, if necessary), and then setting the world location and rotation of the Scanner from the distance along the spline.

Next-gen volumetric fog

Is something I don’t know much about, so I just made a hacky material UV offset thingo ūüėõ

I say “made”, but there¬†are some great light beam and fog examples from Epic included in the engine content, so I¬†grabbed most of this thing from those examples, and here:

Fog Sheet and Light Beams

The scanner has a geometry cone on it that is UV mapped 0->1 U along the length.

I don’t think I really changed much from the content example (I honestly can’t remember), but I did add two parameters that adjust the tiling offset of the noise texture:


As the Scanner moves¬†along the path, it increases the FogForwardOffset which pans the U coordinate of the UVs, so that it looks like the cone is moving through a volume. There’s also always a little bit of panning going on anyway, even when the Scanner is stopped.

As the Scanner rotates, I scroll the V coordinate, just so the noise in the beam doesn’t stay fixed. The rotation isn’t very convincing, so I could probably do a better job of that.

There’s not much going on in the blueprint, but I put whatever I could in there rather than in the material:




The idea of the scene was to be almost entirely lit by the scanner, but I do have a bunch of static lights scatter around too, just to give some ambient light.

There are also two stationary lights in the scene to get highlights where I want them (the lights selected in the screenshot above).
One of them is a spotlight, used to hit the puddle and left wall.

There is also a small light at the front of the tunnel that has “Indirect Lighting Intensity” set to 0, so it doesn’t affect the bounced lighting.
This is the light that hits the scanner here:


This light is quite bright, so when the Scanner hits it, the rest of the environment darkens down, which (hopefully) puts the focus all on the Scanner (yay for auto-exposure!).

There are only two shadow casting lights in the scene, and they are both on the Scanner.
One of them is on a giant point light, and is super expensive, which is the main reason I limited shadow casters everywhere else:


Spinning spotlights

There are also two non shadow casting spotlights on the side of the scanner that rotate and project patterns on the wall.

For some reason that I can’t remember, I decide to generate the pattern in a material, rather than do the smart thing and use a texture.


I’m modifying the “VectorToRadialValue” function to generate bands, then fading it out in the middle bit.

Seriously though, unless you have a really good reason, you should probably do this in a texture ūüôā


So I *think* that’s it!

I’m sure there are things I’ve missed, or glossed over a bit, so feel free to ask questions in the comments and I’ll fill in the gaps.



City scanner scene – Breakdown pt1


In this post, I’ll go through the construction of the environment for my recently posted Half Life 2 scanner scene.

The point of this project was really just to do a bit of animation on my scanner, and show it off in a simple environment. I can’t remember the last time I did any animation, but my guess would be when I was studying at the AIE over ten years ago ūüôā

So with that in mind, figuring I was going to struggle with the animation side, I wanted to keep the environment dead simple. It was always going to be dark, anyway, since I wanted the scanner to light the scene!

Modelling / texturing the tunnel

I looked up a bunch of photo reference for cool tunnels in Europe, presumably the sort of thing that the resistance in city 17 would have used ūüôā

I blocked out basic lighting, camera setup, and created the tunnel out of cubes in UE4.
Once I was happy with the layout, I could then just export the blocked out mesh to FBX to use as a template in Modo:


I also took the time to make a really basic animatic.
I changed the path of the scanner quite a bit, and timing, etc, but I still found this to be useful:

Anyway, at this point, the scene blockout is in Modo, and I can start building geometry:


The geometry itself is dead simple, so I won’t go into that too much, I just extruded along a spline, then beveled and pushed a few edge loops around ūüôā

I always use the sculpt tools to push geometry around a little, just to make things feel a bit more natural. Here specifically I was sinking some of the vertices on the side pathways:


Layered vertex painted materials can be expensive, so I wanted to avoid going too far down that path.
In the end, I settled on having two layers: concrete, and moldy damp green stuff:


The green stuff is vertex paint blended on, and the vertex colours for the mask was done in UE4 rather than in Modo, just because it is quick and easy to see what I’m doing in editor.

Most of the materials in the scene were made in Substance painter.
And I’m lazy, so they are usually a couple of layers with procedural masks, and one or two hand painted masks ūüôā


Water plane


For the purposes of this scene, I could get away with a pretty low tech / low quality water plane. As long as it had some movement, and is reflective, then it would do!

The engine provides flow map samples and functions in the content samples, so I just used those. I’ve written my own ones before¬†(and by that, I mean I copied what they were doing in the Portal 2 Water Flow presentation from siggraph 2010), but the UE4 implementation does exactly what I wanted ūüôā

And seriously, if you haven’t looked at that presentation, go do it.
They used Houdini to generate water flow, but I’m lazy and ain’t got time for that! (Not for this scene, at any rate).

I just generated mine in Photoshop, using this page as a guide:

Photoshop generated flow maps

At some point, I’d like to see if I can set up the same workflow in Substance Painter and/or Houdini.

Anyway, the material is a bit messy (sorry):


I’m passing¬†the flowmap texture and some timing parameters into the flowmaps material function, and getting a new normal map out of it.

The only other thing going on here is that I have a mask for the edges of the water, where it is interacting with the walls. I blend in different subsurface colour, normal strength and roughness at the edges.

Fog planes


I’ve got a few overlapping fog planes in the scene, with a simple noisy texture, offset by world position (having a different offset on each make it feel a little more volumetric).

Much like the water, the fog plane has a subtle flow map on it, to fake a bit of turbulence, and the material uses depth fade on opacity to help it blend with the surrounding geometry:


UE4 4.13 mesh decals

I was going to use a bunch of the new 4.13 features originally, but in the end I think the only one I used was “mesh decals”.

These are decals in the old school sense, not the projected decals that UE4 users have probably come to love. In the back of my mind, I had thought I might turn this into a VR scene at some point, and the cost of projected decals is a somewhat unknown commodity for me at the moment.

The main advantage of mesh decals, vs floating bits of geometry with Masked materials, is that mesh decals support full alpha blending.

In these shots, the water puddle, stain and concrete edge damage are all on part of the same decal sheet:

The decals are all using Diffuse, Normals, Roughness, Metallic and Occlusion (the last three packed together):

DecalsTextures.pngI built the decals¬†one at a time, without much planning, basically guessing at how much texture space I thought I was going to need (I didn’t bother setting a “texels per metre” type of limit for my project, but that probably would have been sensible).

Each time I wanted a new mesh decal, I’d work out in Modo how big I want it first:


Then I’d copy it into a separate Modo scene just for Decal Layout¬†which I take into Substance Painter.
I just did this so I could keep all the mesh together in one space, to keep it easy for painting:


And then here is the scene in Substance:


And here is the scene with and without decals:


What’s great about this, is that mesh decals don’t show up in Shader Complexity, so the tech artists on the project will never know… (I kid, I kid. They will find them in PIX, and will hunt you down and yell at you).

I really like this approach to building wear and tear into materials. The first time I saw this approach was when I was working at Visceral Games in Melbourne, and the engine was very well optimized to handle a pretty huge amount of decals. I didn’t embrace it as much as I should have, back then.


A few years back, I made a blueprint for pipes that allowed joining sections, etc.
So I knocked together a model in Modo for the connection pieces:


Edge-weighted sub-d, of course, because I can’t help myself ūüôā
I even started sculpting in some heavy rust, but had to have a stern word to myself about not spending too much time on stuff that isn’t even going to be lit…

Textured in Substance Painter:


Same dealio with the pipe segments:


Then I just built the spline in the editor, and set it up like in my old blog post.

Much like I did with the original blockout geometry, I also exported the final pipes back out to Modo so that I could use them to work out where I wanted to put some decals.

The only other thing that was¬†a pain, was that the pipes need lightmaps, but I couldn’t work out a way to generate unique UVs for the final pipe mesh.

In the end, I just used the merge actors function in the editor, so that they all became a single static mesh, and let Unreal generate lightmap UVs.


Did you notice that there were hanging spider webs in the scene?
No? Good, because I don’t like them much ūüėõ

I probably spent 10-20 hours just messing about with these silly things, but at least I got some fun gifs out of them:


Next up…

I’ll break down the construction of those web things, might be useful for a scene full of badly animated vines, I suppose…

I’ll also go through all of the silly things I did on the animation / blueprint / lighting side.