Subsurface Scattering spherical harmonics – pt 3

May 13, 2017

Welcome to part 3 of this exciting series on how to beat a dead horse.

By the time I got to the end of the work for the last post, I was just about ready to put this project to bed (and by that, I mean P4 obliterate…).

There was just one thing I wanted to fix: The fact that I couldn’t rotate my models!
If I rotate the object, the lighting rotates with it.

Spaaaaaaace

To fix the rotating issue, in the UE4 lighting pass, I need to transform the light vector into the same space that I’m storing the SH data (object space, for example).

RotateSpace

To do that, I need to pass through at least two of those object orientation vectors to the lighting pass (for example, the forward and right vectors of the object).

So, that’s another 6 floats (if I don’t compress them) that I need to pass through, and if you remember from last time, I’d pushed the limits of MRTs with my 16 spherical harmonics coefficients, I don’t have any space left!

This forced me to do one of the other changes I talked about: Use 3 band Spherical Harmonics for my depth values instead of 4 band.
That reduces the coefficients from 16 to 9, and gives me room for my vectors.

<Insert montage of programming and swearing here>

3bandSH

So yay, now I have 3 band SH, and room for sending more things through to lighting.

Quality didn’t really change much, either, and it helped drop down to 5 uv channels, which became very important a little later…

Going off on a tangent

I figured that since I was solving the problem for object orientation, maybe I could also do something for deforming objects too?
For an object where the depth from one side to the other doesn’t change much when it’s deforming, it should be ok to have baked SH data.

The most obvious way to handle that was to calculate and store the SH depth in Tangent space, similar to how Normal maps are usually stored for games.

I wanted to use the same tangent space that UE4 uses, and although Houdini 15 didn’t have anything native for generating that, there is a plugin!

https://github.com/teared/mikktspace-for-houdini

With that compiled and installed, I could plonk down a Compute Tangents node, and now I have Tangents and Binormals stored on each vertex, yay!

At this point, I create a matrix from the Tangent, Binormal and Normal, and store the transpose of that matrix.
Multiplying a vector against it will give me that vector in Tangent space. I got super lazy, and did this in a vertex wrangle:

matrix3 @worldToTangentSpaceMatrix;
vector UE4Tang;
vector UE4Binormal;
vector UE4Normal;

// Tangent U and V are in houdini coords
UE4Tang         = swizzle(v@tangentu, 0,2,1);
UE4Binormal     = swizzle(v@tangentv, 0,2,1);
UE4Normal       = swizzle(@N, 0,2,1);

@worldToTangentSpaceMatrix = transpose(set(UE4Tang, UE4Binormal, UE4Normal));

The swizzle stuff is just swapping Y and Z (coordinate systems are different between UE4 and Houdini).

Viewing the Tangent space data

To make debugging easier, at this point I made a fun little debug node that displays Tangents, Binormals and Normals the same as the model viewer in UE4.

It runs per vertex, and creates new coloured line primitives:

TangentFace

Haven’t bothered cleaning it up much, but hopefully you get the idea:

TangentPrimsVOP.png

And the vectorToPrim subnet:

VectorToPrimsVOP.png

So, add a point, add some length along the input vector and add another point, create a prim, create two verts from the points, set the colour.
I love how easy it is to do this sort of thing in Houdini 🙂

The next step was to modify the existing depth baking code.

For each vertex in the model, I was sending rays out through the model, and storing the depth when they hit the other side.
That mostly stays the same, except that when storing the rays in the SH coefficients, I need to convert them to tangent space first!

HitsToSH.png

Getting animated

Since most of the point of a Tangent space approach was to show a deforming object not looking horrible, I needed an animated model.

I was going to do a bunch of animation in Modo for this, but I realized that transferring all my Houdini custom data to Modo, and then out to fbx might not be such a great idea.

Time for amazing Houdini animation learningz!!
Here’s a beautiful test that any animator would be proud of, rigged in Houdini and dumped out to UE4:

StupidTube.gif

So, I spent some time re-rigging the Vortigaunt in Houdini, and doing some more fairly horrible animation that you can see at the top of this post.

RiggedVort.png

Although the results aren’t great, I found this weirdly soothing.
Perhaps because it gave me a break from trying to debug shaders.

At some point in the future, I would like to do a bit more animation/rigging/skinning.
Then I can have all the animators at work laugh at my crappy art, in addition to all the other artists…

Data out

Hurrah, per-vertex Tangent space Spherical Harmonic depth data now stored on my animated model!

This was about the part where I realized I couldn’t find a way to get the Tangents and Binormals from the Houdini mesh into Unreal…

When exporting with my custom data, what ends up in the fbx is something like this:

   UserDataArray:  {
    UserDataType: "Float"
    UserDataName: "tangentu_x"
    UserData: *37416 {...

When I import that into UE4, it doesn’t know what that custom data is supposed to be.

If I export a mesh out of Modo, though, UE4 imports the Tangents and Binormals fine.
So I jumped over into Modo, and exported out a model with Tangents and Binormals, and had a look at the fbx.
This showed me I needed something more like this:

LayerElementTangent: 0 {
 Version: 102
 Name: "Texture"  
 MappingInformationType: "ByPolygonVertex"
 ReferenceInformationType: "Direct"
 Tangents: *112248 {...
This is probably around about when I should have set the project on fire, and found something better to do with my time but…

C# to the rescue!!

I wrote an incredibly silly little WPF program that reads in a fbx, changes tangentu and tangentv user data into the correct layer elements.

Why WPF you ask?
Seriously, what’s with all the questions? What is this, the Spanish inquisition?
Real answer: Almost any time I’ve written any bit of code for myself in the past 7 years, it’s always a WPF program.
80% of them end up looking like this:
AmazingUI
The code is horrible, I won’t paste it all, but I build a list of all the vectors then pass them through to a function that re-assembles the text and spits it out:
        public string CreateLayerElementBlock(List<Vector3D> pVectors, string pTypeName)
        {
            string newBlock = "";

            int numVectors  = pVectors.Count;
            int numFloats   = pVectors.Count * 3;

            newBlock += "\t\tLayerElement" + pTypeName + ": 0 {\n";
            newBlock += "\t\t\tVersion: 102\n";
            newBlock += "\t\t\tName: \"Texture\"\n";
            newBlock += "\t\t\tMappingInformationType: \"ByPolygonVertex\"\n";
            newBlock += "\t\t\tReferenceInformationType: \"Direct\"\n";
            newBlock += "\t\t\t" + pTypeName + "s: *" + numFloats + " {\n";
            newBlock += "\t\t\t\ta: ";
	...

Gross. Vomit. That’s an afternoon of my life I’ll never get back.
But hey, it worked, so moving on…

UE4 changes

There weren’t many big changes on the UE4 side, just the switching over to 3 band SH, mostly.

One really fun thing bit me in the arse, though.
I’d been testing everything out on my static mesh version of the model.
When I imported the rigged model, I needed to change the material to support it:
UseWithSkeletal
And then the material failed to compile (and UE4 kept crashing)…
So, apparently, skinned meshes use a bunch of the UV coordinate slots for… Stuff!
I needed to switch back to my old approach of storing 6 coefficients in TexCoord1,2 and 3, and the remaining three SH coeffs in vertex colour RGB:
RiggedMatChanges.png
Cropped this down to exclude all the messy stuff I left in for texture based SH data, but those three Appends on the right feed into the material pins I added for SH data in the previous posts.
And yeah, there’s some redundancy in the math at the bottom too, but if you don’t tell anyone, I won’t.

Shader changes

Now to pass the Tangent and Binormal through to the lighting pass.

I ended up compressing these, using Octahedron normal vector encoding, just so I could save a few floats.
The functions to do this ship with UE4, and they allow me to pass 2 floats per vector, rather than x,y,z, and the artifacts are not too bad.
Here’s some more information on how it works:
OctahedronEncoding.png
So now the Tangent and Binormal data is going through to the lighting pass, and I transform the light to tangent space before looking up the SH data:
 float3x3 TangentToWorld =
 {
  GBuffer.WorldTangent,
  GBuffer.WorldBinormal,
  cross(GBuffer.WorldTangent, GBuffer.WorldBinormal),
 };

 float3 TangentL = mul(L, transpose(TangentToWorld));

 float DepthFromPixelToLight  = saturate(GetSH(SHCoeffs, TangentL));
Probably could do that transposing in BassPassPixelShader I guess, and save paying for it on every pixel for every light, but then there’s a lot of things I probably could do. Treat my fellow human beings nicer, drink less beer, not stress myself out with silly home programming projects like this…

Conclusion

If I were to ever do this for real, on an actual game, I’d probably build the SH generation into the import process, or perhaps when doing stuff like baking lighting or generating distance fields in UE4.

If you happened to have a bunch of gbuffer bandwidth (i.e, you had to add gbuffers for something else), and you have a lot of semi translucent things, and engineering time to burn, and no better ideas, I suppose there could be a use for it.
Maybe.

Subsurface Scattering spherical harmonics – pt 2

March 22, 2017

 

This is my 2nd blog post on using spherical harmonics for depth based lighting effects in Unreal 4.

The first blog post focused on generating the spherical harmonics data in Houdini, this post focuses on the Unreal 4 side of things.

I’m going to avoid posting much code here, but I will try to provide enough information to be useful if you choose to do similar things.

SH data to base pass

The goal was to look up the depth of the object from each light in my scene, and see if I could do something neat with it.

In UE4 deferred rendering, that means that I need to pass my 16 coefficients from the material editor –> base pass pixel shader -> the lighting pass.

First up, I read the first two SH coefficients out of the red and green vertex colour channels, and the rest out of my UV sets (remembering that I kept the default UV set 0 for actual UVs):

SHBaseMatUVs

Vertex colour complications

You notice a nice little hardcoded multiplier up there… This was one of the annoyances with using vertex colours: I needed to scale the value of the coefficients in Houdini to 0-1, because vertex colours are 0-1.

This is different to the normalization part I mentioned in the last blog post, which was scaling the depth values before encoding them in SH. Here, I’m scaling the actual computed coefficients. I only need to do this with the vertex colours, not the UV data, since UVs aren’t restricted to 0-1.

The 4.6 was just a value that worked, using my amazing scientific approach of “calculate SH values for half a dozen models of 1 000 – 10 000 vertices, find out how high and low the final sh values go, divide through by that number +0.1”. You’d be smarter to use actual math to find the maximum range for coefficients for normalized data sets, though… It’s probably something awesome like 0 –> 1.5 pi.

Material input pins

Anyway, those values just plug into the SH Depth Coeff pins, and we’re done!!

Unreal 4 SH depth material

Ok.
That was a lie.
Those pins don’t exist usually… And neither does this shading model:

SHDepthShadingModel

So, that brings me to…

C++ / shader side note

To work out how to add a shading model, I searched the source code for a different shading model (hair I think), and copied and pasted just about everything, and then went through a process of elimination until things worked.
I took very much the same approach to the shader side of things.

This is why I’m a Tech Artist, and not a programmer… Well, one of many reasons 😉
Seriously though, being able to do this is one of the really nice things about having access to engine source code!

The programming side of this project was a bunch of very simple changes across a wide range of engine source files, so I’m not going to post much of it:

P4Lose

There is an awful lot of this code that really should be data instead. But Epic gave me an awesome engine and lets me mess around with source code, so I’m not going to complain too much 😛

Material pins (continued…)

So I added material inputs for the coefficients, plus some absorption parameters.

Sh coeffs

The SH Coeffs material pins are new ones, so I had to make a bunch of changes to material engine source files to make that happen.
Be careful when doing this: Consistent ordering of variables matters in many of these files. I found that out the easy way: Epic put comments in the code about it 🙂

Each of the SH coeffs material inputs is a vector with 4 components, so I need 4 of these to send my 16 coefficients through to the base pass.

Custom data (absorption)

The absorption pins you might have noticed from my material screenshot are passed as “custom data”.
Some of the existing lighting models (subsurface, etc) pass additional data to the base pass (and also through to lighting, but more on that later).

These “custom data” pins can be renamed for different shading models. So you can use these if you’d rather not go crazy adding new pins, and you’re happy with passing through just two extra float values.
Have a look at MaterialGraph.cpp, and GetCustomDataPinName if that sounds like a fun time 🙂

Base pass to lighting

At this point, I’d modified enough code that I could start reading and using my SH values in the base pass.

A good method for testing if the data was valid was using the camera vector to look up the SH depth values. I knew things were working when I got similar results to what I was seeing in Houdini when using the same approach:

BasePassDebug

That’s looking at “Base Color” in the buffer visualizations.

I don’t actually want to do anything with the SH data in the base pass, though, so the next step is to pass the SH data through to the lighting pass.

Crowded Gbuffer

You can have a giant parameter party, and read all sorts of fun data in the base pass.
However, if you want to do per-light stuff, at some point you need to write all that data into a handful of full screen buffers that the lighting pass uses. By the time you get to lighting, you don’t have per object data, just those full screen buffers and your lights.

These gbuffers are lovingly named GBufferA, GBufferB, GBuffer… You get the picture.

You can visualize them in the editor by using the various buffer visualizers, or explicitly using the “vis” command, e.g: “vis gbuffera”:

visGbuffers

There are some other buffers being used (velocity, etc), but these are the ones I care about for now.

I need to pass an extra 16 float values through to lighting, so surely I could just add 4 new gbuffers?

Apparently not, the limit for simultaneous render targets is 8 🙂

I started out by creating 2 new render targets, so that covers half of my SH values, but what to do with the other 8 values?

Attempt 1 – Packing it up

To get this working, there were things that I could sacrifice from the above existing buffers to store my own data.

For example, I rarely use Specular these days, aside from occasionally setting it to a constant, so I could use that for one of my SH values, and just hard code Specular to 1 in my lighting pass.

With this in mind, I overwrote all the things I didn’t think I cared about for stylized translucent meshes:

  • Static lighting
  • Metallic
  • Specular
  • Distance field anything (I think)

Attempt 2 – Go wide!

This wasn’t really ideal. I wasn’t very happy about losing static lighting.

That was about when I realized that although I couldn’t add any more simultaneous render targets, I could change the format of them!

The standard g-buffers are 8 bits per channel, by default. By going 16 bit per channel, I could pack two SH values into each channel, and store all my SH data in my two new g-buffers without the need for overwriting other buffers!

Well, I actually went with PF_A32B32G32R32F, so 32 bits per channel because I’m greedy.

It’s probably worth passing out in horror at the cost of all this at this point: 2 * 128bit buffers is something like 250mb of data. I’m going to talk about this a little later 🙂

Debugging, again

I created a few different procedural test assets in Houdini with low complexity as test cases, including one which I deleted all but one polygon as a final step, so that I could very accurately debug the SH values 🙂

On top of that, I had a hard coded matrix in the shaders that I could use to check, component by component, that I was getting what I expected when passing data from the base pass to lighting, with packing/unpacking, etc:

const static float4x4 shDebugValues = 
{
	0.1, 0.2, 0.3, 0.4,
	0.5, 0.6, 0.7, 0.8,
	0.9, 1.0, 1.1, 1.2,
	1.3, 1.4, 1.5, 1.6
};

It seems like an obvious and silly thing to point out, but it saved me some time 🙂

Here are some of my beautiful procedural test assets (one you might recognize from the video at the start of the post):

Houdini procedural test asset (rock thing)testobject3testobject2testobject1

“PB-nah”, the lazy guide to not getting the most out of my data

Ok, SH data is going through to the lighting pass now!

This is where a really clever graphics programmer could use if for some physically accurate lighting work, proper translucency, etc.

To be honest, I was pleasantly surprised that anything was working at this stage, so I threw in a very un-pbr scattering, and called it a day! 🙂

float3 SubsurfaceSHDepth( FGBufferData GBuffer, float3 L, float3 V, half3 N )
{
	float AbsorptionDistance 	= GBuffer.CustomData.x;
	float AbsorptionPower 		= lerp(4.0f, 16.0f, GBuffer.CustomData.y);

	float DepthFromPixelToLight 	= Get4BandSH(GBuffer.SHCoeffs, L);
	float absorptionClampedDepth 	= saturate(1.0f / AbsorptionDistance * DepthFromPixelToLight);
	float SSSWrap 			= 0.3f;
	float frontFaceFalloff 		= pow(saturate(dot(-N, L) + SSSWrap), 2);

	float Transmittance 		= pow(1 - absorptionClampedDepth, AbsorptionPower);

	Transmittance *= frontFaceFalloff;

	return Transmittance * GBuffer.BaseColor;
}
It’s non view dependent scattering, using the SH depth through the model towards the light, then dampened by the absorption distance.
The effect falls off by face angle away from the light, but I put a wrap factor on that because I like the way it looks.
For all the work I’ve put into this project, probably the least of it went into the actual lighting model, so I’m pretty likely to change that code quite a lot 🙂
What I like about this is that the scattering stays fairly consistent around the model from different angles:
GlowyBitFrontGlowyBitSide
So as horrible and inaccurate and not PBR as this is, it matches what I see in SSS renders in Modo a little better than what I get from standard UE4 SSS.

The End?

Broken things

  • I can’t rotate my translucent models at the moment 😛
  • Shadows don’t really interact with my model properly

I can hopefully solve both of these things fairly easily (store data in tangent space, look at shadowing in other SSS models in UE4), I just need to find the time.
I could actually rotate the SH data, but apparently that’s hundreds of instructions 🙂

Cost and performance

  • 8 uv channels
  • 2 * 128 bit buffers

Not really ideal from a memory point of view.

The obvious optimization here is to drop down to 3 band spherical harmonics.
The quality probably wouldn’t suffer, and that’s 9 coefficients rather than 16, so I could pack them into one of my 128 bit gbuffers instead of two (with one spare coefficient left over that I’d have to figure out).

That would help kill some UV channels, too.

Also, using 32 bit per channel (so 16 bits per sh coeff) is probably overkill. I could swap over to using a uint 16 bits per channel buffer, and pack two coefficients per channel at 8 bits each coeff, and that would halve the memory usage again.

As for performance, presumably evaluating 3 band spherical harmonics would be cheaper than 4 band. Well, especially because then I could swap to using the optimized UE4 functions that already exist for 3 band sh 🙂

Render… Differently?

To get away from needing extra buffers and having a constant overhead, I probably should have tried out the new Forward+ renderer:

https://docs.unrealengine.com/latest/INT/Engine/Performance/ForwardRenderer/

Since you have access to per object data, presumably passing around sh coefficients would also be less painful.
Rendering is not really my strong point, but my buddy Ben Millwood has been nagging me about Forward+ rendering for years (he’s writing his own renderer http://www.lived3d.com/).

There are other alternatives to deferred, or hybrid deferred approaches (like Doom 2016’s clustered forward, or Wolfgang Engels culled visibility buffers) that might have made this easier too.
I very much look forward to the impending not-entirely-deferred future 🙂

Conclusion

I learnt some things about Houdini and UE4, job done!

Not sure if I’ll keep working on this at all, but it might be fun to at least fix the bugs.

 

Subsurface Scattering spherical harmonics – pt 1

March 17, 2017

In this post, I’ll be presenting “SSSSH”, which will be the sound made by any real programmer who happens to accidentally read this…

This has been a side project of mine for the last month or so with a few goals:

  • Play around more with Houdini (I keep paying for it, I should use it more because it’s great)
  • Add more gbuffers to UE4, because that sounds like a useful thing to be able to do and understand.
  • Play around with spherical harmonics (as a black box) to understand the range and limitations of the technique a bit better.
  • Maybe accidentally make something that looks cool.

Spherical harmonics

I won’t go too much into the details on spherical harmonics because:
a) There’s lots of good sites out there explaining them and
b) I haven’t taken the time to understand the math, so I really don’t know how it works, and I’m sort of ok with that for now 😛

But at my basic understanding level, spherical harmonics is a way of representing data using a set of functions that take spherical coordinates as an input, and return a value. Instead of directly storing the data (lighting, depth, whatever), you work out a best fit of these functions to your data, and store the coefficients of the functions.

Here is a very accurate diagram:

DataSphere

You’re welcome!
Feel free to reuse that amazing diagram.

SH is good for data that varies rather smoothly, so tends to be used for ambient/bounced lighting in a lot of engines.

The function series is infinite, so you can decide how many terms you want to use, which determines how many coefficients you store.

For this blog post, I decided to go with 4-band spherical harmonics, because I’m greedy and irresponsible.
That’s 16 float values.

Houdini SH

Thanks to the great work of Matt Ebb, a great deal of work was already done for me:

http://mattebb.com/weblog/spherical-harmonics-in-vops/

I had to do a bit of fiddling to get things working in Houdini 15, but that was a good thing to do anyway, every bit of learning helps!

What I used from Matt were two nodes for reading and writing SH data given the Theta and Phi (polar and azimuthal) angles:

SHFunctions

Not only that, but I was able to take the evaluate code and adapt it to shader code in UE4, which saved me a bunch of time there too.

It’s not designed to be used that way, so I’m sure that it isn’t amazingly efficient. If I decide to actually keep any of this work, I’ll drop down to 3 band SH and use the provided UE4 functions 🙂

Depth tracing in Houdini

I’m not going to go through every part of the Houdini networks, just the meat of it, but here’s what the main network looks like:

NetworkOverview

So all the stuff on the left is for rendering SH coefficients out to textures (more on that later), the middle section is where the work is done, the right hand side a handful of debug modes visualizers, including some from the previously mentioned Matt Ebb post.

Hits and misses

I’m doing this in SOPs (geometry operations), because it’s what I know best in Houdini at the moment, as a Houdini noob 🙂
I should try moving it to shops (materials/per pixel) at some point, if that is at all possible.

To cheat, if I need more per-pixel like data, I usually just subdivide my meshes like crazy, and then just do geometry processing anyway 😛

The basic functionality is:

  • For each vertex in the source object:
    • Fire a ray in every direction
    • Collect every hit
    • Store the distance to the furthest away primitive that is facing away from the vertex normal (so back face, essentially)

All the hits are stored in an array, along with the Phi and Theta angles I mentioned before, here’s what that intersection network looks like currently:

IntersectAll

I’m also keeping track of the maximum hit length, which I will use later to normalize the depth data. The max length is tracked one level up from the getMaxIntersect network from the previous screenshot:

GenerateHits

This method currently doesn’t work very well with objects with lots of gaps in them, because the gaps in the middle of an object will essentially absorb light when they shouldn’t.
It wouldn’t be hard to fix, I just haven’t taken the time yet.

Normalizing

Before storing to SH values, I wanted to move all the depth values into the 0-1 range, since there are various other places where having 0-1 values makes my life easier later.

One interesting thing that came up here: when tracing rays out from a point, there are always more rays that miss than hit.

That’s because surfaces are more likely to be convex than concave, so at least half of the rays are pointing out into space:

FurryPlane

Realistically, I don’t really care about spherical data, I probably want to store hemispherical data around the inverse normal.
That might cause data problems in severely concave areas of the mesh, but I don’t think it would be too big a problem.
There are hemispherical basis functions that could be used for that, if I were a bit more math savvy:

A Novel Hemispherical Basis for Accurate and Efficient Rendering

Anyway, having lots of values shooting out to infinite (max hit length) was skewing all of the SH values, and I was losing a lot of accuracy, so I encoded misses as zero length data instead.

Debug fun times!

So now, in theory, I have a representation of object thickness for every vertex in my mesh!

One fun way to debug it (in Houdini) was to read the SH values using the camera forward vector, which basically should give me depth from the camera (like a z buffer):

SHDepth

And, in a different debug mode that Matt Ebb had in his work, each vertex gets a sphere copied onto it, and the sphere is displaced in every direction by the SH value on the corresponding vertex:

vortigauntBalloons

vortigauntBalloons2

This gives a good visual indicator on how deep the object is in every direction, and was super useful once I got used to what I was looking at 🙂

And, just for fun, here is shot from a point where I was doing something really wrong:

vortigauntClicker

Exporting the data

My plans for this were always to bake out the SH data into textures, partially just because I was curious what sort of variation I’d get out of it (I had planned to use displacement maps on the mesh in Houdini to vary the height).

SHImages
And yes, that’s 4 images worth of SH data, best imported as HDR.
But hey, I like being a bit over the top with my home projects…

One of my very clever workmates, James Sharpe, had the good suggestion of packing the coeffs into UV data as I was whining to him over lunch about the lack of multiple vertex color set support in UE4.
So I decided to run with UVs, and then move back to image based once I was sure everything was working 🙂

PixelVSVertex

Which worked great, and as you can probably see from the shot above, per-vertex (UVs or otherwise) is perfectly adequate 🙂

Actually, I ended up putting coefficients 1-14 into uvs, and the last two into the red and green vertex color channels, so that I could keep a proper UV set in the first channel that I could use for textures.

And then, all the work…

Next blog post coming soon!

In it, I will discuss all the UE4 work, the things I should have done, or done better, might do in the future and a few more test shots and scene from in UE4!

To be continued!!

ArtStation and a mine

December 27, 2016

UE4Textured.png

Just a very quick post to show off a new asset (although I’ve already been spamming Twitter with that a bit).

It’s another Half Life asset: a hopper mine. Was a lot of fun to work on!
The above shot is in UE4, this is another Modo + Substance Painter asset.

Also, I’ve decided to jump on the ArtStation bandwagon, just in case I didn’t already have enough accounts and pages to maintain 🙂

Hope everyone has a great holidays and New Years!

 

The devil is in the decals

October 20, 2016

autodecal

Frequently when talking about mesh decals in UE4, I get comments about them being annoying to maintain, because every time you change your meshes you have to rebuild / adjust layers of decals.

Now, personally, I don’t really care that much, because my projects are all pretty small, and fixing up decals in Modo is generally a very quick job.

But it’s come up enough that I figured I’d make a “2 metres short of Minimum Viable Product” example of how you could address this.

Houe4dengine

That’s what I’m calling Houdini Engine + UE4 now, just to continue the tradition of me being annoying.

Right. Houdini stuff.
I made a digital asset:

Network.png

There are two inputs, which will get fed in from UE4 (later).
In the Houdini scene, #1 input is the object I want to generate a decal on, object #2 is a projection plane.

The stuff on the left is actually all redundant, but what I was planning to do was construct layout patterns in Houdini for different decals on one sheet, and let Houdini just automatically do the UV layout. But procedural UV’ing got super annoying, so I decided not to do that.

Anyway…

Extrude plane, cookie with box:

ExtrudeAndCookie.png

Delete faces that are on the opposite side of the projection (dot product driven delete sop, basically).

Since I couldn’t really get the UVs working the way I wanted, I created a centre point on the projection plane, get the normal and constructed U and V vectors, which I then project onto the verts in the decal mesh.

I did that all in VEX, because it seemed like a good idea at the time.

I was fairly annoyed with working on it by this point, so I just exposed the rotation and scale of the decal so you can play with it in Unreal 🙂

AutoDecalParams.png

Back in UE4

With that done, and the thing saved as a Houdini Digital Asset, time to load up a shamefully unfinished UE4 project (there are lots of choices here…).

The workflow is:

  • Load the digital asset into the content browser.
  • Drag a copy into the scene.
  • Using “World Outliner Input”, Select a plane for the projection, and an object to put decals on:

AutoDecal_outlinerSelect.png

Bam! New decal mesh, floating over the top of the original object, you can save it out using the Houdini engine bake stuff, or whatever you want to do.

Conclusion

I didn’t bother taking this too far, because I don’t really intend to use it myself, but if I thought it was going to be useful there are a bunch of things I’d probably do.

I mean, aside from completely re-build it from scratch, because it’s a whole bunch of broken hack right now…

  • Expose a few different projection types
  • Create separate Houdini asset that lets you lay out planes on a decal sheet to define regions for different decals (which I started on)
  • Make it work with multiple planes passed into the one asset

With any luck, Epic will just come along with a similar workflow where you can press a button on a projected decal in editor, and it will do this sort of thing for you 🙂

(In the meantime, I’ll just stick with manually doing it in Modo, thanks very much…)

 

City scanner scene – Breakdown pt3

October 15, 2016

ScannerFloat.gif

Part 3 of the breakdown of my recent Half-Life 2 Scanner scene.

And now for animation! Also known as “Geoff stumbling around blindly for a week when he really should have watched some UE4 animation tutorials”.

So, erm, feel free to use this as a guide on how *not* to approach making an object float down a hallway…

Down the garden path

Early on, I was trying to work out if I wanted to just animate the whole thing from start to finish in Modo, or do something a little more systemic.

For the sake of trying something different, I settled on having the main movement down the tunnel, rotation of the centre wheel, tail and little flippy bits (technical term) through blueprints, and then blend in a few hand animated bits.

There are three main blueprints that do the work: the Scanner blueprint, the Scanner Path blueprint, and the Scanner Attract Point blueprint.

ScannerBlueprints.png

The division of labour between these things ended up being pretty arbitrary, but the initial idea was that an Attract Point can start playing an animation on the Scanner when it reaches the point, and can also modify the max speed of the Scanner when leaving the point.

Here are the parameters that each point has:

AttractPointProperties.png

So when the Scanner reaches an attract point, it can pause for a while (I use this time to play an animation, generally). The animation doesn’t start when the scanner reaches the point though, it actually starts at a certain percentage of distance along the previous spline segment leading up to this point.

There is also a blend in and out time for the animation, to give transition from the manually animated idle.

The animation blueprint itself does very little:

Scanner_AnimBlueprint.png

Down the bottom left is the Idle animation that happily ticks away all the time, and that blends with an override animation, which is what the Attract Points set.

Each of the rotators are driven by procedural animation on the Scanner blueprint, which I’ll show in a bit.

Improvements in hindsight

The Idle / Override blending part of this is definitely something I would change in hindsight, because blending in and out of the idle is a mess: the scanner could be on an up or down point when I start.

There’s a few ways I could deal with it, including just changing the up and down sine wave motion to be driven by blueprint instead, or just restarting the idle animation when I start blending it back in (or probably a dozen better ways to do it that I don’t know about :P).

Also, the “pause” functionality in the Attract Points is not a great way to do things.
Timing the pause and playing animations was a lot of trial and error, I should have sent events out from the animations instead that trigger the pause.

Custom animations in Modo

There’s three custom animations that I made in Modo:

  • Idle
  • Searching left and right animation half way down the tunnel
  • The final “do I see something? I think I see something… SQUIRREL!!”

Everything in the mesh is hard rigged (no deformation), so just parenting all the pieces together, and importing into Unreal generates a skeleton for me.

In Modo, I didn’t do anything particularly exciting this time around, i.e no interesting rig, I just keyframed the bones.

Modo has export presets for UE4 and Unity now, which is pretty ace!
You can also set up your own presets:

ModoExportPresets.png

It was pretty fun doing some animation again, it’s something I really don’t do very often, and Modo makes it pretty easy to dive in.

Tick-tock

Ok, back in UE4, time to have a quick look over the tick event in the Scanner blueprint.

ScannerBlueprint_tick.png

Unlike my normal lazy self, this time I stitched a bunch of things together, so you can see it in full!

Embrace lag

I wanted quite a few of the animations to be driven by the velocity of the scanner.
I’m calculating the velocity myself anyway, based on set max speed and acceleration values, so I could have just used that value, and built some lag into it.

But I found something more fun:

SpringArms.gif

I have 2 non colliding “spring arms” attached to the Scanner root node, one that is slow to catch up (used for body tilt), one that is fairly quick (used for tail rotation).

This was inspired by some amazingly cool animation work I saw Greg Puzniak doing in Unity3D. It’s not up on his site, but he has cool matcap stuff up there that you should check out! 🙂

So in a lot of my blueprint functions, I get the distance from the spring arm to the arrow, and use that to drive laggy rotation:

DaysOfYaw.png

Despite the comment in the blueprint, when I tried to use this for banking, the ATM swallowed my card (har har har).

So most of my procedural animation is driven this way.
When I’m not playing the override animations, everything except for the up and down bobbing is procedural. Which is pretty silly, because that’s a sine wave, so it’s not really an Idle animation, and I should have got rid of it.

The rest of the blueprint is about keeping track of how far along the spline we are, how close to the next point we are (so we can start playing an animation, if necessary), and then setting the world location and rotation of the Scanner from the distance along the spline.

Next-gen volumetric fog

Is something I don’t know much about, so I just made a hacky material UV offset thingo 😛

I say “made”, but there are some great light beam and fog examples from Epic included in the engine content, so I grabbed most of this thing from those examples, and here:

Fog Sheet and Light Beams

The scanner has a geometry cone on it that is UV mapped 0->1 U along the length.
LightConeWire.png

I don’t think I really changed much from the content example (I honestly can’t remember), but I did add two parameters that adjust the tiling offset of the noise texture:

lightbeammat

As the Scanner moves along the path, it increases the FogForwardOffset which pans the U coordinate of the UVs, so that it looks like the cone is moving through a volume. There’s also always a little bit of panning going on anyway, even when the Scanner is stopped.

As the Scanner rotates, I scroll the V coordinate, just so the noise in the beam doesn’t stay fixed. The rotation isn’t very convincing, so I could probably do a better job of that.

There’s not much going on in the blueprint, but I put whatever I could in there rather than in the material:

UpdateConeFog.png

Lighting

SceneLighting.png

The idea of the scene was to be almost entirely lit by the scanner, but I do have a bunch of static lights scatter around too, just to give some ambient light.

There are also two stationary lights in the scene to get highlights where I want them (the lights selected in the screenshot above).
One of them is a spotlight, used to hit the puddle and left wall.

There is also a small light at the front of the tunnel that has “Indirect Lighting Intensity” set to 0, so it doesn’t affect the bounced lighting.
This is the light that hits the scanner here:

ScannerTunnelFrontLight.png

This light is quite bright, so when the Scanner hits it, the rest of the environment darkens down, which (hopefully) puts the focus all on the Scanner (yay for auto-exposure!).

There are only two shadow casting lights in the scene, and they are both on the Scanner.
One of them is on a giant point light, and is super expensive, which is the main reason I limited shadow casters everywhere else:

scannershadowlights

Spinning spotlights

There are also two non shadow casting spotlights on the side of the scanner that rotate and project patterns on the wall.

LightsOnWalls.png
For some reason that I can’t remember, I decide to generate the pattern in a material, rather than do the smart thing and use a texture.

SpotlightFunction.png

I’m modifying the “VectorToRadialValue” function to generate bands, then fading it out in the middle bit.

Seriously though, unless you have a really good reason, you should probably do this in a texture 🙂

Conclusion

So I *think* that’s it!

I’m sure there are things I’ve missed, or glossed over a bit, so feel free to ask questions in the comments and I’ll fill in the gaps.

 

 

City scanner scene – Breakdown pt2

October 13, 2016

Webs.gif

This is part 2 of the breakdown for my recent scene Half-Life 2 scanner scene (part 1 here).

This time, I’m going to focus on the Houdini web setup.

Although it took me a while to get a very subtle result in the end, it was a fun continuing learning experience, and I’m sure I’ll re-use a bunch of this stuff!

Go go Gadget webs!

I saw a bunch of really great photos of spider webs in tunnels (which you can find yourself by googling “tunnel cobwebs concrete” :)).

I figured it would be a fun time to take my tunnel into Houdini, and generate a bunch of animated hanging webby things, and bring them back into UE4.

This fun time ended up looking like a seahorse:

itsaseahorselol.png

I will break this mess a bit 🙂

Web starting points

PointsAndRaysGraph.png

I import the geometry for the tunnel and rails, and scatter a bunch of points over it, setting their colour to red.

On the right hand side of the seahorse is a set of nodes for creating hanging webs, which is just some straight down line primitives, with a few attributes like noise and thickness added to them.
I’ll come back to these later:

HangingWebs.png

In the top middle of the seahorse, I have a point vop apply two layers of noise to the colour attribute, and also blend the colour out aggressively below the rails, because I only wanted webs in the top half of the tunnel.

The web source points look like this:

WebPoints.png

From these points, I ray cast out back to the original geometry.

Ray casting straight out of these points would be a little boring, though, so I made another point vop that randomizes the normals a little first:

WebNormals.gif

After this, I have a few nodes that delete most of the points generated from the pipe connections: they have a high vertex density, compared to every other bit of mesh, so when I first ran the thing, I had a thousand webs on the pipe connections.
I also delete really small webs, because they look lame.

We are now at seahorse upper left.

Arcy Strangs.

ArcyStrangs.png

Not sure what I was thinking when naming this network box, but I’m rolling with it.

So anyway, the ray cast created a “dist” attribute for distance from the point to the ray hit, in the direction of the normal.

So my “copy1” node takes a line primitive, copies it onto the ray points, sets the length of the line to the “dist” attribute (my word, stamping is such a useful tool in Houdini).

CopyLines.png

Before the copy, I set the vertex red channel from black to red along the length of the line, just for convenience.

Previous up the chain, I found the longest of all the ray casts, and saved it off in a detail attribute. This is very easy to do by just using Attribute Promote, using Maximum as the Promotion Method.

So, I now define a maximum amount of “droop” I want for the webs, a bit of random droop, and then I use those values to move each point of each web down in Y a bit.

WebDroop.png

I use sample that ramp parameter up there using the web length, and then multiply that over the droop, so that each end of the web remains fastened in place.
And I don’t really care if webs intersect with the rails, because that’s just how I roll…

Fasten your seatbelts, we are entering seahorse spine.

Cross web connecty things

ConnectingWebStrands.png

For each of the webs in the previous section, I create some webs bridging between them.
Here’s the network for that.

ConnectingStrands.png

I use Connect Adjacent Pieces, using Adjacent Pieces from Points, letting the node connect just about everything up.

I use a carve node to cut the spline up, then randomly sort the primitives.

At this point, I decided that I only want two connecting pieces per named web, and I got lazy so I wrote vex for this:

string CurrentGroupName = "";

string PickedPieces[];
int PieceCount[];

int MaxPerPiece = 2;
int success = 0;

addprimattrib(geoself(), "toDelete", 0, "int");

for (int i = 0; i < nprimitives(geoself()); i ++)
{
    string CurrentName = primattrib(geoself(), "name", i, success);

    int FindIndex = find(PickedPieces, CurrentName);
    
    if (FindIndex < 0)
    {
        push(PickedPieces, CurrentName);        
        push(PieceCount, 1);
    }
    else
    {  
        int CurrentPieceCount = PieceCount[FindIndex];
        
        if (CurrentPieceCount >= MaxPerPiece)
        {
            setprimattrib(geoself(), "toDelete", i, 1, "set");
        }
        else
        {
            PieceCount[FindIndex] = CurrentPieceCount + 1;
        }
    }
    
    setprimattrib(geoself(), "name", i, CurrentName);
}

So that just creates an attribute on a connecting piece called “toDelete”, and you can probably guess what I do with that…

The rest of the network is the same sort of droop calculations I mentioned before.

One thing I haven’t mentioned up to this point, though, is that each web has a “Primitive ID” attribute. This is used to offset the animation on the webs in UE4, and the ID had to get transferred down the chain of webs to make sure they don’t split apart when one web meets another.

At this point, I add a bunch of extra hanging webs off these arcy webs, and here we are:

AllWebWires.png

Then I dump a polywire in, and we’re pretty much good to go!

Well… Ok. There’s the entire seahorse tail section.

For some reason, Polywire didn’t want to generate UVs laid out along the web length.

I ended up using a foreach node on each web, stacking the web sections up vertically in UV space, using a vertex vop, then welding with a threshold:

LayoutUVs.png

Since I have the position, 0-1, along the current web, I could use that to shift the UV sections up before welding.

With that done on every web, my UVs look like this:

UVsHoriz.png

Which is fine.
When I import the meshes into UE4, I just let the engine pack them.

Seriously, though… These are the sorts of meshes that I really wish I could just bake lighting to vertex colours in UE4 instead of a lightmap.
It would look better, and have saved me lots and lots of pain…

And here we are, swing amount in red vertex channel, primitive offset (id) in green:

FinalWebs.png

Web contact meshes

I wanted to stamp some sort of mesh / decal on the wall underneath the hanging meshes.
If you have a look back at the top of the seahorse, you might notice an OUT_WebHits node which contains all the original ray hits.

I’m not going to break this down completely, but I take the scatter points, bring in the tunnel geometry, and use the scatter points to fracture the tunnel.

I take that, copy point colour onto the mesh, and subdivide it:

WallWebsSubd.png

Delete all the non red bits, push the mesh out along normals with some noise, polyreduce, done 🙂

WallWebsFinal.png

I could have done much more interesting things with this, but then life is full of regrets isn’t it?

Back to UE4

So, export all that stuff out, bring it into UE4.

Fun story, first export I did was accidentally over 1 million vertices, and the mesh still rendered in less than half a millisecond on a GeForce 970.
We are living in the future, people.

CobwebsMaterial.png

Most of this material is setting up the swinging animation for the webs, using World Position Offset.

There’s two sets of parameters for everything: One for when the web is “idle”, one for when it is being affected by the Scanner being near it.

To pass the position of the scanner into the material, I have to set up a Dynamic Material Instance, so this is all handled in the web blueprint (which doesn’t do much else).

It also passes in a neutral wind direction for when the webs are idle, which I set from the forward vector of an arrow component, just to make things easy:

WindDirection.png

So now I have the scanner position, for each vertex in each web I get the distance between it, and the scanner, and use that to lerp between the idle and the “windy” settings.

All of these values are offset by the position id that I put in the green channel, so that not all of the webs are moving at exactly the same time.

Still to come…

Animation approach from Modo to blueprints, lighting rig for the scanner, all the fun stuff! 🙂

City scanner scene – Breakdown pt1

October 12, 2016

EnvWideShot.png

In this post, I’ll go through the construction of the environment for my recently posted Half Life 2 scanner scene.

The point of this project was really just to do a bit of animation on my scanner, and show it off in a simple environment. I can’t remember the last time I did any animation, but my guess would be when I was studying at the AIE over ten years ago 🙂

So with that in mind, figuring I was going to struggle with the animation side, I wanted to keep the environment dead simple. It was always going to be dark, anyway, since I wanted the scanner to light the scene!

Modelling / texturing the tunnel

I looked up a bunch of photo reference for cool tunnels in Europe, presumably the sort of thing that the resistance in city 17 would have used 🙂

I blocked out basic lighting, camera setup, and created the tunnel out of cubes in UE4.
Once I was happy with the layout, I could then just export the blocked out mesh to FBX to use as a template in Modo:

WIP_ExportBlockout.png

I also took the time to make a really basic animatic.
I changed the path of the scanner quite a bit, and timing, etc, but I still found this to be useful:

Anyway, at this point, the scene blockout is in Modo, and I can start building geometry:

WIP_SceneBlockoutModo.png

The geometry itself is dead simple, so I won’t go into that too much, I just extruded along a spline, then beveled and pushed a few edge loops around 🙂

I always use the sculpt tools to push geometry around a little, just to make things feel a bit more natural. Here specifically I was sinking some of the vertices on the side pathways:

WIP_PushVertsModo.png

Layered vertex painted materials can be expensive, so I wanted to avoid going too far down that path.
In the end, I settled on having two layers: concrete, and moldy damp green stuff:

WIP_WallMaterial.png

The green stuff is vertex paint blended on, and the vertex colours for the mask was done in UE4 rather than in Modo, just because it is quick and easy to see what I’m doing in editor.

Most of the materials in the scene were made in Substance painter.
And I’m lazy, so they are usually a couple of layers with procedural masks, and one or two hand painted masks 🙂

substancepainterconcrete

Water plane

Water.gif

For the purposes of this scene, I could get away with a pretty low tech / low quality water plane. As long as it had some movement, and is reflective, then it would do!

The engine provides flow map samples and functions in the content samples, so I just used those. I’ve written my own ones before (and by that, I mean I copied what they were doing in the Portal 2 Water Flow presentation from siggraph 2010), but the UE4 implementation does exactly what I wanted 🙂

And seriously, if you haven’t looked at that presentation, go do it.
They used Houdini to generate water flow, but I’m lazy and ain’t got time for that! (Not for this scene, at any rate).

I just generated mine in Photoshop, using this page as a guide:

Photoshop generated flow maps

At some point, I’d like to see if I can set up the same workflow in Substance Painter and/or Houdini.

Anyway, the material is a bit messy (sorry):

watermaterial

I’m passing the flowmap texture and some timing parameters into the flowmaps material function, and getting a new normal map out of it.

The only other thing going on here is that I have a mask for the edges of the water, where it is interacting with the walls. I blend in different subsurface colour, normal strength and roughness at the edges.

Fog planes

FogPlanes.png

I’ve got a few overlapping fog planes in the scene, with a simple noisy texture, offset by world position (having a different offset on each make it feel a little more volumetric).

Much like the water, the fog plane has a subtle flow map on it, to fake a bit of turbulence, and the material uses depth fade on opacity to help it blend with the surrounding geometry:

fog

UE4 4.13 mesh decals

I was going to use a bunch of the new 4.13 features originally, but in the end I think the only one I used was “mesh decals”.

These are decals in the old school sense, not the projected decals that UE4 users have probably come to love. In the back of my mind, I had thought I might turn this into a VR scene at some point, and the cost of projected decals is a somewhat unknown commodity for me at the moment.

The main advantage of mesh decals, vs floating bits of geometry with Masked materials, is that mesh decals support full alpha blending.

In these shots, the water puddle, stain and concrete edge damage are all on part of the same decal sheet:

The decals are all using Diffuse, Normals, Roughness, Metallic and Occlusion (the last three packed together):

DecalsTextures.pngI built the decals one at a time, without much planning, basically guessing at how much texture space I thought I was going to need (I didn’t bother setting a “texels per metre” type of limit for my project, but that probably would have been sensible).

Each time I wanted a new mesh decal, I’d work out in Modo how big I want it first:

ModoDecalMeshes.png

Then I’d copy it into a separate Modo scene just for Decal Layout which I take into Substance Painter.
I just did this so I could keep all the mesh together in one space, to keep it easy for painting:

ModoDecalScene.png

And then here is the scene in Substance:

SubstancePainterDecalScene.png

And here is the scene with and without decals:

meshdecals

What’s great about this, is that mesh decals don’t show up in Shader Complexity, so the tech artists on the project will never know… (I kid, I kid. They will find them in PIX, and will hunt you down and yell at you).

I really like this approach to building wear and tear into materials. The first time I saw this approach was when I was working at Visceral Games in Melbourne, and the engine was very well optimized to handle a pretty huge amount of decals. I didn’t embrace it as much as I should have, back then.

Rails

A few years back, I made a blueprint for pipes that allowed joining sections, etc.
So I knocked together a model in Modo for the connection pieces:

RailBracketModo.png

Edge-weighted sub-d, of course, because I can’t help myself 🙂
I even started sculpting in some heavy rust, but had to have a stern word to myself about not spending too much time on stuff that isn’t even going to be lit…

Textured in Substance Painter:

railbracketsubstance

Same dealio with the pipe segments:

railsubstance

Then I just built the spline in the editor, and set it up like in my old blog post.

Much like I did with the original blockout geometry, I also exported the final pipes back out to Modo so that I could use them to work out where I wanted to put some decals.

The only other thing that was a pain, was that the pipes need lightmaps, but I couldn’t work out a way to generate unique UVs for the final pipe mesh.

In the end, I just used the merge actors function in the editor, so that they all became a single static mesh, and let Unreal generate lightmap UVs.

Webs

Did you notice that there were hanging spider webs in the scene?
No? Good, because I don’t like them much 😛

I probably spent 10-20 hours just messing about with these silly things, but at least I got some fun gifs out of them:

BusySpiders.gif

Next up…

I’ll break down the construction of those web things, might be useful for a scene full of badly animated vines, I suppose…

I’ll also go through all of the silly things I did on the animation / blueprint / lighting side.

City Scanner scene

October 10, 2016

Had a bit of fun making a scene for my Half Life 2 city scanner.
Will do some break downs of the scene here on my blog at some point 🙂

City Scanner

August 27, 2016

Since I had so much fun with the last Modo / Substance project I did, thought I’d do another one 🙂

This time, I decided to make a City Scanner, from Half Life 2.
It’s a work in progress, and I’ll keep posting regular screenshots up on my twitter, but here’s where I’m currently at:

WIP10

I could have been smart, and just grabbed the model from Source and built around it, but I need to practice building things from scratch, so I built it based off a bunch of screenshots I grabbed out of the game.

It has quite a few differences to the original, which I’m going to pretend was due to creative license, rather than me screwing up proportions, etc (I particularly hate the green side panel, and some of the rear details, but I’m not going to fix the modelling on those at this point) …

Building the model

As with everything I do, this was built as an edge-weighted Catmull-Clark subdivision surface, in Modo 10.

Whenever working on these things, I tend to throw some basic Modo procedural materials and render them out, so here’s where I was at by the end of the high poly process:

ScannerRender.png

Once I was happy with the model (read: sick of working on it :P), I created the low poly mesh for it, and unwrapped the thing.

WIP_WireLP.png
Unwrapping aside, this didn’t take a huge amount of time, because I just used the base sub-d cage, and stripped out a bunch of loops.
It’s pretty heavy still, at about 7000 vertices, but it’ll do!

Painter work

I could have baked the procedural materials out of Modo, and painted over the top of them, etc (Modo actually has some great baking and painting tools these days), but I need to keep using painter more.

Probably the largest amount of time I spent from this point on was splitting the high and low poly up into lots of different meshes so that I could bake all the maps I needed in Substance Painter.

Models with lots of floating, yet welded intersecting parts are a little bit of a pain for this sort of thing, but I got there eventually.

From Modo, I baked out a Surface ID mask (actually, I used a Diffuse render output, and had flood fill colours on all my materials, but I use it as a Surface ID mask in Painter):

SurfaceIDs

For each of the colour blocks, I set up a folder in Painter that had a Colour
Selection mask on it:

WIP_ColourSelection.png

And then I just stack up a bunch of flood fill colour layers with masks until I’m happy.

There’s not a lot of actual painting going on here, at this point, although I do always paint out some parts of the procedural masks, because having even edge wear across the whole model looks pretty silly.

That said, smart masks with flood fill layers aren’t a bad way to flesh out basic wear and tear, etc:

WIP-SmartMask.png

I still need to paint out more of the wear and tear on my model, and put more colour variation in, it looks a little like it has been in a sandstorm, then thrown down some stairs 🙂

UE4

UE4Wip_2.png

Aside from some issues with Reflection Capture Actors (having emissive materials in a scene can mess them up a bit), I really didn’t do much except throw the exported textures from Substance onto the mesh, and put a few lights in.

I did mess about with the texels per pixel, min and fade resolutions, and radius thresholds of the shadow casters a bit, because the default settings for shadows in UE4 are pretty low quality for some reason, even on Epic settings.

The material is really boring at the moment, the only thing it exposes is a multiplier for emissive:

WIP-UE4Material.png

Next steps

I will probably animate this in UE4 at some point, and have it floating around, flashing lights, etc.
And it will end up as a minor piece in an environment at some point, hopefully 🙂

For now, though, I want to continue down the fun path of Modo sub-d/Substance, so I will probably start working on a new model.

Watch this space, and/or twitter 🙂