Posts Tagged ‘material’

Subsurface Scattering spherical harmonics – pt 2

March 22, 2017

 

This is my 2nd blog post on using spherical harmonics for depth based lighting effects in Unreal 4.

The first blog post focused on generating the spherical harmonics data in Houdini, this post focuses on the Unreal 4 side of things.

I’m going to avoid posting much code here, but I will try to provide enough information to be useful if you choose to do similar things.

SH data to base pass

The goal was to look up the depth of the object from each light in my scene, and see if I could do something neat with it.

In UE4 deferred rendering, that means that I need to pass my 16 coefficients from the material editor –> base pass pixel shader -> the lighting pass.

First up, I read the first two SH coefficients out of the red and green vertex colour channels, and the rest out of my UV sets (remembering that I kept the default UV set 0 for actual UVs):

SHBaseMatUVs

Vertex colour complications

You notice a nice little hardcoded multiplier up there… This was one of the annoyances with using vertex colours: I needed to scale the value of the coefficients in Houdini to 0-1, because vertex colours are 0-1.

This is different to the normalization part I mentioned in the last blog post, which was scaling the depth values before encoding them in SH. Here, I’m scaling the actual computed coefficients. I only need to do this with the vertex colours, not the UV data, since UVs aren’t restricted to 0-1.

The 4.6 was just a value that worked, using my amazing scientific approach of “calculate SH values for half a dozen models of 1 000 – 10 000 vertices, find out how high and low the final sh values go, divide through by that number +0.1”. You’d be smarter to use actual math to find the maximum range for coefficients for normalized data sets, though… It’s probably something awesome like 0 –> 1.5 pi.

Material input pins

Anyway, those values just plug into the SH Depth Coeff pins, and we’re done!!

Unreal 4 SH depth material

Ok.
That was a lie.
Those pins don’t exist usually… And neither does this shading model:

SHDepthShadingModel

So, that brings me to…

C++ / shader side note

To work out how to add a shading model, I searched the source code for a different shading model (hair I think), and copied and pasted just about everything, and then went through a process of elimination until things worked.
I took very much the same approach to the shader side of things.

This is why I’m a Tech Artist, and not a programmer… Well, one of many reasons 😉
Seriously though, being able to do this is one of the really nice things about having access to engine source code!

The programming side of this project was a bunch of very simple changes across a wide range of engine source files, so I’m not going to post much of it:

P4Lose

There is an awful lot of this code that really should be data instead. But Epic gave me an awesome engine and lets me mess around with source code, so I’m not going to complain too much 😛

Material pins (continued…)

So I added material inputs for the coefficients, plus some absorption parameters.

Sh coeffs

The SH Coeffs material pins are new ones, so I had to make a bunch of changes to material engine source files to make that happen.
Be careful when doing this: Consistent ordering of variables matters in many of these files. I found that out the easy way: Epic put comments in the code about it 🙂

Each of the SH coeffs material inputs is a vector with 4 components, so I need 4 of these to send my 16 coefficients through to the base pass.

Custom data (absorption)

The absorption pins you might have noticed from my material screenshot are passed as “custom data”.
Some of the existing lighting models (subsurface, etc) pass additional data to the base pass (and also through to lighting, but more on that later).

These “custom data” pins can be renamed for different shading models. So you can use these if you’d rather not go crazy adding new pins, and you’re happy with passing through just two extra float values.
Have a look at MaterialGraph.cpp, and GetCustomDataPinName if that sounds like a fun time 🙂

Base pass to lighting

At this point, I’d modified enough code that I could start reading and using my SH values in the base pass.

A good method for testing if the data was valid was using the camera vector to look up the SH depth values. I knew things were working when I got similar results to what I was seeing in Houdini when using the same approach:

BasePassDebug

That’s looking at “Base Color” in the buffer visualizations.

I don’t actually want to do anything with the SH data in the base pass, though, so the next step is to pass the SH data through to the lighting pass.

Crowded Gbuffer

You can have a giant parameter party, and read all sorts of fun data in the base pass.
However, if you want to do per-light stuff, at some point you need to write all that data into a handful of full screen buffers that the lighting pass uses. By the time you get to lighting, you don’t have per object data, just those full screen buffers and your lights.

These gbuffers are lovingly named GBufferA, GBufferB, GBuffer… You get the picture.

You can visualize them in the editor by using the various buffer visualizers, or explicitly using the “vis” command, e.g: “vis gbuffera”:

visGbuffers

There are some other buffers being used (velocity, etc), but these are the ones I care about for now.

I need to pass an extra 16 float values through to lighting, so surely I could just add 4 new gbuffers?

Apparently not, the limit for simultaneous render targets is 8 🙂

I started out by creating 2 new render targets, so that covers half of my SH values, but what to do with the other 8 values?

Attempt 1 – Packing it up

To get this working, there were things that I could sacrifice from the above existing buffers to store my own data.

For example, I rarely use Specular these days, aside from occasionally setting it to a constant, so I could use that for one of my SH values, and just hard code Specular to 1 in my lighting pass.

With this in mind, I overwrote all the things I didn’t think I cared about for stylized translucent meshes:

  • Static lighting
  • Metallic
  • Specular
  • Distance field anything (I think)

Attempt 2 – Go wide!

This wasn’t really ideal. I wasn’t very happy about losing static lighting.

That was about when I realized that although I couldn’t add any more simultaneous render targets, I could change the format of them!

The standard g-buffers are 8 bits per channel, by default. By going 16 bit per channel, I could pack two SH values into each channel, and store all my SH data in my two new g-buffers without the need for overwriting other buffers!

Well, I actually went with PF_A32B32G32R32F, so 32 bits per channel because I’m greedy.

It’s probably worth passing out in horror at the cost of all this at this point: 2 * 128bit buffers is something like 250mb of data. I’m going to talk about this a little later 🙂

Debugging, again

I created a few different procedural test assets in Houdini with low complexity as test cases, including one which I deleted all but one polygon as a final step, so that I could very accurately debug the SH values 🙂

On top of that, I had a hard coded matrix in the shaders that I could use to check, component by component, that I was getting what I expected when passing data from the base pass to lighting, with packing/unpacking, etc:

const static float4x4 shDebugValues = 
{
	0.1, 0.2, 0.3, 0.4,
	0.5, 0.6, 0.7, 0.8,
	0.9, 1.0, 1.1, 1.2,
	1.3, 1.4, 1.5, 1.6
};

It seems like an obvious and silly thing to point out, but it saved me some time 🙂

Here are some of my beautiful procedural test assets (one you might recognize from the video at the start of the post):

Houdini procedural test asset (rock thing)testobject3testobject2testobject1

“PB-nah”, the lazy guide to not getting the most out of my data

Ok, SH data is going through to the lighting pass now!

This is where a really clever graphics programmer could use if for some physically accurate lighting work, proper translucency, etc.

To be honest, I was pleasantly surprised that anything was working at this stage, so I threw in a very un-pbr scattering, and called it a day! 🙂

float3 SubsurfaceSHDepth( FGBufferData GBuffer, float3 L, float3 V, half3 N )
{
	float AbsorptionDistance 	= GBuffer.CustomData.x;
	float AbsorptionPower 		= lerp(4.0f, 16.0f, GBuffer.CustomData.y);

	float DepthFromPixelToLight 	= Get4BandSH(GBuffer.SHCoeffs, L);
	float absorptionClampedDepth 	= saturate(1.0f / AbsorptionDistance * DepthFromPixelToLight);
	float SSSWrap 			= 0.3f;
	float frontFaceFalloff 		= pow(saturate(dot(-N, L) + SSSWrap), 2);

	float Transmittance 		= pow(1 - absorptionClampedDepth, AbsorptionPower);

	Transmittance *= frontFaceFalloff;

	return Transmittance * GBuffer.BaseColor;
}
It’s non view dependent scattering, using the SH depth through the model towards the light, then dampened by the absorption distance.
The effect falls off by face angle away from the light, but I put a wrap factor on that because I like the way it looks.
For all the work I’ve put into this project, probably the least of it went into the actual lighting model, so I’m pretty likely to change that code quite a lot 🙂
What I like about this is that the scattering stays fairly consistent around the model from different angles:
GlowyBitFrontGlowyBitSide
So as horrible and inaccurate and not PBR as this is, it matches what I see in SSS renders in Modo a little better than what I get from standard UE4 SSS.

The End?

Broken things

  • I can’t rotate my translucent models at the moment 😛
  • Shadows don’t really interact with my model properly

I can hopefully solve both of these things fairly easily (store data in tangent space, look at shadowing in other SSS models in UE4), I just need to find the time.
I could actually rotate the SH data, but apparently that’s hundreds of instructions 🙂

Cost and performance

  • 8 uv channels
  • 2 * 128 bit buffers

Not really ideal from a memory point of view.

The obvious optimization here is to drop down to 3 band spherical harmonics.
The quality probably wouldn’t suffer, and that’s 9 coefficients rather than 16, so I could pack them into one of my 128 bit gbuffers instead of two (with one spare coefficient left over that I’d have to figure out).

That would help kill some UV channels, too.

Also, using 32 bit per channel (so 16 bits per sh coeff) is probably overkill. I could swap over to using a uint 16 bits per channel buffer, and pack two coefficients per channel at 8 bits each coeff, and that would halve the memory usage again.

As for performance, presumably evaluating 3 band spherical harmonics would be cheaper than 4 band. Well, especially because then I could swap to using the optimized UE4 functions that already exist for 3 band sh 🙂

Render… Differently?

To get away from needing extra buffers and having a constant overhead, I probably should have tried out the new Forward+ renderer:

https://docs.unrealengine.com/latest/INT/Engine/Performance/ForwardRenderer/

Since you have access to per object data, presumably passing around sh coefficients would also be less painful.
Rendering is not really my strong point, but my buddy Ben Millwood has been nagging me about Forward+ rendering for years (he’s writing his own renderer http://www.lived3d.com/).

There are other alternatives to deferred, or hybrid deferred approaches (like Doom 2016’s clustered forward, or Wolfgang Engels culled visibility buffers) that might have made this easier too.
I very much look forward to the impending not-entirely-deferred future 🙂

Conclusion

I learnt some things about Houdini and UE4, job done!

Not sure if I’ll keep working on this at all, but it might be fun to at least fix the bugs.

 

Advertisements

City scanner scene – Breakdown pt3

October 15, 2016

ScannerFloat.gif

Part 3 of the breakdown of my recent Half-Life 2 Scanner scene.

And now for animation! Also known as “Geoff stumbling around blindly for a week when he really should have watched some UE4 animation tutorials”.

So, erm, feel free to use this as a guide on how *not* to approach making an object float down a hallway…

Down the garden path

Early on, I was trying to work out if I wanted to just animate the whole thing from start to finish in Modo, or do something a little more systemic.

For the sake of trying something different, I settled on having the main movement down the tunnel, rotation of the centre wheel, tail and little flippy bits (technical term) through blueprints, and then blend in a few hand animated bits.

There are three main blueprints that do the work: the Scanner blueprint, the Scanner Path blueprint, and the Scanner Attract Point blueprint.

ScannerBlueprints.png

The division of labour between these things ended up being pretty arbitrary, but the initial idea was that an Attract Point can start playing an animation on the Scanner when it reaches the point, and can also modify the max speed of the Scanner when leaving the point.

Here are the parameters that each point has:

AttractPointProperties.png

So when the Scanner reaches an attract point, it can pause for a while (I use this time to play an animation, generally). The animation doesn’t start when the scanner reaches the point though, it actually starts at a certain percentage of distance along the previous spline segment leading up to this point.

There is also a blend in and out time for the animation, to give transition from the manually animated idle.

The animation blueprint itself does very little:

Scanner_AnimBlueprint.png

Down the bottom left is the Idle animation that happily ticks away all the time, and that blends with an override animation, which is what the Attract Points set.

Each of the rotators are driven by procedural animation on the Scanner blueprint, which I’ll show in a bit.

Improvements in hindsight

The Idle / Override blending part of this is definitely something I would change in hindsight, because blending in and out of the idle is a mess: the scanner could be on an up or down point when I start.

There’s a few ways I could deal with it, including just changing the up and down sine wave motion to be driven by blueprint instead, or just restarting the idle animation when I start blending it back in (or probably a dozen better ways to do it that I don’t know about :P).

Also, the “pause” functionality in the Attract Points is not a great way to do things.
Timing the pause and playing animations was a lot of trial and error, I should have sent events out from the animations instead that trigger the pause.

Custom animations in Modo

There’s three custom animations that I made in Modo:

  • Idle
  • Searching left and right animation half way down the tunnel
  • The final “do I see something? I think I see something… SQUIRREL!!”

Everything in the mesh is hard rigged (no deformation), so just parenting all the pieces together, and importing into Unreal generates a skeleton for me.

In Modo, I didn’t do anything particularly exciting this time around, i.e no interesting rig, I just keyframed the bones.

Modo has export presets for UE4 and Unity now, which is pretty ace!
You can also set up your own presets:

ModoExportPresets.png

It was pretty fun doing some animation again, it’s something I really don’t do very often, and Modo makes it pretty easy to dive in.

Tick-tock

Ok, back in UE4, time to have a quick look over the tick event in the Scanner blueprint.

ScannerBlueprint_tick.png

Unlike my normal lazy self, this time I stitched a bunch of things together, so you can see it in full!

Embrace lag

I wanted quite a few of the animations to be driven by the velocity of the scanner.
I’m calculating the velocity myself anyway, based on set max speed and acceleration values, so I could have just used that value, and built some lag into it.

But I found something more fun:

SpringArms.gif

I have 2 non colliding “spring arms” attached to the Scanner root node, one that is slow to catch up (used for body tilt), one that is fairly quick (used for tail rotation).

This was inspired by some amazingly cool animation work I saw Greg Puzniak doing in Unity3D. It’s not up on his site, but he has cool matcap stuff up there that you should check out! 🙂

So in a lot of my blueprint functions, I get the distance from the spring arm to the arrow, and use that to drive laggy rotation:

DaysOfYaw.png

Despite the comment in the blueprint, when I tried to use this for banking, the ATM swallowed my card (har har har).

So most of my procedural animation is driven this way.
When I’m not playing the override animations, everything except for the up and down bobbing is procedural. Which is pretty silly, because that’s a sine wave, so it’s not really an Idle animation, and I should have got rid of it.

The rest of the blueprint is about keeping track of how far along the spline we are, how close to the next point we are (so we can start playing an animation, if necessary), and then setting the world location and rotation of the Scanner from the distance along the spline.

Next-gen volumetric fog

Is something I don’t know much about, so I just made a hacky material UV offset thingo 😛

I say “made”, but there are some great light beam and fog examples from Epic included in the engine content, so I grabbed most of this thing from those examples, and here:

Fog Sheet and Light Beams

The scanner has a geometry cone on it that is UV mapped 0->1 U along the length.
LightConeWire.png

I don’t think I really changed much from the content example (I honestly can’t remember), but I did add two parameters that adjust the tiling offset of the noise texture:

lightbeammat

As the Scanner moves along the path, it increases the FogForwardOffset which pans the U coordinate of the UVs, so that it looks like the cone is moving through a volume. There’s also always a little bit of panning going on anyway, even when the Scanner is stopped.

As the Scanner rotates, I scroll the V coordinate, just so the noise in the beam doesn’t stay fixed. The rotation isn’t very convincing, so I could probably do a better job of that.

There’s not much going on in the blueprint, but I put whatever I could in there rather than in the material:

UpdateConeFog.png

Lighting

SceneLighting.png

The idea of the scene was to be almost entirely lit by the scanner, but I do have a bunch of static lights scatter around too, just to give some ambient light.

There are also two stationary lights in the scene to get highlights where I want them (the lights selected in the screenshot above).
One of them is a spotlight, used to hit the puddle and left wall.

There is also a small light at the front of the tunnel that has “Indirect Lighting Intensity” set to 0, so it doesn’t affect the bounced lighting.
This is the light that hits the scanner here:

ScannerTunnelFrontLight.png

This light is quite bright, so when the Scanner hits it, the rest of the environment darkens down, which (hopefully) puts the focus all on the Scanner (yay for auto-exposure!).

There are only two shadow casting lights in the scene, and they are both on the Scanner.
One of them is on a giant point light, and is super expensive, which is the main reason I limited shadow casters everywhere else:

scannershadowlights

Spinning spotlights

There are also two non shadow casting spotlights on the side of the scanner that rotate and project patterns on the wall.

LightsOnWalls.png
For some reason that I can’t remember, I decide to generate the pattern in a material, rather than do the smart thing and use a texture.

SpotlightFunction.png

I’m modifying the “VectorToRadialValue” function to generate bands, then fading it out in the middle bit.

Seriously though, unless you have a really good reason, you should probably do this in a texture 🙂

Conclusion

So I *think* that’s it!

I’m sure there are things I’ve missed, or glossed over a bit, so feel free to ask questions in the comments and I’ll fill in the gaps.

 

 

Checkmate, checker plate

August 28, 2014

UE4 Materials

I want to make a fairly generic material that blends a bunch of layers together for painted, worn, dirty metal, since that just about describes most of the materials in my Half Life scene!

There’s a good 10 years’ worth of Unreal Engine 3 materials to do this sort of thing, so it’s not going to be anything particularly new to you Unreal-ers out there, not to mention for people who have been writing shaders for a while.

That said, it’s been about 5 years since I’ve done a significant amount of shader work (not including a handful of Unity projects at home, and prototyping things here and there), so it’s seriously fun to be jumping back in 🙂

One thing I can’t recommend enough, especially for people who haven’t done a lot of shader/material work, is to prototype your materials in Photoshop.

Quick and dirty checker plate

The first material instance I’ll be making to test out a generic-ish material is a checkerplate metal, with paint, dirt and dents.
I’ll focus on the dirt part first.

So here are the basic elements of the masking I wanted to use, mocked up in Photoshop:

Vertex colour
Mockup_VertColours

Ambient Occlusion (tiled up to the scale of the other maps, to make prototyping easy in Photoshop)
Mockup_AO

Large dirt map
Mockup_Dirt

Photoshop fun

So now to settle on how I want to build my overall “dirt mask” for the material.

I want the vertex colour to overpower the AO, i.e:

  • When the vertex colour is white I want 100% dirt
  • When the vertex colour is black I want 0% dirt

And by black and white, I really mean Red, because I’m using the red channel, but… Details 🙂

A multiply doesn’t give me what I want:

Mockup_Multiply

So what I actually want is an overlay:

Mockup_Overlay

UE4 has nodes for all of the Photoshop blend modes, so this is very simple to do (and note, order of operations is important, swapping your Base and Blend layers will give you different results).

So here’s what it looks like in UE4, with some quick vertex colouring and the overlay mask:

MaterialDirt_1

So far, so good.

Actually, worth mentioning, I’m using the dirt mask to change the Metallic and Roughness parameters of the material, and probably will use it to subtly blend in a different normal and diffuse map at some point.

You could happily just feed this mask into the UE4 layered material system instead, which is just a wrapper around manually blending all the parameters together anyway…

Ok, with some more Photoshop tooling around, I’ve decided I want another overlay, this time using my existing map as a base, and my large scale dirt mask as the overlay:

Mockup_Overlay2

And, same as before, hook the layers up in UE4:

MaterialDirt_2

Ok, so not too bad!

At this point, it could be worth adding a mask sharpness amount that you could paint per vertex, which is a pretty typical approach (scale the mask value around 0.5), but I’m actually pretty happy with the mask part of this now.

Next up, I’m going to put in a wear / dent amount. This will vary the surface normal, and also remove paint.

I’ve sculpted some dents and scratches in Modo, and I’m overdoing it quite severely because I intend to drop it back with vertex colours. Also, my sculpting is a bit lame, so I’ll probably want to redo it later.

Here is the metal with the checkplate normal, and the dent normal blended together (without masking):

MaterialDirt_Dent

Looks denty!
It’s a little beyond the scope of this post, but worth pointing out that just regular blends of normal maps (lerp-ing, overlay, etc), are all a bit lame.
I’m using the reoriented normal mapping approach as outlined by Colin Barré-Brisebois and Stephen Hill:

http://blog.selfshadow.com/publications/blending-in-detail/

Ok, next up I want to blend in the dents based off the green vertex channel.

I should probably work out how to build an “amount” parameter into the reoriented normal blend function, but for now I’m just going to modify the strength of the dent normal by blending it with a straight up vector (0,0,1) based on how much green channel there is:

MaterialDirt_DentMasked

So you can see that some clumsy button pushing scientist has dropped a large barrel off the ladder, and there are various other dents and scratches around.

It’s not shown off well here, but I’ve tried to paint the majority of scratches and bumps in the middle of the walkable area, and near the steps, where I think it makes sense.

Getting rough

I’ve been talking about adding a paint layer, but now I’m thinking that’s overkill.
There’s already enough instructions in this thing to raise an army of angry graphics programmers against me…

Forgetting paint, I think it would be nice to have some large scale roughness variation. To justify this to myself, I like to think that the roughness is the micro-surface detail, and the damage normal is the macro-surface detail, so it makes sense that the large tiling thing would have an effect on the micro scale as well as macro.

To side track further, if every texture in my game was 64 000 * 64 000, and the game resolution was large enough, I wouldn’t need roughness maps because I’d just have a lot of micro variation in the normal map, and it would serve the same purpose 😛

Annnnyway, I’ve dumped a large scale roughness map into the Normal map’s Alpha channel, and I’m using that to overlay the original small tiling roughness (still blending to a different roughness for the dirt):

MaterialDirt_DirtFinal_1

MaterialDirt_DirtFinal_2

MaterialDirt_DirtFinal_3

So yeah, that’s it for now. Things I might think about addressing:
• Decide if I want to add a paint layer or not (I could probably just expand the dirt functionality, and make the whole thing a generic “2 layer material with additional blendy normals”.
• Create AO for the large denty map, combine that with the tiley AO, use that combined to blend in the dirt so that dents get more dirt
• Get better at texturing 🙂

Performance (briefly)

The shader is 105 instructions with dynamic lighting only.
That’s pretty heavy (at least for PS3 / Xbox 360, or other hardware of similar age).

That’s a cost incurred for every pixel of the material, so that might be perfectly fine for a character’s face, or something that is unlikely to take up the whole screen during gameplay, etc.

That said, it’s always a balance.
Shaders like this reduce the need for a tonne of decals, lots of additional textures, etc.

I’m making a bunch of assumptions about how the UE4 renderer works, too.
At this point, I haven’t looked over the shader generation code at all, so I really don’t know much about what UE4 does with these materials.

tl;dr

A multi-purpose blending material, particularly one using overlays, may not be your best friend.
If you end up using it on a giant surface that has large sections that don’t use some of the functionality of the material (for example, a wall that has dirt on the bottom metre of it, and is ten metres tall with no other dirt on it), you might actually be better off:

  • Using decals
  • Creating a cheaper material for the top part that can seamlessly blend with the expensive multi-layer material at the bottom 🙂

Also, don’t just try and guess if things are expensive like I’m trying to here. Just profile it 🙂

Oh my. Nodes everywhere

I would absolutely not have a material this messy in a game, btw, I’d create functions for half this stuff, I promise 😛

Well, the ReorientedNormalBlend is in a function, so that’s a start…

Maybe next post I’ll clean this up and re-post it, so that it doesn’t look like a dog’s breakfast.

Material

As a final note, here’s a good fun video on texture / vertex painting / material blends in UE4 from Uppercut games:

Submerged dev diary #1: Texture painting