Faking Catmull-Clark creased sub-d in UE4

Vector displaced sub-d wheel

I’ve been hoping for edge creased Catmull-Clark subdivision in game engines ever since I started using Modo about 10 years ago.
I previously made a tool to build LODs from sub-d surfaces in Unity, just to have some way of getting a sub-d like mesh in engine. Super expensive LOD meshes…
This was not a very useful thing to do.

There are a few games out there using real-time sub-d, including the folks at Activision who have demonstrated CC sub-d with creases in a game engine:

Efficient GPU Rendering of Subdivision Surfaces using Adaptive Quadtrees

It is unclear if they shipped edge-creasing in any of their released games, but they definitely use CC subdivision surfaces.

And sure, technically there has been real-time subdivision in games going back to TruForm on ATI cards in the early 2000s, and probably before then for all I know, but I’m specifically interested in Catmull-Clark sub-d, and specifically edge creasing 🙂

Why creases?

Creasing gives you control over the sharpness of edges, without having to manually bevel all of your edge loops.
This is nice for keeping a low poly mesh, but also allows you a little more flexibility.
For example, if you come back and change the model, you don’t have to un bevel and re-bevel edges.
If you bake out a normal map, and decide the bevels aren’t quite wide enough, you can just change the crease value and re-bake.

Here are some loops on my wheel model that are heavily creased:

Wire frame sub-d

If I were to bevel those edges instead, my base model would go from 3924 vertices to 4392.

If I removed creases across the whole model, and beveled all the edges to get the same end result I’d need a base mesh around 6000 vertices (2000 vertices more than the creased version).

For the sake of showing how much work the creasing is doing for me, here is the base model vs Sub-d vs Creased Sub-d:

Comparison between sub-d and creased sub-d in Modo

Vector Displacement approach

I’m not likely to be able to implement the Call Of Duty approach myself, so I’ve done something far more hacky, but slightly less gross than my previous Unity attempt 🙂

My new method is:

  • In Houdini, tessellate the model completely flat
  • Also tessellate it using Catmull Clark creased sub-d
  • Bake the difference in positions of the vertices between these two meshes into a vector displacement map and normal map
  • In UE4 (or engine of choice) flat tessellate the model
  • Apply the vector displacement map to push the vertices into their sub-d positions

It’s very expensive from a memory point of view (and probably performance for that matter), so this is not something you’d want to do for a game, but it does show off how nice creased sub-d would be in UE4 🙂

Houdini Vector displacement map generation

First up, here’s the un-subdivided model in Modo:

Low poly wheel in Modo

And this is the edge weighting view mode in Modo, so you can see which edges are being creased:

Modo edge weight visualisation

There are two things I want to bake out of Houdini: A vector displacement map and a normal map.
I’m not baking this data by projecting a high poly model onto a low poly, I don’t need to because the high poly model is generated from the low poly, so it has valid UVs, I can just bake textures straight out from the high poly.

Here’s the main network:

Houdini network for generating vector displacement

The right side of the graph, there are two Subdivide nodes.
The Subdivide on the left uses “OpenSubdiv Blinear”, and on the right is “OpenSubdiv Catmull-Clark”, and they are both subdivided to a level of 5, so that I have roughly more vertices in the meshes than pixels that will get baked out:

Bilinear vs Catmull-Clark sub-d

The “bilinear” subdivision is pretty close to what you get in UE4 when you use “flat tessellation”. So what we want to do is work out how to push the vertices from the left model to match the right model.
This is very easily done in a Point Wrangle, since the point numbers match in both models 🙂

v@vDisp = @P - @opinput1_P;
@N = @opinput1_N;
f@maxDimen = max(abs(v@vDisp.x), abs(v@vDisp.y), abs(v@vDisp.z));

Or if you’d prefer, as a Point VOP:

Vector displacement wrangle as VOP

Vector displacement (vDisp) is the flat model point position minus the creased model point position.
I am also setting the normals of the flat model to match the creased model.

When I save out the vector displacement, I want it in the 0-1 value range, just to make my life easier.
So in the above Wrangle/VOP I’m also working out for each Point what the largest dimension is (maxDimen).
After the Wrangle, I promote that to a Detail attribute (@globalMaxDimen) using the Max setting in the Attribute Promote SOP, so that I know the maximum displacement value across the model, then use another Wrangle to bring all displacement values into the 0-1 range:

v@vDisp = ((v@vDisp / f@globalMaxDimen) + 1) / 2;
@Cd = v@vDisp;

The displacement is now stored in Point colour, in the 0-1 range, and looks like this:

Vector displacement displayed on model

Bake it to the limit!.. surface

You might have noticed that the Normals and Displacement are in World space (since that’s the default for those attributes in Houdini).

I could have baked them out in Tangent space, but I decided for the sake of this test I’d rather not deal with tangent space in Houdini, but it’s worth mentioning since it’s something I need to handle later in UE4.

To bake the textures out, I’m using two Bake Texture nodes in a ROP network in Houdini.

Bake texture nodes in ROP

I’ve only changed a few settings on the Bake Texture nodes:

  • Using “UV Object”, and no cage or High Res objects for baking
  • Turned on “Surface Unlit Base Color” as an output
  • Set the output format for Vector Displacement as EXR
  • Set the output format for Normal map as PNG
  • Unwrap method to “UV Match” (since I’m not tracing from one surface to another)
  • UDIM Post Process to Border Expansion

And what I end up with is these two textures:

Baked vector displacement map

Baked normal map

I bake them out as 4k, but lodbias them down to 2k in UE4 because 4k is a bit silly.
Well, 2k is also silly, but the unwrap on my model is terrible so 2k it is!

Testing in Houdini

If you look back at the main network, there is a section on the left for testing:

Houdini network for generating vector displacement

I created this test part of the network before I jumped into UE4, so I could at least validate that the vector displacement map might give me the precision and resolution of data that I would need.
And also because it’s easier to debug dumb things I’ve done in a Houdini network vs a material in UE4 (I can see the values of every attribute on a vertex, for example) 🙂

I’m taking the flat tessellated model, loading the texture using attribvop_loadTexture, copying the globalMaxDimens onto the model, and then the attribwrangle_expectedResult does the vector displacement.

The attribvop_loadTexture is a Vertex VOP that looks like this:

Vertex VOP used for loading the vector displacement texture

This uses the Vertex UVs to look up the vector displacement map texture, and stores the displacement in vertex colour (@Cd). It also loads the object space normal map, and moves it from 0-1 to -1 to 1, and binds it to a temporary loadedNormals attribute (copied into @N later).

Then at the end of the network, the expectedResult wrangle displaces the position by the displacement vector in colour, using the globalMaxDimen:

@P -= ((@Cd * 2) - 1) * f@globalMaxDimen;

If you’re wondering why I’m doing the (0 –> 1) to (-1 –> 1) in this Wrangle, instead of in the VOP (where I did the same to the normal), it’s because it made me easier to put the reimportUvsTest switch in.
This (badly named) switch allows me to quickly swap between the tessellated model with the displacement values in vertex colour (before bake), and the tessellated model that has had that data reloaded from texture (after bake), so I can see where the texture related errors are:

Animated difference between texture loaded displacement and pre bake

There are some errors, and they are mostly around UV seams and very stretched polygons.
The differences are not severe enough to bother me, so I haven’t spent much time looking into what is causing it (bake errors, not enough precision, the sampling I’m using for the texture, etc).

That’s enough proof in Houdini that I should be able to get something working in engine, so onwards to UE4!

UE4 setup

In UE4, I import the textures, setting the Compression on the Vector Displacement Map to be VectorDisplacementmap(RGBA8), and turn off sRGB.
Yay, 21 Mb texture!

I can almost get away with this map being 1024*1024, but there is some seam splitting going on:

Low res vector displacement broken seams

That might be also be solved through more aggressive texture Border Expansion when baking, though.

Here is what the material setup looks like (apologies for the rather crappy Photoshop stitching job on the screenshots, but you can click on the image to see the details larger):

Tessellation material in UE4

The value for the DisplaceHeight parameter is the @globalMaxDimen that I worked out in Houdini.

Since both textures are Local (Object) space, I need to shift them into the right range (from 0-1 to -1 to 1), then transform them into World space (i.e, take into account the objects rotation and scale in the scene, etc).

The Transform node works fine for converting local to world for the Normal map.
I also needed to set the material to expect world space normals by unchecking Tangent Space Normal:

Checkbox for disabling tangent space normals in UE4

The Transform node works fine for normal maps, but does not work for things that are plugged into World Displacement.
Tessellation takes place in a hull / domain shader and the Local -> world transformation matrix is not a thing it has access to.
To solve this properly in code, I think you’d probably need to add the LocalToWorld matrix into the FMaterialTessellationParameters struct in MaterialTemplate.usf, and I imagine you’d need to make other changes for it to work in the material editor, or you could use a custom node to access the matrix.

If you look back at my material, you can see I didn’t do that: I’m constructing the LocalToWorld matrix from vectors passed in as material parameters.
Those parameters are set in the construction script for blueprint for the object:

Wheel contruction script

I’m creating a dynamic material instance of the material that is on the object, applying this new instance to the object, and setting the Up, Right and Forward vector parameters from the Actor. These vectors are used in the material to build the local to world space matrix.

If I wanted the object to be animated, I’d either need to do the proper engine fix, or do something nasty like update those parameters in blueprint tick 🙂

Results in UE4

Please ignore the albedo texture stretching, I painted it on a medium divided high poly mesh in Substance Painter, probably should have used the low poly (something for me to play with more at a later date).

Close shot of the wheel

Toggle between sub-d and not in UE4

Close up toggle of sub-d toggle on wheel in UE4

This is with a directional light and a point light without shadows.
As a side note, Point Light shadows don’t seem to work at all with tessellated objects in UE4.

Spotlight and directional light shadows work ok, with a bit of a caveat.
They use the same tessellated mesh that is used for the other render passes, so if the object is off screen, the shadows will look blocky (i.e, it seems like tessellation is not run again in the shadow pass from the view of the light, which probably makes sense from an optimization point of view):

Spotlight shadow issues with tessellated meshes

And that’s about it!

Seems like a lot of work for something that is not super useful, but it’s better than my last attempt, and being built in Houdini it would be very easy to turn this into a pipeline tool.

For lots of reasons, I’m skeptical if I’ll ever work on a project that has Catmull-Clark creased sub-d, but at least I have a slightly better way of playing around with it now 🙂

 

 

 

 

 

City scanner scene – Breakdown pt2

Webs.gif

This is part 2 of the breakdown for my recent scene Half-Life 2 scanner scene (part 1 here).

This time, I’m going to focus on the Houdini web setup.

Although it took me a while to get a very subtle result in the end, it was a fun continuing learning experience, and I’m sure I’ll re-use a bunch of this stuff!

Go go Gadget webs!

I saw a bunch of really great photos of spider webs in tunnels (which you can find yourself by googling “tunnel cobwebs concrete” :)).

I figured it would be a fun time to take my tunnel into Houdini, and generate a bunch of animated hanging webby things, and bring them back into UE4.

This fun time ended up looking like a seahorse:

itsaseahorselol.png

I will break this mess a bit 🙂

Web starting points

PointsAndRaysGraph.png

I import the geometry for the tunnel and rails, and scatter a bunch of points over it, setting their colour to red.

On the right hand side of the seahorse is a set of nodes for creating hanging webs, which is just some straight down line primitives, with a few attributes like noise and thickness added to them.
I’ll come back to these later:

HangingWebs.png

In the top middle of the seahorse, I have a point vop apply two layers of noise to the colour attribute, and also blend the colour out aggressively below the rails, because I only wanted webs in the top half of the tunnel.

The web source points look like this:

WebPoints.png

From these points, I ray cast out back to the original geometry.

Ray casting straight out of these points would be a little boring, though, so I made another point vop that randomizes the normals a little first:

WebNormals.gif

After this, I have a few nodes that delete most of the points generated from the pipe connections: they have a high vertex density, compared to every other bit of mesh, so when I first ran the thing, I had a thousand webs on the pipe connections.
I also delete really small webs, because they look lame.

We are now at seahorse upper left.

Arcy Strangs.

ArcyStrangs.png

Not sure what I was thinking when naming this network box, but I’m rolling with it.

So anyway, the ray cast created a “dist” attribute for distance from the point to the ray hit, in the direction of the normal.

So my “copy1” node takes a line primitive, copies it onto the ray points, sets the length of the line to the “dist” attribute (my word, stamping is such a useful tool in Houdini).

CopyLines.png

Before the copy, I set the vertex red channel from black to red along the length of the line, just for convenience.

Previous up the chain, I found the longest of all the ray casts, and saved it off in a detail attribute. This is very easy to do by just using Attribute Promote, using Maximum as the Promotion Method.

So, I now define a maximum amount of “droop” I want for the webs, a bit of random droop, and then I use those values to move each point of each web down in Y a bit.

WebDroop.png

I use sample that ramp parameter up there using the web length, and then multiply that over the droop, so that each end of the web remains fastened in place.
And I don’t really care if webs intersect with the rails, because that’s just how I roll…

Fasten your seatbelts, we are entering seahorse spine.

Cross web connecty things

ConnectingWebStrands.png

For each of the webs in the previous section, I create some webs bridging between them.
Here’s the network for that.

ConnectingStrands.png

I use Connect Adjacent Pieces, using Adjacent Pieces from Points, letting the node connect just about everything up.

I use a carve node to cut the spline up, then randomly sort the primitives.

At this point, I decided that I only want two connecting pieces per named web, and I got lazy so I wrote vex for this:

string CurrentGroupName = "";

string PickedPieces[];
int PieceCount[];

int MaxPerPiece = 2;
int success = 0;

addprimattrib(geoself(), "toDelete", 0, "int");

for (int i = 0; i < nprimitives(geoself()); i ++)
{
    string CurrentName = primattrib(geoself(), "name", i, success);

    int FindIndex = find(PickedPieces, CurrentName);
    
    if (FindIndex < 0)
    {
        push(PickedPieces, CurrentName);        
        push(PieceCount, 1);
    }
    else
    {  
        int CurrentPieceCount = PieceCount[FindIndex];
        
        if (CurrentPieceCount >= MaxPerPiece)
        {
            setprimattrib(geoself(), "toDelete", i, 1, "set");
        }
        else
        {
            PieceCount[FindIndex] = CurrentPieceCount + 1;
        }
    }
    
    setprimattrib(geoself(), "name", i, CurrentName);
}

So that just creates an attribute on a connecting piece called “toDelete”, and you can probably guess what I do with that…

The rest of the network is the same sort of droop calculations I mentioned before.

One thing I haven’t mentioned up to this point, though, is that each web has a “Primitive ID” attribute. This is used to offset the animation on the webs in UE4, and the ID had to get transferred down the chain of webs to make sure they don’t split apart when one web meets another.

At this point, I add a bunch of extra hanging webs off these arcy webs, and here we are:

AllWebWires.png

Then I dump a polywire in, and we’re pretty much good to go!

Well… Ok. There’s the entire seahorse tail section.

For some reason, Polywire didn’t want to generate UVs laid out along the web length.

I ended up using a foreach node on each web, stacking the web sections up vertically in UV space, using a vertex vop, then welding with a threshold:

LayoutUVs.png

Since I have the position, 0-1, along the current web, I could use that to shift the UV sections up before welding.

With that done on every web, my UVs look like this:

UVsHoriz.png

Which is fine.
When I import the meshes into UE4, I just let the engine pack them.

Seriously, though… These are the sorts of meshes that I really wish I could just bake lighting to vertex colours in UE4 instead of a lightmap.
It would look better, and have saved me lots and lots of pain…

And here we are, swing amount in red vertex channel, primitive offset (id) in green:

FinalWebs.png

Web contact meshes

I wanted to stamp some sort of mesh / decal on the wall underneath the hanging meshes.
If you have a look back at the top of the seahorse, you might notice an OUT_WebHits node which contains all the original ray hits.

I’m not going to break this down completely, but I take the scatter points, bring in the tunnel geometry, and use the scatter points to fracture the tunnel.

I take that, copy point colour onto the mesh, and subdivide it:

WallWebsSubd.png

Delete all the non red bits, push the mesh out along normals with some noise, polyreduce, done 🙂

WallWebsFinal.png

I could have done much more interesting things with this, but then life is full of regrets isn’t it?

Back to UE4

So, export all that stuff out, bring it into UE4.

Fun story, first export I did was accidentally over 1 million vertices, and the mesh still rendered in less than half a millisecond on a GeForce 970.
We are living in the future, people.

CobwebsMaterial.png

Most of this material is setting up the swinging animation for the webs, using World Position Offset.

There’s two sets of parameters for everything: One for when the web is “idle”, one for when it is being affected by the Scanner being near it.

To pass the position of the scanner into the material, I have to set up a Dynamic Material Instance, so this is all handled in the web blueprint (which doesn’t do much else).

It also passes in a neutral wind direction for when the webs are idle, which I set from the forward vector of an arrow component, just to make things easy:

WindDirection.png

So now I have the scanner position, for each vertex in each web I get the distance between it, and the scanner, and use that to lerp between the idle and the “windy” settings.

All of these values are offset by the position id that I put in the green channel, so that not all of the webs are moving at exactly the same time.

Still to come…

Animation approach from Modo to blueprints, lighting rig for the scanner, all the fun stuff! 🙂

Crumbling tiger, hidden canyon

In the world of Shangri-La, Far Cry 4, lots of things crumble and dissolve into powder.

For example, my buddy Tiges here:

TigesAnim.gif

Or, as Max Scoville hilariously put it, the tiger turns into cocaine… NSFW I guess… That gave me a huge chuckle when I first saw it.

I was not responsible for the lovely powder effects in the game, we had a crack team (see what I did there) of Tricia Penman, Craig Alguire and John Lee for all that fancy stuff.
The VFX were such a huge part of the visuals Shangri-La, the team did an incredible job.

What I needed to work out was a decent enough way of getting the tiger body to dissolve away.
Nothing too fancy, since it happens very quickly for the most part.

Prototype

I threw together a quick prototype in Unity3d using some of the included library content from Modo:

unitydude.gif

I’m just using a painted greyscale mask as the alpha, then thresholding through it (like using the Threshold adjustment layer in Photoshop, basically).

There’s a falloff around the edge of the alpha, and I’m applying a scrolling tiled firey texture in that area.

I won’t go into it too much, as it’s a technique as old as time itself, and there are lots of great tutorials out there on how to set it up in Unity / UE4, etc.

As it turns out, there was already some poster burning tech that I could use, and it worked almost exactly the same way, so I didn’t need to do the shader work in the end:

You mentioned canyons?

I actually used World Machine to create the detail in the maps.
In the end, I needed to make about 30 greyscale maps for the dissolving effects on various assets.

Workflow

I’ll use my Vortigaunt fellow as an example, since I’ve been slack at using him for anything else or finishing him (typical!).

First up, for most of the assets, I painted a very rough greyscale mask in Modo:

VortigauntRoughMask

Then, I take that into World Machine, and use it as a height map.
And run erosion on it:

WMVort

I then take the flow map out of World Machine, and back into Photoshop.
Overlay the flow map on top of the original rough greyscale mask, add a bit of noise to it.
With a quick setup in UE4, I have something like this:

Vortigone

Sure, doesn’t look amazing, but for ten minutes work it is what it is 🙂

You could spend more time painting the mask on some of them (which I did for the more important ones), but in the end, you only see it for a few dozen frames, so many of them I left exactly how they are.

Better-er, more automated, etc

Now that I have the Houdini bug, I would probably generate the rough mask in Houdini rather than painting it.

I.e:

  • Set the colour for the vertices I want the fade to start at
  • Use a solver to spread the values out from these vertices each frame (or do it in texture space, maybe).
  • Give the spread some variation based off the roughness and normals of the surface (maybe).
  • Maybe do the “erosion” stuff in Houdini as well, since it doesn’t really need to be erosion, just some arbitrary added stringy detail.

Again, though, not worth spending too much time on it for such a simple effect.
A better thing to explore would be trying to fill the interior of the objects with some sort of volumetric effect, or some such 🙂
(Which is usually where I’d go talk to a graphics programmer)

Other Examples

I ended up doing this for almost all of the characters, which exception of a few specific ones (SPOILERS), like the giant chicken that you fight.
That one, and a few others, were handled by Nils Meyer and Steve Fabok, from memory.

So aside from those, and my mate Tiges up there, here’s a few other examples.

Bell Chains

BellAnim

Hard to see, but the chain links fade out 1 by 1, starting from the bottom.

This was tricky, because the particular material we were using didn’t support 2 UV channels, and the chain links are all mapped to the same texture space (which makes total sense).

Luckily, the material *did* support changing UV tiling for the Mask vs the other textures.

So we could stack all of the UV shells of the links on top of each other in UV space, like so:

ChainUVs

So then the mask fades from 0 –> 1 in V.
In the material, if we had 15 links, then we need to tile V 15 times for Diffuse, Normal, Roughness, etc, leaving the mask texture tiled once.

Edwin Chan was working on the assets for these, and I could have just made him manually set that up in Max, but it would have been a bit of a pain, and I’d already asked him to do all sorts of annoying setup on the prayer wheels…

There were 3-4 different bell chain setups, and each of those had multiple LODs for each platform, so I wrote a Maxscript that would pack all the UVs into the correct range.

Quite a lot of work for such a quick effect, but originally the timing was a lot slower, so at that point it was worth it 🙂

Bow gems

BowAnim

Although I never really got this as on-concept as I would have liked, I’m pretty happy with how these turned out.

Amusingly, the emissive material didn’t support the alpha thresh-holding effect.

So there are two layers of mesh: the glowy one and the non-glowy one.
It’s actually the non-glowy layer that fades out!
The glowy stuff is always there, slightly smaller, hidden below the surface.

Dodgy, but got the job done 😛