Faking Catmull-Clark creased sub-d in UE4

Vector displaced sub-d wheel

I’ve been hoping for edge creased Catmull-Clark subdivision in game engines ever since I started using Modo about 10 years ago.
I previously made a tool to build LODs from sub-d surfaces in Unity, just to have some way of getting a sub-d like mesh in engine. Super expensive LOD meshes…
This was not a very useful thing to do.

There are a few games out there using real-time sub-d, including the folks at Activision who have demonstrated CC sub-d with creases in a game engine:

Efficient GPU Rendering of Subdivision Surfaces using Adaptive Quadtrees

It is unclear if they shipped edge-creasing in any of their released games, but they definitely use CC subdivision surfaces.

And sure, technically there has been real-time subdivision in games going back to TruForm on ATI cards in the early 2000s, and probably before then for all I know, but I’m specifically interested in Catmull-Clark sub-d, and specifically edge creasing πŸ™‚

Why creases?

Creasing gives you control over the sharpness of edges, without having to manually bevel all of your edge loops.
This is nice for keeping a low poly mesh, but also allows you a little more flexibility.
For example, if you come back and change the model, you don’t have to un bevel and re-bevel edges.
If you bake out a normal map, and decide the bevels aren’t quite wide enough, you can just change the crease value and re-bake.

Here are some loops on my wheel model that are heavily creased:

Wire frame sub-d

If I were to bevel those edges instead, my base model would go from 3924 vertices to 4392.

If I removed creases across the whole model, and beveled all the edges to get the same end result I’d need a base mesh around 6000 vertices (2000 vertices more than the creased version).

For the sake of showing how much work the creasing is doing for me, here is the base model vs Sub-d vs Creased Sub-d:

Comparison between sub-d and creased sub-d in Modo

Vector Displacement approach

I’m not likely to be able to implement the Call Of Duty approach myself, so I’ve done something far more hacky, but slightly less gross than my previous Unity attempt πŸ™‚

My new method is:

  • In Houdini, tessellate the model completely flat
  • Also tessellate it using Catmull Clark creased sub-d
  • Bake the difference in positions of the vertices between these two meshes into a vector displacement map and normal map
  • In UE4 (or engine of choice) flat tessellate the model
  • Apply the vector displacement map to push the vertices into their sub-d positions

It’s very expensive from a memory point of view (and probably performance for that matter), so this is not something you’d want to do for a game, but it does show off how nice creased sub-d would be in UE4 πŸ™‚

Houdini Vector displacement map generation

First up, here’s the un-subdivided model in Modo:

Low poly wheel in Modo

And this is the edge weighting view mode in Modo, so you can see which edges are being creased:

Modo edge weight visualisation

There are two things I want to bake out of Houdini: A vector displacement map and a normal map.
I’m not baking this data by projecting a high poly model onto a low poly, I don’t need to because the high poly model is generated from the low poly, so it has valid UVs, I can just bake textures straight out from the high poly.

Here’s the main network:

Houdini network for generating vector displacement

The right side of the graph, there are two Subdivide nodes.
The Subdivide on the left uses “OpenSubdiv Blinear”, and on the right is “OpenSubdiv Catmull-Clark”, and they are both subdivided to a level of 5, so that I have roughly more vertices in the meshes than pixels that will get baked out:

Bilinear vs Catmull-Clark sub-d

The “bilinear” subdivision is pretty close to what you get in UE4 when you use “flat tessellation”. So what we want to do is work out how to push the vertices from the left model to match the right model.
This is very easily done in a Point Wrangle, since the point numbers match in both models πŸ™‚

v@vDisp = @P - @opinput1_P;
@N = @opinput1_N;
f@maxDimen = max(abs(v@vDisp.x), abs(v@vDisp.y), abs(v@vDisp.z));

Or if you’d prefer, as a Point VOP:

Vector displacement wrangle as VOP

Vector displacement (vDisp) is the flat model point position minus the creased model point position.
I am also setting the normals of the flat model to match the creased model.

When I save out the vector displacement, I want it in the 0-1 value range, just to make my life easier.
So in the above Wrangle/VOP I’m also working out for each Point what the largest dimension is (maxDimen).
After the Wrangle, I promote that to a Detail attribute (@globalMaxDimen) using the Max setting in the Attribute Promote SOP, so that I know the maximum displacement value across the model, then use another Wrangle to bring all displacement values into the 0-1 range:

v@vDisp = ((v@vDisp / f@globalMaxDimen) + 1) / 2;
@Cd = v@vDisp;

The displacement is now stored in Point colour, in the 0-1 range, and looks like this:

Vector displacement displayed on model

Bake it to the limit!.. surface

You might have noticed that the Normals and Displacement are in World space (since that’s the default for those attributes in Houdini).

I could have baked them out in Tangent space, but I decided for the sake of this test I’d rather not deal with tangent space in Houdini, but it’s worth mentioning since it’s something I need to handle later in UE4.

To bake the textures out, I’m using two Bake Texture nodes in a ROP network in Houdini.

Bake texture nodes in ROP

I’ve only changed a few settings on the Bake Texture nodes:

  • Using “UV Object”, and no cage or High Res objects for baking
  • Turned on “Surface Unlit Base Color” as an output
  • Set the output format for Vector Displacement as EXR
  • Set the output format for Normal map as PNG
  • Unwrap method to “UV Match” (since I’m not tracing from one surface to another)
  • UDIM Post Process to Border Expansion

And what I end up with is these two textures:

Baked vector displacement map

Baked normal map

I bake them out as 4k, but lodbias them down to 2k in UE4 because 4k is a bit silly.
Well, 2k is also silly, but the unwrap on my model is terrible so 2k it is!

Testing in Houdini

If you look back at the main network, there is a section on the left for testing:

Houdini network for generating vector displacement

I created this test part of the network before I jumped into UE4, so I could at least validate that the vector displacement map might give me the precision and resolution of data that I would need.
And also because it’s easier to debug dumb things I’ve done in a Houdini network vs a material in UE4 (I can see the values of every attribute on a vertex, for example) πŸ™‚

I’m taking the flat tessellated model, loading the texture using attribvop_loadTexture, copying the globalMaxDimens onto the model, and then the attribwrangle_expectedResult does the vector displacement.

The attribvop_loadTexture is a Vertex VOP that looks like this:

Vertex VOP used for loading the vector displacement texture

This uses the Vertex UVs to look up the vector displacement map texture, and stores the displacement in vertex colour (@Cd). It also loads the object space normal map, and moves it from 0-1 to -1 to 1, and binds it to a temporary loadedNormals attribute (copied into @N later).

Then at the end of the network, the expectedResult wrangle displaces the position by the displacement vector in colour, using the globalMaxDimen:

@P -= ((@Cd * 2) - 1) * f@globalMaxDimen;

If you’re wondering why I’m doing the (0 –> 1) to (-1 –> 1) in this Wrangle, instead of in the VOP (where I did the same to the normal), it’s because it made me easier to put the reimportUvsTest switch in.
This (badly named) switch allows me to quickly swap between the tessellated model with the displacement values in vertex colour (before bake), and the tessellated model that has had that data reloaded from texture (after bake), so I can see where the texture related errors are:

Animated difference between texture loaded displacement and pre bake

There are some errors, and they are mostly around UV seams and very stretched polygons.
The differences are not severe enough to bother me, so I haven’t spent much time looking into what is causing it (bake errors, not enough precision, the sampling I’m using for the texture, etc).

That’s enough proof in Houdini that I should be able to get something working in engine, so onwards to UE4!

UE4 setup

In UE4, I import the textures, setting the Compression on the Vector Displacement Map to be VectorDisplacementmap(RGBA8), and turn off sRGB.
Yay, 21 Mb texture!

I can almost get away with this map being 1024*1024, but there is some seam splitting going on:

Low res vector displacement broken seams

That might be also be solved through more aggressive texture Border Expansion when baking, though.

Here is what the material setup looks like (apologies for the rather crappy Photoshop stitching job on the screenshots, but you can click on the image to see the details larger):

Tessellation material in UE4

The value for the DisplaceHeight parameter is the @globalMaxDimen that I worked out in Houdini.

Since both textures are Local (Object) space, I need to shift them into the right range (from 0-1 to -1 to 1), then transform them into World space (i.e, take into account the objects rotation and scale in the scene, etc).

The Transform node works fine for converting local to world for the Normal map.
I also needed to set the material to expect world space normals by unchecking Tangent Space Normal:

Checkbox for disabling tangent space normals in UE4

The Transform node works fine for normal maps, but does not work for things that are plugged into World Displacement.
Tessellation takes place in a hull / domain shader and the Local -> world transformation matrix is not a thing it has access to.
To solve this properly in code, I think you’d probably need to add the LocalToWorld matrix into the FMaterialTessellationParameters struct in MaterialTemplate.usf, and I imagine you’d need to make other changes for it to work in the material editor, or you could use a custom node to access the matrix.

If you look back at my material, you can see I didn’t do that: I’m constructing the LocalToWorld matrix from vectors passed in as material parameters.
Those parameters are set in the construction script for blueprint for the object:

Wheel contruction script

I’m creating a dynamic material instance of the material that is on the object, applying this new instance to the object, and setting the Up, Right and Forward vector parameters from the Actor. These vectors are used in the material to build the local to world space matrix.

If I wanted the object to be animated, I’d either need to do the proper engine fix, or do something nasty like update those parameters in blueprint tick πŸ™‚

Results in UE4

Please ignore the albedo texture stretching, I painted it on a medium divided high poly mesh in Substance Painter, probably should have used the low poly (something for me to play with more at a later date).

Close shot of the wheel

Toggle between sub-d and not in UE4

Close up toggle of sub-d toggle on wheel in UE4

This is with a directional light and a point light without shadows.
As a side note, Point Light shadows don’t seem to work at all with tessellated objects in UE4.

Spotlight and directional light shadows work ok, with a bit of a caveat.
They use the same tessellated mesh that is used for the other render passes, so if the object is off screen, the shadows will look blocky (i.e, it seems like tessellation is not run again in the shadow pass from the view of the light, which probably makes sense from an optimization point of view):

Spotlight shadow issues with tessellated meshes

And that’s about it!

Seems like a lot of work for something that is not super useful, but it’s better than my last attempt, and being built in Houdini it would be very easy to turn this into a pipeline tool.

For lots of reasons, I’m skeptical if I’ll ever work on a project that has Catmull-Clark creased sub-d, but at least I have a slightly better way of playing around with it now πŸ™‚

 

 

 

 

 

11 thoughts on “Faking Catmull-Clark creased sub-d in UE4

  1. Why do the creases need to be processed in real time though. One could have the varying crease strenght control on their modeling software, but have that mesh be converted automagically so the creaseas are beveled out through verts. Sure that makes it higher poly, but maybe a higher poly mesh with simpler subdivison is faster to process than an lowpoly mesh with complex subdiv. It’s not like current games don’t already bruteforce the shit out of every curved surface with ever more verts instead of tackling the problem inteligently.

    1. Yup, that’s a totally valid option, might even be do-able with Houdini.

      It’s pretty hard to know what would be faster / better without having both implemented and profiling them to compare, and it all really heavily depends on the use case.
      If you wanted to drop out all those bevels at a distance, then you’d probably need to build out LODs for it still, whereas with real time creasing you perhaps wouldn’t (the base cage mesh might be low enough to use in the distance).

    1. Nice! The textures are a bit memory heavy, so it’s definitely not great on the front, but curious to see if you find it good for something!

  2. hello. I am Japanese.
    Amazing technology!
    I was looking for a way to achieve catmull-clark in UE, so thank you very much for writing the article.

    I have one question.
    What you did in Houdini, can you do the same in Blender?
    I’ve never used Houdini, so I didn’t understand baking in Houdini.

    (I wrote it using GoogleTranslation. I’m sorry if the English is strange.)

      1. Thank you very much for your reply and references!

        I will look at your material and try it. If I get stuck with blender, I’ll try hodini and try your finished node.

        Before I got your comment reply, I tried looking for a method of baking with blender from youtube, but it didn’t work.
        (I don’t have enough knowledge of how to use nodes, calculation formulas, and English, so I can only imitate the completed node. I wish I could find a completed node that matches your method with blender. ….)

        By the way, I’m thinking of trying your method on a rigged human mesh.

        Do you think your method is working to try it on a rigged mesh?
        (I thought maybe your method isn’t working because the rigged mesh changes vertex positions when i move the bones.)

      2. Hmmmmm I am not sure if it would work for characters! I will have to think about it a bit πŸ™‚

        There is a “pre skinned position” node in Unreal that you might be able to use to displace instead of the positions after skinning.

Leave a comment