Posts Tagged ‘Half-life’

ArtStation and a mine

December 27, 2016

UE4Textured.png

Just a very quick post to show off a new asset (although I’ve already been spamming Twitter with that a bit).

It’s another Half Life asset: a hopper mine. Was a lot of fun to work on!
The above shot is in UE4, this is another Modo + Substance Painter asset.

Also, I’ve decided to jump on the ArtStation bandwagon, just in case I didn’t already have enough accounts and pages to maintain 🙂

Hope everyone has a great holidays and New Years!

 

City scanner scene – Breakdown pt3

October 15, 2016

ScannerFloat.gif

Part 3 of the breakdown of my recent Half-Life 2 Scanner scene.

And now for animation! Also known as “Geoff stumbling around blindly for a week when he really should have watched some UE4 animation tutorials”.

So, erm, feel free to use this as a guide on how *not* to approach making an object float down a hallway…

Down the garden path

Early on, I was trying to work out if I wanted to just animate the whole thing from start to finish in Modo, or do something a little more systemic.

For the sake of trying something different, I settled on having the main movement down the tunnel, rotation of the centre wheel, tail and little flippy bits (technical term) through blueprints, and then blend in a few hand animated bits.

There are three main blueprints that do the work: the Scanner blueprint, the Scanner Path blueprint, and the Scanner Attract Point blueprint.

ScannerBlueprints.png

The division of labour between these things ended up being pretty arbitrary, but the initial idea was that an Attract Point can start playing an animation on the Scanner when it reaches the point, and can also modify the max speed of the Scanner when leaving the point.

Here are the parameters that each point has:

AttractPointProperties.png

So when the Scanner reaches an attract point, it can pause for a while (I use this time to play an animation, generally). The animation doesn’t start when the scanner reaches the point though, it actually starts at a certain percentage of distance along the previous spline segment leading up to this point.

There is also a blend in and out time for the animation, to give transition from the manually animated idle.

The animation blueprint itself does very little:

Scanner_AnimBlueprint.png

Down the bottom left is the Idle animation that happily ticks away all the time, and that blends with an override animation, which is what the Attract Points set.

Each of the rotators are driven by procedural animation on the Scanner blueprint, which I’ll show in a bit.

Improvements in hindsight

The Idle / Override blending part of this is definitely something I would change in hindsight, because blending in and out of the idle is a mess: the scanner could be on an up or down point when I start.

There’s a few ways I could deal with it, including just changing the up and down sine wave motion to be driven by blueprint instead, or just restarting the idle animation when I start blending it back in (or probably a dozen better ways to do it that I don’t know about :P).

Also, the “pause” functionality in the Attract Points is not a great way to do things.
Timing the pause and playing animations was a lot of trial and error, I should have sent events out from the animations instead that trigger the pause.

Custom animations in Modo

There’s three custom animations that I made in Modo:

  • Idle
  • Searching left and right animation half way down the tunnel
  • The final “do I see something? I think I see something… SQUIRREL!!”

Everything in the mesh is hard rigged (no deformation), so just parenting all the pieces together, and importing into Unreal generates a skeleton for me.

In Modo, I didn’t do anything particularly exciting this time around, i.e no interesting rig, I just keyframed the bones.

Modo has export presets for UE4 and Unity now, which is pretty ace!
You can also set up your own presets:

ModoExportPresets.png

It was pretty fun doing some animation again, it’s something I really don’t do very often, and Modo makes it pretty easy to dive in.

Tick-tock

Ok, back in UE4, time to have a quick look over the tick event in the Scanner blueprint.

ScannerBlueprint_tick.png

Unlike my normal lazy self, this time I stitched a bunch of things together, so you can see it in full!

Embrace lag

I wanted quite a few of the animations to be driven by the velocity of the scanner.
I’m calculating the velocity myself anyway, based on set max speed and acceleration values, so I could have just used that value, and built some lag into it.

But I found something more fun:

SpringArms.gif

I have 2 non colliding “spring arms” attached to the Scanner root node, one that is slow to catch up (used for body tilt), one that is fairly quick (used for tail rotation).

This was inspired by some amazingly cool animation work I saw Greg Puzniak doing in Unity3D. It’s not up on his site, but he has cool matcap stuff up there that you should check out! 🙂

So in a lot of my blueprint functions, I get the distance from the spring arm to the arrow, and use that to drive laggy rotation:

DaysOfYaw.png

Despite the comment in the blueprint, when I tried to use this for banking, the ATM swallowed my card (har har har).

So most of my procedural animation is driven this way.
When I’m not playing the override animations, everything except for the up and down bobbing is procedural. Which is pretty silly, because that’s a sine wave, so it’s not really an Idle animation, and I should have got rid of it.

The rest of the blueprint is about keeping track of how far along the spline we are, how close to the next point we are (so we can start playing an animation, if necessary), and then setting the world location and rotation of the Scanner from the distance along the spline.

Next-gen volumetric fog

Is something I don’t know much about, so I just made a hacky material UV offset thingo 😛

I say “made”, but there are some great light beam and fog examples from Epic included in the engine content, so I grabbed most of this thing from those examples, and here:

Fog Sheet and Light Beams

The scanner has a geometry cone on it that is UV mapped 0->1 U along the length.
LightConeWire.png

I don’t think I really changed much from the content example (I honestly can’t remember), but I did add two parameters that adjust the tiling offset of the noise texture:

lightbeammat

As the Scanner moves along the path, it increases the FogForwardOffset which pans the U coordinate of the UVs, so that it looks like the cone is moving through a volume. There’s also always a little bit of panning going on anyway, even when the Scanner is stopped.

As the Scanner rotates, I scroll the V coordinate, just so the noise in the beam doesn’t stay fixed. The rotation isn’t very convincing, so I could probably do a better job of that.

There’s not much going on in the blueprint, but I put whatever I could in there rather than in the material:

UpdateConeFog.png

Lighting

SceneLighting.png

The idea of the scene was to be almost entirely lit by the scanner, but I do have a bunch of static lights scatter around too, just to give some ambient light.

There are also two stationary lights in the scene to get highlights where I want them (the lights selected in the screenshot above).
One of them is a spotlight, used to hit the puddle and left wall.

There is also a small light at the front of the tunnel that has “Indirect Lighting Intensity” set to 0, so it doesn’t affect the bounced lighting.
This is the light that hits the scanner here:

ScannerTunnelFrontLight.png

This light is quite bright, so when the Scanner hits it, the rest of the environment darkens down, which (hopefully) puts the focus all on the Scanner (yay for auto-exposure!).

There are only two shadow casting lights in the scene, and they are both on the Scanner.
One of them is on a giant point light, and is super expensive, which is the main reason I limited shadow casters everywhere else:

scannershadowlights

Spinning spotlights

There are also two non shadow casting spotlights on the side of the scanner that rotate and project patterns on the wall.

LightsOnWalls.png
For some reason that I can’t remember, I decide to generate the pattern in a material, rather than do the smart thing and use a texture.

SpotlightFunction.png

I’m modifying the “VectorToRadialValue” function to generate bands, then fading it out in the middle bit.

Seriously though, unless you have a really good reason, you should probably do this in a texture 🙂

Conclusion

So I *think* that’s it!

I’m sure there are things I’ve missed, or glossed over a bit, so feel free to ask questions in the comments and I’ll fill in the gaps.

 

 

City scanner scene – Breakdown pt2

October 13, 2016

Webs.gif

This is part 2 of the breakdown for my recent scene Half-Life 2 scanner scene (part 1 here).

This time, I’m going to focus on the Houdini web setup.

Although it took me a while to get a very subtle result in the end, it was a fun continuing learning experience, and I’m sure I’ll re-use a bunch of this stuff!

Go go Gadget webs!

I saw a bunch of really great photos of spider webs in tunnels (which you can find yourself by googling “tunnel cobwebs concrete” :)).

I figured it would be a fun time to take my tunnel into Houdini, and generate a bunch of animated hanging webby things, and bring them back into UE4.

This fun time ended up looking like a seahorse:

itsaseahorselol.png

I will break this mess a bit 🙂

Web starting points

PointsAndRaysGraph.png

I import the geometry for the tunnel and rails, and scatter a bunch of points over it, setting their colour to red.

On the right hand side of the seahorse is a set of nodes for creating hanging webs, which is just some straight down line primitives, with a few attributes like noise and thickness added to them.
I’ll come back to these later:

HangingWebs.png

In the top middle of the seahorse, I have a point vop apply two layers of noise to the colour attribute, and also blend the colour out aggressively below the rails, because I only wanted webs in the top half of the tunnel.

The web source points look like this:

WebPoints.png

From these points, I ray cast out back to the original geometry.

Ray casting straight out of these points would be a little boring, though, so I made another point vop that randomizes the normals a little first:

WebNormals.gif

After this, I have a few nodes that delete most of the points generated from the pipe connections: they have a high vertex density, compared to every other bit of mesh, so when I first ran the thing, I had a thousand webs on the pipe connections.
I also delete really small webs, because they look lame.

We are now at seahorse upper left.

Arcy Strangs.

ArcyStrangs.png

Not sure what I was thinking when naming this network box, but I’m rolling with it.

So anyway, the ray cast created a “dist” attribute for distance from the point to the ray hit, in the direction of the normal.

So my “copy1” node takes a line primitive, copies it onto the ray points, sets the length of the line to the “dist” attribute (my word, stamping is such a useful tool in Houdini).

CopyLines.png

Before the copy, I set the vertex red channel from black to red along the length of the line, just for convenience.

Previous up the chain, I found the longest of all the ray casts, and saved it off in a detail attribute. This is very easy to do by just using Attribute Promote, using Maximum as the Promotion Method.

So, I now define a maximum amount of “droop” I want for the webs, a bit of random droop, and then I use those values to move each point of each web down in Y a bit.

WebDroop.png

I use sample that ramp parameter up there using the web length, and then multiply that over the droop, so that each end of the web remains fastened in place.
And I don’t really care if webs intersect with the rails, because that’s just how I roll…

Fasten your seatbelts, we are entering seahorse spine.

Cross web connecty things

ConnectingWebStrands.png

For each of the webs in the previous section, I create some webs bridging between them.
Here’s the network for that.

ConnectingStrands.png

I use Connect Adjacent Pieces, using Adjacent Pieces from Points, letting the node connect just about everything up.

I use a carve node to cut the spline up, then randomly sort the primitives.

At this point, I decided that I only want two connecting pieces per named web, and I got lazy so I wrote vex for this:

string CurrentGroupName = "";

string PickedPieces[];
int PieceCount[];

int MaxPerPiece = 2;
int success = 0;

addprimattrib(geoself(), "toDelete", 0, "int");

for (int i = 0; i < nprimitives(geoself()); i ++)
{
    string CurrentName = primattrib(geoself(), "name", i, success);

    int FindIndex = find(PickedPieces, CurrentName);
    
    if (FindIndex < 0)
    {
        push(PickedPieces, CurrentName);        
        push(PieceCount, 1);
    }
    else
    {  
        int CurrentPieceCount = PieceCount[FindIndex];
        
        if (CurrentPieceCount >= MaxPerPiece)
        {
            setprimattrib(geoself(), "toDelete", i, 1, "set");
        }
        else
        {
            PieceCount[FindIndex] = CurrentPieceCount + 1;
        }
    }
    
    setprimattrib(geoself(), "name", i, CurrentName);
}

So that just creates an attribute on a connecting piece called “toDelete”, and you can probably guess what I do with that…

The rest of the network is the same sort of droop calculations I mentioned before.

One thing I haven’t mentioned up to this point, though, is that each web has a “Primitive ID” attribute. This is used to offset the animation on the webs in UE4, and the ID had to get transferred down the chain of webs to make sure they don’t split apart when one web meets another.

At this point, I add a bunch of extra hanging webs off these arcy webs, and here we are:

AllWebWires.png

Then I dump a polywire in, and we’re pretty much good to go!

Well… Ok. There’s the entire seahorse tail section.

For some reason, Polywire didn’t want to generate UVs laid out along the web length.

I ended up using a foreach node on each web, stacking the web sections up vertically in UV space, using a vertex vop, then welding with a threshold:

LayoutUVs.png

Since I have the position, 0-1, along the current web, I could use that to shift the UV sections up before welding.

With that done on every web, my UVs look like this:

UVsHoriz.png

Which is fine.
When I import the meshes into UE4, I just let the engine pack them.

Seriously, though… These are the sorts of meshes that I really wish I could just bake lighting to vertex colours in UE4 instead of a lightmap.
It would look better, and have saved me lots and lots of pain…

And here we are, swing amount in red vertex channel, primitive offset (id) in green:

FinalWebs.png

Web contact meshes

I wanted to stamp some sort of mesh / decal on the wall underneath the hanging meshes.
If you have a look back at the top of the seahorse, you might notice an OUT_WebHits node which contains all the original ray hits.

I’m not going to break this down completely, but I take the scatter points, bring in the tunnel geometry, and use the scatter points to fracture the tunnel.

I take that, copy point colour onto the mesh, and subdivide it:

WallWebsSubd.png

Delete all the non red bits, push the mesh out along normals with some noise, polyreduce, done 🙂

WallWebsFinal.png

I could have done much more interesting things with this, but then life is full of regrets isn’t it?

Back to UE4

So, export all that stuff out, bring it into UE4.

Fun story, first export I did was accidentally over 1 million vertices, and the mesh still rendered in less than half a millisecond on a GeForce 970.
We are living in the future, people.

CobwebsMaterial.png

Most of this material is setting up the swinging animation for the webs, using World Position Offset.

There’s two sets of parameters for everything: One for when the web is “idle”, one for when it is being affected by the Scanner being near it.

To pass the position of the scanner into the material, I have to set up a Dynamic Material Instance, so this is all handled in the web blueprint (which doesn’t do much else).

It also passes in a neutral wind direction for when the webs are idle, which I set from the forward vector of an arrow component, just to make things easy:

WindDirection.png

So now I have the scanner position, for each vertex in each web I get the distance between it, and the scanner, and use that to lerp between the idle and the “windy” settings.

All of these values are offset by the position id that I put in the green channel, so that not all of the webs are moving at exactly the same time.

Still to come…

Animation approach from Modo to blueprints, lighting rig for the scanner, all the fun stuff! 🙂

City scanner scene – Breakdown pt1

October 12, 2016

EnvWideShot.png

In this post, I’ll go through the construction of the environment for my recently posted Half Life 2 scanner scene.

The point of this project was really just to do a bit of animation on my scanner, and show it off in a simple environment. I can’t remember the last time I did any animation, but my guess would be when I was studying at the AIE over ten years ago 🙂

So with that in mind, figuring I was going to struggle with the animation side, I wanted to keep the environment dead simple. It was always going to be dark, anyway, since I wanted the scanner to light the scene!

Modelling / texturing the tunnel

I looked up a bunch of photo reference for cool tunnels in Europe, presumably the sort of thing that the resistance in city 17 would have used 🙂

I blocked out basic lighting, camera setup, and created the tunnel out of cubes in UE4.
Once I was happy with the layout, I could then just export the blocked out mesh to FBX to use as a template in Modo:

WIP_ExportBlockout.png

I also took the time to make a really basic animatic.
I changed the path of the scanner quite a bit, and timing, etc, but I still found this to be useful:

Anyway, at this point, the scene blockout is in Modo, and I can start building geometry:

WIP_SceneBlockoutModo.png

The geometry itself is dead simple, so I won’t go into that too much, I just extruded along a spline, then beveled and pushed a few edge loops around 🙂

I always use the sculpt tools to push geometry around a little, just to make things feel a bit more natural. Here specifically I was sinking some of the vertices on the side pathways:

WIP_PushVertsModo.png

Layered vertex painted materials can be expensive, so I wanted to avoid going too far down that path.
In the end, I settled on having two layers: concrete, and moldy damp green stuff:

WIP_WallMaterial.png

The green stuff is vertex paint blended on, and the vertex colours for the mask was done in UE4 rather than in Modo, just because it is quick and easy to see what I’m doing in editor.

Most of the materials in the scene were made in Substance painter.
And I’m lazy, so they are usually a couple of layers with procedural masks, and one or two hand painted masks 🙂

substancepainterconcrete

Water plane

Water.gif

For the purposes of this scene, I could get away with a pretty low tech / low quality water plane. As long as it had some movement, and is reflective, then it would do!

The engine provides flow map samples and functions in the content samples, so I just used those. I’ve written my own ones before (and by that, I mean I copied what they were doing in the Portal 2 Water Flow presentation from siggraph 2010), but the UE4 implementation does exactly what I wanted 🙂

And seriously, if you haven’t looked at that presentation, go do it.
They used Houdini to generate water flow, but I’m lazy and ain’t got time for that! (Not for this scene, at any rate).

I just generated mine in Photoshop, using this page as a guide:

Photoshop generated flow maps

At some point, I’d like to see if I can set up the same workflow in Substance Painter and/or Houdini.

Anyway, the material is a bit messy (sorry):

watermaterial

I’m passing the flowmap texture and some timing parameters into the flowmaps material function, and getting a new normal map out of it.

The only other thing going on here is that I have a mask for the edges of the water, where it is interacting with the walls. I blend in different subsurface colour, normal strength and roughness at the edges.

Fog planes

FogPlanes.png

I’ve got a few overlapping fog planes in the scene, with a simple noisy texture, offset by world position (having a different offset on each make it feel a little more volumetric).

Much like the water, the fog plane has a subtle flow map on it, to fake a bit of turbulence, and the material uses depth fade on opacity to help it blend with the surrounding geometry:

fog

UE4 4.13 mesh decals

I was going to use a bunch of the new 4.13 features originally, but in the end I think the only one I used was “mesh decals”.

These are decals in the old school sense, not the projected decals that UE4 users have probably come to love. In the back of my mind, I had thought I might turn this into a VR scene at some point, and the cost of projected decals is a somewhat unknown commodity for me at the moment.

The main advantage of mesh decals, vs floating bits of geometry with Masked materials, is that mesh decals support full alpha blending.

In these shots, the water puddle, stain and concrete edge damage are all on part of the same decal sheet:

The decals are all using Diffuse, Normals, Roughness, Metallic and Occlusion (the last three packed together):

DecalsTextures.pngI built the decals one at a time, without much planning, basically guessing at how much texture space I thought I was going to need (I didn’t bother setting a “texels per metre” type of limit for my project, but that probably would have been sensible).

Each time I wanted a new mesh decal, I’d work out in Modo how big I want it first:

ModoDecalMeshes.png

Then I’d copy it into a separate Modo scene just for Decal Layout which I take into Substance Painter.
I just did this so I could keep all the mesh together in one space, to keep it easy for painting:

ModoDecalScene.png

And then here is the scene in Substance:

SubstancePainterDecalScene.png

And here is the scene with and without decals:

meshdecals

What’s great about this, is that mesh decals don’t show up in Shader Complexity, so the tech artists on the project will never know… (I kid, I kid. They will find them in PIX, and will hunt you down and yell at you).

I really like this approach to building wear and tear into materials. The first time I saw this approach was when I was working at Visceral Games in Melbourne, and the engine was very well optimized to handle a pretty huge amount of decals. I didn’t embrace it as much as I should have, back then.

Rails

A few years back, I made a blueprint for pipes that allowed joining sections, etc.
So I knocked together a model in Modo for the connection pieces:

RailBracketModo.png

Edge-weighted sub-d, of course, because I can’t help myself 🙂
I even started sculpting in some heavy rust, but had to have a stern word to myself about not spending too much time on stuff that isn’t even going to be lit…

Textured in Substance Painter:

railbracketsubstance

Same dealio with the pipe segments:

railsubstance

Then I just built the spline in the editor, and set it up like in my old blog post.

Much like I did with the original blockout geometry, I also exported the final pipes back out to Modo so that I could use them to work out where I wanted to put some decals.

The only other thing that was a pain, was that the pipes need lightmaps, but I couldn’t work out a way to generate unique UVs for the final pipe mesh.

In the end, I just used the merge actors function in the editor, so that they all became a single static mesh, and let Unreal generate lightmap UVs.

Webs

Did you notice that there were hanging spider webs in the scene?
No? Good, because I don’t like them much 😛

I probably spent 10-20 hours just messing about with these silly things, but at least I got some fun gifs out of them:

BusySpiders.gif

Next up…

I’ll break down the construction of those web things, might be useful for a scene full of badly animated vines, I suppose…

I’ll also go through all of the silly things I did on the animation / blueprint / lighting side.

City Scanner

August 27, 2016

Since I had so much fun with the last Modo / Substance project I did, thought I’d do another one 🙂

This time, I decided to make a City Scanner, from Half Life 2.
It’s a work in progress, and I’ll keep posting regular screenshots up on my twitter, but here’s where I’m currently at:

WIP10

I could have been smart, and just grabbed the model from Source and built around it, but I need to practice building things from scratch, so I built it based off a bunch of screenshots I grabbed out of the game.

It has quite a few differences to the original, which I’m going to pretend was due to creative license, rather than me screwing up proportions, etc (I particularly hate the green side panel, and some of the rear details, but I’m not going to fix the modelling on those at this point) …

Building the model

As with everything I do, this was built as an edge-weighted Catmull-Clark subdivision surface, in Modo 10.

Whenever working on these things, I tend to throw some basic Modo procedural materials and render them out, so here’s where I was at by the end of the high poly process:

ScannerRender.png

Once I was happy with the model (read: sick of working on it :P), I created the low poly mesh for it, and unwrapped the thing.

WIP_WireLP.png
Unwrapping aside, this didn’t take a huge amount of time, because I just used the base sub-d cage, and stripped out a bunch of loops.
It’s pretty heavy still, at about 7000 vertices, but it’ll do!

Painter work

I could have baked the procedural materials out of Modo, and painted over the top of them, etc (Modo actually has some great baking and painting tools these days), but I need to keep using painter more.

Probably the largest amount of time I spent from this point on was splitting the high and low poly up into lots of different meshes so that I could bake all the maps I needed in Substance Painter.

Models with lots of floating, yet welded intersecting parts are a little bit of a pain for this sort of thing, but I got there eventually.

From Modo, I baked out a Surface ID mask (actually, I used a Diffuse render output, and had flood fill colours on all my materials, but I use it as a Surface ID mask in Painter):

SurfaceIDs

For each of the colour blocks, I set up a folder in Painter that had a Colour
Selection mask on it:

WIP_ColourSelection.png

And then I just stack up a bunch of flood fill colour layers with masks until I’m happy.

There’s not a lot of actual painting going on here, at this point, although I do always paint out some parts of the procedural masks, because having even edge wear across the whole model looks pretty silly.

That said, smart masks with flood fill layers aren’t a bad way to flesh out basic wear and tear, etc:

WIP-SmartMask.png

I still need to paint out more of the wear and tear on my model, and put more colour variation in, it looks a little like it has been in a sandstorm, then thrown down some stairs 🙂

UE4

UE4Wip_2.png

Aside from some issues with Reflection Capture Actors (having emissive materials in a scene can mess them up a bit), I really didn’t do much except throw the exported textures from Substance onto the mesh, and put a few lights in.

I did mess about with the texels per pixel, min and fade resolutions, and radius thresholds of the shadow casters a bit, because the default settings for shadows in UE4 are pretty low quality for some reason, even on Epic settings.

The material is really boring at the moment, the only thing it exposes is a multiplier for emissive:

WIP-UE4Material.png

Next steps

I will probably animate this in UE4 at some point, and have it floating around, flashing lights, etc.
And it will end up as a minor piece in an environment at some point, hopefully 🙂

For now, though, I want to continue down the fun path of Modo sub-d/Substance, so I will probably start working on a new model.

Watch this space, and/or twitter 🙂

 

 

Shopping for masks in Houdini

January 20, 2016

Houdini pun there, don’t worry if you don’t get it, because it’s pretty much the worst…

In my last post, I talked about the masking effects in Shangri-La, Far Cry 4.

I mentioned that it would be interesting to try out generating the rough masks in Houdini, instead of painting them in Modo.

So here’s an example of a mask made in Houdini, being used in Unreal 4:

VortUE4Houdini.gif

Not horrible.
Since it moves along the model pretty evenly, you can see that the hands are pretty late to dissolve, which is a bit weird.

I could paint those out, but then the more I paint, the less value I’m getting out of Houdini for the process.

This is probably a good enough starting point before World Machine, so I’ll talk about the setup.

Masky mask and the function bunch

I’ve exported the Vortigaunt out of Modo as an Alembic file, and bring it into Houdini.
Everything is pretty much done inside a single geometry node:

MaskGen_all

The interesting bit here is “point_spread_solver”. This is where all the work happens.

Each frame, the solver carries data from one vertex to another, and I just manually stop and bake out the texture when the values stop spreading.

I made the un-calculated points green to illustrate:

VortGreen

A note on “colour_selected_white”, I should really do this bit procedurally. I’m always starting the effect from holes in the mesh, so I could pick the edge vertices that way, instead of manually selecting them in the viewport.

The solver

MaskGen_point_spread_solver

Yay. Attribwrangle1. Such naming, wow.

Nodes are fun, right up until they aren’t, so you’ll often see me do large slabs of functionality in VEX. Sorry about that, but life is pain, and all that…

Here’s what the attrib wrangle is doing:

int MinDist = -1;

if (@DistanceFromMask == 0)
{
	int PointVertices[];
	PointVertices = neighbours(0, @ptnum);

	foreach (int NeighborPointNum; PointVertices)
	{
		int success             = 0;
		int NeighborDistance    = pointattrib(
						1, 
						"DistanceFromMask", 
						NeighborPointNum, 
						success);

		if (NeighborDistance > 0)
		{
			if (MinDist == -1)
			{
				MinDist = NeighborDistance;
			}

			MinDist = min(MinDist, NeighborDistance);
		}
	}
}

if (MinDist > 0)
	@DistanceFromMask = (MinDist + 1);

Not a very nuanced way of spreading out the values.

For each point, assuming the point has a zero “distance” value, I check the neighboring points.
If a neighbor has a non-zero integer “distance” value, then I take the lowest of all the neighbors, add one to it, and that becomes my “distance” value.

This causes the numbers to spread out over the surface, with the lowest value at the source points, highest value at the furthest distance.

Integers –> Colours

So, the vertices now all have integer distance values on them.
Back up in the mask image, the solver promotes the Distance value up to a Detail attribute, getting the Max Distance of all the points.

In the wrangle node under that, I just loop through all the points and divide each point’s Distance by the Max Distance, and use that to set the colour, or I set it as green if there’s no distance value:

if (@DistanceFromMask > 0)
{
    @Cd = float(@DistanceFromMask - 1) / float(@DistanceFromMaskMax);
}
else
{
    @Cd = {0,1,0};
}

So that produces the gif I showed earlier with the green on it.

Colours –> Textures

Time to jump into SHOPS. See? This is where my awesome title pun comes in.

As simple as it gets, vertex Colour data straight into the surface output:

Material

In my “Out”, I’m using a BakeTexture node to bake the material into a texture, and I end up with this:

vortigaunt_mask_houdini

Conclusion

Bam! Work is done.
Still wouldn’t have been much point in doing this on Shangri-La, because painting masks in Modo is super quick anyway, but it’s fun to jump back into Houdini every now and then and try new things.

Has led to some other interesting thoughts, though.

  • For Shangri-La, we could have done that at runtime in a compute shader, and generated the mask-out effect from wherever you actually shot an arrow into an enemy.
    That would have been cool.
  • You could probably use Houdini Engine to put the network into UE4 itself, so you could paint the vertex colours and generate the masks all inside UE4.
  • You could do the “erosion” part in Houdini as well, even if you just subdivide the model up and do it using points rather than run it in image space (to avoid seams). Might be hard to get a great resolution out of it.
  • You could do an actual pressure simulation, something along the lines what this Ben Millwood guy did here. He’s a buddy of mine, and it’s a cool approach, and it’s better than my hacky min values thing.

Unforeseen circumstances

February 14, 2015

About once every year or two I decide to work on a character, and there is generally a lot of crying, swearing, and it ends with a character model that looks like it ran full speed into a brick wall.

Well, it’s that time again, and I may actually get a character done for a change!
If I do, it will be the first in about 8 years, and I’m not sure I’m willing to drag that old one out any time soon 🙂

Anyway, here ’tis:

Vortigaunt_wip1

He’s a Vortigaunt from the Half Life series, for those unfamiliar with the series (or in case he doesn’t look much like one :)).
I modelled it off screenshots from Half Life 2 this time, rather than Half Life 1 like most of my other reference.

So it’s rigged and skinned, but just in a pretty temporary way so I could pose him in that pensive state above.
Once I’m happy with the rig, I’ll unwrap the guy, and hopefully get to sculpting and texturing etc 🙂

Flickery lights

January 4, 2015

It’s been a while since I’ve worked on this scene, so I thought I’d ease back in by playing around with materials again!

When the big machine in the roof of my scene turns on, I wanted to turn on a bunch of lights.
Here it is!

I’m doing this with a mix of actual lights, and an animated dynamic material.

Each piece of the geometry that is lit has vertex colour data on it, and the vertex colour is used to offset the flickering (which I’ll explain a bit later).

Here’s what the vertex colours look like (I’m only using the red channel at the moment, but I might use green and blue for other data later when I want to damage the lights):

ModoVertColours

And here’s the material:

FlickerEmissiveMaterial

The stuff outside of the “Flicker” group is just regular Normals, Roughness, Metalicity, etc.
I’m not using a separate emissive texture, I’m just masking the parts of the Albedo that are glowy, and using those in the material emissive slot. The mask is stored in the Albedo alpha channel.

Now, for the flickering part…

I’m using a texture to determine the brightness of the flicker, each pixel representing a different light intensity at a different point in time (I stretched the image to make it obvious, but it’s actually a 256 * 1 texture):

LightFlickerPattern2

The vertex colour, that I mentioned before, is the offset into the above texture. So each light has a different offset into the flicker pattern, so they all go through the same cycle, but they start at different points in the texture.

There are parameters for the strength of the flicker, minimum and maximum emissive strength, etc.
These parameters are controlled by a dynamic material instance, so that I can play with their values in the RoofMachine blueprint, like so:

DynamicMaterialParamsBlueprint

And, finally, I just set up some curves to control these parameters over the “power up” phase, which I just played around with until I was reasonably happy:

LightPowerTimeline

And that’s about it!

I’ve also done a little tweaking of the lighting, etc, and although it’s a bit too gloomy at the moment, it hides my nasty wall material that currently looks like cheese 🙂

LightTestAndWalls

Random fields

October 30, 2014

Just a quick update on the random functionality I implemented for the Vector Fields in the last post:


https://geofflester.wordpress.com/2014/10/27/swirly-vector-fields-of-doom/

The random data I was generating was previously overwriting the values imported for the vector field.
I’ve refactored it so that “Random” is a new “Modifier” component, which means it can be blended in with existing vector field data, e.g flow data baked out of Maya, other Modifier components I’ve implemented.

I’ve also set up a really basic blending between two random data seeds, to have a simple animated random vector field (rather than the previous approach).

Less words, more video!!

Towards the end of the video, I turn on the original Motor component, which starts blending with the randomness.

Probably worth mentioning that all components by default can be controlled by blueprints (because UE4 is awesome), so it would probably only take a few hours of a non-programmers time to:

  • Make a gun that shoots grenades that generate vortices in vector fields
  • A button that randomises particle movement in rooms
  • Fans that blow particles around that turn on and off with player interaction
  • players that generate swirling particles around as they move
  • Etc.

    Oh, also, I bumped the number of particles up to 48 000, for giggles.
    I can actually push it up to about 100 000 before it struggles, which is pretty neat for a 4 year old graphics card (560 ti)! 🙂

    Swirly Vector Fields of doom

    October 27, 2014

    C++ and I are friends again, but to be honest UE4 makes that pretty easy.

    I’ve been playing around with the Vector Fields in UE4, which are essentially a 3d grid of forces that can be applied per frame to particles.

    You can’t really author them in UE4, so I wrote some plugin code that lets me randomly generate them (please excuse the crappy VFX):

    Next up, I wanted to be able to change data in Vector Fields on the fly.
    I have plans for a whole bunch of different “Modifiers”, including spherical impulses (explosions, etc).

    For now, I have a basic motor / vortex type thing going on, that can be turned on and off through blueprints, blended in and out over the top of the random data, etc:

    Places I might take this (if I can escape Civilization: Beyond Earth :)) :

  • Sample the vector field data in the Material system, so that I can have particle effects and material flow effects tied together somewhat (water on a surface follows the direction of water particles flowing near it, explosions cause materials and particles to react together, etc)
  • Run a really simple inflow/outflow pressure simulation through the grid to replace the random initialisation I have now. Kinda like what my rather clever buddy Ben Millwood did recently for 2d water flood:
    Ben Millwood’s amazing flow map tool of glory
  • Move the Modifier functionality to GPU (animated vector fields in UE4 already implement some of this).
    I’m still getting pretty decent frame rates in debug builds of the editor, so I’m not sure how much I care about it, except to tick the “yay, compute shaders in UE4” box
  • Rather than just grabbing the vector at the position in world space, interpolate between vectors using wavelet turbulence technique to add high detail flow (Wavelet turbulence). Currently well beyond my understanding, so would take me months, but you never know 😉 At the very least, this would be a cool thing to do when sampling the Vector Field data in a material
  • Honestly, they could probably have a bunch of engineers work on this tech for years and years, and we wouldn’t see an end to the cool things that could be done with it.
    Things I would also like to see, but wouldn’t probably attempt myself:

  • The data structures don’t lend themselves to having a grid across a whole world (octrees instead of a flat array, maybe).
  • Vector Fields don’t currently work with anything but GPU particles (I think, unless I’m missing something). Would be nice to be able to use them with ribbon particles (for smoke, bullet trails, etc), make them work with dynamic objects, cloth, hair, confetti…
  • Extending them to have additional arbitrary channels added to them (pressure, temperature, etc) could be neat.
  • I could go on, but I’ll leave it there for now!

    If nothing else, this has been a relatively painless exercise in implementing custom SceneObjects and Components in UE4 🙂