Last post I went through all the setup for the bubble sim, now for lighting, rendering, materials, fun stuff!
I talked about the texture creation in the first post, but there are also quite a lot of materials in the scene that are just procedural Houdini PBR materials.
Most of these are not very exciting, they are either straight out of the material palette, or they are only modified a little from those samples.
The top four are a little more interesting, though (purplePaint, whiteWalls, wood and floorTiles), because they have some material effects that are driven from the simulation data in the scene.
If you squint, you might notice that the walls and wood shelf get wet after the grenades explode, and there are scorch marks left on the walls as well.
Here is a shot with the smoke turned off, to make these effects obvious:
To create the scorch marks in a material, I first needed some volume data to feed it.
I could read the current temperature of the simulation, but that dissipates over a few frames, so the scorch marks would also disappear.
The solution I came up with was to generate a new low resolution volume that keeps track of the maximum value of temperature per voxel, over the life of the simulation.
To start out with, I import the temperature field from the full Pyro sim, here is a visualization of that from about 2/3rds the way through the sim:
I only need the back half of that, and I’m happy for it to be low resolution, so I resample and blur it:
Great! That’s one frame of temperature data, but I want the maximum temperature that we’ve had in each voxel so far.
The easiest way I could think of doing this was using a solver, and merging the current frame volume with the volume from the previous frame, using a volume merge set to “Maximum”:
And the result I get from this:
So that’s the accumulated max temperature of the volume from the current frames, and all the frames before it!
Scorch in material
Back in the whiteWalls material, I need to read in this volume data, and use it to create the scorch mark.
Here is an overview of the white walls material:
Both the wetness and scorch effects are only modifying two parameters: Roughness and Base Colour. Both effects darken the base colour of the material, but the scorch makes the material more rough and the wetness less rough.
For example, the material has a roughness of 0.55 when not modified, 0.92 when scorched and 0.043 when fully wet.
The burnScorch subnet over on the left exposes a few different outputs, these are all just different types of noises that get blended together. I probably could have just output one value, and kept the Scorch network box in the above screenshot a lot simpler.
Anyway, diving in to the burnScorch subnet:
(Click for larger image)
One thing I should mention straight up: You’ll notice that the filename for the volume sample is exposed as a subnet input. I was getting errors if I didn’t do that, not entirely sure why!
The position attribute in the Material context is not in world space, so you’ll notice I’m doing a Transform on it, which transforms from “Current” to “World”.
If you don’t do that, and just use the volume sample straight up, you’ll have noise that crawls across the scene as the camera moves.
I found that out the hard way, 10 hours of rendering later.
Anyway, I’m sampling the maximum temperature volume that I saved out previous, and fitting it into a few different value ranges, then feeding those values into the Position and in one case Frequency of some turbulence noise nodes.
The frequency one is interesting, because it was totally a mistake, but it gave me a cool swirly pattern:
When combined with all the other noise, I really liked the swirls, so it was a happy accident 🙂
That’s really it for the scorch marks! Just messing about with different noise combinations until I liked the look.
I made it work for the white walls first, then copied it in to the purple walls and wood materials.
Similar concept to what I did for the temperature, I wanted to work out which surfaces had come in contact with water, and save out that information for use in the material.
On the left side, I import the scene geometry, and scatter points on it (density didn’t matter to me too much, because I’m breaking up the data with noise in the material anyway):
The points are coloured black.
On the right side, I import the fluid, and colour the points white:
Then I transfer the colour from the fluid points onto the scatter points, and that gives me the points in the current frame that are wet!
As before, I’m using a solver to get the wetness from the previous frame, and max it with the current frame.
In this case, I’m doing it just on the red channel, because it means wetness from the current frame is white, and from the previous accumulated frames is red.
It just makes it nice to visualize:
I delete all the points that are black, and then cache out the remaining points, ready to use in the material!
Wetness in material
I showed the high level material with the wetness before, here is the internals of the subnet_wetness:
(Click for larger image)
So I’m opening the wetness point file, finding all points around the current shading point (which has been transformed into world space, like before).
For all wetness points that are within a radius of 7 centimetres, I get the distance between the wetness point and the current shading point, and use that to weight the red channel of the colour of that point.
I average this for all the points that were in the search radius.
In the loop, you’ll notice I’m adding up a count variable, but I worked out later that I could have used Point Cloud Num Found instead of doing my own count. Oh well 🙂
I take the sampled wetness, and feed it into a noise node, and then I’m basically done!
If you want an idea of what the point sampled wetness looks like before feeding it through noise, here is what it looks like if I bypass the noise and feed it straight into baseColour for the white walls (white is wet, black is dry):
Next up, Mantra rendering setup and lighting, should be a rather short post to wrap up with 🙂