Spline Distance Fields

December 31th, 2024

Problem

Earlier this year I decided to put my main project Tangerine on hold indefinitely and began prototyping a new renderer in an unfamiliar language using a totally new graphics API with the goal of overcoming all of Tangerine's technical shortcomings. The new renderer is an eccentric CPU ray tracer called Star Machine that boasts the ability to effortlessly push 4k frames at 120hz or better irrespective of scene complexity, a custom coordinate system that limits the play space to roughly 940.7 astronomical units with a constant world resolution of 15 micrometers, and a truly unique visual style. To prove out the underlying theories and help me prioritize what to work on, I am also developing a time trial racing game called Rainy Road.

At this time of writing, Rainy Road lacks a way to render terrain. These are my requirements for its terrain rendering system:

  1. Some terrain features (roads specifically) must be defined as splines.
  2. Terrain features must be composable, allowing for map sections to be rapidly swapped out at runtime.
  3. The system must be useful for procedural object placement.
  4. Terrain data must be fast to process into rendering intermediaries.
  5. Terrain data must be very compact. An implicit representation is preferred if possible.
  6. Iterating on designing level sections must be quick and easy.
  7. I'd like to use existing tools where possible.
  8. I'm not interested in using proprietary tools for this project.

A quick survey of existing terrain editing tools revealed that virtually all general purpose terrain editors violate at least one of these constraints, and unsurprisingly I did not find any terrain tools available targeting my particular niche.

The next most obvious course of action is to use low resolution heightmaps with a good noise function to give it the illusion of being continous. I've seen games (which are now 20+ years old now) use this technique to great effect, so I consider this to be pretty low risk. Additionally, I figure I can use Blender to rapidly iterate on variations of this technique, and once I am satisfied it can function as the level editor.

My experimentation in Blender (and lots of great feedback from internet people experienced on the subject) quickly revealed that reasoning about landscapes and roads in isolation from one another "adds skill" a bit more than I would like, and so I see this as a good opportunity to develop structured workflows and supporting tools to take me to the places I want to go.

Solution

This particular conceptual brick in the washing machine has led me to deeply internalizing what I now know to be a fundamental law of design and composition that applies universally:

Everything Affects Everything

In the real world roads are built along whatever happens to be the most circumstantially optimal route through the world they are to exist within. The landscape (among other things) affects the decision of where to put a road, and in turn the road changes the landscape.

If we know where some things must be, then we can infer what surrounds those thing. We can eschew the input height map entirely, and instead generate the terrain from only points and splines, possibly in real time.

It turns out there's a whole field of math about this stuff already that I didn't even know existed until after developing the spline distance fields technique and posting about it on Mastodon, and at this time of writing, I have not yet explored it in any significant depth.

According to my journal, on September 20th of 2024 I realized I could generate plausible terrain surfaces entirely from splines that describe the important terrain features (rivers, roads, rails, cliffs, etc). In its simplest form each point in space has a corresponding closent point on the closest spline. That closest-point-on-closest-spline (along with its corresponding binormal vector) defines a plane that determines the local elevation relative to your original arbitrary point in space. This effectively is a method of extruding splines within a constraining volume of space that suspiciously resembles a voronoi diagram (because it secretly is one, but don't worry about that).

Things get a bit dicey for sampling points that are close to the boundary of a spline field, but as I wrote in my journal, these inbetween spaces should simply be "interpolated somehow". I call this the "liminality problem". This can appear similar to subduction and obduction in real life, and so this quality may be reasonably considered a useful feature if it can be handled intentionally. However, in most cases you will want your spline zones to flow together seamlessly, and so this article only describes a method that sweeps the problem under the rug.

The next day I put this idea to the test, and this is what I got:

Applying what I learned from making that first test, I produce this second attempt right away:

The following day I pushed the technique further to experiment with procedural object placement and test out road generation. This is what I came up with:

This is exactly the sort of thing I've been searching for.

This technique is still an area of active experimentation and research for me:

Unfortunately due to my yearly struggle with Father Winter and a very stressful ongoing dispute with my health insurance, I've stalled out on this research project for the time being.

Until the rains of spring heal my soul and wash away the pain,

Here's How the Basic Version Works

The above pseudo code outlines the entire technique. If this tells you everything you need to know to implement it, great! Be sure to at least take a quick look through the pretty pictures at the end of this post before you close the browser tab. Or don't! Nobody is paying anyone to hold on to your attention. You are free! I don't even know if anyone is even reading this because I don't collect any analytics at all.

Now, for everyone who wasn't born with perfect knowledge of everything, the next section of this post walks through how the normal, tangent, and binormal vectors work for splines in Blender; and the section after that steps through a real working implementation of this technique. Example source files are also provided for you to use in any way you like.

Relevant Spline Math

Blender's Geometry Nodes system provides a variety of useful high level functions for working with Blender's curve and point cloud primitives, so we're going to use those where possible. This leaves a few bits of math to review that are important to our implementation. If you already know how to calculate a binormal vector, breeze on over to the next section.

The first thing we need is the ability to define a plane for any given point on a spline. For our purposes we'll use a point in space and the direction that is perpendicular to the plane (aka "the surface normal" of the plane). In terms of spline parameters, these vectors are called the position and the binormal. Blender does not provide the spline's binormal, but it is very easy to calculate it from the spline's normal and tangent vectors which Blender does provide.

I cannot stress this enough the thing we want here is the thing you're probably used to calling "the surface normal", but—for reasons I am not responsible for—the thing we want is instead called the "binormal" here and the thing that is called the "normal" is instead a different thing. Why did the mathematicians do this to us?!

Blender's splines always define a tangent and normal vector for all control points and all interpolated points. These are "unit vectors", which means they always have a length of one, and they encode a direction. The binormal is the cross product of these two vectors, which I'll illustrate in a little bit.

The Tangent Vector

The tangent vector is a unit vector that grazes the point on the curve. This vector rests on the surface of the plane we want to describe. This is what it looks like:

The Normal Vector

The normal vector perpendicular to the tangent vector, and also rests on the surface of the plane we want to describe. This is what it looks like:

Calculating the Binormal

To calculate the binormal, take the cross product of the normal and tangent, like so:

Generating the Heightmap

With the spline's binormal in hand, now it is time to generate the height map. Here's the eagle eye view:

I find it works best to keep the splines in their own collection. This way the geometry node graph can read alll of the splines from it in one go, and you don't need to change any code to add more splines to your terrain. Only one geometry nodes modifier is required to implement this technique, and that placed on an empty mesh.

Let's examine the spaghetti program:

Curves to Surfels

You may have already noticed this, but the curves are just a useful editing interface that is immediately discarded.

What we're actually doing is using the curves to generate a set of primitives called surfels, and then using the surfels to extrapolate the surface of the terrain. In this particular case, a surfel is defined as being a point in space with an associated normal binormal vector. Maybe I should have named this technique "surfel distance fields".

Since our heightmap starts as a flat subdivided plane, we'll get the best results if we project the surfels onto the XY plane before sampling them so that Blender's sampling functions behave the way we expect them to. We still need the original position, so we just capture that attribute to snapshot it before flattening.

Surfel Sampling

Next we create the geometry that will be deformed into our terrain. For this we use the Grid node. The density of the grid affects interpolation behavior. The mesh generation method described in this page takes the "sweep the error under the rug" strategy. If your vertex density is too high, you will get abrupt changes in elevation between locations that belong to different curves. Likewise, if your vertex density is too low, your landscape will be soft and featureless.

To sample the nearest surfel parameters, use the Sample Nearest node to find the index of the nearest surfel to a given vertex on the grid. Then use the Sample Index node to translate that index into the parameters we want. If you're not familiar with geometry nodes's fields concept, the following graph section will look confusing.

The Capture Attribute at the end of this part of the program is not strictly necessary, but I've added it in hopes of making the data flow a little clearer.

Generate the Heightmap

I've mentioned a plane function many times now, and here it finally is! The graph section below starts by finding the distance of each point on the undeformed heightmap to the closest surfel. The function for this is the signed distance from an arbitrary point in space to the nearest point on a plane. The term "signed distance" means the distance can be a negative value, which usually indicates that the evaluated point was under the surface of the plane.

However, that would imply that a negation would be needed in my code below, which is curiously absent. The reason for this is I screwed up the math somewhere here in such a way that inverts the polarity of the result. This elides the need for the negation. The result appears to be correct though. Please pretend that I am smart and meant to do this. Someone smart once told me that every shipped game contains an even number of sign errors.

Moving on, once you have the planar distance in hand, I find that bluring it a few iterations helps to smooth out any liminality problems, much like adjusting the vertex density. This is not an exact science, but generally if you adjust the number of blur iterations or the number of verticies, you also have to adjust the other.

Finally, we use the blurred distance field value as an position offset along the Z axis, and that gets us the deformed terrain mesh.

Tada!

Appendix A: Show Me the Source!

Here are the blend files for the examples shown above:

These example files are made available to you to use for whatever you like via your choice of CC0 or CC-BY.

If you do choose to provide attribution for some reason, credit me as "Aeva Palecek" and link to this article if it is reasonable to do so.

Appendix B: Examples in the Wild

Promising Prototype Provokes Preproduction Perturbation

I put one of my early prototypes into the hands of my dear friend Bitmap and 30 minutes later she sent me this message:

And the next day she sent me this:


The stripes on the vertical surfaces here are isolines added by a shader to show changes in elevation.

This caused quite a stir for the game production she's part of, as they ended up completely throwing out the Godot terrain tool they were using in favor of switching to a new workflow built around the prototype I gave her. I'm equal parts honored and deeply terrified by this development.

Bitmap fucking hates tends to bounce off of conventional digital sculpting tools like z-brush and Blender's sculpt mode. Her artistic background is primarily as a 2D illustrator, and she's found that the way she reasons about form and space is generally incompatible with those kinds of tools.

I'm very eager to see what things she will create now that she can simply draw the terrain.

I asked Bitmap and The Director if there were any recent(ish) screenshots that would also be ok for me to share in this blog post, and they provided me with a treasure trove of gorgeous preproduction progress photographs which I've included all of below except for the one The Director asked me not to:

You may be wondering where all of that nice multitexturing in these screenshots fits into this technique. Bitmap provides a set of flat subdivided quads with UVs already prepared, and the tool I provided deforms them. Weight painting is used to control how the textures are combined.

Sculptor: Non-Destructive 3D Modeling in Godot

My colleague Dbat has been hard at working building a procedural modeling tool in Godot called Sculptor. Sculptor is a non-destructive 3D sculpting tool that works by deforming meshes that have been generated via CSG.

To do this, Dbat has developed an advanced version of the spline distance fields technique that enables you to create freeform meshes instead of height maps. Their technique uses ray tracing to iteratively deform meshes towards the implicit surface described by the input curves. Sculptor exposes parameters that allow you to control the projected shape and influence of each curve on the mesh deformation process. Way cool!

Dbat's technique would pair very nicely with an adaptive tessellation system.

Inverse Distance Weighting Variant

Math wizard Danpiker wisely pointed out to me that the spline distance fields technique is very compatible with Inverse Distance Weighting. Shortly after, Danpiker put together this awesome animated demo of exactly that. Here's a still frame from the demo:

Inverse Distance Weighting solves the liminality problem, and produces a very slick look. You can very the degree of influence by adjusting the exponents in the equation.

I've found inverse distance weighting to be a bit tricky to implement efficiently in geometry nodes, however the introduction of the for-each nodes in Blender 4.3 improves things a bit.

Personally, I find the smoothing effect is maybe a bit too effective at smoothing things out, as it tends to erase the earthy look I'm going for. However, I suspect this is easily overcome with a good displacement map.

Closing Thoughts

I like splines!

Future Work

I do not know what the future holds. Let us brave it together.