Near Field Landscape Decoration

A challenge for large scale terrain engines is the ability to provide enough detail in the near field of view, with grass, vegetation, rocks and other ground cover. In a brute force implementation these would all be represented by individual meshes (perhaps using billboards or imposters for distant objects) in an series of 2D rectangular areas stored in a quadtree, which are then rendered when in the view frustum and within the feature specific far clipping plane.


Individually placed trees in a landscape

A problem with this approach is the vast scale of storage required to store every bush, rock and grass clump, even though retrieval is fast (using a quadtree) this is still a huge data set.

A more scalable, fast, method of achieving this is to store the above but with a small spatial sample of the decoration required. This can be randomly scattered in a 1.0 x 1.0 rectangle at a given density appropriate to your decoration type. This is then stored with the area of coverage in the quadtree, but crucially the sample vegetation covers only a fraction of the area of coverage – the perception of continuous coverage is generated within the vertex shader.

In the above example individual trees do require constant geolocation, but smaller features, visible at only shorter ranges, can get away with a repeating wrap of a small set of meshes.

Assuming a maximum visible range for a feature is 100 world units, and a scatter of sample of meshes inside a 1×1 world unit square, the user can trigger the DrawIndexedPrimitives() when their camera position encounters the large area of coverage.

Inside the VertexShader the position passed in can be scaled and wrapped on a 100×100 scale to produce a repeating field of vegetation that seems to the viewer fixed in space, but in reality is being wrapped in the same way that repeating textures are wrapped in a texture sampler.

float wrap(float value, float lower, float upper)
  float dist = upper - lower;
  float times = (float)floor((value - lower) / dist);

  return value - (times * dist);

VertexShaderOutput VertexShaderFunction_Decoration(VertexShaderInput input)
    VertexShaderOutput output;

    // This calculation from

    // Wrap the coordinates based on the camera position.
    input.Position.x = wrap(input.Position.x - frac(Param_CameraPosition.x / 100)  ,-0.5,0.5);
    input.Position.z = wrap(input.Position.z - frac(Param_CameraPosition.z / 100)  ,-0.5,0.5);

    // Scale X,Y to 100,100
    input.Position.x *= 100;
    input.Position.z *= 100;

    float4 worldPosition = mul(input.Position,Param_WorldMatrix);

Example vertex shader fragment for wrapped surface features


The view above of grass clumps continues for 1000 world space units, wrapping the visible 100 world units of textures continuously as the camera moves giving the impression of endless grass.


In the above the trees are placed specifically in the landscape, but the grass is a wrapped randomised set of billboards.


In this shot it more clearly shows that the grass coverage is thicker nearer to the camera. This is done by rendering the same vertex buffer of grass, camera centred, at two different X,Z scales – the first at a world scale of 150 and a second pass at 30, giving a 5:1 density ratio of grass nearer to the camera. This technique reuses the existing vertex buffer and effect, just changing the World matrix.


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s