Contribution Overview

  • Octree and frustum culling

  • Instanced models

  • Instanced lights

  • Debug line drawer

  • HDR

  • Point- and spotlight shadows

  • Additive animation blending

  • Glass shader


 

Debug Drawer

To make debugging invisible things easier I developed a line drawer system. It’s pretty self-explanatory. You tell the drawer a from and a to and what colour it should be and it does its thing. The lines are also instanced to minimize performance loss.

Octree and Frustum Culling

To save some more frame time, I decided to implement an octree culling system. The octree stores static model data, such as buildings and other world meshes, that is queried by a view frustum to retrieve the model data to be rendered. The way I implemented this system is however a bit more memory intensive than it necessarily has to be. What I do is store an instance buffer per model that a node contains instead of retrieving all culled models and their instances from the octree and then dynamically creating one instance buffer per model. My version results in quite a few more draw calls (~10k) than the latter described technique, but my tries to implement it always resulted in longer frame times. This led me to believe that what takes the most time in the rendering pipeline isn’t necessarily rendering, but binding and creating buffers.

 
 

Instanced Lights

Before I implemented instanced lights, every light required its own fullscreen pass. This was a superfluous waste of frame time. Since I had already created a general class for light sources it was a pretty straightforward task to create the buffer for light instance data. The light instances are run through a vertex shader to determine their position on the screen and much like a normal model vertex. The vertex data is then sent to a geometry shader that in turn creates a screen space sprite which I then run through a pixel shader. In the pixel shader, I calculate the light as usual. The result is that however many lights can now be rendered with a single draw call and not affect too many redundant pixels. I only implemented light instancing for lights that don’t cast shadows. The total frame time from this change alone, at the time of implementation, went from ~25ms to ~12ms. Light sprite bounds are visualized in the image.

 
 

Instanced Models

Early into development, we realized that our current way of rendering models wasn’t sustainable. Our levels became too large and since they were built utilising a modular model kit there were a lot of buffers being bound. Instead of restricting our level designers, I decided to implement instanced rendering.

HDR

To make our lights a bit more realistic I converted our rendering pipeline to HDR, with colour correction, bloom and the works.

Point- and Spotlight shadows

To enhance the mood and setting of the game we wanted shadows for point- and spotlights. My first iteration of the shadow system had every light instance keep a set of shadow textures (one for spotlights and six for point lights) with dimensions 4096x4096. This, I realized later, was a bad idea as gigabytes upon gigabytes of memory were occupied by shadow textures once more lights were added to the levels. I later changed this so that six total, lower resolution, shadow maps were shared amongst all light instances. After I had implemented this system, I heard about a technique where every shadow casting light share one shadow map atlas where their “priority“ dictates how large an area each light is dedicated on the atlas. I, however, had not enough time to implement this system.

 
 

Additive Animation Blending

To invigorate the movement and animations of the player, we decided to enhance our animation system with additive animations support. To play an animation as an additive one, you call the play function with a bool to mark it as such. A reference pose can be set by passing an animation in the SetReference function, and the first frame of that animation becomes the reference. The reference is initialized as the bind pose. In the final game, the only animation to utilize this system is the firing animation. Some, then, would call the development of this system a waste of time, but time spent learning is not time wasted.

 
 

Glass Shader

In the final days of the project, when the bugs had been squashed and the features had been polished, I decided to add some juice to the weapon’s light bulb. I make use of my model effect shader system to assign and render the glass effect. Firstly, the glass mesh is rendered and lit as a glossy, metal-like, object. This is the base of the glass effect before refraction is taken into account. The HLSL code below is entirely original, as far as I know.

        // specular
        color.rgb = FullscreenTexture2.Sample(defaultSampler, input.myUV).rgb; // glass mesh PBR texture
        float3 normal = normalTexture2.Sample(defaultSampler, input.myUV).rgb; // glass mesh normal texture
        float4 pos = positionTexture2.Sample(defaultSampler, input.myUV).rgba; // glass mesh position texture
        float3 viewDir = normalize(pos - FBD_cameraPosition.xyz);
        float d = dot(normal, viewDir);
        color.a = (d + 1) * .5f; // higher alpha on edges
        float luminance = color.r * 0.3 + color.g * 0.59 + color.b * 0.11;
        color.a += luminance; // higher alpha on specular highlights
        color.a = clamp(color.a, 0., 1.); // better safe than sorry
        color.rgb *= color.a;
        // refraction
        float ior = PPBD_refractionIndex / 1000.; // divided for ImGui slider eou
        float str = PPBD_refractionStrength / 100.; // divided for ImGui slider eou
        float4 viewPos = mul(FBD_toCamera, pos);
        float4 viewNormal = mul(FBD_toCamera, float4(normal, 0));
        viewNormal.z = (-viewNormal.z / ior); // flipped on xy plane in view space
        viewNormal = normalize(viewNormal);
        viewNormal *= str * (1 - dot(normalize(-viewPos), viewNormal)); // length to sample point
        float4 refractionPos = viewPos + viewNormal; // new view pos
        float4 pixelProjectionPosition = mul(FBD_toProjection, refractionPos);
        float2 refractionUV; // convert to texture uv coordinates v
        refractionUV.x = ((pixelProjectionPosition.x / pixelProjectionPosition.w) / 2.f + .5f);
        refractionUV.y = ((-pixelProjectionPosition.y / pixelProjectionPosition.w) / 2.f + .5f);
        float3 refractionColor = FullscreenTexture.Sample(wrapSampler, refractionUV).rgb;
        color.rgb += refractionColor;
        color.a = 1; // sets alpha to one because the blending is already done
 

If the HLSL code above is formatted weird and/or you read this on a mobile device, you can press this button to view the code as an image.