About

My first Vulkan game engine. With it, I learned the fundamentals of physics, input management, ImGui, and deferred rendering as well as many valuable lessons on general coding principles and conventional wisdom.

I implemented a Descent clone to test my quaternion understanding, a bowling game for real-time physics, and a unique Surface-Stable Dithering effect — but have since moved on to other projects.

Editor

For this first engine I did what we all do when discovering ImGui: I tried to replicate every general game engine feature I could.

With ImGuizmo integration I let users move, scale and rotate scene actors. I implemented a scene tree with an actor component system. One can add physics properties, meshes, cameras, as well as custom components. I wrote an input manager which lets users create new button and axial inputs, give them names, and bind these names with an event dispatcher to various functions in the code. The game and editor both expose their own input manager for that purpose, and its state gets saved and loaded as does everything else in the scene using tinyxml.

One of the games I wanted to reimplement in this engine was "Descent", a game about flying a spaceship and shooting at enemies through tunnels. To help with level design I quickly realized world-space UVs would be essential, much like I was used to in the hammer editor. This required me to recalculate tangents for all surfaces, but the end result worked rather well.

Rendering

Still learning Vulkan at the time, this is my first use of a g-buffer for deferred rendering. I render the scene multiple times prior to retrieve albedo, normal, depth, and shadow buffers and combine them all after the facts. Unfortunately I make little use of the multiple light sources optimizations that are enabled by this setup. my g-buffer essentially allows me to create a clean outline shader, as well as SSAO. (both things a forward+ setup is more apt for.) I also implemented a parallax shader later on, but by then I already knew I wanted to start fresh with a new engine.

As hinted prior, this engine also uses shadow maps. The implementation is somewhat naive just like the phong lighting used elsewhere but it taught me plenty and above all, all these features were built to reimplement surface-stable fractal dithering as a possible material.

Post Mortem

Dependencies

Keeping dependencies remote makes the program vulnerable to random updates. Code which worked on Windows and Linux months ago now struggles to boot on either. I now understand the urge to 'vendor' everything, or at least gate-keep dependencies behind version requirements.

"Clean Code...

...Bad performance." Inheritance, protected variables, single-responsibility, RAII, all feel neat to implement but haven't necessarily lead me to better code, or more extendable code. The Actor-Component paradigm might make sense in the context of an generalized engine like Unity or Unreal, but it makes little sense elsewhere. The often-recommended many layers of abstraction usually lead you to leak complexity.

I now take better care to profile my projects, and let that — not SOLID principles — guide my architectural decisions. Above all, I found that SOLID principles as a bible can lead you to solve problems you haven't faced yet. You feel like you are being diligent, but in fact you are trapping your future self in a solution that might not be adapted to the problems he actually faces. I now prefer solving only the problems I see, at the risk of deleting less code if it finds itself no longer applicable to a larger task.