Category Archives: Devlogs

Satellites

Even with two shipped games under its belt, my entity-component system (ECS) is still one of the youngest features of my engine, and it should come as no surprise that I’ve discussed changes to it many times throughout the development of the Gunmetal Arcadia titles.

A few weeks back, just before I resumed my normal weekly blogging schedule, I tweeted about some ECS refactoring I was doing to better support dynamically modifying enemies and other entities at creation time. Then just last week, I made yet another change to my entity serialization, so I’ll be covering both of those today.

Entity creation from multiple definitions

Last August, I added a feature to my editor to allow me to split entity definition markup across multiple hierarchical templates, giving birth to a paradigm that I compared, perhaps inadvisably, to multiple inheritance.

multidefs

This solved a problem of having a large amount of copied and pasted markup across entities that were similar in some ways but which couldn’t be easily generalized to a single common ancestor.

This proved to be a huge help in shipping Gunmetal Arcadia Zero, but I also relied on the ability to add markup to specific instances of entities which had been placed in levels. I often used this to flag certain enemies as being counted towards challenge room locking and unlocking, or to spawn rewards when bosses died, or other situation-dependent events. It would not have been appropriate to create a new entity template for each of these since each one would only be instantiated once; keeping that markup associated with the instance that needed it was and is the most logical option.

instancedef

Unfortunately, the procedurally generated nature of Gunmetal Arcadia precludes the use of instance-specific markup. With a small number of exceptions, entities are not placed in the editor at all, but are instead spawned at runtime at known-safe locations given information about spawn “opportunities.” As a result, I was forced to find an alternative that would play nice with procedural generation.

My solution was to extend the concept of aggregating multiple data definitions sources from the editor to the game. Historically, every entity has been spawned from a single definition file containing all the markup it would ever need. The notions of an entity hierarchy and instance-specific markup were strictly editor conceits; all this data would eventually get concatenated together into a single file for each and every entity.

As of a recent change, I can now specify an arbitrarily large list of data definition files from which to build an entity, rather than just one. This is conceptually similar to concatenating markup into a single file in the editor, but rather than iterating through markup that was assembled offline, I spin through all the markup in one file, then proceed to the next, and continue until all files have been exhausted. The end result is identical, provided the source is otherwise the same aside from being split across multiple files.

In this way, I’ve been able to solve the problem of marking up entities based on situation, with my first use case being to count the number of living enemies for locking and unlocking challenge rooms, as I mentioned briefly in Episode 40. I took the same instance-specific markup that I used in Gunmetal Arcadia Zero for flagging enemies as counted for challenge rooms, and I moved it to a new “mixin” entity template. Then I simply need to augment the list of entity definitions with this one when spawning enemies within challenge rooms, and they automatically get counted without the need to create challenge-room-specific variations of every single enemy type.

Since solving that first use case, I also found a use for this when working on spawning doors between the level hub and subrooms. Historically, doors have been assigned a GUID by virtue of being placed in the editor. Entities require a GUID to be serialized (saved and loaded to disk, but also saved to memory to be recalled during the same session), so if we want to preserve the state of a locked or unlocked door, we need one. Now that they are spawned dynamically, they require a dynamically assigned GUID. Now, I could have edited the primary door template in the editor and added some markup to accomplish this, in the assumption that all doors in this game will be dynamically spawned and will all require a dynamically assigned GUID.

assignedguid

However, in light of this new tool, it feels more correctly to split the markup for dynamically assigning a GUID out into its own “mixin” template that can be applied at runtime. In this way, I can preserve the way doors have worked in the past in case I ever need to rely on that behavior again. I like this too because it feels truer; there’s nothing intrinsically “dynamically spawned” about the concept of a door, so putting that markup in the door template feels kludgey. Splitting it out into its own thing that can augment a door, effectively turning it into a different kind of door, that makes more sense.

Additional component serialization

Continuing with the development of doors, I ran into a couple of unusual problems (not pictured below) in which entering the same door multiple times could, depending on the circumstance,

  • Drop the player in front of a different door in the same room
  • Drop the player in the middle of the screen rather than a door
  • Crash the game

At least one of these turned out to be a fairly boring bug caused by failure to correctly account for the position of the door within a large scrolling room. But the more interesting issue, and the one I’m going to talk about here, dealt with some assumptions I’ve made in the past regarding data definition markup and serialized entity-component state.

GunArc 2016-07-24 22-18-45-652

In the past, I’ve typically assumed that the data definition markup I provide when constructing an entity represents its whole. Regardless of whether that markup is contained within a single file or split across multiple, it has always represented the complete, fully constructed state of the entity and all its components. Recently, I broke that assumption.

The situation was this: when placing usable doors on either side of a hub/subroom transition, each needs to know about the other in order to specify a destination. In the past, this has been handled by the editor. I mark one door as “Side A,” the other as “Side B,” and the editor automatically adds some markup to each letting it know the other exists and identifying itself to the other. But once again, I can’t rely on editor tools, as these are now spawned dynamically at runtime.

Now, I might not have the actual doors available to me in the editor, but what I do have, as I’ve mentioned before, are entities representing known-safe spawn points. I have a set of these for enemies, and I have another set for doors. And my solution to this particular problem was to add a new component to the door spawn points called a “door proxy.” This component is responsible for making the link between the two doors on either side of the transition; by virtue of the fact that it doesn’t get constructed until the level has been procedurally generated, it can trust that that information will be available. In fact, it is this proxy component that is responsible for spawning the door itself, which may be an invisible bounding box that prompts for the “up” input, or may be a visible locked door requiring a key to open.

So the proxy spawns the actual door, and it also maintains the information on how the two sides of the door are connected. But there’s a missing piece, and that’s how the door knows that it needs to utilize this information. I’ve accomplished that with the use of yet another new component, a “teleport helper.” This component exists on the door itself and looks for incoming information from a door proxy. If found, it does essentially exactly what the editor tool did in Zero: it assigns a unique name to this door, and it sets up information for teleporting to the other door, which is assumed to have been assigned a name in the same fashion and can therefore be located when necessary.

That’s all well and good, and entering a door constructed in this manner works correctly. The trouble starts when we go back the other way. Rather than returning through the door we came from, we spawn in the middle of the room (default fallback behavior when no appropriate spawn point can be found), and if we enter it a second time, the game crashes.

The problem stems from what the teleport helper component does. It assigns a name to the entity (via a name component) and provides information about its destination (via a teleport component, which is different from a teleport helper component). That in itself is fine, except that the data definition markup for the door does not explicitly specify either of these components; they are created at runtime as they are needed.

I mentioned at the start of this latter wall of text that I’ve historically assumed the data definition markup represented the whole of an entity, and as I’ll explain, that meant specifically that no additional components were expected to be added after the entity were constructed from its definition.

My ECS code has no qualms about adding components post-factory-build. In fact, in both You Have to Win the Game and Super Win the Game, I frequently added components by hand in code, as the notion of a complete data definition for the player entity in particular simply didn’t exist at the time. So that wasn’t a problem. And my serialization code didn’t care if new components had been added when it went to save the entity to disk; it would spin through whatever components were present and write out all their data.

The problem arose when loading this serialized data back up, as in the case of returning to the first room. After spawning the door, the spawn point with the proxy component deletes itself, as it assumes the door is capable of sustaining itself after creation (and with the assigned GUID discussed in the first half of this post, it should be). But the way serialization works in my engine is, the entity first gets rebuilt from its data definition file(s), and then the serialized data (which is assumed to be a delta from this default state) gets applied a component at a time. And here’s the catch that’s taken me far too many paragraphs to reach: if a particular component is not specified in the data definition file(s), it will not be constructed; if it’s not constructed, it will not attempt to load previously serialized data.

Functional doors require a name and a teleport destination, but specify neither component in their data definition markup, relying instead on the teleport helper component to construct these. Therefore, when reloading a door that was previously serialized, it will be missing these components, and the serial data for them — which does exist — is simply ignored.

My solution was simple enough: when rebuilding an entity from serial data, I now spin through that data, constructing any missing components for which serial data exists. In this manner, the name component and teleport component for the door will be constructed when their previously saved data is found, despite not be explicitly specified in the data definition markup for a door.

In truth, there was a much simpler fix for this issue, which would have been to stub out placeholder data definition markup for these two components to ensure their creation. This would have been all of two XML tags:

<name />
<teleport />

With that, the entity factory would know to construct those components. It wouldn’t set them up with any defaults, leaving that to the teleport helper, but it would guarantee they existed prior to loading serial data.

So why didn’t I do that? Mostly it came down to the fact that my code has always allowed for components to be added after factory construction, and even though I almost never do that, and a workaround did exist in this case, it felt better to continue to support that path.

This does raise an interesting question though. If I support adding new components at runtime, should I also support removing them? Historically, I’ve never removed components from an entity, ever, and I can’t even imagine what sort of use case would necessitate that behavior, but there is a sort of irritating asymmetry in supporting one but not the other.

Oh well.

Occluders

The hardware limitations of the NES necessitated certain aspects of its games’ visual aesthetics. Sprites and backgrounds were both composed of 8×8 tiles; in action platformers, these were nearly universally grouped into 16×16 blocks. In conjunction with the console’s limited color palette, this led to a recognizable uniformity across its library. Additionally, certain artistic conventions arose throughout the NES’s life cycle, among them the notion of placing drop shadows beneath platforms in action games to provide clarity of spatial positioning and a greater sense of depth.

dropshadows2

dropshadows3   dropshadows4   dropshadows1

As I’ve noted before, Faxanadu was one of my primary points of reference in developing the look of Gunmetal Arcadia, and drop shadows were an important part of this. The exact nature of these shadows vary from tile set to tile set, but going back to the earliest implementation of the catacombs set, shadows have been an integral part of the look.

dropshadows5

In Gunmetal Arcadia Zero, these shadows were all placed by hand. For the roguelike Gunmetal Arcadia, that is not an option. The quarter-scale prefabs that compose the world don’t (and shouldn’t) have knowledge of their neighbors, so drawing shadows that extend correctly across prefab boundaries cannot be done reliably without introducing severe restrictions to how prefabs can be drawn.

My solution has been to procedurally generate background tiles based on the collision type (solid, walkable, ladder, etc.) of adjacent tiles. After generating a level using the “card-laying” algorithm I discussed previously, I go back through each room and, on a tile-by-tile basis, see whether I need to substitute in a replacement.

In my editor, tiles that need this substitution are implemented as “custom tiles,” which is a sort of catch-all feature that allows me to associate some XML metadata with a tile to provide clues to the game as to how it should function. Currently these only containing stubs flagging them as walkable background tiles or solid collidable walls and floor, but in the future, these will likely be extended to include specific replacement rules.

placeholders

In code, I look at the collision type of adjacent tiles in certain positions relative to the current one, and based on these, I choose substitutions. Since shadows drop down and to the right, I have to search up and to the left for occluding tiles. The exact substitution I choose depends on which tiles are occluders and how far they are from the current tile.

shadow_region

While prototyping this scheme for background tiles, it became apparent it could be useful to apply a similar approach to solid tiles as well. I’ve typically assembled patterns like these overlapping bricks from smaller sets of general purpose tiles, as shown below. Just like the shadows, it would be impossible to match these up perfectly at the edges of prefabs if I were to draw these by hand, so what I’ve done instead is to choose substitutes in this overlapping brick pattern based on location, and to apply the appropriate edges when adjacent tiles are non-solid, so bricks don’t get cut off halfway.

bricks

These prototypes are strictly tied to this particular tile set, but understanding how to solve these problems for one set will put me in a good place to understand how to general these sorts of rules to any set. In particular, I’m thinking about the tiles for the third and fifth levels of Gunmetal Arcadia Zero and how they have a top layer of grass. These rules will undoubtedly be similar to the ones I’m using for detecting edges of bricks in the catacombs set, but I expect there will be some additional edge cases to consider based on my experiences with handpainting these tiles.


I talked a little bit in Tuesday’s video about some input remapping changes I’ve been making recently in order to better support a wider number of gamepads. This feature grew out of a two-hour development stream I did last week and consumed the majority of my week, but it also clued me in to some existing problems in my input system that I otherwise wouldn’t have found.

This solves a problem that I’ve known about for years but had never attempted to address before. The problem is, there’s no consistent standards or conventions for the assignment of buttons and axes on DirectInput devices. I’ve historically followed Logitech’s conventions, as their gamepads seem to be among the most popular non-XInput devices, but not all devices adhere to these same rules. In particular, I’ve had trouble with NES- and Super NES-like gamepads, as well as DualShock 2 gamepads running through USB adapters. The face buttons are often assigned differently; for instance, the bottom face button, the one that would be labeled “A” on an Xbox controller or “X” on a PlayStation one, will be identified as “Button 2” on some devices and “Button 3” on others. This creates two problems. The first is in assigning default control bindings. Let’s say I assign Button 2 to the “Jump” control. This will work correctly on Logitech devices and some others, requiring the player to press the bottom face button to jump, but on other devices, “Button 2” might be the right face button (B or Circle by Xbox or PlayStation standards), so the player would have to press that button to jump by default. Of course, I support arbitrary control bindings, so this can be corrected, but that’s only part of the issue. The second problem is that the button glyphs displayed by the game are coupled with the input (Button 2, Button 3, etc.), so even if you rebind the correct button in terms of physical placement, the game might display the wrong glyph for that button. (For instance, if the player were using a device for which the bottom face button were Button 3, the game would show “B” or “Circle” as the glyph for that button, when it would be appropriate to show “A” or “X.”)

This problem is easiest to understand in terms of face buttons, but it extends to axes and POV hats, as well. The naming of “POV hats” dates back to their origins on flight sticks, but nearly every gamepad I’ve ever tested reports the d-pad as a POV hat, so that is how they are most commonly used today. Axis naming is even more bizarre; DirectInput supports 24 axes, named for X, Y, and Z; position, velocity, acceleration, and force; translation and rotation. Typically the left analog stick of a gamepad will be mapped to XY/position/translation, but the right analog stick and sometimes triggers (if reported as axes rather than buttons) are less standardized.

The DualShock 4 is my favorite controller of the current generation and I want to support it as well as possible, but it brings a number of additional concerns to the table. Chief among these is that its triggers are reported as both buttons and axes. This creates problems when awaiting user input to bind a control. My control binding code watches for changes to any input on any device; when it sees a change, it assigns that input to the specified control. Because the triggers activate two inputs at once, it will only catch the first one it sees, which will often, but not always be the axis rather than the button because it checks for this one first. However, it the trigger is only slightly depressed, such that it is still within the dead zone, the axis will be ignored and the button will be selected instead. My solution for this, against my better judgement, was to introduce a device-specific hack to simply mute the button inputs associated with the DS4 triggers when that device is used (as detected by vendor and product ID).

Additionally, the DualShock 4’s trigger axes rest at the minimum position (-32768) rather than the conventional center (zero). On an ordinary axis, I can watch for deviations from zero in either direction to see whether the input is being pushed, and in which direction. That code fails for these triggers, because changes across the zero boundary will happen both as they’re being pressed and released.

It’s conceivable that my code is simply naive and should account for an arbitrary resting position. However, I would argue that axes are intended to rest at zero and that this is an unusual outlier, as evidenced by the fact that DirectInput has no mechanism for reporting axes’ resting values, and in fact, this exact issue led Microsoft to combine the Xbox 360 controller’s triggers into a single axis that does rest at zero (with the unfortunate and surely intended side effect that the triggers cannot be reliably polled with DirectInput at all, and XInput must be used instead).

My initial solution was to cache off the first values reported by polling the device immediately after it was initialized or selected within the game menu. This value (clamped to the nearest of zero, -32768, or 32767) would be treated as the resting value. This would introduce problems of its own when using flight sticks with dials that don’t automatically center themselves, but until I make a flight sim, I think that’s a risk I can take.

However, that plan fell apart on account of getting unpredictable results back when polling a device immediately after it were acquired by DirectInput. Sometimes it would return the expected values; other times it would return a completely zeroed out state. As this state could conceivably be a valid result for some devices, I couldn’t assume it had failed and needed to be retried on the next tick, and with no error or warning result I could catch, this plan seemed to have hit a wall. So, I added another hardcoded hack, forcing the resting values of these axes to the expected values when the DualShock4 vendor and product IDs were seen.

These sorts of major changes to established systems always make me a little skittish, but so far the results have been promising. I have a little more testing to do on Mac and Linux builds, but both my DirectX and OpenGL/SDL Windows builds are behaving correctly for all my dozen or so test devices, so that’s reassuring.

Echolocation

Hello again!

Five weeks ago, I paused this blog and the accompanying video series on account of not being able to openly discuss the work I was doing. I’ve wrapped up those tasks, and after spending a little while on a demo build of Gunmetal Arcadia Zero for events, I’m back to full-time development on the roguelike Gunmetal Arcadia. (In fact, if memory serves, this is the first time since last April that I’m not splitting my time among multiple projects, so that’s cool.) I’ve covered a decent amount of ground in the last few weeks, so let’s get going.

That console thing

The console port work I’ve been awkwardly not talking about was investigative in nature. My intent was not to have a shippable game at this time, but to prove that it would be feasible to port my engine to a home console at some point in the future. As of the time I wrapped development on this task, I had a functional version of Gunmetal Arcadia Zero running on the console, with rendering, audio, input and more all working correctly. This implementation has a number of known memory leaks, particularly in the new rendering code, but nothing that couldn’t be solved in a day or two. That’s not to say this build was anywhere near cert-compliant, and I would estimate another month or two of work to bring it from this first playable state to a shipping build, but it’s definitely within the realm of possibility, and I consider this investigation a success.

Recent work

I’ve been making some incremental improvements to the procedural level generation scheme I documented back in May. The levels it builds are now somewhat more interesting and plausible as gameplay spaces and offer more room for tweaking and tuning as I continue making progress. Fundamentally the concept is the same: construct a path through a region of a predetermined size and then choose prefabs that meet the appropriate conditions; the difference is in how I construct that path. I now differentiate between one-way and two-way vertical traversals, an important concept in a 2D platformer, as it turns out. It’s also more tolerant of empty spaces, resulting in fewer narrow, twisty paths. I expect it will still require a great deal more work before shipping, but it’s getting better all the time.

Besides constructing the physical space of levels, I’ve been doing some work on specifying the intent of individual rooms within a level, which in turn will notify other systems of how they are to operate. For instance, rooms can be flagged as generic combat spaces, treasure rooms, shops, and so forth. This affects the types of entities that can be spawned in those rooms, so for instance, enemies can spawn in combat spaces but not in shops.

I’ve also been stubbing out a framework to support one of the tentpole features of Gunmetal Arcadia, the notion that each session affects the next in clear ways. This has involved tearing apart the saved game code from Zero, which is fine; Zero exists in a separate source control branch so its code is safe, and the saved game code I wrote for it was never intended to be general-purpose enough to encompass my needs for the roguelike game as well.

Oh yeah, I finally started writing some new music for the first time since finalizing the Zero soundtrack in early March. It’s not much yet, but with some better instrumentation, this could be something.

A new schedule

I’ll have a new video up on Wednesday as per my typical schedule, but starting next week, I’m shaking things up a bit. Videos will go up on Tuesday, with the written blog following on Thursday. This should fit my schedule a little better and hopefully eliminate some of the awkwardness of filming the video before the blog and having to guess at what I’ll be covering in each.

Speaking of schedule, it’s probably worth mentioning that my pace has necessarily slowed a bit. I can’t really work evenings and weekends anymore, so I’m on a rigid 40-hour work week now.

That being said, I still feel like I can believably hit a Q1 2017 launch window. I had originally hoped to have both of the Gunmetal Arcadia games out by the end of 2016, but launching in the holiday season against the AAA heavy hitters feels optimistic at best and willfully irresponsible at worst. Pushing it back to early 2017 makes more sense and still falls within my budget.

Something new

If you follow me on Twitter, you may have seen me talking about “the next game after Gunmetal Arcadia.” This project, codenamed “Banon” and tentatively titled Oaks, is the “other idea” I mentioned a couple months back. I’ve poked around at some concepts for rendering tech for that game over the last few weeks, but I don’t intend to start doing any serious development on that one until Gunmetal Arcadia is done. My time is too limited at the moment to split among multiple projects. But it’s an exciting concept, and I’m taking lots of notes and assembling reference materials, and some of that will undoubtedly spill over onto Twitter.

I don’t have any more events this summer, at least until October and Retropalooza, so hopefully the next few months will be a return to the sort of productive schedule I’ve maintained in the past, or at least a reasonable facsimile thereof in the more limited time I have.

Absentia

As I mentioned in last Wednesday’s video, I’m putting this blog (and the video series, and my Patreon) on hold temporarily while I pursue console port work. Given the confidential nature of that work, trying to fill two weekly slots with what would effectively be vague filler information for the foreseeable future sounded exhausting.

That being said, the port is coming along well; I have audio functioning fully (for my relatively simple use cases) and have moved on to rendering. I expect that will take me a while longer, as I’m having to learn an unfamiliar API and retrofit my existing engine code and shader code to work with it, but fortunately, I’ve been able to leverage some of the work I did a few years back to automatically generate custom, portable effects-like data from my Direct3D 9 Effect (.fx) code. I think if I were to start from scratch and write an all-new engine from the ground up today, I’d probably start by defining my own effect system, writing my shader code in pure HLSL, and making it more portable to any number of platforms and APIs. But this engine having grown the way it has, I’m tied to D3D9 and D3DX, and, you know, that’s probably fine. I don’t expect those to be made totally obsolete any time soon, but if they are…well, I have options.


Once more, in case I don’t post another update before then, here’s where I’ll be in the coming weeks:

Let’s Play Gaming Expo
June 18-19, 2016
Plano Centre
Plano, TX

RTX 2016
July 1-3, 2016
Austin Convention Center
Austin, TX

Retropalooza IV
October 1-2, 2016
Arlington Convention Center
Arlington, TX

Radial

So, this week’s blog and video are gonna be a little weird. I spent most of the week doing console port work, and the nature of that sort of work is that it’s confidential, which necessarily puts it at odds with my usual transparency and makes it difficult to blog about.

However, since the work closely mirrors the efforts I’ve made in the past to port my engine to Mac and Linux, I should be able to write in general terms about my porting process without divulging any of the specifics of what I’m doing. Bear in mind too that this is investigative work, and I’m not announcing or promising ports of any game to any platform at this time.


My first step when approaching any new platform is to get a minimal test program running on it, a “Hello World” sort of thing that proves I can compile and execute code on the device. Most of the sample code I have access to is a little on the bloated side compared to what I was looking for. Imagine an application that renders a 3D, spinning “Hello World” with bloom and antialiasing when all you want to do is printf(“Hello World”); that’s sort of the scenario I was in. Those samples did serve the purpose of verifying that the tool chain was working correctly, but they would have been impractical as a base on which to do my own development. Eventually I managed to pare down a small sample to what I felt was a more representative minimum, and from there, I began introducing my own engine code.

This is where things get a little trickier. I’m suddenly going from a few lines of “Hello World” code to tens of thousands of lines of code written over nearly a decade. It’s not going to just work. But having gone through the Mac and Linux ports already, I had a better understanding of where to start and what to expect.

The first goal is to get the code to compile by any means necessary. Once it compiles, the second goal is to get it to run without crashing. Once it’s running, the third goal is to bring subsystems (graphics, audio, input, file handling, etc.) online one at a time until it’s playable.

So I start by making it compile. Normally this means stripping out lots of code that the compiler doesn’t like. Often this is platform-specific or API-specific code that will need to be rewritten for the new target. Depending on the scenario, it may be best to separate code into #ifdef blocks for each platform. (I’m currently working in a branch and simply commenting out or deleting code with the intent to deal with merging these changes back into the trunk at some later time if this port proves successful.) Most of the time, this code is in the subsystems mentioned above, and in removing it, I’m effectively removing the ability to draw anything, play any sounds, recognize any input, and so on — everything that makes a game interactive. These will have to be rebuilt in the future.

There’s usual some amount of futzing with project settings before everything will compile nicely. This varies with platform, compiler, and IDE, and in my experience has been a huge wildcard. This new platform hasn’t given me too much trouble in this regard, but I’m also not entirely convinced I have all my projects set up correctly despite being able to compile and execute code. It’s something I’ll have to keep an eye on as I move forward.

So, once I have a build compiling and running, the next step is to see where it crashes and fix it. Historically, file handling issues are often among the first culprits, for instance because a path is incorrect and a content package file can’t be opened for read or a config file opened for write. Indeed, that was one of the first problems I encountered this time as well, and although I’m not certain my solution is shippable (in fact, I’m nearly certain it’s not), I did manage to get some initial file reads and writes working as expected.

The next initialization crash happened while trying to construct renderable assets. Since I stripped out all my Direct3D 9 and OpenGL code, my renderer is currently returning NULL pointers in response to asset creation requests, and dereferencing these is causing a crash. So the next step, and the one I’m currently in the process of, is to stub out a good-enough renderer implementation to at least get a majority of cases handled. Having written side by side implementations of D3D9 and GL, I’m in a pretty good place to know what to expect, but nevertheless, this new platform’s API is one I haven’t used before, and there will be a bit of a learning curve there as I wrap my head around how this API wants to work and reconcile that with how my code has historically wanted to work.


So that’s where I am now, knee deep in renderer refactoring, with audio, input, and more looming over the horizon. All things considered, this has been a fairly smooth experience so far, compared to both my previous porting experiences to Mac and Linux and to my previous console development experiences in AAA. There’s been a surprising amount of things just working when I expected to encounter roadblocks. Hopefully that trend will continue as I venture further into this port.

Bridge

I’ve been plugging away at procedural level generation in Gunmetal Arcadia this week, slowly getting more interesting results on the screen. Last week, I started by copying and pasting whole screens’ worth of tiles around, following a hardcoded layout in which individual rooms that met the conditions could be selected and swapped in. I then reduced the scale of the prefabs to a quarter of the screen and assembled them into larger scrolling regions.

This week, I’ve been working on generating a layout for the internal structure of each scrolling region and for the structure of the level as a whole. I’ve decoupled these tasks such that I can tinker with each independent of the other. Today I’ll be looking at the internals of each room, and I’ll cover level structure in Wednesday’s video.

When thinking about how to construct a room from quarter-scale prefabs, I started with a few assumptions. I wanted to find a solution that would facilitate prefabs larger than a quarter of a room (to allow for distinct, recognizable set pieces and landmarks) and could handle these without needing to know in advance whether they would exist or what size they might be. The solution I’ve developed accounts for this and should also give me some room to grow and adapt it as my needs shift throughout this project.

board3
Cards representing prefabs and a board representing the level layout we want to construct

I like to think of this solution as laying cards on a board. Given a number of types of cards (prefabs) of various sizes (and theoretically different shapes), each marked with open or closed paths to adjacent cells, we want to pick cards at random that adhere to some pre-determined path that runs throughout the board (scrolling region). For the purposes of this algorithm, I assume I have an infinite number of each type of card, but it was be easy to limit these as well, e.g., to prevent large setpieces from being seen more than once per level.

board1
A board of a predetermined size with no layout information

The first step in this process is to determine the path through the board. This must exist before we can start picking cards that align with it. I create this path by choosing a cell on the board at random and then proceeding to perform a random walk to unvisited adjacent cells, carving a path as I go. If no adjacent cell is available, I step back until one is and branch in that direction. I repeat this process until the entire board is connected. This creates a connected graph with no loops, which could also be represented as a tree, with any cell on the board arbitrarily chosen as the root.

board2
The board after doing a random walk to create non-looping paths

At the scale I’m working at, these paths tend to be fairly non-descript, but this could conceivably scale up to larger boards and more intricate paths. In the future, I’ll likely also look into introducing loops, tagging paths as one-way (especially for vertical drops), and anything else that could facilitate more interesting gameplay.

Once the path exists, the next step is to pick cards that fit it. (To reiterate, “cards” is a synonym for “prefabs” in this context.) I don’t want to give preference to any particular cards, large or small; I want each card to have an equal chance of being selected.

For each cell on the board, I create a list of every card that could possibly fit that cell, given its paths to adjacent cells. For cards that are larger than a single cell, each part of the card must match the board cell it would overlay, else the card as a whole is rejected. At this point, I can verify that at least one solution exists, by virtue of the fact that every cell has at least one corresponding card.

board4
A potential match for a card

Once I’ve compiled the lists of applicable cards for each cell, I begin playing cards on the board at random. I select a random cell on the board, choose a random card from its list of options, and place it. (If that card is larger than one cell, I remove any cells that it overlaps from further solving.) I continue this process until cards have been placed over every cell on the board. The room is then fully constructed.

board5
A potential match for a larger card spanning multiple cells

Each prefab may also have entities placed within it, and these are concatenated and assigned correct positions within the room as part of the card-laying process. At this point, the fully assembled room can be treated exactly as if it had been built by hand in my editor.


Thanks to the support of my generous Patreon contributors, I’ve begun a regular development livestream. You can find the archives of last week’s stream below.

Since that stream, I’ve wrapped up the work on CRT presets, and the feature is mostly ready to ship. Some of the code could still use some cleaning up, but I’m pretty happy with the implementation otherwise.

I’ve been debating voluntarily increasing the frequency of these streams (and updating the Patreon incentive tiers accordingly) simply because one stream a month is too infrequent to really make a part of my normal schedule. It’s going to feel like an oddity every time; I’m not really going to be able to get comfortable with the format at that pace. (The flip side of that is that I’ll risk running out of good, discrete, streamable tasks too quickly, but that’s something I’ll have to deal with in any case.)


As a reminder, I will be making the rounds at a number of events this summer and beyond:

Let’s Play Gaming Expo
June 18-19, 2016
Plano Centre
Plano, TX

RTX 2016
July 1-3, 2016
Austin Convention Center
Austin, TX

Retropalooza IV
October 1-2, 2016
Arlington Convention Center
Arlington, TX

I’ve applied for space with the Indie Megabooth at PAX Prime/West, but those slots are limited and in high demand, and it’s too early to say whether I’ll be there. I’ll also be looking at demoing one or both of the Gunmetals Arcadia at PAX South 2017 early next year, so stay tuned on that one.

Shuffle

Last week was my first week of full-time development back on Gunmetal Arcadia following Zero‘s launch. It was…slow. I’m still kind of in that “I just shipped a game and I want to turn off my brain and play video games for a change” mode. But I did get some stuff done, so let’s take a look at that.

I’ve been stewing over random level generation for some time. It was an unsolved problem as of last August when I introduced Zero to solve my ongoing production crisis, and I’ve only just recently begun to give serious thought to how it should work. Fortunately, inspiration struck early Tuesday morning, just when I was planning to start implementing some of those systems. I realized I could partially decouple the level data exported from my editor from the level data the player traverses such that I could still sample into the prefab data at will to build playable rooms on the fly. It’s hard to express exactly what was revelatory about this; fundamentally I’m still building prefabs in the editor and pasting them around in the game to construct full levels, but there was a subtle way of thinking about it that hadn’t quite clicked until then. Once I got over that hump, I’ve been able to actually start prototyping a very rough form of randomly generated levels.

So this week has mostly been spent getting my ducks in a row to deal with randomly generated levels in Gunmetal Arcadia. My first order of business was to load level data from the editor and paste rooms together in random ways while respecting the shape of the level which had been hardcoded in advance. (This is what I captured in the preceding video.) This quickly revealed a big problem I hadn’t predicted, which was how to deal with GUIDs of hand-placed entities in prefab rooms.

As I’ve mentioned before, I tag all my entities with a GUID. For all entities placed by hand in the editor, this GUID has historically been assigned by the editor. But now I’m entering a world where the data authored in the editor is only a template for what may appear in the game, and I can no longer trust this editor-assigned GUID to be unique. If the same prefab appears twice at runtime, and if each prefab spawns a copy of the same hand-placed entity, and if each of those entities shares the same GUID, then the game can’t distinguish between them. In practice, this issue manifested itself as a bug in which killing an entity in one room would have the side effect of killing its pair in another room built from the same prefab. Clearly I needed a new approach here.

My current solution is to assign GUIDs at runtime. I’ve done this in the past for dynamically-spawned entities, but that’s a subtly different case when it comes to entity serialization. (In particular, entity death must be serialized for hand-placed entities, while dynamically spawned entities can simply be forgotten altogether when they die.) In this case, I want to treat these things as if they had been assigned a GUID by the editor, but also ensure that their GUIDs are in fact unique even when the same prefabs appear multiple times.

To this end, I no longer assign GUIDs through the editor. GUIDs are strictly assigned by the game at runtime, and to ensure their uniqueness, they’re constructed bit by bit using data that will be unique to the context in which they appear.

I use 32-bit unsigned integers for my GUIDs. The high bit is reserved to indicate dynamically spawned entities. This leaves 31 bits to work with. I convert the coordinates of the room to signed bytes (effectively limiting the coordinates of levels to [-128, 127]) and write these into the GUID along with an unsigned byte representing the index of the entity as exported from the editor. In 24 bits, this gives me a value which is guaranteed to be unique and is valid for use as a GUID. With this substitution in place, I can reuse prefabs as many times as I like, and the game will view the entities within each as being totally unique, separate instances of the same template.


Since recording that video, I’ve begun toying with the idea of descreasing the size of prefabs to a quarter of a room. I’ll be talking a little more in Wednesday video about my reasonings for this and the problems it causes (and hopefully my solutions to those problems, assuming I’ve found any by then).

Liftoff

If you missed my tweets about it over the weekend, I released a game last Friday! And if you’ve been following this blog, it might not be exactly what you were expecting.

Gunmetal Arcadia Zero launched on May 6, 2016 as a Humble Original title, an exclusive bonus for subscribers of the Humble Monthly plan. (With their blessing, it was also made available to applicable supporters of my Patreon).

I also published the soundtrack over the weekend. You can find that on my Bandcamp page and also on YouTube.

Now, you might be saying to yourself, “Hey, I’d like to play Thing and I don’t subscribe to Thing. What about me?” So here’s my official statement on that:

“While we hope to bring this game to more players in the future, it is currently exclusive to Humble Monthly subscribers.”

If that’s sounds like I’m being deliberately vague, that’s only because I have to be deliberately vague.


Let’s talk about the future.

The eponymous roguelike Gunmetal Arcadia is up next. I haven’t done any work on it since my last blog. I thought I might try to get a jump start on some of the procedural content generation problems, but ultimately the need to decompress won out. It was good to take some time off. The danger in that is that I’ll lose sight of the spark. The danger is that I’ll get swept up in some other idea.

I got swept up in some other idea.

Whether I’ll pursue it, that’s still up in the air. I’ve been thinking about using evenings and weekends to tinker with this other idea, time that in the past has been spent working on GDC slides or crunching towards a specific deadline. I don’t have those obligations this time.

I’ve been investigating console development recently. By necessity, I have to be vague about that stuff as well, but I can say I’ve reached out to platform holders and and am taking some initial steps towards actual development. It’s way too early to announce any specific plans — consider how much of an ordeal it was to port my engine from Windows to Mac and Linux, and then consider that I would have to do essentially the same thing again for each additional platform I want to ship on — but it’s something I’ve been wanting to attempt for a while, and it feels like the time is right.

I have a few events lined up this year. Next month, I’ll be at the Let’s Play Gaming Expo here in Plano. In early July, I’ll be headed down to Austin for RTX, and in October, I’ll be at Retropalooza in Arlington for the third year in a row. I’ll be demoing Gunmetal Arcadia Zero at each of these, and I may have some merch sales as well. Can’t make any promises yet.

Bookends

A couple of the very first blogs on this site were dedicated to source control management for this project, so I guess a nice way to bookend development on Gunmetal Arcadia Zero would be to return to that topic.

My general-purpose game engine/framework exists in a Subversion repository of its own, which is included as an external of the derived game’s repository. Starting with Super Win the Game, I’ve been making branches of the engine for shipping. This serves two purposes: it preserves the state of the engine as it was when I launched a particular game such that future engine work won’t unexpectedly break that game, and it allows me to make one-off changes to the engine for that game specifically when needed.

branch1

Last week, I branched my engine for shipping Gunmetal Arcadia Zero. I’ve already made a handful of checkins to that branch to address issues like the exploit mentioned in Episode 34. Some of these changes may get merged back into the main branch (or “trunk” in the SVN vernacular), but I can feel safe in the knowledge that, should I need to make awful hardcoded hacks to the engine to fix Zero bugs after launch, I can do so without upsetting other projects.

I don’t recall whether I mentioned it at the time, but last August, when I spun Zero off into its own separate game, I branched the Gunmetal Arcadia Subversion repository for that game. Since then, all Zero development has occurred in its own game branch, while also touching main engine source.

branch2

Now that Zero is effectively complete and I’m returning to the roguelike Gunmetal Arcadia, I’m faced with how to handle merging these changes back in. Truth be told, there’s very little work that’s gone into Zero that I think I absolutely can’t reuse for the roguelike, so for starters, I’ve taken everything. Where conflicts arose (mostly in binary content of test maps), I’ve taken changes from the Zero branch. In essence, I’m picking up development on the roguelike Gunmetal Arcadia not where I left it off last August, but at the very end of Zero development today.


I’ve been focused on development the last few months, so I haven’t had a whole lot of time to think about event presence this year, but as of yesterday, I’ve signed up for a booth at the local Let’s Play Gaming Expo June 18-19. I missed this one last year, as I already had obligations around the same time, but it’s now in its second year, and it sounds like it’s gonna be a lot of fun! I’ll be showing Gunmetal Arcadia Zero, of course, and I may try to expand the merch selection this year with t-shirts or soundtrack CDs. We’ll see how it goes!

After Wednesday’s video, I’m taking the next week off from development and blogging. I’ll return the week of May 9.

Segue

I made a build of Gunmetal Arcadia Zero on Friday that I’m calling “RC0.” (I’m not sure how widely used that term is, so for those unfamiliar, that’s “Release Candidate 0,” or the first believable shipping version of a product.) In truth, that build had a number of must-fix issues that I’ve addressed over the weekend, but it’s indicative of the state the game is in nevertheless. This thing is almost done. I’ll be spending the next couple of weeks tuning and polishing, fixing bugs, and preparing promotional media, and then I’m kicking this game out the door.

I’m long past the point of developing exciting new features, and I already talked a little bit about my concerns in continuing to document my development as I transitioned to a world of content production. Now I’ve reached the end of content production, and I find myself wondering how best to write about shipping a game.

As I’m not targeting consoles, my shipping process for Gunmetal Arcadia Zero is somewhat different than games I’ve worked on in the past. There’s no certification criteria to meet, and that’s nice, but on the other hand, I’m targeting three separate operating systems and an unknowable number of hardware configurations, and without the means for wider compatibility testing, the best I can do is make sure the game runs on my personal devices and try to fix bugs reported in early builds. On the bright side, I’ve refined my engine with each release, fixing what bugs I can reproduce, and though I’ll probably never have complete peace of mind with regard to the Mac and Linux versions, I feel like I’ve minimized the likelihood of complete and total incompatibility.

So Gunmetal Arcadia Zero will ship soon, and then development on Gunmetal Arcadia proper can resume. Remember that game? The eponymous Gunmetal Arcadia, the one I started blogging about way back in October 2014? The vision I had of that game prior to the introduction of Zero had a number of unsolved problems, and some of those have been occuping my mental bandwidth recently. (Importantly, I still don’t have a concrete plan for procedural level generation.) But in returning to that game in the context of a completed Zero, I’ll also have to ask questions of that existing vision. Does it make sense to keep plugging away at those same goals? What can I take away from the experience of shipping Zero that might affect the roguelike game’s design? Should I alter its design to further differentiate it from Zero? To accentuate the things that work about Zero and mitigate those that don’t? Do I even know which is which yet? Zero‘s not even out and I’m already starting to think of it in postmortem terms.

In On Writing, Stephen King advises setting finished manuscripts to the side for weeks or months after their completion, returning to make edits only after that time has elapsed. I’m not sure that’s entirely applicable to game development, but I do think there is something to be said for stepping away from a finished work for some time before making final alterations and sending it out into the wild. To that end, I’m thinking about taking a short break from this game after this week to work on something silly and fun. Maybe I’ll ever revisit that standalone CRT sim I was toying with last year. I just missed Ludum Dare, but maybe I’ll do something jam-scale on my own. As I was nearing the end of development on Super Win the Game, I participated in Ludum Dare to blow off some steam, which turned out to be an unexpected win when I was later able to reuse some of that work for a prototype of another game.

Upcoming tasks for the week of April 18, 2016:

  • Monday: Promotional materials, day 1
  • Tuesday: Promotional materials, day 2
  • Wednesday: Playtesting and tuning, day 1
  • Thursday: Playtesting and tuning, day 2
  • Friday: Record Ep. 35, write blog, addl. work as time allows

I don’t have any tasks scheduled for this weekend. I guess I’m done with crunch? Maybe I’ll read a book or something.