So, this week’s blog and video are gonna be a little weird. I spent most of the week doing console port work, and the nature of that sort of work is that it’s confidential, which necessarily puts it at odds with my usual transparency and makes it difficult to blog about.

However, since the work closely mirrors the efforts I’ve made in the past to port my engine to Mac and Linux, I should be able to write in general terms about my porting process without divulging any of the specifics of what I’m doing. Bear in mind too that this is investigative work, and I’m not announcing or promising ports of any game to any platform at this time.

My first step when approaching any new platform is to get a minimal test program running on it, a “Hello World” sort of thing that proves I can compile and execute code on the device. Most of the sample code I have access to is a little on the bloated side compared to what I was looking for. Imagine an application that renders a 3D, spinning “Hello World” with bloom and antialiasing when all you want to do is printf(“Hello World”); that’s sort of the scenario I was in. Those samples did serve the purpose of verifying that the tool chain was working correctly, but they would have been impractical as a base on which to do my own development. Eventually I managed to pare down a small sample to what I felt was a more representative minimum, and from there, I began introducing my own engine code.

This is where things get a little trickier. I’m suddenly going from a few lines of “Hello World” code to tens of thousands of lines of code written over nearly a decade. It’s not going to just work. But having gone through the Mac and Linux ports already, I had a better understanding of where to start and what to expect.

The first goal is to get the code to compile by any means necessary. Once it compiles, the second goal is to get it to run without crashing. Once it’s running, the third goal is to bring subsystems (graphics, audio, input, file handling, etc.) online one at a time until it’s playable.

So I start by making it compile. Normally this means stripping out lots of code that the compiler doesn’t like. Often this is platform-specific or API-specific code that will need to be rewritten for the new target. Depending on the scenario, it may be best to separate code into #ifdef blocks for each platform. (I’m currently working in a branch and simply commenting out or deleting code with the intent to deal with merging these changes back into the trunk at some later time if this port proves successful.) Most of the time, this code is in the subsystems mentioned above, and in removing it, I’m effectively removing the ability to draw anything, play any sounds, recognize any input, and so on — everything that makes a game interactive. These will have to be rebuilt in the future.

There’s usual some amount of futzing with project settings before everything will compile nicely. This varies with platform, compiler, and IDE, and in my experience has been a huge wildcard. This new platform hasn’t given me too much trouble in this regard, but I’m also not entirely convinced I have all my projects set up correctly despite being able to compile and execute code. It’s something I’ll have to keep an eye on as I move forward.

So, once I have a build compiling and running, the next step is to see where it crashes and fix it. Historically, file handling issues are often among the first culprits, for instance because a path is incorrect and a content package file can’t be opened for read or a config file opened for write. Indeed, that was one of the first problems I encountered this time as well, and although I’m not certain my solution is shippable (in fact, I’m nearly certain it’s not), I did manage to get some initial file reads and writes working as expected.

The next initialization crash happened while trying to construct renderable assets. Since I stripped out all my Direct3D 9 and OpenGL code, my renderer is currently returning NULL pointers in response to asset creation requests, and dereferencing these is causing a crash. So the next step, and the one I’m currently in the process of, is to stub out a good-enough renderer implementation to at least get a majority of cases handled. Having written side by side implementations of D3D9 and GL, I’m in a pretty good place to know what to expect, but nevertheless, this new platform’s API is one I haven’t used before, and there will be a bit of a learning curve there as I wrap my head around how this API wants to work and reconcile that with how my code has historically wanted to work.

So that’s where I am now, knee deep in renderer refactoring, with audio, input, and more looming over the horizon. All things considered, this has been a fairly smooth experience so far, compared to both my previous porting experiences to Mac and Linux and to my previous console development experiences in AAA. There’s been a surprising amount of things just working when I expected to encounter roadblocks. Hopefully that trend will continue as I venture further into this port.


I’ve been plugging away at procedural level generation in Gunmetal Arcadia this week, slowly getting more interesting results on the screen. Last week, I started by copying and pasting whole screens’ worth of tiles around, following a hardcoded layout in which individual rooms that met the conditions could be selected and swapped in. I then reduced the scale of the prefabs to a quarter of the screen and assembled them into larger scrolling regions.

This week, I’ve been working on generating a layout for the internal structure of each scrolling region and for the structure of the level as a whole. I’ve decoupled these tasks such that I can tinker with each independent of the other. Today I’ll be looking at the internals of each room, and I’ll cover level structure in Wednesday’s video.

When thinking about how to construct a room from quarter-scale prefabs, I started with a few assumptions. I wanted to find a solution that would facilitate prefabs larger than a quarter of a room (to allow for distinct, recognizable set pieces and landmarks) and could handle these without needing to know in advance whether they would exist or what size they might be. The solution I’ve developed accounts for this and should also give me some room to grow and adapt it as my needs shift throughout this project.

Cards representing prefabs and a board representing the level layout we want to construct

I like to think of this solution as laying cards on a board. Given a number of types of cards (prefabs) of various sizes (and theoretically different shapes), each marked with open or closed paths to adjacent cells, we want to pick cards at random that adhere to some pre-determined path that runs throughout the board (scrolling region). For the purposes of this algorithm, I assume I have an infinite number of each type of card, but it was be easy to limit these as well, e.g., to prevent large setpieces from being seen more than once per level.

A board of a predetermined size with no layout information

The first step in this process is to determine the path through the board. This must exist before we can start picking cards that align with it. I create this path by choosing a cell on the board at random and then proceeding to perform a random walk to unvisited adjacent cells, carving a path as I go. If no adjacent cell is available, I step back until one is and branch in that direction. I repeat this process until the entire board is connected. This creates a connected graph with no loops, which could also be represented as a tree, with any cell on the board arbitrarily chosen as the root.

The board after doing a random walk to create non-looping paths

At the scale I’m working at, these paths tend to be fairly non-descript, but this could conceivably scale up to larger boards and more intricate paths. In the future, I’ll likely also look into introducing loops, tagging paths as one-way (especially for vertical drops), and anything else that could facilitate more interesting gameplay.

Once the path exists, the next step is to pick cards that fit it. (To reiterate, “cards” is a synonym for “prefabs” in this context.) I don’t want to give preference to any particular cards, large or small; I want each card to have an equal chance of being selected.

For each cell on the board, I create a list of every card that could possibly fit that cell, given its paths to adjacent cells. For cards that are larger than a single cell, each part of the card must match the board cell it would overlay, else the card as a whole is rejected. At this point, I can verify that at least one solution exists, by virtue of the fact that every cell has at least one corresponding card.

A potential match for a card

Once I’ve compiled the lists of applicable cards for each cell, I begin playing cards on the board at random. I select a random cell on the board, choose a random card from its list of options, and place it. (If that card is larger than one cell, I remove any cells that it overlaps from further solving.) I continue this process until cards have been placed over every cell on the board. The room is then fully constructed.

A potential match for a larger card spanning multiple cells

Each prefab may also have entities placed within it, and these are concatenated and assigned correct positions within the room as part of the card-laying process. At this point, the fully assembled room can be treated exactly as if it had been built by hand in my editor.

Thanks to the support of my generous Patreon contributors, I’ve begun a regular development livestream. You can find the archives of last week’s stream below.

Since that stream, I’ve wrapped up the work on CRT presets, and the feature is mostly ready to ship. Some of the code could still use some cleaning up, but I’m pretty happy with the implementation otherwise.

I’ve been debating voluntarily increasing the frequency of these streams (and updating the Patreon incentive tiers accordingly) simply because one stream a month is too infrequent to really make a part of my normal schedule. It’s going to feel like an oddity every time; I’m not really going to be able to get comfortable with the format at that pace. (The flip side of that is that I’ll risk running out of good, discrete, streamable tasks too quickly, but that’s something I’ll have to deal with in any case.)

As a reminder, I will be making the rounds at a number of events this summer and beyond:

Let’s Play Gaming Expo
June 18-19, 2016
Plano Centre
Plano, TX

RTX 2016
July 1-3, 2016
Austin Convention Center
Austin, TX

Retropalooza IV
October 1-2, 2016
Arlington Convention Center
Arlington, TX

I’ve applied for space with the Indie Megabooth at PAX Prime/West, but those slots are limited and in high demand, and it’s too early to say whether I’ll be there. I’ll also be looking at demoing one or both of the Gunmetals Arcadia at PAX South 2017 early next year, so stay tuned on that one.


Last week was my first week of full-time development back on Gunmetal Arcadia following Zero‘s launch. It was…slow. I’m still kind of in that “I just shipped a game and I want to turn off my brain and play video games for a change” mode. But I did get some stuff done, so let’s take a look at that.

I’ve been stewing over random level generation for some time. It was an unsolved problem as of last August when I introduced Zero to solve my ongoing production crisis, and I’ve only just recently begun to give serious thought to how it should work. Fortunately, inspiration struck early Tuesday morning, just when I was planning to start implementing some of those systems. I realized I could partially decouple the level data exported from my editor from the level data the player traverses such that I could still sample into the prefab data at will to build playable rooms on the fly. It’s hard to express exactly what was revelatory about this; fundamentally I’m still building prefabs in the editor and pasting them around in the game to construct full levels, but there was a subtle way of thinking about it that hadn’t quite clicked until then. Once I got over that hump, I’ve been able to actually start prototyping a very rough form of randomly generated levels.

So this week has mostly been spent getting my ducks in a row to deal with randomly generated levels in Gunmetal Arcadia. My first order of business was to load level data from the editor and paste rooms together in random ways while respecting the shape of the level which had been hardcoded in advance. (This is what I captured in the preceding video.) This quickly revealed a big problem I hadn’t predicted, which was how to deal with GUIDs of hand-placed entities in prefab rooms.

As I’ve mentioned before, I tag all my entities with a GUID. For all entities placed by hand in the editor, this GUID has historically been assigned by the editor. But now I’m entering a world where the data authored in the editor is only a template for what may appear in the game, and I can no longer trust this editor-assigned GUID to be unique. If the same prefab appears twice at runtime, and if each prefab spawns a copy of the same hand-placed entity, and if each of those entities shares the same GUID, then the game can’t distinguish between them. In practice, this issue manifested itself as a bug in which killing an entity in one room would have the side effect of killing its pair in another room built from the same prefab. Clearly I needed a new approach here.

My current solution is to assign GUIDs at runtime. I’ve done this in the past for dynamically-spawned entities, but that’s a subtly different case when it comes to entity serialization. (In particular, entity death must be serialized for hand-placed entities, while dynamically spawned entities can simply be forgotten altogether when they die.) In this case, I want to treat these things as if they had been assigned a GUID by the editor, but also ensure that their GUIDs are in fact unique even when the same prefabs appear multiple times.

To this end, I no longer assign GUIDs through the editor. GUIDs are strictly assigned by the game at runtime, and to ensure their uniqueness, they’re constructed bit by bit using data that will be unique to the context in which they appear.

I use 32-bit unsigned integers for my GUIDs. The high bit is reserved to indicate dynamically spawned entities. This leaves 31 bits to work with. I convert the coordinates of the room to signed bytes (effectively limiting the coordinates of levels to [-128, 127]) and write these into the GUID along with an unsigned byte representing the index of the entity as exported from the editor. In 24 bits, this gives me a value which is guaranteed to be unique and is valid for use as a GUID. With this substitution in place, I can reuse prefabs as many times as I like, and the game will view the entities within each as being totally unique, separate instances of the same template.

Since recording that video, I’ve begun toying with the idea of descreasing the size of prefabs to a quarter of a room. I’ll be talking a little more in Wednesday video about my reasonings for this and the problems it causes (and hopefully my solutions to those problems, assuming I’ve found any by then).


If you missed my tweets about it over the weekend, I released a game last Friday! And if you’ve been following this blog, it might not be exactly what you were expecting.

Gunmetal Arcadia Zero launched on May 6, 2016 as a Humble Original title, an exclusive bonus for subscribers of the Humble Monthly plan. (With their blessing, it was also made available to applicable supporters of my Patreon).

I also published the soundtrack over the weekend. You can find that on my Bandcamp page and also on YouTube.

Now, you might be saying to yourself, “Hey, I’d like to play Thing and I don’t subscribe to Thing. What about me?” So here’s my official statement on that:

“While we hope to bring this game to more players in the future, it is currently exclusive to Humble Monthly subscribers.”

If that’s sounds like I’m being deliberately vague, that’s only because I have to be deliberately vague.

Let’s talk about the future.

The eponymous roguelike Gunmetal Arcadia is up next. I haven’t done any work on it since my last blog. I thought I might try to get a jump start on some of the procedural content generation problems, but ultimately the need to decompress won out. It was good to take some time off. The danger in that is that I’ll lose sight of the spark. The danger is that I’ll get swept up in some other idea.

I got swept up in some other idea.

Whether I’ll pursue it, that’s still up in the air. I’ve been thinking about using evenings and weekends to tinker with this other idea, time that in the past has been spent working on GDC slides or crunching towards a specific deadline. I don’t have those obligations this time.

I’ve been investigating console development recently. By necessity, I have to be vague about that stuff as well, but I can say I’ve reached out to platform holders and and am taking some initial steps towards actual development. It’s way too early to announce any specific plans — consider how much of an ordeal it was to port my engine from Windows to Mac and Linux, and then consider that I would have to do essentially the same thing again for each additional platform I want to ship on — but it’s something I’ve been wanting to attempt for a while, and it feels like the time is right.

I have a few events lined up this year. Next month, I’ll be at the Let’s Play Gaming Expo here in Plano. In early July, I’ll be headed down to Austin for RTX, and in October, I’ll be at Retropalooza in Arlington for the third year in a row. I’ll be demoing Gunmetal Arcadia Zero at each of these, and I may have some merch sales as well. Can’t make any promises yet.