Even in the midst of Turkey Day and Consumerism Weekend, last week managed to be highly productive, so I have a lot of ground to cover in today’s blog.
I kicked off the week with a task to help facilitate these devlogs. As I mentioned in an earlier blog, I keep each of my projects in a separate Subversion repository for legacy reasons. This can make it difficult to review commit logs for the previous week, as I have to check each repository separately. What I really need is a way to view a list of all commits made to all repositories for the entire week, sorted by commit time. So I wrote a script to do exactly that. You can see the output here. This script runs every morning and prints out messages from all commits made in the last week for easy review. As this is publicly visible, it’s also a good reason to step up the quality of my commit messages. “Yurp,” “whatever,” and “this” are probably not terribly useful messages.
I’ve talked a little bit in the past about how visuals can affect game feel. Animation can have a huge impact on how we perceive a player character’s weight and movement. Art can also provide visual cues that subconsciously affect our expectations of how characters should move, and developers can harness this phenomenon in order to better tune game feel.
Way back when I was starting to write the jumping physics code for You Have to Win the Game (code which I’ve continued to improve and reuse to this day), I found that by replacing my placeholder sprite with a picture of Mario from Super Mario Bros., I immediately recognized the failings of my physics settings. My foot speed was too slow, my jump height was too low, and gravity was too light. Of course, my goal was not to recreate Mario’s physics exactly, but these observations helped me move in the right direction.
I recently had an opportunity to exploit this phenomenon again, and to a much greater extent. I’ve been making some big changes to my animation system recently, and I had the realization that I now had all the tools necessary to drop in an existing sprite sheet from a different game, and — fingers crossed — it should just work. I didn’t mention it explicitly, but when I was initially tuning the physics settings for Gunmetal Arcadia a few weeks ago, I was modeling it after Zelda II and The Battle of Olympus. Naturally, I chose to use a Link sprite from Zelda II as my test content. This time, however, instead of a single static image, I could imitate an entire suite of animations.
I haven’t changed any of the physics settings in Gunmetal Arcadia since adding this test content. I haven’t felt the need. A side-by-side comparison will reveal some clear, if subtle, differences between Zelda II and Gunmetal Arcadia, and that’s a good thing; I’m not trying to remake Zelda II here, but I do want to evoke it. This puts me in a really good position to understand other needs beyond physics. Now that I have Link running around in my test map, I have expectations related to melee combat, which is the next big system I’ll be tackling. I’ve been deconstructing the intersection of movement and melee combat in a number of games, not just Zelda II, but also The Battle of Olympus, Faxanadu, Castlevania, and others. There are some interesting subtleties that arise when you look closely at how melee attacks work in conjunction with walking, jumping, crouching, and I want to be sure I capture those subtleties to the best of my ability, maximizing good game feel as well as retro authenticity.
I woke up early Thanksgiving morning with the sudden desire to implement input recording and playback. I’ve had that feature at the back of my mind for a while now, most recently after demoing Super Win the Game. I noticed that the game tended to attract a crowd while it was being played. While it was idling at the title screen, most people would pass it by. This is an old problem and a solved one; arcade games have used attract modes for years for exactly this reason. An attract loop in which the game plays itself for a short time is an easy solution to this problem and is relatively low cost, at least in theory. You load a particular level and provide faked player input to move the character as if they were being controlled by a human player. Where this gets interesting is in making sure it behaves reliably when played back on every machine, as even a small change in the game state can dramatically throw off the rest of the simulation. Fortunately, the confluence of multiple existing systems helped solve many of these problems for me.
About two years ago, I implemented an option for using a fixed delta time for updating the game state in my engine as explained by Glenn Fiedler here. Historically, I’d always used a variable delta time, and this is still the default behavior in my engine, but I recognized that in certain situations, the determinism provided by a fixed delta in desirable. My use case at the time was integrating shots in a bullet hell shooter; I never made that game, but I did get as far as refactoring my entire engine ticking process to allow the game state’s delta time to be fixed.
My input system handles control bindings, which allows me to ignore details about individual input devices and focus on high-level controls as they would appear to the player. Consider the case of polling for input to move the player character. Rather than needing to explicitly query the state of individual arrow keys and joystick axes from game code, my control binding system abstracts these specific devices away so I can just ask, e.g., “Is the ‘move left’ control active?” Additionally, controls may be flagged as relevant to the game simulation or the engine, which affects when and how they are polled.
Having these features at my disposal made input recording a fairly simple process. Rather than having to record the complete input state of every device connected to the PC, I could reduce my data set to the states of the high-level controls defined by the game. I could further reduce the data set by only recording control states when they actually changed, as opposed to every single tick. Finally, I could ensure that the playback would match the recording exactly by dynamically toggling a fixed 60 Hz tick in both cases. (I should note I’m making the assumption here that the initial state is the same for playback.)
Eventually, this will become a real attract mode for Gunmetal Arcadia. There will other problems to solve along the way, including making random level generation and events deterministic (another solved problem, thanks to the Mersenne Twister). Another potential use case, and one that I’m extremely excited about, is the ability to automatically record and upload whole replayable sessions from playtest builds (or Early Access builds, should I choose to go that route). I learned so much and made so many improvements to Super Win the Game just from watching people play it at events, and being able to collect that information from any number of players across the internet would be absolutely awesome.