Remaining Cross-Platform

Last February, I wrote on my personal blog about the difficulties I had encountered in porting my engine from Windows to Mac and Linux. Since then, I’ve continued to iterate on these versions, and I thought I’d share some of the experiences I’ve had in that time.

When I completed the first iteration of this work last year, I was only building a 64-bit Linux binary. The reason for this was simple: I had installed the recommended Ubuntu distro for my PC. This was a 64-bit distro, which meant that by default, I was building 64-bit applications. Based on what limited data I had available from, e.g., Steam and Unity surveys, this seemed like a safe bet, but I very quickly heard from 32-bit Linux users who were unhappy with this decision. At the time, the only cross-platform game I had shipped was freeware, so I wasn’t too concerned about it, but I knew that I would probably have to cross that bridge before shipping Super Win the Game.

In June, after four or five months of Windows-only development, I began compiling Super Win on Mac and Linux. Right off the bat, I ran into some bugs in my OpenGL implementation that had never appeared in either my Windows OpenGL build or in any version of YHtWtG. Both of these were my own fault, and one was easily fixed (I was calculating the size of vertex elements incorrectly in certain cases), but the other was a little more sinister. I had some shader code containing a parameter which was intended to be a float but had been written as a float4. Outside of its declaration, all the code both within the shader and in the game’s code treated it as a one-component float. Despite being technically incorrect, this worked just fine on Windows under both DirectX and OpenGL. It was only when I switched to Linux and ran the game that the GLSL compiler on that platform threw a fit about it. In retrospect, it’s possible that testing with debug D3D and higher warning levels might have caught this far earlier, but I rarely remember to test in those conditions unless I have an obvious rendering bug in Direct3D. The scary part is that, even having found and fixed this one particular bug, I don’t feel like I can be entirely certain that my auto-generated GLSL code will compile on all users’ machines, and in fact, evidence has suggested that it will not, for reasons that aren’t yet clear to me.

It was around this time that I made the first of three attempts to build 32-bit binaries on Linux. Each of these deserves its own paragraph.

My first attempt involved building a 32-bit binary on my existing 64-bit distro. It sounded simple enough: add the “-m32” flag when compiling. What could possibly go wrong? Of course, this also meant recompiling 32-bit versions of all the libraries I was linking (SDL, Steam, GLEW), and of course each of these had their own dependencies. This led to what I described on Twitter as a Mandelbrot fractal of dependency issues. A zillion sudo apt-get installs later, I reached a point where (1) I still didn’t have all the dependencies I would need to compile 32-bit binaries, and (2) Code::Blocks would no longer open because some of the dependencies I installed had broken compatibility. With no clear path to revert these changes, I began manually removing these dependencies and reinstalling the 64-bit versions where I could. I finally reached a point where Code::Blocks would open again and I could compile the code, but it would immediately crash. Then I restarted my PC, and Ubuntu simply would not boot up. I had killed my first Linux distro.

This was June 2014. I had first installed Ubuntu on my PC eight months earlier, and I didn’t really remember how the process had worked. Did I use Wubi? Did I create a partition? There seemed to be several ways this could work, and none of them was wanting to play nice for reasons I couldn’t fathom. Sometimes my PC just wouldn’t boot to Ubuntu. At least once, the Ubuntu setup install process just spun indefinitely until I had to power my machine down. I think I restarted the entire process at least five or six times that night. Eventually, I ended up getting a 32-bit distro installed, in the hopes that I could build a 32-bit binary natively and it would just work on 64-bit distros, too. I was able to produce a 32-bit binary, but it didn’t run on a 64-bit distro right off the bat. I started installing dependencies to see whether I could make it work, and…yep. I killed that distro, too. Two Linux distros dead in two days.

Now here’s the really fun part. At some point when I was trying to get any version of Ubuntu working again, I ended up with my PC booting to GNU GRUB by default rather than the Windows Boot Manager that I was familiar with. And now that my Linux install was dead, GRUB wasn’t working, either. My PC wouldn’t boot at all, not even to Windows. For all intents and purposes, it appeared I had just bricked my PC trying to install compatibility packages on Linux. In an era before smartphones, that probably would have been true. Fortunately, I found a StackExchange post that described the exact problem I was having and the solution. I had a brief moment of panic when I wasn’t sure if I even still had my Windows discs, but I did eventually find them and was able to repair the damage. I had lost three days and could still only build 64-bit Linux binaries with any reliability, but I didn’t lose my PC completely. I chose to ignore the problem and get back to game dev work for a while.

A month later, barely a day after wrapping up three weekends in a row presenting the game at various expos and conventions, I returned to the wonderful world of Linux development with a new plan of attack. Since my desktop PC was too scarred from previous failures and I didn’t trust it to stand up to a fresh install, I decided to turn my laptop into my primary Linux development environment. This time, I went with a Wubi install, despite recommendations to the contrary — it was, I had eventually learned, how I had installed Ubuntu the first time, when I had had the least trouble with stability. I installed a 32-bit distro, rebuilt all my external libraries, compiled the game, and tested it on my 64-bit desktop install, and…this time it worked.

I can’t explain that. I don’t know why my results were different this time. To my recollection, I had followed the exact same steps once before with different results. I tested it on my Steam Machine and it worked there, too. To date, I don’t believe I’ve had any complaints about 32-bit/64-bit compatibility since I started building in this environment. I’m happy it finally worked, and it was strange how smoothly the process went this time compared to all my previous attempts, but I can’t explain it. In all likelihood, if this environment ever fails me for any reason, it will be the death knell for my support of Linux.

I’ve been using various versions of Visual Studio as my primary IDE since 2005. It took me months, if not years, to feel totally comfortable with it, so it’s not surprising that working in other environments would present difficulties.

I use Code::Blocks on Linux and Xcode on Mac. Code::Blocks seems like it was designed to be familiar to VS users. It has some oddities, but there are usually pretty clear analogues between the two environments. Importing projects and configurations from VS was fairly uneventful; it’s been the little unexpected things that I’ve had to learn to deal with in Code::Blocks. For instance, the order in which library dependencies are specified matters in C::B where it doesn’t in VS. This gave me endless linker errors, and for many months before I understood the source of the problem, I just worked around it by making throwaway calls to various library functions in game code so that the linker would understand it needed to link those libraries. I had another issue for a while where Code::Blocks would eventually just sort of choke and slow to a crawl after compiling the game once or twice. This turned out to be related to the inordinate amount of warnings being displayed in the output window. Simply lowering the warning level fixed this problem. (Yes, a pedant might insist that the real solution here would be to address the warnings, but I always compile with /W4 in Visual Studio and produce no warnings at all, so I say gcc is just too persnickety.) The Code::Blocks debugger has been fairly hit-or-miss for me, too, often ignoring breakpoints entirely and requiring me to break at other places and step through code to get to where I want to be. I suspect but haven’t proven this may be related to spaces in filename paths, which is one of those fun “I’ll know better the next time I write an engine from scratch” issues that crops up from time to time.

Xcode is a totally different ball of wax. It wants to work its own way with no regard for what others are doing. In my experience, it’s also a little too eager to happily build and run projects that really aren’t configured correctly, whch has made it difficult to know for sure whether the applications I’m shipping will actually work on other users’ machines. It also likes to put intermediate and product files in bizarrely named folders buried somewhere inaccessible, and the options to relocate these paths to something more intuitive aren’t readily apparent. Shipping a product on Mac is a strange thing, too; Xcode really, really wants every application to be built for and deployed to the App Store, and that’s completely orthogonal to what I’m doing. As a result, the builds I produce outside of Steam are shipped in compressed disk images, which seems to be de rigueur despite the fact that OS X will spit out warnings about downloadable apps shipped this way. It’s a strange situation, where Apple seems to want all apps to go through the App Store with no regard for cases in which the App Store simply doesn’t apply.

As I’ve been wrapping up the upcoming Super Win update, I’ve found myself in a familiar scenario of having to make new builds fairly frequently as I find and fix bugs. On Windows, deploying a new build is fast. I have a single script that rebuilds all game content, compiles Steam and non-Steam builds, builds an installer, and automatically commits all changes to source control. Historically, rebuilding on Mac and Linux has always been a little more involved, as I have not had comparable scripts, and have instead had to manual open each IDE, choose the appropriate configurations or schemes, build the game twice (once for Steam and once for non-Steam), and manually package the executable and content into a .tar.gz or .dmg.

Yesterday, after repeating this process two or three times in as many days, I finally decided it was time to automate this process. As it turns out, both Code::Blocks and Xcode have fairly simple command line support, as do tar and hdiutil for assembling the compressed packages. A few hours’ work later, I have scripts that will quickly and easily produce iterative builds on each platform. There’s still a little more work I could do here to achieve parity with my Windows build script (e.g., automatically commiting to source control), but it’s a good start that should save me some time in the long run.

What would be really cool, and I’ll have to give this one some thought to work out the details, would be if I could automatically update the Code::Blocks and Xcode projects whenever my Visual Studio projects change. Most of the time, this happens when I’ve added, removed, moved, or renamed files. Assuming C::B and XC both have support for modifying projects from the command line (and that is admittedly a big assumption to make without researching it at all), I can imagine I could write a script to watch for changes to the VS project and make the corresponding changes in the other IDEs.

At the moment, I only build on Mac and Linux every once in a while during core development, so keeping the projects in sync isn’t a huge hassle to do by hand. One of my potential goals for the future, however, is to start producing more frequent in-dev builds on all platforms, and at that point, this might become a bigger deal.