Js13kGames retrospective (part 1)
29 September 2013
It’s a common problem with software projects to not give sufficient thought to deployment until near the end of the project. It’s something that should be considered throughout the project (and ideally practiced throughout the project, via Continuous Deployment). By the time you get to the end it should be as close to a one-button process as possible.
The very first thing I did after deciding to enter the competition was to pester the organiser, Andrzej, with some questions about exactly how the game should be packaged for submission (the rules for the server category were slightly complicated). I then created a single grunt task that took care of everything (including bundling, minification, creating a server package with no dev dependencies, zipping up client and server code), and produced the exact things I needed to be able to submit the game:
- An output directory containing just what was needed to deploy the game, with a single command, to nodejitsu
- Zipped client and server files for submission to the competition website (with the console output including the size of each file, so I could quickly check that I was still within the limit)
Admittedly, the challenge was probably easier here than on most full-scale software projects, but the point still stands that it’s well-worth putting in the effort up-front to avoid a last-minute panic about delivery.
In the end I stayed up pretty late polishing my game entry, and was rather tired when it came to submitting it. Such was my state of mind that I managed to get the captcha on the submission form (which was something like “9 + 5 = ?”) wrong on my first attempt. It was a good job that I’d made sure it was pretty much impossible to get any other part of the submission process wrong.
TDD isn’t the first thing that springs to mind when you think of time-boxed coding competitions, and I can’t honestly say that I wrote the entire codebase test-first (or even test-last in places), but this approach was invaluable for parts of it…
Co-ordinating between shared models running on clients and a server is frightfully complex (see below), and it was only by sketching out the nasty edge cases and concocting tests for them that I had any hope of producing something robust. Having tests in place before trying to write the production code saved a lot of fumbling around in the dark.
The test coverage level across the codebase is somewhat inconsistent, but it’s close to 100% for all of the shared model code, and has been so since quite early on in the project. Working under tight time constraints for the competition made this testing if anything more worthwhile. As with packaging, having a one-button setup for running tests and static analysis (JSHint caught a few genuine bugs) was very helpful indeed.
I actually started writing this game as a Ludum Dare entry (the weekend of the 27th Ludum Dare fell within the month that Js13kGames was running). I didn’t manage to finish something presentable in time for the 48-hour Ludum Dare competition. This was partly due to other time commitments, but also due to failing to prioritise work on the project well enough and getting distracted by things that weren’t on the critical path.
I actually didn’t find out about the Js13kGames until about halfway through its run, so was still a bit pressed for time. It was only by maintaining a backlog of features, shaping it regularly, and being disciplined about adding new ideas to the backlog rather than allowing them to sidetrack me, that I managed to finish something I felt was sufficiently presentable by the end of the competition.
One of the first things I had to do when re-purposing my game for Js13kGames rather than Ludum Dare was to tear out all of those pesky 3rd-party libraries. Getting rid of jQuery and Twitter Bootstrap was frankly a bit of a chore, as they had been useful productivity boosters. However, removing the 2D graphics library, pixi.js, turned out to be a big benefit to the project.
As a result, although I familiarised myself with pixi.js’s API docs, I didn’t really understand what the library was doing, and therefore ended up doing some obviously dumb stuff (like effectively redrawing everything each frame). When I removed pixi.js, I spent a modest amount of time researching and reading a few great articles on Canvas (including those from Mozilla, HTML5 Rocks and Dive Into HTML5), and ended up producing something much better.
Of course, if you do already have a good understanding of Canvas or WebGL, then pixi.js is a great library for working with these technologies and could save you a lot of time. I would probably use it again now that I know more.
This was an important reminder for me that you should typically know how things work one level of abstraction below where you’re operating (there’s a great blog post on this by Scott Hanselmann), and the highest level libraries that you’re using should only be a productivity tool rather than a crutch or an excuse for not understanding what’s going on at a slightly lower-level.