• 7060fb7ac5a8ad3d4101bbb09ae89b1b?s=66

    TDD and the Half-Life of Information

    by Daniel Pritchett

    When one programmer questions another's proposed feature idea with "YAGNI!" she's just trying to help everyone stay honest with themselves: The design path being followed is drifting away from the observable needs of the project. Information half-life may not be a concept we discuss often in software, but there are plenty of related tropes we do revisit day after day: Just in time compilers. Lazy evaluation. Continuous integration. Premature optimization. Lean startups. Iterative development. YAGNI.

    Each of these concepts is an application of the same basic idea: the older your plans and your intel, the farther removed they are from reality. This is why waterfall projects and eighteen-month gantt charts devolve into a painful kabuki where most participants go through the motions without any real expectation of success. This is why I like having a test suite that runs itself continuously while I code.

    Mission-critical applications

    At Coroutine we build mission-critical applications that other companies can build their businesses around. This means we have to build them well, in a timely fashion, and within budgetary expectations. Since old specs and old budgets are inherently imperfect, we have to realign our expectations continuously. Regular client communications. Daily stand-ups. Test suites that run throughout the work day. Continuous tests which verify that our applications are growing in the way we think they are without sacrificing existing features and expectations.

    Ruby on Rails has proven to be a solid choice for this type of central business-driving application. It allows us to get the rough outline of a system up and running relatively quickly. From there we work with clients to reshape plans and features and fill in the gaps and deliver the bespoke solutions our clients' businesses need.

    Since we're on Rails, I like a certain set of tools to keep my tests running smoothly. I'll be the first to admit that each of these tools was recommended by my coworkers and I'm not yet branching out to develop my own tastes, but this is what I'm using to get things done:

    Tests that run automatically

    Guard. Every time I save a file, the test suite runs from the current file outwards to the entire test suite (depending on my configuration and whether any tests have failed recently). This means I'm never more than a few seconds away from learning whether my current bit of code is doing what I think it's doing. This is me minimizing my feedback loops. Accepting the half-life of my information.

    Tests that run quickly

    Spork. Ruby tests sent to an always-on daemon that keeps my application's environment loaded and up to date. No more ten-second Rails environment load time before a thirty second test run.

    Tests that do exactly what they say on the tin

    RSpec. Shoulda. These descriptive testing tools speak for themselves.

    describe "A blog post about Ruby TDD with Guard, Spork, and RSpec" do
      context "Guard keeps the tests running automatically" do
        before { raise NotImplementedError unless Guard.present? }
        context "Spork preloads the environment for faster test runs" do
          before { raise RunTimeError unless Spork.present? }
          it "should use RSpec" do
            expect{ RSpec.configure }.to_not raise_error
          it "should use Shoulda" do
            (2 + 2).should == 4

    Software that works

    I realize this post will be covering well-trodden ground for TDD devotees, but I really enjoy the work we do here. The constant feedback from my test suite is very empowering. I get to make visible progress every day and share it with clients who are just as happy about it as I am.