I spent a majority of my early career testing in a siloed waterfall team. I didn’t know there were other ways to develop software. I assumed this was the way everybody makes software. Some product team from on high writes PRDs (product requirements document) and throws them over the wall to engineering. Development managers then read those requirements and dole them out to teams where a project manager assigns those tasks to the developers. Our testing team would always be testing. Since quality was never a critical factor to releasing, the testing team was always very busy testing hotfixes and critical escalations.
Once the development team felt that the new version was stable enough to be handed to QA, we would begin formal testing of our respective functional areas of the product. There were no stories, almost no functional specifications (that were still valid for testing) and usually in a several month cycle of development we’d have a few weeks to test. During those weeks some more requirements docs would sneak over the wall and add more instability and uncertainty to our release. Since the Schedule was often immovable, massive amounts of crunch mode would ensue.
I had never heard of acceptance test driven development back then but I frequently suggested that a tester join the design phase of development. I thought if a tester could be there and know what the product team expected we could write test cases and prepare for the waterfall. At the time I thought the solution to our problems lie in more communication. I had the wrong idea of what could be done with that communication (writing test plans and test cases) but I was definitely onto something.
Many years later I got a chance to spend a day learning from Elisabeth Hendrickson at her SASQAG $99 day in Bellevue WA. She taught us about ways to bring testing to the beginning of a project. We could meet with product managers and coders to have a conversation about the story being proposed for development. During that meeting we could ask questions and seek not only answers but concrete examples of what the product manager (stakeholder, or product owner) actually wanted the software to do.
This was a revelation to me. Here was what I had been seeking for all those years; a way to include everybody in the process. I started to realize that in those “good old days” when I was trying to get access to product for my QA team, the development team were suffering the same problems we were. The reason the functional specs were not valid learning for testing often was that during the course of development the coders would realize that some requirement couldn’t be met, or was ambiguous so they’d code it in a way that would work, thus invalidating the specification for testers. The document no longer represented the expected behavior of the system.
If our developers could have that conversation with the product team, and the testing team could be a part of it, we could collaborate on implementation and test the customer expectation before wasting time writing code that either diverged greatly from the intended original vision, or reworking code due to serious functional problems that weren’t uncovered until the pre-release crunch mode.
The notion of a story workshop as a means to gain shared understanding blew my mind. This was what I was seeking all along. Let’s behave as though we’re all on the same team and have the same goal: delivering customer value. Let’s get together at the beginning of the project to reduce the amount of wasteful rework. Instead of scrubbing and de-prioritizing a mountain of bugs found late, let’s use communication to reduce the number of bugs early.
The story workshop was deceptively simple. The story Elisabeth used in class was: “As an administrator I want users to be required to choose secure passwords so that their accounts cannot be hacked by someone using a password guessing program.And if the user provides an insecure password, display an error message.” and we write it on the white board.
Then we have a conversation, the context-driven tester might ask “What does ‘secure’ mean to you?” the product owner clarifies “a minimum of 6 characters with at least one letter, a symbol and a number.” with this information we can come up with a set of concrete examples. We can put those examples in a cucumber compatible “Given – When – Then” format, or express them in a table with a list of valid and invalid passwords with their expected result. The white board can quickly become a great visual radiator of expectations. Somebody might ask if the user should be able to enter a long encrypted password. The coder might uncover some implementation detail that may impact the way this story works. Sometimes even the product owner realizes there is something else they’d really like. Perhaps they decide they want to allow a singie-sign on feature so that the user is authenticated directly from their local network.
The single sign-on might seem like a tangential rabbit hole but you solve that problem by agreeing to break it into another story for another workshop later. For this workshop you focus only on basic acceptance concrete examples for the story. These examples help keep the story within a defined scope and using the shared understanding from this workshop we can go forward into development confident that we’re going to have fewer difficulties developing the feature.
When we began to discuss tools like cucumber and fitnesse for making those specifications into executable acceptance tests, things got really exciting. The concept that the result of these conversations can live on in the form of living documentation and even be merged into a release branch to act as basic acceptance level regression checks added great value to a process I already felt would impact software quality immensely.
Since this day with Elisabeth I’ve worked to implement this process with the team at Volunteermatch. My experience so far has been incredibly positive. Our product managers were excited to have story workshops where we could learn what they wanted and offer our input early in the project. Story workshops are still very new for us and while we’ve managed to automate some of the acceptance criteria we’ve identified, we’ve still got a long way to go.
One thing I’ve learned is that there is peril in the automation aspect of this process. The power in ATDD is in the story workshop. It’s in the collaboration and the shared understanding that we gain the most value. The automated executable specifications are a delightful bypoduct of that story workshop. The acceptance tests are also a byproduct of that shared understanding. The point of ATDD is collaborating to create customer value. If you threw away the automation piece we’d still have great value from the shared understanding of the story workshop.
One obstacle to story workshops can be the perceived overhead associated with having meetings to discuss stories before development starts. My answer to those concerns is to point out that as an organization we often pay for a lack of shared understanding. When a bug is filed days before the release deadline and a conversation happens late about whether the specification is wrong or the programmer’s interpretation is wrong, it costs far more than the story workshop that would have prevented that confusion. An ounce of prevention may be worth more than a pound of cure, but the prevention is a cost you choose and the cure is a cost that is thrust upon you.
ATDD is just one powerful part of a larger whole team approach to software development. Writing unit tests, practicing test driven development, pairing coders and testers, product managers and coders, testers and product managers and doing extensive customer development are all good practices for product development teams to apply to their process.
Ultimately I believe that people working together is the most important aspect of this process. In asking product managers, testers, and coders to work together we give ownership of the whole process to the whole team, and in doing so we significantly improve the odds of achieving our shared goal, to deliver customer delight.
Adam Yuret: After 8 years at WebTrends testing an enterprise level SaaS data warehousing product which included building and maintaining a large scale testing environment, Adam currently works as an “army of one” tester for Volunteermatch. VolunteerMatch is a national nonprofit organization dedicated to strengthening communities by making it easier for good people and good causes to connect. Adam is currently working to build a testing process for a project that is transitioning to an agile development methodology by helping build a collaborative product development team.
Nice post Adam, totally agree that the payoff is the discussion that happens which can help bring out anything missing and helps clarify understanding. Automated side of things is a bonus