How does Agile Methodology Relate to Continuous Testing?
Doing just about anything takes some degree of planning. How do you get from A to B? If you’re talking a simple trip, you look at a map, gas up your car, and set off. If you’re building a picnic table, you grab a blueprint, make sure your tools are sharp, and start sawing. Or if you’re making soup, you look at a recipe book and get cooking (or ignore it and add what you wanted in the first place).
But what if there’s no map, no blueprint or no recipe? How do you plan to build something that’s never been built before?
There are two options:
1) The charge-in method. You say, “I’m confident I’ve done enough similar things that I can write out all the steps in advance. We’ll plan it, we’ll do it, we’ll launch it. All the pieces will fall into place as we go. Everything will work out fine.”
2) The analytical method. You say, “Hey, I haven’t done this before. Nobody has. So it’s probably best if we leave a little early, and keep our eyes open. We’ll likely run into traffic, bad weather, or detours along the route… so even though we know where we want to end up, we’ll have to adapt and make decisions as we go.”
Sound familiar? That’s because option one is Waterfall Methodology and option two is Agile Methodology.
Mapping Your Course
Imagine planning a road trip using the Waterfall method, with no adaptations allowed. “Go two miles then take a hard right” may seem like a fine plan in theory, but it won’t get you anywhere when you take that corner and discover a roadblock.
A plan is only as good as the information you have at any given moment. To arrive at your destination, you have to be aware of your surroundings, and adaptable to new insights gained along the way.
With few exceptions, software projects tend to be taking routes unknown and building things that have never been built before. The fastest, most risk-free way to success? Take a step, look around, figure out the next best step, and repeat.
Staying on Track
Of course, building unique software with constant adaptations and a fresh team of people at the wheel comes with a healthy dose of challenges – and that’s where Continuous Testing comes in. Testing is the best way to find out what’s working, what’s not, and how fast you’re traveling. It also keeps your eyes on the road and ensures you never stray too far off-track.
Is it close to being done? Does it work? Will it break in the future? Where can it be improved? Not only does testing answer these questions, but it helps ensure bugs are spotted and fixed before the software is shipped to your customers.
From a business perspective, testing comes out at the top of your ROI. By focusing on Continuous Testing, you’ll be able to make better decisions quickly and gain much more confidence in your product (as your customers will in you). And this is why we strongly promote testing and feedback as part of any Agile process.
Where Agile Methodology Came From
Software development as a profession is relatively new. It’s a field that was pioneered by people like Alan Turing, John Backus and Grace Hopper – pioneers so recent that they can be found on The Letterman Show.
As a new field, everyone had to learn on the job, regardless of their expertise and experience in other areas. Early developers were adults in their 30s or 40s before they even saw a computer; they had no schools or training manuals, and had to take inspiration from everywhere and everything, from ancient logic-driven academics and philosophers to radio operators and tradesmen.
As such, the never-been-solved problems faced by the first generation of computer scientists required an organic approach. Building, testing and feedback all blurred together in a maelstrom of creativity, destruction and progress. One idea could take them from nothing to something, invalidating promising paths and past achievements – and the right idea could come from anyone.
Everything was explored and learned as it happened; and the unforeseen nature of these projects, coupled with their importance, meant time (and budgets) were of little consequence.
A New Generation, Over A Generation Ago
Within a few short years, the computer science landscape became unrecognizable. Hardware and software had split into two separate disciplines, and machines could be reprogrammed in countless ways to adapt to our ever-changing needs.
Then came the biggest change, with computers becoming something you could buy. These were complex and often highly-customized machines, and every mainframe required a small army of software developers to prop it up and feed it meticulously hand-crafted code. To grow, businesses wanted these computers – and to run them, they needed more software engineers than they had access to.
By the 1970s, universities were attempting to churn out computer science graduates to keep up with this new demand. But there simply weren’t enough teachers. This led to mass hiring of new faculty staff to teach the next generation of computer programmers. The fact many had only bare basic knowledge of computers was inconsequential; something was better than nothing. A few years later, tens of thousands of newly-minted software developers hit the workforce.
These young developers of the 1970s had what every fresh college hire has: an abundance of energy. But between poor teaching and rapidly evolving technology, they often lacked the abilities and discipline of the generation who came before them.
As a solution, managers implemented rigid linear thinking, with regimented planning and building coming before testing. Often with lucrative government contracts in mind (and matching 1950s haircuts), these managers enforced early Waterfall Methodology and businesses celebrated solving the problem of how to properly develop software in a way that met deadlines and stuck to budgets.
Waterfall was a Bandaid
Except, projects still failed, still ran over budget and still didn’t live up to scoping expectations. The reason for this failure is the fundamental flaw of Waterfall Methodology – we don’t know, and can’t predict, everything up front.
No amount of planning results in perfectly accurate estimates, and few (if any) plans are so perfect they can’t be improved as you go. But the regimented nature of Waterfall doesn’t allow for either of these facts.
Here’s a very simplified example. Imagine your goal is to count 1,000,000 cans, and the original plan is to count them one-by-one. The sales team sells the client on this approach, and you’re told to get a move on — the client has aggressive due dates! Then, at around can 10, you realize all the cans weigh the same, so you can just bunch and weigh them. With that knowledge, you can now count them in blocks of 10, and if you can count in blocks of 10, you can count in blocks of 100, and so on. You’ve found a great shortcut. It deviates from the original plan, but it’ll get you there faster. So what do you do?
If you’re a Waterfall developer in this scenario, you can’t just adapt to the newly discovered counting method. Instead, you raise your idea to your manager, who then writes a change order. The client then asks for an impact assessment, and that assessment then needs to be completed, reviewed and approved by everyone in the chain of command. And you’re told, “Keep counting the cans one-by-one, just in case we have to deliver on the original contract.” Eventually, in a few weeks or months, the change is approved through legal and becomes the new contract that you’re building towards. You can finally stop counting cans one-by-one! Unfortunately, by that point, you’ve either counted all the cans already (at a much higher cost), or you’ve become frustrated and left the company, leaving them back at square one.
Unfortunately, there’s more to success than just having a plan.
Avoid The Domino Effect
The next real flaw in Waterfall is that it’s a system designed to fail forward. The planners and designers fail the developers; the developers fail the testers; and the testers fail the customers. But how does this happen?
With a number of siloed teams working in a rigid environment, incremental flaws soon add up – and when you add inflexible deadlines and budget pressures into the mix, it’s a perfect breeding ground for small issues to grow into serious problems.
For example, the sales team and solution architects cut a few corners to come up with a plan that’s delivered on time and looks great on paper; it’s then flung to the development team who soon uncover a snag caused by incomplete or mismatched requirements that were glossed over in planning. But due to the time pressure on the developers, a few more corners are cut, another layer of gloss is applied, and the project comes in almost on budget. By the time the product reaches testing, its flaws are clear; endless meetings begin, fingers get pointed, budgets get blown and customers get let down.
Sounds like the stuff of nightmares, right? But it’s very much a reality. A lot of projects fail; in fact, some estimates put the number at 70%. Luckily, all of this is avoidable if smaller steps are taken, if the right feedback is given along the way, and if everyone is allowed to learn and adapt requirements to fit the current best understanding at any given point.
The Agile Manifesto
After spending their careers watching their best efforts ruined by poor processes, a group of second-generation developers got together in 2001 to come up with a set of principles they felt would help the industry. They were determined to leave things better than they found them; and so the Agile Manifesto was born, in an attempt to harness a collective desire for teamwork, quality and discipline.
“We are uncovering better ways of developing software by doing it and helping others do it. Through this work we have come to value:
- Individuals and interactions over processes and tools
- Working software over comprehensive documentation
- Customer collaboration over contract negotiation
- Responding to change over following a plan
That is, while there is value in the items on the right, we value the items on the left more.”
“[Agile] is often thought of as a process, but it’s not a process. It requires attention to detail. You have to work in fixed-time boxes, you have to collaborate with the customer, you have to do continuous integration. These are disciplines, not process steps. These are promises you make, they are not tasks to follow.”
This is a method that openly acknowledges where you are right now in your journey, purposefully taking small steps towards defined goals, and gaining confidence along the way by constantly seeking feedback.
Agile Methodology isn’t something that was coined by Amazon, Google, Facebook or any other modern behemoth. It was a veteran group of technology experts reflecting on their careers and aligning on what made the difference between failure and success. It was an act of rebellion against the hegemonic Waterfall structure that forced them to work inefficiently on projects that were doomed from the start. And it’s a return to honest, and results-focused, creative problem solving.
About the Continuous Testing Advocate Series
This post is part of an ongoing series dedicated to helping turn all managers, business analysts, designers, creatives, developers, support staff, and testers into Continuous Testing Advocates. Continuous Testing is an integral, but historically under-utilized, part of any modern software development life cycle. You can help your team by promoting concepts, processes, and technology. To learn more, check out the Continuous Testing Advocate Series page on ContinuousTesting.com.