We labor under an invalid premise that what works in the traditional factory setting, works in software application development. We have chosen to believe that building software is a defined process where we first identify the product requirements and the system requirements and then move forward to analyze, design, code, and (finally) test -- a method historically called the "waterfall" approach.

The roots of this approach reach all the way back to gentlemen like Frederick Taylor (1856-1915) who championed the workflow engineering and Henry Ford (1863-1947) who perfected assembly-line concepts -- approaches that were quite successful during the 20th century and, understandably, were then adapted and applied to software application development.

The adaptation of these ideas in the 70s and 80s as software application development became more and more prevalent is to be expected, giving birth to the concept of the waterfall, where development begins with requirements gathering, continues through analysis, design, and coding, and finally ends with testing. However, Dr. Winston Royce once commented, "the testing phase...is the first event for which [the software is] experienced as distinguished from analyzed. Yet, if [the experience fails] to satisfy the various external constraints, then invariably a major redesign is required."¹

Agile Development embraces this thinking and strongly suggests that all aspects of software development, from requirements gathering to testing, all continuous and constant, rather than done in phases.  Thus, when we look at the activity of testing in an Agile Organization, we realize that testing is NOT a separate activity done after coding, but rather a perception of software application development that is continuous during the development process. In other words,

  1. As we gather product (system) requirements, we should begin to consider how we would test these requirements and, conversely, the practical limitations of the requirements so that we test the product properly and reasonably exhaustively.
  2. As we determine software requirements, we must consider the testability of the application as well as how the application is to be constructed. If we build the application in a way that hampers testability, we increase the overall costs of ongoing proper testing and very likely decrease the attainable level of quality of the application. We also consider the test infrastructure with the same attention we pay to the application infrastructure.
  3. As we analyze features, we consider the acceptance criteria and identify ways that the application should work as well as the many ways that we might wish to ensure it is protected from working (test case analysis).
  4. As we design features, we design tests that work within the defined test infrastructure, satisfy the identified test cases, and even explore avenues of conditions that had not yet been identified.
  5. As we code and test features, we build test scripts and test data beds, ways to create product environments that allow for clean execution of test scripts and minimize false positive and false negative results. We explore how the product works with the newly added or modified code and if we find aberrations in the code, we fix the code and firm up the tests to ensure that the same issue can't occur again. We run our tests again and again and again, making sure that any feature previously completed continues to behave as expected.

Testing, therefore, is not a phase or an activity, but rather a way of looking at the application you are building. How will you, efficiently and effectively, ensure that the application continues to work from day to day, despite the fact that tens and even perhaps hundreds of people are adding and changing your product code on a daily basis?  In Agile Development, you answer the question with testing and tests that are built as PART of the application, rather than at the END of the application development effort.

¹ 1970. Royce, Winston (1970), "Managing the Development of Large Software Systems", Proceedings of IEEE WESCON 26 (August): 1–9.

Our Certifications