Industrial XP -> Storytesting


Capture a story with acceptance criteria that validates its implementation

Storytesting is the process of providing the input data, initiating a process that corresponds to a story being tested and comparing the actual output with the expected output at the end of the process.

After developers implement a story, they run storytests to verify that their implementation does what the customers expect. Only after storytests pass will customers mark a story as complete. All of the storytests for a system represent a contract,
written by customers, that the system must obey.

Storytests are most useful when automated, as this empowers customers and developers to launch them at the press of a button and discover the system's state. Coming up with the input data and expected output data for automatable storytests typically involves domain knowledge. Finding the right combination of tests is also important, and the contribution of professional testers can be substantial in this regard.

Storytesters are usually:
  • Domain Experts (DE)
  • Subject Matter Experts (SME)
  • Quality Assurance Experts (QA)
We have found the combination of SMEs/DEs and QA folks to be highly effective in producing good storytests.


The best time for writing storytests is usually after Iteration Planning, when the stories to be worked on are identified. The storytests should cover the scope of the stories being implemented in that iteration. They may be written initially on a whiteboard while the test is being researched.

Thereafter, they are captured in electronic form, and checked into version control. At this point, the storytests should be accessible to the entire project community in an executable form, so as to provide Executable Documentation of the system.

Don't Dial Every Number

To test if a mobile phone works correctly, you don't need to dial every number in the telephone book. Dialing a couple is sufficient to test that the phone works correctly for good numbers. Add a few wrong numbers into the mix to verify that you hear an error message and you have covered two major boundaries of the system. There are other boundaries which will merit individual tests, such as emergency numbers (like 911).

Storytesting involves identifying the minimal tests that will cover all boundary conditions by exercising all logic pathways.

e.g. The table below describes the monthly access plan of a mobile phone provider.

Monthly Access



Included Anytime Minutes



Bonus Anytime Minutes



Home Airtime Per-Minute Rate After Allowance



To test a system that accurately implements this plan, we'd need storytests to compute the bill amount for a customer who has used up:

  1. regular minutes lower boundary – 0 minutes
  2. regular minutes general – 1 minute – 499 minutes
  3. regular minutes upper boundary – 500 minutes
  4. bonus minutes lower boundary – 501 minutes
  5. bonus minutes general – 502-799 minutes
  6. bonus minutes upper boundary – 800 minutes
  7. chargeable minutes lower boundary – 801 minutes
  8. chargeable minutes general – 802 minutes to …

Eight tests could sufficiently prove that the system computes the bill accurately. More tests could further strengthen the customer's confidence, but they should be recognized as redundant. Redundant tests can slow the team down as they can cause the tests to take longer to run. In addition, they can reduce the effectiveness of storytests as Executable Documentation by obscuring the sytem's logic.

If you must have redundant tests, consider moving them to your regression test suite. Regression tests will test your system exhaustively. They take longer to run, and hence are not run as often as unit, smoke or storytests.


  1. Storytests should cover all logic pathways of a system
  2. If you have too many test cases, examine if all of them are needed. Should some of them be in regression test suites instead?
  3. If a storytest is too hard to implement, check that you are not trying to do too many things at once. Breaking up a test into smaller tests can be helpful
  4. When a test is too hard to implement, hold brainstorming sessions with all parties involved (QA, Development) and think out of the box to simplify your test


Industrial XP logo
Values & Practices
· Continuous Risk Management
· Project Chartering
· Project Community
· Test-Driven Management
· Sustainable Pace
· Planning Game
· Storytelling
· Storytesting
· Frequent Releases
· Small Teams
· Sitting Together
· Continuous Learning
· Iterative Usability
· Evolutionary Design
· Story Test-Driven Development
· Refactoring
· Domain-Driven Design
· Pairing
· Continuous Integration
· Collective Ownership
· Coding Standard
· Retrospectives

Send mail to with questions or comments about this web site.
Copyright © 2004 Industrial Logic, Inc. All Rights Reserved.