Don’t Let “Undone” Documentation Delay Software Project Delivery

Share on FacebookTweet about this on TwitterShare on LinkedIn

Share on FacebookTweet about this on TwitterShare on LinkedIn

The Agile blogosphere talks about the importance of coming up with a common understanding of what is meant by “done” within an Agile project. This “Definition of Done” should define all steps necessary to deliver a finished increment with the best quality possible at the end of a sprint. Typically, team progress toward achieving “doneness” is described of in terms of sprints, story points, velocity, work (done, undone and remaining), burn-down/burn-up charts, and delivery/release dates. It is easy to inadvertently assume that “doneness” applies only to the target software functionality. However, “doneness” also applies to internal project/process documentation and any external, deliverable documentation. Negative impacts of “undone” documentation typically include extra project costs, incomplete/delayed delivery, and likely unmet expectations with respect to quality, completeness, usability, and accuracy.

segue-blog-documentation-delay-software-project

 

“Definition of Done”

Be sure that your software project’s “Definition of Done” includes entries to address both internal and external documentation. Internal documentation should include those artifacts necessary to define, describe, and manage the project, such as User Stories, Tasks, Acceptance Criteria, and Test Cases. External documentation would include artifacts that are project-required deliverables, such as User Documentation. You may modify your definition of “doneness” as your project evolves from sprint to sprint, building on valuable lessons learned in one sprint and applying them to subsequent sprints.

User Stories

User Stories contain the user’s requirements for what he/she wants the project to accomplish. They represent the user’s perception and expectation of the world-to-be. To be most effective, a User Story should focus on the user, facilitate a conversation, be simple and concise, have actionable Acceptance Criteria, and be testable. The story should be written from the user’s perspective and employ personas or actors such as “account holder”, “applicant”, or “approver”. To maintain consistency and convey the same focus, the story should follow a simple, consistent, concise format such as:

“As an , I want so that ”.

You want to avoid creating ineffective User Stories that may lead to gaps between what the user is looking for/expecting, and what may be ultimately provided. Expectation gaps, whether real or perceived, may lead to delayed acceptance or, worse, non-acceptance, of the functionality described by the User Story.

An incompletely-stated User Story is clearly “not done”.

Acceptance Criteria

Once you have expressed the “what and why” of the User Story, you will need to provide actionable, testable Acceptance Criteria to demonstrate that the User Story has been successfully achieved. These criteria are the measures of whether the requirement represented by the User Story has been satisfied. They may express functional and non-functional requirements, expected acceptable quality, or required performance.

Effective Acceptance Criteria should be expressed clearly and simply, leaving no room for misinterpretation. They should have clear pass/fail outcomes with respect to what is considered minimal marketable functionality or minimal quality required by the User Story. Acceptance Criteria represent “conditions of satisfaction.” There is no partial acceptance; either a criterion is met or it is not.

Acceptance Criteria that have any of the following are likely to lead to incomplete/ineffective testing of the User Story and a subsequent expectation gap regarding satisfaction of the User Story’s expressed requirements:

  • incomplete, vague, untestable, or contradictory
  • do not adequately address possible alternatives
  • contain or imply unverified assumptions
  • do not relate to the customer and their involvement with the requiremen
  • A User Story with incompletely or ineffectively stated Acceptance Criteria is clearly “not done”.

A User Story with incompletely or ineffectively stated Acceptance Criteria is clearly “not done”.

Test Cases

Test Cases provide the mechanism to demonstrate whether a User Story as measured by its Acceptance Criteria not only works as coded, but more importantly, works as intended. For each User Story, you will need to design effective Test Cases to clearly demonstrate achievement or non-achievement of the conditions of satisfaction represented by the Acceptance Criteria.

A well-written Test Case should be clear and concise, have a specific goal, and clearly state its assumptions. It should not assume any specific knowledge on the part of the reader. A Test Case should identify its source, be categorized (for subsequent analysis), have a status (for workflow management), and have an estimate of required testing time.

Test Cases that have any of the following characteristics are likely to lead to downstream quality and expectation gaps, costly rework, delays, and diminished customer confidence and satisfaction:

  • incomplete
  • do not adequately address alternate logic paths
  • do not provide adequate coverage
  • contain implicit or overlooked assumptions
  • do not address relevant pre- and post-conditions

A User Story with incompletely or ineffectively stated Test Cases is clearly “not done”.

User Documentation

User documentation must focus on the needs of the actual target users of the documents. It should be expressed in terms of the user’s perspective, reflecting a user-view, whether the user is the end-consumer or a process-maintainer. Best practices for documentation typically call for simplification: keep the documentation just simple enough, but not too simple. As a requirement, user documentation, just like the functionality it supports, should be estimated, prioritized and appropriately integrated into the project work queue.

Documentation that does not adequately address the following areas may be perceived as incomplete or substandard:

  • available features and options
  • user choices and outcomes
  • functional and operational constraints
  • available resources for assistance

This may diminish the perceived value of the delivered functionality and become a source of user disaffection. User documentation with incomplete or ineffective content is clearly “not done”.

An Example

The following “Definition of Done” is from a first-time Agile/scrum project for my company, Segue Technologies. It represented the collaborative inputs from all of the project’s stakeholders: developers, testers, the Product Owner, the Scrum Master, and the customer. It reflects what was expected by, and from, all who were involved and is expressed in terms of the processes and tools that were used.

ProcureLinx Lite

Sprint 1 “Definition of Done” (Done Done)

  • A User Story ticket exists for every user requirement
  • Each User Story is assigned Story Points
  • Each User Story includes Acceptance Criteria
  • Task tickets exist for every User Story to represent work to complete/completed
  • Task tickets must have original estimate, completed work, and remaining work hours
  • Tasks must be related (e.g., linked) to a User Story
  • Test Cases exist to test all Acceptance Criteria in a User Story
  • Test Cases must be related (e.g., linked) to a User Story
  • Code must be checked in and attached to Task and User Story tickets
  • All code has been reviewed by another developer
  • Formal “release ready” Build has been generated
  • All unit tests passed
  • All regression tests passed
  • All Acceptance Criteria in User Story passed testing
  • Each User Story has been reviewed and accepted by the Product Owner
  • “Goodies” are provided at the Retrospective

As the project evolved across its multiple-sprint life, this definition was reviewed, critiqued and tweaked to better serve the needs and goals of the project. Your project’s “Definition of Done” may look like this or look totally different, but it should clearly reflect your project and what matters most to it and to you.