User:Dib9345/Dib9345:Example/US24 draft

INVESTing in Done and Debt-Free User Stories
Investing in Done and Debt-Free User Stories is the use of a combination of Agile techniques that help to define User Stories that will be accepted by the Product Owner without incurring technical debt. To understand this further, consider the following component parts:

What is INVEST?
The acronym Invest comes from a 2003 article by Bill Wake: in which Wake describes the characteristics of a good user story using this acronym. The acronym stands for the following:


 * I – Independent


 * N – Negotiable


 * V – Valuable


 * E – Estimable


 * S – Small


 * T – Testable

Wake goes on to provide the following definitions of INVEST:

Independent
Stories are easiest to work with if they are independent. That is, we'd like them to not overlap in concept, and we'd like to be able to schedule and implement them in any order.

Negotiable… and Negotiated
A good story is negotiable. It is not an explicit contract for features; rather, details will be co-created by the customer and programmer during development. A good story captures the essence, not the details.

Valuable
A story needs to be valuable. We don't care about value to just anybody; it needs to be valuable to the customer. Developers may have (legitimate) concerns, but these must be framed in a way that makes the customer perceive them as important.

Estimable
A good story can be estimated. We don't need an exact estimate, but just enough to help the customer rank and schedule the story's implementation. Being estimable is partly a function of being negotiated, as it's hard to estimate a story we don't understand. It is also a function of size: bigger stories are harder to estimate. Finally, it's a function of the team: what's easy to estimate will vary depending on the team's experience.

Small
Good stories tend to be small. Stories typically represent at most a few person-weeks worth of work. Above this size, and it seems to be too hard to know what's in the story's scope. Smaller stories tend to get more accurate estimates. Story descriptions can be small too (and putting them on an index card helps make that happen). Alistair Cockburn described the cards as tokens promising a future conversation.

Testable
A good story is testable. Writing a story card carries an implicit promise: "I understand what I want well enough that I could write a test for it." If a customer doesn't know how to test something, this may indicate that the story isn't clear enough, or that it doesn't reflect something valuable to them, or that the customer just needs help in testing.

==== Good and Poor examples of User Stories based on INVEST attributes ==== A good example: A bank customer can change his or her PIN.

Positives: Small, valuable to the bank customer, testable (can you change your pin or not), independent, negotiable (how does the customer do this, etc.), and estimable.

A poor example: Write game rules.

Drawbacks: not Independent, no business Value, not small.

What is the DONE criterion?
A team's definition of done is an agreed-upon set of things that must be true before any product backlog item is considered complete. This set of things is targeted at measuring the quality of a product backlog item, not whether it has met its functional specifications. An example of this set of items could be:


 * Code Complete


 * Unit tests written and executed


 * Integration tested


 * Performance tested


 * Documented (just enough)


 * Approved by Product Owner

History
2002: an early Articleby Bill Wake calls attention to the possible inconsistencies arising from terms commonly used within teams, such as "done"

2003: early Scrum training materials hint at the future importance of the "Definition of Done", initially only in the form of a slide title: "The story of Done"

2005: the first exercises inviting Scrum trainees to reflect on their (local) "definition of done" appear in later iterations of Scrum training materials

2007: by that point the "Definition of Done" as a full-fledged practice, and as a textual checklist displayed in the team room, has become widespread

Good and poor examples of DONE criterion
Here is a good example of the DONE criterion


 * Code produced (all ‘to do’ items in code completed)


 * Code commented, checked in and run against current version in source control


 * Peer reviewed (or produced with pair programming) and meeting development standards


 * Builds without errors


 * Unit tests written and passing


 * Deployed to system test environment and passed system tests


 * Passed UAT (User Acceptance Testing) and signed off as meeting requirements


 * Any build/deployment/configuration changes implemented/documented/communicated


 * Relevant documentation/diagrams produced and/or updated


 * Remaining hours for task set to zero and task closed

Here is a poor example of Done criterion:


 * Code Complete


 * Test Complete

The DONE criterion is applied across all user stories to ensure that quality is consistent in order to limit rework and reduce the risk of misunderstanding and conflict between the development team and the customer or product owner.

What is Technical Debt?
Technical debt is best described in Martin Fowler’s 2003 Article “TechicalDebt”. In the article, Fowler describes the definition with an example:

“You have a piece of functionality that you need to add to your system. You see two ways to do it, one is quick to do but is messy - you are sure that it will make further changes harder in the future. The other results in a cleaner design, but will take longer to put in place.

Technical Debt is a wonderful metaphor developed by Ward Cunningham to help us think about this problem. In this metaphor, doing things the quick and dirty way sets us up with a technical debt, which is similar to a financial debt. Like a financial debt, the technical debt incurs interest payments, which come in the form of the extra effort that we have to do in future development because of the quick and dirty design choice. We can choose to continue paying the interest, or we can pay down the principal by refactoring the quick and dirty design into the better design. Although it costs to pay down the principal, we gain by reduced interest payments in the future.”

History
Ward Cunningham first drew the comparison between technical complexity and debt in his 1992 experience report referenced in the Wikipedia Article "Technical Debt"

"Shipping first time code is like going into debt. A little debt speeds development so long as it is paid back promptly with a rewrite... The danger occurs when the debt is not repaid. Every minute spent on not-quite-right code counts as interest on that debt. Entire engineering organizations can be brought to a stand-still under the debt load of an unconsolidated implementation, object-oriented or otherwise." – Ward Cunningham

How is it used?
In order to avoid technical debt, the design of a User Story must evaluate whether technical debt will be incurred and balance this with the estimates of the User Story. The results might be a spectrum of heavy debt to light debt with a corresponding change in estimates. If teams always estimate with the design leaning towards as little technical debt as possible, then this will be built into whether the user story can be delivered within the sprint itself. The problems arise when teams make estimates without accounting for technical debt and then need to revise and renegotiate estimates or live with a design that is loaded with technical debt in order to meet the estimates they gave originally.

How is incurring technical debt related to DONE?
Technical debt must be considered as part of the DONE criterion. If it is not considered then there will not be a way to measure whether technical debt has been addressed as part of the design of the user story. A way to add technical debt considerations to the DONE criterion is the following:


 * Identify any rework that needs to be done


 * Identify the reasons why it wasn't done properly the first time


 * Identify what measures can be put in place to stop similar rework from occurring


 * Add these measures to the Definition of Done (DoD)

Conclusion
Using the INVEST paradigm contributes to writing a good user story. However, that is not enough as there must be an understanding of the quality criteria that need to be addressed for the user story to be considered finished in implementation (the DONE criterion or Definition of Done (DoD)). In addition, we could have a good user story that has been implemented with high quality but poorly designed and if that is the case there will be technical debt that has to be repaid. As such, technical debt must also be considered in both the construction of the user story and the design of the DONE criterion. If all of these techniques are employed then User Stories will be developed in a way that creates high quality product backlog items that do not incur technical debt