Tuesday, November 29, 2011

Agile Definition of Done Starter Kit

I often find it amusing when the definition of ‘done’ in Agile is sometimes called ‘done-done’. This is meant to imply that we are not just done with development (the 1st done), but we are done with testing (the 2nd done) as well. However, if you think about all of the activities that are needed to get stories in a sprint backlog into the shape to be potentially shippable, you should probably call it “done-done-done-done” and possibly more (LOL).
So what is the importance of done criteria? First as mentioned, it helps the team understand what is the expectation of getting a story (or the functionality therein) into shape to be potentially shippable. Second, it helps identify the activities and expectations that must occur to build a quality product. Third, all activities in the done criteria are considered when the team sizes the work during Sprint Planning and, therefore, has a direct impact on the sizing of stories. When the team sizes a story, they need to ensure it includes all of the work described in the team’s "done criteria".
I usually bring a starter kit of typical tasks to get a story to "done". This helps initiate an active discussion prior to sprint 1 among the team so that each team member understands the various elements of the done criteria and what elements we are agreeing too as a team. Here is my done criteria (aka, definition of done) starter kit:
  • Incremental designing (and what type of design type(s) the team will use)
  • Incremental development (per the development programming techniques, and this includes developing documentation such as user guides and non-functional requirements associated with the story)
  • Incremental building/evolving the unit tests
  • Consideration for incrementally building out automation for regression testing, etc
  • Applying appropriate source control, checkout/checkin, and branching/merging
  • Applying approach incremental local builds (in private workspace)
  • Applying code review (or pair programming if being applied) as appropriate
  • Incremental testing (per the testing types, e.g., functional, system, integration, etc., pending how much automation there is)
  • Meeting acceptance criteria shared by the Product Owner
At this point, the team discusses these elements and establishes a common definition of done for the stories and the sprint. Now keep in mind that this is the team’s common done criteria and it should be flexible depending on the type of work. Also, once the team agrees to done criteria, expect it to evolve over time and it may be a discussion in the Retrospective if it needs improvement. Some of the effort associated with your definition of done is dependent on what tools, infrastructure, and automation, that currently exists and where you want to go, so keep this in mind.

Finally, if your definition of done has nine key activities, then you can call it the “done-done-done-done-done-done-done-done-done” criteria (LOL).  Maybe just one "Done" is enough.  Once you establish the done criteria for the team, don’t forget to evolve it over time to get you to a quality and releasable product!  


  1. Mario - I find it helpful to ask the team what it will take for the business to make money from their work. That usually gets performance testing, memory leak testing, training created, sales people trained.

    In addition I stick with "Done Done" as reminder that its more than just my work that has to be done.

    Mark Levison
    Agile Pain Relief Consulting

  2. I agree with Mario. Need to be flexible and opened as it depends who is your user.
    Lately i came across an article that summarized the essential points I had been contemplating but did not know how to express.
    Its main point which i am totally agree with was " Develop a culture of Trust by empowering your teams, let them decide how they will deliver the expected results and how to meet the DOD. Doing so increases productivity and motivation.
    Nowadays team members are closer to the market place and the customers needs (they know what matter for them and factor this in their DoD). They are also well aware of what processes are not working effectively and efficiently AND most importantly, they know how they can fix them".
    This tells me that the answers are in your organization, ask your teams to find the solutions (really scrum spirit isnt't it).
    TRUST, EMPOWER, TRANSPARENCY are the keys to succeed

  3. Thank you all for sharing your thoughts!

  4. I agree completely and like your starter set. I tend to get the PM & PO to describe what deliverables they need and software is usually just one. There will always be various automated amd scripted tests. Then there will be some sign-offs for example data protection, financial regs, language translations, legal, etc. On largescale projects there will certainly be architectural and deployment models/docs, training, help, security, etc. Kind of done to the power n.
    Then I get the team to look at each deliverable and say how they will achieve it. Sometimes we have no choice e.g. financial regs must be tested and signed off in a contrrolled way. But sometimes the team can come up with improvements and negotiate new ways of getting to done.
    It is also worth noting that really there are 3 levels of done. The sprint, the release and the project. So you may for example do performance testing at a release level which is several sprints/teams worth of stories (I try and get as much brought forward into the sprint as possible but there are constraints).

  5. Each user story is expected to yield, once implemented, a contribution to the value of the overall product, irrespective of the order of implementation; these and other assumptions as to the nature of user stories are captured by the invest formula..... nice post on agile user stories thanks for sharing.

  6. Wow - another DoD article from the way-back-machine - this is a grand-daddy of DoD articles... I'm linking to another grand-daddy article. I like the "starter-kit" notion and your list of items.

    Exercise:: Definition of Ready & Done

    In the winter of 2018 Luke Hohmann & the Scrum Alliance hosted a webinar on Def. of Done and Ready: http://agilecomplexificationinverter.blogspot.com/2018/02/webinar-collaboration-at-scale-defining.html