How it is defined?

A practice that lists sets criteria for various activities to be performed in a project to check if all the activities are completed. It can also be termed as a checklist of a product deliverables and a way to report the status of these deliverables due to other team members.


Definition of Done – ‘Checklist of Product Deliverables’

According to Mike Cohn who is one of the founders of the Scrum Alliance, having a “definition of done” has become a near-standard thing for every scrum team. Defining a clear definition of done in a project makes progress tracking easy and allows team to focus on completing the tasks to build the product with a clear vision by eliminating non-important activities that adds as a waste. Team members do not waste time and effort in identifying the exactness of the user stories, which does not create confusion.

Importance in a framework (methodology): Scrum

The Scrum framework sets a very high bar of delivering “Potentially Shippable Software” at the end of every sprint. It requires team to deliver “potentially shippable software” at the end of every sprint. Without having a clear definition of done, work might seem to be in ‘under progress’ state that can lead to confusion among team members. They will be able to pick up tasks from product backlog but not mark as done even after completing the tasks.

Adding to this, it will pose other issues such as no correct tasks as ‘done’, increasing count of unmet goals, unable to calculate velocity of a project, visible tracking will become misleading or there will be overcommitment of work that cannot be done leading to accumulating technical debt. We definitely cannot afford to have any of these issues in any project for which accuracy, effort tracking, progress reporting are critical requirements.

In order to avoid these, it is really important to have a clear definition of done in the beginning of a sprint that encompasses user acceptance criteria confirming not only completion of one functionality of a task but also asserting the quality of the associated features.

Metrics to measure Definition of Done

Requirement Analysis and Environment setup

  • Goal: To set up all initial requirements with no pending tasks
  • Hypothesis: Team has received all the requirements from the customer for developing initial version of the product. Team has required skills to develop the product.
    • Question 1: Are all tasks prioritized by the product owner?
      • Metric: Count and match the number of tasks/features in Product Backlog.
    • Question 2: Are user stories assigned and estimated by team members?
      • Metric 1: Date, time and results of Sprint Planning meeting
      • Metric 2: Item lists in Sprint Backlog
    • Question 3: Is the development environment ready?
      • Metric 1: Softwares installations list
      • Metric 2: Third-party tools installation list
    • Question 4: Is design completed?
      • Metric 1: UML diagrams for each user story
      • Metric 2: List of approved Prototypes/Mockups/Wireframes
    • Question 5: Are unit test cases written?
      • Metric: List of unit test cases for each user story

Development & Testing activities

  • Goal: To complete development and testing tasks with minimal/no issues
  • Hypothesis: Team started building the software after receiving approval from customer and managers when pre-development checklist was verified
    • Question 1: Is code working for all features?
      • Metric: List of checked in and merged source code  
    • Question 2: Is unit testing done?
      • Metric: No. of unit test passed
    • Question 3: Is code refactoring done?
      • Metric 1: Comments in source code
      • Metric 2: Reusable source code
    • Question 4: Are code reviews done?
      • Metric: Code passing in automatic and manual reviews
    • Question 5: Is progress updated daily?
      • Metric: Burndown charts daily updates
    • Question 6: Is release built and integrated?
      • Metric 1: List of builds in integration environment
      • Metric 2: List of manual builds
    • Question 7: Is functional,performance,acceptance testing done?
      • Metric: List of tests passed
    • Question 8: Are all issues resolved?
      • Metric: Total No. of issues left = No. of issues recorded – No. of issues solved


Correctness of Definition of Done

  • Goal: Minimize the rate of rejection from the customer for the delivered product
  • Hypothesis: Team has completed the development and testing of the product. It is ready to be shipped.
    • Question 1: Has it been approved by the customer?
      • Metric: Rate of acceptance = Features passed by customer /Total Features delivered
    • Question 2: Has it been rejected by the customer?
      • Metric: Rate of rejection = Features rejected by customer /Total Features delivered


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s