Definition of Done vs Acceptance Criteria: Your Guide to Agile Testing Standards
In the fast-paced world of Agile development, delivering high-quality software consistently is the ultimate goal. Yet, without clear standards, teams can fall into the trap of "it's done when I say it's done," leading to buggy releases and frustrated stakeholders. Two of the most critical tools for preventing this are the Definition of Done (DoD) and Acceptance Criteria (AC). While they sound similar, confusing them can derail your project's quality.
This guide will demystify these essential Agile testing standards. You'll learn what they are, how they differ, and how to use them together to build a robust quality gate. Whether you're a new QA engineer, a developer, or a product owner, mastering these concepts is fundamental to succeeding in modern software projects and aligns perfectly with industry-standard frameworks like the ISTQB Foundation Level syllabus.
Key Takeaway
Acceptance Criteria define what needs to be built for a single user story to be correct. The Definition of Done defines how every user story must be completed to be considered shippable. Think of AC as the "feature spec" and DoD as the "project-wide quality checklist."
What is Acceptance Criteria? The "What" of a User Story
Acceptance Criteria (AC) are a set of conditions that a software product must satisfy to be accepted by a user, customer, or other stakeholder. They are the functional boundaries of a single User Story or feature. In Agile QA, AC are your primary source of truth for creating test cases.
From a testing perspective, well-written AC answer the question: "How will I test this?" They transform vague requirements into concrete, verifiable statements.
How to Write Effective Acceptance Criteria
Good AC are clear, testable, and concise. A popular format is the "Given-When-Then" (GWT) structure, which is excellent for behavior-driven scenarios.
- Given a certain precondition or context...
- When a specific action is taken...
- Then a verifiable outcome is observed.
Example: Manual Testing Context
User Story: As a registered user, I want to reset my password so I can regain access to my account.
Acceptance Criteria (GWT Format):
- Given I am on the login page and click "Forgot Password"
- When I enter my registered email address and submit the form
- Then I receive a success message stating "Password reset instructions have been sent to your email."
- Given I have received the password reset email
- When I click the unique reset link within 24 hours
- Then I am taken to a page where I can enter a new password twice for confirmation.
- When I submit a new password that meets the security policy (e.g., min 8 chars, one number)
- Then I am logged in automatically and see a confirmation message.
A manual tester would use these exact steps to create and execute test cases, checking each "Then" statement.
What is the Definition of Done? The "How" for Every Story
The Definition of Done (DoD) is a shared, formal checklist of all the activities required to get a product increment to a potentially shippable state. It is a team agreement and applies to every user story in a sprint. While AC defines *what* is built, the DoD defines *how well* it is built.
The DoD acts as the project's ultimate quality gate. A story is not "Done" until every item on the DoD checklist is satisfied, regardless of how well it meets its AC.
Typical Components of a Definition of Done
A robust DoD covers multiple dimensions of quality. Here’s what it often includes:
- Code Complete: Code is written, reviewed, and merged to the main branch.
- Unit Tests: All unit tests are written, passed, and code coverage meets the agreed threshold.
- Integration Tests: Relevant integration tests are passed.
- QA Verification: Feature is tested against all AC; no critical/open bugs remain.
- UI/UX Review: Design match is verified (for front-end stories).
- Performance Check: No significant performance regression is introduced.
- Documentation Updated: User guides, API docs, or release notes are updated.
- Product Owner Acceptance: The PO has reviewed and accepted the story.
DoD vs AC: The Critical Differences Side-by-Side
Understanding the distinction is crucial for effective Agile testing.
| Aspect | Acceptance Criteria (AC) | Definition of Done (DoD) |
|---|---|---|
| Scope | Specific to a single User Story or feature. | Universal; applies to every User Story in the project/sprint. |
| Purpose | Define what the feature should do (functional correctness). | Define how the work should be completed (overall quality and completeness). |
| Ownership | Primarily owned by the Product Owner/Business Analyst, with QA input. | Owned and agreed upon by the entire Agile team (Dev, QA, PO, Scrum Master). |
| Nature | Variable; changes with each story. | Static; remains consistent across stories (evolves slowly as a team standard). |
| Testing Focus | Basis for functional and system test cases (validating behavior). | Basis for non-functional checks, process adherence, and quality gates. |
Practical Insight: A story can pass all its AC (it works as specified) but fail the DoD (e.g., no code review was done, or documentation is missing). In a robust Agile QA process, that story is not done.
The ISTQB Foundation Level Perspective
The ISTQB Foundation Level syllabus provides a standardized framework for understanding these concepts within the broader context of software testing.
How this topic is covered in ISTQB Foundation Level
ISTQB formally defines acceptance criteria as "criteria that a component or system must satisfy in order to be accepted by a user, customer, or other authorized entity." It links them directly to acceptance testing. The Definition of Done is discussed within the Agile testing framework as a key practice for establishing a shared understanding of quality and completion, often acting as a test oracle and an exit criterion for testing at the story level.
The syllabus emphasizes that testers must be involved in creating both AC and DoD to ensure testability and quality are baked in from the start—a principle called "early testing."
How this is applied in real projects (beyond ISTQB theory)
While ISTQB provides the theory, real-world application requires nuance. In practice, a DoD is often a living document on the team's wiki. Teams might have a "hard" DoD (mandatory for every story) and a "soft" DoD for the sprint (e.g., "run performance benchmark"). Furthermore, in manual testing-heavy environments, the DoD might explicitly include steps like "tested on browsers X, Y, Z" or "verified on mobile device resolutions." The key is that the team lives by it; it's not just a theoretical list.
For those looking to bridge this theory-practice gap, an ISTQB-aligned Manual Testing Course that focuses on real project workflows can be invaluable.
Building Your Quality Gates: Integrating DoD and AC
Together, AC and DoD create a powerful, multi-layered defense against poor quality. Here’s how they work in sequence during a sprint:
- Story Refinement: The team, including QA, reviews the User Story and its AC for clarity and testability.
- Development & Unit Testing: Developer completes code, ensuring it meets AC and passes unit tests (part of DoD).
- Code Review & Merge: Another developer reviews the code (DoD item).
- QA Verification: The tester executes tests derived from the AC. This is the primary functional validation.
- DoD Checklist Run: Beyond functional testing, QA or the team verifies all other DoD items: cross-browser check, documentation update, PO demo, etc.
- Done: Only when both the AC are satisfied AND the DoD checklist is complete is the story moved to "Done."
Common Pitfalls and How to Avoid Them
- Pitfall 1: Vague Acceptance Criteria. AC like "user can log in" are untestable.
Solution: Use the GWT format to enforce specificity. - Pitfall 2: Treating DoD as Optional. Teams skip "non-essential" DoD items under
pressure.
Solution: The DoD is binary. If it's not met, the story is not done. Period. - Pitfall 3: QA Not Involved Early. Testers receive stories with pre-written, poor AC.
Solution: Insist on tester participation in all refinement sessions to ask clarifying questions. - Pitfall 4: Static DoD. The DoD never evolves as the team and product mature.
Solution: Review and update the DoD during sprint retrospectives.
Mastering the creation and enforcement of AC and DoD is a core skill for any Agile professional. It moves quality from an afterthought to a built-in feature of your development process. For a hands-on deep dive into applying these and other quality standards in real testing scenarios, consider exploring practical training that goes beyond theory.