What 'Done' Really Means: The Definition That Changes Everything

A story moves across the board. Development is complete. Tests pass. The code is merged. The team marks it done and moves to the next item. Later, someone discovers that the feature breaks under load, or exposes user data, or fails when inputs are malformed. The story was marked done, but it was not actually finished. And now the team is fixing problems that should have been prevented, whilst new work waits.

This pattern is common enough that most teams have learned to live with it. Stories marked done are not always shippable. Quality issues surface after the fact. The team knows this is happening, but the mechanism that would prevent it — a clear, shared, enforced Definition of Done — either does not exist or exists only on paper, ignored in practice because meeting it would slow the team down.

The irony is that ignoring quality to maintain velocity does not actually make teams faster. It makes them feel faster in the short term, whilst accruing debt that will slow them down later. A Definition of Done that evolves to raise the quality bar is not a constraint on productivity. It is a mechanism that makes unsustainable velocity visible, and forces the team to address the practices that are preventing them from going fast sustainably.

When ‘Done’ Means Different Things

The first symptom of a broken Definition of Done is misalignment. Different people on the team are using different standards. A developer marks a story done because the code works in their environment. The QA engineer marks it done because it passes functional tests. The product owner marks it done because it meets acceptance criteria. And then someone discovers that it was never tested for security vulnerabilities, or that it has no error handling, or that it performs acceptably with ten users but collapses under production load.

Each person thought they were doing their job correctly. The problem is that “done” was never defined clearly enough for everyone to be working toward the same standard. And in the absence of clarity, people default to the standard that feels achievable within the constraints they face. If the sprint is ending and the velocity needs to be maintained, “done” starts to mean “functionally complete,” and everything else gets deferred.

That deferral is invisible at first. The velocity chart looks healthy. The team is delivering story points consistently. But underneath, defects are accumulating. Non-functional requirements are being ignored. Technical debt is compounding. And by the time the cost becomes visible, it is far more expensive to address than it would have been to prevent.

The Quality Ratchet

I use the Definition of Done differently than most teams do. Not as a static checklist that the team tries to meet, but as a progressive quality ratchet that raises the bar over time. The mechanism is simple. You establish a baseline Definition of Done that the team can meet consistently. Then, once they have demonstrated that they can meet it, you raise the bar.

For example, a team struggling with defects might start with a Definition of Done that says “no high-priority defects are left open for the next sprint.” That is achievable. The team focuses on addressing the most critical issues before marking work done. Over time, they get better at preventing high-priority defects in the first place — better unit tests, better code review, better attention to edge cases. Once they can consistently meet that standard, the Definition of Done evolves: “no medium-priority defects are left open for the next sprint.”

The same mechanism works for non-functional requirements. A team working on a web application might start with “defensive programming and unit test cases for SQL injection.” Once that becomes routine, the standard expands: “SQL injection and data exposure vulnerabilities addressed.” Then “injection, data exposure, and authentication failures.” The progression follows something like the OWASP Top 10, raising the bar incrementally as the team builds the capability to meet higher standards.

The Artificial Crisis

When you raise the Definition of Done, velocity usually suffers. Stories that would have been marked done under the previous standard now remain in progress because they do not meet the new requirements. The team’s throughput drops. That drop feels like a crisis. It is not.

What the drop reveals is that the previous velocity was not sustainable. The team was going fast by deferring quality work. They were marking stories done that were not actually shippable, and the cost of that deferral was invisible in the velocity chart. Raising the Definition of Done makes the trade-off visible. The velocity you were achieving was an illusion. The velocity you can achieve under the new standard is the truth.

That truth creates productive discomfort. The team cannot meet their sprint commitment under the new Definition of Done with their current practices. Something has to change. They improve their upstream processes — better testing, pair programming, more rigorous code review, automated checks. Over time, the velocity recovers. But now the team is delivering at the new quality level. They are sustainably faster and higher quality, not just faster.

The artificial crisis is the mechanism that drives improvement. Without it, teams have no forcing function to address the practices that are preventing them from going fast sustainably. They feel busy, they feel productive, and they remain stuck at a quality level that will eventually become untenable.

What Belongs in a Definition of Done

A functional Definition of Done specifies what must be true before a story can be considered complete. That includes the obvious things — code is written, tests pass, acceptance criteria are met — but it also includes the things teams often defer: non-functional requirements, documentation, deployment readiness, and defect resolution.

The specific components will vary depending on the team’s context. A team building a consumer-facing web application will have different non-functional requirements than a team building backend infrastructure. A team in a highly regulated industry will have compliance requirements that other teams do not. The Definition of Done is not a template to copy. It is an agreement the team makes about what standard they will hold themselves to.

That agreement is owned by the team, not imposed from outside. As with all engineering decisions, the people doing the work are best positioned to define what “done” means in their context. But ownership does not mean the standard is static. The Definition of Done should evolve as the team’s capability improves. What was challenging six months ago should be routine now. And if it is routine, the bar should rise.

The Conversations It Enables

A clear Definition of Done changes the character of the conversations teams have. When a story is marked done but later discovered to have issues, the conversation often becomes adversarial. The developer says it worked in their environment. The QA engineer says it was not tested properly. The product owner says it does not meet expectations. Everyone is defending their part of the process, and nobody is taking responsibility for the gap.

When the Definition of Done is clear and enforced, that conversation shifts. If a story is marked done and later has issues, the question is not “whose fault is this?” The question is “why did our Definition of Done not catch this?” That is a systems question, not a blame question. It prompts the team to examine whether their standard is rigorous enough, whether it is being enforced consistently, and whether they have the processes in place to meet it.

That shift — from blame to systems thinking — is one of the most valuable outcomes of a well-maintained Definition of Done. It focuses the team’s energy on improving the process rather than defending individual decisions. And it makes quality a shared responsibility rather than something that gets delegated to QA or deferred to later.

The Long Game

An evolving Definition of Done is not a constraint. It is a forcing function. It creates the conditions for improvement by making unsustainable practices visible and giving the team a clear target to work toward. The velocity may drop in the short term. That drop is not failure. It is honesty.

Over time, as the team’s practices improve and the new standard becomes routine, the velocity recovers. But now the team is delivering work that is actually done — shippable, maintainable, resilient under real-world conditions. They are not just faster. They are sustainably faster. And that sustainability is what separates teams that improve over time from teams that plateau at a level of quality they can never escape.

The Definition of Done is not the end goal. It is the mechanism that makes the end goal achievable. And like all good mechanisms, it works best when it evolves in response to the team’s growing capability. Static standards produce static results. Evolving standards produce improvement. The difference is not in the words on the page. It is in the commitment to raising the bar when the team is ready, and then supporting them as they rise to meet it.


Establishing and evolving a Definition of Done that drives genuine improvement requires seeing where the quality gaps are and building the capability to close them. Let’s explore how making ‘done’ meaningful can work for your team. Book a free consultation to discuss your current standards and where they could evolve next.


Discover more from The Software Coach

Subscribe to get the latest posts sent to your email.

Leave a Comment

Your email address will not be published. Required fields are marked *