Can the definition of done potentially be a bad thing?


Obviously having a DOD is a requirement for any Scrum team and it fosters good practices in a team as well. But thinking we are “done” just because we met acceptance criteria doesn’t seem… I don’t know… Agile?

As we know, a software product is never actually done and any feature or story we get to a “done” state could potentially be improved on through inspection and adaption from the team, stakeholders, and end users. My question is, how can we shift the mindset that many orgs have as “This has met the DOD so it’s done” to "Lets inspect this increment of work and see if it can bring more value or a better experience for our customers.

An observation:
The sprint review by some as a demo of “done” work to be signed off on by stakeholders instead of what it’s supposed to be which is a meeting to review potentially shippable software and inspect and adapt. The output of the sprint review is supposed to be a “REVISED product backlog.” Not… “These things have met the DOD, and now we are 35% complete to our release”



I’m probably just being dense, but this seems like a behavior/culture concern, not a problem w/ artifact/practice (the DoD).

In other words:

"We did X and it didn’t work.“
"X didn’t work because…”


A great topic!

I suppose anything can be a bad thing, if misunderstood or abused.

I like to guide my teams to develop their DoD’s based an idea I’ve taken from Rawsthorne, Dan; Shimp, Doug (2011-08-14). Exploring Scrum: the Fundamentals

The underlying principle is: If we don’t know what Done looks like we will never finish anything. So let’s have a DoD for the various building blocks that make up what we are producing as a team.

Each element has its own DoD (which is itself always being reviewed, updated, and adapted)

Yeah, there’s a paradox here as you point out - with Agile we’re never really finished in the larger sense. So perhaps the “Product” is yet another layer to Shimp’s diagram, and there’s a DoD for “Product” too.

Spotify has a great model for a Product’s “doneness”… They call it “a local maximum”

When a product hits that “done state” its time to rethink it.

re: the sprint review - yeah, all too often it is as you describe it, a demo, not a review.


Andy, thanks for passing on the info on local maximums. I’m going to ponder that some more and for sure introduce it to some teams.

At the same time, Troy, I have to agree with Zach on the concept. There’s a for sure difference between work increments being done and the concept of our work being “done”. At some point, we have to freeze requirements of work or we will never ship anything.

Perfect example: once had a client that would redo wireframes for a story in the middle of an iteration. Funny enough, they would also become frustrated that stories were taking too long to complete. When I reminded them of the changing requirements, they started to see the light.

Same with DoD. If you don’t agree on what “done” for a small increment of work, it’s impossible to really agree on when it’s done. The person testing the work might say the dev didn’t finish, when the pair might agree they did.

Make sense?


Thanks Chris. I definitely agree a DOD is needed. But I guess I’m just Bringing up a question about using the DOD culturally as a well to drive incremental development instead of thinking iteratively. But I do agree with you


Thanks Andy this is great information


Yeah using the tool organizationally seems a bit fuzzy in terms of its usefulness. Improving the culture of how work gets done is handled better from an outcomes perspective. What outcomes do we want from a new way of working? From that perspective, a level of “doneness” is absolutely necessary but it should be determined on an outcome-by-outcome basis as opposed to a broad definition.

Sorry if I’m preaching to the choir. This is just something I talk about almost every day LOL.


Go to the authoritative document, The Scrum Guide. No mention of acceptance criteria as that is a technique which can be used (or not) within the framework. If the statement was made with more of a generic “checklist” connotation, my apologies.

What is the purpose of the Definition of “Done” (DoD)? “used to assess when work is complete on the product Increment.” This inspection occurs within the Sprint.

What is the purpose of the Sprint? "Each Sprint has a definition of what is to be built.” It provides a time-box to focus on working to DoD.

Prior to the Sprint Review event, what has been “Done” and not “Done” is known. During the event, this "Done"ness is shared and collabotation occurs. The Product Backlog is updated if necessary to address any improvements that can be made to the product and its Backlog.

It would be incorrect to use the DoD to contract against revisiting a “Done” item. Release Planning has been removed from the Scrum framework for several years. Perhaps such misunderstandings as this discussion is part of the reason why. Creating additional layers in planning, and tiers of DoDs, moves backward to classical project management.


Man, sounds so simple @ALarimer makes you wonder right? :slight_smile:

This is why making software is hard using us pesky “humans” because it’s NEVER that straightforward right? I don’t think it’s “incorrect” to add layers to things like planning and done. Just depends on how many teams, how many streams of work, and how safe teams feel. Often, you have to build to that level of agility over time.

Curious to know which misunderstandings we have had in the discussion, and more of your perspective. Seems like you have a ton of great experience to lend to our conversation.

Thanks for participating!


@chrismurman, humans attempt to create a sense of security through the perception of control. In striving for control, things are often made more difficult than needed. When a cognitive activities, such as creating software, is involved then the complications are often multiplied.

Planning (i.e. releases, road maps, big design up front) is often a manifestation of the desire to feel in control. That’s what waterfall has sown for years . . . and continues to do so too often. Though not necessarily incorrect, these added layers are often serve as a comfort blanket which is often never shed, resulting in continued poor project management practices.

Back to the OP. A sense of being finished (never looking back) after completing (perhaps based on acceptance criteria) a Product Backlog item (perhaps a user story) would be a mistake. If such is the practice, then the group simply executing a project plan in Scrum terms is possibly (probably) the reality. This would be directly related to the “observation” in the OP: Sprint Review event as a sign off.

Just because an item generates feedforward with an opportunity to modify the product already created, doesn’t mean that it cannot ship. The Sprint and Definition of “Done” provide the small requirements freeze needed to focus on working toward the next opportunity to inspect and adapt. These new desires need to be placed on the Product Backlog and prioritized; repeatedly revisiting the same items can certainly be of questionable value.

The idea of scaling (many teams, many streams of work) is a different topic. Briefly though, it is really helpful to think in terms of good software engineering, i.e. SOLID. Avoid the tightly-coupled monolith through breaking the product (slicing if you will) into smaller products with API items in the appropriate Product Backlogs. The micro-services approach is a step in the good direction, but still often results in big design up front.


I think it was meant to be something organically grown between the team, PO and stakeholders. Is it a list, probably, but I would not rely on the list, eventually that list can become large as quality and value increases. I think if things are done right and the team is close to its stakeholders its more organic and it’s more we can see or signal when something is missing rather than checking off a list or maybe I am just tired and making stuff up:)


A team’s Definition of Done, should ironically, never be done.

It needs to evolve over time, informed by reality.


DoD’s: Don’t let perfect be the enemy of the good (enough). You can always iterate…
(But don’t mistake “the good enough” for lacking quality!)


Some great points made in this thread. From a practical perspective, I’ve found DoD to be helpful in these two areas:

  1. Convenience in defining work - “Use it as AC that applies to all stories”. This allows for units of work (e.g. Stories if you’re tracking as stories) to be concise.

  2. Focus the discussion on the delivery. When the goal is not black/white, teams can easily get caught in debates about wether missing functionality is a bug of the current work or a new request tracked as a new user story. Having a clear DoD short circuits that conversation so that the missing work is quickly logged and fed through implementation to delivery.

Totally agree with the point about DoD is never done. It should evolve as needed over time.


Great points made by Chris, Zach and Andy. In my experience, DoD is one artifact I often ask my teams to review and make updates as we progress through multiple sprints. It’s like me adding more weights in the gym from my last visit. I have recently blogged on this topic in my blog [] , please let me know your feedback on DoD or any other topics Thank you!


Perfect. Although the metaphor I’m not sure I completely get. I don’t want DoD or DoR to gradually get bigger or more robust. I’m sure you were just referring to more mature though, which I completely agree with. :slight_smile:


Yes! I was referring to mature teams. As the teams mature, the DoD is revised but not to get it bigger and robust.


I don’t follow. Can you give an example of “more mature” and an example of “more robust” in order to elucidate the distinction?


Great question. Was a little vague and confusing.

DoD can and must evolve over time because as teams mature, so must the ways they work. I prefer to start with minimal DoD and DoR and add fidelity as needed. You might start with “pass QA” and then mature the criteria to include unit tests and code coverage. Or something like that.

When @rajanikasturi first posted I wasn’t sure if he just meant “keep adding more and more” as opposed to “clarify and refine”. That’s where the robust part came from.

Does that help or did I just confuse you further? :slight_smile:


@chrismurman, you are right- the key words are clarify and refine so that the teams’ focus is to further enhance quality - like you mentioned, first you might start with ‘pass QA’ but as the team’s mature, find more opportunities to other criteria you mentioned.

@jsampson, The Definition of Done section in the Scrum Guide mentions - As Scrum Teams mature, it is expected that their definitions of “Done” will expand to include more stringent criteria for higher quality.

I have a few details and examples that I provided in one of my blog posts on DoD. Please feel free to review and let me know your comments as well.

– Raj Kasturi (does anyone know where I can change my display name? My full name appears as rajanikasturi on this blog.