Definitely, Maybe Agile
Definitely, Maybe Agile
Ep. 148: The Definition of Done
In this episode of Definitely Maybe Agile, Peter Maddison and David Sharrock dive into the often misunderstood concept of "Definition of Done" and its crucial role in measuring and maintaining quality in agile teams. They explore how this simple yet powerful tool can significantly impact team discipline, product quality, and overall project success. The hosts discuss common challenges teams face in adhering to their Definition of Done and offer insights on effectively implementing and maintaining this practice.
This week´s takeaways:
- The true value lies in developing team discipline to consistently meet the agreed-upon criteria, not just in creating the list itself.
- Should be viewed as a "tax on delivery" - a necessary investment in quality that may slightly reduce immediate output but prevents future rework and technical debt.
- The implementation requires buy-in from the entire team, including product owners, and should be kept to a manageable 5-10 items to ensure consistent adherence.
Whether you're a seasoned agile practitioner or new to the concept, this episode offers valuable lessons on the balance between quality and commitment, ensuring your team's work stands the test of time.
Welcome to Definitely Maybe Agile, the podcast where Peter Maddison and David Sharrock discuss the complexities of adopting new ways of working at scale. Hello, dave, how are you doing Excellent? Good to see you again, peter.
Dave:It's been I don't know a week or two since we last talked, so it's good to catch up and see how things are going.
Peter:It is. It is. I mean it's summer. We're enjoying the lovely weather outside.
Dave:You picked the one day in Vancouver that it's raining this summer, but that is not a surprise, of course.
Peter:Of course, I think we sent it all your way. Thank you, you're welcome. So today's topic I call this measuring quality, but that kind of gives the game away. You were talking about definition of done and its relationship to quality, and so I think this is an interesting topic. I mean, it comes up quite a bit, so let's dive into it.
Dave:Yeah, and I think this is so much more than measuring quality. I think there are better ways of measuring quality that we can talk about. It's actually one of the in my kind of humble opinion, in terms of working with teams, I think it's one of the most impactful artifacts that we can draw people's attention to and use with the team Everybody. When you introduce the concept of this definition of done, this sort of list, checklist of things that we all agree are required in order to consider a piece of work complete, no one has any difficulty agreeing, making that explicit. They'll have a conversation, they mutter a bit about it, but something comes together. That's not the value in the definition of done. The value in the definition of done is the team developing the discipline to always meet their definition of done, and that's much harder.
Peter:Yes, yeah, and because that requires that everybody in the team is aligned and agrees on what it is they need to do and that they've internalized it essentially.
Dave:Well, so it's easy if we say we'll all agree to do X, whatever it is, test automation or some, you know, number of unit tests, or whatever it might be.
Peter:Every piece of code must have an explicit test written for it, or something.
Dave:Well, we can all agree it when we're not intending to meet it.
Dave:So we can all nod our head, we're kind of going through the motion of actually agreeing to it. But there's a completely different context when we start saying this piece of work we're really close to being finished, we're expected to deliver it. Our commitment says we'll have this done by the end of the day and we can't actually meet that commitment because there's some definition of done criteria that will take longer than we have available time for. That's that point where all of a sudden the team tries to cut the corners because they could get recognition for meeting a commitment, because a lot of the time the definition of done is invisible work outside of the team.
Peter:Is there and yes and I'm glad you put it that way, because I mean it's the way you were going there with five it's always because there's somebody cracking the whip at the top Well, or or you know we're being uh, you know the success of our promotion prospects, whatever it is.
Dave:the recognition we receive as a team is on meeting our commitment and getting work across a line. Um, the definition of done describes this sort of hidden work, this invisible work that is necessary so that it crosses the line once and doesn't kind of come pinging back to us in a future round of work because we missed something, we introduced a defect, whatever it is. So it's a very important piece, but unfortunately it's delayed before we pay the price for skipping the definition of done. So teams can often develop that habit of skipping the definition of done for a while.
Peter:Yeah, especially if it's something where, as you say, there's a knock-on effect that they're not going to see until later. So they're basically building up debt at that point and now I've got to decide, I'm going to have to go pay that down. And especially if it's something like, hey, the documentation needs to be updated, then that can be something that can be very much like pushed down the road until it becomes a real problem.
Dave:Well, and then you know it's nearly impossible to document something that's 18 months of work has gone into it. You're never going to document what's really there.
Dave:And I think it's always worth as introducing a definition of done. Bearing this in mind, the way I think of a definition of done is it's a tax on the delivery of the team. The team can deliver a certain amount and they have to reduce that amount in order to cover the definition of done work, which means we don't want a list of 25 things that the team has to deliver. They can't do that and get sufficient work done. So it's like the minimum quality standard the team agrees to, and nothing will leave the team without meeting that minimum standard, and I mean a kind of a rule of thumb we always use is five to ten items, something like that so it's somewhat lightweight.
Dave:There'll be some things in there like maybe technical documentation needs updating. There might be a little bit more of a load on the team. However it's, the key is to make it small enough that the team can commit to it and hold themselves accountable to meeting it.
Peter:Yeah, and I think having the buy-in outside of the team, too, is kind of an essential part to that.
Peter:It's that we understand that the team is going to deliver, this is the expectation of how they're going to deliver, and that even making it a part of how you say, okay, this is where we've got to, this isn't quite ready to go yet and we've got these last bits to go.
Peter:This is what it's going to take to get there. And having that very transparent conversation around that, I think, is a key part of it as well, and one of the other pieces that I think is critical is having the conversation around it so that everybody in the team is well aware of what it is, but also making sure that it's something that is front of mind. It isn't the thing that gets done at the end of the process or some things might need to be, but it's something that is a part of the thought process all along. It's like how are we going to ensure that we've got everything we need as we get to delivery, that we've captured all the things we need for our definition of done, and that adds a lot of value because it gets people starting to think about what are potentially a lot of and it can be, some non-functional aspects of it.
Dave:Yeah, I like that. That. It's definitely beyond the team and I also want to see you know the conversation my product owner actively asking to make sure things have met the definition of done, interested in what's in the definition of done and understanding why it's important Rather than the definition. You know sometimes that product owner role is frustrated because they want more work done and they don't view the definition of done as valid work in a sense. Now we've got a problem whereas the best teams I've had, the product owner is all over the team to say you cannot bring anything into the review that doesn't meet your definition of done, so validate it meets the definition of done. Maybe there's a purpose in demonstrating it. There's value in demonstrating the definition of done definitely not all the, but that can be useful and helping everybody understand that this is critical to the success of the product Right.
Peter:Exactly Because having these pieces in place means you're starting to think about it and I think actually, in talking to the abstract, we've talked about a couple of the types of things that tend to go into that.
Peter:It might be worth talking through some of the other types of things that we see going into that. I might be worth talking through some of the other types of things that we see going into that. I mean updating technical documentation, ensuring that code proper tests passed on, especially if there's a subsequent QA step of some kind. Are the test steps written out and understood so that it's ready to be handed into, whatever the subsequent groups there are. Automation scripts for deployments have been built and tested and validated prior to be handing over for organizations operating that way. Again, some of these are things that, in an ideal situation, the team has the capability and power to do some of these things themselves, but it's not always the case. It depends on the nature of the systems they're interacting with. The case. It depends on the nature of the systems they're interacting with and, in fact, from that perspective, the definition of done can be a very good way of ensuring that you do have those smooth handoffs into later aspects of the system.
Dave:I've seen really great experiences with teams that, for example, include code reviews, which is pretty common, but also things like pairing on design phases of solutions, so that again you're not creating these single points of expertise, subject matter experts within the team, but there's that pairing around that which has a lot of benefits, both quality-wise and knowledge creation-wise, as well as far down the release, the deployment pipeline as you can get it. Again, you mentioned it varies, um, but you don't want it sitting on some machine hidden away somewhere. You want it, you know, out of the team's kind of span of influence and you know where they're working. Get it out of that into a pre-production or wherever it might be. There's also so people often forget this but things like documentation. So we've talked about technical documentation, but if you're in a regulated environment, there's regulatory documentation that might need to be updated or partially modified as you make change after change after change.
Peter:Yeah, attestation and other things that need to be documented or captured as a part of that and building that out as a part of your system for regulated organizations can be a key part of it. So having that in your definition doesn't say, have you done the necessary attestations, that this can be pushed forward into the next stage of the releases. It can be quite a good way to do that, but basically making it a part of the system so it isn't something that gets forgotten, yeah.
Dave:Well, and that might be, you know, sort of support documentation, so the problems don't come back into the team but they're actually being handled somewhere else or marketing training material and again, it isn't the complete training material, but it's you know. Here's some screenshots of what's changed. Here's the feature that's been added or modified so that that can now be included in materials that maybe, potentially, are being written outside of the team.
Peter:Especially if I mean one way that this can get done is say I'm building a component part of a larger system. I'm putting my pieces which are modifying the screens I have, but the whole solution isn't going to be ready until the other parts are done. So release is separated from deployment. I'm deploying those pieces. They're hidden behind a feature flag. I can then make them available at the right point in time. And now, as the solution comes together, I'm now from a product perspective, because some other group, quite likely in some other team, is putting what you're providing together into that user documentation and marketing documentation to make it available on whatever interfaces that customers might have to make it available on whatever interfaces that customers might have to Rather than saving online documentation, exactly, yeah, how do we wrap it up If our definition of done is three things to walk away from?
Dave:what are the three things we want people to walk away and understand?
Peter:I think ensuring that you've had the conversation around it to understand are there things that we need to ensure that, as a team, we have made sure our existing on everything that we say is ready to go. Having that conversation is valuable in of itself. Keeping it front of mind throughout the process, I think, is one of the key pieces as well, so that you're aware of what am I going to have to make sure is ready to go by the time I get ready for delivery, and I think I liked your concept around it being a tax on delivery and that it's making sure that it's understood that this is something that has to happen. So if it's reducing your capacity to deliver by that amount, I think that's a good thing as well. Excellent, three things, nice and smooth. There you go, awesome, so a pleasure, as always, dave, so we can wrap it up there and we'll chat next time.
Peter:Don't forget to hit subscribe and you can reach us at feedback@ definitelymaybeagile. com. That's good. Thanks again, Peter. Always a pleasure. Always a pleasure. Bye. You've been listening to Definitely Maybe Agile, the podcast where your hosts, Peter Maddison and David Sharrock, focus on the art and science of digital agile and DevOps at scale.