Matthew Philip

Transcript

Okay, now we can start.
We were just waiting for Klaus.
Okay. Bonjour à tous. Je m'appelle Mathieu. Matt? Matthew?
Thank you for coming. I'm going to be speaking in English because you can probably tell my French is not very good. But anyway, so thanks for coming. I'm going to be talking about the service delivery review and helping us to think about what that is, demystifying it somewhat, and hopefully I'll have some time to answer questions at the end. So at the end if you have questions, but also as I talk at any time if you'd like to raise your hand and ask a question, that is fine.
Okay, so many of us have seen this, the Agile Manifesto for Agile Software Development, right? Okay, does anybody have a favorite line, a favorite phrase of this?
Which one?
People over process.
Okay, people over process. Okay, very good.
Mine is more responding to change.
Responding to change. That's your personal. Okay, good. Very good.
Anyone else? Favorite phrase?
Okay, good. I kind of like that. All right. Good. Yeah, so this is a very enduring document. It's very helpful in helping us understand how we can do things better. My favorite phrase is actually the very first part of the document because this, to me, helps us to think about agility as a dynamic thing. It's not just a static one-time definition, but it pushes us to always be learning, always be discovering new ways. And so one of those new ways for me has been understanding the benefits of service delivery review. So one of the things that I like to talk about is this idea of becoming continuously fit for purpose. So not that I'm personally fit, you know, but it's helping organizations and teams become continuously fit. And so that's a dynamic process. It's not just once you figure out your fitness and what makes you fit for your purpose once and for all, but it's an ongoing discovery, ongoing feedback, ongoing sensing to understand what makes you fit for your purpose. Otherwise, you'll go the way of the dinosaurs. and your company will no longer exist. So it's really important at a high level to understand what makes us fit for our purpose, but always be doing that because we can miss out. So what is the problem? So you might say, I'm proposing you a solution without a problem. I'm proposing the service delivery review, but why do I care? So if it's important for us to always be understanding our fitness for our purpose, we always need to be getting feedback. And one of the things that I've found working with teams and organizations over the years is that while we have some really good ways of understanding and getting feedback for parts of what we do, like the product, there's lots of good ways to understand product fitness. Have we built the right product? One thing that's missing is understanding how fit for purpose our service is. So if you think about different businesses, think about a restaurant. So is a restaurant in the business of product or in the business of service or something else? How many people think restaurants are in the business of product?
Okay, yeah, okay. Service? Okay, service, okay. How about, who thinks both?
Okay, right. So, and it's like that for knowledge work as well. So we, anyone who is in software delivery, software development, is probably producing some kind of product, whether it's for consumers, users, or other developers, if you're developing API, you're developing a product. But there's also the way in which you deliver that product, the service delivery, the way that people understand how long it takes you to get that thing. So just like in the example of a restaurant, we care about the food and the drink, especially the wine, especially here in France. But we also care about the service, how fast we're seated, how fast we're served, the relationship we have with people who are serving us. So here's an example. This is in my own organization recently of a request that I made, my teammates made, to have something designed by our design group. All right, so if you're familiar with the idea of the Kanban lens, the Kanban lens is basically if you had special glasses, special lunette,
To be able to see your organization, your organization consists of multiple different services, different people providing services either internally or internally and externally to our customers, our clients. And so in my organization, we have... Design people. They work for our clients, but they also provide services internally for people like me who have no ability to make anything pretty at all. Right? And so we had a president that we wanted to make look nice because we had a presentation coming up. And so we requested it to our colleagues and this is the response. So thanks for sending this over. I'm happy to pass this to the design team to get some help on it. Is there any due date? All right, so how would you answer this?
So you're passing off, you need a request from someone, and you'd like it soon, but there's really no due date, but you don't really know when they'll have it ready for you. You've never met these people, maybe.
What would you do?
How would you answer this? Would you say a due date? Yes? Any make up something?
Yeah, right, right. So actually anyone who came to Andreas'talk this morning would know this. He mentioned the idea of giving fake due dates, right?
Yeah, right, yeah, very good. So we didn't have a due date, but we didn't also trust and know how long it would take to deliver this, right, which is important to us. Because I know that someone has asked me for something and they don't put a due date on it.
It goes in the back of the... The list, right? So here's kind of what happened. So it said so-and-so can work on it next week, was kind of busy, had other projects to do. And so we said, oh, now it's becoming close to the time when we actually need, we actually We do have a due date now. And so we still don't understand quite how long it's going to take to have this request completed. Right? And so she said she can work on it early next week, and she's pretty fast. Okay. So what is pretty fast? Pretty fast is a qualitative understanding of speed, but it doesn't help me plan. Pretty fast is just an idea of how long, but it's not a number of days or even a probabilistic expectation of how long it would take to be finished. So this is one of the reasons, and this is just a simple example, inside my organization, inside an internal service. But again, if we think about the Kanban lens, there's all types of services going on that we need to understand the service fitness for its purpose. So in this case, it was important that I had this done by a certain time or with an expectancy that I could make planning decisions around. So, one reason we want to have a quantitative understanding of our fitness for purpose of our service delivery is this. So I mentioned this. If we don't have any sort of quantitative feedback, if they had said, we have a 90% expectancy that we can finish this in two weeks, that would have been very helpful for me. That's much better than, do I need to make a fake due date? Or do I need to always be nagging and sending a follow-up email? Are you working on this? Is this going to be done? Right? Anyone have to do that before?
It's because we lack that feedback. We lack an understanding of what makes the service fit for its purpose.
And so we make these artificial boundaries. We make artificial fake due dates, which are actually counterproductive. So if you think back to the example of the restaurant, there's an idea of a product, there's an idea of service. And there's also two viewpoints we can have. One is from the viewpoint of the customer, so you and I who go to the restaurant, but also internally focused inside the restaurant, the people who are running the restaurant have concern about those two things as well. So if you think about these two things as a matrix, as two axes,
So the feedback needs we have are in those categories. So if you think about from, just think about product fitness. That's the easiest one, perhaps. So from the customer. And another way to think about this external viewpoint is the phrase, have we built the right thing? So there's a difference between have we built the right thing and have we built the thing right? So those are the two different viewpoints we can think of, or the two different aspects. And one is maybe team-oriented internally, and one is externally oriented, or we might say customer-oriented. And then we have the service and product parts of this. So is our product fit for its purpose? Have we built the right product? And like I said before, We typically have good ways of assessing that already. Similarly, is our product healthy from an internal standpoint? So the idea of health is related to fitness in the customer's eyes. There's a relationship there. There should be. So we have ways of understanding that. Is our team healthy? So internally facing.
Have we built the team right? But then there's the question about have we built the right team? Have we built the right service for our customer? So feedback loops, what do we have? So let's start with the team. Have we built the thing right from an internal perspective? Service perspective. What kind of feedback loops do we have in the traditional agile cadences or feedback loops? Anyone want to guess?
Retrospective, yes, very good. Retrospective. Is it cognate? Right, okay, very good. All right, let's move on down to product and team. Demos, okay, so think about the, so yes, demos is a helpful thing. Is it giving us feedback on the health internally focused or the external customer view?
External, yes. Okay, good. Okay, so that gives us external view. What about internal?
Quality, okay, how do we assess equality from an internal perspective?
Okay, quality metrics. Yeah.
Okay, Sonar, yeah, okay, good, good, good. And for those of us who don't know what Sonar is, can you tell us what Sonar does?
It's basically a tool to assess the quality of code.
Yeah, very good. So right, exactly right, right. So it's an internal health, yeah, how healthy is our code? How healthy is the product? The customer cares about that, but only so far as it affects how the product functions, right? So there's a relationship there, very good. Okay, good. All right, so we have automated build, automated tests, code metrics. Right, good. Performance monitors. That's giving us an internal viewpoint on the health of our product. Okay, you mentioned demo. Gives us an understanding of, from an external perspective, the customer, what the customer really cares about.
Usage metrics and then just money, right? We know we've built the right thing if we're making money.
Usually that's a good sign. It doesn't necessarily mean we've built the wrong one if we don't make money. Sorry?
Customer satisfaction.
Yeah, customer satisfaction. Exactly right. Very good. Okay. And then finally, what about in the service from the customer's standpoint? What kinds of feedback loops do we have? And then finally, what about in the service from the customer's standpoint? What kinds of feedback loops do we have?
Okay.
So, sorry, what was it? Customer survey, yeah, that gives us a sense. Yeah, good, good, good. And so we'll talk about that.
Comprends?
Complaints. Yeah, that's good. That's kind of a lagging metric, isn't it?
Yeah, right. Good. Okay, so I'm proposing you that one way we can do this in an organized, regular cadence with the same kind of discipline that we do these other feedback loops in is the service delivery review. This is something I've found to be useful in a variety of different domains, different groups. Anything that is offering a service, whether it's software delivery or whether it's an internal group that I'm working with now, understanding what are our services and then what makes our service fit for its purpose, and then constantly assessing that so that we can continuously make sure that we're fit for our purpose. Okay, so what is this? So this is a definition.
This is based on David Anderson's initial definition. I've simplified it for my simple brain a little bit. But the main things are in the bold face. So a quantitatively oriented discussion between a customer and a delivery team. So if you think about that, anyone who's delivering a service and anyone who's consuming a service is the customer. It doesn't necessarily need to be the end customer. If you think about, if you're in the example that I was. I was talking about in my organization making a simple design request. I was the customer in that example. Anyone who's consuming the service, anyone who's providing the service. About the fitness for purpose of its service delivery. So just like in the restaurant example, the cook in the kitchen can't tell you whether the product and the service delivery is fit for its purpose. It has to be from the context of the customer. And so that's where our traditional agile feedback loops fall short, is that in a team retrospective, for example, we understand, oh, we get along okay, we're working well as a team, I'm friends, I pair well with you, so good, that's great. But we really are blind to what the customer's viewpoint of how fit for purpose that team is. It could be a great team, and I've worked with lots of great teams who get along well, who are very productive, who feel very productive, they feel happy even. And yet they miss the mark because they don't have a feedback loop from their customers'eyes. And so it's very helpful because for me in my career, it's been hard to see teams like that who are doing well, and yet because they lack a feedback loop from their customer, they fail. And so they lose the contract, their customer is dissatisfied in some reason. It's not because they weren't working hard, it's not because they were not talented developers, but it's because they just missed something about from the their customer. All right, so you may have seen this chart before. It basically gives you a sense of various feedback loops that we can use. And so the service delivery view is in the middle here. It's actually, I like it being in the middle because it really does touch upon all the other ones. If you're familiar with the practice of the Kanban meeting, some people call it daily planning, flow planning, stand-up meeting, that's more of a micro level planning.
So that can feed into the service delivery review. Service delivery review can then also help us understand at a higher level if we use it in conjunction with an operations review. But this is a way for some of these things to fit in. So I just wanted to give you a sense of where the service delivery review is.
Okay, so fundamentally, I mentioned the idea of having the customer's voice in the meeting. And we're asking this question constantly, is our service fit for its purpose?
So some ideas, some possible topics that you can use in the service delivery review. These are all things that give us feedback on what's important to the customer. But we have to first ask the customer what's important. So delivery time. So a very common concern is... How fast will this be done? When will I have this stuff ready? So it's like when you go to the restaurant. You care, depending on what your needs are, if you're getting a fast... Food? You obviously have a threshold for your food being ready in how long?
What's an acceptable wait time for you when you go to a fast food restaurant?
Five minutes, okay, all right. Yeah, if you had to wait more than 20 minutes, would you go to that restaurant anymore for that purpose? Probably would not. Right, so speed is a concern. But if you're going to a nice restaurant, for example, you probably would tolerate a longer wait time. You'd probably be also suspicious if they delivered the food to you quickly. If they got your food in five minutes, you'd probably wonder if the food was actually fresh or made to your order, right? So speed is obviously a very common concern. How predictable are we? So if I said, well, you might have your food in one minute, you might have it in an hour. Is that acceptable to you? It depends on what you're looking for. Probably not, though. I've used blocker clustering, and Klaus is here. I learned this from Klaus, so if you want more on blocker clustering, but this is a way to help us quantitatively understand the impediments to our flow. I've used service delivery review meetings to talk about those things because it's all related to our predictability of delivery, our flow of work. Work type mix. This is an understanding of how much of each type of work do we want to be doing on a regular basis. So if you're We're focused on expedite type work. Is that acceptable to us? Are we focused on bug requests, bug fixes? Is that acceptable? Do we want to have thresholds for the amount of those types of things that we want to be working on in any period of time? The nice thing about this is that you can decouple the cadence of these service delivery reviews, just like you can decouple cadences of your planning meetings. So if you're working in a sprint context, for example, or a scrum context, you can overlay this into it. It's not one or the other. You can use this in conjunction. Because, again, we're getting feedback in a different sense. We're getting feedback on our service delivery review, our service delivery.
Okay, so I'll show you some examples of these in just a moment. Due date performance, so this is really important in some contexts. So if you're working in something where you need to make a deadline, you need to know what is your performance to that. And so that's another way you can think about that in your service to the review. Policy changes. So this is a good discussion that I've used. This is a good forum because it's got the right people in the room to have that conversation.
Fitness surveys. So this is something if you've read or seen Alexi and David's book, the new book, Fit for Purpose, they talk about this and it's a really helpful way to gauge fitness. Someone mentioned customer surveys. But you're asking very explicit questions. that relate to service. So it's not so much about the product, but it's the service. And so you can see in there, so they have something they call the Fit for Purpose, Fitness for Purpose box score. And I've used this actually with teams not even doing software delivery, but one team I worked with was a group of facilitators, retrospective facilitators. And so the question that management had was, Are our facilitators good? Are they doing work well? Are they helping at all? And so the first question we had to ask is, what is the purpose of these facilitators? What is the service that they offer? So they offer professional retrospective facilitation, third-party objective facilitation. All right, so that's good. So that's the first step, understanding what the service is. And then so what makes that service good or acceptable to the teams that they're serving? And so we actually used a very early version of this fit-for-purpose question, this fit-for-purpose card, to get an idea from the teams that they were serving to see how fit-for-purpose that service was. So again, think broadly, think Kanban lens, all the different services in your organization, there's fitness for those purposes.
Okay, and then frontline staff reports. Again, these are the people who are going to be interfacing with customers who are going to know and sense, be able to sense the market, sense what's going on the best.
And ultimately, what obstacles stand in the way of us delivering to our service expectations? Right, so if we think about the actual metrics, there are some examples here.
Team health engagement and flow efficiency in our service internal facing, which again relate to the customer's concern, but ultimately when I go to a restaurant, for example,
I care about how well the kitchen staff are getting along, but only so far as it affects how well my service is done. All right, and so I'm sure your customers care about your team and if they're happy or not But ultimately they care more about these things on the right. So this is from the customer standpoint So delivery time distribution throughput due date performance are some examples there
All right, so this is just a worksheet that I have that allows people to get started and gives a template for thinking about what are the kinds of things we want to be bringing into the service delivery review and what are some of the outputs. So we have a definition of the services, what makes those services fit for their purpose. And again, the more you do this, the more you can continuously evaluate if you're fit for your purpose. So you have on the left, how are we doing? And maybe you think about different expedite policy, different service levels that you have for those things, standard urgency requests, and so on.
Improvement drivers and hypotheses. So one of the things that's nice to be able to come out with this from this meeting is a quantitative hypothesis or experiment. If we want to improve something about our service delivery view or our service delivery, what are some quantitative measurements that we can use to assess whether we've actually improved?
And then perhaps there's policy changes that we need to involve.
Okay, so one thing that's helpful is to review recent work and how things are progressing. So this is, if you're not familiar with scatterplot charts, over time, the cycle time, I call it delivery time, basically how long is it taking each work item to complete over time. And so we can see from this, it gives us a sense of our delivery time X. expectations in percentiles.
So this is a very helpful way to, at a glance, be able to see the trends as opposed to, say, a frequency chart, which doesn't necessarily give you a sense of time. This gives you a sense of time. So I like to practice what I preach. So anybody use personal Kanban for your own personal work? Okay, so I use personal Kanban for my life, my non-work stuff, and I basically do my own service delivery review. All right, yes, I'm a geek.
But it's helpful because I can see how well I'm doing in the things I want to be doing and what makes my life fit for its purpose and all the different people that I'm trying to make happy and satisfy. And so this is just from my recent service delivery review. I do it every Monday. By myself. But basically, this is giving me a sense of the cycle time for my standard urgency things. So things that don't have a deadline, how fast am I completing them? So if you ask me to do something for you, I could tell you, well, I have a 95% expectancy that I'll finish that in 22 days or less.
85% expectancy that I'll do it in seven days. So about a week. So now I know. And so actually now I can improve on that, right? So you can see over time. That line has gone down a bit. And so I'm completing things a little bit faster. Maybe just luck, but it gives me feedback on my service, my personal service.
This is how I'm doing with my fixed date items. So if someone asks me, Matt, I need you to do this by such and such date. I can tell you how I'm doing. So you can see, so the red line is the last two weeks period. And so basically I'm trying to keep it above 90. You've seen a couple dips in the seven day. And so this allows me to assess what's going on here. And I found that these were weeks that I was traveling. I was either going to conferences or doing some visits out of the country. And so I know then that I can proactively communicate that to my stakeholders, whether it's my family, my friends, whoever, that... My service delivery expectation is going to change in those intervals. And I can also, if that's not acceptable to them, I can try to make compensation changes in what I'm doing.
Delivery time, this is another way to view our delivery times. This is distribution. So we can also see in the tail how long it's going to be, how much risk we might have.
This is an interesting quality type of metric I've done with teams, basically to assess Are we delivering the right stuff? Are we delivering... And so this works in conjunction with the product fitness, but this is really saying... The value adding demand, like the stuff that adds value to our product, how much in percentage are we working on that kind of stuff as opposed to failure demand, which is defects, things that are coming up that we don't really want to be working on? And so putting a threshold for that is useful here. You might even also include technical investments. That could be a third category. And so you can look at those bands and have appropriate thresholds, so minimum and maximum amounts for each of those things. So again, this gives you a sense of, are we doing the right things that make our service fit for its purpose? And so this works in conjunction with the product fitness, but this is really saying the value adding demand, like the stuff that adds value to our product, how much in percentage are we working on that kind of stuff as opposed to failure demand, which is defects, things that are coming up that we don't really want to be working on? And so putting a threshold for that is useful here. You might even also include technical investments. That could be a third category. And so you can look at those bands and have appropriate thresholds, so minimum and maximum amounts for each of those things. So again, this gives you a sense of, are we doing the right things that make our service fit for its purpose?
And then finally doing a forecast. So moving away from upfront deterministic type estimating practices to probabilistic forecasting. So we're using the same data, our delivery time data, to make a probabilistic forecast. So this is nice. This is actually... Actionable agile tool. It's actually baked into Kanbanize, who are here, if you want to see a demo. Basically, it's giving us a nice little calendar view of putting those delivery time expectations, the 95th percentile, 85th, and 50th, overlaid onto a calendar, which is really helpful, because that's typically how business people think, and how we make requests. We think in terms of calendar time. And so this is giving us a sense of that expectation. So going back to that request that I made of my colleagues doing design work,
If they had given me this, this would have been really satisfying. And they would say, well, we have a 50% chance of getting it done in September, but we have a 95% chance of getting it done by the beginning of October, for example. Then I can plan. And then I can also give them feedback to say, The time it takes for you to turn around design requests is unacceptable to me. And then it allows my organization to make decisions about what's important. And maybe we need to change what we're doing there. We need to invest more heavily in our design. design group or make other changes. But only when we have that kind of quantitative feedback can we make proper investments, proper adjustments.
This is a really nice, this is more of a micro level view of work, but it's also a very proactive way to think about it. So this gives us a sense of based on where work is in progress, so it's not completed, Based on where work is in progress, we can proactively think about managing the work so that we meet our expectations. So if we have a certain delivery time expectation of 85%, which say it's 14 days or less, we can start seeing the items that are in progress have a greater or less chance of making it by that time. So this is a proactive way before things get finished to actually manage those things.
So I mentioned this, the fitness for purpose card.
This is an example. And basically, I just tweeted out. I made this public. If anyone would like to copy this, it's public. It's on Google Doc. You can use it. It doesn't do a lot of the calculations very nicely, so you have to do some manual calculations, I will tell you that now. But it's useful. You can adapt it to whatever your needs are. And the thing that I like about this, and you can read, like I said, more about it in the Fit for Purpose book, but it's basically asking the customer the very simple question, why did you choose our service? List up to three reasons. So it's really interesting because... If we think about only internal view, we have an assumption about why people like us and choose us, but we don't really know unless we ask the question of the customer. And so this is an open-ended way to get that feedback. And then it follows on then, how satisfied were you? How did our services fulfill your expectations for these things? So that's a really helpful gauge. And it's very simple. This is not rocket science. This is very simple. Asking the customer, involving the customer, and quantifying our fitness.
Okay, so this is interesting. This is actually, this is from a few years ago, but some of the high level categories that people are generally concerned about. So accessibility, and depending on what your business is, you might translate these differently, but these categories are common in lots of ways. Do what I ask promptly, be responsive, So there's some predictability issues in here. Tell me what to expect. If we don't do things like probabilistic forecast, it's going to be hard for us to tell people what to expect out of us. That's a really satisfying thing. My tolerance for my mechanic, for example, for my car, I don't know anything about cars. So I have to trust him. But basically, if my mechanic says it'll take 10 days to finish my work on my car, I don't know. Should it take 10 days or should it take one day? But if he delivers that or he says it's going to be late, it's going to be 15 days, that's dissatisfying to me. So telling me what to expect and being able to quantitatively do that is really helpful. That's what people care about. Doing it right the first time, some first time quality issues, meeting your commitments and keeping your promises. So Andreas talked about this earlier, this idea of building trust. This is a really helpful practice for doing that.
Okay.
So this is just an interesting twist. This is, for me, a new... way of thinking about this. So we typically hear about delighting customers with our product. Okay, that's great. But just simply reducing their effort, the work they must do to get their problem solved does. And we can understand this in terms of our service delivery.
Okay, so some benefits. Really quickly, why would you do this if I haven't convinced you already? It really focuses you on the customer's needs. And so British Airways, a few years ago, did a survey of their staff.
And then they surveyed their customers, actually, to see what they valued and what was important. And of the things that the staff thought their customers liked, it didn't match what the customers valued. And so it was very insightful. They did a type of fit-for-purpose survey. I don't think they called it that. But basically, they found that two of the major reasons why customers chose them were not anywhere on the radar for their staff. And so that gave them feedback about what their customers valued. It's really important. I like this too, the setting clear standards and achievement. Like I said, I've worked with teams who've worked really hard and thought they were doing the right thing, but they didn't really know how to improve. Basically, they were satisfied, they were happy, but they didn't really have clear objectives for improvement. And so this is a helpful way to guide that improvement.
Okay, so I mentioned some of the ideas. Of improvement, understanding why we fail. So it's interesting. I worked with a company who was doing bespoke custom software delivery for clients. And so they would basically care about how their clients were thinking about their delivery. And they would occasionally lose a contract. And the CEO of the company, when I introduced the idea of service delivery reviews, said, I care more about coming to the service delivery review and the outcome of that than I care about the product demo. You know, the typical invite the stakeholders to the product demo. He was actually getting more from the service delivery review because it mattered more in terms of what the customer's likelihood of staying with us was going to be.
And finally, it improves your relationships. So both work relationships and personal relationships. So like I mentioned, I do this on a personal level. And my wife recently had a couple of to-do items for me. And so she said, are you going to be able to do those things today? And so a fixed date delivery type of request, right? And so I said, actually, I can tell you exactly what my chance of doing that would be. I said, I have a 93% chance of doing that, my love. I had just done my service delivery review, my personal service delivery review, and she was really satisfied. She said, really? That's amazing. Okay, great. And thankfully, I did actually accomplish those things. But... How fit for purpose was my service to the review in her eyes, right? So I said, is this acceptable to you? She said, yeah, that's fine. So anyway, it can improve your relationship. And by the way, I've gotten actually, I've improved since then. So I think you saw I've moved from 93% to 94% of my fixed delivery performance. So anyway, if for no other reason, you can improve your relationships. Okay, so you leave the conference, so what do you do when you go back to work? So here's some takeaways for you.
Identify, involve your customer. This might take some thinking, you know, you might assume who your customer is. Basically, who's making requests of you?
It could be internal. Ask the fitness question. Again, this is not complicated. It's very simple. What do you value? What do you value about the service we provide? And that gives you a sense of how fit for the purpose you are. Discovering your services, that's probably in the wrong order.
Establish service delivery expectations. At the beginning, you might not know exactly, or your customer might not know exactly what makes you fit for the purpose, but that's the whole idea of doing this continually and continually reviewing is saying, is 93% acceptable to you? Is a 20-day, 85th percentile expectation acceptable? And so every week or every two weeks or every month that you do a service delivery review, you can understand, yeah, actually, that's fine. You can also identify, and they talk about this in their book too, where you're over-serving the market. So you might assume that you need to be doing something in terms of your service delivery that's much higher than what people need. And so this allows you to understand the threshold at the lower level as well.
Interestingly, the idea that, oh, keep doing the other feedback loops. Again, we need feedback in all four of those categories. So keep doing the things that work. You can also merge this into some of the existing feedback loops you have because you've got the right stakeholders typically.
Andy Carmichael, actually talked about how you can think about your fitness and measuring your fitness for your purpose as a way to measure or assess your agile adoption or your agile transformation effort. So rather than compliance and compliance to practices, those are more internal health type concerns, right? Are we doing these feedback loops? But really, considering becoming more progressively fit for your purpose is a way, that's really a business-facing outcome, to say, if you're becoming agile and you're not changing the way your customer views your delivery, then it's kind of a meaningless change, isn't it? The customer doesn't care that you're doing stand-up meetings or sprint planning meetings. The customer cares about that you're becoming more fit for the purpose in their eyes. So I think that's a nice way to think about if you're doing agile adoption or agile transformation work, think about it in terms of fitness. Right, and so if I haven't mentioned the book enough, I'll quote this. The tighter you make your feedback loops, the greater agility you can exhibit as a business and faster you can sense and respond. So this idea of continually becoming, continually sensing whether you're fit for purpose. It's not, you're just not walking around blind in a room. You're actually using quantitative understanding to become better.
I'll leave you with that. I'll post these slides on a couple of locations. You can see on Twitter as well. Any questions? Questions?