Gabrielle Benefield

Transcript

Hi, I'm Gabrielle Benefield and today we're going to be talking about outcomes versus outputs, a lot of the metrics.
Have a look at this.
So ironically, I found this while I was on the phone to eBay customer service. And it was my fourth time of trying to call them and going through what felt like Groundhog Day, the same thing over and over again. So each time they would sort of cut me off halfway through and it was pretty awful. And I got a feeling they were driven by some sort of interesting metric like this one.
So I did a look up on metrics, looked at what we're using in Waterfall, Lean, Kanban, Agile, and most of the measures I seem to come across were all about outputs, things like how many features we're delivering, how many function points, or throughput measures, our cycle time, our velocity, things like that. Are we reaching our milestones on time? And you did hear people start talking about value, but it wasn't really clear how we're going to create value as we go along.
And it sort of made me think about why do we measure a lot of these things? So why do you think we measure things like story points, function points, features delivered, scope, et cetera? Why do you think that is?
Because? Predictability. Predictability. You want to get more predictable? Yeah.
It's easy.
Done?
Nicely put.
I think that's aspirational. I think that's what we would like to be doing. A lot of the measures, I think it's more as Josh put it, Joshua, that it's actually easy. So a lot of the things we're measuring, it's much easier to measure. And when I say to people, well, you know, throughput is good to give us an idea of our flow efficiency. So how much we can deliver. I think that's important. But I don't know if that should be our main focus. And things like our scope delivered, I don't know if that has much bearing on the value we deliver. And again, we do it because it's what we have and it's what's easy. That's kind of like to me, just delivering more is like giving a machine gun to your product teams. What they're really doing, often with the Agile and Lean teams I work with, is they're delivering more features faster, right? So they take lots of shots, like a machine gun, and you might hit the target, your probability goes up, but actually you might kill many innocent bystanders, so you create a lot of waste. That's a bit like the analogy of putting enough monkeys in a room, typing for long enough, we'll eventually get all the complete works of Shakespeare. So if that's what we're trying to do with products, that's a really big problem. More features are a big issue, and that more features create product complexity. You know, imagine if I said, I've got this great startup idea. We're going to create this really basic remote control for your television. And what we're going to do is we're going to charge people for extra buttons so they could have an infinite amount of buttons.
Who's going to buy that? Who wants to buy lots of features for their remote control? You're scratching your head. I thought you were about to say, me, give it to me. Right. No, people actually, what do they really need to do? They want to change the channel. They don't want all the complexity, and that amount of complexity causes a lot of issues. If you think about it, because we put this big emphasis and you incentivize what you measure, if we're incentivizing output, we're going to get a lot more output. Now, that's the big issue, at least product complexity. It will take us longer to get our products to market. It will cost us more to maintain them. And here's the big one I'm most concerned about. Defects. I work a lot in the healthcare space at the moment, and every line of code they build can lead to more defects. I had one woman that, after we worked through a while, we managed to really radically reduce the amount of features they were building. They're in pharmaceutical. She said they went from the 100% they would plan down to about 42%. We were able to fairly consistently keep it around 40% of what they've been doing before. And she said for us, every line of code, we could kill people. So she said this is great for us. We can actually save lives with this. So this is something we should pay attention to. Who here is going to be flying after the conference?
Right. Do you want lots of extra features in your plane? More things that can go wrong? Yeah, don't think about that. Best not to.
Why we measure what we measure, it's easy, it's what we've done for a long time. And this is something we need to reassess and work out where we have these measures, do we really need them, can we remove them, and what should we be measuring instead?
This goes as far as the legal contracts. So I'll tell you a story. We were working with a telco. They were putting work out to bid in a competitive contract environment. They had three suppliers come in and run do the bid in the competition. Two were kind of looked at. One of them had built a really beautiful, elegantly coated, lovely system of pretty high quality.
They tried to minimize what they were building. One of the other main competitors had built a lot of function points, way more than the other team, but the quality was pretty low. Who won the bid?
The second one, right? We're all skeptical, worked in this industry long enough. The second one won, and even though people around were saying, but, but, but, this is a terrible idea, they said, no, these guys are great. We can get two or three times as many function points out of them as the other team. Fast forward a couple of years. This product's built, it gets to market. So we said to them, we want to actually go in and measure how many of these features have been used. This is the number one thing I'm getting a lot of my clients to do at the moment. You need to start tracking feature usage, which parts of the system, and understand that better. What percentage do you think of that system was actually being used?
30%?
10.
10.
Okay, you're French, you're a little skeptical here. It was actually 20%, right? But then we were being so generous. So we had lots of function points that paid a lot of money for 20%. But it gets better.
We started to look into the legal contract that was driving this. The company who built the system had a 60-day warranty period on the software they built.
That period was up and they actually won the maintenance contract.
They were paid for every defect they fixed in the system. This system had between 200 to 1,000 defects a day.
Guess how much they were paid? 125 pounds per defect, 200 to 1,000 of them per day. This company, I'm actually kind of impressed. I think it's a really disruptive business model, whatever the supplier's doing, it's awesome. So what they'd done during the warranty period, where the genius lay, was that they went through and they actually automated the opening of tickets and the closing them when things got fixed. If a system outage happened, servers go down, guess what happens? All of the subsystems go off. all of these tickets get automagically generated. Then, even if you just tripped over the power outlet, plug it back in, everything would come up. All of these things would get automatically fixed. So cool. We were watching this one day, 20,000 of these things, and one massive outage had had that happen. And this supplier, was just churning over the money because no one happened to look at those contracts. It's a big telco, right? So you've got to be really careful because these incentivizing, especially when you have legal contracts, can cause all these aberrant behaviors.
So now we get into the hard part.
You know, I spend a lot of time, I've read Dom's work, I've got Tom Gilber, I spend a lot of time with, Master Metrics, David's doing a lot of the stuff, Lean Startup, you know, you name it, there's a lot of work being done around the metrics. And actually, I think we have a good idea that we should be measuring what I call the outcomes. Outcomes are the results, right? We're going to talk about the definition and measure of success. How do we know we have succeeded? What's the end state?
And there's a big difference between this because one of my, I went into work with a charity. I spent a couple of hours there one afternoon to look at their legal contracts. They had a supplier coming in. And in their contract, they had all sorts of things in there where they'd set out all the scope. They had put down all sorts of things that were a little hard to understand, and nowhere had they actually talked about what success meant to them. I flagged it. I said, you're going to have massive problems. This is a really big issue. You need to spend some time clarifying it. Just to give one example in there, there was one piece where they were talking about the system needed to be very user-friendly. I always loved that one. I'm like, sure, why don't we put little happy faces and maybe some rainbows and unicorns? It's going to be so friendly. People are going to love it. She's like, no, no, no, no, no. She said it needs to be really consistent.
So when I hear consistent, I start thinking, oh, that means we need like a search box in the same place on every page. I'm thinking about usability. When I started to dig into what consistent meant, it actually meant that she wanted, when they ran all the donor contributions, when they added it up, they would be correct.
Right? That's what consistent meant to her. So this is what we're dealing with. There's a really big difference between people saying we need all these outputs, but we're not very clear on why they need it. That actually was a massive disaster. No one had actually clarified the need that was actually transparency across all of the different donation centres. So that got completely lost in the contract. So instead, what we've been working with, when I say we, I work with quite a few different collaborators. One is Ryan Shriver in the US. He was one of the first people who I came across his work where he'd taken Tom Gilb's work, which is very deep, a little complex to understand, and he'd drawn some nice pictures and simplified it. So I started working with him. Susan Atkinson, a lawyer, so we've been working a lot on how do we create better legal contracts. And we found that you could talk about the stuff and needing outcomes, but it was more about getting people to adopt it that was really the biggest challenge.
Getting people to understand why we had the outcomes we did. So none of what I'm going to show you is new. It's a really simple framework that we've found a lot of success adopting. This outcome work as opposed to outputs and throughputs. It's called Mobius. This is the Mobius loop. The German mathematicians came up with it. I love the shape because it means we're never done. This is a continuous cycle. Also notice, David, your Kamban logo. It's kind of a Mobius, right? It's a continual. Yeah, the new one I saw. Yeah, that triangular one we saw this morning. So, well, let me draw it out for you first.
So what I found, you know, I work with a range of teams that are sort of agile, lean, and everything in between. And what I found is there is a really big focus on this kind of, we're going to deliver something and adapt based on feedback, right? It doesn't matter if you're flow-based, iterative, whatever it is you're doing, there's this basic notion. And I would sometimes turn up to teams and they're like, right, teach me lean and agile. Let's go. And I'm like, okay, great. What are we building? I don't know, but we're kind of agile. So we'll figure it out as we go along. Back to machine gun product development. Wow, that's a good idea.
Not. And on the other end of the spectrum, we had the massive analysis and planning where people didn't actually get things to market to validate and get feedback quickly. So where Mobius comes in is to try to link this in a very sort of linking some of the lean things across. where we're always starting, well, typically with a problem. I tend to see problems everywhere, the world's problems needing to be fixed. It could be an objective, but we start with a problem. What is it that you think's going on?
Then probably the most important part is the deep dive, understanding why. For a lot of the lean people, this is where you're gonna be using your value stream analysis, A3 thinking, any tool you need. We tend to do a lot of data gathering when we're in this deep dive. We also go to the Gemba, we go out and see people, we try to figure out what they're really doing. So that's to give us information that will feed into creating outcomes.
I'm going to get into some examples, so I'll show you different ways that we're using outcomes. One really key point I found, visualize your outcomes and make it into pretty pictures. I work with management a lot. All management reports have to be on one page and with pretty pictures, right? No one reads anything, particularly management. So this is really helpful. Once we understand, so this is saying, where are we now? What's really going on? Where do we want to go? And we're trying to get rid of that whole idea of requirements. And I must say I'm heavily influenced by spending too much time with Chris Matz. We're actually working on a new graphic novel at the moment. But we talk about options. What are the potential things we could do to be able to hit our outcomes?
and adapt. You could be in PDCA land, plan something, do something, check something, act. You could be lean startup, build, measure, learn, whatever language you want. We found keeping a pretty open framework means people can put in their own tools, whatever they want. It's not meant to be a, this is the thing that will rule everything. It's just meant to be a simple framework to help people think through it.
So I'm going to show you, we go through three stages. ID8, where are we going? Investigate and ID8.
Do you want to do it? Yes.
create, how will we get there, and validate what happened, right? But it's meant to be a continual cycle. We don't do it sequentially with a big upfront design. We're going to be doing this all the time, going round and round. We also use this as a bit of a coaching mechanism. So when we go into companies, we're trying to be really clear about what problems are you trying to solve, What are the outcomes you'd like to achieve? What are the different things we could try? Now we're going to try them, validate whether they're making a difference, and continue to do it. Again, continual is the point.
David mentioned tools earlier. I found having some simple tools really does help some adoption. This is just a simple canvas where we've laid it out.
This one's based on a client of mine. So I got pulled into, well, they first asked me to come in and they said, can you run a user story writing workshop? I said, sure, tell me what's going on. They told me and I said, okay, I'm going to need two days. And I said, that's a lot of time for user story workshop. And I said, yeah, that part will take about half an hour because that's not your problem, right? So really it was a bait and switch to be able to go in and start this whole process. While we're in there, we actually gathered, one of their big problems is Across their whole transaction checkout process, they had farmed it off to completely separate groups. So they had a lot of sub-optimization in the system and they were never talking together. They had completely different incentives, which again were creating all sorts of bad behaviors that were negating the customer experience. So I kind of obfuscated it to a general e-commerce. I've had a few of these now. And what we do is we talk about what's the initial problem. Usually customers will say something like, well, people are checking out of our system. We don't know why. Okay. So once we do the deep dive, we start finding out some data. Turned out that 14% of people who started this process never finished it.
Each 1% conversion was worth many, many millions more than this per year. So this was a fun one because we could actually see a lot of value created.
There were some things we noticed really early. If a page took after three seconds to load, we saw some drop off. Over four, people completely left, right? So we had this, Very simple data quickly. We also found that the people furthest from the data center, which was in Europe, were most affected. And some of our biggest spenders were actually in places like Australia and New Zealand. So the people spending the most money were the most severely impacted. We found a lot of other things. There are all sorts of user experience issues. All these sort of performance and usability were the key factors. So on one hand, we have management, the whole set of new features. Meanwhile, we're getting the data and saying, actually, it's all about performance and usability. I find this pattern over and over again, right? People want the shiny, fun things. Actually, it's the bread and butter stuff you do that makes the biggest impact.
So some of the outcomes we came up with, you will try to create metrics around this, but even getting some information about what it is the customer wants, start off simply, that would be my advice, then we layer and make it kind of deeper and more complex as we go along. They wanted to increase checkout conversions.
We looked at industry averages, we were at 14%. If we got between 25 and 30%, we'd be more in line with other typical kind of businesses. So for us, that was worth over 40 million pounds of revenue a year. So this was a good one to fix.
There's a lot of correlation. What happens with these outcomes is there'll be high-level ones. For example, management will say, we want more revenue, more customers, more engagement. At the next layer down, you'll start saying, how can we achieve that? So we might find, okay, if we increase checkout conversions, we believe we can do that. And there's a hypothesis that if we decrease page load times, we'll actually improve the other metric. You do have to keep a line of sight. Between these metrics.
A friend of mine who does a lot of joint forces training, so he's a big military guy, and he does a lot of this, we commonly know it as commander's intent in the sort of lean agile community, or mission command. The idea that you start with the idea of where we want to go. For example, with the troops, there'll be a big strategy, but on the ground, the teams can make their own decisions, right? And he always says to me, you know, at the top layer, you get this big field of view, right? Your periphery vision is really big because you can see everything that's going on. As you get down to the troops who are finally in the trenches, they have a very narrow field. They can't see the big picture. And he said, you've always got to have two lines of sight above. So you know what the next battle command is. You know what the one above is doing. And it's the same with this. You cannot get too myopic, right? If you just look at little metrics, they can be very brittle.
We visualize them. So this is one of our, what we call a DAMOS, a definition and measurement of success. And we have a baseline of 12 seconds. So in these teams, we wanted to get it less than four.
People tend to jump to solutions very quickly. So in the room, as soon as we brought up this issue, we get five suggestions, right? First one is, well, it's pretty clear that we need a new data center in Australia. Okay.
And they're like, let's go do that. I said, well, hang on, what else is there? Someone said, well, clearly it's a problem in the network. We've got some sort of bottleneck, some issue, so let's go fix that. I said, okay, that's a good idea. More options. Someone else, you know, the 20-year-old kid went, well, we can compress the images. So that was another option. We could also do a lot around our payment flow usability. We will compare these. There's a lot around value, risk, cost, and impact. We have fancy kind of voodoo spreadsheets to do all this stuff, but honestly, a lot of the time, we can do a pretty good amount just with stickies on the wall.
Once we get into deliver, there's three states for deliver. We can... Research something, we can run an experiment to test an idea, and we can actually deliver it to market, right? There are almost three states of maturity. This way we're trying to decrease risk and figure out is this worth the investment. So for this one, We did some research. We ran a network trace route to figure out are there network problems. We did some usability research. There was a lot more. Part of this was putting in the instrumentation in the first place. Most companies don't have data, so a big part of our job is helping them get that set up.
Instead of doing the whole data center, we would run a test. We'd divert a tiny amount of traffic, get one server in Australia, and start looking at, is this worth doing or not?
Right, we could start prototyping, et cetera.
And we implemented the image compression. That was a no-brainer, low risk. So what we were doing was constantly measuring. These are our leading indicators, right? Instead of do we meet the massive outcome, these are things that help us build up information. And this is in the feedback loop to help re-inform what we want to invest in next.
We measure these couple of layers because there's no point decrease in page load time if we do not see a corresponding change in the overall checkout conversions. It was kind of funny for this one. It was pretty binary. We found that it had to be four seconds. Didn't matter if we got it down to 4.8. If it wasn't four seconds, it didn't really matter. So we were really going after the highest value things.
In Adapt, we're going to keep going until we get to that point where we have to decide, this is simplified, but pretty much, is it worth investing further? Because we're measuring value delivered, we measure the cost as we go a building, we can keep measuring this in a fairly disciplined approach. So we might decide to keep going around the right-hand side of the loop, right? We could keep delivering more, or we might want to create more options.
I'll get to that later.
We might implement our network fix, we might design new pages, etc.
This one, again, these teams were doing Scrum, they were doing Kanban, they were doing everything, right? They felt like they were doing a pretty good job. And after we ran this workshop, so I was in there for about three days, we got all the teams started and we said, you just need to start implementing, let's get this going. I get a message from the product manager who really took this to heart. Two days later, he pings me on Skype and says, hey, guess what? You were so right. We started getting the data back. We've already pushed changes. We're measuring them, and we can clearly see we're going to be saving 12 million pounds over the next year. He said none of that was anywhere in our queues of work, backlogs, etc. He said, in fact, now we've been mapping, looking at the outcomes, we've realized most of that stuff isn't really worth doing.
So I'm in there a couple of days later because the VP wanted to bring in some more coaching for the teams. And he looked at my coaches, who are people like Chris Metz, et cetera, and went, wow, you guys are really expensive. I went, well, I can get you really crappy people who are much cheaper if you want. They don't do a good job. And he went, Yeah, but how can you prove you're worth that? I said, oh, you know Jamie? And he went, yeah. I said, do you know what happened last week? He went, yeah, no. And I said, well, we just saved you 12 million after being in there for three days. So David, the VP, is looking at me. He doesn't even look down at the paper and he just signs it. Now came one of my worst moments. I do a lot in outcome-based contracting, and I'm sitting there and I charged for the workshop and that, and I went, oh my God, I should have done an outcome-based contract, right? Taking 10% for the money we saved. I said that would have been 1.2 million within three days. How awesome would that be? And the VP looked at me and laughed. He went, yeah, and I would have signed that as well. Now I won't.
So this is like consultant's nirvana, right? This is where we all want to be.
I'd say that the part that everyone's, you know, there's this big push on metrics, and I always say to people, beware of canned metrics. I look on all the startup forums, Lean Startup, and there's a lot about, you need to measure this, this, this. It's the same with lean metrics, common metrics, agile metrics. It's like, if you measure these things, you'll be fine. And actually,
It's not really the metric that matters. It's actually the meaning that it passes across that matters. So it's actually the act of going through this journey, which is far more important than any numbers we come across, right? So that deep dive, that really is the key to it. Anyone who does a lot of lean work, it's really... understanding root causes or my Japanese friends like to say root causal loops. Apparently we're not meant to use the word root cause anymore. It's all about root causal loops. And because it's all about observation and understanding. Margaret Mead, the famous anthropologist said, what people say, what people do, and what they say they do are entirely different things. So I think the days of us going around asking anybody for any kind of requirements list, again in the contract space, why would you take a customer who is totally clueless about technology, who wants to hire experts, you ask them to write up all the stuff that you're meant to build. Why would you do that, right? They understand the outcomes, but they have no clue what the potential requirements might be. So instead, we need to actually validate this, even in the deep dive, to go in and see. I do a lot of guerrilla tactics for teams, because this is all about us teaching people how to do this stuff, not us doing it, right? That's the important part.
And so both data and user experience research are two key pieces. I heard the other day, I was reading something where it said, people lie between six and 200 times a day. If you're at 200, you've probably got psychopathic tendencies. But if you think about it, you see someone in the morning, how are you doing? I got asked that question at least six times a day. I was like, great. And I'm like, actually, I didn't sleep at all last night. I feel like crap. So we all lie a little bit.
I've talked about the can metrics, so beware of just taking things out of the can. It's the act of people creating an understanding that matters.
Visualize these things. It really helps to put these into very colorful visuals. Teams can draw them up.
I'll send these out later, but usually we'll start a little fluffier. For example, this was a non-profit company, this is one of Ryan's. We want to increase market share, we want to increase monetary donations. It's kind of at that level. As we go on, it gets a lot more sophisticated. That we'll start talking about the scale of measure, the target, the failure rates, et cetera, right? So this is something that we'll build up expertise in as we go along. and understanding that matters.
Visualize these things. It really helps to put these into very colorful visuals. Teams can draw them up.
I'll send these out later, but usually we'll start a little fluffier. For example, this was a non-profit company, this is one of Ryan's. We want to increase market share, we want to increase monetary donations. It's kind of at that level. As we go on, it gets a lot more sophisticated. That we'll start talking about the scale of measure, the target, the failure rates, et cetera, right? So this is something that we'll build up expertise in as we go along.
One area I'm having a lot of success with this stuff is management.
Management doesn't care if you're using Scrum, Kanban, or your Aunt Marjorie's process. They don't care. Because to them, it's all about delivery, and there's a lot of terms that they don't really understand. What they really care about is investment and risk. So this model can really bring in, if you look at the left side, it's a lot more strategic, and the right side is a lot more operational. And if they understand that even at their level, they want to be looking at things, well, if you want more revenue, what's preventing it? What are the constraints you have? If you have problems that you're losing your customers, why is that happening? Now, all of these things get kind of linked. It's a little fractal. So even if you have that big strategic layer where people might say, well, I have some cost reduction initiatives, I have value creation, you'll get products and services that will be able to deliver against the strategic goals.
And the other cool thing is if you build up your capabilities, let's say we've got Kanban working well, if we can get Kanban to be better, then you actually find you go around the loops a lot faster. It's like a little racing car. You'll speed up that cycle, which is a good thing.
Any questions so far?
Yeah, so remember I mentioned earlier, this is like Plan D check out. To me, I do a lot of A3 stuff. If anyone's familiar with the Toyota A3 one page report, which gets into pretty much all the things on the Mobius loop.
I know managers who talk about Plan, Do, Check, Act and Teams, but for some reason they weren't able to visually follow it. So giving them a simple tool, again, remember I said nothing new, right? Nothing new at all. For some reason, this helps people be able to see where they are.
I also have people who'll start talking about where they are in the cycle. I get some people saying, yeah, we're all about delivery, but we don't know what we're building and why. Or they'll say, we're all the way over to the left in that deep dive where we're spending so much time planning, we're never getting anything validated. And because we can create a line of sight, it creates a common language between top-level objectives and the teams. Back to commander's intent, right? This notion. that we want to say this is where we're going, but then management and everyone should get out of the way so the teams can deliver. So everybody can run their own sort of Mobius loops. They report on outcomes. We're not reporting on things like outputs or throughput measures anymore. It's fine to do that under your capabilities. The team will want to improve. But we don't report on those. All we care about is the outcomes. So again, just a simple visual tool.
I've got one other one from the health sector. I find e-commerce is sort of so trivial and easy. No one cares about it. This one's actually really making a big difference and literally saving lives. Do you want me to go through one more? Maybe a slip?
For some reason, I've ended up in the health vertical. So I have three different clients. One does big healthcare service contracts. They build all sorts of patient records and health equipment. Another one creates all the RFID tracking, so we can start tracking movements in hospitals. Another one... Builds all the patient record systems that's one of the biggest companies in the US. I'm also working with healthcare architects. So we're actually driving all this outcome stuff now into very non-software worlds as well. So I'm going to take one that tends to be pervasive across all these industries.
The problem is hospital infections cause a lot of mortality and patient deaths. Apparently, What we call HAIs, hospital-acquired infections, is the fourth biggest killer in the United States. It kills more people than AIDS, breast cancer, and car crashes combined. Scary, right? I think it was in Stephen Spare's book, which was originally Chasing the Rabbit. He said statistically, if I remember correctly, going to an emergency ward is as dangerous as going base jumping. Right? I know which I'd rather do. So this is really a big problem for all of us. We shouldn't even care about Ebola and things like that because this is what's killing a lot of people. Something like 267 people per day are dying of these things. That's like a plane full of people. Why do I keep talking about planes and crashing?
And it's like, what was it, 20 billion in healthcare costs in the US alone. So this is a really big issue. When you start going into the deep dive, it's kind of interesting. And this is what we're actually teaching people to do this kind of inquiry style, because we've got to work on multiple levels. One is bacteria is introduced into the hospital by patients. What's kind of interesting is some hospitals have been opening up to get more outpatients who just go for the day in because they get more money. However, more patients, more bacteria, the infection rates go up.
This is getting through the hospitals. hospital staff, the medical equipment, the waste materials, all of those will then make its way and keep pushing that around the hospital.
Some of the top reasons are infection, dirty hands, and disinfection of equipment. So something like, it was interesting, I was reading a lot of data on this the other week. They said even though having dirty hands is one of the biggest problems, they said, and they were citing research saying only about 40% of people will actively clean their hands and hospital staff. However, I read something really interesting, which said that's when it was being monitored. So people going in and monitoring this, observe 40%. When they put in automated monitoring systems, 10%. Right, so they said number one thing they're training patients to do is when they see the doctor or nurse say, have you washed your hands? What a really embarrassing question, but I think I'm going to do it now.
Outcomes were interesting. So we had these sort of, we want to decrease infection rates.
Not many years ago, the US government pushed all of the costs for reinfection rates back to the hospitals, right? So it was costing them a lot of money. They've now pushed it back to the hospitals, which is why you see such a big dramatic increase, people doing stuff. So decreasing infection is something we want to aim for. There's another outcome, decreasing patient deaths. I put these up there because they're not the same. Outcomes get really interesting because some infections and bacteria kills more people. Whereas a lot of bacteria gives a lot of infections to a lot of people. One is a long tail, it costs a lot of money over time. The other one is less money, but it's bad publicity because we get deaths. Things like pre-existing medical conditions, all these things can impact. So we have to treat them kind of differently. This is where outcomes get quite interesting.
Monetarily, we want to save billions in care. Again, hospitals, no matter how altruistic, they probably don't want to kill you. It's really about the money that will incentivize these to make a change.
So under options, you'll get all these things coming out. I mean, it's getting pretty creative. Obviously, patient testing in isolation. Let's do a better job of that. Apparently, any Americans in the audience?
Americans, if you go to a hospital in a lot of Europe, places like the Netherlands, Belgium, As soon as they know you're an American, they'll put on the rubber gloves, they'll put you to one side, and they will test you very specifically. Because apparently America is very bad at testing for some of these bacterias that are really high. Europe's better at it. They think the reason is the hospitals don't want to know because they cannot deal with having to isolate people. So just be aware if you go to hospital. It's not you. It's because you're American, right? As if you didn't have enough going against you.
Better waste disposal, the art of tracking that we're doing with one client is tracking how things get around the hospitals, things like the carts you have equipment on, that's interesting. Obviously hand hygiene, that's a really big one.
Another guy I know, Simon, I wonder if I've got that up at the moment.
Let me just pull this up.
Simon gave a TED talk. I was down at the TED conference in Rio the other day.
He's working on trying to get hospitals and nurses to do their own work, their own diagnostics to improve things. It includes pizza, and I really hope they wash their hands afterwards. What they found is when they started digging into getting the hospital staff to say, why are you not washing your hands enough? It was kind of interesting. These... Dispensers hardly ever worked, so they were kind of broken. So that was one of the first things that came up. Another thing is that they're very high in alcohol. People won't use it because it dries out their hands. So they got the hospital staff to start making these incremental improvements. So this was one of the design things that they wanted to be able to improve for rapid use. So things like that are really taking place.
That'll probably make me loop through it again.
Better transport systems. Again, working with medical architects is kind of interesting. They're starting to put tracks through the ceiling so patients can actually get hoisted and move along. So it both decreases infection, it gets people there quicker. We're kind of looking at different ideas, things like If you, old hospitals used to use copper as surfaces, and copper actually is a natural antibacterial agent. So if you start building hospitals with, you know, pokeyoke? Who's heard of that term pokeyoke? I just love the name of it. Poke-yoke is sort of a Japanese term. Best way to describe it is make it easy to do the right thing, hard to do the wrong thing. So like a floppy drive, if we remember those, could only go one way into your floppy drive holder. So the same way, you need to make the stuff very, very easy for people, otherwise they won't do it. So these guys, you know, they'll often do hand hygiene pilots, again, back to experiment with something. We're doing a lot of research, patient testing in isolation.
One of the ones that's coming out lately is that actually a lot of it's the patients. People aren't following the post-op instructions, so the bacterias have a chance. to kind of reignite. So things like that are where we have to start looking into how we do this with patients.
So again, we're going to be measuring this stuff as we go along and just looking at the outcomes. This has nothing to do with the features again. This is all about are we making a difference to those outcomes.
Any questions on that one so far?
People had trouble remembering this stuff, so one of the things I put together is a little card set. They're like little kind of playing card size trigger cards. The idea is it just triggers you to remember things. So they have things like this. They're sort of a much bigger set than that, but it's a way of letting people know that you can bring in different tools, different equipment. Like if you want to know more about the user experience design part, there's different methodologies you can bring that in.
So I'm just going to finish with two quick stories about outcome thinking and how it's something that's a really big driver across many sectors at the moment.
Alistair Parvin was an architect and he got called by a school in England who had a very Victorian building, so they have very narrow corridors. The problem is that when the school bell went off, all the kids would rush out, they would create massive congestion, and there was a lot of bullying happening. So they said, you know what, we know we have to redesign the school, we know it's going to cost millions of pounds, and we're kind of resigned to it. So the architects go and start studying the problem, have a look. Now, these guys consider themselves problem solvers. They're not builders. Architects are problem solvers. And they said to them, you know what? If you could do a couple of things, we'll experiment. But instead of having one school bell, We can have them go off at different times. Or better yet, no school bell and let the teachers let the kids out at the end of lesson and they're never going to be quite on time. That way you've reduced a big queuing flow issue for like, you know, it costs them what, a couple of grand for the architects to come in and do that. So that's where outcome thinking is about finding the shortest path to the outcome rather than building these massive solutions. So I thought that was pretty cool.
This guy, Toby Eccles, there's a TED talk up on social impact.
So I've been working with Toby a little bit on the contract space. And what they do is they go to government and they get investors, private sector investors, and they go to the government and say, let's take some of your big problems. For example, recidivism. Recidivism is... For prisoners who repeat offenders. I think it's something like 63%, if I remember, of UK prisoners go out to re-offend again. And usually the ones in prison have had 43 prior convictions. And they say, if we can get it to 10%, or greater reduction, we will then get a fee based on that. The investors who put their money in from the private sector get 7.5 to sometimes 20% back on their investment. There's a degree of altruism in there, but there is a money-making sign. So these guys only get paid, that's their contract, with the government for the results.
They've been doing this for things like abused children, all sorts of social programs. And it's interesting because talking to Toby, he said, you know, a lot of people say, we're gonna get prisoners to make woodworking and learn all these skills, but there's never a correlation back to, is it bringing down repeat offender rates? One of the things they found is if they go into the prisons and when they get out of prison, they have 46 pounds in their pocket, they don't have know where they're going. If you do not help them into a halfway house and to get help to get going again, these guys will actually go to their mate's place. They'll be on the sofa staying there. The next minute, drug deals are going down. Petty crimes have been planned. They're back in that loop. So just having someone meet these guys and help them onto the next step has dramatically reduced that re-offender rate. So very creative solutions are coming out of this. This will have a big impact. Stuff like you want to be looking at the beyond budgeting, flow-based budgeting, because again, when you move to an outcomes model, you don't need to plan for years in advance. You're constantly re-evaluating where your investments go, so you need to be able to be very flexible on the monetization. All your incentives will change. I really liked, Beate might talk about it tomorrow, but I went to his Beyond Budgeting session the other day, and Handelsbanken talked about their incentives, which is they only do it on the profit for the company. Everybody gets the same amount. It gets invested. Into a retirement fund that they get when they're 60. Now, some people have like over a million and a half pounds in there, right? So this is a great thing, but it really helps with long-term thinking. Your legal contracts will also change. No longer should you be measuring outputs or throughputs. Your charging models will also change. You should be linking things back to outcomes. If anyone wants to know about the contract stuff, we've created some open source contracts. Like the minimal viable contract, very light. They're up at flexiblecontracts.com. And it's all based on this model. So I think that's it.