Arnaud Porterie
Transcript
[00:00:06]
Hi everyone. Thank you for coming. Uh, my name is Arno and we're gonna talk a bit about what's beyond developer productivity. So a bit about myself.
[00:00:20]
My name is uh Arno Portery, as you can hear I'm French. I am however living in Amsterdam and it's good to come back home sometimes. Um, I'm currently the founder of a company called Echoes and we have a booth here, uh, if you'd like to come visit us later to show what we're building. Before that, I was a deputy CTO in a company called VP and before that, I was leading the core team at Docker, which I'm sure many of you know. And which was a strong inspiration for me regarding every all the topics that deal with developer experience, open source and what it's like to talk about developer productivity. All right. So, first, a disclaimer. Yes, I'm the founder of a startup, I'm not gonna pitch my product at all, this is not gonna be a sales pitch. I'm gonna talk about the philosophy between what we do, behind what we do and and what we think are the right ways to communicate and define engineering success.
[00:01:16]
So what are we gonna talk about? First we're gonna talk about developer productivity, it's a hot topic, I'm sure you all know that. We're gonna talk about why it's a hot topic and maybe why it shouldn't be as much of a hot topic.
[00:01:30]
We're gonna talk about engineering success, uh starting with a parallel with team sports. And what makes success in our field so hard to characterize.
[00:01:42]
And finally, we're gonna talk about a framework for reasoning about engineering success. So what we call the three components of engineering success and how to apply them in practice.
[00:01:54]
So, developer productivity, a hot topic sure, but for good reasons or not.
[00:02:00]
Measuring developer productivity has been a trending topic. Unless you've been living under a rock for the past couple months, I'm sure you have seen, well, of course, super popular research work with Dora, with space. I'm sure you have seen some pretty controversial posts, uh especially from McKinsey. And a flood of overpromising startups and products regarding measuring developer productivity, pushing developer productivity, developer productivity, developer productivity. What's interesting with the fact that it's such a hot topic is that today, developers are more productive than ever. Like, there's no, there's no time in the past where we have been so productive. Like we have AI assistant built into our IDEs, we have pretty awesome frameworks and languages and tooling today. Why is it that we have an obsession with developer productivity now, when it seems that we are more productive than ever?
[00:02:58]
Well, my feeling here is that developer productivity is really the tree that hides the forest. Expectations toward tech are only growing.
[00:03:08]
We are unfortunately in a time where we're talking about, you know, downsizing and cost cutting. We have boards and investors that are demanding more visibility into the performance of the teams. And pretty much we're asked constantly to do more with less.
[00:03:25]
The problem is that simultaneously, our accountability as an industry is not really catching up. What I mean by this is that we still suffer and probably for some good reasons of a perception of opacity of our function. The reality today is that there's a lot of CEOs who cannot tell for sure if their engineering department is doing good. And the true is, and this is also true for CTOs, we see a lot of CTOs that are not super confident whether they're doing well and we don't know how to characterize this.
[00:04:04]
So, the reason why I think that developer productivity is such a hot topic right now is this, our inability to measure engineering success has become untenable. Expectations are growing. Accountability has not catching up. Measures of developer productivity fill a gap. It's not the best, it's not great, but it does provide one answer to the question, are we doing well? Well.
[00:04:34]
Even though it's probably not the bottleneck to most organizations. And it only captures one narrow aspect of engineering success.
[00:04:47]
So what do I mean by it's not the bottleneck? That's not to say that developer productivity is not important, don't get me wrong on this. But in many cases, improving developer productivity is not gonna correlate and gonna show business improvements. I'm sure many of you are familiar with the theory of constraints. Essentially, what that means is that improving developer productivity is just an illusion because it's most likely not the bottleneck to your organization. And it's really interesting. Because we're seeing that a lot of companies believe that the productivity of developers is somehow dragging down the business.
[00:05:20]
While the same companies are often dragging down the develop the productivity of developers by their complexity and the business itself, right? So it seems a bit backward.
[00:05:35]
And it's a narrow aspect of engineering success. When you measure the success of an engineering organization using the productivity of developers. What you're essentially doing is that you're simplifying the mission of engineering as producing code. I think we know in 2024 that the mission of engineering goes beyond writing code. This perpetuates the idea that engineering is not much more than a feature factory. And it's the exact same misguided fallacy that once led to people measuring success by lines of code. And, you know, I'm I'm saying this as an archaeological artifact because I don't think anybody's doing it anymore, hopefully. Except maybe at Twitter apparently, but that's very much it.
[00:06:23]
So, having said that, if developer productivity is not the answer to measuring engineering success, what is engineering success? We're gonna start by an analogy, um
[00:06:35]
there was a talk a couple years ago in a conference called Tech Rocks by uh Fabien Galtier. So, I don't know anything about sports, so I hope I'm not gonna say something wrong. But Fabien Galtier was a former rugby player and I think he's now, or at least was at the time, the coach of the national French team. I got this right? Cool. Um, he shared his story, um, the relationship of his discipline with the with tech at this conference and it was absolutely fascinating. What he told us is that his first relationship with tech in his in his discipline was as a player. When before a game, he had his coach, I think Bernard at the time, who told him, oh, by the way, during this game, we're gonna track your performance. He was like, okay, sure.
[00:07:23]
And it was basically a very early version of the data analytics that we have today and it was something that was looking at his movements over the field, etc, etc. So he played his game and at the end of the game, he was told by his coach like, hey, you didn't run much. He was like, all right, but he scored. He was incidentally the best player of the in the game in this particular match, but still the feedback was like, you didn't run much. All right.
[00:07:53]
And he was doing the parallel with what he's doing today as a coach with data, where they're now using way more complex analytics to be able to see not if the players are running enough, but if they're running at the right time, if they're having the right impact, if the team dynamic is the right one, etcetera, etcetera. As you can tell, like it's amazing the parallels that we can make with our field.
[00:08:18]
They have in rugby something that we don't, which is a very easy success criteria, which is to say, either we won the game or we lost. We don't have this, of course, in engineering. But when we're looking at measuring performance, we are a bit at the you didn't run much era, you know. It's it's a bit like telling developers, well, yeah, you didn't write so much code. Or you didn't close so many tickets and like sure, okay, but maybe he was the best player of the game.
[00:08:49]
So to come back to uh to software development, why is our field not as easy as rugby in this particular context?
[00:08:58]
Well, first, software development is super complex. Um, we're juggling with priorities. We have to coordinate cross teams efforts, uh, we have to manage software rot.
[00:09:08]
All while shipping value to the business. The fact that it's so complex means that we're never gonna get the silver bullet magic KPI that captures success as a whole. It's just, it's just too complex.
[00:09:24]
The other difficulty with our field is that it's extremely contextual. Like, how every team is gonna operate, is gonna vary based on the sector, different teams are gonna have different compliance requirements. Different pace of change.
[00:09:39]
Um, you know, everybody wants to move fast and break things, sure, okay, there's industries and companies where you cannot move fast and break things because people's lives are in the balance.
[00:09:49]
And it's also depends on the business maturity and the trajectory. I think it's something very specific to our field that basically all the path of the company, all the acquisitions, all the changes of strategy, all the all the decisions are somewhere are somehow baked into the platform. Like this is very unique, I think to our field in that we have to manage something that is a a piece of history of the company as a whole. And because it's so contextual, that means that benchmarks as a measure of success are probably not gonna work. Because we see that constantly, we're talking with CTOs who who want to be, you know, reassured that they're doing the right things and that their organization is is doing well. And and they'd like to have feedback on that and comparing to others, but the problem is you're gonna take companies that have the same size. That have the same, you know, maturity, the same kind of business maturity, I would say. And even in the same space, they are gonna operate entirely differently and benchmarks are not gonna mean a thing. One example I often give on this is you we have, we have very successful companies that are extremely happy shipping reasonably unfinished products, assuming that they're gonna see what stick and, you know, polish in the end. And who are we to say that this is not a right approach if it works for them? And on the other at the other sorry, the other end of the spectrum, you're gonna have companies that want to ship the perfect product.
[00:11:16]
Before it's being shown to the world and again, who are we to say if this is the right approach for them or not? So it, there's just too much that comes together to say that benchmark is gonna be a reasonable way to compare companies and to define engineering success.
[00:11:35]
So the only conclusion we can reach there is that success is pretty much relative. There's not gonna be a universal measure of engineering success. The only thing we can try to measure is the gap between what we are and what we could be, given our constraints, given our context, given our history. In other words, is our organization the best possible version of itself?
[00:12:05]
There are however things that all engineering organizations have in common. And that could give us a way to reason about engineering success as a whole. While success is relative, interestingly, failure is pretty universal.
[00:12:18]
We don't necessarily know where to draw the line between success, what what defines success, but however, we know how to recognize failure patterns. We know that when a team doesn't ship or regularly ships the wrong things, that's a red flag.
[00:12:34]
We know when the development pace is low, we know when the quality of outputs is not there. We know when the turn is high and when people are just burning out. These are universally wrong, even though we don't necessarily know how to draw the line between what's absolutely good. Good.
[00:12:52]
The other thing which is pretty universal is our mission statement. When you think about it, most engineering organizations out there have one mission, which is about delivering value for their business, frequently and sustainably. This probably gives us enough of a frame to discuss engineering success in a way that applies to all companies.
[00:13:13]
Breaking down this sentence, we can look at what we call the three components of engineering success.
[00:13:20]
The first one is alignment. Are we putting our efforts in the right place?
[00:13:25]
I'm gonna go over all of them in details afterwards. The second one is delivery performance. Are we delivering software efficiently? And this is of course where questions of developer productivity may arise, of course.
[00:13:39]
And the third one is health. Are we creating a context favorable to durable success?
[00:13:50]
None of the components of engineering success can be neglected for too long.
[00:13:55]
If you don't have any alignment, eventually your efforts are not producing value and that's not not a good position to be in.
[00:14:04]
Without delivery performance, progress will eventually halt.
[00:14:08]
Without a healthy environment, the results won't last.
[00:14:12]
What, you know, what what's the value of a team that has the perfect developer productivity metrics, if ultimately they're not working on the right things or if they do so at the expense of the people on the team? Right?
[00:14:29]
The relative importance of those three components will vary over time and will vary with context. It's often time the case that we're seeing, for example, startups put a lot more value on delivery performance because they want to go fast to market. And we're seeing, with time especially, we've seen with Covid a lot of movements there. Some companies, especially on Silicon in Silicon Valley, which were very much biased toward autonomy, when Covid hits and the economic situation started to be a little more tight, we saw that there was a lot more push toward alignment. Like, sure, you can work remotely, sure, sure, yeah, but we have, you know, five company pillars and we want you to work on them and and and we want to make sure that that happens. And this is natural, it's gonna change, it's gonna change from company to the next and over time. So let's talk a bit about alignment. Are we putting our efforts in the right place? So arguably, this is the most important in the current era. Again, work in the post-Covid world is often distributed and that means that you want to make sure that there's an alignment between the people, even though communication is not necessarily as good as it was before.
[00:15:39]
We don't have the same ability to hire in the current economic context. And that means again, we're supposed to be doing more with less and that means that we don't have as much freedom as before to do things that are not necessarily priority.
[00:15:54]
And for thing, the product engineering model is the de facto standard. What I mean by this is that we're seeing more and more organization where there's basically one product line collaborating with one engineering line. This collaboration is absolutely key, of course, to the success of the organization as a whole and that means that you want to make sure that there's a strong alignment. What I mean by a strong alignment there is making sure that the developers understand what they're pursuing in terms of business objectives and making sure that the product the product managers understand that they should also be incentivized on the quality of the platform.
[00:16:30]
It requires some clarity and some reasonable stability of goals, of course, it's never perfect.
[00:16:37]
And again, some shared incentives. It's easier said than done. Uh, we've seen it in my past company where we we had teams be incentivized together both on reaching business objectives and technical quality of the platform. It's sometimes a hard pill to swallow for a PM to be incentivized on the SLA of the platform. Or for an engineer to be incentivized on achieving business objectives. But I think that's what alignment really is about, it's making sure that people are in the same boat and that they share problems together because if they don't, it's gonna be a constant struggle to decide what is most important for one or another depending depending who you're talking to.
[00:17:27]
How do you measure alignment? So this is a really complicated one.
[00:17:32]
What what I think is that you can by connecting everyday work to its purpose and measuring the allocation of effort across goals and across initiatives. It is way easier said than done. And unfortunately, none of the traditional ways to solve this are really satisfactory.
[00:17:47]
Typically, what we've see is a manual time sheet. That's probably the worst, of course. Uh, manual time sheets to say, well, this week I, you know, spent 40% of my time on this and 20% of my time on that.
[00:17:58]
Everybody hates it. Uh, it's manual, so it's imprecise. But, you know, it gives some data about how exactly we are aligned as an organization. The second way that is extremely common is mandating Jira tickets for every tasks.
[00:18:17]
Everybody hates it. Second edition. Um, it's not great developer experience, but again, you know, uh, it does it does exist and a lot of companies are doing it.
[00:18:29]
Um, the third way is what what we did, uh, in in in a past experience is just collecting rough estimates for discussions and making sure that we understand exactly what is it that we're pursuing and how exactly our engineering efforts are mapping to business goals.
[00:18:46]
The thing is, I would really not underestimate the benefits of bringing everyone on the same level of understanding of text effort. Just being able to understand and to show in a large organization how our engineering efforts are contributing to different OKRs, to, you know, to our north star, to different business goals, etcetera, this entirely changes the nature of the conversations, we're not talking anymore about, you know, like how many tickets we close or how many pull requests or story points or whatever. We are talking about the strategic contribution of tech and this makes a whole lot of difference. In terms of the relationship between tech and leadership. It brings or it can restore trust with the leadership team. And honestly, I've I've experienced it firsthand. It this is sometimes all it takes to make sure that you have a leadership who understands that we are part of the picture and not just a feature factory.
[00:19:46]
It gives meaning to everyday work. Because that's something that you're gonna show to the team also to show how day-to-day work participates into the bigger picture. And it also it also helps close the feedback loop between planning and execution. What I mean by this is that we're seeing a lot of companies where planning and execution are weirdly separated in a way that the people who decide what we're gonna do are not necessarily sufficiently exposed to the consequences of their decisions in terms of execution. What that means is that it's very easy as a senior leader to say, we're gonna start this new project that I believe is the right thing for the business. And it's very easy to move on to the next thing without necessarily understanding how disruptive this is gonna be to the team. The simple ability to show how engineering efforts are contributing to business goals helps close this feedback loop in terms of being able to express that our focus is being split and it's being split because of this and that and that. And that the only thing we can do as an organization is make different arbitrages, but we're not gonna magically create engineering capacity. And I think this is really, really crucial.
[00:20:55]
How do you use the measure of alignment? Well, as I was saying, two things. Showing their teams their contribution to the big picture.
[00:21:04]
Showing the leadership the tech contribution to the overall strategy and how budget is being used. move on to the next thing without necessarily understanding how disruptive this is going to be to the team. The simple ability to show how engineering efforts are contributing to business goals helps close this feedback loop in terms of being able to express that our focus is being split and it's being split because of this and that and that. And that the only thing we can do as an organization is make different arbitrages, but we're not going to magically create engineering capacity. And I think this is really, really crucial.
[00:20:54]
How do you use the measure of alignment? Well, as I was saying, two things: showing their teams their contribution to the big picture.
[00:21:03]
Showing the leadership the tech contribution to the overall strategy and how budget is being used.
[00:21:11]
Moving on to delivery performance. Are we delivering software efficiently? So, we have industry standard metrics. You all know this book, Accelerate. I'm not going to go over it. There's it's an industry standard today, it's we get four metrics that are really solid at measuring the the delivery performance of teams. And of course, it has to be done at a team level and include cross-team collaboration, not at the individual level. That being said, I'm not going to talk in details about the Dora metrics, which you probably already know. I'm just going to talk about how they can easily be misused and the most common pitfalls we're seeing in a lot of companies.
[00:21:52]
The first one is that delivery performance metrics should belong to the team as a tool for continuous improvement and operational excellence, not as a measure of success. That's the same reason that I said earlier. If you believe the Dora metrics are the measure of success of your team, you are essentially reducing your mission to shipping code, which is not what your mission should be as an engineering team. It should not be used as a KPI to communicate with leadership.
[00:22:24]
Why? Simply because it's not actionable in the hands of leadership. So, what is it that they're going to do with them? I I had a discussion recently with a VP of Engineering who told me that he used the number of merged pull requests each month to to in his reports to the to the board. And of course, you can imagine what happened. One month he showed that they had done 100 pull requests. The month after he showed that they had done 90 pull requests. And he was asked, like, where are the missing 10?
[00:22:57]
Who who cares? Is it is it really that important? And you know, maybe the missing 10 are relevant because they indicate that there was somehow a change of pace or a change of process or whatever, but in the hands of the board, like I I don't think so.
[00:23:18]
Delivery performance metrics, like every metric out there is only valuable when it drives actions. However, and I think this is a mistake that a lot of companies right now are doing because of the hot topic of developer productivity and Dora metrics. A lot of people assume that putting numbers on the dashboard, like computing Dora metrics and putting it putting them in a spreadsheet is somehow going to drive some observable behavior change. And most often it just doesn't. Why? Because sure you are measured, but then what? Most managers, they just don't know what to do with them. Again, they don't know what's good. They don't know what they should do differently based on that. And I think this is a very, very common mistake.
[00:24:07]
So how I would recommend using those delivery performance metrics, well, first and foremost by exposing them to the teams. This is where they belong and this is where they are actionable.
[00:24:18]
And exposing them with clear expectations. Like as a VP of Engineering, sure, I'm going to make those those metrics available to the teams. Now what? What do I want? Do I have a threshold that I expect every team to reach? Am I looking for some level of improvement? Am I looking for no degradation? Like, what what what's good? And I think this is the the the part that is very often missing in the deployment of those kind of efforts.
[00:24:45]
How to use them not as a measure of success for a team, except perhaps in the very specific case of a platform team.
[00:24:52]
a platform team or platform efforts, essentially when you are switching platform or you know, changing the way that the company delivers software as a whole and you want to measure if this particular effort is successful, then yes, probably Dora metrics are the best proxy you have. because ultimately, they are going to they are going to capture the capacity of the team to ship more often and and faster.
[00:25:19]
Health is the context favorable to durable success. This is the most difficult one. The best we have are proxies. We cannot measure that accurately because of course, there's a significant human component to it and that's not something that we can easily measure. The best proxies are churn, satisfaction survey, or unusual work patterns. So, for example, seeing when, you know, somebody has been working all weekends or all nights, something that could lead to to burnout.
[00:25:54]
how you want to use them is to make sure that your results are sustainable. So essentially that the team is not paying the price of, you know, whatever pressure the company is putting on them or in terms of performance that you're trying to achieve. But it's really important not to lose track of the goals.
[00:26:12]
What you aim to understand is what's going to hurt you in the long run, not, you know, the day-to-day shocks and variations that can happen. We are humans. We are not perfectly consistent in our mood, we are not consistent in our engagement, in our activities, and that's fine. I insist on this because I'm also seeing companies that think, okay, maybe I'm going to use, you know, like again, productivity metrics or activity metrics to notice when somebody is not contributing. Like, oh, he's he's disengaged. I have to talk to I have to talk to them. Well, first, you should probably talk to them anyway. And before that.
[00:26:51]
And and second, like, what does it mean when an activity metric goes down? Okay, maybe that person didn't contribute code that week, that is entirely unrelated to the value they produced. Maybe they were, you know, talking with a customer, maybe they were doing something that was way more valuable to the business than actually writing code, but that's not something that the metrics will show. And in general, of course, no metrics should ever replace one-on-one discussions.
[00:27:19]
key takeaways of this talk.
[00:27:22]
Our inability to measure engineering success has become untenable. Developer productivity is one answer today that feels a gap but is an incomplete measure of success.
[00:27:32]
Engineering success takes alignment, delivery performance and health. Thank you very much for your attention. I will be happy to take questions and of course, we have a booth if you would like to ask questions afterwards and to see what we're building. Thank you.
[00:27:46]
Thank you.
[00:27:55]
Hi. Ah. Um, my question is like you say that some of these measures, if we want to take some, should be in the team. Like is the team then also bringing it to the team? Because usually, like, that's how I also know it is that like CTOs or managers come and say, we're doing Dora metrics now or, hey, I looked at your velocity and it's looking weird. Um, but I often also see that the teams don't want to be measured at all. Um, so it's hard to bring measurements into the team that are good and valuable for teams.
[00:28:29]
I completely agree. It's um, I would say that in general the teams that want to use Dora metrics to measure themselves probably already do. So, in this regard, you're absolutely right. Still, I do think that there's benefits in measures and that's something that you can tell to the teams. Um, what we're seeing most often is that there's companies that are pretty there's a defiance regarding metrics, which is not always justified because there's they have a decent role to play in terms of continuous improvement, in terms of getting better at delivering software. But again, it has to come with clear expectations. It's it's not going to work if as a CTO, I come to the team to say, hey, we're now we're measuring this. And now what? Like am I going to get be judged on this? Uh, do I expect this number to grow? Like is what am I doing here? And I think this is really the biggest um, the biggest thing there. But I wouldn't reject them as a whole because there's I think there's a reasonable way to use them especially when you're trying to, you know, to switch an organization to more modern ways of working, um, and including including things like, yeah, shipping software differently.
[00:30:23]
Well, no more questions. Thank you, Arno.
[00:30:25]
Thank you. Ah, there's one here.
[00:30:27]
No, why do you have to speak in the mic?
[00:30:32]
So given that no one asked a question, I ask a one, maybe it's not good, but
[00:30:40]
you are talking about engineering productivity, but is there any point, isn't that we produce something when we work together with product and business? So isn't measuring engineering a reflection of a broken view on what we do actually or not?
[00:30:59]
It is, it is in a certain way. And the problem is, the way that companies are organized today, I think doesn't allow to use customer satisfaction or business impact as the sole measure of success. It's as simple as this. I wish I could say that it should be the only measure that matters and I and I think it probably should, but that's not what companies expect today.
[00:31:23]
You know, at the personal level, having been a CTO myself in a pretty large organization, it's it's embarrassing. Cause we are when we are in a in an XCO with other C levels, everybody has a measure of success. Everybody can tell where they're doing well in a way that is easy to communicate and it's a way in a way that is easily challenged. Engineering doesn't have this. And that, I think, will remain a problem even in a perfect world where we would be incentivized solely on the success of our business.
[00:31:56]
Good food for thought. Thank you.
[00:32:11]
Thank you.
[00:32:21]
And I think now is time for coffee break, I guess. And feel free to meet Arnaud and his team at Hiko's booth with the partner stands to find out more about their products.