In this episode, we answer an interview question that Brian has started to notice Product Managers stumbling over: "What are some indicators you would look for to know if an agile team is working effectively?"
0:00 Topic Intro
1:49 Velocity
4:39 Delivery and Forecasting
6:21 Flow Metrics (Cycle Time, Lead Time, WIP)
7:50 Say-Do Ratio
10:17 Different Uses for Say-Do
11:31 Retro Improvements
14:48 Impediment Resolution
16:45 Roadmap Categorization & Finance
19:54 Backlog Health
22:51 Customer Satisfaction and Value Confirmation
26:47 Surveys
28:09 A Rant on Steering Feature Factory Culture
29:56 Team Satisfaction (Niko Niko)
33:27 Team or Service NPS
36:29 Bug Leakage / Escaped Defects
40:27 Training (L&D)
43:55 Leads Kanban: People Development
47:01 Meeting Duration & Effectiveness
49:23 Wrap-Up
= = = = = = = = = = = =
Please Subscribe to our YouTube Channel:
https://www.youtube.com/channel/UC8XUSoJPxGPI8EtuUAHOb6g?sub_confirmation=1
Apple Podcasts:
https://podcasts.apple.com/us/podcast/agile-podcast/id1568557596
Spotify:
https://open.spotify.com/show/362QvYORmtZRKAeTAE57v3
Amazon Music:
https://music.amazon.com/podcasts/ee3506fc-38f2-46d1-a301-79681c55ed82/Agile-Podcast
= = = = = = = = = = = =
I want to have a podcast to help somebody who's looking for a product manager job because I have got this question that I ask in all my product manager interviews. And it goes a little something like this. What metrics might you look at to know whether a quote Agile team is operating effectively. Yes, a great question. I think it applies to not just product, but you know, even as a coaches or a scrum masters. What metrics do you use to gauge how your teams are being effective or not? Yeah. Yeah. Yeah. Terrific. You might, you might also expand this question to say whether your teams are on the right track. I mean, I almost don't want to use the term effectiveness in this question, but really what I mean is like how do you know your teams are doing well? Doing the right thing. And yeah, getting better at their craft. Right. We're trying to avoid the effective word here. a lot of times people fall down the trap of efficiencies of effectiveness and then that's a slippery slope, which we're not going to go down in this podcast, folks. So there's a lot of bad habits. That's kind of the main reason I use this question is there's a lot of bad habits out there and I've seen a lot of really great agile coaches, but I've also seen a lot of. Project managers in disguise as either product people or Scrum agile people what I don't know what to call it But also also also double also I've seen a lot of people that have no experience in product or team coaching Or even project management that won the lottery as product managers. Yeah Who, who basically about the only tool they have is a hammer. So I would like to identify those people through the funnel. And this question usually it does help. Not always, but a lot of times it does help. Speaks a lot about a person if they can differentiate whether all you have is nails, because you have a hammer, or do you have other things that you can do? A little finesse. A little finesse. That's right. Goes a long way. Now that we're through the introduction, Whoa, like and subscribe everyone like and subscribe every like and subscribe helps the podcast. Alright, so let's see, let's pick the easiest thing. That I would expect to hear what what you would often do right and yeah Yeah, I would say maybe 50 percent of the time. People will say velocity I'll throw this one over to you when you're an agile coach and a, and you, and you hear a product person say Hey, one of the, one of the things I look at to know if my team is on track to know if my team is doing well and forming together and working together well is I look at velocity. What is the initial thing that you think when you hear that? Can I say this on the podcast? The initial thing that crosses my mind right off the bat is this person has nary a clue about what they're speaking because velocity is one of those things that is very widely misunderstood by a lot of people and to a greater degree even weaponized. I've seen that, where teams are asked to increase their velocity. And increase it, they can very easily, right? Everybody knows how to game the system. You know, just just that every three point story becomes a five point story and boom, voila, you have increased velocity. Yeah. Except you really have nothing. Yeah. Right. So that's the first thing that comes to mind. Is why are you counting the number of steps taken to get from The starting pistol to the first milestone or you don't call it right you're running a marathon here that that doesn't matter You gotta pace yourself While you were while you were saying that I was thinking it would be acceptable to be watching Velocity if you dig a little deeper and say, well, what about Velocity are you watching? If they tell you I just want to see that it's Pretty stable like that's kind of what I'm looking for. I'm not I'm not looking for big swings in either direction or you know over the past five sprints two sprints ago. They did like zero and then last sprint They did 50 and then the next sprint they did whatever 10 or like just huge swing. Basically, there's no pattern to your velocity. You know, if if that was their answer, would you feel better? Well, I would look at those kinds of patterns and try to figure out what happened, right? You can learn from that. Here's a zero because so for that purpose. Yeah, definitely. Those are just things that you can look at. The other thing is velocity helps you. Become predictable over time, right? You can predict the last three sprints. On average, we did 14, let's say, so it's reasonable to assume, with everything else being constant, that we could do around 14, give or take. And if your backlog to release is 140 points, I'm using these numbers for a reason. That's going to take you about 10 sprints, roughly. I mean, you can do those sorts of calculations. That's fine for that. But it should be used by the team to kind of make their predictability better, right? They should not be any business of management to go in and say it was only 14. Now you can do 20. Yeah. Cause that's what I've seen too. Yeah. Yeah. Like in retrospect, we could have broken up this podcast to be a three hour extravaganza where we're, where we, we, we examine each piece of criteria from the different perspective of the single team product manager or scrum master. We look at it from either of the, both of those perspectives and then we look at it from the multi team so, cause we could really ask these questions like of an agile coach. We could really dig in an interview and really get in depth with these questions because when I'm hearing, when I'm hearing you develop this velocity, you're developing into a forecast and that would be something that I would like to know as a product. Like if I'm starting with a new team as a product manager, I would like to know is your team already delivering a stable forecast? Is your team already delivering in a stable way? Predictable way is your team basically is your team delivering so that the product manager can give a forecast, right? You know, is it possible what I was getting to? Yeah, I didn't say it as eloquently, but yes, that's exactly right Yeah, cuz it does not be able to give a forecast. I mean every different category that we're about to weave in and out of like very quickly some some dark corners were about to go more quickly than others through but if someone asked me this question, Hey, what, what, what metrics would you look at to know if your team is kind of on track? If you got a stable, mature improving team. I, I was, I, I would, and I pretty sure I have given this answer before of Are we delivering valuable software on a regular cadence? Like on a, just on a regular cadence. You know, predictable, regular meaning predictable. So predictably we're delivering software. It doesn't matter if it's like one feature a month, one feature a week, whatever you mean, just because every, every other like, Oh, let me dig into the metrics or whatever, all that stuff kind of breaks down when like users are getting their software, they know when the next window is. You don't miss the window. You're showing them what they're all the rest of these numbers kind of fade away. Users don't care about these things anyway. So why, why fuss over it ad nauseum, right? Yeah, I agree with you. As long as you're delivering valuable software regularly, and the software is of good quality, you're making it. So, the, when people say velocity, I will question them a little bit because I will, I'm digging to things that actually could be useful, like the cycle time, lead time, flow metrics. Basically. Yeah. I try to dig through velocity to see if they understand that what they're, what they're trying to get out of velocity. What they're really looking for are flow metrics to understand at what stages through the process are they bottlenecking and that kind of whip limit to see if they understand what whip limits are to see if they understand what basic flow metrics are. And I'll just take cycle time and lead time. You know, if they understand what those are and they know to watch them, to look at them on the reports, to identify problems, I'm super happy. Yeah, and from a product lens they don't necessarily have to know all of the equations, there's only one. Yeah. Little's law, things like that. They don't have to necessarily know that. They can work, they can partner with the scrum master or Azure coach. Sure. Right? But they have to know off it. Yeah. And more importantly, not get really down in the weeds with the velocity number in terms of story points or anything like that. Yeah. Yeah. And I mean, a savvy product person will know to, to look for the teams that have 15 things in progress and never finish anything in that. I would say I would hope the product people will be looking for that because I'm sure that it sticks out to the leadership. You know, I'm sure it sticks at the leadership. Yeah, yeah, definitely. Definitely teams that keep starting and not finishing. Yeah. Speaking of starting and not finishing, That was a slick transition. Yeah, terrible, terrible. Here's one that I have not seen a lot of teams track, but I think is actually pretty valuable is the say do ratio. Of your, or you could look at it as, you could look at it, it's a little bit different, but you could also look at it, your committed versus delivered ratio, but basically your say do ratio. I love that metric, by the way, other say do ratio, I often track it. So it's not just committed story points or number of stories, etc. It can be other things too. The team commits to put into practice two improvement items, for example, right, from a retrospective. And they don't put any into practice. Well, that's not a good say do ratio. So you can apply a say do ratio to a lot of things. But as a concept, I think it's fantastic. Again, it's just like most of these measures. Don't get hung up on the numbers, right? Look at that and use that as a basis to ask questions. That's what I'm saying. If you don't meet it, you don't meet it. It's not going to suddenly change. Huddle up with your team and say, Why didn't we? What got in the way? And then next time, you do better. How, how how do you track this? Is it just you just look at what was committed at the beginning of the sprint and then what was delivered at the end, and you just kind of divide? I think we, yeah, that, that's the fundamental say do that, that you and I know, right? Everyone listening? Yeah. That's easy to start with. Yeah. Yeah. Just look at the number of X, whatever it is, story points, number of stories that you committed to, and at the end of the sprint, look at what you delivered. Look at the Delta and talk about it with your team and say, do we take on too much simply? Or is it because Joe or Mary had to take time out during the sprint? I mean, whatever the reasons are, talk about them. So that way everyone is at least on board with what's going on. And then inspect and adapt from there onwards. Right. The other things that you can look at for say do are things like how many times did you effectively. Block interruptions to the team like leadership coming in saying stop what you're working on. I need you to work on this, right? Usually it doesn't happen at the team level, but it's usually you know at a team member level a developer tester or whatever How many of those did you spot and stop? Yeah, that's an acronym spot and stop so during a sprint right look at those and then and then you don't have to actually for the say do You don't have to have Say and do known up front, meaning you can say for the next sprint, we're going to target at least three such interruptions that we're going to block if you don't get three. That's fine. But if you did get three, did you block all three as a team again? I'm not saying it's someone's job to do that. It's the team's job that that's a say do ratio, right? So there are other things like that that you can come up. Yeah, you probably could think about the different categories. Yeah. Of your team. Like one of one of my teams uses Scrum, for example. But I scrum, all this stuff is kind of baked in of the fact that you're gonna meet with customers through your backlog, refinements through your retros. I, I know in actuality it's a little , like some people don't, but I'm, I'm specifically thinking of a team that uses Kanban that don't. They, they don't have the events baked into their. for them. So with that team they could have a category of a say do this as, we met with X number of customers or representatives of the personas of the work that we're working on or whatever is some kind of say do Do ratio that says we're going to touch all of our personas in a specific specified period of time You know, this is this is like stuff. I don't think a lot of people Think this way. Yeah. Cause there's two engrossed with the work of the sprint. Right. I like that number of touch points, period. Like that's one and then a number of touch points with, with, or interactions with a customer, but yeah, number of interactions with each persona over. Two sprints, three sprints. It's contextual, right? So you have to tailor it to your context. But I like that. That's also a good one. You, you brought up you either you brought up or you implied about the say do ratio in terms of , retrospective. Actions or maybe in my brain. No, I did. I did say that. Okay action items coming out of a retrospective and then action items that were basically driven to completion Somehow that resolved or I don't know what you call it the improvements that were made basically that that's sort of like say do ratio, but it's different. I would definitely track it differently. Well, actually, maybe I track it in swim lanes. I don't know how to track it. Yeah, there's ways to track it. I've seen a whole range of different things people do with it. They would create stories, for example, for these and then assign them zero story points because they don't want to mess with the metrics, things like that. People can do that. Before COVID, interestingly, we would use a flip chart. Attract those is it would just be visible and then every day we'd look at that at the stand up. Are we doing this? We said we would right there's ways to do this Whether you're hybrid or you know remote or in person, but I like some of these these outside the box type of things, right? Just around around that same subject you may decide at the retrospective to target One, two, or three experiments you're going to tackle in the next sprint. That's another say do ratio. We said we would do two. That's true, yeah. Do we just not do them? If so, that's fine. But why? Yeah, that's a really good one. And that's where the value is, is having the conversation. Yeah, that's a really good one. I used to do the retro action. I don't like putting it in the backlog. Just because I'm a stickler for like how I arrange things in the backlog and I don't like a bunch of tasks sitting out there and not connected to stuff. But that's just me. That's just me being silly about it. But the, what the team found was we could all commit to doing one takeaway from a retro. So we would vote on what the highest. You know, highest value takeaway was, and we all agreed and we voted on it. And then in, in, in in the ALM tool we were using at the time could have been anything, could have been JIRA. In the ALM tool we were using it has the ability to have the sprint goal displayed above the active sprint. So we would put the sprint goal and then we'd have a new line and it would say retro action this sprint. And so you'd have the sprint goal, which is the goal for the whole sprint. And then you have the, the retro action that we committed to for the sprint and they'd be right there in line. So every, every morning when we're walking the board, you would start and you'd see the sprint goal at the top of the board, right? With, with the retro action at the top of the board. I've also been on teams that we had a card on the board for the retro action. So I've seen it both ways. I kind of prefer it being a banner level item at the top that does like it doesn't necessarily have a card or need a card to be pushed forward now that team had been together for a couple years so that right to be fair and clear about it just visualizing stuff like that and having them all commit it. Was easy enough for them to do. I've used the hybrid approach there. So with this particular LM tool, you could. Use any way you want to conduct your retrospective and the votes. But once the votes have been cast, I simply gather those stickies in descending order. So five votes is the top most and then three and two, etc. And ask the team, how many do we feel like we can actually implement? One or two, right? When they say, well, let's do all three. We've never done that. So let's just start with one. Right. And that one, I screen scrape and put as an image on a dashboard on the team's dashboard. So when people are looking at it, which is every day, it's there up front and center because I believe in that. I think once it, once you get to this level and you have a retro and you decide on an improvement item and then it's just kind of not visible, you're going to lose. So you need to be able to make it visible, keep it there and then. Periodically, maybe daily at the stand up. Just say, Hey, look, this should be doing this. so we talked about retro actions, whatever you figure out for retros, you can do the same thing in the same way, I think, for impediments. Impediment resolutions, yeah, I mean, that's another, that's another say do, I guess, right? Yeah. You can figure out how many do we actually encounter, and how many do we knock, knock away, right? As a team. So that's the thing is, the team has to work on this together, as opposed to throw it over the wall at the scrum master and go, Hey, scrum master, you're the remover! No, they're not the human mop. They don't remove impairments. They facilitate removal. That's potentially my favorite line in the scrum guide. It's just causes the removal of exactly what it says. Causes the removal of so odd. Why does he just not cause the incorporation? I mean, I like the, the rate and time that it takes for each one. If you can, if you can label each one appropriately. Now you've got some real powerful like numbers here. It's like, Oh, when, when we have to escalate an impediment for another team. It's a matter of hours and days when we have to, when we have to raise something that goes to like leadership. Now we're talking weeks and months or potentially never, or I feel it like with, if you just track it and throw labels on it, you'll start to realize. What your most painful avenues are right patterns will emerge. Absolutely. And that allows you to then pivot a little bit and say, Hey, look, we need leadership to act faster. Right? I mean, it gives you the information that you need to do that. I don't know if it enables you to do that because without the information, you can't do anything. I agree. So that's the first step, right? I agree. Some roads have roadblocks. Say it was a good one though. I like say do yeah, and you know what, because teams don't really have a, just a graceful way to track it along with their normal work. A lot of people don't use it like you would think that should be front and center in every a LM tool is the, the amount of stuff that you said you were gonna do at the start of the sprint and where you're at on track, they, they have those numbers, , when you finish a sprint, you, you, they're gone. Yeah, they're gone. Yeah. It doesn't, it doesn't clearly tell you. Yeah. Yeah. I think you have to finagle the tool to do that for you in some way. Yeah. Let's talk about some, some product management specific metrics that I would like to hear as, as responses to this interview question how about, how about the percentage of my roadmap features With respect to technical enablers or features or things that customers are buying or things that the sales team wants or some kind of labeling on my roadmap. to let me know who we're building things for and what, what the reason we're building things basically if, if you're, if you're a product manager and you, and a hundred percent of your roadmap is featured delivery, I'm really worried about you for it. Yeah, I agree. And this can be simply just using tags, right? It doesn't have to be that complex. Yeah. But you need that good mix. That's really what you're saying. And I agree. So oftentimes teams that have, I want to say this a week. Are we talking about feature factories here? Is that what you want to say the words, but yeah but you know, some, sometimes you get these product folks on your team that really are only focused on output. You're not really going to be successful longterm, right? Because all you're doing is just building, building, building. There's no room in there for improvement. There's no room in there for enablers. There's no space in there. It's like a highway that's blocked with cars. No, no, it's going anywhere, but you got 12 lanes. Block LA folks will understand what I'm saying. Block with something. Yeah. Block with something. So, yeah. So I, I agree with you there. But that mix is contextual too, right? It depends on your industry, your context, the product that you're building. But good product, people will know that and they will look at that, take a stab in the dark and say, let's just save 20% for enablers. Mm-Hmm, . And if you find yourself doing more, then save 25. If you find yourself doing less, save less. Yeah. I mean, if you're not tracking it at all, you don't have those, you don't have those percentages. Right. And that's the worst. I mean, surprisingly, a lot of teams have no idea how much, like how much of your capacity you say do you have to be full up every sprint? To the top of your capacity, you know what I mean, like we didn't, I'm not like, you know what's not even on the list is anything about utilization or any nonsense numbers like that, because I would just kick somebody out , we're done here, there's really no reason to go on with the interview. I'm not talking about ludicrous stats. I'm talking about usable stats and where you're spending your money Hey, let's go back to the software finance podcast that we just did roadmap folks. That's a good one. Roadmap feature features like new features versus maintenance versus technical enablers to kind of get you to the future versus like throwaway work that goes nowhere or whatever. Versus R& D work. Or exploration. Yeah, exactly. Yeah, exploration, exactly. Like all these things directly go to a line item in finance. So, it, on one hand, I understand a lot of people don't even bother with this because it, it's invisible to them. Somebody else on some other team takes care of it, but. It's very valuable information to know where you're you're tuning that dial of who we're building things for. Yep. Absolutely. Couldn't agree more. Everything you do has a a code in the general ledger, right? People just may not be aware of it, but that's absolutely the case. Don't worry about that, Om. Let the accounts take care of that. There's only one colored slip you're going to get from that company. Okay? I'm just going to say that. Oh, no. Oh, that's the The real, I think it's not my favorite color. The realist of real talk on, on on the old Agile podcast. It's backlog health. Backlog Health, and this is another one. Again, I was saying weak product ownership leadership. This is another one where people say, I've got this backlog and oh, you taught me it should be deep. I use the acronym deep right, and look at my back backlog. Isn't it wonderful? I've got 300 items on there. Yeah, it's very deep. Okay, so now. I'm going to talk to that person and say, yeah, guess what kale's on sale and so is our tomatoes. Go buy like 30 pounds of each and stuff them in your fridge just and then see how you go. At some point they will become stale. At some point you have to throw them out, right? We're not going to talk about my college days here, but yes, you do have to throw them out. So if your backlog is so deep that you haven't looked at stuff in, pick a number, a month? Month and a half, three months, just pick a number, draw a line and burn everything underneath that. That's right. Burn. Don't archive. Archiving stuff is like taking it out of the fridge and putting it in a drawer somewhere. That's archive to me. Like, if it's gonna stink in the fridge, why would you put it in a drawer? You're not, you don't have any use for it. Get rid of it. It's easy to write a report that goes on to all your different backlogs and just gives you a report of all the. Stale things or the aging things over 90 days or whatever, whatever. Pick your time frame. That's not the problem. The problem is people's mentality here. They say, well, I don't want to lose it. I might need it one day. I know it's like, okay, well export it then. Well, put it in an Excel spreadsheet and do what you want. Well, no, it's still there. I mean, if you, if you need it, if you really do need it and you've closed it. It's still there with everything you wrote. You didn't, you're not losing that's right. Anything. Yeah, it's just a different state. You can go to the graveyard and dig it up. Is what I was gonna say. It's still gonna smell. My friend's gonna smell the same. No, you know exactly where it is. You bet. You're the one that buried it. It's dead. It's an ex work item. I'm not gonna do the parrot sketch. No, seriously, a lot of people just have this phobia of losing stuff. That's pretty in a spreadsheet, right? Where does that fear come from? Where does that fear come from? I suspect these people were burned by bad management at some point. Is this the same mentality that like you go into the house and like stuff is stacked up to the ceiling because they can't get rid of anything? The hoarders. Yeah, the hoarders. Yeah. Hoarders Product Backlog Edition of Hoarders. I, I don't know again, I have aging, stale reports. They're automated. Sure. They don't automatically go close things. But they do fire off an alert that says, hey, here are the work items that you have. And then whatever, every week or so that they fire every They don't fire every week because I don't have that much old stuff. But whenever they do fire, it's a list of here's, here's the greatest hits of the seventies or whatever. And I look through it and say, Oh. Yeah, I'd like to keep this around another month or so. Maybe, maybe I'll get to it. Maybe I'll get to it. And then a quarter goes by and I'm like, no, I didn't get to it. Next time I report runs, I'm like, you know what? I'm just going to close it. Yeah, it doesn't have to be binary, right? You can keep some and lose others. That's fine. I mean, it's a human making decision at some point. But don't keep it in the backlog. Yeah, I I want to transition us.'cause again, we, we've we, I think we've beaten this dead horse, buried it unburied it, and then beaten it again, and then buried it again. Lucky horse customer satisfaction. Ooh, metrics for customer satisfaction. I'm talking about customer satisfaction, I'm talking about, not just, not just like customer surveys and NPS scores and stuff like that, I'm talking Like feature validation. I'm talking about it as part of your work process, having a column that comes after the done column that says, I validated that the feature we put out solves the customer's problem and they like it. That's what I'm talking about. Absolutely. This is the flip side of what we were saying earlier, right? Number of touch points a team has with the customers. Yeah. This is looking at it from the other side of the mirror. Number of Times you meet with a customer to show them what you've got and no, don't wait for your review. You don't have to, right? The minute you have something complete, when I say complete, I mean developed and tested. By all means, reach out to the customer and say, hey, here's what we've done. How do you like this? It's far better to pivot two days, three days into the sprint, or five days into the sprint than waiting to the end of it. And then saying, oh, the customer doesn't like this. We're going to get to it. Next sprint, right? Imagine like four or five days in, you get feedback, good, bad or ugly implement that maybe at the end of that very sprint, you can turn that around. So there is that as well. But the other you started by saying customer satisfaction, which is kind of almost like a loaded term these days and people come up with all sorts of metrics. It's just like it's MVP. It means so many things. I don't know what that's. That's why that's why I try. I wanted to start with let's start with just what validation of what we're working on. Like, Hey, I'm going to give this to you and ask, is this valuable to you? Customer satisfaction pivots me into my next category, which was going to be value to the customer is the customer actually getting value from our solution? Cause again, product management, like all, all the years I spent in product management, I'm, I'm pretty like, there are some things. I try to stay very flexible about. I try not to have real strong opinions, right? Or I have strong opinions, but they're very loosely held. But one of the things I do have a strong opinion on, like maybe for worse is customers won't tell you, like, I'm not going to use this, or this is not what I need, or this is not valuable, or I don't like this. They will nod and say, Oh, it's pretty good. And that's all the feedback, and then they quietly will not use it, and they'll go back to whatever they were doing. They people are, they want to be nice to strangers. They're not going to tell you, this solution is garbage, and here's why. You need to have that honest conversation with them, and empower them to say when it's garbage, right? You say, how many times are you going to use this? So, there is, this is actually quite interesting, because they haven't got it yet. You're just showing it to them, right, in some, Environment. Maybe it's stage or something like that. It has their data in it at least. So if they say it's good enough, take their word for it. But then, later on, down the road, post delivery, post implementation, whatever, it's in prod. Well, is your team following through? Especially the product side, right? Are you following through and doing those post delivery telemetry, right, to see? Is anyone using this? Right? How are they using this? Are they having a good experience? A bad experience? How many times are they using it? All of this should be part and parcel of product. And the team too. Because the product is speaking with the team all the time, hopefully. And they can say, look, you know that stuff we did? It didn't quite land, right? If you're not doing that And if you just stop at delivery, which is where a lot of teams stop, they say, well, it's done and it's deployed. Okay, done. We'll move on to the next one. You're missing it. that's why I jumped straight into feature validation, basically like a post done step on the team's board that team is responsible for because I think about being This is like, this is later in my career, like I'm, as I kind of come around to, I don't know, being old, like that later in my career, I'm starting to. Kind of shift the way that I work where I want to bring sales in. I want to bring leadership in. I want to make sure that I want to make sure that the, like we were talking about the roadmap before we don't get too far from leaderships, like what their goals are where it's like, Oh, we have our roadmap and then leadership has their roadmap and everyone's got different roadmaps. Yeah. Now suddenly we're Airbnb where we all got different roadmaps and like, how did we get this? I had a dream and I went on a hike and how did I get this way? Oh wait, I've been the CEO the whole time. Two very boring minutes later You can do this with surveys, you can do this with like, in app, kind of hey, do you like this feature, kind of stuff. In the world of mobile development, you do it in the application, and you try to time When you're going to get a good response, like you don't do it after a crash. Yeah, , cause in the, in the, in the mobile development world, you have it a little bit easier because you have the app stores, so generally. I mean, there's a lot of trolls in the world, but, but you generally know how people are perceiving your application. The B2B applications are a little more difficult. You have to kind of bake in the touch points, or you have to be really good about surveys and getting doing that on a regular basis because participation surveys is garbage. You have to, you have to use more than one Avenue, I feel like it's just one isn't enough. Yeah, the other thing, even with the app stores though when people have the right reviews and provide feedback, things like that, it's a known thing that an unsatisfied, dissatisfied user is more likely to say something than a satisfied user, right? Somebody did some research on it and said a dissatisfied user is going to go tell upwards to like nine to ten people saying. Product sucks. Yeah. Satisfied person might not say anything to anyone unless someone mentions the product by name. I use it. It's pretty good, right? Right. So all of this has to be taken into account. But if you're not, if your team isn't even aware of all of this stuff, which oftentimes I feature factories are like that, right? They're not aware because they're prized on finishing things only. And not actually making sure that the customer is satisfied. Well, there's a big difference there. Well, this is like the podcast that I would like to have, maybe one day, maybe one day we will. The podcast I would like to have about about how I kind of view all the layoffs that I see in our space. Specifically that are affecting agile professionals and whatnot. That stuff worries me because exactly what we're talking about now is you're, you're in a feature factory and there's little you can do to change the culture, maybe if you had a few of these metrics and you were connecting directly with customers, that would be enough to steer the culture in the direction where they're, they're looking at the right metrics People might be listening to this who, who, who were listening to the podcast for the first time and don't know, all the times I've gone onto a rant about this category, but basically if you've never heard my rant, any of my rants before, First of all, congratulations. But, but also, What do you mean? Like you, you have, you have to, a lot of old school organizations view that all of the quote I. T. workers, product managers included, scrum masters, because they can't tell the difference between a scrum master and a project manager anyway, and all the developers, they view them as a cost center they don't generate any revenue. The sales team generates all the revenue because they book all the sales. And even, even when customers renew from year to year because of the stellar service, they get from customer support and development teams and new features. Still, it goes back to the initial salesperson who sold them their contract, even though they've been gone two years commission and left. Just that you're like, we can't fix that. Yeah. We can't fix the way the organizations are structured. We can't fix org design, because if we could, we'd have a whole different podcast. Fixing org design, fixing your bad orgs. But what you can do to bridge this gap is show, connect yourself to value for the customer, and overlap that with customer satisfaction. Definitely. Walk in the customer's shoes. It's quite illuminating. Just do that one time and see how How much you learn, right? I agree with you. So what about the the team effectiveness here? We were just talking about customer satisfaction. What, what about, what about a more internal facing metric about your team's satisfaction? Team satisfaction is something that I find is largely disregarded. Right? I was going to say, I was going to say, I've been on, I've said this on the podcast before, I've been on exactly one team that has measured this team's satisfaction. One team. Yeah, because people say, you're here, you're trusted you have skills. Go to work, basically. And then, when you're done, at the end of the, whatever it is, day, sprint, nobody cares how satisfied you are. So you get paid, don't you? That's it. Which is terrible. It's really bad. Because I, I have belief that this actually leads to a lot of team churn. This is something that I try to implement in as a practice on the teams I coach, which is team happiness, team effectiveness. And, and as soon as you say that they will go, ah, but that's such a fickle subject. It doesn't have to be. Right. I'm a big fan of doing this throughout the sprint as opposed to on a good day only. Right. If you're doing deployments and the deployments are successful, everyone's happy. On the other hand, if they're not so good, right, it falls back on you. Oh, people aren't happy. Yeah. So there's a reason for that. But that's not what you're looking for. You're looking for how happy are each of the team members throughout the sprint, which is if you're doing two experience, it's 10 days of the sprint. So the easiest thing that I can suggest people, if you're not already doing this or are aware of it, is to implement a happiness calendar, right? The Japanese have done this, they came up with this, just like most of the things they come up with is very useful in our everyday life. It's called the Niko Niko calendar. Niko Niko. Niko Niko. N I K O. N I K O. Niko Niko. And effectively it's just a calendar. Right. You start with marking the day your sprint starts and you mark the day your sprint ends. And you have each team member post an emoji on there or emoticon if you want. A smiley, straight face, meh, or a frowny. That's it. Limit to those three. Because you don't want those other 50 million emojis. I was going to say, look what I pulled up. Yeah, perfect. Look at that. There it is. Perfect. Yes. Again, this is not necessarily a great example because you don't learn an awful lot from those that are wearing sunglasses versus those that are googly eyed or, Just limit it to three icons. That's what I do, right? Smiley, straight, or frowny. And during your retro, you can scan this and say, who's always unhappy and why is that? Talk about it. Who's always happy and why is that? Why are most people not happy on a Thursday, for example, right? These patterns will emerge and you can deal with those. Yeah. Over time you're gonna find More people will do this than even vote on or create stickies for your retro because this is so easy. Right. It is really easy.. I was going to ask, I was going to ask, do they make plugins to any ALM tools? I haven't seen that. No? What a valuable thing though, right? Do it right there in the ALM tool. Yeah, I was going to say that the first time you log in for the day, it just pops up on the bottom of your face and then you click the face and then you're done for the day. Perfect. And that, because that's it, because that's all you want. It's just, you're just taking the temperature once a day. Obviously you would want to trigger it. If possible, you want to trigger it When they walk into their daily stand up you just take temperature and then you're done. But yeah, I, I can't, I, I mean, it doesn't shock me that nobody is developed useful tools for ALM tools, but No money in it, I guess. Yeah, I guess. I don't know. I would pay, you know. It's whatever, a dollar per user per whatever month I would too. You pay, you pay way more for other garbage features. Like like JIRA product discovery. I it's terrible. the cool thing about tracking team member satisfaction is you start, you can look at it over the last sprint and then the highs and the lows, like if the whole team kind of dips for like a couple of days, you can ask what was happening on this day. You can put, you can start putting flags on the top like a, like a, like a milestone on a Gantt chart type of deal. Like. On this day, we did we were racing to do a deployment that was mandated. But on this day, we had a system crash, and it was all hands on deck or whatever. Yeah, it would be really cool. We as a collaboration tool, you could use Miro, Miro, etc. Because ALM tools don't have this built in, but it would be ideal if they did. Yeah. Right? And this is fine. It's not ideal. So what? Right? Everyone can share a Miro board. Mm hmm. So that's, that's what I tell my scrummasters to do. Some do it very well, some kind of just do it because they feel like they have to because others are doing it. Nonetheless, it's a very useful thing. Last, last word on the Nico Nico is Those of you that are listening are saying that Oh, but you know, I work on a scaled program. Guess what? You still have teams. You can still do it and you can do it at scale. Except here's the other thing you can do. Those teams that are dependent upon the work of your team or those that Your team is dependent upon, right? Ask those people to rate your team. As a whole. This is where you have to be careful, not Fred or Rita or whatever. No. This is as a team. Team A, Team B, Team C. And then see how that goes. I think that's a cool concept I would think that, that larger organizations that for example, they treat their agile coaches as a service organization. Hey they go to the service organization, say, Hey we think that our agile practices are dipping and we need an agile coach to sit with our teams for six months just to train them back up or whatever. I think it would be cool. To have an NPS type of score for the coach or for I think it'd be cool to have it for the coach because that coach is like they're doing a service for the team. They're performing a service with absolutely, but where I had thought about this is the, we were talking about several teams working together in one program. And when you have to interact with another team that the teams each have. A net promoter score of like Hey, Hey, how is your interaction with the team? How easy was it? How easy was it to collaborate with the team? How how effective do you think they were at helping you get to your resolution? Your problem? How, how you probably could think of a bunch of questions. And then teams would over time end up with net promoter scores. That might be cool. I've never seen an organization that does it like that. It might be interesting. I wouldn't do it at a personal level. I definitely don't do it at the person. I agree with you on that one. Yeah. Yeah, yeah, keep it impersonal when you're doing it at that level for sure. I mean, Scrum Masters can look at this sort of thing at the team level and say, for example, one person's more or less always kind of frowny or neutral, right? Sure. It may not be anything to do with their work. Maybe they're having some issues on a personal front. Yeah. Right? So, the Scrum Master, when they have their one on ones Could tease that out. Maybe they're always late and missing the stand up because they have to drop off their kid to school. Right? Okay, it's good to learn that. And then give them a break and say, yeah, just put your sticky on the wall first or last. Doesn't matter. We know to not expect you there as opposed to why isn't she always here or he always here. It's such an easy tool to use. It's also fun. It is really easy. Yeah. Alright. Right, what have we got left? We have bug leakage, which somehow I've driven around. I didn't want to deal with it, that's why we have it left. Oh, I know why I drove around this because again, the last time I brought up like bugs at a community event, like everyone looked at me, like I kicked their dog. I was like, I was like, just don't have bugs. Like that's the easy way to just like, if you have a bug, just deal with it and then you won't have a bug anymore. Like, well, what about my velocity? I was like, let it impact your velocity. Just tell leadership. We're fixing bugs and if they ask you, Brian, why do you have so many bugs? You have to impact your velocity and fix bugs all the time. I'll say you should have that conversation with engineering. I'll get them and we can all talk about it. Exactly. But maybe you're whipping them too hard. You know, just to produce output, right? So, quality dips as soon as you do that. Yeah, but, but again I'm a big advocate for a sustainable pace at my work. So, this never becomes a problem. So, I don't use this in my work in the last couple of years. I've not used this. I have been in organizations where this is like a top level metric, like a top five of things that we look at whenever the leadership meets and we kind of analyze things. I use this, but I don't use it. At the team level, meaning how many bugs the team had in the spring or anything like that's just the nature of the work. I use it to track escape defects, which means the customer seeing the impact of your bug. Sure. Right? Yeah. And again, not for the purpose of saying, well, why did that happen? But the purpose of looking at the root causes. We go through the five whys and keep asking why did this happen? Well, it happened because we released too early. Why did we release too early? Well, because somebody was really pushing us. Why were they pushing you? So, anyway If you can whittle down to a actionable root cause, you can nip that in the bud. Maybe it won't happen again, but you have to track it to see. Customers aren't going to necessarily tell you, right? They will simply throw the whole thing back at you and say, This release, it doesn't work for us. There's a minor bug. Something's off on the screen by two pixels. Right. Okay. You might say, well, so what? It's still usable. Hey, you're not paying for it. They are right. So I think it's a valid metric unless you get to the point where your team is so mature that when they look at something like that, they will be able to hopefully spot that. There are tools now where you can spot these things if you're using the automation tools and they will refuse to release it. Even if it's such a minor thing where the battle isn't with between them and the customer now, the battle is between them and middle management or upper management and say, you're late for this delivery. Well, we have to fix this. Throw it out there and see if they notice. I've seen that too, which is a terrible attitude. Yeah, right. We put these tools in to catch all this crud. And now you're saying, well, just. Ignore that because you know, my bonus depends on us making that release. I might get someone who used to work for me in a different life on a podcast so we can do QA horror stories. It'd be a fun podcast. Cause I, I've had this conversation is the, the, the QA person catches a defect in the sprint and the developer, Acts very defensive about the bug being caught. And then we have a big blow up about, well, it's, it's not something that was reported from a customer, so we don't know if it's a real scenario. So we're just gonna push it out. And then , the. Product person on the team is kind of browbeat by arbitrary deadlines because they're under the gun as well. And it's, it's just not a good one. You have a lot of compartmentalization on your team that happens, right? It's like the, it's like the boat where there's a hole on one end and there's a guy just bailing out water. And there's two guys at the other end going, Hey, look. The hole is on your side. You're all set. I don't, I don't again, it's like, I, I almost skipped this category because like for, for me like the, the state of the teams that I work with, this is not really a big thing. Like we find bugs, we fix the bugs. Well, I mean, that's, that's assuming we don't look at the bug together and decide like. It. This is, it's not worth fix it. This is not worth it. Like, we don't care about this. I will look out for it when it happens. Or we'll add a, add a monitor into the log and add it to a dashboard that lets us know when it happens and we'll go deal with it. I'm not worried about it. Yeah, I think it's worthy of, putting in some things that you can actually fish out of the logs and put them on a BI type of report or whatever, right? Just to kind of escalate that up or make it visible anyway, I guess. And then, that becomes the basis of the conversation. Okay so, Two, two categories left one is metrics around training. So are, are people getting trained? You know that old training budget that says, oh, we have money for training, but you've gotta go figure out how to spend it and you've gotta go figure out what things you wanna learn and then you've gotta put in this approval and what, like no one ever just gives you money and says like, use this money for training. Yeah, you gotta provide justifications and all of that. And if you dare not use the money, you won't get it next year. So, yeah, absolutely. That's an easy one to track, isn't it, really? You know, what type of training, how many, how often, how many hours? I mean, there's so many things you can do there. But you don't really see that much around that. Well like we I covered this in a different podcast, and we started, we, we did talk about categories of road map items before I consider training part of the road map for my teams. Anyway, I've kind of, I've kind of taken ownership as a product manager of training for my teams just because I don't know where else, if we truly are in a distributed, this is probably a good conversation for a different podcast about distributed versus centralized, because if you truly are a distributed model of teams. Like the team members like all the developers they roll up to the leads and the leads roll up to something else You know some sort of hiring manager some some short some sort and that person like other models call this like You know chapter leads or whatever whatever. Yeah who is responsible in the organization to make sure that. All of a particular discipline are learning new things. You know, the product managers continue learning and getting new skills. It's the same thing for developers and whatnot. If you're in a decentralized model where all your people are out on teams, they're not they're not really part of one central place, it's more difficult to distribute training, to make sure everyone's getting all my developers on all the teams are getting the same training or whatever. So I sort of have taken this onto my roadmap. And baked it into the road map. So I have a, we were talking about tags as part of a technical enablers, road map features, stuff like that. Training is a category for work items in my road map. Because sometimes developers will want to go put their hands on new tools and see if it's applicable. And it makes a lot of sense to put that on the training budget. Like as far as like where you write off the cost goes, the cost of them trying, quote, cool new things. It's going to go on a training budget the, a budget that's going to get expense basically. And it's super easy to do if you're putting it on the roadmap. You know explore this tool go learn this skill that kind of stuff. I understand this is probably pretty alien to a lot of people well because you know a lot of people are in that In that quagmire of working in a matrix environment or PMO type, right? Where training is something that you have to fight for with PMO. And then they'll look for justification. Why, why do your developers need this? Developers actually have it a little easier because, to your point, they don't need to learn a new environment, new tool, new this and that. It's easier to get. But product, for example, right? It's hard to justify that product needs new training in. Analytics or whatever, right? That becomes hard to justify. And PM, it's hard to justify only because those with the purse strings, they don't understand it. They don't understand what this training is for. How it helps product, for example, right? Soft skills often fall in this category. Right. You say, well, we want to train people on emotional intelligence. So what's that right now? Facilitation. Yeah, exactly. That's another one. Facilitation. So, so that, that is a shame that you have to kind of. Explain to people that don't get it to begin with and then that's, I think the challenge in a modern distributed organization again, where we were talking about, like you have hiring managers, but then the team members are on different teams. I would think that your, your Leads like your development lead. It's like you have like four or five teams and they all share like a one development lead, or maybe they call it an architect. I don't know what you would call it, right? Same thing with product managers do this. Like there's like a group product manager or something like that. I would think that those people could have something that resembles a roadmap or at least a combine board where the swim lanes are their people. And it's basically a training kanban board. It's a personal development, basically type of board that says, here, here's how I'm trying to develop the people that I'm responsible for. Yeah, there's a lot to be said for those guys of information radiators. You're right, yeah. I like the idea of a training kanban. I just, I'm hesitating here a little because if you're gonna put people in their swim lanes and say you're doing these things, you have to have the budget for it. You have to get that budget approved, firstly and foremost, right? And then the other side of it, which is Fred or Mary or Joe, when they see 10 hours of training in the next month, If they're pressured to deliver something, they're going to, that's the first thing that they're going to neglect. Training. Because I'm working on stuff. The real stuff, they call it. Well, this is real because this training that you're getting is directly applicable and is needed for what's coming up next. And the time to get that training is now, not when you have that work on your plate. So as long as you can align all of those things, it's a great idea. Yeah, this is probably a topic that we can put a pin in for another podcast because if everyone's distributed all over the place and you never really have like a central team to go to and say, Hey, this week I want everyone to do whatever you really have to say like, Hey, I want I'm going to make this resource available to you. Per person, a, b, c, whatever. And, in the next three months, you get like a general timeframe. I'm, I'm, I'm going to make this money available to you. Here's a course that I think here's a selection of courses I think you should take depending on the person, like some person, some people might be like, hey, You need to learn how to do some project management stuff. Here's a course for you. Hey, you need to learn to up your facilitation game or whatever. Here's a course for you and kind of lay out some things. They know the budget's there and again, this person does their reviews, so like you're leading them towards where They need to shore up some of their weaknesses. You're not saying like drop what you're doing. You're giving them a time range so that the person has to manage it into their all the day to day work they're doing. But, but it's, it's sort of the, the person kind of has to help you, the manager, work it in with their team. But it's, but you're right. At some point if the schedule is just crushing and everything needs to be done yesterday kind of culture. At some point the manager is going to have to come in to the team and be like, Look, I'm taking my dude out for two days because you guys can't seem to get your act together. But also what I've seen to what you were saying just now, it's on the person for sure, right? But what I've seen also when this has been tried, at least, is They just don't give any slack to that person to actually get trained. So then it's like, well, here's the, the budget's there. You can do a course, but you have to do it on the weekends. Well, it's like, Hey, I'm out. That's a metric too. That's a metric. That's a great metric. We, we quote got the dollars and we got the training budget or, and no one's using it, like that's a metric too. The final one and I guarantee you, the one that nobody is using, is tracking meeting duration and effectiveness. Meeting effectiveness. Oh, Alfonso, if you're listening to this or watching, Get that app ready, because I used it. Alfons, a friend of ours, he created an app, mobile app, where you can plug in a ballpark number of dollars per hour, put in how many participants you have in your meeting, and, right, and as people are talking about stuff, and they go off on tangents, you can have stopwatches on it. So you can say, well, I'll stop it here. They just went off tangent. A minute or two later, you come back, bring everybody back and go, Guys, we just wasted 47 people in the room going on a tangent. So don't be doing that. Right? The other one is to go in and say, There's 47 people we're going to meet for half an hour. This is the amount of money we're going to burn in 30 minutes. Are we getting value? At least worth that amount of money? This can be a metric. I think how you use it is important. Initially, just make it visible. Just make visible the amount of dollars that are spent in meetings. Do we really need a meeting? First of all, right, can we just chat about this in a Teams chat or Google chat or something? Do we have to have a meeting? I don't know if I, I don't like the idea of tracking this. Because, I can count probably into the double digits of people that I've interacted with across my career that have walked into one of the out of the box Scrum events and said, wow, there's a lot of people in here. Do we really need them? Leave us alone. Like it's, it's not the scrum events that you need to worry about. It's all the rest of the garbage meetings that, so that's a great point. Right. If you're doing the scrum events well, you should need very, very few garbage meetings outside of the scrum events. Look at, look at it that way. Don't look at the durations of those garbage meetings. How many are we having and why? Why aren't these being able to handle through the regular scrum events? Now, if that's the metric that we're tracking, I'm all for it. I would say meeting durations and effectiveness of Non scrum related meetings. That's a great clarification. I agree with that. Actually, I'm down to track it. Some people just love to be in meetings. And let me pivot it off. Let me pivot it by meeting organizer so I can see who is taking my team members time. Yep. And cross charge your team members time to them. That's exactly right. Because again, as we learned, there's no time in a Brian & Om's software development company that is free. Everything is charged to a line item. So that we can extract maximum dollars for a minimal work. Sneezes are free, but only one per day. Oh my goodness. If you're still with us and you're interviewing for a product manager position when I ask you what metrics might you look at to know if an Agile team Is operating effectively, or working well together. Now, hopefully you have a better response than velocity or utilization hopes and dreams. Yeah, don't say utilization. I'll kick you out. All right, folks. Listen thank you for staying with us like, and subscribe our podcasts and let us know down below in the comments, what else you'd like us to talk about maximum utilization, max headroom.