Recently, a small audience was shocked to learn that Product Manager Brian Orlando has no team members in the permanent role of QA (or Testing).
On this episode, Enterprise Agility Coach Om Patel and Product Manager Brian Orlando discuss why the audience reacted like Brian just kicked their dog, and what it takes to get a hypothetical team to the same point.
0:00 Topic Intro: We Have No QA
0:26 The Backstory
1:46 It's What We've Always Done
2:43 Handoff Disfunctions
4:55 Efficiency
7:27 Brian's Programming Story
8:52 Unpacking Brian's Tirade
10:00 What Are Testers Doing?
12:21 Real Example of Collaboration
14:49 QA and Customers
16:57 Process Improvement
19:18 Customer Verification
22:14 Cross-Functionality vs. Penny-Pinching
25:19 Team Collaboration vs Silos
27:02 Brian's Rant on Silos
30:47 Summary & Key Takeaways
33:10 Wrap-Up
= = = = = = = = = = = =
Watch it on YouTube
Please Subscribe to our YouTube Channel:
https://www.youtube.com/@arguingagile
= = = = = = = = = = = =
Apple Podcasts:
https://podcasts.apple.com/us/podcast/agile-podcast/id1568557596
Spotify:
https://open.spotify.com/show/362QvYORmtZRKAeTAE57v3
Amazon Music:
https://music.amazon.com/podcasts/ee3506fc-38f2-46d1-a301-79681c55ed82/Agile-Podcast
= = = = = = = = = = = =
We had a conversation at a networking event very recently where I said that I don't have a dedicated QA people at my company. And boy, it looked like I kicked someone's dog in front of everyone. It was, yeah, that was much to everyone's astonishment. Whaaat? It did not go over well. And we're here to talk about why I don't think it went over well. The setup for this was I told people. I don't have dedicated QA team members. I don't do a dedicated testing phase. The developers on my team are somewhat senior. They do the full stack of development. They're in total control from the very beginning of the requirements, talking to the customers, all the way through to deployment and maintenance of the applications. And the team is completely cross functional, has all the skill I need to work on everything that I want to work on. And we don't have bugs. We don't have a backlog of bugs, because when we find a bug... We either make, we make a decision to either fix it. And if, and if we want to fix it, it gets fixed immediately. And if we don't want to fix it, it gets deferred. So either, either we choose to work on it as part of a larger enhancement in the future, or we mark it as a known issue and it goes away. So that sounds to me like you have a dream team right there. You have all the ingredients on your team experience the willingness and the ownership of the work, the quality of the work, right? Say that that was that was the, that was the that was the attitude of the group the other night as well of a dream team, but not a dream team, meaning a collection of the best players. Working under the best processes, processes uh, a dream as in you got to be dreaming to have this going on. That was the way it was perceived. And I was, and I got to admit, I was kind of taken back a bit by a group that is focused on agile software to have that understanding. I think it's it's really What people know today, right? And they've always had developers and testers on their teams. So the paradigm of no developers test their own stuff. I mean, not necessarily. I'm not talking about a developer testing their own code. That's not what I'm saying. I'm saying. The dev team tests the output of their work, right? And if there's any defects, they own it, and collectively they discuss that, presumably with the product folks, and decide either to fix it or not. If it's if it warrants fixing, do it now. If it doesn't, then put it in the backlog. It's what people, are familiar with. It's what they're accustomed to, right? You have to have QA. What do you mean you don't have QA? I get that. Like, out of everything that we had talked about... Most people's experience with this is with QA being a separate team and things getting handed off to the QA team and then having a QA phase while the QA team does some QA things. Yeah, there's some QA magic. So, you mentioned a couple of words there that I believe are at the crux of all this, right? The handoff. So, if a team is brought up to kind of work in this mode where developers develop and testers test, you're going to have a mini waterfall in your sprint right there, unknowingly or knowingly. And some teams that aren't so good, what they end up doing inevitably is say, well, development's complete. Only testing's left. Right? Only. Only testing's left. That is a game that I remember being played a lot. Which is developers working up to the last day of the sprint, and then throwing their hands in the air taking a victory lap, being like, we're all done! All done! If only these QA people could get things done. Because the QA people consistently, they're the bottleneck. They consistently... Have stuff held up at the end of the sprint and then you know take it here comes the end of the year reviews This is what we heard. We heard that. Oh, it's always QA is always the bottleneck Yeah works up to the last hour of the sprint throws it over the fence and then whoever get whenever the buzzer Rings whoever's holding the hot potato They're out. You got it. Yeah. You're out. So I think, yeah, I mean, look to, to the QA people's defense, right? They're going to say you didn't throw things over the wall at us until the last day of the sprint. So that left us no time. How many times have we seen that kind of behavior? I've seen that a lot in teams that are not quite as mature, right? They're just formed or whatever. So yes, of course, people are blaming the QA people. Indirectly or indirectly, there's a well, development was complete Q. A. We're lagging, and that's maybe because you know, they don't know what they're doing, etcetera, etcetera. So coming back to the original premise of this, right? If you have this mentality of bolting on testing and quality at the end, you're gonna have this issue. On the other hand, what you're describing is People taking ownership off the product increment that they're creating. And this isn't just development. I think ultimately it comes down to What the team's definition of done is not what the developer's definition of done is. It's the team's definition of done. So when do we complete things, right? And making things complete means, yeah, baking the cake and decorating it and making sure it's, it's, it's cooled down enough where people can consume it, right? Oh i, I know something that came up in that conversation. It just, it just came back to me like a, like a vivid dream. It was your development team, if they have to also do testing, there'll be less efficient developers. I remember that came up. I remember that coming up. And I, and I also remember not dealing, oh boy, I, like I didn't deal with any of these points very well. Cause I remember saying like what else are your development team members going to do? Are they, are they going to, they're not going to do any unit testing. They're going to just they're not going to do any exception handling. They're not going to do any try catch in when interacting with databases and other data sources, like they're just going to assume that everything is up all the time. Like, no, they're not going to do that. They're going to look to handle exceptions. They're going to look to log things. They're going to there's coding standards as they already do. It's just, you may, may not know about them. So what's the problem with basically changing your coding practices to integrate tests directly into the code. And, and also, before you respond to that point, I want to give you a whole unrelated point. To respond to as well, which is, I've also seen where QA people don't have the time that it takes for them to spin up as developers. I probably really should have separated this one as a separate point. So they, they, they come up with the test that they want to have automated. And they will ask the developer, while they're writing the code, to also develop the automation in line with the code. You know, it's a, hey, hook this into, into your code as you're, as you're writing it. So it's like they're basically architecting the tests which is just like, it's just another requirement quite honestly to me, but I agree. So let's take both of those in that sequence probably, right? So developers that. Just write code and that's all they do, right? And maybe they do some, a little bit of that try/catch stuff. I mean, okay. If that's all they're doing, then I'm afraid you're going to be in that situation that we started the podcast with, which is QA has got a heavy burden to to carry if on the other hand, a developer is going to start by not writing code first, but by writing. A test first and then writing just that little snippet of code that causes test to fail just that just that little piece. Right? And once that happens, they can move on to the next piece in the next piece. If they do that, they have really covered their basis quite well. So I think that, of course, everybody Has heard about TDD, and that's pretty much where I'm heading with this stuff, right? Test driven development, so write the test first, and then the test drive development, which is what TDD is. yeah, in order to do that, though, you need to, I mean, I might be way off base here, because when I write something new when I go to interact with other systems and I can't be sure that they're going to hand me back valid variables or a valid response. Something that's not garbage, basically. I can't always assume that the transaction is going to be successful. Okay? And when I'm doing that, and I'm writing it for the first time, I'm very diligent about what I'm doing. It's when I come back and punch up the functionality later. Oh, this interacts with this one system, but, but maybe I'm going to add this new flow where it goes and looks at some other system first to get some additional field or whatever. And you know, Oh, I got a new subscriber or are they new subscriber? They're already in my database. Now I got to go check my database to see if they're already in there before I add them. And if I add them, I give them, I sent them to a different webpage that says, Hey, you're already here. Do you forget your password? Are you lost? Did you forget your keys? Like now, now I have a different path to Matt. So when I'm doing the enhancement, again, this is just me personally, when I'm programming, when I'm doing the enhancement, I'm more apt to forget. About the doing all the exception work that I did when I did it new. Because like a lot of times when I'm doing it new from scratch, I'm, I'm like sketching out a little flow chart on a little whiteboard and saying like, Oh, I'm going to go here. I'm going to go here just to help me keep the logic clear in my head. But if I don't do that same process again, when I'm bolting on pieces of the system, as the system gets more complicated, there is the chance of injecting issues because I'm not following these standards. Yeah. So let's unpack that cause I think you've got three or four points in there, right? So I, I, I think you're right. If you're. starting from scratch. It's a blank slate and you're more apt at that point to map out things, right? Especially the boundary conditions, things like that. You know, boundary meaning like you're going out to another subsystem and coming back and you might look at The quality and format, et cetera, of the response you got back. But then going back in and adding to it, it is very easy for us to forget about that. And say, this is also, really, this is also brand new. Because you're now looking at another subsystem. So why wouldn't you go through the same exercise? But human nature, you don't. And where it gets really muddy is, You talked about you going in as the original developer, right? But what if another developer picks that up? Now the problem is much more likely to rear its head, right? Because they don't know what, they don't necessarily go back and look at what the other person did. They just say, we're going to assume everything works so far. I'm just adding this little piece. Well, it's much more likely there that they don't do the due diligence, right? So if you're a developer and you're picking up somebody else's code. Remember that, right? That that's one thing I wanted to pull apart. And the other thread I wanted to pull on this is you mentioned QA working with developers here, right? So sometimes what I hear from people is what did the testers do the first few days of the sprint, maybe the first week of the spring, what do they do? They're just sitting there twiddling their thumb. I didn't even think about that. If you're asking the question, I'm afraid you've got it all wrong. It's been, it's been so long since I've heard somebody say that. Like not, not, not, not, not ironic with, with no irony unironically say that it's been so long since I've been in an organization that really misunderstood. At such a fundamental level, this shift left concept has been so long that I don't even really consider that anymore that I need to talk to that point of, what are you doing in the first couple of days of the sprint? You, I guess you're doing nothing like maybe you should go work on some other project or go find some documentation or whatever, like go find some busy work to do basically. That's exactly what happens. So people who don't understand it. I see this all the time, by the way, it's right. So people say, well, you don't have anything to test until the second week of the sprint. So why don't you go work with that team over there? Right? That happens all the time because the focus is really on utilization, right? And not on improving the process. So if you're doing this in a way that's effective, let me put it that way your QA people will be working together with your developments, developers from day. One, right? So we hear the terms pair programming, etcetera. And we think about two developers looking over each other's shoulders and doing that kind of doesn't have to be two developers. So quality here is an activity. It is not a role, right? So if you say, well, your Q. A. And I've got nothing ready yet. I'm a developer. I'm just starting. I know you want to pay a program with me, but hold back, right? I'll let you know when I've got something for you. Yeah, that's not as good. Just say quality people should be working together with developers from the get go. And that's where this is the fourth thing now, right? This is where the quality folks could say, Hey, can you put this little piece in there as well? And that's going to be hooking into their automation tests, right? So if you do that from day one, what will happen when that little piece of code is finished? Let's say it's a user story or whatever PBI when that is finished. The tester is already good to go with. A test case, a test scenario or whatever else, they're not crafting one from scratch, which means you're saving time. And then that time compression magnifies over all of your user stories in the sprint. So think about that for a minute. Also what gets placed in the logging usually is under the usually development has logging standards that they kind of agree between all the developers, sort of like a technical standard. But boy, it would be a lot better for QA people sitting over the developer's shoulder, it'd be nice to say Hey, I see in this transaction, you have these variables coming back. Like, can you just dump them all into the log as a regular part? You know, it'd be a lot easier for me to see all of these items in a log, whereas maybe they'll just say like, I've seen like single messages, like transaction successful well, what does that mean? Like what? I need to know what you passed, right? And then what came back. I need to see it all in the log. Anyway I'm thinking about like specific I think you're really touching on the coding standards that should be in place, right? And those coding standards should not be developed only by developers. I mean, by all means, engage everybody on your team. Well, this is, I mean, if we're going to talk about shift left like shift left meaning like Do the testing while the active software development, like I'm, I'm even I hate the coin phrases that, cause it sounds so ridiculous, but like, I'm talking about shift left all the way back to talking to the customers. I think of all the QA people that when you intercept bugs through customer support, like you walk into the customer support fishbowl or whatever, I don't know, usually it's considered a fishbowl or a call center with no windows, I don't know what you call that. But I got that song in my head, but anyway, but usually when you walk into customer support, like you're trying to help those guys and you, and especially if you're, you're a partner with them on a regular basis where they're your stakeholders and you're talking about new features and you're helping them fix bugs and showing them how bugs are resolved and stuff like that. That's a great place for your QA people to engage as well. So if, if you're QA people. Talk to customer support more often than your developers talk to customer support. We're also assuming that departments talk to other departments at this point. I know, I know, I know, but, but, that's also a great point where when your QA people are sitting with your developer while the, while the, the, the, the code is being written, they now have an insight that bridges, not only. How the testing should be done, but also how customers use the software because they've sat with customer support, or maybe they are customer support of your company because your company is small, or, or even better, if your QA people are under the product office. Now there's a whole, a whole nother level of quality they can provide by making sure that your, your solution fits the need and it's like a million other things that could derail this podcast in a moment's notice. I think if your QA people have a purview into how your customers are actually using the product, you're way ahead of the game there because it isn't simply just looking mechanically looking for, you know Code correctness that that's not what we're talking about. We're talking about fitness for purpose off the software. So the software is developed not to be mechanically correct. It is developed to satisfy customer need. So here's where a You know, a a a bunch of QA people, if they're listening to the customers, I think they can, to your point, really help bridging that gap, right? Sure, absolutely. And they can even feed back to developers and say, yeah, it's correct, there's nothing wrong. However, that's not how they use it. Well the crazy part about the like sound really crazy you know, the, the other kind of wild part about this to me is like, nobody, nobody would balk when I say, well, we don't have bas. But, but if you think about it, in this example where I'm asking my developers to be QA people, testers, to take on that role as well. Also, I don't have any dedicated BAs, so I and the development team both share the role of being a BA. Sometimes, sometimes I will, I will write quote requirements. Right? From what I hear customers want and other users and stuff like that want in the system. Sometimes the development team will come to me and say, Hey, we, we see the need to have this. We're going to write a story about it and we'll bring it to the next refinement. And I say, that sounds great. Sounds good to me. So we don't have QA, but also like we don't have BAs either. Like who does that job? Well, we all do it. In other places I've worked where we did have the benefit of QA The QA people end up naturally becoming the business analysts in the equation where they, they vet requirements. They talk to customers, they talk to customer support, they talk to development, they talk to product. They kind of go between all and they make sure that the end to end. The quality is considered end to end is what I'm saying that that that basically becomes their job So it's like they're sort of doing QA. They're sort of doing development. They're sort of doing testing They're sort of doing the BA job But in order to ensure quality you've got to do all those jobs You know, or at least interact with all those jobs, you know what I mean? Yeah, I do. So, to me, that's the difference between QC and QA. You know, QC folks are simply checking for correctness. Yeah, right. And what you're describing is building in quality from the beginning. I mean, that's gotta be the way to go. Well, I mean, to really, to really really piss people off. Like to, to really exacerbate the situation. If they can do what they're doing for the customer and the requirements and the, the testing and the, the basically the coding and the delivery and all that kind of stuff. Boy, think about what they can do also when like you inject the process and continuous improvement of your work processes into this cycle as well to say, Hey, if there's inefficient communication between teams, or if there's inefficient work methods that are, we're bringing employed bring that up and let's deal with that QA people to, Go study Kanban and go study different software development methodologies and, and, and, and look at the, what you're doing the work processes that the teams are engaged. What I'm saying is if you're going to be in the role of quality assurance, and you want to assure quality, like quality doesn't stop at one phase quality, you draw a line again, or draw a line around everything from the from the very first moment that the idea of a requirement. Okay. Pops into someone's brain that's included all the way to when it's delivered to a customer and a customer uses it and gives you feedback how they like it or not and beyond and beyond like all that that you have to draw a circle around all that which is a very very expensive circle and your QA people can't be afraid that to get blocked in any one place in that and also if you build organizational blocks to keep them out of any one place now you're impacting quality with with with with a You're building artificial dams to stop the quality at some point. Yeah, exactly. That's correct. You're promoting siloism basically at some point, right? And that's not a good thing. So oftentimes it's perception, right? So people think of quality as somebody who inspects it at the end. And we're talking about people. We're talking about a paradigm where quality isn't inspected for At any time, because you don't have to inspect for it if you build it in, right? So it starts, like you said, for the full gamut after delivery of whatever the deliverable was to the customer, they still watch that for a while, right? They work with the customer to figure out how else, how could it have been made better? I mean, I, I will, I'll tell you right now, I'll go out on a limb right here. And actually this, this is a question I should bring to a future networking event near you. In your columns on your development board, where you have these To do, development done, begin QA, all these little columns that I've seen, all these little games I've seen people play over time. You know, in the staging environment, in the test environment done. I call them all wastes, but anyway. Waiting for my other team to throw it over the fence to some environment or whatever. when you have all those columns. Boy I have I've seen it before. I've seen it on one team. Exactly one team that has a column that says verifying with the user but it basically it comes after the done column done, meaning development is finished and now product when they do the product demo and they do a period of follow up with the customer afterwards to make sure that what they just pushed out into production meets the customer's need and the customer is satisfied with it. There is a, a person to person, it's like not asynchronous, like send it through the mail, write your congressman. carrier pigeon, a face to face. Do you like, what do you like about this feature? Where do you think we went wrong? What do you think we can improve conversation? And I I've very rarely seen people add that add that column. After the done column, because the done, done means like development team. Yep. Wash their hands, their hands of it. And they, and they're, they ride off into the sunset to deal with the next fire. That is because they, cause they're all judged on quote efficiency. If we build a million features for our customers and our customers tell us all these features are unusable, they're all crap. And this was not my original problem. Oh boy. Have we missed. Oh, boy. Like we made it. We've missed Bigley. Bigley. Absolutely. We use that. Use that. Bigley. We missed Bigley. And I'm gonna use another one. We use all the time. Keep that resume updated because you're now circling a drain as a company, let alone your own unit. Absolutely. Couldn't agree more. I haven't, by the way, you're lucky that you've seen one team do that. I've not seen any team do that. I have seen Teams pay lip service to this, right? So they have people that they put in the role of titles like customer success, and then they'll just, they'll just contact the customer once in a while and say, yeah, exactly. So. So I think you nailed it when you said face to face, right? Send an email and say, on a scale of 1 to 10, right? Please rate on our you know, whatever the survey is monkey survey, whatever it is, right? Survey monkey. How do you, monkey survey is better. Chimp survey. Just say how do you like it? You love it or you thought it was great versus it's okay. There is never a scale to say it sucked bigly. They missed that out on purpose so they don't get the feedback. They only get the feedback they're seeking. I've seen that huge miss, by the way, right? Huge miss. You're just kidding yourself until another company comes along. Who really does it right, and then, yep, you're toast. we talked a little about paraprogramming. We did not, we, and we touched on cross functionality, but we didn't dig into cross functionality. I, I, I think like one of the reasons that I told the, the group, I said, well, most of my most of my developers are not, they're not junior. I don't think I have junior developers, honestly. I was like, well, the majority of my team are senior developers, even if their job title is not senior developers, they are senior full stack developers. So like, obviously the company pays. For senior full stack developers but if you have a, if you have a staff of junior developers or you're just going to staffing firms and getting warm bodies off the street, or you're going to whatever offshoring and, and cutting your offshore firm every six months because you're constantly in search of a cheaper solution because you got to save a couple cents or whatever. I could see why this is perceived as being crazy. If this is your team, my team is all junior people. They're not cross functional. The developers wouldn't know how to test. Even if you sat them down and try, you know what I mean? That's just not the world in which we live. Totally get it. I totally would understand. And I would say, well, maybe what I'm doing is not for you with the current way that your team, given the current way that you're, I think the same thing about if you, if you cannot control, if, if. Churn on your team is out of control. Also, like you're losing people every other month. Yeah. You know what I mean? Just because, I don't know, whatever other reasons. Same problem. You have the same problem. If you can't keep a solid team, you can't do what I'm doing. But ultimately, both of those scenarios come down to one thing, right? And that is economics. You're trying to save money.... You get what you pay for. That's really what it boils down to. Right? So you're going to save some money because this other vendor offshore vendor has bid a few dollars less per hour for their resources. So you go with them. And you're saving money. And if your company prizes What you're doing, right? Saving them money and you get promoted, then fine. But you know that that's exactly what happens in that scenario. You're going to have a situation that leads to customers not being happy in the end because the delivery isn't really. Up to their expectations. I mean, I would say that whole customer validation bit after the fact and the whole, like bringing the customer in on the first step and on the last step and you know, so that the customer is always involved. The problem is getting out of control. When you have offshore teams, especially offshore teams of junior team members. Boy, I just doubled down right there because now your company probably is saying like, Oh, well you, you just get with the architect or you get with the lead developers here in house onshore or whatever. And you guys write all the stories and just send them over to the development team to do. You know, well, we'll do all the requirements over here in a corner and then just distribute them to the, to the code monkeys. Yeah, right. And the customers know where to be found. Right. Yes. I've seen that way too often. And people say, well, this is what they need. How do you know that's what they need? Right? Well, that's what they need. We, we know what they need. No, you don't know what they need because you haven't spoken to them. The old we know of product management. Yes, exactly. Exactly. I want to bring us back to this whole whole topic of you know, testers, right? So here's here's another thing that I see all the time. People saying testers are testers. They don't write code. Developers don't test, right? They develop and Testers are writing scripts and stuff, but these guys are writing the actual code. So I'm thinking to myself, what are these guys doing? This is shooting rockets in the space, and these guys are actually landing on the moon. It's not like that, that the technology that they're using is way more similar than it is different, right? So using some of these automation tools, it isn't beyond the realms of developers to learn that, right? So I think where I'm going with this is the. Testers among you should be open and embrace your developers and say, these are the tools we use, let me show you how we use it and they can use it too. We, we use it, not us and them. Yeah. Right. If you do that, you have a, you have a good chance of having a developer say, say to you, well, let me write this. And then I might actually trigger your test scenario immediately through the pipeline or whatever. So there is that, but I don't see that very often. Unfortunately, I see the siloism all over the place, largely because of budgeting reasons. They say developers are budgeted this way and testers are budgeted that way. And you mentioned earlier, ideal scenario would be that they fall under product testers, at least, right? Maybe even developers. Why not? You're developing a product. You're not developing a project at the end of the day. Again, I haven't seen that. Yeah, all that often well that you know it's like seeing the seeing the roadblocks of who people should work for and report through I'm gonna pull the steering wheel and go around those roadblocks because I want to talk about control control over I don't know if controls right word but I don't think you can reach a level of quality that you have to reach in order to, to, to work the way I'm working now without having total control over start to finish. Because my start to finish does not start when somebody has handed me requirements. That are already fleshed out and then I just do some development and then I send it over to another team that deploys it and then another team does customer support and then another team does whatever that that's not the case. The case is I talk to customers, I decide what they need with them and the development team, everyone in the room. We control all the environments we send code out to and release to. And using a CICD, we're continually putting things into production, like little, little, little bits at a time, right? And the testing is done and, and wrapped in, inside of that process i, if. If we had to Oh, the continuous discovery is done by my product team and and like a lead engineer or something who doesn't even work on my team. Maybe they're like my boss's boss or whatever. And then they go off in a corner and they decide. What's going to be done. They write a PRD product requirements document. Well, they hand it to a BA to write. And then they, yeah. Or maybe they, they involve somebody else to write that. And then they hand that down to the team with a list, basically, well, several lists, honestly, of things that need to be done with diagrams and architected out and like the creativity is minimized and the beatings will continue until the work is done. And then when you code things you can play in your little development Sandbox, but the minute you go to another environment past that you got to get another team involved and you got to wait on Them yeah. So with all these handoffs, with all of these silos no matter how wide the silo is It's still a silo with all these silos happening I don't think it would be possible to do what I'm doing with even one. That's my challenge. Even one of these silos. Maybe one. Maybe, just pick one. But also the minute you inject one, I'm gonna start, I'm gonna look at that one and say, Really? Why is, why that one? Why, why that one? Yeah. Yeah. One's one too many in this case, I feel like. If you just pick, pick your favorite silo to say, This silo is really important to be silo fied. Silo fied? Is that a word? Ooh, I like it. Silo fied? Silo fied? Why? Why inject one? So if we agree that the whole point of us working in this way, in the agile ways of working, is to deliver customer value as soon as we can. If we agree on, broadly agree on that, every one of these handoffs, Works against us. Yes. Right. So in the lean way of thinking, each one of these handoffs is a waste, right? Technically. Now, how can you accelerate value to the customer? Eliminate waste? And what better way to eliminate waste then go after the low hanging fruit, right? Which we've come across in this podcast already a few of them. But there are some other ones that are kind of We glanced at it like developers are done. Let's just say they work with QA and they're both working together. And there's maybe you have no QA and developers test the stuff, but they can't actually deploy it because they have to wait for a DevOps team, which is a matrix team somewhere. Hand it off to them. Another handoff. And the DevOps team is very, very willing to help you, but they've got a backlog that they have to work through. And so your stuff's gonna happen three days from now. Oh, another handoff. Another waste, right? If you have the skill sets on your team, though, to deploy that increment to production. Then you could get that to the customer quicker. There's so many instances where I've seen those things and I point them out and say, This is a waste. This is a waste. This is a waste. And the counter argument against that always is, Yeah, but that's their job. That's their job. It's us and themism again and again and again. I guess for people that have made it all the way through this podcast Both of you, thank you. Thank you, both of you. And I hope by now you've detected We're not saying I don't know what I'll title this podcast. I don't know what click baity title I'll put to this podcast, but like, I'm not saying don't have QA people. I'm saying my team has testing skillset and everybody on the team is looking out to make sure that quality is built in at every single step along the way throughout the whole life cycle of the software. And because we have discipline, and because we have basically advanced practices, advanced team members, what I think of as advanced, I mean, we could be doing a lot more, we could be doing a lot more automated testing, we could be a lot more diligent about the QA metrics that we're pulling, anybody can pull QA metrics, you know what I mean? Sure. There's a lot more, we could be, Writing TDD, actual TDD, I could be putting it into the requirements so that it's, it flows through the rest of the process. There's a lot of stuff we could be doing. Do we need to do it? Well, we have no bugs, and we don't have any big problems that are causing issues with regard to product and with regard to technology. So do we need to do it? For my application, probably not. No. For other applications, maybe you know, you work in medical field, you work in something with more scrutiny, maybe you'd have to go deeper and implement more things. But anyway, it's all about impact at the end of the day, right? It certainly is as long as you're inspecting what's happening. And you're keeping an eye on the total quality, you know of the entire system, basically, not just one single piece then maybe you're headed in the right direction. So hopefully by the, by the end of this podcast, like people have understood that you know, when I'm not saying again, my background is in QA. So that's why I have a lot to say on this topic. Sure. We're not saying. Don't have testers. It's quite the opposite. Actually. It's take the time to where everybody in your team has a testing skill set, right? Which is the way more expensive option? Actually, so it was interesting that people immediately pushed back on me when I brought this up because what they didn't realize is oh you're actually spending more money ensuring that your product is quality then we are over here with a dedicated QA team where we throw stuff over the fence and You know what I mean? We have all these silos. But then the quality is subpar at that point. So, yeah, absolutely. So that's, that's this podcast you know, don't optimize, don't do local optimization is basically what this boils down to. Right. Look, look at optimizing the whole. Yeah. Yeah. Yeah. Take a systems view. Yes. Systems view, a holistic view. Well, if you like to take a systems view, like and subscribe. And also let us know what topics that you would like us to talk about. We have a form on the website and yeah, just navigate over to arguingazure. com and let us know. And that's a wrap.

