
Definitely, Maybe Agile
Definitely, Maybe Agile
There Are No Safe Bets in Business Anymore
In this episode, Dave and Peter explore why "safe bets" in business can be the riskiest moves today. They unpack the shift from long-term plans to fast, testable experiments, and why companies must embrace uncertainty to stay competitive. Topics include digital transformation pitfalls, cultural resistance to change, and the importance of alignment over tech.
Key takeaways:
- Shorter cycles are essential
- There are no safe bets anymore
- Alignment matters more than technology
For more insights, visit definitelymaybeagile.com, subscribe, and follow us on social media. Questions or feedback? Reach out at feedback@definitelymaybeagile.com.
Peter [0:04]: Welcome to Definitely Maybe Agile, the podcast where Peter Maddison and David Sharrock discuss the complexities of adopting new ways of working at scale. Hello Dave, how are you today?
Dave [0:14]: We're doing very well, doing very well. So enjoying having got back from Toronto. I'm enjoying finding the weather's just a little cooler and that you inherited some rain, if I'm honest.
Peter [0:24]: I'm actually kind of glad for it, because it's been very, very dry.
Dave [0:29]: So what are we talking about today? Taking safe bets? I think this is a really... it's a bit of an oxymoron, isn't it?
Peter [0:34]: Yeah, it's one of these things where, as we often see in organizations, they take a certain approach to problems because it makes it feel like it's going to be a safe bet. Like this is the... we just do this, it's going to be the easy way to do it, it's going to be safe, we're not going to have any problems, we're not going to lose, it's all going to work, we're going to win.
Dave [0:53]: Well, and I think this has so much to do with our very human comfort level with stability, predictable, safe solutions, ways forward, whatever it might be. And we've briefly talked about the explore versus exploit side of things around complex problems and knowing when to go and explore and try things out and find new opportunities versus take the opportunities we have in front of us and invest in those and maximize the return that we can get from those opportunities. And it feels like there's a sea change away from that exploit side of the equation, much more into the explore, driven by the pace of change. Technology doing what it's doing, geopolitics doing what it's doing... A lot of different things are driving towards the need to explore much more than there is an opportunity just to sit and reap the benefits of the work we've done in the past.
Peter [1:52]: Well, and I think technology has had a big impact on it too. We've now got capabilities we didn't have before that allow us to rapidly accelerate creating prototypes and creating experiments that we can put in front of customers and lots of things like that that we can... We just didn't have the capability to do them as easily as we do now, and I think that might have something to do with that too, because exploration becomes much easier when you've got the tools to be able to explore much more easily. So that's, I think, definitely a part of it as well.
Dave [2:24]: But exactly, to your point now. If we have better tools at our fingertips, the environment is shifting, so exploring is more required, let's say, or is more beneficial. It's something that's very valuable. However, we don't necessarily have the mental muscle to be able to explore, because it means experiments, it means trying things out and, crucially, it means not everything will work. And this is where that whole safe bet conversation comes in, because so many organizations, so many executives, are either going to interpret things as safe bets—therefore, because there's no risk or there's very little risk in this particular direction, we should chase that—or they just only feel comfortable if there is a bet and it's a safe bet and that's the way we should go, and there's that lack of comfort with uncertainty.
Peter [3:15]: Yeah, and I wonder as well about... When we think about technology has been learning how to experiment for the last couple of decades, like agile methodology, bringing in the concepts that we know and which work very well in that exploration space. Quite often we've still got the situations that, from a business delivering a service to a customer perspective, the customer—be it advisors or mortgage brokers or whomever who might be the recipient of the software that you are developing—on the business side they can be very nervous about "Well, we don't want to change everything. If it isn't perfect before we give it to them, then I'm going to have to deal with all the consequences of that."
Dave [3:56]: What if we upset our brokers? What if we upset our customers with something? And the headache with it is that's... I'm saying it's a solved problem is probably being a bit cheeky, but there are certainly some really great practices that allow us to segment that audience, to test it with a subset, to limit the potential of disruption because we annoy a group of brokers or a group of users or whatever it is. So we can start looking at that in a very careful way. However, what we often see is, either on the business side or even in the IT delivery side, there isn't the ability to just be able to go and really build that continuously. It's a one-off thing, or they'll do it occasionally, but not for everything.
Peter [4:39]: Yeah, and that is a common piece I see is that "Well, it's going to take us 12 months to roll out this new version of the portal, and it has to be 12 months because it's going to take us that long to do every one of the changes and do all the testing and validate it works, and it has to be perfect before we give it to the client." Versus, as you were saying, one way to do it is to say, "Okay, we'll find a subset of the customer base and we'll say, okay, these are the very engaged customers who we can work with very closely. We can introduce an early prototype of this to them, learn how it should work and see what we can use to modify the future growth or test it with them first," so that we're not waiting 12 months to find out that, well, actually, it's not what they wanted at all.
Dave [5:40]: Yeah, I mean, if I can just pull the conversation back just a tad towards where we were talking about how to evaluate experiments and things like that, because that adds to what you're saying is... There's a couple of things that just jumped to mind. One is the culture that you're working in really has to be comfortable with seeing an experiment go wrong, and I think when we talk about safe bets there's a mindset around "we only want safe bets." Therefore, another way of saying that is "we only want things to go successfully," and what the reality is? Things don't go successfully all the time, but we need to understand how an organization responds or what their antibodies are when something goes wrong.
Peter [6:09]: Yes, yeah, and I think that is a key part of it too, is like, are we really going to be accepting of something that doesn't go well? There's also... like one of the pieces I've seen frequently is the customer segmentation isn't the only way that you can potentially divide up what it is you're going to change about the solution. It partly depends whether there's a solution out there already, but you can start to think about it as "what are the biggest pain points of the customer? What are the things I should be solving for? How do I start to prioritize those? How else might I be able to slice this very large piece of work into smaller pieces based on risks or features or capabilities or other things that I can slowly introduce to the customer, without having to wait for absolutely everything to be finished before I do something?"
Dave [6:57]: And that absolutely makes sense, but it comes... What I've seen a lot is "we're replacing this system. We need all the same functionality in the new system," or another one that I'm just coming across quite a lot right now, which is "we're trying to catch up with, or get ahead of our competitors, so we want to have feature parity with our competitors," and that's almost considered a safe bet, because if it's good for our competitors, it must be good for our customers. And it's like, that's a weird investment strategy to take, I think, because what we really want to do is "what is it we bring to the table that our competitors don't? How do we differentiate?" So, instead of wasting my time building like for like with our competitors—so now we can't differentiate—we understand what our customer, our target audience, is and how we make their life better.
Peter [8:03]: Yeah, and they could end up diluting their value offering, because they end up chasing after the wrong thing. Your competitor's customer may not look like your customer, at least not exactly. So you've really got to truly understand your customer and that persona to know like, "what are we building? Who are we building this for?" Now, if it turns out when you look at your customer that you've got exactly the same customer, then maybe you want to build some similar features. But there's also the "well, is there something we can do to differentiate? Is there something that that customer might want more than what our competitor thinks they want?"
Dave [8:30]: I think it's really... I kind of love squirreling around this particular topic because so many organizations right now are chasing that shift to digital customers, digital servicing, that whole digital-first experience, and so much of it is take what's effectively an offline experience and turn it into a digital experience. But there are customers who love the offline experience, and if we just assume all of them want to go digital, there's some loss there. There's something going on. But also there are whole groups of other customers that might be open to using a digital-first offering, but you don't have those conversations. It seems to be much more about "we have to save costs by moving all of this into a digital experience and then we've got the value and away we go," and I think there's just... it's harder to get that deeper conversation.
Peter [9:21]: Yeah, and another mistake that we see organizations doing, of course, is that they move it into the digital experience, but they totally replicate the physical experience and it sucks because they miss out all the opportunities of actually having a digital experience, which is quite a different user experience and different interface as well.
Dave [9:42]: Well, it can be, right? It can be a really, actually, it can be a very, very powerful experience if we really understand that customer side. But because it's a virtual world, a remote digital experience, we really do need to experiment with it because it's not something that we can just intuitively understand.
Peter [10:00]: Yeah. So this brings us back around to the... there is no more safe bets. You can't say, "Okay, I'm going to understand what I need to do to service that customer and provide them what they're looking for. I'm going to do what my competitors are doing. It's going to take me 12 months to do that. Let's kick off that and start working towards that. And 12 months from now, I can potentially go back and measure whether or not that worked"—although that's maybe another topic for another time, that 12 months later, nobody actually ever goes back and looks to see whether or not this was what we were hoping to get. Because that's the other problem too, right? It's too long a time lag.
Dave [10:43]: Yeah, yeah, like after 12 months, everyone's forgotten why we even started on this journey. And because 12 months often turns into 18. At that point, you just... And one of the... I mean just picking up a little bit around that as well, when we talk about experiments and bets, even if we have a safe bet in the sense that it's obvious this is going to work, and we go ahead and make that change and it proves to be correct... One of the things that we're bumping into right now is, if you remember, years ago we talked about business agility and that integration across IT and business and so on, and that's no longer in vogue. However, it's now more important than ever, if that makes sense. So when we're working with an organization, you see this wonderful opportunity—it clearly gives a return—and then you bang into another division or part of that journey, like the business and how they get new features out to their brokers and how they incentivize them, and you find that that whole process takes so long that you're not able to leverage this great opportunity you've seen, because we can only see it in the digital space and we can't actually get it out into the broader organization.
Peter [11:46]: Yeah, and I look at that as something of that alignment problem across the organization where the business side is working on one cadence of delivery and the technology is... even if you've built the best engine in the world that can build and experiment and learn and do all the things we need to do, if the rest of the organization is operating in a different paradigm, then it doesn't really matter. In fact, in those situations, it might be better off just saying, "Okay, technology, you go off and do your stuff for other customers and we'll just offshore ours to somebody who can operate at the speed we want."
Dave [12:21]: I'm feeling like I should say you can't say that. I'm not recommending that, but I'm just saying... I was going to say drop us an email and we'll talk to them a little bit and figure out, help them find a solution to the problem we're describing.
Peter [12:40]: Oh, I am sure. Well, I know I've spoken to IT leaders over the years and that's definitely a conversation that's come up in the past.
Dave [12:49]: A couple of key takeaways?
Peter [12:50]: I think one is understanding that we do have to experiment and we do have to learn and we do have to figure out how we measure that. Shorter time cycles are critical, and even when something is labeled as a safe bet, it's not going to be a safe bet. You need to figure out how to break that down into a smaller piece so you can learn whether or not you're going in the right direction as quickly as possible.
Dave [13:14]: I think I'm going to add to that. I feel like the language of seeking safe bets is indicative that the organization isn't comfortable with experiments or bets that don't pay out, and maybe that's one of the key takeaways is how do you get—maybe in a small part of the organization—the opportunity to try things and have them not work and use that as something that can help shift that mindset away from "we need to do this because it's a safe bet."
Peter [13:44]: Awesome. Well, thank you for the conversation, as always, Dave, and if folks would like to reach out, they can at feedback@definitelymaybeagile.com or check out our website.
Dave [13:58]: Peter, until next time. Thanks!
Peter [14:00]: You've been listening to Definitely Maybe Agile, the podcast where your hosts, Peter Maddison and David Sharrock, focus on the art and science of digital, agile and DevOps at scale.