
Definitely, Maybe Agile
Definitely, Maybe Agile
Is Your Organization's Approach to Risk Outdated?
When was the last time raising risk in your organization led to anything other than slowing down? Join hosts Peter Maddison and David Sharrock as they challenge conventional thinking about risk management in the age of rapid technological change. This episode reveals why traditional approaches might be putting your organization in greater danger.
Drawing from their battle-tested experience working with financial and technology organizations, Peter and David crack open the uncomfortable truth: many companies still treat risk management as a checkbox exercise rather than a competitive advantage. They reveal how the explosion of data analytics capabilities has rendered old "rule of thumb" approaches obsolete while simultaneously creating entirely new risk landscapes that most organizations are woefully unprepared to navigate.
This week's takeaways :
- Risk management begins with identifying vulnerabilities, understanding potential impacts, and making informed decisions about how to handle them—whether through acceptance, avoidance, transfer, or reduction.
- Effective risk management isn't just about frameworks and committees—it requires a cultural environment of psychological safety where risks can be openly discussed without fear.
- The most effective approaches position oversight functions like architecture as service providers rather than gatekeepers, helping teams move forward safely rather than simply blocking progress.
Subscribe to "Definitely Maybe Agile" to transform how your organization approaches risk, digital transformation, and DevOps at scale.
Welcome to Definitely Maybe Agile, the podcast where Peter Maddison and David Sharrock discuss the complexities of adopting new ways of working at scale. Hello, dave, how are you today? Very good, good to talk to you again, peter. It always is, and then we're going to record something on one of my favorite topics today.
Dave:This is our topic of risk, and I guess my role is to keep us on the straight and narrow as we go ahead and talk about risk, isn't it?
Peter:Yes, yes, Because you have an idea in mind of what you're looking to communicate here and I think your view of risk and how you approach and come to risk it might be a little different to mine. And well, let's explore a little and find out where we end up.
Dave:And I think, ultimately, our approach to understanding risk and dealing with risk is the same. That's as I mean risks are. It's maybe more around defining where the needs to manage risk is coming from. I think that's the main difference.
Peter:The reason we started this conversation is because we're both of us experiencing lots and lots of conversations around risk, and companies, organizations that we're working with, certainly are having to understand and explore risk in more and more detail yes, and I mean I work a lot with financial organizations and helping organizations figure out the management of risk and certainly a subset of overall organizational risk is a big part of what I do and I have been doing for quite a number of years, so it's certainly a very interesting space, and risk is somewhat low, determined. It occurs at all levels of the organization in a variety of different forms, and so there's always this need to understand what might possibly go through, what are the possible things that might happen and what might we be able to do to mitigate those risks well, and the impact and how big of a concern is it is.
Dave:If yeah, you know, if it rains tomorrow and I get wet, it's very different to you. Know, if it rains tomorrow, I get wet and I get a cold, and now I'm locked, you know.
Peter:Yes.
Dave:For action for two weeks dealing with some sort of illness. So one of the things that you're describing is risk has been part of the conversation in technology and in finance in particular, forever Right, forever right. And one of the things I always find interesting as you step into those organizations, those sort of domains, is the understanding of risk. If you just talk generally about risk, there's an immediate sort of hand up at the back of the room saying hold on. You can't just say risk here, you've got to determine what type of risk it is, because there's already a vocabulary and a granularity into understanding that risk in many, many ways, and I think what where I'm coming from on that is that conversation that needs to have granularity around what risk really means is becoming more and more like it's. It's broadening way beyond the technology and the finance side is becoming a topic of conversation in areas that often didn't really worry about risk.
Peter:Yeah, and I wonder how true that is in that maybe they just didn't call it that or maybe they didn't structure it in quite as formalized way as they might now, because I think that there's always been this case, of whatever role you're in, if you're looking at the work you have coming up, a lot of the actions you take you're taking them because you're trying to avoid a potentially bad outcome if a particular circumstance occurs.
Dave:Yes, I agree with where you're coming from on that one. But I'd also say there's a really interesting sort of shift in that in the past, in certain circumstances, that management of risk has been less of a conversation and there's either been sort of heuristics that were used to manage risks like don't spend too much money on this, keep it in space. So there's some rules of thumb to sort of mitigate that risk, or there's an implicit acceptance of the risk. If you release a product, some of the features will not be valuable to the end customer. We just live with the fact that there is an implicit rate of failure in the features that we release on a product as an example, but both of those they become. There's a conversation now where the cost of accepting or mitigating, using heuristics like rules of thumb in that context is becoming too great, so that now there needs to be a bigger and bigger you know more of a conversation around what the risks really are and how can we review the impacts and manage them.
Peter:I think we also and you're going to hate me bringing technology into this but we also now have a much greater capacity to be able to analyze the data from these circumstances and actually dig into problems that previously we used rules of thumb because they work most of the time and they were basically unsolvable with the kind of sort of compute capacity or capabilities that we had. So it's a technology is. The advances in technology is much greater compute capabilities and the ability to process much larger amounts of data. That allowed us to potentially move beyond some of those rules of thumb and get into a space where we can be more granular and have more accuracy and potentially look at risk in different ways.
Dave:Absolutely so there's. The technology has impacted it in a couple of different ways. One is just we now have much more capability to be able to understand that risk and bring you know. As you said, there's more data, there's more compute capability, there are things that we can now model and look at mitigating. That would have just been way too expensive, way too difficult to do in the past. But there's also that pace, like the pace of change in technology, monetizes a lot of the changes and what that's doing is it's opening up many more kind of avenues.
Dave:If I just think of a simple thing like devops, cicd pipelines and pushing changes live, that the pace of change in technology there means that now you and I can have a conversation in the afternoon, make some changes to the system, push a button and out they go and they're live, and now the risk associated with that has been maybe automated away or it's implicitly managed. But now we've really got to sit back and kind of kind of bring that conversation back in to say we used to have mechanisms to manage risk in a kind of intelligent way. Are we still managing it appropriately? Are there situations where we shouldn't push the button to send things live? And that's driven by that technology allowing us to do things much, much more quickly, to decentralize decisions, which increases risk in certain contexts.
Peter:I think that particular example you gave there would send me off on a tangent which we could probably spend the next couple of hours talking about, because there's a lot of organizations that fail to be able to get to that, and it's a. So it isn't just a lack of ability to accept the risk, it's a lack of understanding that by being able to push a button and immediately push it out, you reduce your risk, and so there's a sort of a double-edged sword on that particular one.
Dave:Well, but this is what I mean about that sort of increasing color around risk. So as organizations mature, the technologies pull through, but various other things as well that we'll probably touch on, you know, high or low there or not there becomes, is just not nuanced enough for us to be able to really understand what, um, you know what, what the right approaches are and how to have the right conversations around those are the right people involved?
Peter:are we making decisions in the right way, and so on yeah, and then this this gets us into an interesting space where, if you look at something like Deepwater Horizon, for example, here's a classic example A big oil spill, a whole oil rig, exploded. From a risk perspective, they had just passed a couple of weeks, before the orders had been there, and they passed a review with flying colors there was absolutely no problem, there's no risks whatsoever. Review with flying colors there was absolutely no problem, there's no risks whatsoever. Yet it then exploded and caused a massive environmental disaster and a whole bunch of other connotation problems as a consequence of that.
Peter:When digging into that and starting to look as to why, like why, were the risks that were there not properly identified? Because after the fact, looking back, there were some clear problems that could have, had they been addressed, prevented the incident from occurring, but people took those risks and swept them under the rug requires an environment of psychological safety where people will talk about risk and that risk is a first-class citizen alongside the next widget or feature that you're wanting to get out, or the next barrel of oil in case of a exploding oil rig, but the and that you're. You are in a situation where it is possible to say wait a minute, and that the people within the organization will respond appropriately to that and the right things happen. This is where we start to talk about safety one and safety two cultures.
Dave:What's really interesting, as you're describing that is and again, if we bring it back to technology, because we bump into this all the time is if I raise a risk, in most cultures, most organizations I'm talking digital technical companies, but it will apply in many different contexts If we raise a risk, the nearly only obvious thing will happen is whatever we're trying to do will slow down. I'm going to have to go and get approval from a change, you know, some sort of, say, architectural review board, for example. So I can't make the changes I'd like to, because our only response is if there's a risk, it needs mitigating, and you need exceptional, experienced people in the room to determine whether or not the plan we have to manage that risk is the right one. And this is where those nuances of understanding there are risks that we can accept, there are risks that we're going to mitigate in some way. We're going to transfer or avoid or reduce those risks in some way by making positive actions around those.
Dave:And the idea is that there isn't a scenario where there's no risk. It's just there are risks that we either don't know about or we know about and we're accepting it's the equivalent. So now we're even in that sort of description that I'm just describing. We're already adding layers of nuance to our understanding of risk so that, instead of it being a basically a handbrake on us being able to make progress which how many of us are in a situation where we can afford to say, well, we need to slow down? That's just not happening in the current context.
Peter:Yeah, and maybe even more so by the fact that we have a lot more not just a lot more change coming at us, but a lot more threat actors, a lot more people who are threatening to take and steal and break in and damage our environment for a variety of different reasons. So you've got a lot of there's a lot to respond to and a lot of, a lot of changes in the system just as a consequence of that.
Dave:Even there I would take. I mean, there's always that context and the environment maybe is from a security perspective or from a bad actor's perspective is definitely moving very, very fast. But even from a simple thing, like customers using our products perspective, that environment is changing really really fast as well. So the longer we've talked about this many times, the longer it takes us to get into market with new functionality, the more likely we are to miss that market. And if we do come to market with features which don't operate in just the right way, we're going to lose this reputational risk. There's just general churn. All of the things, even from the client side, the customer side, are making things more and more important that we're nuanced about how we discuss risk.
Peter:Yes, Beyond the nuance, it's about the fact that we are discussing the risk. It's bringing me back to the psychological safety. It's not enough to simply have the framework in place to have your GRC, to have all of your first, second line, third line defense, to have your audits, your internals, all the rest of the different pieces that you need in place. You need all of those. They're invaluable, Doing them well in a way that creates a culture where people are not looking at audit and going, oh no, here comes audit again. This is my 10th audit. I don't see the value in it and it's painful and it's agonizing. And instead of looking at it as well, this is a way to validate that what we're doing is effective and that we're doing it well and that we've captured the right things.
Dave:So there's a very, very large cultural element in appropriately dealing with risk within your organization, and I love the way you're describing this because it always sort of reminds me of years ago, working with a group where and this was on the architecture side but architecture is a great kind of place to look for how an organization will manage risk but one of the things that really struck me with this architecture group was they saw themselves as a service provider to the technology part of the organization that was building products on this architecture. So instead of it being sort of a policing audit, you've got to be right in order to go forward. Role. Their role was much more about how can we enable you, the technology part of the organization, to deliver on the business goals and objectives in a way which is safe and secure, managing that risk, but also quick and easy and something that they're well understood and able to follow through on, and I find that perspective is rare.
Peter:It is when it's done well and I've seen many organizations that do this well too but where it's done badly, it becomes a series of committees and you've got to get escalated through every committee until you get to the top committee and you've got a number of people sitting in an ivory tower telling you whether or not you can move forward, and the people in that ivory tower potentially have little or no insight by the time it gets to that level as to whether or not this is the the right way to approach something I think to to to your point.
Peter:I would add to that that the the best architecture teams take not only a a technical view the environment but they take a people view too. It's the social technical viewpoint, where they don't see themselves purely as architects of technical solution. They see themselves as architect of how will we structure ourselves organizationally around these architectures too. They really live and breathe Connolly's law, if you like, the communication patterns of the organization systems we built. So they think about things on a people level as well as thinking about things on the technical level.
Dave:I always love when we come full circle and we come back onto something that both of us believe so strongly in, which is process. Change isn't about process, is about process and the people's side, and if we ignore the people's side, all sorts of unfortunate, unfortunate, unintended consequences kind of come swinging through as well and so.
Peter:So we've run through our usual allotted time here, and I'm happy to talk about this some more because it's a topic near and dear to my heart. But how would you sum this up? What?
Dave:are your main takeaways from this conversation? I really I think one of the first things is is you were describing risk in terms of of understanding. You know where the vulnerabilities are, where the risk is what it is, understanding its impact on an organization and therefore being able to decide how to mitigate it, whether there's some sort of you know, can we accept the risk, do we avoid it, can we transfer it, can we reduce it in some way? And I think that just clinical definition of risk is always worth kind of scribbling down and really relating to, because that's it's, it's the big, you know, it's the beginning of how we can go ahead and understand how to handle it, but also how to discuss it and bring it to the table. So that's the first thing.
Dave:Um, the second thing is, I think risk used to be limited to certain contexts or certain parts of the organization let's call it, you know, under the chief risk officer, or it's a finance or technology sectors and things like this. And more and more risk is because of the pace of change, because of how rapidly technology is changing, because we can explore the different risks in much more granularity, much more detail. The conversation about risk is going into areas which aren't used to discussing risk, and we're seeing lots of nuances around high risk, low risk environments, decisions which require more risk conversations around them. I think that would be the second thing, and I'd probably touch on that third thing being the people and process. Conway's law and how. Risk management and addressing risks is a conversational thing. It's a people thing as much as it's a process thing. Having a process at which conversations are had and actions are taken is not sufficient. We also need to bring in the cultural shift that allows those conversations to be raised in the first place.
Peter:I would add to your third point that I hope is your second point that's driving the conversations, because then it would mean that people are actively talking about risk, which, rather than just trying to sweep it under the rugs which is quite often, unfortunately, what you see, especially in organizations where raising risk to your point, sometimes for all the reasons that you described, it's that I've got to get this feature out because I've got to get it to market. If I go tell the architects about this, then I'm going to get stuck in a bunch of committees and I don't want to do that. I just need to get this thing out and the VP is yelling at me and I've got to do it, and then things blow up. So having the continual conversation around that is is essential, and so the people element is indeed a critical part of that, and so, with that, I think we can wrap up for this conversation for today and, as always, dave, thank you very much.
Peter:I always enjoy these conversations and look forward to the next one. Yeah, well, that was a fun one, so thanks again. You've been listening to Definitely Maybe Agile, the podcast where your hosts, Peter MaDdison and David Sharrock, focus on the art and science of digital agile and DevOps at scale.