Security is a guiding principle for DCDR, and protecting user data has been baked in from the start. However, there’s more to data security than restricting access and managing user permissions. I’ve used the INFOSEC abbreviation CIA — confidentiality, integrity, and availability — as a guide to help determine the steps required to protect your data while also ensuring that the system does what it’s supposed to. Overall, the intent is to ensure:
Googling ‘what is a risk manager?’ will get you variations on ‘it’s the person who manages that organization’s risks,’ which is a pretty weak answer. It’s certainly not enough to help anyone who’s just starting in the role to understand what they’re supposed to do. Similarly, if someone’s thinking about this as a career, we need a bit more.
So here’s a more detailed answer.
‘A risk manager is a person who helps an organization achieve success by understanding, managing and responding to its risks.’
That’s a lot better, but I want to go deeper and see what that means…
How can you spot the point where a risk — a thing that could occur — becomes an event that is occurring? I’d argue that you don’t need to identify the specific point of change, and you’ll waste valuable time trying to spot the exact moment of transition. Most importantly, if you wait to see the transition point, your response will be on the back-foot from the get-go.
Phase transition is the point where a gas turns into a liquid or a liquid to a solid: it’s the point where the state of matter changes. …
I looked back at some of my degree notes the other day and came across something I’ve been meaning to work on for a long time. (By long time, I mean about 10 years*.)
It’s based on two concepts. First, the work that Brian Toft, Simon Reynolds and Barry Turner did with respect to how disasters evolve and how we can learn from them. The second concerned how to differentiate between emergencies and crises. Bringing these concepts together gives us a model or framework for how risks become events and how these events can become disasters.
There might be bigger…
I realized a while back that it can be too easy to mistake ‘simple’ with ‘easy’ and I’ve been concerned that promoting a simple approach to risk management might lead people to think that this makes everything easy. Unfortunately, even though a KISS approach makes risk management easier, it doesn’t do away with the need for hard work altogether. Worst of all, it can be easy to mistake shortcuts for simplification.
I made the same mistake myself recently with my running.
I’m hoping to tackle a longer race this fall (although to be honest this seems less and less likely…
As risk managers, we spend a lot of time working out how to get things done.
After all, the risk assessment is just the start of the process. Once you’ve identified your risks and worked out how to address them, you need to get down to work: then the actual management part begins.
Determining ownership for many risks will be relatively straightforward and departments will often fight very hard to maintain ownership of risks that fall within their remit.
(This is why we also need good governance. Even though the subject matter experts (SMEs) are often best placed to manage…
Many people have a few smoke alarms dotted around their house and, to me, these are some of the most straightforward set-it-and-forget-it risk management tools you can get. You set these up and then…nothing. You can forget about them until that annoying ‘chirp’ sound wakes you up one night, telling you to change the battery.
And most people will never hear their smoke alarm go off except for those times that their cooking gets a little out of hand.
However, if there were a fire, they’d know about it immediately and be able to react.
If it’s a small fire…
I’m sure you’ve heard people referring to COVID-19 as a ‘Black Swan’ — something that no-one could have seen coming — but is that actually the case?
Terrible though it is, I don’t think it’s accurate to describe the current situation as a Black Swan because we’ve had to deal with highly contagious, deadly diseases before.
Calling this a ‘Black Swan’ is, therefore, a way to excuse a confused response: ‘how could we have prepared for something that no-one could see coming?’
However, genuine Black Swan events do exist and we need to understand these because the consequences can be…
A while back, I wrote something on the need to speak up, even when it’s hard. That’s something we face as risk managers, but it’s also a necessity in other parts of our lives.
I’ve also written about how there are risks that are so big and uncomfortable that they’re left in the corner: we pretend not to see them.
I even wrote a whole piece on Jim Barksdales’ rule about snakes: “The first rule of snakes is, if you see a snake…Just take care of it”. …
I was thinking a while back about the idea of informed intuition: cases when you seem to be trusting your intuition but, in fact, you’re recalling deeper experiences and patterns that help with your risk-based decision-making. As I was building upon this idea, it became clear that I wasn’t onto any thing new but, instead, this has been explained in the work of, among others, Gary Klein and the RPD model.
I was a little bummed out that I wasn’t breaking new ground but the plus side, this is a real phenomenon that’s worth exploring because it helps you understand…
I’m an analogue operator in a digital environment who thinks simplification = optimization. I build and share risk management tools at https://andrewsheves.com