StoneTurn Partner Sarah Keeling and Senior Adviser Richard Mackintosh host a discussion on the evolving risk landscape, examining geopolitical shifts, emerging threats in the Middle East, and the impact of AI on security. Their conversation with our guest panelists highlights the nuanced nature of insider risks and how they intersect with broader global challenges. Watch the recording to discover the importance of organisational buy-in, accountability, and a holistic approach to effectively address the multifaceted dimensions of insider threats in a rapidly changing world.
Panelists:
• Helen Gripton, Human Rights and Social Impact Lead, BP
• Jules Parke-Robinson, Global Head Security Risk Management, Philip Morris International
• Sarah Keeling, Partner, StoneTurn
• Richard Mackintosh, Senior Adviser, StoneTurn
Transcript edited for clarity.
The Risk Landscape
Sarah Keeling: The risk landscape has changed dramatically. We have the continuation of the Ukraine Russian war, aggression from the Middle East, issues arising over AI, security, and energy issues. We’re going to focus today on looking at how the landscape has changed and how these challenges are now being faced by our experts.
What’s changed? And where are we now?
Richard Mackintosh: What’s changed: a challenging geopolitical situation that’s developed rapidly and in ways that…weren’t expected. The economic consequences of the pandemic and some of the security consequences as well. So things like remote working, how organizations are thinking about those are still quite a lot of challenges in that space. How AI has just become this dominant conversation and we can start to see in things like medical science, what some of the potential benefits are, but I think it’s also raising questions for people and organizations on things like, what does this mean for me? What does it mean for us as an organization?
What hasn’t changed really is Insider risk in the sense that it’s been around since we’ve had people working in businesses and organizations but the challenging geopolitical situation is making that much more a central issue. We see insider cases right across sectors in all different ways. Traditionally, there was quite a narrow view of what insiders were. People now understand it better in the sense it could be fraudulent activity, it could be theft of IP, leaking of data, sabotage of your system. So, there’s a broader appreciation and therefore a better focus on it.
Cost of Insider Threats
Richard Mackintosh: There are the obvious costs such as if someone defrauds you there’s a cost in pounds or dollars of how much you have lost. But the reality is there are some difficult costs to assess. For example, your reputational damage, how do you manage that or put a cost on that or investor confidence? You might be in an organization where you’re going through a funding round, for example, and you have some significant leakage of information or loss of IP, which could critically damage investor confidence. There are reports on the cost of insiders, some take quite a cyber centric view. What comes out is the incidents are being reported more, they’re expensive, and it takes longer to remediate them. As a consequence, organizations are thinking we really ought to have a program or a function which is specifically focused on insider risk.
Jules Parke-Robinson: There’s the recognition that there is a problem, but amongst other competing priorities, it’s still not getting the traction and the buy in from a leadership level that it really deserves. Therefore, there’s not necessarily the resources allocated to insider risk programs to really have that holistic approach. There’s often a focus that’s very much on data loss prevention but not recognizing that actually the biggest risk is people and it’s human behavior that causes the problem.
Helen Gripton: We’re seeing a slow but steady shift in the language around insider risk. We will hopefully be moving to a place where it will be more noteworthy not to have an insider program than to have an insider program. In terms of AI, it’s very important that we are distinguishing between AI, but also generative AI, and how that in some ways is able to put people in places where they weren’t, or remove them from places that they were perhaps, if we look at the simplest applications on photos and phones and so on and how that can be tackled is going to be a really big topic moving forwards.
Sarah Keeling: How do you navigate the paradoxical relationship with countries on one hand who might try well to leverage insiders to acquire data or to spy on a country and try to get hold of IP and R& D information or to act as agents of influence? How do we manage that commercial relationship on one hand, diplomatic relation on the other hand, whilst at the same time, understanding that there may be a risk of you being hurt by that country.
Richard Mackintosh: We’re not just talking about one nation here. The key thing is this is not an either-or choice. The approach that people need to take is to think clearly about the risks of engaging and ask yourselves questions such as, do we know what our crown jewels are and where they are, and what do we genuinely have to protect? So, you can engage in a sensible way, but be mindful of what the risks are. You can understand what the risks are and have appropriate mitigations in place to deal with that risk.
Helen Gripton: It needs to be a realistic conversation. Also, have an acceptance that if we’re working in partnership with nation states, as many organizations are, there will be a certain amount of data that has already gone out of the door, realistically. As such, it needs to be a risk based approach in terms of protecting your crown jewels, to use Richard’s phrase, but, equally not perhaps trying to close the door after the horse has bolted and protecting information that either has lesser value, or has perhaps already been exfiltrated, because that can distract from, the actual getting after the rest of the program and too heavy a focus on, on trying to protect information that perhaps isn’t particularly sensitive can really bleed personnel away from focusing on the risk.
Insider risk is fundamentally about a breach of trust. You’ve trusted someone to come inside your organization, and they breached that trust by doing something unauthorized by stealing data or whatever they decided to do. And the way to counter act that is by building a high trust environment that will then encourage people to feel confident that they can speak up and that when they speak up, something will be done about it.
Sarah Keeling: Also, we’re living in a world now where relationships between governments and businesses, it’s more of an archipelago of relationships. You could be a client, a competitor, and a joint venture partner. And you see that often with very large global businesses. So sectors behave different internally? And I guess this is partly a, a cultural question, but it’s also not just about the culture of the organization, but often how different companies internally view insider risk and whether they’ve really acknowledged it or not. Jules, You’ve recently been in two very different sorts of companies.
Jules Parke-Robinson: I think for me and both recent roles, actually understanding what the important information is, what’s confidential, what do we actually need to protect? And then understanding what that risk is and what our level of acceptance is. And it’s also not static. So it’s constantly evolving and something that we need to have that engagement and dialogue at the most senior levels of leadership. And in that multiple and holistic way so that it’s not just about cyber tools, it’s broader than that. It’s how do we go from that entire hire to retire cycle for the onboarding to the offboarding and everything that sits in between.
Sarah Keeling: Risk is adaptive, it moves quickly. We too have to respond to that and accept that that’s the world we live in and we have to be agile in the way we look at the threat picture. Richard, you’ve worked in a number of businesses where you’ve created insider risk programs, what’s your perspective?
Richard Mackintosh: If I think about different sectors, I think if you look at financial services, because they’re a highly regulated environment. They will have a lot of technology tools, which are looking for anomalous behavior. But they’re not completely fail safe, even though they’re very advanced tools. Yes, technology has a key role to play. But I would always say a holistic approach is the one because this is where things like a speak up culture can really help. So whatever sector you’re in you want technology tools but that needs to be reinforced with all the other things that you do around physical security. We see regularly that there are cases where people are working in regulated environments, but they get away with things until it goes horribly wrong and then someone comes out and says, oh, I knew there’s something wrong about what so and so was doing, but they haven’t spoken up about it. So, all of these things need to be working in concert together.
Helen Gripton: Some organizations are more comfortable with the digital monitoring that people are more sensitized to it. And as such, introducing a conversation about broader insider risk and potential measures to mitigate is more easily landed. There is a broader challenge sometimes in that those organizations can too heavily rely on their digital monitoring and not take the time to really understand their people, really understand the softer side, like a speak up culture, making sure that leadership is, strong and supportive of any such program.
The majority of cases we’ve seen tend to point to at some point in the journey of an individual who becomes an insider, and poor leadership being a contributing factor. So making sure there’s a broader understanding of the risk that doesn’t just rely heavily on the digital side, but also looks at the softer people side as well.
Sarah Keeling: Let’s turn to buy-in, both organizational and senior leadership.
Richard Mackintosh: The biggest challenge around getting buy in is accountability. So if we talk to executives and ask them, for example, if you have a major cyber breach, who’s accountable for that? They’ll say, well, that’s the CISO or the CIO. And everybody understands that, that’s very clear. If an individual carries out an act, for example stealing your IP, who is responsible? It actually it falls between a number of different stools. It’s a kind of an HR problem, it might be a physical security problem. If they’ve walked out the door with it, it could be an IT problem. So the first thing is actually getting people a to understand what the risk is for them specifically and have a sense of it and have that risk shared at a senior level. And if someone’s accountable for it at the top table, something will get done about it.
And then within the organization itself, communication of this risk is important about what you say to people inside the organization about what you’re doing. There are occasions when it can get a bit heavy handed and feel a bit like this is a surveillance society. You need to get the communication absolutely spot on because nothing kills an effective speak up program other than people feeling that they’re being spied upon or they’re not trusted. And that’s one of the things that we hear is people won’t speak up because they don’t believe, for example, that a, they’re going to be taken seriously or in the worst cases it will actually negatively impact their careers. People need to understand why the program exists as well as their role within it. Top-level accountability and clear leadership demonstrably taking people who raise concerns seriously is incredibly powerful.
Jules Parke-Robinson: With any insider risk program there has to be one accountable owner at the highest possible level to be able to get everything aligned and connecting dots across the whole business. It can’t be buried within a singular function or somewhere underneath legal and compliance. It has to be visible and owned by the business. That’s, where I’ve seen it have the most success. The other thing is getting proactive and understanding that life cycle of what the risk is. Many people will put a program in when something’s happened and they’ll put the controls in at the end of the process, we need to start at the beginning. So where’s the due diligence in terms of third party vendors that we work with in terms of individuals or potential employees that we want to bring on board and how are we then putting that the training protocols in place to make sure they understand what all those responsibilities are and what they should be protecting in terms of company information throughout their time.
When you’re transforming a business and going through change there needs to be communication with the people who are involved in insider risk and security really early
Helen Gripton: A lot of it comes down to framing. You have to know your organization well enough to know what will resonate as a way of selling the program. If you’re in an organization, for instance, where you have a very strong safety culture, it’s about keeping people first and foremost safe, and assets and data. That can be your hook to get into the organization’s senior leadership in a language that is more broadly understood. We have to change the way we talk to the business so that we are seen as partners and business enablers.Helen Gripton: Absolutely. The onus is on us as the program owners to make sure that we are to frame the risk to different parts of the organization in a way that they’ll understand and resonate with the culture of that particular part of the organization. Explained properly, away from the language of monitoring and being suspicious of the workforce, framed more positively about proactively keeping people safe and protecting data. It is absolutely key that we frame it in the right way.
Sarah Keeling: Let’s assume it’s bought in, how do you ensure that this narrative about insider risk really does reach different audiences and resonates across different cultures?
Helen Gripton: A flexible way of introducing and discussing to understand local sensitivities and cultures well enough to make sure that your style is appropriate. The other key thing though is not to have too regional approach or too differing an approach to the consequences and reactions to insider risk when you look at different parts of the organization, because one of the ways to demonstrate commitment and for parts of the organization to buy in is to understand that there is, where local laws and employment laws will allow, a pretty uniform approach to consequences and ways that the organization will respond in the event that a risk is found. Nothing undermines it faster than feeling like different people, different levels, different cultures in the organization are treated differently when an insider risk is surfaced. So, a flexible approach in terms of getting buy-in and have a pretty standardized response in terms of how you react to any risks you find or people issues that you find.
Jules Parke-Robinson: There’s always a natural tendency for us to trust our colleagues. We need to think also about culture and about creating an environment where people feel empowered to speak up because you can have the most incredible program but if people don’t feel that if they’re a whistleblower, if they make a speak up allegation that nothing’s going to happen or it’s not going to get investigated comprehensively, then they won’t be encouraged to do it. So it’s not about snitching on your colleagues, but it’s about everyone having that trusted environment and a trusted culture so that if you do get the bad apple, that people are prepared to actually speak up about it.
Sarah Keeling: Yeah, and I think the trusted, creating a trusted environment is really behind this. It’s a sort of running backbone to dealing with insiders. If you create a trusted environment, you’re less likely to have people wanting to destroy that, that trusted environment.
Richard Mackintosh: The reality is that in different cultures you need to approach this slightly differently. So that might be in certain countries, there are things you can do legally that you can’t do in other countries. You just need to be aware of that. One of the things that I found helpful in getting traction is storytelling, it’s incredibly powerful. To give an example, we were talking to some people recently about the risks around social media and how it is used by criminals and nation states and somebody in the chat box came on and said, precisely that has just happened to me. I think then everybody in the organization hears that and thinks this is actually real, this is not theoretical.
The other thing is consequence management and how you manage the consequences when something happens. I’m an advocate of explaining, where possible, the action has been taken against people whether they breach the trust of the organization and that might be they’ve been disciplined or exited the organization that sends a really powerful message about when you speak up, we listen and action is taken and now we’re going to tell you what we’ve done and we’ve improved our processes.
Insider risk is fundamentally about a breach of trust. You’ve trusted someone to come inside your organization, and they breached that trust by doing something unauthorized by stealing data or whatever they decided to do. And the way to counter act that is by building a high trust environment that will then encourage people to feel confident that they can speak up and that when they speak up, something will be done about it.
The great thing about high trust environments is you get better people, you keep better people, and they’re more profitable.
Sarah Keeling: It can be difficult to roll out an insider risk program and sometimes there can be tension between the compliance ethics and insider function. How do you manage that tension?
Richard Mackintosh: They should be closely aligned working absolutely hand in glove. If there’s tension that may be your communication isn’t quite right.
Sometime I see quite a bit of focus on new tools which organizations have, for example, to see how productive people are being and how they’re spending their time. I worry about that because it says to me, we don’t trust you to get on and do your job properly and unless you land that in exactly the right way, it can have a really unhelpful impact. I would be absolutely honest with people about the program, protecting you, your colleagues, and company. And give people channels where they can raise concerns. What I wouldn’t do is over communicate about any technology controls you’ve got because with some of the people who were carrying out illicit trading activities, they knew what the controls were and how to bypass them.
Sarah Keeling: I’d now like to look at the challenges around getting the balance right between intrusion i. e. having sufficient internal controls to spot those red flags, versus freedom, and how you get that balance right?
Helen Gripton: Be open, there shouldn’t be a secret program happening in the background. That’s where distrust and discomfort comes from because it will eventually come out. It’s essential to get that message right and landed first. Then it’s important that any controls you put in place to manage the risk are proportionate and necessary. We have to step back every so often and really check ourselves on that and a huge part of the balance in terms of the challenges between freedom/intrusion is about communication. If there is a skepticism, and I think we can probably see that particularly in the younger generation about monitoring, and if your controls are proportionate, you should be able to communicate why they are in place effectively, rather than have people concerned about being watched or monitored in the background in a way that makes them feel uncomfortable at work. Because frankly, that would be uncomfortable for anyone if you felt like you were being monitored the entire time without due cause, or without a desire to mitigate a specific risk that you understand.
Jules Parke-Robinson: We know about ethical fading and that people will start with one small act which could then spiral into something that’s much bigger. If we can share that awareness and training that’s important for people to understand that no one is immune and no organization is immune to the risk.
There’s also a danger that all of the red flags exist but they sit in different parts of the organization. So they may sit with HR and finance but how do we join the dots? Putting the program together is just connecting them. It’s about protecting and enabling the business
My final point would be about how different generations react differently to the idea that they are being or could be being monitored. We have to be open and transparent about what we’re doing and why we’re doing it.
Sarah Keeling: You may have had some big insider risk event that has had huge ramifications for the business and may not have fully communicated what has happened to the employees and all of a sudden there’s a whole new insider risk program coming in. How do you then handle that and get that balance right?
Richard Mackintosh: It’s about proportionality. You need to think carefully about what are we trying to do, what are we trying to protect, what tools we need, and how do we communicate that.
Do not leave this just in the hands of security people. Create a working group of your people from HR, legal, IT defense, and physical security people. And part of that is thinking about, okay, so we have some principles which we’re operating under. And one of those is proportionality. Is it reasonable? Would it be reasonable if we said to someone, we’re going to introduce this kind of control or this kind of monitoring? How do we think people would feel about it inside this organization? Risk sits right across an organization and the more that you can involve people to think about it and about proportionate controls, the better the outcome will be because you’ll be testing as you go.
Sarah Keeling: Thank you. To sum up our conversation, I think we all concluded that insider risk programs are no longer just a nice to have, that we really do need to have them, but also through having an insider risk program, you can better navigate the risks and relationships that you have with sectors, competitors, other countries, even bad actors. So you’ve got a situation where you have an opportunity here to manage those risks, and actually, the better you do manage these insider risks creates more opportunities for you in terms of commercial imperatives that you have to pursue.
You don’t have to be a huge organization, programs are scalable. It’s the same principle for a large organisation, SME, or even a very small outfit. At the end of the day, if it’s your crown jewels, your data, your people, your, IP, with an insider risk program, it’s probably the best way to mitigate the risks of losing those crown jewels.
Thank you all for your time, this has been a very interesting conversation. If you’d like to discuss any of the issues raised in this videocast, please do get in contact with us.