Description
This on-demand webinar, delivered by accredited third-party auditors and consultants Alastair Parr and Joe Toley, discusses metrics to consider when maturing your third-party risk management program. During the presentation, Alastair and Joe cover:
- The five pillars to analyze in a program maturity assessment, and why each is important
- Top criteria to consider when conducting reviews of your third-party program
- Benchmarking the maturity of your TPRM program against others
- Key considerations for planning program objectives and targets
Watch now to gain a clear picture of why maturity assessments are important, the proper methodology for conducting them, and what a successful and mature program looks like.
Speakers
Peter Schumacher
Host
Joe Toley
Accredited third-party auditors and consultant
Alastair Parr
Accredited third-party auditors and consultant
Transcript
Peter Schumacher: Welcome and thank you for joining our webinar today. Peter Schumacher: Top five metrics to consider when building a more effective thirdparty risk featuring Alistister Parr and Joe Tolley. Peter Schumacher: Alistister and Joe are accredited thirdparty auditors and consultants and over the years they’ve helped hundreds of customers mature their TPRM programs. Peter Schumacher: My name is Peter Schumacher. Peter Schumacher: I’m your webinar host for the day. Peter Schumacher: I’ve got a couple housekeeping items to cover before we get started officially. Peter Schumacher: So, first of all, this is a reminder that all attendee lines are muted. Peter Schumacher: We do that in an effort to cut down on background noise, especially with many of us working from our home offices. Peter Schumacher: In an effort to keep this session interactive, we do invite you to submit your questions using the live Zoom console. Peter Schumacher: Time permitting, at the end of the hour, we’ll address those questions. Peter Schumacher: and an official Q&A session. Peter Schumacher: Today’s webinar is being recorded and we plan to deliver that recording to your inbox by tomorrow. Peter Schumacher: I know you didn’t join to hear my voice. Peter Schumacher: So, at this point, I’d like to turn things over to Alistair and Joe. Peter Schumacher: Thank you both so much for joining us today. Peter Schumacher: And Alistister, please take it away. Alistair Parr: Thank you very much for driving that uh introduction there, Peter. Alistair Parr: Very much appreciate it. Alistair Parr: So, uh thank you all for joining us. Alistair Parr: Just to briefly recap, we’ll just just cover off a few more housekeeping items. Alistair Parr: I think uh Peter has kindly covered most of them for us. Alistair Parr: We’re going to talk a fair bit today about maturity assessments. Alistair Parr: Specifically, what is a maturity assessment for third party management programs? Alistair Parr: Why do we use the term third party rather than vendor? Alistair Parr: And then cover some of the key metrics and observations we’ve seen from enacting that on the field. Alistair Parr: We will have a Q&A section at the end. Alistair Parr: So to recap, if you have any questions, please feel free to fire them our way uh via the uh the Q&A option in Zoom and we’ll endeavor to answer them as we go along. Alistair Parr: or at the very end of the session. Alistair Parr: We do have one hour today and you do have your hosts. Alistair Parr: So, it’s myself Alistister Parf brief bit of insight so you understand who I am and why I’m suitable to talk to you about this issue. Alistair Parr: I’ve been dealing with third party risk programs for the best part of 15 years uh focused very much on governance and implementation done a lot of work around ISA 27,0001 auditing business continuity and so on. Alistair Parr: And we also have Joe Tully of us here. Alistair Parr: Joe Thank you and hi everyone. Joe Tolley: Yeah, so my background is um mainly around data uh and uh and recently conducting third party maturity assessments. Joe Tolley: So I’m responsible internally for heading up our services team uh and in addition to that making sure we customize our platform to give clients the best use of the system and some of its uh features and functionality. Alistair Parr: Thank you very much Joe. Alistair Parr: So let us begin. Alistair Parr: Maturity assessment overviews the most common question we naturally get up front is what is a third party uh program maturity assessment. Alistair Parr: So for us if we just cover off the metrics what makes up a good third party uh maturity assessment we normally use the CMM model. Alistair Parr: So the Carnegie capability maturity model which if for those of you who are familiar with it rates things between a one and a five uh between initial as in there’s not much progress it’s very immature up to five being optimized. Alistair Parr: Now the thing about uh the Carnegie capability security model is that it’s extremely rare if near enough impossible to get to a level five. Alistair Parr: So when we actually assess organizations here, we normally expect them to ambitiously aim for a three or a four usually within 12 months of actually conducting their program. Alistair Parr: Now what is the maturity assessment actually trying to achieve is it’s really looking at the broader capabilities of the third party program. Alistair Parr: It looks at governance. Alistair Parr: Uh it looks at a couple of key pillar areas which we will cover off but the objective of it really is to ascertain across scope coverage uh maturity from governance audits the depth of detail to ascertain how mature is the organization in managing the program and does it integrate with the broader strategies. Alistair Parr: So why should I actually care what a third party maturity program is going to tell me is that it is a regular checkpoint. Alistair Parr: It is going to give you a benchmark compared to peers and across industries verticals uh and of course internally as to how effective your program currently is. Alistair Parr: Now it is a measure of of success. Alistair Parr: Now, there isn’t a pass or fail, of course, when you’re doing a maturity assessment. Alistair Parr: It’s just simply a benchmark and states what do we need to do in order to become in our tolerance level. Alistair Parr: You know, what what do we need to do to actually achieve what we aim to do, what we aspirationally want to hit within a set time frame. Alistair Parr: And it’s really just an opportunity to identify what can we be doing better. Alistair Parr: So, a common misconception that we experience when we’re doing these maturity assessments with different organizations is the fact that people try to uh game the system and they try and say the answer is that they expect the uh the inquisitor or the auditor to to ask for when the reality is that’s not going to help anyone. Alistair Parr: We always rather people be honest, direct and helps us identify what the challenges are so we can drive that program moving forwards. Alistair Parr: And then of course when we actually know where we are, it becomes a milestone for moving forward. Alistair Parr: So a milestone for understanding what remedial actions we can take, what net new programs need to be implemented as part of our third party poly management workflow and where there may be some additional deficiencies that we can either accept or tolerate based on budget uh our risk tolerance etc. Alistair Parr: So you see on the right hand side we normally see organizations hit around the two to three mark if they have an existing process and that two to three mark is usually commensurate with the fact that they’ll have a technology they’ve implemented some processes they have accountable resources internally for driving third party risk uh etc but the aspirational view is that you’d be able to go up normally around a one in the maturity score within a 12-month period. Alistair Parr: So, we’re going to cover off some of the key areas of uh a maturity assessment shortly, but just to give you a bit of insight up front, our key coverage areas include uh content. Alistair Parr: So, what are we actually identifying in our third party management program? Alistair Parr: Coverage. Alistair Parr: So, how broadly are we identifying that across our estate? Alistair Parr: Is it a subset? Alistair Parr: Is it the entirety? Alistair Parr: Roles and responsibilities. Alistair Parr: Do we have the right resources doing the right activities as part of that uh that workflow? Alistair Parr: remediation. Alistair Parr: Once we found these issues, what are we actually going to do with it? Alistair Parr: And then finally, governance. Alistair Parr: So from a governance standpoint, uh do we educate and inform the right people in the business? Alistair Parr: And are we articulating the right mechanisms and metrics that help us ascertain if we’re moving in the right direction? Alistair Parr: Now, the good thing about maturity assessment is that it doesn’t really matter whether you’re a Goliath of industry or relatively uh in your infancy when it comes to a third party program. Alistair Parr: There is always something of value. Alistair Parr: Now that could be as basic as building the foundational blueprints of a third party management program or of course evolving it and expanding your reach coverage etc. Alistair Parr: So what are the key factors in the assessment approach before we move on to the broader scope? Alistair Parr: A third party maturity program needs to be repeatable. Alistair Parr: So whether you leverage the prevalent maturity assessments or build your own or another technology or another provider it needs to be a repeatable process. Alistair Parr: So it needs to be objective rather than subjective. Alistair Parr: So in that sense we need to be asking the same questions time over time to ascertain if things are evolving or not. Alistair Parr: Uh we of course need to get those honest responses to reinforce that point. Alistair Parr: That is something I will uh hammer home multiple times over the course of today. Alistair Parr: Honesty is is a key factor. Alistair Parr: The amount of times we’ve had to reject maturity assessments and uh politely guide people to adjust some of their answers based on reality. Alistair Parr: It happens far more often than we’d like. Alistair Parr: We recommend having a consistent audience and scope of response. Alistair Parr: So in this case it’s about having the right audience. Alistair Parr: An interesting fact is quite often we do maturity assessments with uh two different tiers of resource in the organization. Alistair Parr: We can do a maturity assessment with the practitioners actively working on the project day-to-day and we can do a maturity assessment with the stakeholders and the sponsors. Alistair Parr: And invariably we tend to find they actually vary. Alistair Parr: The stakeholders sponsors tend to think that the program is performing better than they’d expect and the uh practitioner is obviously more aware of the the various warts uh that emerge out of the program and articulate as such. Alistair Parr: So it’s worth noting that a consistent audience means you’re going to capture a similar perspective. Alistair Parr: However, you want to be able to take the sponsor stakeholder view or the practitioner view. Alistair Parr: That’s of course uh your prerogative, but we’d recommend sticking to one of the two. Alistair Parr: Now the clear and binary approach for identifying weaknesses, again we remove any interpretation here. Alistair Parr: It should be a case of when we ask any maturity based questions, it should be a yes. Alistair Parr: yes, no or a checkbox or definitive statement that they can select. Alistair Parr: It shouldn’t be an interpretive based question set. Alistair Parr: And once we’ve done that, we can then standardize what the scoring mechanism is. Alistair Parr: And then we can of course prioritize accordingly based on the business appetite. Alistair Parr: And we’ll talk about how we can actually prioritize different criteria over the course of today. Alistair Parr: So the next view which is actually assessment scope. Alistair Parr: Now I’ve covered this briefly, but we’re just going to go through each of these and just highlight really what we’re looking for when we go for each of these. Alistair Parr: So each of these pillars normally we would have a rating between 1 to five for. Alistair Parr: So that’s that maturity rating of 1 to five and then we take out an overall program maturity from this. Alistair Parr: Now a common mistake that we see when people are building their maturity assessments is that they will weight these areas equally or they’ll weight the questions within those areas equally. Alistair Parr: Now that’s a bit of a faux par and the reason being is that you may actually want to focus on particular questions more than others. Alistair Parr: They may be he weighted. Alistair Parr: So we recommend if you are building your own or leveraging one that you do check that the questions are weighted appropriately based on your vertical or your risk tolerance or interpretation. Alistair Parr: So again just to reiterate coverage. Alistair Parr: So how comprehensive is the scope? Alistair Parr: Do we cover the entirety of our third party estate? Alistair Parr: Do we extend to fourth parties or even fifth parties? Alistair Parr: Uh do we only have our tier ones or priority vendors covered? Alistair Parr: Content is are we purely looking at information security? Alistair Parr: Are we looking just at business continuity and resilience? Alistair Parr: Are we capturing legal requirements? Alistair Parr: Our procurement involved in the conversation? Alistair Parr: Do we have privacy involved in the conversations? Alistair Parr: Is this broader gamut of requirements you normally wish to capture from a third party? Alistair Parr: Is is what we view at content? Alistair Parr: So is it limited to a particular perspective? Alistair Parr: Do we amalgamate the different requirements of the business together and then what do we actually do with it? Alistair Parr: So the roles and responsibilities piece is do we have individuals accountable for different activities in the platform? Alistair Parr: Uh so do we have for example practitioner is responsible for capturing the data. Alistair Parr: Do we have an analyst looking at the results? Alistair Parr: Do we have governance defined? Alistair Parr: So from a governance standpoint, do we actually have uh stakeholders, steering committees set up with the business? Alistair Parr: And then do they drive that remediation process? Alistair Parr: So do we have risk tolerance levels defined? Alistair Parr: How do we actually drive that level of remediation? Alistair Parr: And then how do we report all that back? Alistair Parr: So the metrics for governance, KPIs, uh reporting back and we’ll touch on some of the common deficiencies and challenges we as we go through that. Alistair Parr: And just to highlight, I didn’t lie. Alistair Parr: I’m going to hammer this point home multiple times that while a maturity assessment is fun to win, it isn’t a game. Alistair Parr: So, we do recommend you be honest and fair to yourself. Alistair Parr: So, uh that as you improve over the course of your program, that’s demonstrated in some of the results. Alistair Parr: Okay. Alistair Parr: Uh so, we’re going to go through some of the pillars now. Alistair Parr: And, uh Joe, I wondered if you’d be happy to cover off coverage for us, please. Joe Tolley: Of course. Joe Tolley: Thank you very much. Joe Tolley: So, some of the Key considerations for the coverage pill of maturity is firstly getting an understanding of how comprehensive your current scope of the program is. Joe Tolley: How many of your third parties do you know about and how confident are you about that particular number? Joe Tolley: So performing an exercise to actually try and understand you know what information you have internally about a third party or about your entire third party scope. Joe Tolley: It’s usually valuable to determining how well you’re assessing uh each of the potential entities that could be exposing risk. Joe Tolley: to your organization. Joe Tolley: Uh we often find that the scope of third parties is usually uh reduced because of poor on boarding processes. Joe Tolley: So one thing we would highly recommend is investing some time into looking at your on boarding process to make sure that if there are third parties the business units are engaging with uh there is a fitfor-purpose process to inform the program owner that there’s a new third party to on board. Joe Tolley: Uh so the relevant um uh gates of on boarding a a third party can be taken. Joe Tolley: Without that sufficient and mature onboarding process, there’s always going to be a risk that there are holes in the program and obviously third parties that could potentially go unmanaged. Joe Tolley: As part of that onboarding process, there are a few things that should be incorporated to ensure that you are um acting in the most mature way possible. Joe Tolley: Uh one of those aspects being profiling. Joe Tolley: We commonly see that third parties are purely assessed uh using assessment content, but there’s not a sufficient amount of time and investment in actually finding out what that third party does for the organization, logging it so they can keep a track record of it. Joe Tolley: Uh making sure they’re aware of uh key attributes that might influence the assessment process like where they’re operating from. Joe Tolley: Uh what type of service are they providing? Joe Tolley: Uh what’s the scale of the organization? Joe Tolley: Do they interact with personal data or any form of of sensitive or or intellectual property for example. Joe Tolley: So some of those key aspects around attributes of How are third parties delivering a service can be hugely beneficial not just from a reporting standpoint? Joe Tolley: Uh but also when it comes to to maintaining and making sure you’re assessing them in the right way. Joe Tolley: Uh there can be a huge amount of context gained when it comes to the remediation approach because you know that type of information up front. Joe Tolley: Uh and without trying to get this information up front and as part of that on boarding process, it can be a real struggle to pick up that type of data later on. Joe Tolley: As part of onboarding a third party, um We’re often pushing clients towards rolling out a tiering based approach. Joe Tolley: This is a way of classifying the criticality of a third party based on some of those key attributes that we just discussed. Joe Tolley: What we have found is that um clients might be tiering third parties based on the type of data they hold. Joe Tolley: Um of course that is one aspect but what we recommend is looking at a few different areas uh to understand what the real impact of that third party would be on your organization if they were to fail in the performance or delivery of the service. Joe Tolley: So being able to understand some key attributes to in order to measure how critical a third party is to your business. Joe Tolley: Some of those attributes might be data uh but also it could be that they expose some reputational damage um some financial damage or even operational damage which could be of course very significant and something that’s overlooked. Joe Tolley: Uh we often find and this is probably one of the the key most key areas that uh that third party programs fall down uh when it comes to on boarding processes and that’s looking at the extended third parties or fourth parties or as we reference here end parties uh within the program. Joe Tolley: So these are third parties of your third parties. Joe Tolley: They’re the ones that might be supporting a third party service uh that they provide to you. Joe Tolley: It could be a uh a data center for example. Joe Tolley: Uh what we’re commonly finding is that if you begin to map out some of these four parties that might be providing a service to your third parties. Joe Tolley: You can in fact find some commonality there. Joe Tolley: There could be multiple third parties you utilize which are really critical to you all being stood up by the same fourth party provider. Joe Tolley: Uh that of course exposes another area of potential risk and a potential bottleneck if there was uh the event of a failure from that fourth party. Joe Tolley: So far we’ve looked at identifying third parties, making sure we capture the right information, classifying them correctly and of course finding out any of that additional information around extended uh third party scope. Joe Tolley: What we haven’t touched on yet is third party maintenance. Joe Tolley: Um this is a real big one to hammer home. Joe Tolley: There’s not a lot of clients that we see do this. Joe Tolley: It’s making sure there is a process in place to revisit each third party on a regular basis. Joe Tolley: And of course this could be something you determine based on the tiering or or classification of a third party to you. Joe Tolley: Uh but the process should be there to make that all of the above information on those top four bullet points there are reviewed. Joe Tolley: You have a process in place to make sure contact is information is the same. Joe Tolley: The type of service hasn’t changed. Joe Tolley: Their criticality to the business hasn’t changed. Joe Tolley: And some of those key attributes that might influence your assessment process as well have not changed. Joe Tolley: And if they have, you’re able to you’re able to make that change to how they’re assessed or how they’re reviewed and remediated internally. Alistair Parr: Great. Alistair Parr: Thank you. Joe Tolley: No problem. Alistair Parr: So, Some common observations that we normally see specifically around coverage is that a vast percentage here, so 79% of the organizations we deal with do not consider forth party risk. Alistair Parr: So they’ll stop at that third party space. Alistair Parr: Now that for us is perfectly understandable because we appreciate that it’s very very rare for us to actually engage with an organization who has actually done the necessary due diligence and requirements across their third party estate. Alistair Parr: So naturally, why would they have evolved to their fourth parties when they haven’t had the bandwidth? Alistair Parr: And usually at the point where they have actually assessed everybody and captured the information they need, need is ready to start again and repeat that process. Alistair Parr: There is a commonality and Joe did mention this which is out of the people that have actually considered fourth party risk. Alistair Parr: Uh we have seen quite often concentration risk where the same fourth part is being used by multiple third parties. Alistair Parr: Now I appreciate in some cases this might be an acceptable tolerance particular when you’re dealing with cloud providers like AWS and Azure but equally so there may be a reliance across some of your critical uh prov ers that would have a detrimental impact to the business downstream. Alistair Parr: So we do recommend at the very least if you look at some of your most critical providers that enable you to generate revenue in whatever it is that you may be doing that you just investigate and just identify if there’s any concentration risk to consider. Alistair Parr: Now that concentration risk may actually be a positive for you for your third party estate when you have different parts of the business using the same vendor and you haven’t necessarily consolidated your uh your contracts into a master services agreement and there’s an opportunity to to actually funny enough save money and get money back from your third party program rather than being seen as a insurance expense. Alistair Parr: But generally looking at concentration risk for fourth parties, we do recommend looking at at least a subset of your tier ones. Alistair Parr: Joe, would you like to uh educate us a bit on content if you don’t mind? Joe Tolley: Sure. Joe Tolley: Thank you. Joe Tolley: One of the common pitfalls we’re finding as well through working with our with our clients is the lack of a standardized assessment framework when building out an approach to assessing third parties. Joe Tolley: Uh commonly what we’re finding is there is a survey of some form. Joe Tolley: It’s being sent out to some third parties and capturing data. Joe Tolley: What clients aren’t doing uh and certainly what we’re pushing them towards doing now to mature this particular area is building out a playbook of how to assess each third party based on some of the information we learned from the on boarding process. Joe Tolley: So if for example we know they handle sensitive data or we understand that they’re a tier one third party, maybe we could stand ize what happens next. Joe Tolley: So, if they’re a tier one, perhaps there’s a requirement for an on-site visit. Joe Tolley: Maybe we need to see evidence of penetration tests that have taken place. Joe Tolley: Maybe there’s a lower threshold in which we’re accept willing to accept risks. Joe Tolley: So, building out that if this then that type standardized approach to identifying what a survey or what an assessment requirement is uh will certainly help standardize that process. Joe Tolley: It’s as I said, it’s something we haven’t seen um incorporated too much. Joe Tolley: to date. Joe Tolley: Uh but certainly there’s some room for to finding that sort of logic provided you have the profiling and on boarding information up front. Joe Tolley: Another pitfall we’re seeing we’re seeing with third party programs is the uh the lack of a developed and mature uh process for managing survey content or assessment uh content. Joe Tolley: Uh the reason being quite commonly is that an assessment or a question set is in place. Joe Tolley: Uh maybe there’s been changes of roles etc. and uh throughout the is the questionnaire itself hasn’t really migrated or been enhanced or kept up to date. Joe Tolley: And as I said, that’s usually because there’s some historic reason for why it was invented in the first place. Joe Tolley: It covers all the bases, so there was never a need to review it. Joe Tolley: We would certainly recommend putting a particular representative uh in charge or accountable for managing that type of questionnaire content to ensure it covers the right basis from a regulatory standpoint and a coverage standpoint based on some of those profiling attributes that we that we uh that we understand. Joe Tolley: Uh there are a few techniques as well that we are recommending to clients to introduce into survey content as they’re rolled out. Joe Tolley: Uh we often see free text fields within questionnaire content. Joe Tolley: Yes, no partial type responses. Joe Tolley: All of these capture information, but they leave a lot of room for error when it comes to an accuracy and a detailed standpoint. Joe Tolley: Uh asking whether, for example, a third party has information security policy and then and then having a yes no partial type response. Joe Tolley: Uh the yes could mean many things. Joe Tolley: Yes, they have one, but what what is the quality of that content like? Joe Tolley: Um and the partial as well, you know, it leaves a lot of ambiguity around the quality or or the the level of maturity about that particular information security policy that we’re asking for. Joe Tolley: So, what we certainly push clients toward doing now is building out a set of requirements aligned to each of those types of questions uh to represent what we mean or what what we feel is acceptable or considered good. Joe Tolley: Um so rather than having do you have an information security policy yes no partial perhaps there we could say which of the following information which of the following aspects are incorporated in your information security policy and then we can detail out each of the things that we define uh as our standard for what’s acceptable when looking at that type of information. Joe Tolley: Uh are we looking for them to have a particular um content inside the information security policy are we looking to understand they have an owner that it’s communicated and of course that it’s reviewed as well. Joe Tolley: So being more descriptive about what we actually want from these questions leaves a lot less room for error when it’s when it comes back to accuracy of the data we’re collecting when it comes to the um the detail that we get within the content uh the questionnaire content as well there are a few techniques we can introduce certainly making sure that any of our our questions have sufficient guidance text associated to them. Joe Tolley: Making sure that we explain any terminology. Joe Tolley: Uh making sure that we explain any acronyms that might be present within the question set and of course making sure there’s a sufficient route to go down should they have any problems with responding to particular areas of the questions as well. Joe Tolley: Um so enhancing that questionnaire quality uh will do we’ll do a few things to actually help the endto-end process for uh managing your program. Joe Tolley: One is that it usually takes a third party less time to respond to question sets. Joe Tolley: Uh they’re obviously going to find it easier. Joe Tolley: Um you’re going to get that data back better. Joe Tolley: It’s going to be better quality and more accurate. Joe Tolley: And of course, we’re saving time here. Joe Tolley: The more things that we can capture up front that’s accurate and to the level of detail we need. Joe Tolley: Uh means the less times we have to engage with the third party afterwards to follow up on some of these areas where uh there may be gray areas within how they’ve responded or blank answers or miss questions. Joe Tolley: So having uh that that mature process in place around questionnaire quality will of course save on resource and make the process more efficient later on down the line as well. Joe Tolley: Uh and the last point on there is around scoring and risk register output. Joe Tolley: Um it’s all well and good having a good quality question set but are we able to define some logic to identifying and automatically presenting some of these risk items to us off the back of it. Joe Tolley: Uh so are there particular questions within in your question set that are more important than others. Joe Tolley: Have you put in the measures to grade or apply some form of criticality to each of those waitings as well? Joe Tolley: Um what we would recommend of course introducing here is an impact times likelihood approach. Joe Tolley: Something where we’re waiting the questions we’re asking and we’re looking at the level of maturity for the responses we get back. Joe Tolley: Using those in combination, as you can see on the right hand side of the the slide here, we’re able to start grading some of these questions and responses automat atically we then see all of the more critical findings, you know, further up that list of uh of a more itemized risk register and it helps us prioritize again some of the review process which of course improves on more of those efficiency tasks as well. Alistair Parr: Thank you Joe. Alistair Parr: So we’ve had a few questions come in while uh while we talking about this. Alistair Parr: So I’ll just address these briefly and of course we do have the Q&A section at the very end but we’ve had a couple of questions just asking about the fourth party and concentration risks from the previous slides are just to highlight that as how do we actually track that. Alistair Parr: So an assessment is is definitely always the preferential route. Alistair Parr: The reason why is that if you use a technology to passively scan and identify the technologies that they use uh from a fourth party standpoint, it it’s going to be a limited list. Alistair Parr: You’re going to get insight into the fact they use uh these SAS technologies and 0365 and who doesn’t tend to use 0365 or something similar these days. Alistair Parr: It’s it’s it’s a somewhat useful list. Alistair Parr: But what really adds value is when you actually get the context from them directly. Alistair Parr: You need to really ask the vendors and say uh usually for a supplementary assessment or something similar where you ask what are the fourth parties in scope for my contract or the services that you’re providing. Alistair Parr: Again, you don’t want the list of everything they do because most organizations will have multiple service lines that they’re providing. Alistair Parr: So you can pinpoint and target the ones that actually are supporting your workflow. Alistair Parr: And some of that does feed into some of the uh the privacy elements as well. Alistair Parr: So data protection, data transfers, understanding the flow of information as it moves around. Alistair Parr: And uh tied to that as I did mention usually assessments. Alistair Parr: We’ve had a couple of questions come in as well about uh how do we actually get the uh the third parties and vendors to assess as we progress because I appreciate a lot of people can get resistance from them and quite often the resistance we get back is we’ve answered this a thousand times in the last year to every other customer like I understand it is a repeated process and it’s probably changed 5 10% between the different organizations. Alistair Parr: So the standard approach that we take on this is Let’s try and make this as painless as possible for the the suppliers and the vendors. Alistair Parr: I appreciate we’re the customers and they’re providing services for us and you’re usually going to get a good response when it comes to contract negotiations or at the procurement stage because they naturally want your contract and they want your money, but past that stage, they become very uh reluctant to. Alistair Parr: So, first stage is making sure that you have contractual obligations for them to actually conduct this assessment in a manner that you’re comfortable with. Alistair Parr: That’s that’s the uh by far the the stick to the situation. Alistair Parr: The carrot being use something that’s going to make life more straightforward for them. Alistair Parr: Don’t give them a spreadsheet where they’ve got to go through and interpret your your language and logic. Alistair Parr: Ask them in a manner that is more palatable to them. Alistair Parr: So, the way that we approach that personally is that we will use web interface. Alistair Parr: We’ll use uh multiple choice or checkbox based questions rather than yes, no. Alistair Parr: So, we move the ambiguity and that uh that mind-numbing process of going through a spreadsheet and going yes, no, yes, no, no, yes, no as you go through 300 rows. Alistair Parr: And I’ve done my fair share of those. Alistair Parr: Uh, and make it less of a meditative process for them. Alistair Parr: Make it interactive. Alistair Parr: Uh, and wherever possible, give them information back. Alistair Parr: Don’t just be a black hole where you take that data and you tick a box internally and you say, “Brilliant, we got their information.” Alistair Parr: Try and make it interactive. Alistair Parr: Give them results. Alistair Parr: Give them feedback. Alistair Parr: Highlight from a compliance standpoint if you’re able to map against it what they’re doing well and what they’re doing badly. Alistair Parr: Uh, if it’s approached that way, then they’re seeing some value against it. Alistair Parr: I appreciate the be doing similar activities internally anyway. Alistair Parr: But if it’s painless or as painless as it can be and you’re giving feedback and something that’s actually useful to them as a return, then you’re more likely to uh to to get some interaction from them. Alistair Parr: So that carrot and stick approach of contracts uh and making that process seamless certainly helps. Alistair Parr: So what are the observations we normally see from a content standpoint? Alistair Parr: Now content risks the most common issue we see so 52% of organizations who with conducted this maturity assessment against and just to reiterate we’ve done it with golas of industry down to small organizations across multiple verticals here 52% of them did not have a standard way to present risk data now this may sound surprising because most people would think well they have a a risk register of some description a spreadsheet at all or whatever it may be but a centralized risk register is more than just a information security risk register it’s something that’s universal across the organization it’s a similar model or process that’s being followed across say privacy, business continuity and so on. Alistair Parr: So it’s looking at the gamut of information you’re capturing across from the vendor and can we consistently standardize it in some shape or format. Alistair Parr: The value on that is that you’re all speaking the same language, you’re interacting with a vendor in the same way and that third party is just generally going to find that a more uh palatable approach. Alistair Parr: So we do strongly recommend when we speak to customers who have this issue that we work on building a centralized risk register, consolidating data wherever we can and making sure that is consistent. Alistair Parr: So moving on to our next sphere. Alistair Parr: So roles and responsibilities. Alistair Parr: Joe, if you would be so kind. Joe Tolley: Sure. Joe Tolley: So now we move on to the the nuts and bolts and the operational aspect of actually making the program work. Joe Tolley: Uh we need to align the right resource and the right representatives and of course the right skill set to the task at hand. Joe Tolley: Uh the first thing to look at here is whether you have a dedicated team responsible for managing the program. Joe Tolley: Uh I appreciate that if you do not have many third parties, the the slice of resource that you’re going to need to cater for the program is considerably less. Joe Tolley: Uh but we do actually find that even organizations with a larger scale of third parties uh do not have dedicated resource to managing the program. Joe Tolley: Uh because of that there has to be compromise. Joe Tolley: It means you’re either reviewing less risks, you’re assessing less third parties or you’ll not be able to perform some of the more maturebased tasks like u managing content properly or of course being able to report the right information up. Joe Tolley: chain as well. Joe Tolley: So ensuring that you have enough resource for the task in hand is obviously something quite important. Joe Tolley: Uh there are a few ways of doing this. Joe Tolley: We we have tools internally that allow us to look at your program goals, how many third parties you you would like to assess how much um time is spent on each of the individual program tasks. Joe Tolley: And it allows you to forecast exactly what type of resource you’re going to need internally and how many hours you have to dedicate to the program on a monthly or yearly basis to meet your program goals. Joe Tolley: If you don’t have enough resource, you can quite easily forecast that there’s going to be a problem. Joe Tolley: And at that point, you can weigh up whether you need more resource to invest in the program or you have to compromise and reduce some of the scope of your your program goals to make them realistic. Joe Tolley: We’re also commonly finding that not every program has a dedicated owner associated to it. Joe Tolley: So, ensuring you have someone accountable for making some of those decisions around uh defining content, what the scope of third parties are what the program goals look like is of course something that is uh that is important and helps the program function and operate. Joe Tolley: Surprisingly, operational manuals are a bit of a rarity from the organizations we’ve assessed so far. Joe Tolley: This operational manual is a guide book on how the different aspects of the workflow for managing a program day-to-day uh should operate. Joe Tolley: It includes the process flows for the onboarding uh the communication approach, the assessment content guidelines, how you would go about reviewing and remediating risk items all the way through to the types of reporting templates that are required as well to support each of those operational workflows. Joe Tolley: There should be a a clear guideline on exactly who is performing each of those tasks and what the level of invol of involvement is as well in each of the activities. Joe Tolley: So, we recommend using some form of racy model there uh to support each to those workflows. Joe Tolley: So, we know where the lines are and the boundaries are in terms of who’s doing what. Joe Tolley: Of course, there should be an owner dedicated to that type of manual as well, someone to make sure it’s up to date. Joe Tolley: And uh touching on the resource point that I referred to during the content pillar overview, the more we define a robust process here, uh the more we get a good gauge on how many third parties we can assess each month or across the course of the year. Joe Tolley: So, standardizing this process, making sure people are the right task at the right time uh really allows us to build a benchmark on what can be accomplished. Joe Tolley: So making sure we standardize this as much as possible, communicate it out and train people in the right way will of course make sure that we can start to get some good metrics on how the program is working. Joe Tolley: As I just touched on there, the operation manual should serve as a real uh key training point for the rest of the program. Joe Tolley: If there is a new starter for whatever role within the program, this operation manual would serve as that. Joe Tolley: guide book for how to act, how they should engage with third parties, when they should communicate and how they should communicate with them and of course as we touched on where the boundaries of their roles and responsibilities uh lie within the program too. Joe Tolley: We are often finding as well that uh some of the incorrect roles might be aligned to programs. Joe Tolley: So um what we recommend doing is aligning the operational based tasks of a program to the more analyst based roles within the within the team. Joe Tolley: And anyone who’s touching on remediation activities that might require the subject matter expertise is aligned to those more uh subject matter expertise based roles within the program like risk reviews or remediation. Joe Tolley: If you’re able to segregate who’s doing what, you’re of course aligning the right resource to the right tasks and as well as that becoming a a good way of standardizing your approach. Joe Tolley: It allows you to build up some metrics of what teams capacity uh is aligned to what and of course ows you to start resource forecasting for the future. Joe Tolley: So getting that understanding of how many risks can be reviewed by a subject matter expert uh per month, how many hours they’re dedicating to the program, it all builds up that profile of maturity from a roles and responsibilities standpoint. Alistair Parr: Fantastic Joe. Alistair Parr: Thank you. Alistair Parr: That was lovely. Alistair Parr: So moving on to some of the common observations before I highlight these. Alistair Parr: Uh we’ve actually had a question come in about equating risks to financial costs. Alistair Parr: So risk quantification. Alistair Parr: Now touching on the uh centralized risk register point that we made earlier on just to highlight that as in risk quantification is certainly useful but the main challenge we see when it comes to third party programs is that to establish a accurate or meaningful financial calculation that you can hang your proverbial hat on means you need to have the right level of context. Alistair Parr: So you need to be able to populate enough information about yourselves internally as an organization as where are you in your vertical uh what is your size, what is your function, how are you regulated, etc. Alistair Parr: Now, that’s reasonable, that’s understandable for you to manage internally, but the amount of context you need from the third party is substantial. Alistair Parr: So, as much as we are proponents of risk quantification, and we are, the reality is is that the by far and large, the most organizations we speak to are not in a position to populate the right amount of context against a third party to determine what it is that they’re doing and therefore what is the associated potential financial impact of their activities. Alistair Parr: be it tied to revenue generating activities or otherwise. Alistair Parr: So organizations at the upper echelons of the uh maturity scale certainly can start looking at risk quantification and I think that’s that’s a valid progression and it’s certainly a talking point for governance when we come on to that and we talk about stakeholders of the program and speaking their proverbial language but the reality is the vast majority of organizations we speak to would not be in a position just yet uh to accurately calculate financial costs uh regardless of the Mont follow calculations that they may use. Alistair Parr: So tying back to some of the observations here that we’re actually seeing against roles and responsibilities. Alistair Parr: We’ve got a couple of metrics for you this time. Alistair Parr: 62% lack a standardized process. Alistair Parr: Now this ties back to the operations manual point that Joe was highlighting earlier on. Alistair Parr: And a standardized process for us should be documented. Alistair Parr: It should be racy. Alistair Parr: You should have roles uh defined and the various activities and who’s consulted etc mapped out. Alistair Parr: But a reasonable percentage of people are so-called winging it. Alistair Parr: They’re simply bringing new people on board. Alistair Parr: Uh they’ll evolve as they go along. Alistair Parr: They’ll do their best. Alistair Parr: People will leave, process will change, etc. Alistair Parr: The challenge on that is you haven’t got a consistent metric year on year in order to compare results. Alistair Parr: So, uh we do strongly recommend that the process is standardized just if anything for personnel churn. Alistair Parr: 52% had resource planning shortfalls. Alistair Parr: Now, quite often this is people being overly optimistic around what they’re going to be able to achieve. Alistair Parr: They’ll start out the year and say, “Right, we’ve got X number of personnel. Alistair Parr: We’re going to conduct 4,000 assessments of third parties this year. Alistair Parr: That’s brilliant.” Alistair Parr: But what they’re not necessarily considering is all the intricate pieces that contribute to that. Alistair Parr: Who do they need to involve across the business? Alistair Parr: Are there subject matter experts they need to involve? Alistair Parr: Are they business owners who own the reput sorry the uh relationships with the third party and so on. Alistair Parr: So these are all timeconsuming efforts and in reality take more time than you’d expect for a subset. Alistair Parr: So quite often there’s resourc planning shortfalls where they’re not able to get through the volume. Alistair Parr: And that’s not purely just on the vendors interactions and the third party interactions and the delays we mentioned earlier on and getting them to respond, but also those internal uh uh various contributors which uh slow down the process. Alistair Parr: Uh 59% overspend on TPRM resources. Alistair Parr: So even though they have resource planning shortfalls in the sense of they haven’t got the right amount of capacity, they are in some cases overspending on resource. Alistair Parr: Now what we mean by this is that you may have somebody who is higher skilled and higher cost center to the business uh working on activities that they wouldn’t necessarily need to do. Alistair Parr: A lower skilled resource would be able to accommodate it. Alistair Parr: So if you’re putting out your infosc or your CISO or your senior roles of the business and they’re sitting there trolling through results as opposed to looking at output of results and risks and issues, that’s not a good use of time. Alistair Parr: So a reasonable amount of percentage of organizations are using senior tier resources on activities which could either be automated or delegated down to uh an individual who who may be at a cheaper cost center for the business which we found to be quite interesting. Alistair Parr: So moving on to remediation Joe thanks so now that we have defined our content we’ve aligned our resources we start to get data back into uh into into the team for review. Joe Tolley: Um so the first thing that we would need to look at is the process around reviewing submissions that that come back from third parties. Joe Tolley: What we’re finding is there’s not typically a process a standardized process for reviewing these responses after they’re submitted. Joe Tolley: Um so what qualifying steps take place to make sure that data is accurate and ready to progress onto a review or remediation phase. Joe Tolley: So certainly some of the things that we would recommend introducing are firstly a basic review of the quality and completeness of uh set of responses that come back. Joe Tolley: This would typically be um ensuring that the right questions have been responded to. Joe Tolley: Uh you’ve got responses aligned to every single particular question you’ve asked. Joe Tolley: Making sure that any notes that have been provided as to support any any questions have been provided are fit for purpose and relevant and any attachment files or evidence uploads of course meet the required uh required uh guidance as well. Joe Tolley: So having that standardized for your questionnaire content your your mark questionnaire content ensures that that’s going to be a repeatable process for the team to manage. Joe Tolley: Once you’ve done that aspect of the approach and you’ve performed any rejection and um and repeat processes to get the information to the level where you need it to be, uh you can then move on to looking at the findings from a more risk standpoint. Joe Tolley: So looking at the types of responses you got back, uh using some of the question scoring logic to define how you should start to prioritize risks for review. Joe Tolley: Uh reviewing them to find out exactly how these should be reviewed internally. Joe Tolley: And of course, uh, building out a remediation approach that’s standardized as well. Joe Tolley: One thing we are pushing clients toward doing is not just asking questions, uh, getting the responses back and then leaving those actions down to interpretation, but actually defining again some form of standardized logic on what to do if you get this type of response. Joe Tolley: Uh, so referring back to my example earlier on, uh, if they said no, they didn’t have an information security policy, Maybe you can already define up front what the next step in that review and remediation approach should be. Joe Tolley: Uh what would the guidance be? Joe Tolley: Uh when would you expect it by and what sort of quality would you expect as well? Joe Tolley: Having that type of process defined means that uh if we refer back to the the previous roles and responsibilities side uh we can start aligning some of these first pass remediation and review activities to the more operational and and analyst based roles within the business. Joe Tolley: We don’t have to have subject matter experts look at all of these findings. Joe Tolley: Uh if we can align some basic logic and standardized process to giving a first pass through that review review process of of of the risks. Joe Tolley: Um lastly when we look at the uh the remediation resource requirements, this ties again back into roles and responsibilities slightly but making sure that we build up some accurate metrics on how long it takes to review the types of risks that we’re commonly finding. Joe Tolley: Uh what the review process looks like end to end so that we can start forecasting and capacity planning for uh further remediation efforts and as I touched on earlier as well although we’re building out this if this then that type logic for the types of questions that we’re asking having the type of process defined for remediation is of course very important so not just for the remediation aspect with third parties but also that submission quality and completeness review approach as well making sure each of these types of workflows features within the operations manual has the right steps and the right gates in place of how to actually progress through the workflow end to end and all of the correct alignment of resources uh supported by a racy model as well to ensure it’s carried out uh to the most mature and repeatable manner. Alistair Parr: Great. Alistair Parr: Thank you Joe. Alistair Parr: So some of the most common observations we see in the space remediation 86% so a very significant amount here had inconsistent remediation guidelines. Alistair Parr: So this is how they’re going to respond and react to results as they come in. Alistair Parr: So what I mean here is that they’re not actually standardizing it. Alistair Parr: So you may have different parts of the business analyzing different uh data in different ways and different roles analyzing the data in different ways. Alistair Parr: That’s fine. Alistair Parr: But as long as it’s consistent enough that we can hang our hat on it and review it year on year and make sure it’s consistent, then we will be good. Alistair Parr: 59% had incomplete risk scoring mechanisms. Alistair Parr: So however you calculate this, whether it’s likelihood by impact or using this models risk itself needs to be again mapped in some shape or form. Alistair Parr: We need to understand what is the impact based on the what that vendor or third part is actually doing. Alistair Parr: It’s not going to be universal across the entire estate. Alistair Parr: Uh and then determine based on whatever information we can collect whether it’s monitoring tools, risk assessments, etc. how likely that’s going to be. Alistair Parr: So surprising a percentage don’t actually have a normalized and standardized risk scoring mechanism. Alistair Parr: So a likelihood and impact or similar model. Alistair Parr: And what we normally we find when it comes to risk remediation is that while we all strive to build something that’s can withstand the test of time and looks fantastic to look at, it doesn’t really need to be your magnumopus. Alistair Parr: The most effective programs that we’ve seen from a remediation standpoint are simple uh and therefore are able to scale. Alistair Parr: The more complex it is, the more difficult it’s going to be to scale and the more it’s going to fall apart as you do try and do so. Alistair Parr: So we do strongly recommend uh that we cover off uh this information in a bit more detail. Alistair Parr: So we do recommend that when we go through remediation we make sure that we are uh tracking and establishing recommendations in line with what the business expects to see. Alistair Parr: So from a governance standpoint what we are looking to see for governance is being able to articulate things to the right resources of the business effectively. Alistair Parr: So from an individual reporting standpoint how are we going to report back the various roles and responsibilities and the progress the KPIs of the of the day-to-day act activities. Alistair Parr: So what I mean by that is are we doing assessments? Alistair Parr: If so, how many assessments have we actually distributed? Alistair Parr: Are we on track for hitting our targets? Alistair Parr: Are we hitting the right amount of the organization and the third parties that we deal with on a day-to-day basis? Alistair Parr: Or is it a small subset? Alistair Parr: Are we reporting back statistics on that small subset which may be misleading compared to the broader audience? Alistair Parr: An example would be I’m assessing 10% of the business at third party estate and then I’m reporting back that we have a high level of risk. Alistair Parr: across our third party estate. Alistair Parr: Well, if that’s only against 10%, we need to articulate that as part of our governance processes. Alistair Parr: We need to escalate the right KPIs back to uh the program sponsors and stakeholders from a program reporting standpoint. Alistair Parr: So, I obviously spoke about some of the scope there and the breadth of it, but again, we need to be very very succinct in highlighting what we’re covering. Alistair Parr: Is it just information security? Alistair Parr: Is it business continuity? Alistair Parr: How often are we conducting these assessments, etc. Alistair Parr: Now, this can feed into a risk committee and steering group. Alistair Parr: So a risk committee here is usually a cross-section of different roles of the business. Alistair Parr: I’ve mentioned some of the the more comprehensive third party programs incorporate business resilience, continuity, IT, uh information security, privacy, legal, procurement. Alistair Parr: All of these have a vested interest in managing these third parties. Alistair Parr: So should be involved in some shape or form in a risk committee or steering group. Alistair Parr: Now we normally recommend quarterly reviews which could also take the results of a maturity assessment. Alistair Parr: Now the maturity review feed into that because they give a broader insight into the program. Alistair Parr: What are we doing well? Alistair Parr: What are we not done well? Alistair Parr: Where can we improve? Alistair Parr: And of course, audit is a factor on that. Alistair Parr: So, the audit teams, if you have an audit team for the organizations that do, they’ll want to see how you’ve uh matured against your maturity assessment, what you’re doing, and collect demonstrative evidence against that. Alistair Parr: So, those steering groups, risk committee minutes, and sessions uh and the reporting on the KPIs of the program all feed into that. Alistair Parr: Okay. Alistair Parr: So, one of the observations we tend to see on governance is that 69% were missing strategic reporting opportunities. Alistair Parr: So, this is where they’re actually capturing information on third parties but not necessarily reporting it back to the business. Alistair Parr: Now, reporting it back certainly helps because you’re able to allocate more budget, justify budget spends, justify resource, uh, etc. Alistair Parr: and actually fix the issues that we’re finding. Alistair Parr: Now, a large percentage are not being able to collect the information they’ve got into a cohesive summary that they can present. Alistair Parr: Now, the good questions earlier on about risk quantification, turning it into a financial number, that can certainly help because you’re talking the same language and you can say if we don’t fix this, it’s going to cost us X dollars and then that has meaning and that’s justifiable rather than a risk score where we say this has a risk score of 20 to the organization. Alistair Parr: Most senior execs may just sort of shrug their shoulders. Alistair Parr: If you tell them they have a red risk and they’ll probably be more concerned. Alistair Parr: Now, that’s not to undermine, of course, their their capabilities. Alistair Parr: It’s just the exposure of what they’re dealing with in the business. Alistair Parr: It needs to be commensurate with what they’re having to deal with on a day-to-day basis. Alistair Parr: So, what we normally recommend here is of course risk quantification is an approach. Alistair Parr: But alternatively, what you can do is start saying we cover X% of the organization. Alistair Parr: This is what we know. Alistair Parr: This is what we don’t know. Alistair Parr: This is this is what could happen if something happens as everyone sees on the news about the various breaches uh third party events and so on. Alistair Parr: So, you you’ve got to make it into an unknown question. Alistair Parr: You know, we know this much. Alistair Parr: These are the problems we know and this is what we don’t know. Alistair Parr: Are you going to accept that or are we going to get more budget in order to go and find out what we don’t know? Alistair Parr: 59% struggle to get an overview of third party risk. Alistair Parr: So this is a case where they don’t actually have much insight into what’s happening. Alistair Parr: They look at silos of third parties. Alistair Parr: So it could be a small subset. Alistair Parr: It could be just looking at information security or privacy and so on. Alistair Parr: And they’re not building this comprehensive or 360deree view. Alistair Parr: on what’s happening. Alistair Parr: So that should apply at a third party level and rolling up into categories of third parties, rolling up further into tiers and business function and then rolling up into your overall posture. Alistair Parr: I appreciate that’s sounds overwhelming for people who don’t have any of that, but if you’re consistent and you have a clear and simple metrics for calculating what a third party is and what their risk level is and a nice simple scalable mechanism, then it is uh more straightforward than it may seem. Alistair Parr: Now, we’ll come on to some of these questions in just a moment, but a few uh closing points before we go to our Q&A is that when it comes to the prioritization of improvements, what we’re really looking to do is identify the weaknesses on the back of a maturity assessment, uh we want to be able to associate what the impact is to the relevant parts of the business and then prioritize uh what our weaknesses need to be resolved first. Alistair Parr: So, what are the foundation layers that we need to focus on that are going to improve risk uh across our third party program and Of course, a contributing factor here should be the level of effort to implement. Alistair Parr: If we go and prioritize the top few and then find out they take 12 months to implement, yet we have something in number four that takes one month, then naturally you’re going to want to approach those lowhanging fruit areas first. Alistair Parr: Once we know what they are, we can look at planning and ownership. Alistair Parr: So, we can start establishing realistic time frames and expectations. Alistair Parr: We can set our maturity score objectives on what we actually want to try and achieve. Alistair Parr: And then we could define improvement tasks for various roles and allocate ownership to those. Alistair Parr: Now, we don’t need to do one component at a time. Alistair Parr: We quite often set out a objective per area. Alistair Parr: So, each of those pillars, we define a uh a simple objective to achieve per quarter and work through them that way and then break them down into phases where necessary. Alistair Parr: As part of our quarterly reviews on the maturity assession uh assessments and sessions, we would normally look at the results and see how we’re progressing against those. Alistair Parr: So, finally, appreciate it’s not getting easy getting everybody in harmony when it comes to third party management because there’s so many players involved in the process. Alistair Parr: Uh but if we keep it clean, simple, straightforward, uh it can be cohesive and we can we can get some semblance of a program running. Alistair Parr: So for the last five, six minutes or so, we’ve got a few questions coming in. Alistair Parr: So please feel free to send over any additional questions that you’ve got. Alistair Parr: Now is the time. Alistair Parr: So to start off with the first, uh we’ve got some informations here. Alistair Parr: Sorry, we do of course have as going through some of these Q&A points. Alistair Parr: Uh we do have a couple of poll based questions that we appreciate if you could answer. Alistair Parr: So firstly, if you’re looking to augment or establish your third party risk program in the next several months, please do let us know. Alistair Parr: In the meantime, we do have a question about as a business associate of a covered entity healthcare provider. Alistair Parr: Is there justification to exclude the covered entity from completing a third party risk assessment? Alistair Parr: Now, normally we wouldn’t actually have a justification for excluding it. Alistair Parr: Uh there need to be a justification of course But we we would normally recommend including those. Alistair Parr: Now it could be that it’s a a limited assessment. Alistair Parr: It doesn’t need to be the Mth degree and of course you may have a risk tolerance that reflects that. Alistair Parr: We would recommend at least from a due diligence standpoint addressing it partially. Alistair Parr: So having a short form supplementary assessment that’s going to address the need. Alistair Parr: Good question. Alistair Parr: And we’ve got another question here. Alistair Parr: So if an organization has a recent sock report with high marks, can this substitute a third party risk assessment? Alistair Parr: That’s a very good question. Alistair Parr: and one that comes up quite often. Alistair Parr: So quite often when we send out an assessment to organizations, we’ll get back a sock 2 report or an ISO 27,0001 uh statement of clickability. Alistair Parr: Now these are great, but someone needs to sit there and actually interpret them to some shape or form. Alistair Parr: Most organizations that would send these or a lot of them wouldn’t even address your own assessment. Alistair Parr: They’ll just pass that off to you and say these are standard T’s and C’s. Alistair Parr: Accept it and I understand that. Alistair Parr: So we can see those as acceptable assuming you have the resource internally to be able to process it. Alistair Parr: We try and limit the amount that we get down to a few% as in sub 5% just because of the associated effort in interpreting it. Alistair Parr: And that’s actually quite realistic. Alistair Parr: It’s actually a small subset that would actually push back uh and not map it into it. Alistair Parr: But there are of course additional questions that we get and we have to ask based on that data. Alistair Parr: So if it’s a sock 2 report or a 27,0001 assessment, you want to understand the scope, the statement of applicability. Alistair Parr: Does it cover off all the services they provide, have they got defined and addressable owners? Alistair Parr: You you’d want to dive into the uh the scope and qual and qualify the information that you get on the back of it. Alistair Parr: That’s a very good question. Alistair Parr: And we’ve got a question here. Alistair Parr: I’ll ask Joe if you could uh address this, which is if I don’t necessarily have all the resources in house, I’m a small organization here, I assume, uh to be able to address third parties, should my operations manual just be focused on myself? Joe Tolley: Yeah, if um if there are resource shortfalls um I would recommend that um of course you should start to build on the efficiencies as much as possible of the program so that you can start to accommodate more of the scope of third parties you you might need to um so as as we’ve discussed there are several ways of improving that efficiency whether it’s improving the quality uh and the user experience of actually completing an assessment um whether you’ve got the ability to enroll some form of automated platform of course to to automate as much of these mechanisms as possible. Joe Tolley: It allows you to um get an easier grasp and of course expand on the scope of the program in as short time as possible. Alistair Parr: Great. Alistair Parr: Thank you, Joe. Alistair Parr: I got another question here about program pillars as in do they have varying weights when you look at a third party risk program? Alistair Parr: Are they treated equally? Alistair Parr: A good question. Alistair Parr: So, no, we would not necessarily treat them equally or at least the contributing elements that make up uh that pillar. Alistair Parr: Uh a good example may be that if you’re looking at your or for example your your risk remediation program. Alistair Parr: That is certainly very valid and important. Alistair Parr: But if your scope is so small to the point where you’re only dealing with 5% of your third party estate, the fact that you have a fantastic risk remediation program doesn’t really stack up. Alistair Parr: It’s going to be heavily weighted by your scope, your coverage, etc. Alistair Parr: So when we actually build these uh these programs and these maturity programs, we do weight the questions that contribute to those uh differently. Alistair Parr: Now the pillars are important and they all have some value of importance but normally it’s uh it’s the scope and the depth is is something that we deem to be uh particularly important there. Alistair Parr: So before we ask a final question, we do have a a final poll for uh for you here as well. Alistair Parr: So if you would like to get access to one of our maturity assessments, uh we can talk to you about how they how they actually function, we do offer free maturity assessments as well. Alistair Parr: So you can actually go through the process and see how our pillars actually function. Alistair Parr: Uh by all means, please feel free to uh to comment in the poll and we’ll be able to follow up with you at uh uh at your leisure. Alistair Parr: And there’s no commitment, there’s no fee associated to doing any of our maturity assessments. Alistair Parr: So we’ve got a final question here around governance uh and who is normally actually owning the third party management program. Alistair Parr: It’s a very good question. Alistair Parr: Normally we see responsibility ultimately falling under uh the uh the chief uh well the COO’s office more than anything else. Alistair Parr: So if you’ve got a data protection officer quite often they’d be heavily invested in the third party management program from a privacy standpoint. Alistair Parr: Uh but normally it goes uh COO or CIO depending on the organizational structure uh then CISO in some shapes or form but it’s very information led very information centric and one last question here is uh you’ve obviously mentioned that remediation can be and should be kept simple. Alistair Parr: What do you mean? Alistair Parr: What we mean by that is that we should have clear-cut guidance and recommendations defined for the third parties so they understand what our risk tolerance is. Alistair Parr: We should be transparent with them. Alistair Parr: So we should be able to highlight to them and say this is what we expect from you. Alistair Parr: This is what we would pass and we do business with you. Alistair Parr: So I’d have clear-cut recommendations as part of your remediation process and establishing that upfront. Alistair Parr: So you can have different tiers of users articulating that information back to the third party. Alistair Parr: And the other aspect of that remediation process being simplified is by making sure that you have that standard risk interpretation model, you know, a likelihood by impact score, risk quantification, whatever it is, make sure it’s simple enough that you can repeat it, get a calculation, simplify it, automate it, whether it’s using a platform or spreadsheets, however you look at it, so that you can analyze data from your third parties cleanly and easily. Alistair Parr: Get get an assessment back, get monitoring information, and be able to simply turn that into a risk. Alistair Parr: I’m afraid that is all the time we have for today, but thank you very much everybody for uh for joining the webinar and thank you very much Joe for your time and contributions. Alistair Parr: It’s been uh very enlightening. Alistair Parr: If you do have any other questions for us, please feel free to reach out to us. Alistair Parr: Again, we’d be happy to provide a maturity assessment for you all uh if you would like us to. Alistair Parr: But otherwise, we hope you found value in today and we appreciate it and we hope you have a fantastic rest of your days. Alistair Parr: And thank you all.
©2025 Mitratech, Inc. All rights reserved.
©2025 Mitratech, Inc. All rights reserved.