Juan de Castro: Hello, my name is Juan de Castro and you're listening to Making Risk Flow. Every episode, I sit down with my industry leading guests to demystify digital risk flows, share practical knowledge, and help you use them to unlock scalability in commercial insurance. So today I've got Meghan Hannes with me. Meghan is the Head of the US Cyber and Tech Underwriting Management at Beazley, so one of the most knowledgeable people in the cyber space, so I'm really excited about today's episode. Meghan, thank you so much for joining me, why don't you start with a brief introduction to your role and your background?
Meghan Hannes: Thank you for having me. It's certainly a pleasure to be here. I am Meghan Hannes, US Head of Underwriting Management for Beazley US, which means we look after all things underwriting, appetite, application, pricing, policy forms, and strategy going forward.
Juan de Castro: Fantastic. Obviously, today's episode is going to be focused on cyber. It's a really hot topic. It's a very dynamic market environment. Let's start with a 50,000 feet view of the market, how do you see the market right now? How is it evolving?
Meghan Hannes: The 50,000-foot view of the market as it stands today is really that post-remedial environment, right? What we're seeing is pricing levelling off, I don't like to call it necessarily a softening, but more of a levelling off of the journey the market has been on since ransomware really hit us between the eyes in late 2020 and 2021. We're starting to see, certainly, rates come down and some money going back to our customers after they've seen two and three-year rate increases, and rightfully so because of the losses we are experiencing in the cyber market.
Juan de Castro: That's great. Obviously, in this environment where the rates are shifting again, risk is also shifting. Obviously, one of the most important bits is how do you control the loss ratio. So what are your thoughts on that?
Meghan Hannes: There are obviously a lot of components to that. First and foremost, we have our application process and our very steeped and knowledgeable underwriting community who's really the first and last line of defence. Any organisation would assess the application for individual responses. Lessons learned over the last several years are the basics obviously still matter. So critical open ports, EPP, MFA protocols, those are still very highly effective. From there, our underwriters would look at different application responses, trends that they are seeing, and then we would certainly put that risk into a bucket of selection, based upon its profile against other insurers within our wider portfolio, to then qualify for certain terms and conditions.
Juan de Castro: Definitely. And in this evolving environment, you're still looking at what other risk controls, what other factors drive that loss ratio? So it would be fantastic if you ran us through how do you identify those additional risk controls and how do you start identifying the correlation with the loss behaviour?
Meghan Hannes: It's a great question. Anytime you want to look at loss ratio impact, it's really two things, right? So it's that individual insurer's profile as represented by the application to the insurance company, and then ultimately the underwriting expertise applied to that lens to produce terms and conditions. But then the secondary lens and what fits over that model is what that risk means to the portfolio, and every insurance company is different and they're going to have a unique business mix that composes their portfolio. And from there, you really need to understand the different elements that feed into how you measure that individual risk profile against the balanced needs of your portfolio to ultimately influence desire or loss ratios as an outcome. And what that means is a certain level of relativity, right? So there's a lot of discussion in the market of insurers wanting to know what my risk profile looks like in comparison to my peers? Cyber underwriting management and portfolio management isn't too different. So it's not just about the risk in isolation. It is absolutely about the risk in isolation and then comparing and contrasting the claims that we're seeing to the responses that that insurer provides, but then also where does that insurer sit in space against the wider book composition? And that really is the two-pronged approach of successful portfolio management at a very basic level from a book composition perspective. Looking not only at the basics, but then also layering those other securities on top, meaning backup provisions, reliance on outside vendors, email protocols, threat actor deterrence, all of those different metrics that we look at to inform what good risk selection looks like. And then setting that in a predictable framework such that we understand not only that individual risk profile from an adequacy perspective, but then also where does that risk profile sit in space with the wider portfolio?
Juan de Castro: If I understood correctly, on one side looking at the risk in isolation, so in absolute terms, is this a good or a bad cyber risk? And then what's the marginal impact of that risk in your portfolio mix? So if we impact those two areas, the first one is how do you identify whether it's a healthy cyber risk or not? How would you identify what are the factors that define a good cyber risk? Let's start with an individual risk in isolation.
Meghan Hannes: Sure. Again, the basics matter. So EPP, MFA, any critical open ports, we then move on to those higher technical controls. Again, like email controls, what kind of backups, how does the insured manage authentication processes, what vendors are they utilising, third parties, how do they access their system, what are the sign-off provisions, how do they ring fence that exposure? Those are all things that we would look at. And then compare and contrast those answers of not only that individual insurer, but how often do we see all yes answers versus all no answers? And are those questions still effective in identifying risk selection against the losses that we're seeing?
Juan de Castro: Especially if you look at that as a cyber proposal form, they're getting lengthier and lengthier and lengthier. Some of them have 50, 60 pages, right? And it gets to a point where can an underwriter actually look at one of those questionnaires and make a judgement call on is it a good risk or not? But in the end, it will get to a point where you probably need to start almost like going back and being more opinionated or more specific about what really differentiates a good from a bad risk. Is that something that you're looking at?
Meghan Hannes: It certainly is. We're constantly looking at what application trends are. Going back to, do we get application responses that are always answered yes or always answered no? Then identifying, are those effective questions against the actual loss information that we're seeing come in? To answer the direct question about applications, are applications getting longer and longer? I would challenge back that they're actually getting shorter and shorter. Anytime you have a certain fluctuation in the market, a hardening, a levelling off, softening, or finding some of those basic security controls are perhaps not as prevalent as they used to be, perhaps 18 months ago, when we saw a truly remedial phase in the market where the applications got longer, the pricing got higher, and a lot of pricing adjustments were made based upon where we were as a cyber market of how we price risk, to where the needs we had for remediation, and then where we want to go as a healthy PNL going forward, of how we assess not only pricing risk but then the applications of which that derive our underwriting information to then bubble up to what informs good risk selection. So the point is application questions and length for that matter will come and go. And that's fluid with the market. And as an underwriting manager, one of our goals is to certainly support the underwriting community in that fluidity. So that's one of our top jobs is supporting the underwriting community to move through the market dynamics that they face. Where we support on the back end is that portfolio piece. So ensuring that we're constantly giving them the tools that they have at their fingertips to understand what this risk means in isolation, how does the technical pricing fit into our wider goals and objectives? But then also where does it sit in space in the wider portfolio? And those are the tools that we're constantly, again, driving, updating, examining to deliver to our underwriting community, so then they can go and be that first and last line of defence to the company, both from a production perspective and then a PnL perspective.
Juan de Castro: And on this second part that you just mentioned now, how does this risk sit in space in relation to the broader portfolio? Obviously without getting into the details of the Beazley strategy, but what type of considerations do you look at?
Meghan Hannes: The underwriting managers in general will have their own metrics of what they want to ensure they deliver to their underwriters in terms of what informs good risk selection. And we're no different at Beazley. I'd be remiss if we went further into detail, but I will leave it with underwriting management's obligation to the business to ensure that there are a couple of key touch points, individual risk selection and pricing sufficiency, but then also the tools to understand where that risk sits in the wider portfolio, so the underwriters and folks responsible for growth can ensure that they're making informed decisions and have confidence about that.
Juan de Castro: And going back to that tension about obviously willing to better understand the client and the risk profile, but at the same time not ending up with a thousand pages in the proposal form, what's the role of using external data, external providers to understand the risk profile?
Meghan Hannes: There's certainly a place for it. So there's no one magic bullet that informs good risk selection. The strategy that will set certain carriers apart from others is how they bring together those internal and external tools. So how are they bringing together this information versus perhaps an external scanning tool that many insurance companies use, but how are they boiling that into their proprietary mix for risk selection? So it's proprietary information, the external scanning tools, their internal, very own application mix, and then other portfolio management tools such as a data science team, how is that coming together to inform a selection is what for me is bespoke to each insurance carrier. But certainly to answer the question, external tools absolutely have a place. It's for each underwriting manager at their respective organisation, are those responsible for the portfolio mix to calibrate on how they want to apply that and the weight of which they want to apply that?
Juan de Castro: So moving a little bit from the few underwriting considerations into the more operational challenges or considerations, right? So what's the role of capturing data from your submission intake flow to inform your underwriting strategy?
Meghan Hannes: It's a great question. And it's certainly an opportunity for the wider market to solve because we know a couple of things. We know carriers have their applications and there's many carriers in the market. To all the points we've previously discussed, they all have their own unique spin on what informs good risk selection, and the outcome is those questions. There happens to be overlap, but it's by no means identical. And we, as a cyber market, again, we're finding commonality, but there's again bespoke treatment and not only each individual insurance carrier with their application, but then also a handful of brokers also really revel in creating that application for their clients to then submit to the carrier through various risk, whether it be risk control or optimisation, even before they come to the front door. What that means is the carrier is getting a whole lot of information to synthesise, and how do we ingest that in a meaningful, yet efficient way? There is certainly a lot of growth and room to potentially automate that process, obviously with checks and balances along the way to make sure that you get very high confidence scores of that automated process. But for me, one of the next big hurdles and opportunities of this space is how do we continue to synthesise the large amount of information that we're getting in? And that's one. But then also, what do we do with that information once it's in the door? And so it's not just automation, but it's also about data science on the back end.
Juan de Castro: It is a problem that has existed in many lines of business for years, right? Every insurer’s, to some extent, competitive advantage is to capture the right information about the risk to be able to have superior risk selection and underwriting practices, but at the same time, you don't control the applications that you receive on a day-to-day basis. So at the end of this, I think every insurer has this challenge of, okay, I am receiving applications from 50, 60 different, as you said, brokers or competitors, but how do I map that to the questions I want to know about the insurer? Right.
Meghan Hannes: Yeah, and right now that's the largely manual process that really evokes the need for experience, right? So you have to have an underwriter mapping those question sets out to ensure that intent and applicability are consistent through the chain. And then once you have that manual data set, if you will, how do we then automate that in a way that we still have very high confidence in the output? That’s certainly the next opportunity, one of the many opportunities facing underwriting managers today.
Juan de Castro: Let's build on that. You have an environment where you've got a large number of cyber underwriters, each of them receiving those applications in different formats, different risk controls about the client. What is your vision of what an underwriter should receive so that you're removing all that friction or operational friction, and then you're really enabling the underwriter to focus on making an underwriting judgement? So how do you remove the friction of them receiving that variety of information?
Meghan Hannes: Well, that's just it, I don't think you do. I think what we do as good stewards of the business, we will ensure that the underwriter can accept any variation of information or application, obviously assuming that the information is there and usable. But that's one of the opportunities we have as, again, whether you call portfolio managers, underwriting managers, product heads, how do we ensure that we keep that pipeline open for our underwriters to ingest information without having to limit requirements on the front end, meaning a certain carrier requires just their application. Because that's something that would certainly, for me, have a downstream effect on goals and objectives. And so that's really not the point. The point is to get the amount of information you need to reasonably assess the risk. It's our obligation to ensure that the information we receive is par for evaluating risk selection on an equal basis across the portfolio.
Juan de Castro: In this environment, perhaps let's think about, as I said, what's your vision for underwriters? What do you think the role of these cyber-, how is it evolving the role of the underwriters ? I think you mentioned the micro, again, that is where you think the underwriters should be spending their time.
Meghan Hannes: Yeah, for me, the role of the underwriter has only expanded in the last several years. There's certainly no shortage of needing to be very technically proficient in understanding not only the application responses you get, but then having informed second and third line questions, perhaps in a one-on-one exchange with the broker or with the client. But for me, the role of the underwriter doesn't necessarily change with the growth of the market. If anything, it's further empowered with additional datasets and tools, more so at their fingertips, to make optimised decisions as they go and then balance the book mix that presents, whether it be on their desks or in different regions.
Juan de Castro: What is the role of technology in that environment? How can technology support those angles?
Meghan Hannes: Again, it's back to basics. It's that automation, optimisation, and then frankly, it's got the potential to speed things up. When we're looking at something manual, how can we potentially automate that for the underwriter? It'll get the same outcome, but they can focus their efforts on, rather than that almost like a transcriptive piece, how do we ensure that they're optimised to focus their efforts on the true risk selection, engaging with our clients, engaging with our brokers, because we've provided toolings at their fingertips to give them a snapshot, whether it's micro or macro lens that we discussed, rather than spending a lot of time on the transcriptive piece, which as a former underwriter of 15 years, I can assure you there was a lot of transcriptive to get to the place we’re set up for success, engage the broker, engage the insurer. There's a lot of opportunity to automate those processes and then not slim down the information that we provide, but make it more practical and functional. They could have increased confidence that what we're supplying is not only a lens of, again, that micro piece on an individual risk basis, but also where that risk sits in space in the portfolio. That's a trend that's been emerging. It predates ransomware, but 17, 18, 19, with the maturity of the cyber market, you really started to see that portfolio mix, the emphasis on understanding the portfolio mix, which predated remediation in a good way, right? By the time we got to remediation and we really started to understand not only frequency, but severity was increasing at a very fast clip. It allowed us to build upon, for me, the trend and momentum that was already there for a maturing market. It just really accelerated it, because, as a market, we didn't have a choice, right? We had to understand what pricing levels specifically need to look like to come out on the other side.
Juan de Castro: Let me play back, I think, what I understood in your vision, which is, I think it's quite an exciting one. And then let me know if I got it right. Sounds like for you, the role of technology is, one is about digitising all those submissions that underwriters receive in very plenty of different application forms, and ideally, automatically mapping those to your schema, to the type of questions you want to get answered to analyse a risk. And I think by doing that, you remove all that, just to the mapping process, which is just like a very manual process for underwriters. And providing decision support insights to the underwriter, I'm like, okay, so almost regardless of what proposal form you receive, these are the risk controls the client has and the health of those risk controls. And then on the other side, mapping that to your portfolio strategy and to the rest of your book and saying, okay, this is the impact this new risk will have in our portfolio mix. So with that, removing the friction in the manual aspect of those activities and really presenting the underwriter with those two areas. I guess if I understood correctly the last bit you mentioned is if you've got all that data then digitised so that you don't need to start rekeying data and entering data into rating engines or other systems, again, that is how you free up your underwriters from the more manual activities and really enabling them to focus on reselection. Is that a fair summary?
Meghan Hannes: Yeah, that's a very fair summary, ensuring that we maximise the underwriters' bandwidth to do what they do best, which is not only form and build and grow relationships with our broker and trading partners, as to the support, not only increased submission flow, but the quality of the submissions and the type of business mix that we're seeing, but then also just underwrite the risks. That's certainly one aspect of technology enablement. The flip side of that coin is obviously ingesting all of that data, and then how do we start utilising data lakes, machine learning to optimise our lens of that data that we're ingesting to not only have claims data, but then also that what informs risk selection? What are the key data points learned from those types of data repositories?
Juan de Castro: That is quite an exciting vision, right? So what are you most excited about in the cyber space in the next few years?
Meghan Hannes: It's a fun question to answer. The market has come so far in the last, we'll call it 25 years, and so I often comment, particularly when I'm with folks who've been in this space for upwards of 20 years, like I have, which is hard to believe, you really have to sit back and think about where we were to where we are, and even take a watershed moment with a lot of the retail pops in early mid 2013, early 2014, and then take that context and put it into present day tense. That was only 10 years ago. To think about how far we've come as a market, the tooling we have, the machine learning coming online, again, the different, just like, the rich datasets we now have from an actual loss perspective. And then we've got the ransomware epidemic, if you will, of 2019, 2020, 2021, where we learned a whole lot about pricing sufficiency and how inflation behaves and those kinds of periods of stress on a market. Those are all tools that mature a market and just continue to catapult its growth. And then looking forward, so there are discussions of different reinsurance vehicles and ILS vehicles, it's a really exciting time and it only has more room for maturity and growth. With increased tooling now at our backs, it's an exciting time to be in cyber insurance and certainly no two days look the same.
Juan de Castro: As you said, the cyber insurance market has come a very long way. The cyber risk profile is constantly evolving. So definitely, I'm sure there's no room for staying still. So you need to always catch up with how the market is evolving.
Meghan Hannes: Exactly. It's interesting, right? Because it's one of the lines that there's obviously a lot of external threat landscape as well, right? So you can do everything the same internally, but have an external threat vector just come out of left field and more or less change the assumptions that you have, which obviously, and everyone listening, obviously that's one of the hallmarks of the cyber insurance line of business. It's not only internally that it has a rapid pace of change just with security controls and technology advancement, but then you've got this human element that can really change the dynamic and in some cases overnight.
Juan de Castro: Definitely. Well, this is absolutely fascinating. I don't think there's any other line of business or risk that is evolving as fast and as much as cyber, so I'm sure we can do probably another episode in a year's time, and I'm sure the conversation will be different. So looking forward to that. But Meghan, thank you so much for joining me. I've thoroughly enjoyed some of your views and the discussion.
Meghan Hannes: My pleasure and thanks for having me.
Juan de Castro: Making Risk Flow is brought to you by Cytora. If you enjoyed this podcast consider subscribing to Making Risk Flow on Apple Podcasts, Spotify or wherever you get your podcasts, so you never miss an episode. To find out more about Cytora, visit cytora.com. Thanks for joining me. See you next time.