P&C Insurers need to understand the losses that they have incurred (claims) before they renew the policies (auto, property, marine, workers compensation, etc.). The same applies to Commercial Insurance brokers as well, who have to understand their customer's losses covered by various carriers.
However, the process of fetching, analyzing and understanding Loss Runs is extremely manual and time consuming. A commercial broker has customers who are insured through different carriers. Whenever the time for renewal comes, the broker has to contact these insurance carriers to for loss records for further analysis. A typical process looks like below:
- System generates the list of policies that are up for renewal after a certain number of days (for most it is about 180 days). It is generated 180 days prior as there are many checks and balances to do before renewal
- Rules are applied on the policies, either manually or through excel macros, that further funnels down the number of policies for which Loss Runs analysis needs to be done
- Previous year policies are fetched to be able to understand the lifecycle of the policy from beginning. This is not very easy as commercial brokers do not always have the entire data available with them. It has to be fetched from systems, requested from processors who further reach out to managers, brokers or even the insurance carrier to be able to get the data.
- Now comes the major part where Loss Runs records are requested from the carriers. Since, most of the traditional brokers and insurers do not have API capabilities to be able to send the data in real time or through batch jobs, it has to be requested via email or if available on insurer's portal.
- Requesting via email means to-and-fro between the broker and the insurance carrier
- Post receiving the loss records, the major challenge that the broker faces is standardising the records as different carriers have different formats. The obvious solution seems to standardize this at the carrier's end, but it is easier said than done. Hence, the broker needs different tools to standardize the records received. This includes conversion of PDF to Excels, maintaining a separate file which is standardised and thereby copying the data from carrier's files to this own file. This is not only time consuming but error prone.
- Once the records are standardized, the broker works upon it to generate a Loss Summary (defined earlier through collaboration with its Large, Medium brokers who further use it to analyze and renew policy). This work is either done manually or through a tool, which can be an COTS product or home grown system (it is generally a home grown system)
- Post the generation of Loss Summary, the same is sent to Large and Medium Size brokers for their consumption. The process does not end here as these brokers also fill in additional information sourced through their internal local files or different systems.
Hence, the final Loss Summary that is prepared goes through a tedious process. The basic challenges that we see here is :
- No single source of truth from where the entire records (policy history details, claim records, broker required information, etc.) are available to analyze
- Limited or no use of technology makes the process manual. (Imagine if the same was used in our online shopping experience - be it for ordering or returns!!)
- Limited or no technological collaboration between different stakeholders.
So how do we solve for this issue? After all, Loss Runs are needed as they are the basis for understanding the renewal premium. Imagine the fate of Marine Insurers now after the recent disaster of Baltimore. The insurers would now want to get access to more set of data to analyze for potential losses and thereby premium.
Insurers/Brokers should adopt the following:
- Process Reimagination - A complete and thorough understanding of the process needs to be done. As we can see from the process defined above, some of the things can be reduced by combining manual things into one single automated procedure, some may be reduced by having the data in one single place, etc.
- Use of Business Process Management Tools - Whenever there are multiple parties involved, it is always better to make use of a tool that is equipped with features like skill and availability based routing, configurable rules, quick deployments, easy UI and ability to integrate with other systems. In the above process, if Processors, Managers, Brokers, Insurers are connected through a BPM tool, the tasks become easier to manage.
- Blockchain - Though a far fetched solution, but if implemented, a blockchain solution can give that real time push as it not only connects the stakeholders in one chain, but also has the ability to define rules, execute and fetch information and data from anywhere, anytime.
Gen AI? - How can any solution be devoid of Gen AI these days? The solution to the Loss Runs problem may not be solved entirely by it, but consider these solution snippets:
Processor simply types in --
"Get me the list of policies that are due for renewal in next 180 days. Apply the following rules on them and filter the data for me"
To fetch Policy History Details --
"Fetch me last 5 year policy history details of the policies above and highlight the records that are missing"
To get Loss Records from Insurance carriers --
"Draft an email for me to be sent to the respective insurance carrier requesting the loss runs for X years for the above policies" or "Download the Loss Runs for the above policies by logging in to the insurer's web portal. The login credentials can be taken from the last time"
To generate Loss Summary --
"Based on the Policy History Data and Loss records received, generate a Loss Summary as per the defined criteria"
These may seem a little far fetched, but it can be implemented if systems are in place.
Loss Runs Analysis play an important role in insurance and is only gaining importance due to higher occurence of claims. Insurers/Brokers need to act fast to streamline the process before insurtechs take over.