Are we ready for Data?

Are we ready for Data?

Long one...warning.

Way back in the carefree, pre-pandemic summer of 2019, the market was treated to Lloyd's future vision in the shape of Blueprint One.

It was grandly ambitious and rather a lot to take in at 45,341 words, according to my trusty word processor which I have neither reason to doubt nor the enthusiasm to check.

It should have escaped nobody's attention that data is becoming somewhat important these days and Blueprint One was keen to recognise that.

A quick scan revealed that the word “Data” featured 579 times on 88 of the Blueprint's 145 pages. That's one in every 80 words.?And APIs got a mention, on average, once every page or so.?Most encouraging.

By the time Blueprint Two was published in November 2020, Lloyd's ambition had shrunk to just 104 pages and 25,514 words, yet the word “Data” held its own and now accounted for a remarkable one in every 48 words.?And APIs still averaged a mention once every two pages.

From this fantastically comprehensive, insightful and statistically faultless analysis, I'm confident you will agree that Data and APIs were fairly key components of the programme.

And that was exactly what everyone wanted to see.?After all, there can be no better example of creating a modernised market than putting data, and the means of exchanging it, at the top of the priorities list.?Market infrastructure, after all, doesn't just mean a legendary trading floor, dodgy lifts and a coffee shoppe.

But that was well over £200 million ago now.?Or a couple of years, using the Gregorian calendar.?So how’s it going?

Well, it's far more a Future of London Market gig these days and that's no bad thing. ?

And it's been a bumpy ride too.?More scope revisions were made including a virtual underwriting room burned at the stake, a scuppered then re-floated PPL rebuild and a do-or-die deal with DxC to replace the stone age bureau systems.

But the Core Data Record, CDR to its friends, was a really big feature of Blueprint Two (68 mentions on 33 pages) and has managed to survive.

The CDR started life as a great idea but its first incarnation was, how can one put it delicately??Woefully inadequate? A hotchpotch handful of data element definitions, half of which had already been ennobled with "core" status since 1991 when the failed EPS (Electronic Placing Support) project defined them as part of its Common Core Record.

Nonetheless, it was a start and costing an alleged £9 million, a reassuringly expensive one at that.?Well, The Big Four have never knowingly undersold.

Anyway, not long after that lead balloon landed and it was looking like it was all over for the CDR, its scope was swiftly extended to encompass 154 data elements and a market consultation ended on 31st January.

Sensing that Data might just be of importance to everyone, a cross-market Data Council was established by the LMG to oversee it all.?Good stuff so far and one can only hope that their ambitions will be realised.?History is not on their side, mind you.?London market committees, regardless of their credentials, their dedication and the importance of their quest, don't always die of old age secure in the knowledge their mission was accomplished.

But let's assume it will be a success and hope they don't stop at just 154 data elements either.?Core Data is a good start but if the market truly wants to be digital, they'll need an awful lot more data surrounding the core to cover all classes of business and their various means of placement.?

And they'll need more than just a simple list of data elements too - data needs to be logically and hierarchically organised in a model to be of any material value, and it needs comprehensive rules for governance, validation and exchange.?Plenty to do.

Now, I’m not going to get into a debate in this article about what the market intends to do with all that lovely data if and when it arrives.?That’s one for another time.

No, what we need to look at is how the market intends to get the data in the first place.

That CDR and its future versions needs to be populated by something from someone.?And that rather simple matter is, I fear, going to be a far more complex problem to solve than some might think.

Where's it all coming from?

Ultimately, it's the customer who has to provide a good chunk of the data.

And we're getting quite used to that as a society.?In the world of personal lines, for example, whenever we use a price comparison site or insurer's online portal, we do the data entry for them.?Their systems then validate and augment it with 3rd party services and give us a price.

That price is cheaper than it would be if our insurer still paid legion staff to transcribe our handwritten proposal forms into their systems.?And, not entirely by coincidence, there is seldom a retail broker in the loop nowadays, so that's a big chunk of gross premium saved.

So there is a "modernisation dividend" and we, the customer, get a share of it by getting cheaper insurance products, compared easier, purchased quicker.?And that is the quid pro quo for us doing all that work on the insurer's behalf.

But who is going to do this in the commercial insurance world??And how?

Delegated Authority

In the case of DUA business, it should be relatively easy but for the dreaded Bordereaux.

Bordereaux are an unmitigated disaster as a means of exchanging data.?Unquestionably unfit for modern purpose, they are costing the industry a small fortune in delays, management, conversion, validation, cleansing and mapping.?Everyone knows this and few would argue that they constitute a large proportion of what is wrong and costly with DUA process between London and its remote distributors and customers.?

But knowing this and being able to fix it are not quite the same thing.

Of course we can replace bordereaux with a data standard and mechanism that is fit for purpose to exchange data in a modern market.?And we could theoretically roll this out to the coverholders, MGAs, technology companies, brokers and underwriters who make up that market.

Theoretically being the operative word.?In practice, the task is monumental.

For example, Risk bordereaux come, by and large, from Quote/Bind systems, Premium bordereaux from Policy Administration/accounting systems (which may not be the same as the Quote/Bind system) and Claims bordereaux from TPA/DCA’s claims management systems. Lloyd’s alone has around 4,200 coverholders and some 400 approved TPAs.?I have no idea how many technology companies are involved supplying systems to them all but I know it’s “a lot”, and probably more.?And for many smaller coverholders, worse still, Microsoft Excel is the "system”.

So that’s a huge number of business-to-business data relationships and affected systems, every one of which will, at some stage, have to change.?

If the word “scale” had any absolute meaning at all, this is probably it. ?

Yes, there are Insurtechs out there with bordereau-free solutions, many of them excellent.?But picking off a few binders here, a coverholder or two there, is not going to change the market at anything like the pace we need to get better, more timely data in the short term.?This has to be done at a grand scale and that will require a Herculean market effort, driven from the centre and supported by all.?And that is going to be very, very expensive.

So who pays??And, not entirely unrelated to that, who will share in the modernisation dividend if it is successful??The customer, the retail broker, the coverholder/MGA, the London broker, the London carrier or the reinsurer??Or all of them? ?And how about the technology companies who will be required to invest considerable sums in changing their systems?

I'm absolutely not talking about disintermediation here.?But while all of those parties are still an integral part of the distribution chain, any modernisation effort will need to benefit each party by improving their own bottom line.?If that's not the case, who will bother to engage with it?

And will it make the policy cheaper for the customer??Sounds desirable but will a reduced gross premium allow all participants in the distribution chain to retain their earnings from it? I can't see how. And if I'm right, could the gross price come down by a sufficient margin as to make London more globally competitive?

And to compound the problem, Lloyd's has been historically reluctant to frighten the horses, and with good reason.?Its legion coverholders bring some 40% of its annual premium, around £12 billion, and Lloyd's are keen to avoid upsetting them.?Lloyd's are not without competition either, both abroad and at home as any dual platform Managing Agent will remind you.

So who in One Lime Street is going to take the risk of pushing their distributors elsewhere by asking them to invest significant sums in systems and process reengineering unless the benefit case for them is clear and unarguable??Those coverholders can trade with Lloyd's now without all the pain, risk and expense of modernisation so unless the financial argument is firmly on their side, with a decent share of the modernisation dividend accruing to them, will they bother? ?

Of course bordereaux are not fit for purpose.?Of course they belong in the dustbin of history.?Of course modern data standards are better.?But for whom??If it's not better for the people who bring us the business, it's not happening.?And it it's not better for the customer, how will we keep them?

The market must answer these questions and solve these business issues first.?And only then can it throw away the bordereaux and the vastly expensive sub-industry of ‘bordereaux fixers” and move to straight through processing of all that lovely data.

Open Market

Let's move on now to the heavy lifting side of the market, where face-to-face deals are supported by spreadsheets and printed documentation.

Like borederaux in DUA business, the guilty parties in open market and facilitised business are the schedules of values (SOVs), the placing slips and the many other ways of conveying data between the customer and the underwriter.?Collectively they are the cause of significant angst and expense market-wide.

Just like bordereaux, there are no standards governing this data and what is submitted is not able to be properly validated and easily ingested.

So, every time a broker uploads a submission to PPL and asks a number of underwriters to quote it, multiple underwriting teams are kicked into action, each doing the same thing with the same downloaded data - reorganising it, validating and cleansing it, before they can ingest it into their systems to calculate the quote.

And they will all do it slightly differently…and possibly slightly wrongly…and definitely very expensively…and almost always in a hurry!

This must surely be simple to change??Just invent a data standard for those slips, another for the SOVs and make every broker use them with the new PPL MkII to convey all that lovely data instead of those silly PDF documents and spreadsheets.?Problem solved.

Except that it can't and won't happen like that.

Firstly, although the slip comes from the broker who could, possibly, be incentivised to provide the same information as data in an iMRC or Whitespace format, the SOVs and the supplementary information by and large don't.?They come from the customer, the risk manager, or a collaboration between them and their broker.

So who is going to translate all the customer's data into the standards demanded by London, Lloyd's and the underwriters??The brokers??And who is going to pay for that to be done??And who is going to assume the liability for it not being done right?

Perhaps the customers will adopt new systems that store their asset schedules in the format demanded by their insurers??There are hundreds of thousands of customers out there, so perhaps not.?And, anyway, I somehow doubt they would ever do so willingly unless there was a considerable financial incentive in the shape of significantly reduced premiums.

And even if one can answer those knotty commercial problems, do we know for sure who owns the data in the first place??And who gets to exploit it for the AI systems and market aggregate statistics? And who gets to keep it after the deal's been done??The one thing that data lake isn't is public property for everyone to swim in.

But never mind all that.?Solve them all and you still have one almighty problem to deal with.

What happens when "Computer says No"?

When the broker presents the information to the electronic platform, like PPL MKII or Whitespace, to submit it for quote or bind, and some piece of data isn't right in a schedule or on the electronic incarnation of the slip, and the computer says "No".?What then?

Will the system stop the submission??Logically speaking, of course it should.?That's supposed to be valid data on there, not guff.?We've got all the guff we can handle right now and the whole point of this is we don't want any more guff.?So it surely must say "No"?

But it can't.?What broker would accept that? What broker would ever agree that they cannot place business in the market because the computer won't let them do it, when exactly the same information presented on a printed slip or schedule would be completely acceptable to any underwriter and has been since 1688?

What broker would concede defeat at 4pm on a Friday when their customer is about to lose cover on a $1 billion of assets?

Of course they won't.?They will print it off and walk it into the market or stick it onto and email marked urgent.?And they will never use the electronic platform again.?Why should they if it's losing them business or, worse, endangering their customer??

This is so much more complicated than simply defining a data standard and trading it on an exchange.?And these issues need solving first.

Of course spreadsheet SOVs and PDF slips are not fit for purpose.?Of course they belong in the dustbin of history.?Of course modern standards are better.?But any suggestion that the solution is as simple as a putting in a "Data PPL MkII with a CDR" is a dangerous delusion.

Business problems to solve, not just data ones

So, that was all a bit negative, wasn't it?!

And so it will remain until we recognise that what appear to be data problems to be solved by a relatively simple data project like the CDR, and what hopefully comes after it, are actually contingent upon a wide array of complex business and logistical problems, some of which are not going to be quite so easy to resolve.

But if we fail to do this, we'll be in trouble and the integrated, digital marketplace we aspire to be will never be given the chance to work.

So my message to the Data Council is this:?The overriding imperative of our modernisation programme is to reduce costs and ensure that the modernised, integrated, digital London market remains competitive in the global marketplace.?In doing so, it is imperative that we will need to consume more data, consume it quicker and make better use of it.?But before that can happen, we must ensure that every organisation in the chain between the customer and the ultimate insurer is incentivised to provide that data in a format in which it can be readily validated and consumed.?Every one of them has a potential stake in the modernisation dividend that we seek.?The dividend will not be all ours to spend.

? Jeff Ward, 2022

#data #lloydsoflondon #brokers #insurance #MGA #insurtech #insuretech #innovation #digitaltransformation #datamodel

Helen Wright

Managing Director at Lysander PR

2 年

A great article Jeff and well worth a second read

Bent Isachsen

Visionary Operations Director with broad Operations and IT background in the Insurance Industry

3 年

Jeff, such a good read, next time you are up in London let me know, be good to catch up. Difficult not to respond to your article with some thoughts.....We will never be ready for data at the front end of the underwriting process. A standard can only be applied to a "static" data set like firm order i.e. in a subscription market the A&S process Trying to standardise insurance companies USP is a lost cause. How I look at the CDR is that it is only looking at supporting the A&S process and enabling the iMRC, which in itself will be a huge step in the right direction.......so looking forward to a reduction of A&S queries and sluggish cash movement. Leaving the DUA and the Claims improvements comments for another day Speak soon // BHBI

Ali Dove

An independent insurance expert dedicated to empowering insurance firms with comprehensive change and transformation solutions through excellent communication, stakeholder and relationship management.

3 年

Sheer brilliance Mr Ward! Yes a long read but so worth it! ????

Tony Moore

Senior Product Manager at PPL Placing Platform Limited

3 年

Nice article Jeff. Despite knowing the enormity of the task, I am glad to be one of the army of people doing my bit toward the collective endeavour. We just mustn't lose faith and give up as has happened so many times before. And if our first steps aren't perfect (which they surely won't be), keep on working to edge ever closer and closer.

Stuart Pembery FBCS

Fractional CTO | 30 years in Global Specialty Insurance | AI Automation Expert | Fellow, British Computer Society

3 年

Amazing that you are so positively engaged after all these years! I was on that 1991 CCR committee and have decided to think more tactically for the time being. Keep up the good work ????

要查看或添加评论,请登录

Jeff Ward的更多文章

社区洞察

其他会员也浏览了