The following is an opinion piece written by James Burton, director of product management, insurance, UK and Ireland, LexisNexis Risk Solutions. The views expressed within the article are not necessarily reflective of those of Insurance Business.
2017 was one of the worst years on record for insurance losses caused by natural catastrophes. If the extreme weather we have experienced in recent years becomes the ‘norm’, it is set to cost the UK billions and put 2.5 million homes at risk.
The launch of Flood Re, on April 04, 2016, as the world’s ‘first’ flood reinsurance scheme, has helped to maintain affordable insurance for homeowners in high risk areas. Its aim is to keep down the cost of quality insurance cover for flood-risk homes by spreading the risk across the industry, while working to bring about the conditions for a transition back to a free market over a period of 25 years. So far, it seems to be doing just that, as the scheme recently announced that it will, for the first time, not pass on to insurers an annual increase in premium thresholds.
Homeowners need to have access to affordable flood insurance and Flood Re is a vital part of futureproofing that provision.
As well as this development, there have been significant advances in perils risk data mapping over the years to give insurance providers an immediate understanding of environmental risks such as flood, for both domestic and commercial properties.
This has helped ensure customers are priced as accurately and as fairly as possible while enabling the sector to prepare and protect themselves from the losses associated with environmental risks.
For many years, underwriting perils relied on risk data at a postcode level. However, detailed analysis of perils risks right down to an individual address is now standard practice with mapping solutions that deliver a score for the level of risk posed for that one address. Given there are, on average, 15 addresses in one postcode (but often many more), this ability to gain an understanding of risk at the property level is essential, it’s not just fairer for the customer, but it can also improve quotability for the insurance provider.
But we are now going beyond this capability, with scoring for the risk of flood based on the property footprint. This is hugely valuable for the commercial property arena where an address can cover a large area and a flood risk could be quite different at the back of a building versus the risk posed at the front.
Mapping and risk data analysis tools provide this fine level of detail as well as the broader view of risk accumulations, which are so vital to help insurance providers understand where they might be over exposed and how they could be impacted by an extreme event.
Of course, flooding is just one of the risks insurance providers need to assess for property cover. So, alongside flood data, using the same mapping solutions, they can overlay climate, subsidence, fire and other risk data for individual properties instantly. Modern approaches to accumulation calculation even factors for the risk of fire spreading between buildings, based on their proximity to one another and their height.
In the future we may see further datasets incorporated as part of these solutions and delivered at point of quote. Past claims data related to the property pooled into a contributory database, for example, would be very powerful in helping to understand the risk of a future claim. Where commercial property insurance is concerned, there is also the potential to include data on the people behind the business as well as their policy history data.
Fundamentally, mapping technology has taken the understanding of property risk to a whole new level, reducing the complexity of the manual task property insurers would have faced just a few years ago, by delivering both a micro and macro view of risk. This holistic view of risk allows for better management of overall exposure and we continue to support this initiative.