The insurance industry may be risk-minded, but that hasn’t stopped it from continuing to use technology systems and practices one computer scientist calls “disturbingly” unsecure.
Dr. Bill Curtis, chief scientist with CAST Research Labs, is one of the lead authors on “CRASH Report Insurance 2016,” which analyzes the structural quality of IT applications used in different software technologies across 38 insurance companies in eight countries. Among other findings, Curtis and others determined that insurance firms, particularly those in the US and North America are among the least secure as compared to other businesses in the financial services sector.
Built primarily on legacy COBOL technology – a coding language so old Curtis says someone wanting to learn it would “need to enter the department of archaeology” – insurer systems are vast, unwieldy and prone to system outages and privacy breaches.
“Because of the complexity of the systems and the piecemeal, bolt-on solutions [software] engineers often put in place in a hurry, insurers’ applications are disturbingly less secure and more likely to experience outages,” Curtis told Insurance Business America
Indeed, of the more than 6.8 million violations of code unit level quality rules detected in the insurance sample, 28% were rated as “high severity.” In-house applications were shown to perform slightly better than outsourced applications, but a lack of resources means many smaller insurance companies may be at a disadvantage on that front.
The security problem is worsened in the US by its regulatory system, which subjects insurers to 50 sets of differing rules that must be addressed within the legacy systems, often with deadlines that approach far too swiftly for software engineers to update the technology in a secure way.
“These systems are huge and not well-documented, so it becomes harder to make changes to them,” Curtis said. “With required rgulatory changes, it becomes harder and takes longer to create solutions than it does in the EU, where there is more consistency in regulations.”
Thanks to their size, the older legacy systems are also vulnerable to simple hacks like SQL injections that can gather secure information like credit card data and pull it from the app.
Yet insurers continue to utilize the older technology, in part because it’s adept at servicing older policies, but also because the systems’ size and complexity make it next to impossible to replicate in a timely and efficient way. Curtis says COBOL technology in particular is also comparatively fast and even comparatively more secure, processing a massive number of queries per second without exposure to the web.
“A lot of companies are of the mindset that it does what it’s supposed to do and if it ain’t broke, don’t fix it,” he said.
But that doesn’t mean insurers shouldn’t be mindful of the systems’ vulnerabilities. A thorough analysis of the quality of software being used at the system level is key to increasing security, as is having a company-wide policy of identifying and addressing security problems with the necessary resources – despite the time drain.
“You can’t just build brand new functionality, so it’s got to be about monitoring risk and ensuring you have high-quality, more secure software,” said Curtis. “This problem isn’t going to go away. The biggest risk for businesses used to be bad investment, and now it’s the possibility of getting hacked.”
Yahoo expected to confirm massive data breach
Giant cyber attack blocks Twitter, Spotify, New York Times