Blog

Insurers Lag in Data Testing

Syed Haider
Insurance Experts' Forum, August 27, 2012

The insurance industry depends on reliable and timely data. This data originates from a variety of systems, like policy issuance, customer service, billing and claims. To aggregate and integrate large volumes of data from a plethora of sources is not a trivial problem for data architects. Yet what's even more challenging than the development of such projects is testing them.

While the application-testing landscape is fertile with a variety of fresh ideas and offerings, the terrain of data testing remains surprisingly barren. A quick search on Amazon will demonstrate the curious truth: the number of texts dedicated to testing data applications is woefully low when compared with those covering software testing. Why is that, one wonders, especially if one is tasked with architecting a data project that has few or no application/user-interface (UI) layers? A few quick explanations do come to mind:

First, software engineering as a discipline takes a holistic view of application development and doesn’t treat database development as a separate concern deserving any kind of special treatment. Consequently, we see a healthy proliferation of testing tools, patterns and best practices on the UI and application layers overall, with little or no caveats for data testing. We see the notions of continuous integration, test-driven development, code-coverage and so on firmly established in the application development communities. Yet the testers and developers of non-UI data-heavy applications, such as data warehouses and business intelligence, struggle to get even the rudimentary automation right.

Second, database vendors have by and large lacked any emphasis on enforcing some sort of development discipline on their platforms. With a near lack of built-in support for quality, it can become a challenge to enact even the basic quality-controls for a project. Consider, for instance, concepts like ‘project’, ‘build’, or ‘unit-test’. Such notions have been introduced in software development to improve the manageability of complex projects, and also to establish a quality-baseline. (For instance. a broken build isn’t considered working software). No such concepts exist in the database platforms, making the task of organizing large code bases and tracking quality an extremely daunting one.

Third, tools and technologies like extract, transform load (ETL) and BI that are integral to most contemporary data architectures are relatively new concepts undergoing frequent revisions. Built-in support for testing is slowly emerging in this space, though it’s nowhere near as comprehensive as the well-entrenched techniques in the application development arena.

As a result of some of the observations above, we witness a tendency with IT managers to treat data testing the old-fashioned way: hiring second-class resources and wasting excessive man-hours on repetitive and manual processes that could have otherwise gone into more impactful quality assurance.

Yet they start feeling overwhelmed as data volumes or integrations grow and customer demands become more complex. Consequently, data quality suffers, and it’s all downhill from there.

Sound familiar? We’ve been there too, and over the years have learned some common-sense remedies that we’ll be sharing in the second part of this blog.

Syed Haider is an architect with X by 2, a technology company in Farmington Hills, Mich., specializing in software and data architecture and transformation projects for the insurance industry.

Readers are encouraged to respond to Syed using the “Add Your Comments” box below. He can also be reached at shaider@xby2.com.

This blog was exclusively written for Insurance Networking News. It may not be reposted or reused without permission from Insurance Networking News.

The opinions of bloggers on www.insurancenetworking.com do not necessarily reflect those of Insurance Networking News.

Comments (1)

I agree, data is the heart of any system and should be at the heart of testing also. This whitepaper discusses some of the practical approaches to the challenge. http://www.origsoft.com/whitepapers/data-strategies-application-testing/

Posted by: George W | August 29, 2012 7:27 AM

Report this Comment

Add Your Comments...

Already Registered?

If you have already registered to Insurance Networking News, please use the form below to login. When completed you will immeditely be directed to post a comment.

Forgot your password?

Not Registered?

You must be registered to post a comment. Click here to register.

Blog Archive

Boosting Performance with Integrated Underwriting Tools

A unified, comprehensive platform can help underwriters perform their jobs more efficiently — and profitably.

Apple's Way: Succeeding in the Enterprise Without Even Trying - Part 3

Today's data centers are doing far more with much smaller footprints.

Apply Mindfulness to Leadership

Managers can benefit from applying this theory both to their career aspirations as well as to interactions and expectations of staff.

Opinion: Halbig Decision Creates New Level of Uncertainty for Obamacare

Time will tell if the Halbig decision remains viable. But in the meantime, a new level of uncertainty has been injected into the process.

CIOs: "We Don't Have Enough People to Run Our Mainframes"

Insurers will be competing with other industries for both legacy and “new IT" talent.

4 Ways to Keep Insurance Data Quality Healthy

Continually building trust and credibility in the data is the key to a successful data warehouse.

Advertisement

Advertisement