Enterprising Developments

Data Integration as a Repeatable Business Process

Joe McKendrick
Insurance Experts' Forum, May 8, 2013

Launching a new coverage line? Moving accounts off the mainframe? Taking over a smaller company?

Let's not mince words: data integration is hard work. There are a lot of data sources that need to be vetted, cleansed, transformed and brought into the fold.

Perhaps there is a way to establish a system flexible enough to absorb any and all new data that comes its way. In a recent webcast, I had the opportunity to join Dr. Claudia Imhoff and John Schmidt, VP of global integration services at Informatica Corporation, in a discussion of the best practices that will enable more seamless data integration as organizations change.

What's required is an adaptable architecture — something we call a next-generation data integration architecture — which can grow and change, while enabling “one-click integration” as new data sources are brought in.

John Schmidt, who is also co-author of "Lean Integration," painted the vision of what such an architecture should look like.

For example, data integration becomes a business process, not an IT activity. This is an important step to achieving more important value from integration activities. IT still plays a key role as an enabler of the process, however. The key is that data integration is baked into the business and is fairly automated, rather than being a special or one-off activity that needs to be started from scratch every time a new data set is brought into the business. Rather, data integration needs to be a repeatable process that occurs almost without any prompting.

John borrows a term from management guru Jim Collins, describing the elevation of data integration to an enterprise process as a “Big Hairy Audacious Vision.” As he puts it: “data integration is a business process just like sales, marketing, order fulfillment, invoicing, etc. Data Integration encourages business user self-service, where IT is an enabler and provides technical support and validation just like for other business processes.”

Joe McKendrick is an author, consultant, blogger and frequent INN contributor specializing in information technology.

Readers are encouraged to respond to Joe using the “Add Your Comments” box below. He can also be reached at joe@mckendrickresearch.com.

This blog was exclusively written for Insurance Networking News. It may not be reposted or reused without permission from Insurance Networking News.

The opinions of bloggers on www.insurancenetworking.com do not necessarily reflect those of Insurance Networking News.

Comments (0)

Be the first to comment on this post using the section below.

Add Your Comments...

Already Registered?

If you have already registered to Insurance Networking News, please use the form below to login. When completed you will immeditely be directed to post a comment.

Forgot your password?

Not Registered?

You must be registered to post a comment. Click here to register.

Blog Archive

A Cure for Analysis Paralysis

“Adaptive” analytics can help insurers keep up with the flood of real-time data.

To Quantify or Not — That is the Question with Modernization (Part II)

While the quantitative business case may be ingrained in many insurance operations, it often offers little practical use.

The Good, The Bad and The Ugly Of Enterprise BI

When IT can't deliver, business users build their own applications focusing on agility, flexibility and reaction times.

The IT-Savvy 10%

IBM survey reveals best practices of IT leaders.

The Software-Defined Health Insurer: Radical But Realistic?

Can a tech startup digitally assemble the pieces of a comprehensive, employer-provided health plan?

Data Governance in Insurance Carriers

As the insurance industry moves into a more data-centric world, data governance becomes more critical for ensuring the data is consistent, reliable and usable for analysis.