Blog

A Lesson from the Financial Crisis: We Need Better Data Governance

Joe McKendrick
Insurance Experts' Forum, September 22, 2009

Last week, the U.S. House Financial Services Committee’s Subcommittee on Oversight and Investigations took up the issue of how technology can be employed to improve TARP and financial services oversight. Technology can always be improved upon, of course. But an additional element is needed to better manage things that get out of control—greater trust in the information our systems are generating.

Testimony by Dilip Krishna, a specialist in risk and financial management for Teradata’s financial services and insurance organization, summed up the challenge facing an industry built on trust: trust breaks down in the absence of transparency. We have plenty of technology for slicing, dicing and parsing data, but that’s not going to prevent another meltdown such as we saw at the end of 2008.

Obviously, we can’t have complete, open transparency to all outsiders; we need to maintain confidentiality of customer records. But the trust factor is key.

“Trust lies at the heart of transparency,” Krishna said. “It is only in unusual circumstances, or at very high cost, that financial information can be demonstrated to be completely authentic.”

Being able to trust the data or information on which we base decision-making is a vital part of this formula. Krishna called this process the “information assembly line,” in which “data needs to be complete and detailed while it is transformed into useful information as it moves from the transaction systems to the point of disclosure. Confidence in the reported information can only be gained when there is confidence in the robustness of the assembly line—for example, via knowledge that all changes during the process of creating the information are fully audited and controlled.”

How does an organization go about assuring that their “information assembly line” is delivering the most credible and timely information possible? The answer is data governance.

In a new book, The Data Asset: How Smart Companies Govern Their Data for Business Success, Tony Fisher, president and CEO of DataFlux, a part of SAS, writes that “data governance and data quality should never be considered a one-time project. A quality culture must be established as an ongoing, continuous process.” A strong, sustainable data governance program assures transparency, he adds. And at this time, too few organizations approach data governance this way.

Achieving a proactive state of data governance takes time and needs to be approached in incremental stages, Fisher says. He recommends establishing a data governance center of excellence. Additionally, “business analysts, working through data stewards, start to control the data management process, with IT playing a supporting role.”

Organizations striving for effective data governance, he says, “need to think less about functions, less about data, and more about processes. The mark of a proactive organization is that that it makes “information management decisions based on the need to improve the business rather than the need to improve the IT infrastructure.”

As financial organizations continue to repair themselves in the wake of the financial crisis, perhaps its time to establish stronger governance over the creation and management of the data that is now driving these businesses.

Joe McKendrick is an author, consultant, blogger and frequent INN contributor specializing in information technology. He can be reached at joe@mckendrickresearch.com.

The opinions of bloggers on www.insurancenetworking.com do not necessarily reflect those of Insurance Networking News.

Comments (0)

Be the first to comment on this post using the section below.

Add Your Comments...

Already Registered?

If you have already registered to Insurance Networking News, please use the form below to login. When completed you will immeditely be directed to post a comment.

Forgot your password?

Not Registered?

You must be registered to post a comment. Click here to register.

Blog Archive

Opinion: Halbig Decision Creates New Level of Uncertainty for Obamacare

Time will tell if the Halbig decision remains viable. But in the meantime, a new level of uncertainty has been injected into the process.

CIOs: "We Don't Have Enough People to Run Our Mainframes"

Insurers will be competing with other industries for both legacy and “new IT" talent.

4 Ways to Keep Insurance Data Quality Healthy

Continually building trust and credibility in the data is the key to a successful data warehouse.

Customer Experience Trend Watch

Three recent HR moves demonstrate that large life insurers recognize customer experience as a strategic differentiator.

Insurers Have a Lot of Data, But Too Many Silos

Insurers actually have more data analytics resources than other industries.

Are Data Centers Shrinking or Expanding?

Today's data centers are doing far more with much smaller footprints.

Advertisement

Advertisement