Blog

4 Tips for Defining Your Approach to Big Data

Roger Hartmuller
Insurance Experts' Forum, February 18, 2014

If you’re like me, you’re noticing that everywhere you turn someone is writing about big data. Initially, it was the technology press. Then, business journals and newspapers got into the act. At companies everywhere, senior business leaders and even boards of directors are determining the best ways for their organizations to deal with big data. I like to joke that we don’t need to add the word “big” in front of every utterance of data, but for those who are new to big data, here are four factors to consider:

  1. Strategy. Have an outline of how your company intends to leverage data. Most any technology organization can download an open source copy of Hadoop, install it on some commodity boxes and declare itself on the big data bandwagon. Unfortunately, this misses the point. It is critical to understand what roles data and analytics play in your company and how you want to leverage that in the future. So a data strategy needs to be informed by the answers to these questions: Does the data strategy support your corporate and IT strategy? Will it be a driver for improving results? Will it help your company make better business decisions?
  2. Change. Just as the real world is not static, know that data is ever-changing. Big data technologies allow you to work with much larger datasets and at far-faster processing speeds. Technology innovations such as in-memory processing break through old speed barriers. Real-time processing technology innovations change the paradigm from a store and process data mindset to one of processing data in real-time. Even as storage costs go down, it will be more effective to process data in real-time to get to the true signal than to spend money trying to store the complete data set.
  3. There are four Vs. When defining big data, there are four Vs, not just the three the industry talks about: velocity, variety and volume. The fourth is veracity. In other words, it’s important that data is accurate and reliable, and that you understand what uses it is fit for. Without veracity you can very quickly end up with broken processes or erroneous insights.
  4. Good governance. Enterprises investing in big data require more emphasis on data governance than ever before. Not doing so is a recipe for a data quality disaster and it could lead to very poor management decisions or ineffective real-time analytics. Careful management of your data is an absolute requirement to get the most out of your big data efforts. 

I do believe big data will continue to evolve how business is conducted, changing the way companies interact with consumers. But, it will be those businesses that have the discipline to stay focused on data quality that will reap the most rewards, whether they use the word “big” or not.

Roger Hartmuller is VP of Allstate’s information services group.

Readers are encouraged to respond to Roger using the “Add Your Comments” box below.

This blog was exclusively written for Insurance Networking News. It may not be reposted or reused without permission from Insurance Networking News.

The opinions of bloggers on www.insurancenetworking.com do not necessarily reflect those of Insurance Networking News.

Comments (1)

Roger, very nice article on Big Data. With the explosion of big data, companies are faced with data challenges in three different areas. First, you know the type of results you want from your data but it's computationally difficult to obtain. Second, you know the questions to ask but struggle with the answers and need to do data mining to help find those answers. And third is in the area of data exploration where you need to reveal the unknowns and look through the data for patterns and hidden relationships. The open source HPCC Systems big data processing platform can help companies with these challenges by deriving insights from massive data sets quick and simple. Designed by data scientists, it is a complete integrated solution from data ingestion and data processing to data delivery. Their built-in Machine Learning Library and Matrix processing algorithms can assist with business intelligence and predictive analytics. More at http://hpccsystems.com

Posted by: HAANA M | February 20, 2014 1:51 PM

Report this Comment

Add Your Comments...

Already Registered?

If you have already registered to Insurance Networking News, please use the form below to login. When completed you will immeditely be directed to post a comment.

Forgot your password?

Not Registered?

You must be registered to post a comment. Click here to register.

Blog Archive

Competing with the Coasts for Tech Talent

Are heartland-based insurers at a recruiting disadvantage for tech skills?

Putting Your Investments Where Your Transformation Is: Part 2: Optimizing Your IT Investments Portfolio

Sam Medina continues a 3-part series on Transforming the IT Investment Budget in order to fund new programs and initiatives without the necessity of additional capital expense.

Boosting Performance with Integrated Underwriting Tools

A unified, comprehensive platform can help underwriters perform their jobs more efficiently — and profitably.

Opinion: Halbig Decision Creates New Level of Uncertainty for Obamacare

Time will tell if the Halbig decision remains viable. But in the meantime, a new level of uncertainty has been injected into the process.

Apply Mindfulness to Leadership

Managers can benefit from applying this theory both to their career aspirations as well as to interactions and expectations of staff.

Apple's Way: Succeeding in the Enterprise Without Even Trying - Part 3

Today's data centers are doing far more with much smaller footprints.

Advertisement

Advertisement