Enterprising Developments

Big Data, Cheap Storage

Joe McKendrick
Insurance Experts' Forum, June 13, 2011

When you think back to the Y2K issue, the main reason programmers didn't use four-digit year fields was to save a couple of bytes of valuable disk space. And rightly so – disk space cost about $200 a megabyte in 1980.

Now, many rightly worry about the onslaught of “Big Data” – unstructured files, graphics, audio, video, and social media data – that is threatening to overwhelm our data centers. Many organizations have databases supporting hundreds of terabytes of data.

If they had to pay 1980 prices for, say, 500 TBs of disk, that would amount to a $100 billion price tag, just for storage.

But that isn't the case. The cost for raw storage for a site with 500 TBs is about $41,050, or about 0.0004% of what it was three decades ago.

This information is culled from a simple but very busy website that documents the typical costs of storage, based on actual product retail pricing, from 1956 until the current day.

Moore's Law and economies of scale keep making storage cheaper and cheaper, down to about eight cents a gigabyte as of late summer 2010, the last entry on the site.

Of course, many enterprises are just as focused on storage costs with cloud services as they are with buying their own on-site storage.  Amazon Web Services, for example, charges just under 10 cents a gigabyte per month for storage, which is actually a bit more than buying your own disk for the long term – of course, you don't have to worry about servers, uptime, provisioning, and maintenance, either.

The bottom line is that nobody thinks about storage hardware in all the discussions about the challenges and opportunities in Big Data. Yet, without dirt cheap storage, there would be no Big Data. And no cloud for that matter, either.

Joe McKendrick is an author, consultant, blogger and frequent INN contributor specializing in information technology.

Readers are encouraged to respond to Joe using the “Add Your Comments” box below. He can also be reached at joe@mckendrickresearch.com.

This blog was exclusively written for Insurance Networking News. It may not be reposted or reused without permission from Insurance Networking News.

The opinions of bloggers on www.insurancenetworking.com do not necessarily reflect those of Insurance Networking News.

 

 

 

 

 

 

Comments (0)

Be the first to comment on this post using the section below.

Add Your Comments...

Already Registered?

If you have already registered to Insurance Networking News, please use the form below to login. When completed you will immeditely be directed to post a comment.

Forgot your password?

Not Registered?

You must be registered to post a comment. Click here to register.

Blog Archive

The Other Auto Insurance Telematics Shoe Drops

Progressive's decision to charge Snapshot drivers more if their driving data indicates higher risk has started the industry down a road of data-driven adverse selection.

Core Transformation – Configuring in the Rain

The whole point of core transformation is that changes at the micro level can be used as a stimulus for changes at the macro level.

6 Ways to Develop a Productive IT-Business Dialog

Relationship management 101 for keeping IT and business on the same page.

Unified Digital Strategy: Succeeding in the Digital Revolution

A unified digital strategy recognizes that all business strategies and technologies touch the customer in some way and that a one-size-fits-all channel model is obsolete.

Agile and Continuous Delivery in a Regulated Environment

Just because a development team is doing continuous delivery or packaging releases into two-week sprints doesn’t mean that code is being moved to production.

Dealing with the COBOL Brain Drain

Documentation on aging systems often is akin to tribal knowledge, and the potential for things to go bump in the night increases as these environments face generational transition.