Enterprising Developments

Freedom From Hardware, One Application at a Time

Joe McKendrick
Insurance Experts' Forum, January 17, 2013

Can software-defined data centers help relieve today's congested and overtaxed data centers? That's the hope of many vendors and analysts, who see this emerging approach as a way to finally wrest control of data center operations from the historical limitations of servers, storage arrays and network equipment.

“Is a software-defined data center just existing physical assets with more virtualized aspects, or is it something revolutionary?” asks Patrick Kerpan in a recent post in Wired.

At their core, software-defined data centers are a key piece of private clouds—they abstract data center operations and functions away from underlying hardware. “The deployment and management of applications and the virtualized compute, storage and networks they are comprised of should exist only as software,”

As Kerpan puts it: “Let someone else own the hardware, the guards, the glass, the gas, the batteries, the generators, and the hundreds of people who service them in the world of enterprise IT. We would even advise considering managing infrastructure IT and application IT as very different organization—and to the extreme, never the two shall meet, except for in the form of APIs and the contractual relationships engendered in their use.”

Many organizations are already on their way to software-defined data centers, whether they are actively pursuing it or not. The bottom line is that no one needs to attempt to move their entire data center to being software defined. The process occurs one application at a time. “Dramatic action doesn’t need to be pursued,” says Kerpan. “The ubiquity of APIs, automation, Internet, and 'fast, flat, and fat' physical resources means software-defined data center can be pursued now and deliver ROI one application at a time, not one physical data center at a time.”

The components of a software defined data center will consist of network virtualization, image automation, topology automation, and file system virtualization.

Joe McKendrick is an author, consultant, blogger and frequent INN contributor specializing in information technology.

Readers are encouraged to respond to Joe using the “Add Your Comments” box below. He can also be reached at joe@mckendrickresearch.com.

This blog was exclusively written for Insurance Networking News. It may not be reposted or reused without permission from Insurance Networking News.

The opinions of bloggers on www.insurancenetworking.com do not necessarily reflect those of Insurance Networking News.

Comments (0)

Be the first to comment on this post using the section below.

Add Your Comments...

Already Registered?

If you have already registered to Insurance Networking News, please use the form below to login. When completed you will immeditely be directed to post a comment.

Forgot your password?

Not Registered?

You must be registered to post a comment. Click here to register.

Blog Archive

A Cure for Analysis Paralysis

“Adaptive” analytics can help insurers keep up with the flood of real-time data.

To Quantify or Not — That is the Question with Modernization (Part II)

While the quantitative business case may be ingrained in many insurance operations, it often offers little practical use.

The Good, The Bad and The Ugly Of Enterprise BI

When IT can't deliver, business users build their own applications focusing on agility, flexibility and reaction times.

The IT-Savvy 10%

IBM survey reveals best practices of IT leaders.

The Software-Defined Health Insurer: Radical But Realistic?

Can a tech startup digitally assemble the pieces of a comprehensive, employer-provided health plan?

Data Governance in Insurance Carriers

As the insurance industry moves into a more data-centric world, data governance becomes more critical for ensuring the data is consistent, reliable and usable for analysis.