The Pace of Technology Change: How Fast is Too Fast?

Ara Trembly
Insurance Experts' Forum, October 15, 2009

It is no secret that the insurance industry of today is not known for its cutting edge technology inventiveness, or even for utilizing the latest technologies embraced by other industries.  

Part of that situation is no doubt due to the conservative and risk-averse nature of insurers, but part of it may also be due to the lightning-fast pace at which technology in general is advancing. Even outside the technologically phlegmatic insurance arena, technological advances are happening so quickly that one wonders how any enterprise could adequately plan to respond quickly enough to leverage such changes for competitive advantage.  

During the opening insurance general session of the just concluded Oracle OpenWorld conference in San Francisco, Oracle Insurance’s SVP and GM Rick Connors offered this very relevant observation: “In the new economy, speed and acceleration have changed the name of the game.” In other words, the ability of any enterprise or company to ascend to, or remain on top of, the business totem pole rests on how quickly it can act and respond to the market. Technology clearly will be the rocket boost that enables such a response.  

That got me thinking about quantum physics (I know that seems a little bizarre, but stick with me here). Specifically, I thought about the Heisenberg Uncertainty Principle, which states—in brief—that the more precisely the position of a particle is determined, the less precisely the momentum of the particle is known at that instant, and vice versa. So we can determine how fast something is traveling, or we can determine its exact position, but we can’t know both at the same time.  

Applying this to our question of technology advancement, I would propose that we can make our business plans based on where that advancement is at this moment, or we can try to gauge the speed of change and make plans for where we think technology will be at some time in the future. It seems obvious that we can’t do both, or at least that we can’t devote our full energies to both courses at once.

If we adopt the first option (planning based on the current technology environment), we can build a solid plan based on proven technologies that represent the best available solutions—for now. How well these technologies and plans will serve our enterprises in a year or two or five is to some extent an unknown, since speed and acceleration are going to take us to a new place where the “older” technologies may have become unworkable or irrelevant.  

On the other hand, if we plan for a future state that has yet to develop, we run the risk of being wrong and having done our work in vain. If we’re right, however, we’re probably going to be in the catbird’s seat in terms of competitiveness. That means we’re light years ahead of our competitors.  

So here is the problem: The technology landscape is changing too rapidly for us to precisely plan for the future, but to base our plans on what we know now may carry even more risk in terms of leaving us badly outclassed by competitors who “guessed right.”  

The best solution, it seems to me, is for companies and enterprises to remain flexible enough (agile enough) to respond to market changes. In order to do this, we must make sure that we are not too tightly tethered to any particular platform, brand or supplier.  That doesn’t mean we don’t use the “best-of-breed” technologies available today. We must, however, be able to make “guesses” on the fly, and we must be able to survive and keep moving if our guesses turn out to be losers.    

To be sure, the balance between risk and reward is a delicate one, and each company and enterprise must determine its own tipping point. I could write much more on this topic, but I have to take this phone call from the Journal of Applied Physics. Meanwhile, it may be useful to remember that, as one physicist quipped, “Even quantum physicists don’t understand quantum physics.”

Ara C. Trembly ( is the founder of Ara Trembly, The Tech Consultant and a longtime observer of technology in insurance and financial services. He can be reached at

The opinions posted in this blog do not necessarily reflect those of Insurance Networking News or SourceMedia.

Comments (0)

Be the first to comment on this post using the section below.

Add Your Comments...

Already Registered?

If you have already registered to Insurance Networking News, please use the form below to login. When completed you will immeditely be directed to post a comment.

Forgot your password?

Not Registered?

You must be registered to post a comment. Click here to register.

Blog Archive

The Pitfalls of Using Assembly Line Methods to Create Software

Most of the time, when the business needs IT, it is for custom software development, just like creating a concept car.

Wearables and Gamification in Life Insurance Goes Mainstream?

With so many U.S. households still uninsured, insurers are going have to try new things to re-position their product, focusing on consumer needs.

Will John Hancock Vitality Transform Insurance?

The Vitality program integrates this information directly into the rewards, giving you credit for the exercise, just by virtue of reporting it.

Why Customers Should Want Innovative Insurers

At a time when confidence in the insurance industry has been compromised, innovative companies can break the mold.

Five Ways to a Positive User Experience

The user experience can make or break an application. Here are five ways to measure whether itís positive or negative.

Innovation & Insight Day Recap

The Insurance Team recognized fifteen model banks across five categories: Digital; Data Mastery; Legacy and Ecosystem Transformation; Innovation and Emerging Technologies; and Operational Excellence.