The Pace of Technology Change: How Fast is Too Fast?

Ara Trembly
Insurance Experts' Forum, October 15, 2009

It is no secret that the insurance industry of today is not known for its cutting edge technology inventiveness, or even for utilizing the latest technologies embraced by other industries.  

Part of that situation is no doubt due to the conservative and risk-averse nature of insurers, but part of it may also be due to the lightning-fast pace at which technology in general is advancing. Even outside the technologically phlegmatic insurance arena, technological advances are happening so quickly that one wonders how any enterprise could adequately plan to respond quickly enough to leverage such changes for competitive advantage.  

During the opening insurance general session of the just concluded Oracle OpenWorld conference in San Francisco, Oracle Insurance’s SVP and GM Rick Connors offered this very relevant observation: “In the new economy, speed and acceleration have changed the name of the game.” In other words, the ability of any enterprise or company to ascend to, or remain on top of, the business totem pole rests on how quickly it can act and respond to the market. Technology clearly will be the rocket boost that enables such a response.  

That got me thinking about quantum physics (I know that seems a little bizarre, but stick with me here). Specifically, I thought about the Heisenberg Uncertainty Principle, which states—in brief—that the more precisely the position of a particle is determined, the less precisely the momentum of the particle is known at that instant, and vice versa. So we can determine how fast something is traveling, or we can determine its exact position, but we can’t know both at the same time.  

Applying this to our question of technology advancement, I would propose that we can make our business plans based on where that advancement is at this moment, or we can try to gauge the speed of change and make plans for where we think technology will be at some time in the future. It seems obvious that we can’t do both, or at least that we can’t devote our full energies to both courses at once.

If we adopt the first option (planning based on the current technology environment), we can build a solid plan based on proven technologies that represent the best available solutions—for now. How well these technologies and plans will serve our enterprises in a year or two or five is to some extent an unknown, since speed and acceleration are going to take us to a new place where the “older” technologies may have become unworkable or irrelevant.  

On the other hand, if we plan for a future state that has yet to develop, we run the risk of being wrong and having done our work in vain. If we’re right, however, we’re probably going to be in the catbird’s seat in terms of competitiveness. That means we’re light years ahead of our competitors.  

So here is the problem: The technology landscape is changing too rapidly for us to precisely plan for the future, but to base our plans on what we know now may carry even more risk in terms of leaving us badly outclassed by competitors who “guessed right.”  

The best solution, it seems to me, is for companies and enterprises to remain flexible enough (agile enough) to respond to market changes. In order to do this, we must make sure that we are not too tightly tethered to any particular platform, brand or supplier.  That doesn’t mean we don’t use the “best-of-breed” technologies available today. We must, however, be able to make “guesses” on the fly, and we must be able to survive and keep moving if our guesses turn out to be losers.    

To be sure, the balance between risk and reward is a delicate one, and each company and enterprise must determine its own tipping point. I could write much more on this topic, but I have to take this phone call from the Journal of Applied Physics. Meanwhile, it may be useful to remember that, as one physicist quipped, “Even quantum physicists don’t understand quantum physics.”

Ara C. Trembly ( is the founder of Ara Trembly, The Tech Consultant and a longtime observer of technology in insurance and financial services. He can be reached at

The opinions posted in this blog do not necessarily reflect those of Insurance Networking News or SourceMedia.

Comments (0)

Be the first to comment on this post using the section below.

Add Your Comments...

Already Registered?

If you have already registered to Insurance Networking News, please use the form below to login. When completed you will immeditely be directed to post a comment.

Forgot your password?

Not Registered?

You must be registered to post a comment. Click here to register.

Blog Archive

The Other Auto Insurance Telematics Shoe Drops

Progressive's decision to charge Snapshot drivers more if their driving data indicates higher risk has started the industry down a road of data-driven adverse selection.

Core Transformation – Configuring in the Rain

The whole point of core transformation is that changes at the micro level can be used as a stimulus for changes at the macro level.

6 Ways to Develop a Productive IT-Business Dialog

Relationship management 101 for keeping IT and business on the same page.

Unified Digital Strategy: Succeeding in the Digital Revolution

A unified digital strategy recognizes that all business strategies and technologies touch the customer in some way and that a one-size-fits-all channel model is obsolete.

Agile and Continuous Delivery in a Regulated Environment

Just because a development team is doing continuous delivery or packaging releases into two-week sprints doesn’t mean that code is being moved to production.

Dealing with the COBOL Brain Drain

Documentation on aging systems often is akin to tribal knowledge, and the potential for things to go bump in the night increases as these environments face generational transition.