Blog

Building a Website as a Robust Software Application

Kal Nasser
Insurance Experts' Forum, March 12, 2014

The foundation the Internet is built upon is the same today as it was in the early 1990s: the hypertext transfer protocol (the “HTTP” that precedes every web address in your browser) and the hypertext mark-up language (HTML — the syntax used to tell the browser how to display content). These two acronyms were coined at a time when web pages consisted mostly of text, which was rendered by text browsers. The original specification was that of a one-way medium — a way of navigating a repository of static content.

The content delivered via this medium has gotten more sophisticated and interactive over the years, incorporating sound and images and submission forms, but that was just part of its evolution. There's been another, slower, transition taking place: full-fledged software applications. The "cloud" is not just a replacement for the hard drive; it's also delivering the software we use. That transition isn’t new — it started shortly after the beginning in the form of sites that mimicked software functionality, such as early forms of webmail.

Mac or PC? The answer to that question seems to have emerged: the browser. The medium that manages our data and runs our software has gained independence from both the data and the software. There are no competing Internets to choose from. There is now really one operating system: the Internet. And any browser we choose has to be a neutral tool through which we interact with the data and the software.

The bit about data independence seems obvious. It would seem strange, for instance, to require a special "ESPN TV" to watch ESPN. ESPN just delivers content that can be viewed it on any TV. And it's not a stretch to compare a TV channel to a music file, streaming video or a document. Those also are just content being delivered to a device just like a TV channel or a radio station. The less obvious part is that the very nature of Internet content is that it is interactive, incorporating user feedback and input, and changing accordingly.

The consumer of such content shouldn't be expected to manage the content independently from the software that it relies on. Imagine a shopping site's order form that requires the buyer to separately download, install and maintain a software application to collect and submit the order!

Overcoming the HTTP Legacy

Ironically, the best way to deliver a seamless integration of content and software to the user is to architecturally decouple the data from the software. The current thinking of how to best implement a web application, known as the "single page application," is based on this concept. Rather than sending pages of content to the browser (along with the code that manipulates them), the application should be delivered to the browser, and a data interrogation interface should be provided to it to allow targeted updates to relevant components on demand. This reduces or eliminates the times when the entire website has to refresh just to update certain parts on it.

One problem that faces this concept is caused by the very legacy of the web as a network of linked pages. The more advanced a web application becomes, the harder it gets to keep traditional browser behavior from breaking. For instance, the back and forward buttons that every browser has are made with hypertext in mind, where each page has links to other pages, and the user can move back and forth in the browsing history. Certain techniques are proposed to handle this, including allowing the website to manipulate its own history in the browser session, and change its state appropriately when those buttons are clicked.

This is not just the way of the future: It's the way of the present. A website is no longer merely a form with a submit button, much less a collection of static pages linked via hypertext. It has to be architected from the ground up as a robust software application that delivers a rich experience and a meaningful service to the user. Such advanced web architecture is particularly crucial in the insurance industry, where there’s so much potential for letting customers self-service.

Kal Nasser is a software developer with X by 2, a technology company in Farmington Hills, Mich. 

Readers are encouraged to respond to Kal using the “Add Your Comments” box below. He also can be reached at knasser@xby2.com.

This blog was exclusively written for Insurance Networking News. It may not be reposted or reused without permission from Insurance Networking News.

The opinions of bloggers on www.insurancenetworking.com do not necessarily reflect those of Insurance Networking News.

Comments (0)

Be the first to comment on this post using the section below.

Add Your Comments...

Already Registered?

If you have already registered to Insurance Networking News, please use the form below to login. When completed you will immeditely be directed to post a comment.

Forgot your password?

Not Registered?

You must be registered to post a comment. Click here to register.

Blog Archive

Boosting Performance with Integrated Underwriting Tools

A unified, comprehensive platform can help underwriters perform their jobs more efficiently — and profitably.

Apple's Way: Succeeding in the Enterprise Without Even Trying - Part 3

Today's data centers are doing far more with much smaller footprints.

Apply Mindfulness to Leadership

Managers can benefit from applying this theory both to their career aspirations as well as to interactions and expectations of staff.

Opinion: Halbig Decision Creates New Level of Uncertainty for Obamacare

Time will tell if the Halbig decision remains viable. But in the meantime, a new level of uncertainty has been injected into the process.

CIOs: "We Don't Have Enough People to Run Our Mainframes"

Insurers will be competing with other industries for both legacy and “new IT" talent.

4 Ways to Keep Insurance Data Quality Healthy

Continually building trust and credibility in the data is the key to a successful data warehouse.

Advertisement

Advertisement