Moore's Law And The Future Of Software…

Share

Moore’s law has been an effective a predictor of technology pricing and performance…

My first real experience with the power Moore’s law came when I upgraded from my old IBM PC with an Intel 8086 to the new PC AT with a 286 processor. I paid about the same as I did for my earlier computer, but found myself looking at an incredible boost in performance. The improvement wasn’t just just significant – it was transformational.

I could see creating programs for the new system that I never would have considered viable on the old. Several years later, this experience repeated itself with the introduction of the Intel 80386 – another incredible leap in speed and capability. We seemed to be on a incredible performance trajectory.

But then, something happened…

Software started to get more complex.

The ‘old guard’ programmers that grew up writing in assembly language on systems with 1K of memory began to fade away. With new, powerful – and lower cost – systems entering the marketplace, it became ‘bad form’ to focus on counting bits and cycles. This new wave of applications became more ambitious in their scope and functionality, and more accessible and creative in their interactivity and interface design. Anyone writing commercial or corporate applications started moving their development over to some version of ‘C++’, a language better suited than most to deal with the increasing demands being placed on software developers. DOS – the lightweight operating system that dominated the marketplace – found itself being replaced by Microsoft’s Windows 386, which offered a more robust framework for delivering graphically based applications. And with Windows came HAL – a hardware abstraction layer that hid the details of the machine a program was running on.

This marked the start of one of the most significant transition in technology history.

Software replacing hardware as the driving force for industry progress…

With software in the driver’s seat, computer based technology started to appeal to a much wider demographic, resulting in the arrival of new suites of applications. And with hardware prices continuing to drop at the rate predicted by Moore’s Law, the appeal of these newer applications broadened significantly. Devices that once required a corporate level budget (or serious geek commitment) now became accessible to a more mainstream, less technologically literate audience. This deflation in hardware prices created a virtuous loop that fed an explosion of new and more diverse applications which in turn resulted in increasingly pervasive market penetration.

The arrival of the internet in the mid-1990 only accelerated the process, moving software into what has come to be known as a “continual beta” state, and pushing harder for the eventual commoditization of PC’s – and hardware platforms in general.

That commoditization has become a two edge sword…

The upside is that the drop in hardware prices has made technology of all kinds almost universally accessible. We have gigabytes of storage in our phones and cameras, and processing power in our PDA’s that would have been enough to run an office 15 years ago.

The downside here is that hardware is now a dwindling component of any investment made in a technology based service or solution. It is no longer the main factor in what we pay for our technology. That distinction now rests with software.

And unfortunately, software doesn’t follow Moore’s Law at all…

Traditional software development continues to be an increasingly complex undertaking, and this complexity makes it resource and time intensive – and thus very expensive. There are two very distinct reasons for this increasing complexity. One reason falls in to the ‘Apple’ software camp – it isn’t easy to create exceptionally intuitive software to do relatively sophisticated tasks (examples are the iPhone and their iLife productivity suite). It requires a significant investment, innovative thinking, and iterative refinement. The other reason is commercial pressure. In most application genres, all of the basic functionality people need is already implemented in the software they own. To remain viable businesses, application developers are forced to create new ‘must have’ features that people will be willing to pay for – features that their competitors do not have. In effect, complexity is being added in a bid to escape commoditization. A good example of this is Microsoft’s Office 2007 productivity suite, a incredibly complex and expensive piece of software in what is essentially a twenty year old product category.

Their biggest competition is often the previous version of their software…

As technology consumers, we have become addicted to the historically declining pricing curve fueled by rapidly falling hardware prices. Getting more for less isn’t something we are willing to give up on, and that reticence is creating a real upheaval in the industry. Some recent software upgrades – most noticeably Microsoft’s Vista – have failed to gain any real consumer traction due to their cost and complexity. This marketplace backlash has called in to question the traditional way software has been developed and sold, and has many firms scrambling to reassess and refocus.

So how is the industry adjusting?…

Some firms see this as an opportunity, and are starting to extend their franchises with Software as a Service (SaaS) business models. Companies like Google and Yahoo are entering the application arena using web based services to replace traditional thick-client categories. These applications already offer many of the basic functions people expect from them, and are continuing to grow in sophistication. Being web based, they add a new social/collaborative dimensions not found in their traditional counterparts, but they also fall way short in disconnected use. If you aren’t online, their capabilities are often limited or non-existent. These services are largely ad supported, and offer consumers both free and subscription based options

Others like Amazon are providing rich platforms for new applications to develop against. These web service and infrastructure providers are looking to jumpstart the development of new applications and services by bringing efficiency and scale to the more commodity based capabilities most development projects depend on.

Traditional application vendors aren’t sitting like dinosaurs waiting to be hit by the meteor. Some like Microsoft and Adobe are experimenting with bringing their ‘franchise applications’ on to the web, while at the same time looking at ways to transition their traditional application base over to a ‘software rental’ model. Their goal is to create a revenue annuity that no longer depends on continually pushing updates into the marketplace, though it is unlikely that they will be able to do that in a revenue neutral way

The wild card in all of this is open source. There are some fantastic applications in all genres coming out of the open source community. Most people are familiar with with Linux and Firefox, but there’s a whole range of tools and applications beyond that – Open Office, GIMP, Audacity, and VLC, Lucene, MySql, Miranda, WordPress, PDFCreator and many more.

This model transition in the software industry is really just beginning and has a long way to go before things settle down again. New players will appear and ascend, and some of the big names today may diminish or disappear completely. Whatever the new equilibrium point for the industry ends up being, it will no doubt involve a combination of all of the initiatives discussed here. And I believe it will also reflect a more distributed approach to code craft and a more federated approach to service delivery.

As a result, the economics going forward will absolutely be different then what’s in place today.

They’ll probably operate a lot closer to Moore’s Law…

Like This Post? Then Please Share It…

Share