Friday 27 August 2010

What's Wrong With IT? - Part 1


Pretty much everything in IT depends on abstractions. In the same way that the driver of car really doesn't need to know anything about the workings of the internal combustion engine the typical user of a modern desktop computer has very little idea how it actually works. The abstraction with which they are presented (the desktop metaphor and the windowing interface) is mostly good enough to hide most of the unpleasantness beneath. Mostly good enough. The abstraction is still leaky enough that they are occasionally presented with a memory dump - we've all seen the Blue Screen of Death on a plasma screen at an airport or railway station. A computer with no such abstraction would require the user to understand and manipulate individual processor opcodes. This would render it unusable for almost everyone. The history of computers is really a history of successive and higher-level abstractions. In the last thirty years the command-line (itself a relatively high-level abstraction) gave way to the windowing-GUI (e.g. Mac OS and Windows). The windowing-GUI now seems to be gradually being replaced by touch based interfaces such as Apple's iOS (on the iPad, iPhone and iPod) and Google's Android. The level of abstraction on a computer like the iPad is such that its users don't typically even regard it as a computer at all. iOS doesn't include features like a CPU monitor because the user doesn't need to care about CPU utilisation. The OS prevents individual applications from choking the system. It also doesn't feature anything as passe as a hard disk access light firstly (and most obviously) because it uses solid state storage and secondly (again) because the user doesn't need to care. The abstraction is now so high and so good that you could use a modern touch based interface and not really have any understanding of how anything below the interface works. It should be noted that there is a cost involved - such computers are less flexible. However this loss of flexibility is not something which the typical user cares about. The abstraction is tuned very well to the typical user.

What does any of this have to do with corporate IT? Well, lets compare the skill-set expected of an engineer working on a computer such as the iPad with the skill-set expected of the iPad's typical user. The gap is vast. iPad engineering involves the use of cutting edge materials science, radio transceivers, software engineering, information architecture and state of the art touch screens and software kits. The typical iPad user merely needs to be able to master a few touch gestures and type on an on-screen keyboard. The iPad represents the current state of the art in hardware and software engineering. Its typical satisfied user could be a grey-haired old lady with a terror of computers. This is not accidental. The upside to the iPad's lack of flexibility and relatively low feature count is that it is comprehensible to the masses.

Now, lets compare the skill-set required of a developer building a software infrastructure such as J2EE or .NET with the typical corporate user of J2EE or .NET. Well, firstly the skill-sets are essentially similar. This is because the big software ecosystems are designed to be able to used to build *anything* - from a tiny phone-based application to a massively scalable web application serving hundreds of thousands of concurrent users. Flexibility begets complexity. The infrastructure developer building J2EE or .NET and the corporate developer building much less inherently complicated apps probably use the same development tools and the same basic languages. Why? Why can't big infrastructure vendors produce tools which are built from the bottom up to be used by corporate developers rather then taking massively complicated software ecosystems and trying (and failing) to paper over their inherent (usually) needless complexity by stripping them down for corporate use? These tools use inappropriate abstractions. Corporate developers should be able to focus on business problems rather than massive, ever-changing software infrastructures.

So, the first big thing which is wrong with IT is that the tools are written by very bright hardcore software engineers who have limited insight into the actual requirements of corporate developers. We need to find better alternatives.

Friday 6 August 2010

Hello World!


As the famous saying goes "Prediction is very difficult, especially about the future". However, it seems obvious that the era of the PC as the principal engine of growth for the IT industry is drawing to a close. In the nineteenth century there was a shift from people owning one general purpose electric motor with various specialised attachments to people owning many motors each embedded in a single use device. A similar thing seems to be occurring now. Computers are becoming both less obviously computers and also more ubiquitous, location-aware and permanently connected. I recently replaced by wife's ailing MacBook with an iPad. For her, as for many others, it's probably all the computer she'll ever really need. Ironically, she doesn't even regard it as being a computer. She's scared of computers. Mostly she's scared that if she does something "wrong" she'll "break" it. Not so with the iPad. As the PC is supplanted by the Really Personal Computer (e.g. tablet, app-phone) much of the burden of computation and storage is shifting from the user's own device into the cloud. And what a cloud. Incredibly rich and diverse services are available for free (well, seemingly free). Many enable their users to maintain and develop networks of friends, relatives and business contacts. Others enable their users to manage their day-to-day lives more easily, listen to virtually any music ever recorded or simply share their family photos and videos.

The world of cloud services is rich and user-focussed. The world of corporate IT is the polar opposite. Most IT departments spend their time patching old systems in order to coax them into continuing to function or bending them violently in unnatural ways in order to meet new business requirements which are far distant from those originally intended by the system's original developers. The vast majority of code written by developers isn't anything to do with business process. Instead developers spend most of their time worrying about the infrastructural context in which their code will run. If the highest abstraction always wins then why are we still engaged in hand-to-hand combat instead of undertaking surgical strikes with the IT-equivalent of laser-guided bombs? Users (rightly) expect better. After all, at home, they can get exactly what they want. At the office - not so much.

Corporate IT is broken and the users have noticed. There must be something we can do about this. The ongoing purpose of this blog is to explore the architectural shifts occurring in the IT industry and try to suggest ways in which these changes can be exploited to help mend corporate IT.