Pretty much everything in IT depends on abstractions. In the same way that the driver of car really doesn't need to know anything about the workings of the internal combustion engine the typical user of a modern desktop computer has very little idea how it actually works. The abstraction with which they are presented (the desktop metaphor and the windowing interface) is mostly good enough to hide most of the unpleasantness beneath. Mostly good enough. The abstraction is still leaky enough that they are occasionally presented with a memory dump - we've all seen the Blue Screen of Death on a plasma screen at an airport or railway station. A computer with no such abstraction would require the user to understand and manipulate individual processor opcodes. This would render it unusable for almost everyone. The history of computers is really a history of successive and higher-level abstractions. In the last thirty years the command-line (itself a relatively high-level abstraction) gave way to the windowing-GUI (e.g. Mac OS and Windows). The windowing-GUI now seems to be gradually being replaced by touch based interfaces such as Apple's iOS (on the iPad, iPhone and iPod) and Google's Android. The level of abstraction on a computer like the iPad is such that its users don't typically even regard it as a computer at all. iOS doesn't include features like a CPU monitor because the user doesn't need to care about CPU utilisation. The OS prevents individual applications from choking the system. It also doesn't feature anything as passe as a hard disk access light firstly (and most obviously) because it uses solid state storage and secondly (again) because the user doesn't need to care. The abstraction is now so high and so good that you could use a modern touch based interface and not really have any understanding of how anything below the interface works. It should be noted that there is a cost involved - such computers are less flexible. However this loss of flexibility is not something which the typical user cares about. The abstraction is tuned very well to the typical user.
What does any of this have to do with corporate IT? Well, lets compare the skill-set expected of an engineer working on a computer such as the iPad with the skill-set expected of the iPad's typical user. The gap is vast. iPad engineering involves the use of cutting edge materials science, radio transceivers, software engineering, information architecture and state of the art touch screens and software kits. The typical iPad user merely needs to be able to master a few touch gestures and type on an on-screen keyboard. The iPad represents the current state of the art in hardware and software engineering. Its typical satisfied user could be a grey-haired old lady with a terror of computers. This is not accidental. The upside to the iPad's lack of flexibility and relatively low feature count is that it is comprehensible to the masses.
Now, lets compare the skill-set required of a developer building a software infrastructure such as J2EE or .NET with the typical corporate user of J2EE or .NET. Well, firstly the skill-sets are essentially similar. This is because the big software ecosystems are designed to be able to used to build *anything* - from a tiny phone-based application to a massively scalable web application serving hundreds of thousands of concurrent users. Flexibility begets complexity. The infrastructure developer building J2EE or .NET and the corporate developer building much less inherently complicated apps probably use the same development tools and the same basic languages. Why? Why can't big infrastructure vendors produce tools which are built from the bottom up to be used by corporate developers rather then taking massively complicated software ecosystems and trying (and failing) to paper over their inherent (usually) needless complexity by stripping them down for corporate use? These tools use inappropriate abstractions. Corporate developers should be able to focus on business problems rather than massive, ever-changing software infrastructures.
So, the first big thing which is wrong with IT is that the tools are written by very bright hardcore software engineers who have limited insight into the actual requirements of corporate developers. We need to find better alternatives.
No comments:
Post a Comment