Wednesday 16 April 2014

John Boyd, The OODA Loop and IT

The work of John Boyd is becoming more well known in the IT community. However, this has, so far, been based on an oversimplified view of Boyd's theories - notably his so-called OODA loop. This is unfortunate as both Boyd's OODA loop and its wider theoretical underpinning potentially offer valuable insight into why IT systems often fail to meet the ongoing challenges of a fluid operating environment. 

John Boyd (1927-1997)

Who was John Boyd?

John Boyd was a military strategist. As a young fighter pilot Boyd was fascinated with the reasons why some planes and some pilots were successful in dogfights and others were not. In order to make meaningful comparisons between fighter planes Boyd created Energy-Maneuverability (EM) theory. EM theory uses mathematical models of the behaviour and capabilities of different aircraft to compare how each would fare against the other in various circumstances. From the analysis of both US and Soviet fighters Boyd drew two conclusions: Firstly that in many cases Soviet fighters were superior to US fighters and secondly that the metrics which were then used when designing and procuring aircraft were not overly relevant to actual air combat. At the time of EM theory fighter aircraft design was driven by absolute required maximum speed of the plane together with its ability to  carry huge amounts of ordinance. Boyd’s EM theory showed that in any likely combat encounter the winning plane would likely be the one which could change its state most quickly. For that reason he concluded that fighter planes should be as light, powerful and manoeuvrable as possible. These planes should also be cheap enough to buy in large numbers. This subsequently led to the development of the F16 which was both cheaper than its predecessors and also more manoeuvrable.

Whilst EM theory was focused on understanding the comparative dogfighting merits  of fighter planes Boyd's other important theory - the OODA loop - was concerned with how individuals and groups could understand their changing environment and subsequently make timely and appropriate decisions regarding how to respond to it. 

The OODA loop has four main phases: 

Observe - gather information on the situation around you. 

Orient - build a model of the world using the information gathered together with your cultural and genetic heritage.

Decide - plan how to react to the external situation.

Act - execute the plan.

Iterate. Iterate. Iterate. The quicker you can iterate through the loop the quicker you can react to your environment and, in combat, the more likely you are to defeat your enemy.

The 'OODA loop' is often characterised (i.e. misrepresented) as a very simple cycle. However, Boyd's actual OODA is much more complex as shown below:



The misinterpretation of Boyd's work (especially the OODA loop) is understandable since he wrote little of it down in a formal manner. Aside from a few presentations and his earlier work on dogfighting tactics and modelling of aircraft combat performance the only written work which exists is his nine page paper Destruction and Creation. Destruction and Creation was Boyd's attempt to understand and document the mechanisms by which creativity can occur in organisations - an important concern for Boyd as much of his war fighting strategy was based on the hypothesis that the winning side is the one which better maintains an overall understanding of the current situation and which can therefore operate at a higher tempo thus not only being more 'agile' but also degrading its enemies' model of what's going on. 

Boyd and IT

For some reason, probably its perceived virile masculinity, military strategy has long been influential in business strategy. Sun Tzu is often read by business people who see themselves as warriors on the battlefield of commerce. This partly explains the attraction of Boyd to some. Also, from both a business perspective in general and an IT perspective in particular Boyd's OODA loop is attractive as it seems to offer what many are seeking; agility. In IT the Agile method is becoming the default mechanism for software development throughout the industry. Who would deny that businesses or IT development processes should demonstrate agility?

However, I would argue that in focussing on a (highly simplified) version of the OODA loop much of its value is lost. Additionally, the intellectual and attitudinal underpinnings of the OODA loop are at odds with some of the ways in which people are attempting to implement it in IT. 

So, how should we apply the OODA loop?

The OODA loop can be applied as a general decision making model and as a mechanism for ensuring that an organisation is able to continuously adapt to a changing competitive environment. Clearly, Boyd did not see the adoption of his loop in terms of a fixed set of tools and processes. Since Boyd viewed people and mission as being more important than tools (weapons) it is likely that he would place an emphasis on ensuring that all leaders (at all levels) in an organisation shared a common set of attitudes, approaches and cultural values together with a common understanding of the organisation's mission (Schwerpunkt in Boyd-speak). Additionally, people in leadership positions should be sufficiently creative so as to be able to operate effectively with a high degree of operational autonomy. 

Clearly business is not a zero sum game in the same sense as a dogfight - i.e. for a business to succeed it is not necessarily required for it to destroy its competitors. Therefore at least some of the point of the OODA loop in conflict is (at worst) not required and (at best) a helpful side-effect. Specifically one of the effects of the OODA loop when applied in conflict is to create chaos and channel it in the direction of one's opponent. This destroys the opponent's model of the rapidly developing situation and stretches his or her decision-making cycle. In business it is clearly advantageous to operate at a faster tempo than your competitors and also to behave in ways which do not fit their models of the world.

Beyond the facile interpretation of the OODA loop as a mechanism for making decisions quickly is Boyd's belief that the key to an organism's survival is its model of the world, its relevance to reality and the synthesis which it performs on this analysis of the world. In Destruction and Creation Boyd attempted to describe how creative thought based on analysis and synthesis can lead to innovation. He proposed a thought experiment in which four domains were to be considered; a skier on a slope, a speedboat, a bicycle and a toy tank. Each of these domains had its own attributes: chair lifts, skis, people, mountains, chalets for "skier", sun, boat, outboard motor, water skier and water for "speed boat", chain, seat, sidewalk, handlebars, child and wheels for "bicycle" and, finally, turret, boy, tank treads, green paint, toy store and cannon for "toy tank".  Each of these domains is therefore associated with a network of interlinked concepts. Boyd argued that it was by breaking these links, dissociating them from their domains, and then reassembling them in novel ways that new ideas were formed. Boyd illustrated this by picking several of the previously identified attributes; handle bars, outboard motor, tank treads and skis. Combining these allows us to create what Boyd termed a "new reality" - or in this case a snowmobile. Boyd wanted the US military to be a culture which encouraged its officers to be the creators of snowmobiles - to be individuals who were capable of the analysis and synthesis required to think creatively. Boyd went on to argue that having created a snowmobile (or any other concept) it must be measured against external reality and (even if initially) a good fit subsequently itself destroyed and its conceptual network rebuilt to fit a changing external environment. This cycle of destruction and creation underpins Boyd's OODA loop. Boyd referred to the works of Gödel, Heisenberg and the second law of thermodynamics in order to support his theories. Using Gödel's Incompleteness theorem Boyd argued that any system representing reality must necessarily be inaccurate and incomplete. Using Heisenberg's Uncertainty Principle Boyd argued that in combat the observer effect means that the neat distinction between protagonists becomes blurred. The Second Law shows that the entropy in any natural system will increase thus making attempts to understand it increasingly difficult. 

Where are the lessons for IT here?

There are two main areas where Boyd's work is applicable to IT (and to an extent business itself); organisational culture and concept formation. 

Organisational Culture

Underpinning Boyd’s belief that manoeuvre warfare was preferable to attrition warfare were his views on the levels of autonomy and creativity which should be displayed by officers at all levels in the hierarchy. In order to successfully adopt manoeuvre warfare an army will need to delegate decision-making which is made at the top of the hierarchy down to commanders in the field. In order for this to be viable the officers involved need to share both a culture and a common understanding of the mission involved. Therefore even when these officers are exercising a great deal of operational autonomy they are doing so in ways which mesh with others and also effectively serve the wider mission. This approach enables an army to simultaneously execute multiple OODA loops at different organisational levels.

Does this approach work for IT? Well, it clearly offers merit from a purely business perspective. Organisations which adopt a top-down 'attrition-warfare' approach to decision-making are clearly less able to react to a rapidly hanging environment. However from an IT perspective the situation is slightly more uncertain. The OODA loop is designed to allow a protagonist to create chaos and then channel it at his or her enemy. Is the creation of chaos the purpose of IT? Despite much evidence to the contrary it clearly is not. IT is, fundamentally, concerned with creating useful models of the world. These models get realised as IT systems - executable code, databases, servers, networks etc. So if an organisation were merely to adopt a Boydian approach to delegation of authority for IT then we could expect a period of rapid change in IT systems followed by a period of head scratching as the organisation dealt with the ensuing complexity which had been created - or 'accidental architecture' in IT parlance. The term 'software' is actually a little misleading. Software isn't soft. Once implemented software is actually relatively expensive and time-consuming to change. So, for a Boyd-like approach to work it needs lower-level decision makers to operate within a meaningful governance framework which provides a structure in which local decisions can be made without compromising higher-level or longer term goals and strategies. In IT architectural terms this would equate to a lightweight implementation of an architectural governance approach/framework such as TOGAF. This clearly needs to be done in such a way that the correct balance between top-down governance and local agility is struck. 

The second implication for the structure of organisations concerns the people themselves. Boyd argued that the officers in a manoeuvre warfare army should be 'Snowmobile builders' - i.e. Individuals who were capable of both analysis and synthesis. To some extent creativity can be taught or at least fostered. However, organisations wishing to implement Boyd's ideas would be advised to hire individuals capable of demonstrating creativity - in this case by being able to disassemble existing concepts and the reassemble in new and useful ways. 

Concept formation

The second major lesson for IT from Boyd is around how concepts are formed, destroyed and reformed. During the orientation phase (or 'The Big O' as Boyd termed it) of the OODA loop the analysis of gathered data and the synthesis of a new or modified view of the world takes place. In the heat of battle this model is used to quickly generate plans which can be reused against the enemy. However, in commerce these models get encapsulated as IT systems. Once so encapsulated they are relatively hard to change. This prevents organisations from rapidly iterating an OODA loop. The entropy inherent in the external environment gradually destroys the value of legacy IT systems. As in manoeuvre warfare the ability to be creative and to react to a situation by modifying our internal model of the world is critical in a commercial environment. Additionally, philosophically, Boyd's theories serve to illustrate that models are inherently a simplified, incomplete and inaccurate mechanism for understanding the world and they only get more wrong over time. 

What is slightly unfortunate is that this lesson from Boyd really only serves to highlight the problems and imperatives of which IT folks are already painfully aware - i.e. the need for agility and the barrier which complicated legacy systems present. We're still developing the tools both in terms of financial models which describe the ongoing and opportunity costs of complexity and also in terms of software design patterns and practices to address complex systems. 

Perhaps what we really need is an IT version of Energy-Manoeverability theory which instead of quantifying the flight characteristics of warplanes instead helps us to understand the qualities and characteristics of computer systems - ideally along with the associated costs. 
For anyone with an interest in Boyd's life and theories the books "Boyd: The Fighter Pilot Who Changed The Art of War" by Robert Coram and "The Mind of War: John Boyd and American Security" by Grant Hammond are both highly recommended. The former focuses on his life and the latter focuses more on his work and theories. Additionally, his original briefings are all available at Air Power Australia

Wednesday 18 December 2013

Whither Bitcoin?

Gold has preserved its value over millennia. There’s a reasonable case to be made that over the last two thousand years an ounce of gold has always been able to buy a good suit of clothes for a man - from a Roman toga to a modern high-quality mens suit. Gold has been particularly popular since the financial crisis began unfolding in 2008. In a world in which nation states are able to print money in order to keep their economies from sinking the apparent inherent value of gold is very appealing.

Virtual currencies such as Bitcoin are an attempt to create another store of value which nation states cannot debase for political or economic reasons. Bitcoin is portable, non-taxed (so far) and has a finite cap on the number of Bitcoins which can ever exist. It also represents a serious strategic threat to, amongst others, the United States. To explain, the United States is currently living far beyond its means. This is made possible because the US Dollar is the world’s reserve currency. Oil is priced in US Dollars. There will always (seemingly) be a demand for US Dollars by investors and trading institutions. As the US Dollar is seen as being a rock solid store of value which also underpins the global economy the USA can borrow money far more cheaply than it would otherwise be possible to do.

But what if other stores of value - such as Bitcoin, were to become mainstream? Wouldn’t this undermine the value of nation state currencies in general and the US Dollar in particular? 

For an example of how seriously the US government takes these things consider the case of the Liberty Dollar. The Liberty Dollar was an attempt to create a physical alternative to the US Dollar. Its creator, Bernard von NotHaus was recently convicted of  "making, possessing and selling his own coins” and faces up to fifteen years in prison. He has been described as a “domestic terrorist” by US government officials. In that context what view is the US government likely to hold on Bitcoin?

Now, Bitcoin is essentially a bunch of clever mathematical algorithms. See here for a great description of how Bitcoin actually works. With gold you’re trusting people’s belief in its intrinsic value and with traditional currencies you’re trusting the economies and political systems which stand behind them. However, with Bitcoin you’re trusting that the mathematician(s) who created it have left no chinks in its cryptographic armour. Is it likely that very smart mathematicians and cryptanalysts at the NSA (and every other major nation state intelligence agency) are trying to find weaknesses in Bitcoin? Highly likely, I’d say.

Lets suppose that, for example, the NSA were to find a vulnerability in Bitcoin. It wouldn’t necessarily need to be functionally catastrophic - it would merely need to damage Bitcoin’s reputation as a safe store of value. Bitcoin would have, in effect, an electronic Sword of Damocles hanging over its head. It could potentially be instantaneously and irrevocably destroyed by a keystroke in an NSA data centre. If such a weakness existed then when, if you were a nation state, would you choose to use it to destroy Bitcoin? If it were up to me I’d wait until Bitcoin had, to an extent, gone mainstream. When holding Bitcoins is considered normal enough that the average person might choose to hold some - that’s when I’d pull the rug from under it. Some average folk lose some money, as a result virtual currencies (all virtual currencies) are rendered toxic for a very long time. People rush back to the seeming safety and security of traditional currencies - and the natural order is preserved.

So, if you hold Bitcoin now then any value which it currently has may only exist on the sufferance of the NSA. Risky, no?

Tuesday 3 December 2013

Book Review - Enterprise Architecture as Strategy





Enterprise Architecture as Strategy by Jeanne W. Ross, Peter Weill, David C. Robertson.

This book attempts to describe how and why enterprise architecture matters to organisations. There are several questions that this book attempts to address:

  1. What is enterprise architecture?
  2. How can organisations justify the investment in developing an enterprise architecture?
  3. What are the business outcomes from doing enterprise architecture well?
  4. In what terms should an overall enterprise architecture be defined?
  5. How can an organisation realise its architectural vision - and in doing so support the realisation of its business vision?
  6. How can the conflict which often occurs between short-term business imperatives and long-term business strategy be managed in terms of its impact on architecture?

The first thing which the book recommends doing when developing an enterprise architecture is to define the organisation's operating model. This represents the organisation's commitment to a way of doing business. The operating model is defined in terms of two dimensions; business process standardisation and systems integration. The book argues that these two dimensions drive how enterprise architecture should be undertaken at an organisation. It then goes on to classify the various permutations possible with these dimensions into four operating models:

Diversification - Low Standardisation, Low Integration
Coordination - Low Standardisation, High Integration
Replication - High Standardisation, Low Integration
Unification - High Standardisation, High Integration




An organisation's position on the quadrant should be driven by a) the extent to which data sharing between business units is necessary and b) the extent to which commonality of business processes across business units is required. The book goes on to give examples of organisations which have applied each of the four operating models. It also describes the impact of each approach on IT architecture strategy and the implications of each with regards to how the organisations can grow. For example an organisation employing a replication or unification scenario which is acquiring another organisation will need to rip and replace the newly acquired entities' existing systems in order to fit the acquiring organisation's operating model. Conversely an organisation which employs a diversification approach can more easily digest newly acquired organisations - but it clearly will struggle with any data sharing requirements between business units. It is important to note that an organisation can simultaneously employ different operating models at different levels in its organisation. 

The identification of an organisation's operating model is not purely an IT responsibility. It is as least as much (if not more) a senior business leadership decision. 

The book's authors see much of EA as being focused on business objectives and outcomes rather than being a purely technical discipline. This makes a lot of sense especially, as the book asserts, since senior management understanding and sponsorship of EA is critical to its success in an organisation. To that end the book recommends the creation of a a Core Enterprise Architecture Diagram. This is one-page diagram intended to illustrate the key elements of an organisation's IT architecture to all interested parties in an organisation. The four most commonly shown elements in the diagram are:

1. Core Business Processes - company-wide processes
2. Shared Data Driving Core Processes - data shared between processes 
3. Key Linking and Automation Technologies - e.g. middleware and ERPs
4. Key Customers

The book then helpfully provides real-world example Core Enterprise Architecture diagrams for organisations with differing operating models. The book argues that organisations which in its words 'digitise the core to make it predictable, reusable and reliable' will be successful. Unfortunately the book does not define what digitising the core actually means in practice - beyond the obvious of building or (especially) buying systems to do so. Nevertheless the core diagram approach could well be useful as a mechanism for helping organisations to decide where to focus their IT investment. 

The book then goes on to define and explore various stages of enterprise architecture maturity. It defines these as:

1. Business Silos Architecture
2. Standardised Technology Architecture
3. Optimised Core Architecture
4. Business Modularity Architecture

As an organisation moves through these stages it will spend less and less on local, business unit-specific IT and spend more on developing a common IT architecture which still preserves the ability (where necessary) for diversification. The book discusses the maturity journeys of various organisations. It also argues that it is difficult for any organisation to jump multiple maturity levels in one leap as (when attempting to do so) the rate of change is usually too much for an organisation to bear. This section of the book is very successful in clearly defining the maturity stages, their individual characteristics and also the activities and IT funding priorities which should be associated with each stage. It also describes how an organisation's architecture practice can and should mature along with the architecture itself. 

Time and again the book asserts that in order to increase agility an organisation needs to "digitise its core processes". Unfortunately the book does not define what this means in any meaningful way. The inference which most readers are likely to draw from the phrase is that successful companies implement Enterprise Resource Planning (ERP) software. 

One of the key outcomes for organisations which undertake this journey is that as the flexibility of local business units starts to fall the overall global flexibility begins to rise. The outcome for the most architecturally-mature organisations is that the benefits of their overall gains in global flexibility outweighs the loss of flexibility at a local level.



One of the key challenges faced by any enterprise architecture team is how to define KPIs which demonstrate that the investment made in made in order to create and maintain an architecture practice is worthwhile. The book does provide some helpful material for these efforts by describing the various benefits which should accrue to organisations with a well functioning enterprise architecture practice. These include reduced IT costs, increased IT responsiveness, improved risk management, increased management satisfaction and enhanced strategic business outcomes. However, actual examples of enterprise architecture-relevant KPIs are not present. 

The book does an excellent job of describing how architecture should align and interlock with both senior IT folks and also project teams in order to ensure that both long-term architecture goals and short-term project needs are met. It does this by describing how the architecture function should be involved in the governance and guidance of projects throughout the entire project life-cycle.

Another key challenge for enterprise architecture efforts is how to turn what is often seen as an ivory tower-type of endeavour into one which has practical, useful effects throughout an organisation. Part of this is typically the question - who pays for the shiny new architecture when there is typically no specific budget to do so?

The book really does not adequately address this critical question. The only relevant example provided is that of Toyota Motor Marketing Europe. In this example where projects could not be persuaded to fund the additional costs sometimes required in order to meet architectural strategy the architecture function would pick the cost of the additional work. Given the way that architecture teams tend to get funded this does not seem to be an approach which could be widely applied. Additionally, the incentive for project teams to "do the right thing" is much reduced when they realise that someone else will pick up the tab if they don't choose to adhere to architectural standards. The book does suggest that business units which pay for the upfront infrastructure costs associated with the implementation of strategic items could subsequently recoup the wider costs involved via some sort of chargeback model. However this idea is not explored in sufficient depth. 
The book then goes on to outline three outsourcing models together with the implications of each. The models described are:

1. Strategic Partnership - responsibility for operational activities outsourced
2. Co-sourcing - project management and implementation outsourced
3. Transaction - narrowly-defined repeatable processes

Unfortunately these models are quite poorly defined by the book. Additionally the book is now rather out-of-date with regards to the commercial models now being offered by outsource vendors. The book does attempt to advise how and when to outsource with regards to the previously described architectural maturity model. However since the outsourcing models are themselves poorly defined this is of limited value. It would have been useful if the book had discussed the impact of the commercial models used during outsourcing on the achievement of architectural goals. For example, how can a move from capex to opex-based expenditure enable strategic projects to become more easily funded?

The book then moves on to describe the different growth opportunities enabled by each of the different operating models (diversification, coordination, replication and unification). It also describes how each copes with a merger or acquisition scenario. This is genuinely useful material and will enables architects to help business managers understand the likely impact of a given merger or acquisition in both cost and systems terms. 

Having previously defined four stages of architectural maturity (business silos, standardised technology, optimised core and business modularity) the book also proposes a fifth; dynamic venturing. This seems to involve businesses reaching a level of agility at which they can can plug-and-play entire businesses or business units. This notion is predicated (at least in part) on the assumption that software agents and standards for information interchange reach a level of maturity which allows such seamless integration (both business and IT) to happen. Given the rate at which both meaningful AI and standardised business software component models are improving this seems to be quite an optimistic position to adopt. 

The book closes with a chapter which first describes how the previous chapters have addressed the problems associated with organisations with low architectural maturity and how this can alleviated via a more mature approach. It then closes by offering some useful guidance on how to approach an initiative to plan and implement an enterprise architecture strategy. 

In summary the book is very successful in describing the practical types of operating models which can be applied when defining the high-level architectures and business approaches for global organisations. It is also strong on the maturity journey which organisations need to undertake in order to realise a vision involving maximised global organisational agility. Where it is weaker is in defining the practicalities involved - what does "digitising the core" actually mean, how does one do it?

The other major failing of the book is that it does not adequately address the tension between short-term project imperatives and long-term architecture vision - i.e. balancing the tactical with the strategic. 

The book contains much sound advice for enterprise architects and provides a good framework and starting point for them to engage with senior business stakeholders in order to begin or deepen the implementation of an enterprise architecture. 

On balance, the book is a highly worthwhile read for any enterprise architect who wishes to put in place a high-level strategy for enterprise architecture which will exist above the day-to-day imperatives of the role. 

Wednesday 20 November 2013

Why Isn't The iPhone 5c Selling?

It seems to be no secret that the sales of the Apple iPhone 5c aren't meeting Apple's expectations. I'm guessing that this is at least in part because many people don't want to be seen using a phone which is clearly the cheaper of the two phones in Apple's lineup. Apple's approach has always been to offer last year's iPhone as the budget option for those who want a new iPhone. Buyers who have bought last year's model have always been able to at least maintain the pretence that they haven't bought the cheap option - after all they're using a phone which was once Apple's flagship model - "I may be using last year's model but for all you know I may have had it for a while". This is particularly true during the 3/3gs and 4/4s transitions. However current potential iPhone buyers are faced with the choice of buying either a brightly coloured phone which is obviously the budget choice or paying a relatively small premium to buy this year's top model. It's no surprise that so many skip the 5c and go straight to the 5s.

Thursday 16 May 2013

Arsebestos & The Instrumented Self

In his book of essays Some Remarks the writer Neal Stephenson coined the term Arsebestos to describe the likely epidemic in illnesses which will be attributable to our office-based, sedentary working lives. Research has shown that even if you take regular high-intensity exercise the long-term effects of the many hours we spend sitting either at work or at home are very very bad indeed. Longer periods of lower intensity ambulatory activity (i.e. walking) seem to help. However, in societal terms we are in danger of creating a wholly new class of devastating industrial injuries - in the forms of obesity, diabetes and cancer. What can we do about this? The obvious answer is to sit less and move around more. In the workplace this clearly depends on employers paying heed to the health research results which are starting to emerge which demonstrate the deleterious effects of sitting. What this might mean in practical terms could include standing desks, more standing in meetings and perhaps even treadmills integrated into desks.

The growing realisation than sitting is bad for your health is occurring at the same time as devices and practices are being created which help people to capture and analyse their own biometric data. The movement associated with this emerging trend calls itself The Quantified Self. Some participants in this movement capture the minutiae of their lives in great detail in order to better understand themselves and their lives and to, essentially, optimise their own lives. Whilst this can veer into the fanatical and wacky there are some devices available which ordinary people can buy, use and get benefit from. One of these is the Nike+ Fuelband.

Nike+ Fuelband

The Fuelband is a rubberised bracelet which attempts to measure the activity-level of the wearer. Given its form factor and its sensor-type (an accelerometer) it gives only a general picture of the wearer's activity level - walking and running can be measured quite well as they involve arm movement, cycling and weight-lifting are not measured well. The Fuelband allows the wearer to accumulate 'Fuel' which is a synthetic metric proprietary to Nike. Via both the Nike website and the Fuelband iPhone app the user can set goals for daily Fuel targets, track progress and compete with friends. According to user reports the Fuelband helps to change the wearer's behaviour in order that they meet their goals. Typically this involves adjustments in behaviour such as walking more or climbing stairs rather than using a lift or even a more intense exercise such as running or playing a sport. The Fuelband is merely one of the current crop of activity monitoring devices. Others include the Fitbit Flex and the Jawbone Up. Both of these will also monitor your sleeping patterns.

So, a device as simple and limited as the Fuelband seems capable of making a positive difference to levels of activity. The next generation of devices will offer much more insight to our physiology and activity patterns. One of these is the Basis monitor, a watch-style device which includes a heart-rate sensor.

Basis

Beyond the obvious ability to monitor the user's heart-rate this also makes possible the measurement of any exertion - not just ones involving arm movement. This in turn allows the capture of much more accurate calorie burn rates.

As these devices start to become both ubiquitous and increasingly connected we will start to move beyond the vaunted Internet of Things to the Internet of Us. The Internet of Things means that when my neighbour's BMW breaks down and before he has even had a chance to call a breakdown service he is rung by BMW themselves to let him know that they are aware that his car has a fault and that help is already on its way to his location. The Internet of Us means that an ambulance will be dispatched before we are even aware that a heart attack is imminent. We won't just be Quantified, we will be Instrumented. Eventually, these devices will disappear entirely - becoming sub-dermal implants powered by our own electrical systems. Effectively your body will have its own IP address.

The masses of data already being generated by the existing batch of activity monitors raises both opportunities and concerns. Opportunities because the data could provide an invaluable resource for medical researchers. For example, they could look for patterns of behaviour or biometric activity which correlate with health problems. Concerns because of the ownership and privacy issues which clearly arise from the amassing of such data by third-parties on our behalf. Ultimately, these concerns will need to be resolved via a combination of both customer pressure and legislation.

In any event these tools (even as they exist today) provide individuals with the information they need to help to start to combat the effects of sedentary living.

Anyway, enough rambling. My Fuelpoints total for today currently stands at a measly 1148. Time to get off my arse...

Saturday 25 August 2012

Amazon Glacier - Game Changer


The problem with cheaply, reliably storing terabytes of data over the long-term is one which not only affects businesses but also individuals who generate large amounts of data. With high-definition camcorders and high-megapixel cameras it's easy for even amateur photographers who take snapshots of their kids growing-up to generate terabytes of photos.

If you want to do offsite backups of priceless, irreplaceable family photos then there's really only been three options until now; some kind of rotating backup strategy using removable media (high-manual effort), syncing data to a friend or relative's house over the Internet (technically-difficult) or using a cloud-based file system such as Amazon S3 or Windows Azure storage (too expensive).

A cloud-based strategy has (theoretically) always been my preferred option. However, it's probably too expensive for the average person. For example, at $0.125 per Gb per month it would cost $128 per month to store 1Tb of data. Ouch.

Enter Amazon Glacier. Amazon Glacier trades accessibility for cost. To retrieve an object from Glacier can take several hours using a scheduled job. However, the cost per Gb per month is a mere $0.01. Wow. This means that it would cost a mere $10.24 per month to store 1Tb of data. A greater than 90% reduction in cost over S3. Suddenly redundant cloud-based archival of those irreplaceable family photos and videos is affordable for most people.

Saturday 20 August 2011

2011 13" i7 MacBook Air Review

With my 2008 MacBook Pro creaking I decided to replace it with a 13" MacBook Air. The MBP was the last of the non-unibody models and hence in Apple design terms is almost prehistoric (the enclosure design is now 8 years old). I also have a Mac Pro for heavy lifting - running VMs, Lightroom, Visual Studio, Java IDEs etc. I do a fair amount of travelling around Europe on business and the Air is intended to provide me with a lightweight travel machine with good battery life. I seriously considered the 11" Air but ultimately its poorer battery life and smaller text meant I was unable to justify buying one - which is a shame since they are wonderfully portable.

If the MBP appears prehistoric then the Air is like an artefact from the future. The kind of object which is so perfectly engineered as to seem not manufactured at all but instead brought into existence fully formed.

After much deliberation I had ordered the 256Gb i7 version. I had been worried about heat - based on reports of the i7 running hot. However although it is warm to the touch it is much much cooler than the MBP which, if performing light duties for an hour or so, would become uncomfortably warm (i.e. hot) to type on. The underside temperature of the Air in general use on my lap seems to stabilise at around 32C whereas the MBP would generally run at an uncomfortable 39C.

I set up the Air by performing a migration from the MBP's last Time Machine backup. This was relatively  quick (around 1.5 hours for 150Gb over USB). It all went well except some App Store purchased programs needed to be deleted and reinstalled before they'd run.

Compared to the 5400 RPM disk in the MBP the Air's SSD makes everything I/O-related blisteringly quick. The keyboard is lovely. The screen is nice and sharp although its brightness does seem to need to be turned up relatively high in order to be usable - probably a reflection of my tired old eyes.

The SSD really shows its mettle when using big bloated Java IDEs like Netbeans which loads very quickly and even though the screen is small it is still very usable. It's also transformed my use of virtual machines. Starting or resuming the Windows 7 Fusion VM on my MBP would essentially hang the machine for five to ten minutes. On the Air the VM starts, runs, suspends and resumes very quickly and without unduly impacting the performance of other processes on the machine. It's a shame that the memory is fixed at 4Gb but it's enough for me at the moment.

I have a real dislike for glossy screens. They look particularly bad in brightly-lit environments such as, ironically enough, Apple stores. However, the Air's screen is much less glossy than that of the iMac or MacBook Pro. So much so, in fact, that I can't say I've yet noticed a reflection at all. Good stuff. The text is a little small and would probably rule out its use as a primary machine for me.

Although pre-Lion versions of Mac OS X supported encryption of a user's home directory this ability came at the price of limited Time Machine functionality. However, Lion's full disk encryption together with the Air's SSD and the i7's AES-NI hardware accelerated encryption delivers security which is seamless, Time Machine compliant and performant. Benchmarks show that it has minimal impact on system performance and battery life. Full disk encryption is clearly especially useful on a machine which is as eminently nickable as the Air.


The much touted instant is not noticeably quicker than my old MBP - or indeed any of my previous Mac laptops.


I haven't done any formal battery life tests but it lasts much much longer than my MacBook Pro which itself has a relatively new battery.

It's shame that the Air doesn't include a 3G wireless chipset but I'm guessing that Apple left it out as it would crush battery life.

It's also a shame that Apple/Intel haven't adopted USB3 because, as things stand, there is no fast storage option for the Air - I don't count the Thunderbolt display because a) it's not actually available yet and b) explain why I need to spend 800 quid to get a Thunderbolt-to-FW800 convertor?


I ran a quick performance test by encoding a 720p movie using Handbrake. My Mac Pro (2.26Ghz Octo, 16Gb RAM) averaged around 60fps whilst the Air averaged around 20fps. A pretty creditable performance.

All-in-all a wonderful little machine.

Pros

Light
Good battery life
Super quick
Quiet (mostly)
Cool

Cons

Fixed 4Gb RAM
No USB3
No 3G

Verdict: Lovely jubbly.