Lately in my daily commute, I have been listening to audio books. This past couple of days it has been Flash Boys by Michael Lewis. The book is about how high frequency traders game the stock market exchanges – very complex systems – by front running purchase orders by a millionth of a millisecond.
“Front running” is like tapping the right shoulder of the person in front of you at the line in the grocery shop when you are buying fish. In the nanosecond before she turns around, you sneak in front of her on the other side, buy her order for a pound of salmon, turn around and sell it to her for a profit, without her having a clue of what just happened.
All in a nanosecond because this transaction takes place in cyberspace where the trading system is a complex web of millions of simultaneous orders.
In the book there is an excellent discussion on how complex systems fail. One of the references it quotes is a short article (more of a geeky PowerPoint if you ask me) that lists 18 ways complex systems fail. I think they complement very well the discussions on complexity and development that Owen Barder has been having on his blog (I find numbers 7 and 8 particularly compelling):
- Complex systems are intrinsically hazardous systems.
- Complex systems are heavily and successfully defended against failure.
- Catastrophe requires multiple failures – single point failures are not enough..
- Complex systems contain changing mixtures of failures latent within them.
- Complex systems run in degraded mode.
- Catastrophe is always just around the corner.
- Post-accident attribution to a ‘root cause’ is fundamentally wrong.
- Hindsight biases post-accident assessments of human performance.
- Human operators have dual roles: as producers & as defenders against failure.
- All practitioner actions are gambles
- Actions at the sharp end resolve all ambiguity.
- Human practitioners are the adaptable element of complex systems.
- Human expertise in complex systems is constantly changing
- Change introduces new forms of failure.
- Views of ‘cause’ limit the effectiveness of defenses against future events.
- Safety is a characteristic of systems and not of their components
- People continuously create safety.
- Failure free operations require experience with failure.