This is one of the best real-life self-improvement economics oriented books I have ever read. The wealth of information and advice contained in it is so vast that makes it extremely difficult to review or summarise. It took me a few months just to extract what I thought was important in this rather long and clumsy review. Do not depend on it. Grab the book and read it yourself.
Harford attempts to explain how the real world works. Why leading experts cannot agree amongst themselves and why excellent companies eventually fail: Failure to manage complexity. The solution: Think for yourself, experiment, learn from your failures and successes; adapt and evolve.
“Problem solving in a complicated world is not easy” and “the experts are humbled”. “Good leaders surround themselves with expert advisers, seeking out the smartest specialists with the deepest insights into the problems of the day. But even deep expertise is not enough to solve today’s complex problems…One of Tedlock’s more delicious discoveries was that the more famous experts – those who spent a lot of time as talking heads on television – were especially incompetent…Louis Menand, writing in the New Yorker, enjoyed the notion of bumbling seers, and concluded, ‘the best lesson of Tedlock’s book may be the one that he seems most reluctant to draw: Think for yourself…Yet there is a reason…: his results clearly show that experts do out-perform non-experts”.
Having studied “the long, tangled history of failure…The lesson seems to be that failure is fundamental to the way the market creates sophisticated and healthy economies…In a market economy, there is usually room for only a few winners in each sector. Not everyone can be one of them…The difference between market-based economies and centrally planned disasters, such as Mao Zedong’s Great Leap Forward, is not that markets avoid failure. It’s that large-scale failures do not seem to have the same consequences for the market as they do for planned economies (The most obvious exception to this claim is also the most interesting: The financial crisis that began in 2007…trial and error is a tremendously powerful process for solving problems in a complex world, while expert leadership is not. Markets harness this process of trial and error, but that does not mean that we should leave everything to the market. It does mean – in the face of seemingly intractable problems, such as civil war climate change and financial instability – that we must find a way to use the secret of trial and error beyond the familiar context of the market.”
Palchinsky’s principles in dealing with complexity: “First seek out new ideas and try new things, second, when trying something new do it on a scale where failure is survivable, third seek out feedback and learn from your mistakes as you go along. The first principle could simply be expressed as ‘variation’, the third as ‘selection’. The importance of the middle principle – survivability – is something which will become clear in chapter six, which explores the collapse of the banking system…Above all, feedback is essential for determining which experiments have succeeded and which have failed.”
“Variation is difficult because of two natural tendencies of organisations. One is grandiosity…such flagship projects violate the first Palchinsky Principle, because errors are common and big projects have little room to adapt. The other tendency emerges because we rarely like the idea of standards that are inconsistent and uneven from place to place…uniformly high standards are not only impossible but undesirable…Most of us sugar- coat our opinions whenever we speak to a powerful person…because yes-men tend to be rewarded…telling the unvarnished truth in unlikely to be the best strategy in a bureaucratic hierarchy…Adaptive organisations need to decentralise and become comfortable with the chaos of different local approaches and awkwardness of dissent from junior staff…Accepting trial and error means accepting error.”
But learning from mistakes can be very hard. “Daniel Kahneman and Amos Tversky summarised the behaviour in their classic analysis of the psychology of risk: ‘a person who has not made peace with his loss is likely to accept gambles that would be unacceptable to him otherwise’ …Faced with a mistake or a loss, the right response is to acknowledge the set back and change direction. Yet our instinctive reaction is denial. That is why ‘learn from your mistakes’ is wise advice that is painfully hard to take.”
There are three essential steps for successfully adapting:
1. “To try new things in the expectation that some will fail”
2. “To make failure survivable because it will be common”
3. “To make sure that you know when you’ve failed”
“As a Prussian general once put it, ‘No plan survives first contact with the enemy’ What matters is how quickly the leader is able to adapt”…”The big picture becomes a self-deluding propaganda poster, the unified team retreats into groupthink, and the chain of command becomes a hierarchy of waste baskets, perfectly evolved to prevent feedback reaching the top. What works in reality is a far more unsightly, chaotic and rebellious organisation altogether”.
Unanimity is seldom an indication of a healthy organisation…”a single dissenting voice was enough to liberate the subjects…the basic usefulness of hearing more ideas, better decisions emerge from a diverse group…Even an incompetent adviser with a different perspective, would probably have improved Johnson’s decision making…the right decisions are more likely when they emerge from very different perspectives…Galvin told Petraeus that the most important part of the job was to criticise his boss: ‘It’s my boss to run the division, and it’s your job to critique me’…it is not enough to tolerate dissent: sometimes you have to demand it”.
“…it is impossible to know in advance what the correct strategy will be…Tactics that had worked yesterday were a liability today…The lesson of the Iraq war was that the US Army should have had much better systems for adapting a failing strategy, and should have paid far more attention to successful local experiments...In the organisation of the future, the decisions that matter won’t be taken in some high tech war room, but on the front line…what really counted was identifying the junior officers who were capable of thinking for themselves.”
“The idea that we can actually predict which technologies will flourish flies in the face of all evidence. The truth is much messier and more difficult to manage… The lesson is variation, achieved through a pluralistic approach to encouraging innovations…In an uncertain world, we need more than just Plan A; and the means finding safe heavens for Plans B,C,D and beyond…This need to specialise may be unavoidable, but is worrying, because past breakthroughs have often depended on the inventor’s sheer breadth of interest…We may have booming universities and armies of knowledge workers, but when it comes to new ideas, we are running to a stand still. This is particularly worrying because we are hoping that new technology will solve so many of our problems…Grants, unlike prizes, are a powerful tool of patronage. Prizes, in contrast, are open to anyone who produces results. This makes them intrinsically threatening to the establishment…But the wonderful thing about prizes is that they don’t cost a penny until success is achieved.”
“The lesson is that pluralism encourages pluralism. If you want to encourage many innovations, combine many strategies…”
“The barrier to change is not too little caring; it is too much complexity” – Bill Gates
“Yunus…returned to his roots to experiment in a local context he understood much better than any foreign adviser would have been able to. Yunus advocates what he calls ‘worm’s-eye view’.”
“A doctor who wants to run a properly controlled trial to test these two options needs approval from an ethics committee…The alternative to controlled experiments is uncontrolled experiments. These are worse, because they teach us little or nothing”.
“We should not try to design a better world. We should make better feedback loops.”
“In a lecture of 1755, Adam Smith declared that ‘little else is required to carry a state to the highest degree of opulence from the lowest barbarism but peace, easy taxes and a tolerable administration of justice: all the rest being brought by the natural order of things.”
“The problem seems to be that governments love to back losers: think about big banks or car companies”.
“If we want to learn about dealing with systems that have little room for trial and error, then gas rigs, chemical refineries, and nuclear plants are the place to start…Banking exceeds the complexity of any nuclear plant I ever studied”.
“For Perrow, the dangerous combination is a system that is both complex and tightly coupled. The defining characteristic of a tightly coupled process is that once it starts, it’s difficult or impossible to stop: a domino-toppling display is not especially complex, but it is tightly coupled.”
“James Reason is celebrated in safety–engineering circles for the ‘Swiss cheese’ model of accidents…every additional safety measure also has the potential to introduce an unexpected new way for something to go wrong”.
“But the real lesson is that it should have been possible to let both Leman and
collapse without systemic damage. Preventing banks from being ‘to big to fail’
is the right kind of sentiment but the wrong way of phrasing it, as the domino
analogy shows: it would be absurd to describe a single domino as being too big
to fail. What we need are safety gates in the system that ensure any falling
domino cannot topple many others.”
The lessons from Deepwater Horizon:
1. Safety systems often fail
2. Latent errors can be deadly.
3. If whistleblowers felt able to speak up, the accident might have been prevented.
4. The rig system was too tightly coupled.
5. Contingency plans would have helped.
6. Accidents will happen, and we must be prepared for the consequences.
Some fast growing companies claim to embody some of the key principles of adapting, one being organised around autonomous teams of which there are over half a dozen on each site. Self selecting teams can make a difference: “A new recruit is placed with a team for four weeks for a trial period, at which point they can stay only if they win the vote of two thirds of their team members”…”Healthy competition is promoted by a ‘no secrets’ policy of strict transparency: many of the company’s financial statistics are available to employees, and every team knows how every other team is performing – a mechanism that also allows bad ideas to be noticed and nipped in the bud, and good ideas to spread horizontally through the company…Peer monitoring does not always work, of course; peer groups can turn to self-serving or even corrupt cliques (It is no wonder that John Timpson spends most of his working life visiting Timpson stores)…All significant actions, such as flipping a switch in the reactor control room, were double-checked by a colleague.”
“Google’s corporate strategy: have no corporate strategy”.
“Success is the number of experiments that can be crowded into twenty-four hours…When a problem reaches a certain level of complexity, formal theory won’t get you nearly as far as an incredibly rapid, systematic process of trial and error…it isn’t cutting –edge technology that tends to undo the market leaders. It is the totally new approach…Disruptive innovations are disruptive precisely because new technology doesn’t appeal to the traditional customers: it is different and for their purposes, it’s inferior…A sufficiently disruptive innovation bypasses almost everybody who matters in a company.”
“Corporations exist precisely because we don’t – and shouldn’t – care when abstract legal entities fail. We care about individuals.”
“The ability to adapt requires this sense of security, an inner confidence that the cost of failure is a cost we will be able to bear. Sometimes that takes real courage; at other times all that is needed is the happy self-delusion of a lost three-year-old. Whatever the source, we need that willingness to risk failure. Without it we will never truly succeed”.