Tag Archives: Tim Harford

Tim Harford – Adapt (Lecture)

29 Oct

Ever tried. Ever failed. No matter. Try again. Fail again. Fail better.
– Samuel Beckett

I like Tim Harford.

This lecture was an overview of the concepts in his book Adapt. He uses “biology, statistical physics, psychology and of course economics to explore how complex problems are solved, and the crucial role of learning from our apparently endless ability to screw up“.

(You can see him giving the same lecture here.)

Failure Is Productive If You Learn From It
He started by describing the brilliant Toaster Project. A chap called Thomas Thwaites decided to build a toaster – the cheapest £3.99 Argos toaster – from first principles. And I mean first principles. He tried to smelt iron ore, make the plastics, and so on. He found it was really very difficult, but after trying hard, and failing lots, making use of short cuts, and help from other people, he built something that approximated a toaster. It kind of warmed the bread a bit, and then caught fire.

The point Harford makes from this story is that by admirable trial and error Thwaites got to a solution. He then used it to point out how useful markets are. A producer does not need to think about how its product be used; the miner doesn’t know whether his iron ore will go on to be used for a toaster or for a fence.

Complex systems like this have evolved over time with trial and error and often nobody has a full view of how they work. As such they’re not going to be perfect. He draws an analogy with evolution.

Harford described another example. Unilever created a detergent that is sold in small capsules. To produce these capsules the raw material is squirted through a special nozzle. To design the nozzle they didn’t get experts in fluid dynamics or other disciplines, rather they prototyped different shapes, recording the outcomes, then refining the shape based on their observations until they had a nozzle that works. They don’t know how it works, but they have proved that it does.

This example draws out the trial and error aspect of his approach to complex problems. Here the Unilever boffins experimented. We tend to over value expertise, and while we shouldn’t ignore that, trial and error is more germane for difficult problems or complex systems.

A key to ensuring trial and error works is the feedback loop: the information that is learnt from failure must be used to refine the next iteration. If this informational loop is hindered then failure looses its power as a learning process.

Psychology
He discussed the issues that people have adopting this approach. The ability to fail is not always seen as a virtue. Alas this attitude means that innovation can be stunted.

He then spent a lot of time talking about our tendencies to conform to the group we’re in. He described some experiments such as this brilliant video of how people can be made to conform in a lift. He then focusses on the well-known conformity experiments of Solomon Asch.

Asch “asked groups of students to participate in a “vision test”. In reality, all but one of the participants were confederates of the experimenter, and the study was really about how the remaining student would react to the confederates’ behaviour.

Each participant was put into a group with 5 to 7 “confederates” (people who knew the true aims of the experiment, but were introduced as participants to the naive “real” participant). The participants were shown a card with a line on it, followed by another card with 3 lines on it labelled A, B, and C. The participants were then asked to say which line matched the line on the first card in length. Each line question was called a “trial”. The “real” participant answered last or next to last.

Asch hypothesized that the majority of participants would not conform to something obviously wrong; however, when surrounded by individuals all voicing an incorrect answer, participants provided incorrect responses on a high proportion of the questions (32%). Seventy-five percent of the participants gave an incorrect answer to at least one question.

The unanimity of the confederates has also been varied. When the confederates are not unanimous in their judgement, even if only one confederate voices a different opinion, participants are much more likely to resist the urge to conform (only 5–10% conform) than when the confederates all agree. This finding illuminates the power that even a small dissenting minority can have. Interestingly, this finding holds whether or not the dissenting confederate gives the correct answer. As long as the dissenting confederate gives an answer that is different from the majority, participants are more likely to give the correct answer.

So if you’re in a group, rather than do the usual conforming to keep the peace and fit in, it’s worth speaking up to give an alternative opinion. Even if your view is rubbish you’ll at least be emboldening another member to put forward their idea, and the group is therefore more likely to come to a more reasoned conclusion, as the prevailing view will be challenged. All ideas should be tested. If they fail they are not up to the job, if they pass muster they are proved to be worthy.

Harford then described the cognitive bias of loss aversion which is where people “strongly prefer avoiding losses to acquiring gains” which leads to people avoiding risks.

At the end of the lecture it was asked whether it is worth using this approach given the cost of failure? How much should we experiment?

It depends on the upside and downside. Someone like Google should experiment lots, the more experimentation the better, whereas a car manufacturer should experiment less as we need cars that don’t kill people! There is a cost-benefit analysis required here: what is the gain given the failure rate?

In Conclusion

Given mistakes are inevitable it’s best to think about the best way to make those mistakes. For example we can make them sooner in the process using trials and pilots. It’s important to manage the risk that comes from failure.

We should have the cojones to ask for honest feedback about ourselves so we can improve. It’s important to face down the ego and figure out why failure happens.

Mistakes are inevitable, in fact desirable. We need to work towards early discovery of failure, get better at managing risk and ensure good feedback loops. We’ll never get a perfect system but we can continue to refine the complex systems we have.

The Logic Of Life by Tim Harford

28 Jun

Tim Harford is a very cool economist. His book describes how we humans are very rational in our choices, it’s just a question of understanding the incentives that drive those decisions. He comes to some interesting, fun and startling conclusions about the human condition through wonderful studies and exciting statistics. His is an evidence-based world from which we can learn much about overcoming our less helpful tendencies and cognitive biases.

I am a fan of his radio series and his blog in which he gets to the bottom of the statistics in the news and what they really say.

I recommend reading his book, however if you want to get a flavour of the main themes, here they are:

On Safety
The more floors in the buildings in a street the more dangerous it is. After controlling for race and class in the residents of a building, it can be seen that for every extra floor in the building, you are 2.5% more likely to be mugged or have your car stolen, a statistic that goes up to 25% after 10 floors. The theory is that people feel less like they are being observed so are therefore emboldened with the thought that they can get away with it.

On Business
He describes the tournament theory: you are rewarded relative to those around you, therefore you are more likely to make others look bad, and just work as hard as you need to to look better than others.

He describes why CEOs getting paid so much is not necessarily all bad: the guy at the top being paid more doesn’t motivate him to work harder; it motivates those under him to work harder so they can achieve that position, and this can still add value to a company even if people aren’t rewarded fairly for their work.

The amount of luck involved in achieving a good outcome in a particular kind of work perversely means that bonuses must be significantly higher for that role. So if 95% of the outcome of a job is down to hard work, and 5% down to luck, motivating a person with a bonus is easy – the bonus doesn’t need to be very high. However, if 95% of a job is luck and 5% is hard work then the worker would just put his feet up and wait for the money to come in. If however he were rewarded massively for that extra 5% he’d buckle down. This clearly has some interesting implications for certain unpopular sectors where very high bonuses are awarded.

We are motivated to work harder if we know a more productive colleague can see what we are doing – if a supermarket worker at a till knows that a faster colleague is behind them they are faster themselves.

On The Hunt For A Mate
The less men there are (for example; works the other way around too), the more likely it is that women will settle for a less good “deal” in terms of what they get out of the relationship, because there will always be another girl that will settle for a less good deal to get her man.

We don’t have absolute values about what we want in a partner; we choose the best from those available in the group. I’ve seen other studies around this from the analysis of speed dating statistics.

And now we get to the sensitive topics:

On Sexual Choices
It may taboo to discuss sexual orientation in such terms, but the evidence is there. If you are from a family that contains someone with HIV/AIDS, then you are less likely to be in a male homosexual relationship, and more likely to be in a lesbian relationship. A rational decision based on the first hand information you have of the disease. Male anal sex increases the likelihood of contracting HIV, and lesbian sex means it’s lower than heterosexual penetrative intercourse.

On Racism
People may find it uncomfortable to read the phrase “rational racism”, but that doesn’t make it less truthful. If people actually find the balls to face facts and understand why they themselves can be racist, and take responsibility for that fact, the sooner we can get better at avoiding these things in our society.

Harford references some surprising studies that show how racist people are. For example one study found that resumes headed with stereotypically white names received 50% more interview requests than the same resumes with stereotypically black names. Sadly, he shows, this is because there is an economic advantage to some ‘”rational” discrimination in recruitment. Another proves how easy it is to engender a colour bias with a simple game, the implication being that it’s much more of a problem if such biases are entrenched over generations.

I challenge you to take this IAT test. Most people tend to say “not me”, however I’m humble enough to say that I did it and found that I have racial biases. If you pass with no racial tendencies, then you are part of a very small and special part of society, and I would like to know how you managed it. For the rest of us mere mortals we need to be aware of our biases in this regard and understand how to motivate and incentivise ourselves to beat them.

The good thing is that Harford present ways that we can beat racism that are proven to work empirically.

On Society And Innovation
So if we live in a society where earning money is rewarded rather than punished (with taxes or corruption), and private property and the rule of law is respected, then people will innovate and come up with great ideas. The economic viability of the idea is paramount. For example, he describes why the industrial revolution did so well in the UK: not because Britain had people who were more inventive or clever than other Europeans as some historians like to maintain, but because we had an easily available energy in the form of coal, and wages were significantly higher than the rest of Europe, so cotton mills, coke smelting, etc. were much more economically viable due to the resulting wage savings.

He finishes with what he admits is not an iron clad theory: that the number of ideas produced by humans, is directly proportional to the number of humans there are. Hence, as population has increased so has innovation.

The main theme is that we are entirely rational creatures, making logical choices given the incentives we have. The more we understand our biases and motivations, the better we can be for that knowledge, and the more we can influence ourselves to perform in better ways.

%d bloggers like this: