Skip to playerSkip to main contentSkip to footer
  • today
Documentary, BBC Two -Horizon,How You Really Make Decisions -2014
Transcript
00:00We like to think that, as a species, we are pretty smart.
00:11We like to think we are wise, rational creatures.
00:14I think we all like to think of ourselves as Mr Spock, to some degree.
00:18You know, we make rational, conscious decisions.
00:21But we may have to think again.
00:23It's mostly delusion, and we should just wake up to that fact.
00:28Oh.
00:31In every decision you make, there's a battle in your mind between intuition and logic.
00:38It's a conflict that plays out in every aspect of your life.
00:42What you eat, what you believe, who you fall in love with,
00:49and most powerfully, in decisions you make about money.
00:54The moment money enters the picture, the rules change.
00:58Scientists now have a new way to understand this battle in your mind.
01:04How it shapes the decisions you take.
01:07What you believe.
01:09And how it has transformed our understanding of human nature itself.
01:13Sitting in the back of this New York cab is Professor Danny Kahneman.
01:38He's regarded as one of the most influential psychologists alive today.
01:48Over the last 40 years, he's developed some extraordinary insights into the way we make decisions.
01:54I think it can't hurt to have a realistic view of human nature and how the mind works.
02:03His insights come largely from puzzles.
02:13Take, for instance, the curious puzzle of New York cab drivers and their highly illogical working habits.
02:20Business varies according to the weather.
02:26On rainy days, everyone wants a cab.
02:30But on sunny days, like today, fares are hard to find.
02:34Logically, they should spend a lot of time driving on rainy days.
02:39Because it's very easy to find passengers on rainy days.
02:43And if they're going to take leisure, it should be on sunny days.
02:47But it turns out this is not what many of them do.
02:52Many do the opposite.
02:54Working long hours on slow, sunny days and knocking off early when it's rainy and busy.
02:59Instead of thinking logically, the cabbies are driven by an urge to earn a set amount of cash each day, come rain or shine.
03:11Once they hit that target, they go home.
03:16They view being below the target as a loss and being above the target as a gain.
03:22And they care more about preventing the loss than about achieving the gain.
03:27So when they reach their goal on a rainy day, they stop.
03:34Which really doesn't make sense.
03:37If they were trying to maximize their income, they would take their leisure on sunny days and they would drive all day on rainy days.
03:49It was this kind of glitch in thinking that Kahneman realized could reveal something profound about the inner workings of the mind.
03:58Anyone want to take part in an experiment?
04:00And he began to devise a series of puzzles and questions which have become classic psychological tests.
04:09It's a simple experiment.
04:11It's a very attractive game.
04:13Don't worry, sir. Nothing strenuous.
04:14It's posing problems where you can recognize in yourself that your intuition is going the wrong way.
04:21The type of puzzle where the answer that intuitively springs to mind and that seems obvious is in fact wrong.
04:28Here is one that I think works on just about everybody.
04:34Now I want you to imagine a guy called Steve.
04:36You tell people that Steve, you know, is a meek and tidy soul with a passion for detail and very little interest in people.
04:48He's got a good eye for detail.
04:49And then you tell people he was drawn at random.
04:52From a census of the American population.
04:54With the probability that he is a farmer or a librarian.
04:59So do you think it's more likely that Steve's going to end up working as a librarian or a farmer?
05:03What's he more likely to be?
05:05Uh, maybe a librarian.
05:08Librarian.
05:09Probably a librarian.
05:10A librarian.
05:11Immediately, you know, a thought pops to mind that it's a librarian because he resembled the prototype of a librarian.
05:20Probably a librarian.
05:22In fact, that's probably the wrong answer because at least in the United States there are 20 times as many male farmers as male librarians.
05:32Librarian.
05:33Librarian.
05:33So there are probably more meek and tidy souls, you know, who are farmers and meek and tidy souls who are librarians.
05:44This type of puzzle seemed to reveal a discrepancy between intuition and logic.
05:51Another example is...
05:52Imagine a dictionary.
05:53I'm going to pull a word out of it at random.
05:55Which is more likely that a word that you pick out at random has the letter R in the first position.
06:03Or has the letter R in the third position.
06:06Um, start with the letter R.
06:09Okay.
06:10People think the first position because it's easier to think of examples.
06:15Start with it.
06:16First.
06:16First.
06:17In fact, there are nearly three times as many words with R as the third letter than words that begin with R.
06:23But that's not what our intuition tells us.
06:27So we have examples like that.
06:29Like, many of them.
06:31We have no idea.
06:32No doubt about it.
06:34Kahneman's interest in human error was first sparked in the 1970s, when he and his colleague Amos Tversky began looking at their own mistakes.
06:43it was all in ourselves that is all the mistakes that we studied were mistakes that we were prone
06:51to make in my hand here i've got a hundred pounds kahneman and tversky found a treasure trove of
06:58these puzzles which would you prefer they unveiled a catalog of human error would you rather go to
07:05rome with a free breakfast and opened a pandora's box of mistakes a year and a day 25 but the really
07:14interesting thing about these mistakes is they're not accidents 75 they have a shape a structure
07:22i think bro skewing our judgment 20. what makes them interesting is that they're not random errors
07:29they're biases so the difference between a bias and a random error is that the bias is predictable
07:35it's a systematic error that is predictable
07:40kahneman's puzzles prompt the wrong reply again and again and more likely and again probably more
07:49likely it's a pattern of human error that affects every single one of us on their own they may seem
07:58small ah that seems to be the right drawer but by rummaging around in our everyday mistakes that's
08:07very odd kahneman started a revolution in our understanding of human thinking
08:15a revolution so profound and far-reaching that he was awarded a nobel prize
08:20so if you want to see the medal
08:27that's what it looks like that's it
08:38psychologists have long strived to pick apart the moments when people make decisions
08:43much of the focus has been on our rational mind our capacity for logic
08:52but kahneman saw the mind differently he saw a much more powerful role for the other side of our minds
08:59intuition and at the heart of human thinking there's a conflict between logic and intuition
09:06that leads to mistakes kahneman and tversky started this trend of seeing the mind differently
09:15they found these decision-making illusions these spots where our intuitions just
09:19make us decide these things that just don't make any sense
09:23the work of kahneman and tversky has really been revolutionary
09:26it kicked off a flurry of experimentation and observation to understand the meaning of these mistakes
09:37people didn't really appreciate as recently as 40 years ago
09:42that the mind didn't really work like a computer we thought that we were very deliberative conscious
09:47creatures who weighed up the costs and benefits of action just like mr spock would do
09:56by now it's a fairly coherent body of work about ways in which intuition departs from the rules of logic if you will
10:11and the body of evidence is growing
10:16some of the best clues to the working of our minds come not when we get things right but when we get things wrong
10:26in a corner of this otherwise peaceful campus professor chris shabri is about to start a fight
10:42all right so um what i want you guys to do is stay in this area over here the two big guys grab you
10:50and sort of like start pretending to punch you make some sound effects all right this looks good
10:58all right that seemed pretty good to me
11:01it's part of an experiment that shows a pretty shocking mistake that any one of us could make
11:08a mistake where you don't notice what's happening right in front of your eyes
11:12as well as a fight the experiment also involves a chase
11:26it was inspired by an incident in boston in 1995 when a young police officer kenny conley was in hot
11:32pursuit of a murder suspect
11:34it turned out that this police officer while he was chasing the suspect had run right past
11:42some other police officers who were beating up another suspect which of course police officers are
11:47not supposed to do under any circumstances when the police tried to investigate this case of police
11:53brutality he said i didn't see anything going on there all i saw was the suspect i was chasing
11:58and nobody could believe this and he was prosecuted for perjury and obstruction of justice
12:05everyone was convinced that conley was lying
12:09we don't want you to be like closer than about everyone that is apart from chris shabri
12:17he wondered if our ability to pay attention is so limited that any one of us could run past a
12:22vicious fight without even noticing and it's something he's putting to the test and when you see
12:29someone jogging across the footbridge then you should get started jackie you can go
12:40in the experiment the subjects are asked to focus carefully on a cognitive task
12:45they must count the number of times the runner taps her head with each hand
12:52and they're not going to be able to find out that they're not going to be able to find out
12:56would they like the boston police officer be so blinded by their limited attention
13:00that they would completely fail to notice the fight
13:07about 45 seconds or a minute into the run there was the fight and they could actually see the fight
13:12from a ways away and it was about 20 feet away from them when they got closest to them
13:17the fight is right in their field of view and at least partially visible from as far back as the footbridge
13:34it seems incredible that anyone would fail to notice something so apparently obvious
13:42they completed the three minute course and then we said did you notice anything unusual
13:46yes what was it it was a fight sometimes they would have noticed the fight and they would say yeah
13:52i saw some guys fighting but a large percentage of people said we didn't see anything unusual at all
13:58and when we asked them specifically about whether they saw anybody fighting they still said no
14:10in fact nearly 50 percent of people in the experiment completely failed to notice the fight
14:16did you see anything unusual during the run no okay did you see some people fighting no
14:28we did it at night time and we did it in the daylight
14:32even when we did it in daylight many people ran right past the fight and didn't notice it at all
14:37so did you see anything unusual during the run uh no not really okay did you see some people fighting
14:47no you really didn't see anyone fighting no does it surprise you to that you could have missed that
14:52they were about 20 feet off the path oh yeah you ran right past them completely missed that then okay
15:00maybe what happened to connelly was when you're really paying attention to one thing and focusing
15:05a lot of mental energy on it you can miss things that other people are going to think are completely
15:09obvious and in fact that's what the jurors said after conley's trial they said we couldn't believe
15:14that he could miss something like that it didn't make any sense he had to have been lying
15:17it's an unsettling phenomenon called inattentional blindness that can affect us all some people have
15:27said things like this shatters my faith in my own mind or now i don't know what to believe or
15:32i'm going to be confused from now on but i'm not sure that that feeling really stays with them
15:37very long they are going to go out from the experiment you know walk to the next place
15:41they're going or something like that and they're going to have just as much inattentional blindness
15:44when they're walking down the street that afternoon as they did before
15:48this experiment reveals a powerful quandary about our minds
15:56we glide through the world blissfully unaware of most of what we do and how little we really know
16:04our minds for all its brilliance the part of our mind we call ourselves is extremely limited
16:12so how do we manage to navigate our way through the complexity of daily life
16:19every day each one of us makes somewhere between two and ten thousand decisions
16:35when you think about our daily lives it's really a long long sequence of decisions
16:45we make decisions probably at the frequency that is close to the frequency we breathe
16:52every minute every second you're deciding where to move your legs and where to move your eyes
16:58and where to move your limbs and when you're eating a meal you're making all kinds of decisions
17:01and yet the vast majority of these decisions we make without even realizing
17:07it was danny kahneman's insight that we have two systems in the mind for making decisions
17:18two ways of thinking fast and slow
17:25you know our mind has really two ways of operating and one is sort of fast thinking
17:36an automatic effortless mode and that's the one we're in most of the time
17:41this fast automatic mode of thinking he called system one
17:48system one is you know that's what happens most of the time you're there the world around you
18:03provides all kinds of stimuli and you respond to them everything that you see and that you
18:09understand you know this is a tree that's a helicopter back there that's a statue of liberty
18:15all of this visual perception all of this comes through system one
18:20the other mode is slow deliberate logical and rational
18:26this is system two and it's the bit you think of as you the voice in your head
18:33the simplest example of the two systems is really two plus two is on one side
18:38and 17 times 24 is on the other
18:41what is two plus two four four four fast system one is always in gear producing instant answers
18:48and what's two plus two four a number comes to your mind four four four it is automatic
18:56you do not intend for it to happen it just happens to you it's almost like a reflex
19:01and what's at 22 times 17 that one but when we have to pay attention to a tricky problem we engage
19:14slow but logical system two if you can do that in your head you'll have to follow some rules and to do
19:21it sequentially and that is not automatic at all that involves work it involves effort it involves
19:29concentration 22 times 17
19:32there will be physiological symptoms your heart rate will accelerate your pupils will dilate so many
19:42changes will occur while you're performing this computation
19:47three
19:49god
19:51that's um
19:53220 and seven times
19:5522
19:5724
19:59374
20:01okay and can i get you to just walk with me uh for a second
20:05uh for a second
20:06okay uh who's the current system two may be clever but it's also slow limited and lazy
20:13i live in berkeley during summers and i walk a lot and when i walk very fast i cannot think
20:19can i get you to count backwards from 100 by seven
20:23sure
20:24193
20:2680
20:27it's hard when you're walking
20:28it takes up interestingly enough the same kind of executive function
20:34as
20:35as thinking
20:36forty
20:37four
20:38four
20:39four
20:40four
20:41if you are expected to do something that demands a lot of effort you will stop even
20:47walking
20:48eighty
20:49um
20:50six
20:51fifty one
20:53uh
20:56sixteen
20:58nine
20:59two
21:00five
21:01eight
21:03d
21:17two
21:18five
21:19two
21:20as a minor character who thinks he is the star,
21:23because, in fact, most of what goes on in our mind is automatic.
21:28You know, it's in the domain that I call System 1.
21:31System 1 is an old, evolved bit of our brain, and it's remarkable.
21:35We couldn't survive without it because System 2 would explode.
21:38If Mr Spock had to make every decision for us,
21:41it would be very slow and effortful, and our heads would explode.
21:45And this vast, hidden domain is responsible for far more
21:49than you would possibly believe.
21:52Having an opinion, you have an opinion immediately,
21:55whether you like it or not, whether you like something or not,
21:58whether you're for something or not,
22:00liking someone or not liking them.
22:02That, quite often, is something you have no control over.
22:05Later, when you're asked for reasons, you will invent reasons.
22:08And a lot of what System 2 does is it provides reason,
22:12it provides rationalizations, which are not necessarily
22:16the true reasons for our beliefs and our emotions
22:19and our intentions and what we do.
22:27You have two systems of thinking that steer you through life.
22:31Fast, intuitive, System 1, that is incredibly powerful
22:38and does most of the driving.
22:42And slow, logical, System 2, that is clever, but a little lazy.
22:49Trouble is, there's a bit of a battle between them
22:52as to which one is driving your decisions.
22:54And this is where the mistakes creep in.
23:06When we use the wrong system to make a decision.
23:09I'm just going to ask you a few questions.
23:11We're interested in what you think.
23:12This question concerns this nice bottle of champagne I have here.
23:16Millicene 2005 is a good year.
23:17Genuinely, nice vintage bottle.
23:19These people think they're about to use slow, sensible System 2
23:25to make a rational decision about how much they would pay
23:28for a bottle of champagne.
23:31But what they don't know is that their decision
23:33will actually be taken, totally without their knowledge,
23:37by their hidden, fast autopilot, System 1.
23:42And with the help of a bag of ping-pong balls,
23:45we can influence that decision.
23:46I've got a set of numbered balls here, from 1 to 100, in this bag.
23:52I'd like you to reach in and draw one out at random for me, if you would.
23:56First, they've got to choose a ball.
23:58The number says 10.
23:5910.
23:5910.
24:0010.
24:01They think it's a random number, but in fact it's rigged.
24:05All the balls are marked with the low number 10.
24:08This experiment is all about the thoughtless creation of habits.
24:12It's about how we make one decision and then other decisions follow it
24:18as if the first decision was actually meaningful.
24:21All right.
24:22What we do is purposefully, we give people a first decision
24:26that is clearly meaningless.
24:2810.
24:2810.
24:29OK.
24:29Would you be willing to pay 10 pounds for this nice bottle of vintage champagne?
24:32I would, yes.
24:35Nah.
24:35Yeah, I guess.
24:36OK.
24:38This first decision is meaningless, based as it is on a seemingly random number.
24:44But what it does do is lodge the low number 10 in their heads.
24:49Would you buy it for 10 pounds?
24:51Yes, I would.
24:52You would?
24:52OK.
24:52Now for the real question, where we ask them how much they'd actually pay for the champagne.
25:00What's the maximum amount you think you'd be willing to pay?
25:0320.
25:04OK.
25:057 pounds.
25:067 pounds.
25:06OK.
25:07Probably 10 pounds.
25:08A range of fairly low offers.
25:12But what happens if we prime people with a much higher number, 65 instead of 10?
25:20What does that one say?
25:2165.
25:2165, OK.
25:2365.
25:23OK.
25:23It says 65.
25:27How will this affect the price people are prepared to pay?
25:31What's the maximum you would be willing to pay for this bottle of champagne?
25:3540?
25:3645 pounds.
25:3745, OK.
25:3850.
25:4040 quid?
25:41OK.
25:4150 pounds.
25:4250 pounds?
25:43Yeah, I'd pay between 50 and 80 pounds.
25:46Between 50 and 80?
25:46Yeah.
25:48Logic has gone out the window.
25:51The price people are prepared to pay is influenced by nothing more than a number written on a ping-pong ball.
26:01It suggests that when we come to make decisions, we don't evaluate the decision in itself.
26:06Instead, what we do is we try to look at other similar decisions we've made in the past, and we take those decisions as if they were good decisions.
26:16And we say to ourselves, oh, I've made this decision before.
26:19Clearly, I don't need to go ahead and solve this decision.
26:23Let me just use what I did before and repeat it, maybe with some modifications.
26:27This anchoring effect comes from the conflict between our two systems of thinking.
26:35What happens is I ask you a question, and if the question is difficult, but there is a related question that is somewhat simpler,
26:55you're just going to answer the other question, and not even notice.
27:00So the system does all kinds of shortcuts to feed us the information in a faster way we can make actions,
27:06and the system is accepting some mistakes.
27:08We make decisions using fast system one, when we really should be using slow system two.
27:18And this is why we make the mistakes we do.
27:22Systematic mistakes known as cognitive biases.
27:29Nice day.
27:30Since Kahneman first began investigating the glitches in our thinking, more than 150 cognitive biases have been identified.
27:46We are riddled with these systematic mistakes, and they affect every aspect of our daily lives.
27:54Wikipedia has a very big list of biases, and we are finding new ones all the time.
28:00One of the biases that I think is the most important is what's called the present bias focus.
28:04It's the fact that we focus on now, and don't think very much about the future.
28:09And that's the bias that causes things like overeating and smoking, and texting and driving, and having unprotected sex.
28:16Another one is called the halo effect.
28:20And this is the idea that if you like somebody or an organization, you're biased to think that all of its aspects are good.
28:28Not everything is good about it.
28:30If you dislike it, everything is bad.
28:33People really are quite uncomfortable, you know, by the idea that Hitler loved children.
28:39He did.
28:41Now, that doesn't make him a good person, but we feel uncomfortable to see an attractive trait in a person that we consider, you know, the epitome of evil.
28:51We're prone to think that what we like is all good, and what we dislike is all bad.
28:56That's a bias.
28:57Another particular favorite of mine is the bias to get attached to things that we ourselves have created.
29:04We call it the IKEA effect.
29:06Well, you've got loss aversion, risk aversion, present bias.
29:10Spotlight effect.
29:11And the spotlight effect is the idea that we think that other people pay a lot of attention to us, when in fact they don't.
29:17Confirmation bias, overconfidence is a big one.
29:21But what's clear is that there's lots of them.
29:23There's lots of ways for us to get things wrong.
29:25You know, there's one way to do things right and many ways to do things wrong, and we're capable of many of them.
29:30These biases explain so many things that we get wrong.
29:42Our impulsive spending.
29:46Trusting the wrong people.
29:48Not seeing the other person's point of view.
29:52Succumbing to temptation.
29:53We are so riddled with these biases, it's hard to believe we ever make a rational decision.
30:10But it's not just our everyday decisions that are affected.
30:13Succumbing to temptation.
30:13Succumbing to temptation.
30:23What happens if you're an expert, trained in making decisions that are a matter of life and death?
30:37Are you still destined to make these systematic mistakes?
30:45On the outskirts of Washington, D.C., Horizon has been granted access to spy on the spooks.
30:53Welcome to analytic exercise number four.
31:01Former intelligence analyst Donald Kretz is running an ultra-realistic spy game.
31:08This exercise will take place in the fictitious city of Vastopolis.
31:15Taking part are a mixture of trained intelligence analysts and some novices.
31:23Due to an emerging threat, a terrorism task force has been stood up.
31:29I will be the terrorism task force lead and I have recruited all of you to be our terrorism analysts.
31:36The challenge facing the analysts is to thwart a terrorist threat against a U.S. city.
31:47The threat at this point has not been determined.
31:50It's up to you to figure out the type of terrorism and who's responsible for planning it.
31:56The analysts face a number of tasks.
32:02They must first investigate any groups who may pose a threat.
32:06Your task is to write a report.
32:08The subject in this case is the network of dread.
32:12The mayor has asked for this 15 minutes from now.
32:15Just like in the real world, the analysts have access to a huge amount of data streaming in from government agencies, social media, mobile phones and emergency services.
32:29The network of dread turns out to be a well-known international terror group.
32:39They have the track record, the capability and the personnel to carry out an attack.
32:45The scenario that's emerging is a bio-terror event, meaning it's a biological terrorism attack that's going to take place against the city.
32:53If there is an emerging threat, they are the likely candidate.
33:00We need to move on to the next task.
33:04It's now 9 April.
33:06This is another request for information.
33:09This time on something or someone called the Masters of Chaos.
33:13The Masters of Chaos are a group of cyber hackers, a local bunch of misfits with no history of violence.
33:27And while the analysts continue to sift through the incoming data, behind the scenes, Kretz is watching their every move.
33:36In this room, we're able to monitor what the analysts are doing throughout the entire exercise.
33:41We have set up a knowledge base into which we have been inserting data throughout the course of the day.
33:48Some of them are related to our terrorist threat.
33:51Many of them are not.
33:53Amidst the wealth of data on the known terror group, there's also evidence coming in of a theft at a university biology lab.
34:01And someone has hacked into the computers of a local freight firm.
34:05Each of these messages represents essentially a piece of the puzzle.
34:11But it's a puzzle that you don't have the box top to.
34:15So you don't have the picture in advance.
34:19So you don't know what pieces go where.
34:21Furthermore, what we have is a bunch of puzzle pieces that don't even go with this puzzle.
34:24The exercise is part of a series of experiments to investigate whether expert intelligence agents are just as prone to mistakes from cognitive bias as the rest of us.
34:38Or whether their training and expertise makes them immune.
34:43I have a sort of insider's point of view of this problem.
34:48I worked a number of years as an intelligence analyst.
34:51The stakes are incredibly high.
34:54Mistakes can often be life and death.
35:00We roll ahead now.
35:01The date is 21 May.
35:02If the analysts are able to think rationally, they should be able to solve the puzzle.
35:10But the danger is they will fall into the trap set by Kretz and only pay attention to the established terror group, the network of dread.
35:21Their judgment may be clouded by a bias called confirmation bias.
35:27Confirmation bias is the most prevalent bias of all.
35:29And that's where we tend to search for information that supports what we already believe.
35:37Confirmation bias can easily lead people to ignore the evidence in front of their eyes.
35:43And Kretz is able to monitor if the bias kicks in.
35:48If we still see that they're searching for a network of dread, that's an indication that we may have a confirmation bias operating.
35:57The network of dread are the big guys.
36:00They've done it before.
36:01So you would expect they'd do it again.
36:04And I think we're starting to see some biases here.
36:08Analysts desperately want to get to the correct answer.
36:12But they're affected by the same biases as the rest of us.
36:15So far, most of our analysts seem to believe that the network of dread is responsible for planning this attack.
36:24And that is completely wrong.
36:30How are we doing?
36:33It's time for the analysts to put themselves on the line and decide who the terrorists are and what they're planning.
36:39So what do you think?
36:43It was a bioterroristic attack.
36:47I had a different theory.
36:48All right, so what's your theory?
36:49Because I may be missing something here, too.
36:51They know that the network of dread is a terrorist group.
36:54They know that the Masters of Chaos is a cyber hacking group.
36:59Either to the meat factory or in the water supply.
37:01Abundance of dead fish holding up in the river.
37:06The question is, did any of the analysts manage to dig out the relevant clues and find the true threat?
37:13In this case, the actual threat is due to a cyber group, the Masters of Chaos,
37:20who becomes increasingly radicalized throughout the scenario and decides to take out their anger on society, essentially.
37:26Who convinced them to switch from cybercrime to bioterrorism?
37:31Or did they succumb to confirmation bias and simply pin the blame on the usual suspects?
37:38Will they make that connection?
37:40Will they process that evidence and assess it accordingly?
37:43Or will their confirmation bias drive them to believe that it's a more traditional type of terrorist group?
37:49I believe that the Masters of Chaos are actually the ones behind it.
37:53It's either a threat or not a threat, but the network of dread.
37:56And time's up. Please go ahead and save those reports.
38:02At the end of the exercise, Kretz reveals the true identity of the terrorists.
38:07We have a priority message from City Hall.
38:12The terrorist attack was thwarted.
38:15The planned bioterrorist attack by the Masters of Chaos against Vestopoulos was thwarted.
38:22The mayor expresses his thanks for a job well done.
38:25Show of hands, who got it?
38:31Yeah.
38:32Out of 12 subjects, 11 of them got the wrong answer.
38:36The only person to spot the true threat was in fact a novice.
38:44All the trained experts fell prey to confirmation bias.
38:48It is not typically the case that simply being trained as an analyst gives you the tools you need to overcome cognitive bias.
39:00You can learn techniques for memory improvement, you can learn techniques for better focus, but techniques to eliminate cognitive bias just simply don't work.
39:13And for intelligence analysts in the real world, the implications of making mistakes from these biases are drastic.
39:26Government reports and studies over the past decade or so have cited experts as believing that cognitive bias may have played a role in a number of very significant intelligence failures.
39:41And yet, it remains an understudied problem.
39:52Heads.
39:55Heads.
39:55But the area of our lives in which these systematic mistakes have the most explosive impact is in the world of money.
40:07The moment money enters the picture, the rules change.
40:13Many of us think that we're at our most rational when it comes to decisions about money.
40:18We like to think we know how to spot a bargain, to strike a good deal, sell our house at the right time, invest wisely.
40:30Thinking about money the right way is one of the most challenging things for human nature.
40:36But if we're not as rational as we like to think, and there is a hidden force at work shaping our decisions, are we deluding ourselves?
40:47Money brings with it to the right way is one of the most challenging things.
40:48Money brings with it a mode of thinking.
40:49Money brings with it a mode of thinking.
40:50It changes the way we react to the world.
40:54When it comes to money, cognitive biases play havoc with our best intentions.
41:00There are many mistakes that people make when it comes to money.
41:05Kahneman's insight into our mistakes with money were to revolutionize our understanding of economics.
41:10It's all about a crucial difference in how we feel when we win or lose.
41:31And our readiness to take a risk.
41:35Our willingness to take a gamble is very different depending on whether we're faced with a loss or a gain.
41:48Excuse me guys, you spare two minutes to help us with a little experiment.
41:51We have to try and win as much money as you can, okay?
41:53Okay.
41:54In my hands here, I have 20 pounds, okay?
41:57Here are two scenarios.
41:58And I'm going to give you 10.
42:00In the first case, you are given 10 pounds.
42:03That's now yours.
42:04Put it in your pocket.
42:05Take it away.
42:06Perfect.
42:07Spend it on a drink on the south bank later.
42:08Okay, okay.
42:09Okay?
42:10Okay.
42:11Then, you have to make a choice about how much more you could gain.
42:15You can either take the safe option, in which case I give you an additional 5, or you can take a risk.
42:21If you take a risk, I'm going to flip this coin.
42:25If it comes up heads, you win 10.
42:28But if it comes up tails, you're not going to win any more.
42:33Would you choose the safe option, and get an extra 5 pounds?
42:37Or take a risk, and maybe win an extra 10, or nothing?
42:41Which is it going to be?
42:45I'd go safe.
42:46Safe 5?
42:47Yeah.
42:48Take 5.
42:49Take 5?
42:50Yeah, man.
42:51Sure.
42:52There we go.
42:53Most people, presented with this choice, go for the certainty of the extra 5-er.
42:57Thank you very much.
42:58I told you it was easy.
42:59In a winning frame of mind, people are naturally rather cautious.
43:03That's yours too.
43:05That was it?
43:06That was it.
43:07Really?
43:08Yes.
43:09Yeah.
43:12But what about losing?
43:13Are we similarly cautious when faced with a potential loss?
43:17In my hands, I've got 20 pounds.
43:19I'm going to give that to you.
43:20That's now yours.
43:21Okay.
43:22You can put it in your handbag.
43:24This time, you are given 20 pounds.
43:29And again, you must make a choice.
43:33Would you choose to accept a safe loss of 5 pounds, or would you take a risk?
43:39If you take a risk, I'm going to flip this coin.
43:42If it comes up heads, you don't lose anything.
43:45But if it comes up tails, then you lose 10 pounds.
43:49In fact, it's exactly the same outcome.
43:53In both cases, you face a choice between ending up with a certain 15 pounds, or tossing a coin to get either 10 or 20.
44:02I will risk these internal, don't I?
44:05Okay.
44:06But the crucial surprise here is that when the choice is framed in terms of a loss, most people take a risk.
44:14Take a risk?
44:15Take a risk, okay.
44:16I'll risk it.
44:17You'll risk it, okay.
44:18Our slow system 2 could probably work out that the outcome is the same in both cases.
44:23And that's heads.
44:24You win.
44:25But it's too limited and too lazy.
44:27That's the easiest 20 pounds you'll ever make.
44:29Instead, fast system 1 makes a rough guess, based on change.
44:34And that's all there is to it.
44:35Thank you very much.
44:36Wherever you want.
44:37If you were to lose 10 pounds in the street today and then find 10 pounds tomorrow, you would be financially unchanged.
44:53But actually, we respond to changes.
44:55So, the pain of the loss of 10 pounds looms much larger.
44:59It feels more painful.
45:00In fact, you probably have to find 20 pounds to offset the pain that you feel by losing 10.
45:05Heads.
45:06At the heart of this is a bias called loss aversion.
45:10Heads.
45:11Which affects many of our financial decisions.
45:13People think in terms of gains and losses.
45:18Heads.
45:19It's tails.
45:20Oh.
45:21And in their thinking, typically, losses loom larger than gains.
45:26You lose.
45:27We even have an idea by how much, by roughly a factor of two or a little more than two.
45:34That is loss aversion.
45:36And it certainly was the most important thing that emerged from our work.
45:43It's a vital insight into human behavior.
45:46So important, it led to a Nobel Prize and the founding of an entirely new branch of economics.
45:55When we think we're winning, we don't take risks.
45:59But when we're faced with a loss, frankly, we're a bit reckless.
46:05But loss aversion doesn't just affect people making casual five-pound bets.
46:20It can affect anyone at any time.
46:24Including those who work in the complex system of high finance, in which trillions of dollars are traded.
46:33In our current complex environment, we now have the means as well as the motive to make very serious mistakes.
46:45The bedrock of economics is that people think rationally.
46:48They calculate risks, rewards, and decide accordingly.
46:53But we're not always rational.
46:57We rarely behave like Mr. Spock.
47:00For most of our decisions, we use fast, intuitive, but occasionally unreliable system one.
47:07And in a global financial market, that can lead to very serious problems.
47:19I think what the financial crisis did was it simply said, you know what?
47:25People are a lot more vulnerable to psychological pitfalls than we really understood before.
47:33Basically, human psychology is just too flawed to expect that we could avert a crisis.
47:44Understanding these pitfalls has led to a new branch of economics.
47:55Behavioral economics.
47:59Thanks to psychologists like Hirsch Sheffrin, it's beginning to establish a toehold in Wall Street.
48:05It takes account of the way we actually make decisions, rather than how we say we do.
48:17Financial crisis, I think, was as large a problem as it was,
48:21because certain psychological traits like optimism, overconfidence, and confirmation bias
48:26played a very large role among a part of the economy where serious mistakes could be made and were.
48:36But for as long as our financial system assumes we are rational, our economy will remain vulnerable.
48:52I'm quite certain if the regulators listened to behavioral economists early on,
48:57we would have designed a very different financial system,
49:01and we wouldn't have had the incredible increase in housing market,
49:07and we wouldn't have this financial catastrophe.
49:11And so when Kahneman collected his Nobel Prize,
49:14it wasn't for psychology, it was for economics.
49:24The big question is, what can we do about these systematic mistakes?
49:29Can we hope to find a way round our fast-thinking biases and make better decisions?
49:39To answer this, we need to know the evolutionary origins of our mistakes.
49:47Just off the coast of Puerto Rico is probably the best place in the world to find out.
49:59The tiny island of Cayo Santiago.
50:05So we're now in the boat heading over to Cayo Santiago.
50:09This is an island filled with a thousand rhesus monkeys.
50:13Once you pull in, it looks a little bit like you're going to Jurassic Park,
50:17you're not sure what you're going to see,
50:18and then you'll see your first monkey and it'll be comfortable,
50:20you're like, ah, the monkeys are here, everything's great.
50:23It's an island devoted to monkey research.
50:34See the guys hanging out on the cliff up there?
50:37Pretty cool.
50:41The really special thing about Cayo Santiago is that the animals here,
50:44because they've grown up over the last seven years around humans,
50:47they're completely habituated.
50:49And that means we can get up close to them, show them stuff,
50:52look at how they make decisions.
50:54We're able to do this here in a way that we'd never be able to do it anywhere else, really.
50:58It's really unique.
51:02Lori Santos is here to find out if monkeys make the same mistakes in their decisions that we do.
51:08Most of the work we do is comparing humans and other primates,
51:13trying to ask what's special about humans.
51:17But really what we want to understand is what's the evolutionary origin of some of our dumber strategies,
51:22some of the spots where we get things wrong.
51:24If we can understand where those came from, that's where we'll get some insight.
51:31If Santos can show us that monkeys have the same cognitive biases as us,
51:35it would suggest they evolved a long time ago.
51:44And a mental strategy that old would be almost impossible to change.
51:50We started this work around the time of the financial collapse.
51:56So when we were thinking about what dumb strategies could we look at in monkeys,
52:00it was pretty obvious that some of the human economic strategies which were in the news
52:04might be the first thing to look at.
52:06And one of the particular things we wanted to look at was whether or not the monkeys are loss-averse.
52:11But monkeys, smart as they are, have yet to start using money.
52:17And so that was kind of where we started.
52:19We said, well, how can we even ask this question of if monkeys make financial mistakes?
52:23And so we decided to do it by introducing the monkeys to their own new currency and just let them buy their food.
52:31So I'll show you some of the stuff we've been up to with the monkeys.
52:35Back in her lab at Yale, she introduced a troop of monkeys to their own market, giving them round, shiny tokens they could exchange for food.
52:47So here's Holly. She comes in, hands over a token. You can see she just gets to grab the grape there.
52:53One of the first things we wondered was just can they in some sense learn that a different store sells different food at different prices?
53:02So what we did was we presented the monkeys with situations where they met traders who sold different goods at different rates.
53:10So what you'll see in this clip is the monkeys meeting a new trader.
53:14She's actually selling grapes for three grapes per one token.
53:17And what we found is that in this case the monkeys are pretty rational.
53:24So when they get a choice of a guy who sells, you know, three goods for one token, they actually shop more at that guy.
53:36Having taught the monkeys the value of money, the next step was to see if monkeys, like humans, suffer from that most crucial bias, loss aversion.
53:45And so what we did was we introduced the monkeys to traders who either gave out losses or gains relative to what they showed.
53:56So I could make the monkey think he's getting a bonus simply by having him trade with a trader who's starting with a single grape.
54:04But then when the monkey pays this trader, she actually gives him an extra.
54:07So she gives him a bonus. At the end, the monkey gets two, but he thinks he got that second one as a bonus.
54:12We can then compare what the monkeys do with that guy versus a guy who gives the monkey losses.
54:18This is a guy who shows up, who pretends he's going to sell three grapes, but then when the monkey actually pays this trader, he'll take one of the grapes away and give the monkeys only two.
54:27The big question then is how the monkeys react when faced with a choice between a loss and a gain.
54:34So she'll come in, she's met these two guys before, you can see she goes with the bonus option, even waits patiently for her additional piece to be added here.
54:45And then takes the bonus, avoiding the person who gives her losses.
54:57So monkeys hate losing just as much as people.
55:00And crucially, Santos found that monkeys as well are more likely to take risks when faced with a loss.
55:15This suggests to us that the monkeys seem to frame their decisions in exactly the same way we do.
55:22They're not thinking just about the absolute, they're thinking relative to what they expect.
55:27And when they're getting less than they expect, when they're getting losses, they too become more risk seeking.
55:34The fact that we share this bias with these monkeys suggests it's an ancient strategy etched into our DNA more than 35 million years ago.
55:51And what we learn from the monkeys is that if this bias is really that old, if we really have had this strategy for the last 35 million years,
55:59simply deciding to overcome it is just not going to work.
56:02We need better ways to make ourselves avoid some of these pitfalls.
56:09Making mistakes, it seems, is just part of what it is to be human.
56:20We are stuck with our intuitive inner stranger.
56:23The challenge this poses is profound.
56:32If it's human nature to make these predictable mistakes and we can't change that, what then can we do?
56:40We need to accept ourselves as we are.
56:44The cool thing about being a human versus a monkey is that we have a deliberative self that can reflect on our biases.
56:50System two in us has for the first time realized there's a system one.
56:55And with that realization, we can shape the way we set up policies.
57:00We can shape the way we set up situations to allow ourselves to make better decisions.
57:05This is the first time in evolution this has happened.
57:06If we want to avoid mistakes, we have to reshape the environment we've built around us, rather than hope to change ourselves.
57:20We've achieved a lot, despite all of these biases.
57:21If we are aware of them, we can probably do things like design our institutions and our regulations and our own personal environments and working lives to minimize the effect of those biases and help us think about how to overcome them.
57:38We are limited, we are not perfect, we are irrational in all kinds of ways, but we can build a world that is compatible with this and get us to make better decisions rather than worse decisions.
58:03That's my hope.
58:04And by accepting our inner stranger, we may come to a better understanding of our own minds.
58:17I think it is important in general to be aware of where beliefs come from.
58:26And if we think that we have reasons for what we believe, that is often a mistake.
58:33That our beliefs and our wishes and our hopes are not always anchored in reasons, they're anchored in something else that comes from within and is different.
58:47Catch the weekend's Super League matches with highlights of all the games here on BBC Two next.
58:58Thank you very much.
58:59Thank you very much.

Recommended