Buffl

Week 7

LK
von Linus K.

What do we mean by rational thinking (Baron, 2000) Thinking and Deciding

Some clarifications

Emotional ≠ irrational. If you think about emotions as a way of making decisions without thinking, then it is reasonable to assume emotional = irrational

But: The seeking of pleasant emotions and the avoidance of unpleasant ones are surely goals that most of us have, and ought to have. Emotions can impair thinking, but they can also help it


Rational thinking is the kind of thinking that helps us achieve our goals, whatever these may be


Rationality need not be self-interested. Goals such as the concern for others are among the goals that many of us have and ought to have

Rational thinking is not too much thinking. Rational thinking is where the amount of thinking is appropriate to the situation as far as possible


Notes:

The relation between rational thinking and emotion is a complex one. We should not use the word emotion as a substitute for the word “irrational”. We need to think about emotions in the psychological sense of states with certain causes and effects. Emotions, in this sense, are goals of our decision. They also affect our decisions in ways that are sometime desired, and sometimes undesired. They may help us in the long run, even if they hurt us in the short run. They may make us behave more morally, or less morally, then we would without them. They can impair thinking, but they can also help it.

Rational thinking is the kind of thinking that helps us achieve our goals, whatever these may be.

We call our decisions foolish when they subvert our values and wise when they affirm them.

Think about your own life. Have you ever checked your phone while crossing a road. But if you worked out the probability and benefits of both outcomes against each other, would you really do it? And if you don’t think like this can you really call yourself rational?



Normative theory of choice under uncertainty

The theory of expected utility* (also referred to as rational choice, rational actor, homo economicus) says that when faced with a risky decision actors ought to choose the option that maximises their expected utility

-*Pinker (2021), Chapter 6


In its original form EU was formalised by von Neumann and Morgenstern (1947)


Utility represents whatever people want to achieve


Utility ≠ happiness or money. It is supposed to be a summary measure of how choices realise our ultimate values or goals


To compute the utility of each choice, you multiply the utility of the outcome by the probability of the state that leads to it and add across the states


Let’s look at an example…..


Notes:

From Baron

Utility theory began in the 17th C. Its development since that time was largely associated with economic theory where it became part of the theory of the behaviour of buyers and sellers in markets. Psychologists became interested in utility theory in the early 1950s soon after the publication of von Neumann and Morgenstern’s (1947) Theory of Games and Economic Behaviour.

Utility theory is a partial theory. It assumes that we have already before us our possible choices, our goals and all the evidence we need. We do not need to search for these. If the decision involves buying a car, for example, we already have a list of possible cars and the strengths and weaknesses of each.

EUT is only considering part of the process behind making good decisions, it is not a complete normative model.


The Descriptive question – How do we think?

-> Enter Behavioural economics


Herbert Simon (Bounded rationality and Satisficing)

Herbert Simon is credited with kicking off the research area that we would now call behavioural economics.

Two key concepts: Bounded rationality & Satisficing

Theories of rational choice assume a knower with perfect information, unlimited time and memory. It assumes we optimise.

But normal humans have to factor in the time and trouble it takes to get information, and process the odds.

A wealth of information puts pressure on our truly limited resource – our capacity to pay attention

Simon suggests that given the costs of information it makes no sense to optimise, but instead we must satisfice – namely settle for the first alternative that exceeds some benchmark that is good enough.


https://www.economist.com/news/2009/03/20/herbert-simon


Notes:

Bounded rationality: HS challenges the ‘heroic abstraction’ – the idealisation of human rationality – embodied in neoclassical economics.

HS was (among other things) a cognitive psychologist, computer scientist and economist. His primary research interest was organisational decision making and he is best known for his theory of bounded rationality and satisficing.

In contrast to the tenets of economics HS held that people did not seek to maximise their benefit from a particular course of action (as they could not assimilate and digest all the information that would be needed to do such a thing). Not only could they not get access to this information, but even if they could their minds would be unable to process it. The human mind necessarily restricts itself, it is, as Simon put it, bounded by cognitive limits.

Hence people tend to seek something that is “good enough”, something that is satisfactory. This real world behaviour is what is called satisficing.

He applied this idea to organisations as well as individuals. He notes that a wealth of information just puts pressure on our truly limited resource – our capacity to pay attention. Much of his work was on decision making in organisation and he wanted to find the techniques and processes that organisations could use to achieve the best results given the limits on rational decision making.

He was awarded the Nobel Prize for Economics in 1978, to his considerable surprise since by then he had not taught economics for two decades.



Kahneman & Tversky (1974) – How do people assess the probability of an uncertain event? & Heuristics

Q: How do people assess the probability of an uncertain event? (Should you accept that job, invest in that firm, marry that person?)

Kahneman & Tversky: they rely on a number of heuristic principles which can lead to biased decisions


“In general these heuristics are quite useful, but sometimes they lead to severe and systematic errors”

Representativeness

Availability

Adjustment and anchoring


The issue is not that people make mistakes. But that these mistakes are predictable and systematic*. They are ingrained in our human nature


* The error is not random - it is systematic (in a regression it would be a variable, not just random error)


Notes:

Tversky, A. and Kahneman, D., 1974. Judgment under uncertainty: Heuristics and biases. science, 185(4157), pp.1124-1131.

This is on the student’s reading list – one of the best-known social science papers. (Citations – over 46 thousand)

Heuristic = rule of thumb

How do people assess the probability of an uncertain event? T&K argue that people generally rely on a limited number of heuristic principals which reduce the complex task of assessing probabilities and predicting values to simpler judgemental operations. (As K puts it, when asked a difficult question the mind quite often substitutes it for an easier question]. In general, these heuristics are quite useful, but sometimes they lead to severe and systematic errors.

One of the key points of K&Ts work on biases and heuristics is to show not just that we are irrational, but that we are predictably irrational. The point of BE is to reveal the systematic patterns in our departure from rationality. If there are systematic patterns in the way in which our decision depart from rationality – then that is useful information that can help us make better decisions – for example, we can design systems that might help overcome these biases. Eg – how to avoid group think.



Prospect theory (Kahneman & Tversky, 1979)



3 Core Characteristics:

  1. We evaluate utility according to a reference point

  2. Loss aversion: Losses loom larger than gains. We are risk averse in the “gain” frame. We are risk seeking when things are framed as losses

  3. Probability - we overweight probability of rare events and underweight probability of common events



Notes:

The details are interesting, but the key takeaway is that K&T show many situations where if you manipulate the how the choice is framed (while the actual outcome remains the same) you can impact choices – from the point of view of EUT this is completely irrational.

Evaluation is relative to a neutral reference point. We evaluate the utility of difference choices with reference to a neutral reference point. You can easily see this with an experiment – place three bowls of water in front of you: Ice water in the left hand, warm water in the right hand and the middle bowl should be room temperature. Put one hand in the ice water and the other in the warm water for about a minute, then put both into the room temperature bowl. You will experience the same temperature as heat in one hand and cold in the other. For financial outcomes the usual reference point is the status quo, but it can also be the outcome you expect, or feel entitled to (for example the raise or the bonus that your colleagues receive).

Loss aversion: When directly compared to each other, losses loom larger than gains. This asymmetry has an evolutionary history – Organisms that treat threats as more urgent than gains has a better chance of survival. [If you think on balance you are a merit student – you will get more pain from a pass, than pleasure from a distinction]. Moreover, we are risk averse in the “gain” frame and we are risk seeking when things are framed as losses.

Probability: we overweight the probability of rare events and underweight the probability of common events. [i.e. we don’t worry enough about being knocked over by a car and worry too much about Ebola].

The illusion of control (Soane & Willman)

Fenton O’ Creevy et al. (including Emma Soane) argue that some behaviours that seem irrational, especially in the context of finance and economics, are in fact rational in terms of goal achievement and the maintenance of self-image and self-control. (We need to feel good about ourself, and so it is evolutionary adaptive that we ‘fool ourselves’).

They look at one particular bias: The illusion of control. They argue that we are powerfully motivated to restore our sense of control when we feel it is threatened. In order to do so we need to be supported by a framework of self-belief, and to sustain this we need evidence. We do this by taking credit for positive outcomes and attributing failure to third parties.

They ask:

Does trading have a tendency to attract people who are prone to the illusion, or might it attract those who are immune to it?

Is the trading environment especially conducive to the formation of the illusion or is it one where the illusion is difficult to sustain?

Is the illusion of control detrimental to trading performance?

They did this through a computer-based experiment where traders were given a computer game to play and were afterwards asked to rank their own performance on four dimensions. In reality the game, which traders thought they could influence the outcome of, was completely random.

Findings:

Those who gave themselves very high scores on their own performance typically stressed that it was important to believe in yourself.

Those who gave themselves low scores stressed that it was important to honestly appraise one’s own performance.

They found that there was a statistically significant negative association between illusion of control and total remuneration and contribution to profits. Traders with a higher propensity to illusion of control were rated by their managers as less effective. In sum they found clear evidence that the illusion of control is associated with poorer performance and lower earnings.


In reality the game, which traders thought they could influence, was completely random.

They found that those traders who gave themselves higher scores (ie had a higher propensity to the illusion of control) were rated by their managers as less effective, and had poorer performance.

Author

Linus K.

Informationen

Zuletzt geändert