Does Loss Aversion Exist?
We’ve long been told that we feel losses more than we feel gains; that losing $1000 will make us more unhappy than winning $1000 will make us happy. Many experiments point to the existence of loss aversion, although recent experimental results have caused skeptical researchers to question its existence or at least claim that loss aversion is more complex than we first thought.
Nassim Taleb criticizes the ideas of risk aversion and loss aversion differently from the sceptical researchers. In his book, Skin in the Game, Taleb says “I believe that risk aversion does not exist: what we observe is, simply, a residual of ergodicity. People are, simply, trying to avoid financial suicide and take a certain attitude to tail risks.”
Taleb also says “Rationality is the avoidance of systemic ruin.” He rejects the idea that we are loss averse; we are simply avoiding things that could lead to financial ruin, death, or other permanent loss. “In a strategy that entails [a possibility of] ruin, benefits never offset risks of ruin.”
So, let’s apply these ideas to a simple experiment. We offer a subject, Stan, a chance to toss a fair coin to either lose $100 or win $200 on the coin’s outcome. Suppose Stan rejects the offer. What are his possible reasons?
Whether we call this “risk aversion,” “loss aversion,” or something else, it’s clear that the vast majority of subjects who reject a 100/200 coin toss but would accept a 100/300 coin toss are making a mistake in the 100/200 case. Even though mental heuristics for avoiding ruin serve us well and served our ancestors well, Stan applied them in this case to reject a beneficial opportunity.
Taleb doesn’t like labeling Stan “irrational,” but no matter what label we choose, he made a choice against his own interests. This doesn’t mean that Stan isn’t well-served on the whole by his mental heuristics for avoiding ruin; it’s just that they didn’t serve him well in this case.
The cost of this mistake is quite low (less than $50), but similar mistakes have much higher costs. One example is portfolio allocation. Many people live in poverty in old age because of a lifetime of avoiding any investment riskier than bank deposits. It’s certainly possible to achieve better returns on savings without incurring the risk of ruin.
Nassim Taleb criticizes the ideas of risk aversion and loss aversion differently from the sceptical researchers. In his book, Skin in the Game, Taleb says “I believe that risk aversion does not exist: what we observe is, simply, a residual of ergodicity. People are, simply, trying to avoid financial suicide and take a certain attitude to tail risks.”
Taleb also says “Rationality is the avoidance of systemic ruin.” He rejects the idea that we are loss averse; we are simply avoiding things that could lead to financial ruin, death, or other permanent loss. “In a strategy that entails [a possibility of] ruin, benefits never offset risks of ruin.”
So, let’s apply these ideas to a simple experiment. We offer a subject, Stan, a chance to toss a fair coin to either lose $100 or win $200 on the coin’s outcome. Suppose Stan rejects the offer. What are his possible reasons?
- Humans are wired with simple heuristics for avoiding ruin, and this offer triggered one of Stan’s avoidance heuristics.
- Stan doesn’t believe the coin is fair.
- Stan doesn’t believe he’ll be paid if he wins.
- Stan has a moral objection to gambling or some other reason that gambling even once has high cost, such as having a gambling addiction.
- Stan has other financial risks in his life that combine to make a $100 loss potentially very painful right now.
- Stan is so poor that the cost of a $100 loss is greater than the gain of a $200 win.
Whether we call this “risk aversion,” “loss aversion,” or something else, it’s clear that the vast majority of subjects who reject a 100/200 coin toss but would accept a 100/300 coin toss are making a mistake in the 100/200 case. Even though mental heuristics for avoiding ruin serve us well and served our ancestors well, Stan applied them in this case to reject a beneficial opportunity.
Taleb doesn’t like labeling Stan “irrational,” but no matter what label we choose, he made a choice against his own interests. This doesn’t mean that Stan isn’t well-served on the whole by his mental heuristics for avoiding ruin; it’s just that they didn’t serve him well in this case.
The cost of this mistake is quite low (less than $50), but similar mistakes have much higher costs. One example is portfolio allocation. Many people live in poverty in old age because of a lifetime of avoiding any investment riskier than bank deposits. It’s certainly possible to achieve better returns on savings without incurring the risk of ruin.
Comments
Post a Comment