Les billets libellés: psychology. Afficher tous les billets.

The Righteous Mind by Jonathan Haidt

jeudi 13 décembre 2018

At first I wasn't going to write about this book because I wasn't sure I could be objective about it. I have developed some theories about psychology and this book offers evidence for many of those theories. I loved the book, but was unsure if I could really say anything meaningful about it as the confirmation bias made it difficult for me to be objective. However, after repeatedly bringing up the book in conversation I decided I should write about it.

Ostensibly the book is about moral psychology, how and why people develop their moral beliefs and opinions. To answer these questions we need to go deeper into how the mind works, which is what I found most interesting. The common conception of the conscious vs unconscious mind holds that the conscious mind - the part of everyone that can think things through in words and logic - is in charge and the unconscious mind is mostly irrelevant. It may occassionally bubble some things up to the conscious, but it plays little to no role in making decisions.

I have written about my theory before, but basically I believe that the inverse is true. My previous metaphor had been that the conscious mind is like a sports announcer and the unconscious is the sports game. The announcer describes the action, but doesn't control it; viz the conscious mind is more about explaining and rationalizing the decisions made by the unconscious mind that actually participating in those decisions. Mr. Haidt provides a much better metaphor - he calls it the elephant and the rider. The elephant is the unconscious mind and the conscious is a person riding the elephant. The rider thinks the elephant does what he tells it to, but most of the time the elephant does what it wants. Haidt refers to the rider as a PR agent, who rationalizes the actions of the elephant and explains them to the rest of the world.

Looking at morality from this viewpoint offers a very different perspective. Typically one would say that ones beliefs and morality are arrived at by the conscious mind making decisions based on evidence. Mr. Haidt says that morality is instead decided by the elephant, independently of the conscious mind, which then goes about constructing post hoc explanations and arguments for it. This explains why it is so difficult to change people's opinions with facts, evidence or discussion. Attempting to convince someone that they are wrong assumes that the rider can change the opinion of the elephant. While this is possible, it tends to be rare - if the elephant doesn't have a strong opinion on a subject then the rider can maybe influence it; the rest of the time the rider has very little say.

The common misconception of a rational mind ruled by the conscious mind underlies a lot of important assumptions upon which our civilization is based. For example all modern economic theory is based on the idea that humans are rational actors who make all decisions based on maximizing their self-interest using the available information. If this was true then advertising as we know it wouldn't work - everyone would always choose whatever product offered the best value at the best price. The economic man is so far from reality that there is a joke that economics is based on an imaginary different species - homo economicus. Despite the fact that homo economicus doesn't behave anything like homo sapiens all economic theory is based on it, and the entire global neoliberal order is based on those economic theories.

The reason I feel so strongly about this subject is because you can't build theories on top of incorrect assumptions. Because these mistaken assumptions about psychology and how the mind words are so prevalent they corrupt an enormous amount of the theory of what our society is based on - from economics to criminal justice to psychology. Unfortunately I think these biases are inherent to all life forms with nervous systems. As I recently wrote about people's desire for immortality, I think that these biases are deeply ingrained in us by evolution. Just as no one can imagine being dead, most people can't imagine that their conscious mind is not in complete control. Because it is the conscious mind that does the imagining it is difficult to imagine itself out of the picture, and most people do not like to realize that they do not have as much control as they think they do over things. However until these illusions are dismissed, we are not going to be able to create social systems that are based on evidence and reality rather than wishful thinking.

Libellés: books, psychology
Aucun commentaire

In my opinion, the work of Kahneman and Tversky is probably the most important work in psychology and economics in the last century. I mention cognitive biases in conversation on a daily basis. Many years ago I was considering doing a PhD in economics, but I gave up the idea when I realized that economic theory is based on a completely invalid assumption - that people are rational and make rational decisions based on self-interest. If this was true the entire advertising industry would not exist. The concepts of brands, corporate image, basically everything that goes into modern advertising is direct proof that humans are not rational.

Daniel Kahneman and Amos Tversky revolutionized psychology with their work on decision making under uncertainty. Kahneman describes the mind as consisting of two systems - System 2 is what people normally think of the mind - the part that thinks in words, analyzes things and makes logical decisions. System 1 is what most people refer to as the "unconscious" mind, the part that reacts and makes judgements without analysis. Their work reveals that System 1 is far more important than most people think.

Far from making logical, rational decisions based on evidence most decisions are made using heuristics and cognitive biases. A heuristic is a rule-of-thumb - an easy way to answer difficult questions, usually by substituting simpler questions which are similar. For example, if you are asked how worried you are about terrorist attacks your reply has nothing to do with the actual rate of attacks. Unconsciously you think of how many attacks you can think of and answer based on that. So if you see a lot of terrorist attacks on the news you probably think the chances of being affected by one are far greater than they actually are. This is called the availability heuristic - people judge how common something is by how easily they can remember instances of it.

Cognitive biases are exactly what they sound like - unconscious biases that effect how we process information. My favorite is the confirmation bias, which predisposes people to accept information that is consistent with their opinions and beliefs and reject information that is not. If you think about it it is really quite simple and obvious. Say, for example, I think that all immigrants are criminals. If I hear about a crime commited by an immigrant I think, "there's proof that I'm right." And if I hear statistics that say that immigrants are less likely to commit crimes I just ignore it or say that the source must be wrong. If you think about it, this is really obvious, but until Kahneman and Tversky wrote about it no one had ever thought of it.

Kahneman says that System 2, the thinking part of the brain, is lazy and does as little as possible, relying mostly on snap judgements made by System 1. Personally I would go even further than that. I think that System 2's primary purpose is communicating thoughts and ideas by putting them into language, and it's ability to think logically is just a side effect. I think that unless one makes purposeful effort to think through something logically, what System 2 will do is try to put what System 1 has already decided into words. In other words when we think we are thinking through a decision, we are really just rationalizing what System 1 wants. Of course there are exceptions to this and people can and do use System 2 to analyze data and make decisions, but in my opinion, most of what System 2 does is rationalizing.

Since Kahneman and Tversky released their paper on decision making under uncertainty, hundreds of other heuristics and cognitive biases have been discovered and named. While humans tend to think of ourselves as highly evolved and highly intelligent, all of this work tends to show that we make decisions in a much less thoughtful way than we think. When we make judgements under uncertainty, for the most part we do not analyze the evidence. Instead we use heuristics and biases to "estimate" the answer to the question by substituting answers to somewhat similar questions, or maybe we just rationalize whatever our subconscious mind has already decided.


Libellés: books, psychology
Aucun commentaire

The Undoing Project by Michael Lewis

lundi 22 janvier 2018

I have always been fascinated by the various ways in which our minds fail us. While we think we are very good at making decisions based on data and evidence, in fact we tend to use heuristics, or rules of thumbs, to come to quick and easy decisions when there is uncertainty involved. Daniel Kahneman and Amos Tversky pioneered research into this field and discovered what they call "cognitive biases" which are systematic deviations from making rational decisions and judgments. I first heard about these many years ago, specifically about how the anchoring bias was used in marketing. As the years have gone by I've become more and more interested in the subject.

One cognitive bias you may have heard of is the "confirmation bias" which is where you interpret new information in such a way as to uphold your existing beliefs. This is why it is so hard to convince anyone that their opinion is wrong. When we hear information that contradicts our beliefs we tend to either dismiss it or rationalize it away. This is how Trump supporters can dismiss negative news as "fake news" and Clinton supporters can dismiss any of her scandals as "right wing conspiracy theories." There are many other biases, such as the aforementioned "anchoring bias" which is where you tend to evaluate numbers in relation to other numbers. This is why gas stations in America have three octanes at three prices. People tend to see the cheapest one at the lowest price and the premium at the highest price and think that the one in the middle is the best deal, even though it would be cheaper to mix the cheapest and most expensive octanes. You don't just the price by itself - you judge it in relation to other prices.

The list of biases and heuristics goes on and on - and they apply to PhD level statisticians as well as everyone else. Our brains are just not designed to process lots of information, so we instead use shortcuts to make judgments. Why are people scared of sharks when the chances of being attacked by a shark are less than that of being hit by lightning? Because shark attacks are dramatic and memorable, and whenever there is one it is shown all over the TV news, and we tend to remember it more. This is called the "availability heuristic" and this explains why people are more worried about terrorism than about heart attacks. 

This book is the story of how Kahneman and Tversky met and conducted their research. Like many of Lewis's books, it provides a good summary for the layperson of very complicated subjects, while not going into too much technical detail. The subject is personal for me because the deficits of human ability to analyze and process data have led me to go into data science. However this book should be an enjoyable read for anyone.

Libellés: books, economics, psychology
Aucun commentaire

Archives du Blogue