• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

20 cognitive biases that screw up your decisions.

Status
Not open for further replies.

entremet

Member
I've always enjoyed the study of these. I also find myself still falling for many of these myself, as celebrated physicist Richard Feyman said, "The first principle is that you must not fool yourself and you are the easiest person to fool"

This graphic is nice summary of the major ones. There are many more, though.

Which ones do you fall prey to the most?

bi_graphics_20-cognitive-biases-that-screw-up-your-decisions.png


http://www.businessinsider.com/cognitive-biases-that-affect-decisions-2015-8
 
D

Deleted member 13876

Unconfirmed Member
If anyone is interested in this, read Daniel Kahneman's Thinking, Fast and Slow for more in-depth info.
 

entremet

Member
If anyone is interested in this, read Daniel Kahneman's Thinking, Fast and Slow for more in-depth info.

It's on my to read list or reading backlog lol. Looks very meaty, it is worth it? I find most nonfiction very talky and not well edited.
 

lazygecko

Member
I'm not sure what they mean by "experts" in #12. My understanding was that the more informed you are, the less conviction you have. Knowing enough to know how little you know and all.
 
Ah, this is taking me back to Social + Cognitive Psych. I've been guilty of a couple of these (3, 4, 7, and especially 16), though I've been trying to correct that.
 

Diseased Yak

Gold Member
Poor N-Gage

LOL, god how I wish this was the first reply! It's soooo GAF.

I tend to fall victim to the Overconfidence bias, especially in my career. I've been in IT for 20+ years, have seen and done it all (almost... see what I mean?!), and often assume I know how to handle issues that pop up. Once or twice a year, I'm humbled.
 

entremet

Member
Ah, this is taking me back to Social + Cognitive Psych. I've been guilty of a couple of these, though I've been trying to correct that.

Everyone is guilty of these, most many per day. The important thing about knowing this is awareness.

Incidentally, the Scientific Method is a check on many of these, but it's not applicable for everything and it's too time consuming to be used on daily decisions.
 
I'm not sure what they mean by "experts" in #12. My understanding was that the more informed you are, the less conviction you have. Knowing enough to know how little you know and all.

The more you know about something the more skeptical you are about things that don't fall along the lines of what you have seen for years. The bias is simply that you give a harder time to radical ideas than to what seems intuitive to you.
 

Harusame

Member
Another bias I would like to add to the list would certainly be the Sunk Costs Fallacy. This occurs when a person invests either time or money into a certain thing, and due to this investment, people come up with justifications as to support their investment. The persons justifications and rationale may not follow properly and if there happens to be a counter-argument, the person may go on the defensive and in turn, reinforce those justifications.

This fallacy sometimes occurs in console war arguments.
 

entremet

Member
Overconfidence bias is also called the Dunning-Krueger effect.

It's most common application is people rating their driving abilities lol.
 

RM8

Member
LOL, god how I wish this was the first reply! It's soooo GAF.

I tend to fall victim to the Overconfidence bias, especially in my career. I've been in IT for 20+ years, have seen and done it all (almost... see what I mean?!), and often assume I know how to handle issues that pop up. Once or twice a year, I'm humbled.
How did GAF react to the reveal of the N-Gage? I know GAF thought DS (AKA the second best selling gaming device ever made) was being sent to die, lol.
 

zou

Member
10 isn't actually true and also shouldn't be on the list. if it were the case, people wouldn't sell their stocks at the bottom and not checking your portfolio often is actually a good thing. less opportunity to screw up.
 

entremet

Member
10 isn't actually true and also shouldn't be on the list. if it were the case, people wouldn't sell their stocks at the bottom and not checking your portfolio often is actually a good thing. less opportunity to screw up.

I've seen this all the time at work. Not the stock example, but the principle.
 

Haly

One day I realized that sadness is just another word for not enough coffee.
I love the irony of 8 when it comes after 7. The "myth of the flat earth" is just one of those things that feeds the confirmation bias of those who would like to believe that medieval society was just so hopelessly backwards and Columbus had to sail around the world to prove that the Earth is spherical, or some similarly theatrical narrative.

A part of me wishes this was intentional, but I very much doubt it is.
 
Thinking Fast and Slow is a great read, it's the guys life work and filled with useful information. It's not dry in the least either, he's had a ton of interesting experiences and uses his stories from his life to demonstrate things. Probably the best of... those type of books I've read. Dan Ariely is good too. Sway and The Drunkard's Walk are good reads also.

Whenever I finish one of these books I convince myself I'll use some of the information because it makes so much sense as I read. Not been too successful really but I try.
 
I can't think of an example for #9.

I assume it's a roundabout way of explaining the concept of "overthinking" something. The Wikipedia page has an example here:

A female patient is presenting symptoms and a history which both suggest a diagnosis of globoma, with about 80% probability. If it isn't globoma, it's either popitis or flapemia. Each disease has its own treatment which is ineffective against the other two diseases. A test called the ET scan would certainly yield a positive result if the patient had popitis, and a negative result if she has flapemia. If the patient has globoma, a positive and negative result are equally likely. If the ET scan was the only test you could do, should you do it? Why or why not?

Many subjects answered that they would conduct the ET scan even if it were costly, and even if it were the only test that could be done. However, the test in question does not affect the course of action as to what treatment should be done. Because the probability of globoma is so high with a probability of 80%, the patient would be treated for globoma no matter what the test says. Globoma is the most probable disease before or after the ET scan.
 
It's interesting to think of all the ways we fool ourselves, particularly with regard to memory. I read an explanation a while back about a bias (maybe the Von Restorff effect?) that explains why it often feels like certain bad things or choices happen to us disproportionately, like you feel that drivers around you are terrible most of the time, or that when you choose a line in the grocery store, it's almost always the longer one. Basically, the brain doesn't try too hard to remember very common, every-day events, but it does a good job of remembering things that are out of place, leading you to think that these things happen far more often than they actually do.
 

cameron

Member
*laughs nervously at #4*

A dangerous example with #20 -> Zero-risk bias
Many parents would prefer not to vaccinate their children because of the perceived risk of vaccine injury. Although such risks are extremely small (or virtually zero in terms of risk of autism), many would prefer to avoid vaccines and these small risks. In reality, infectious diseases pose a large risk to the public and vaccines reduce those risks significantly. However they do not reduce those risks to zero. Many would prefer the zero-risk of unlikely vaccine injury rather than the much larger risk reduction offered by vaccines.
 

DaveH

Member
If anyone is interested in this, read Daniel Kahneman's Thinking, Fast and Slow for more in-depth info.
I second that, additionally, Blink by Malcolm Gladwell (not as science but as defying your intuitions), Dan Ariely (more psychology), and Richard Thaler (applied psychology).
 

DaveH

Member
It's on my to read list or reading backlog lol. Looks very meaty, it is worth it? I find most nonfiction very talky and not well edited.
It's very engaging and interactive because it invites you to take part in a number of the thought experiments on which the book is based... in so doing your own cognitive processes are revealed to yourself which is generally very interesting.

As the book gets more nuanced and mathy, you might be less engaged, but if you really like the tools it provides, it shows you how to mix, match, and apply them beyond the more basic experiments so you can understand how more sophisticated systems and choices are made.

I think most people adore the first third of the book, really like the second third, and then tend to taper off.
 

Astral Dog

Member
Just yesterday i was reading this, a lot of these make sense.
Its always interesting how we adapt our own mind to reality for our convenience
 
D

Deleted member 13876

Unconfirmed Member
I second that, additionally, Blink by Malcolm Gladwell (not as science but as defying your intuitions), Dan Ariely (more psychology), and Richard Thaler (applied psychology).

Dan Ariely is also great and a really entertaining speaker. He had a Coursera course on behavioral economics that was super interesting.
 

DaveH

Member
Aren't 8 and 15 the opposites of each other?
Not every bias applies all the time to every situation to every person.

Even so, opposite biases aren't an issue... you can have three people look at a set of data, two with opposite biases, and the third with the neutral ideal (taking the data for what it is rather than assuming additional significance based on how established or recent the finding).
 
Status
Not open for further replies.
Top Bottom