Major Flaws of Human Thinking
I don't consider these biases "Major Flaws" or flaws at all. IMO these observations are heuristics that actually work amazingly well for most every day situation and one shouldn't be quick to discard them in lieu of some deeper, system-2esque idea. This is similar to how optical illusions don't show flaws in our vision but rather highlight how well we can infer non-trivial information correctly in "normal" situations.
For example, in the linked example of "Conformity", a person taking a test with several other people doesn't immediately leave the room when observing smoke and hearing a smoke alarm. This is IMO, a completely legitimate response: If several other people are calmly carrying on their business it's much more likely to assume they know something you don't rather than them all being just sitting lemmings oblivious to their own demise (or that they are all goons in on a televised hoax at your expense). Just imagine how intolerable life would be if every single person would want to investigate every single request or stated fact for themselves, just to discover the request is usually logical and the fact true (consider for example, a "Dead End" sign with every single driver checking if it's actually true).
Obviously the edge cases of these heuristics (the "illusions") should be examined, and care taken when appropriate, but it's important to remember that these human tendency are basically what allows complex societies (with many individuals with partial knowledge) to function.
It's good to periodically see these articles get traction. There are so many logical fallacies that we need to remain diligent in guarding against them.
Lately, in response to the relatively intense polarization of beliefs, I've been contemplating open-mindedness, and whether discussions starting with establishing an open-minded context could get people talking again.
To me, open-mindedness means that you're receptive to new ideas and experiences and generally do not reject them outright. When presented with implausible or seemingly impossible information, an open-minded person will listen to the basis of ideas, concepts and information and weigh that basis before making a determination about whether or not the idea is true and worth assimilating. Or they may try a new experience with less pre-judgment of whether they'll appreciate it.
In my view, there is a critical nuance in open-mindedness: It applies whether or not the idea is consistent with what I already believe to be true. As an open-minded person, I feel that I need to regularly reevaluate my beliefs as new information comes to light, even if that new information directly contradicts a basis for a firmly held belief.
I'm curious to know examples that you've had where you've learned new information that contradicted what you already firmly believed, and through open-mindedness, changed your belief accordingly.
This is an interesting thread as I'm seeing a false dichotomy form. Either these phenomena are flaws and should be abandoned or they are often helpful and therefore should be adhered to.
But I think there is a middle ground or maybe it is not in the middle but off to the side somewhere. We can be aware of these biases and try to be conscious of when we are relying in part or entirely on them and still choose to abide by them especially in the absence of better information.
Understanding that one is behaving out of conformity or conservatism or another bias doesn't consign one to intermable mental litigation. You can be aware of this fact and still know you do not and may never have sufficient information to challenge and overturn the bias.
However, when we suspect we are acting solely out of one of these biases and we see a compelling reason to challenge it, that awareness becomes the tool that allows us to change. A compelling examples is the acceptance of homosexuality in Europe and the U.S. The conformist beliefs that were leading to the supression of individuals were doing harm. As more people "came out" the conformist foundation of the supression arguments weakened. For most I think it has become clear that the risk of overruling conformity was insufficient in the face of the material harm homosexual individuals were enduring.
Speaking for myself, the biggest problems in my thinking are as follows:
1. Thinking you know anything with an even remotely high degree of certainty.
2. Confirmation Bias - thinking you are right and searching for evidence to prove you are right, reaffirming your own beliefs in the process and actively ignoring or providing excuses for counter examples and contradictory evidence.
3. Difficulty/Inability of looking only at the facts without interpreting them as you would like them to be or not to be.
3. Inability/Difficulty of admitting you are wrong/made a mistake and also remember it and learn from it.
4. Inability/Difficulty of acknowledging you are an irrational, flawed monkey stumbling around trying to make sense of about a billion things you don't understand and never will understand.
Trying to still make progress given all the above.
I dont agree with the premise that these "flaws" are wrong and must be "removed", if you were to remove them you would probably get a perfectly rational being, like a computer and computers are really stupid.
Being alive is better than being correct, rationality is correct only if all the premises are correct with perfect information. So to manage this world of uncertainity our brain must use heuristics, heuristics that are really good, so good that we somehow survive.
Nature would say "Its a feature, not a bug"
I tried to read past the headlines-grabbing title and the first point, but since it's three distinct points already AND wishful thinking can also be seen as a form of resilience sometimes and not just a terrible thing AND the focus on conservatism seems more emotional than pragmatic, I've checked out before the end.
Perhaps a less self-aggrandizing and hand-wavy way to address the subject would be to actually look at proper research such as CBT in psychology pointing out cognitive distortions[1], which are incredibly helpful to recognize and identify in the self in order to correct both the big issues (some aspects of depression, anxiety) and small ones (everyday life for most people).
[1] https://www.psychologytools.com/articles/unhelpful-thinking-...
Ironically the biggest cognitive flaw is believing that these apply to other people but not to yourself.
I saw a comment from someone at the center for applied rationality a while ago that said that one of the top reasons why people want to take their workshop was to be able to pinpoint issues in other people's thinking.
I disagree that conservatism is a major flaw of human thinking, or even a flaw at all. I would argue that conservatism is one of the core principals of science. Although the quantity and quality of data sufficient to change one's views could be (and frequently is) debated, a healthy amount of stability and skepticism is required for science to be efficacious.
A similar argument could be made for conservatism's relationship with social structures. Rapidly changing social structures leads to unpredictable futures, which can lead to economic insecurity at the individual level.
We evolved to think the way we do over millions of years. We should not just dismiss a pattern of thought as a flaw because we don't understand its usefulness. It might be a flaw - the world today is different in many ways from the world our thinking evolved for - but at the very least we have the burden to explain why this pattern evolved, and why it's harmful now.
My favorite: FAE - Fundamental Attribution Error
"the tendency for people to under-emphasize situational explanations for an individual's observed behavior while over-emphasizing dispositional and personality-based explanations for their behavior"
e.g: when you're late, some external force is the cause; when co-worker is late, he's just "a person who is the type to be late".
If you want a much deeper investigation of this topic https://www.lesswrong.com/about has been doing work in this area for over a decade as part of the self-named "rationalist" community.
> conservatism — an insufficient ability to change our common views and beliefs with the new data, new evidence
I've been able to counter this somewhat by adopting a default assumption that my understanding is incomplete. That's an easier self-sell than being wrong.
He's disabled the right-click -> copy option. A flaw in thinking, IMO. The "Absolute Enable Right Click and Copy" extension, restores the functionality.
>Another flaw of our thinking is conservatism — an insufficient ability to change our common views and beliefs with the new data, new evidence. This is understandable — changing basis views leads to a reconsideration of all related knowledge. An enormous amount of rebuilding is required. Our biological brains just can’t do that in a short time, also it’s much harder with age. Because of this, we give much more weight to old knowledge rather than new evidence, thus making a conservatism bias.
Painting conservatism as purely a human flaw is poor thinking in itself. The positive side of conservative thinking was neatly summarized by G.K. Chesterton: "Do not remove a fence until you know why it was put up in the first place."
There may be things you don't know, or which have never been known, which the conservative solution to a problem addresses either by accident or design. For examples, see traditional methods of food preparation. Indigenous peoples in South America have passed down numerous traditional recipes for the cassava plant. Every recipe involves a process to remove cyanogenic glycosides [0], which are present in all parts of the cassava plant and lethal if ingested. These recipes were developed without any modern understanding of chemistry or toxicology. Deviation from the old recipes without this knowledge, e.g. to apply a new time-saving cooking technique, could have disastrous consequences. Old knowledge often runs deep.
[0] https://www.theguardian.com/science/blog/2017/jun/22/cassava...
undefined
There are a variety of places you could look further on this topic, but I’ll throw in a recommendation for the book Seeking Wisdom by Peter Bevelin.
I agree with most of the patterns from the article. I've came across them in "Thinking, Fast and Slow."
Daniel Kahneman states most of the flaws from the article come from "system 1" that's responsible for taking quick decisions without "deep thinking"
I'd strongly recommend the read. It helped me act less reactively.
The biggest flaw in human thinking that's even bigger than all the ones in the article is admitting you're wrong on a consistent basis.
We are incapable of doing this especially when we've already invested years and years of our lives onto object oriented programming.
Humans always like to construct logical scaffolds around a biased agenda and they are unable to deconstruct that scaffold when presented with contradictory evidence.
And no one is consciously aware of that this is happening. Contradictory evidence can be right in front of their eyes but if the programmer already has 10 years of Object Oriented Programming under his belt he's not going to flip on a dime and admit that he's been doing it wrong for 10 years, it just doesn't happen. Instead the person needs to unconsciously recreate a logical perspective of the universe that fits his personal agenda.
Another thing to consider is that this and the flaws mentioned in the article exist within our minds because they aided in our survival. These "flaws" were biologically advantageous and that's why we think this way.
Maybe lying to yourself is technically wrong, but in the end you may be better for it, especially when you've already invested so much time reading and practicing Object Oriented Programming.
Does anyone have any examples of their own illogical and illusive scaffolds about world views that they've built to support their biased agenda?
If so please reply! I set this post up so that it will be very easy for you to find your own examples.
Politics in sheeps disguise.
I think all of these flaws stem from cognitive dissonance and our desire to avoid uncomfortable thoughts. The route of fooling yourself is always the easier one.
"people like to", "some people are", "People are", "Humans are". Is this number 1. ?
Surely at least one of the flaws in human thinking is thinking that one can simply remove the flaws by enumerating and preventing them.
I think this article ironically overgeneralizes certain issues. For example, conformism is good when it comes to vaccination or social distancing. People are more comfortable with a stranger injecting a liquid into their bodies when they know everyone else let it happen as well. More conforming countries are handling the pandemic better. The difficult question is what heuristics can be applied to when it's fine to take a mental shortcut.
I wonder if philosophy can solve the first 4?
The later through Science/logic where it's possible.
This is 100% unbiased and objective. It is painfully obvious that all those knuckle-dragging conservatives and religious people, who are categorized in the very first part as narrow-minded by this screed, have no business as part of the public discourse. Clearly the work of a deep thinker.
This is basically high-school level "philosophical" musing and has about as much application in real life, which is none.
There are not major flaws in human thinking there is human thinking, framed. There are thousands of years of human thinking on human thinking that one could refer to, and do. Ignoring the question of what is a human subject, reality and thought maps it as another polemic cultural battle of the Global West. It's super fun. Start maybe at Plato, Kant, Hegel, Foucault, Badiou, Ljubljana school, Mbembe, Moten/Harney/Undercommons, Agamben, Butler, Spivak, Hall, and on and on. So many people dedicating their lives to this question always seems like a good place to start, if one is seriously asking the question and not just reproducing status quo. Which, if that is the purpose of this forum, I apologize.