The Great Mental Models Vol 1 by Shane Parrish

The Great Mental Models Vol 1 by Shane Parrish
The secret to better decision-making is learning things that won’t change. Mastering a small number of versatile concepts with broad applicability enables you to rapidly grasp new areas, identify patterns, and understand how the world works. Don’t waste your time on knowledge with an expiry date – focus on the fundamentals.

Nothing has yet been said that’s not been said before.

Mental models mentioned in the book you may want to look into further:

  1. The Map is Not the Territory
  2. Circle of Competence
  3. Second-order Thinking
  4. Probabilistic Thinking
  5. Correlation Over Causation
  6. Regression to the Mean
  7. Inversion
  8. Occam’s Razon
  9. Hanlon’s Razor

In life and business, the person with the fewest blind spots wins. Removing blind spots means we see, interact with, and move closer to understanding reality. We think better. And thinking better is about finding simple processes that help us work through problems from multiple dimensions and perspectives, allowing us to better choose solutions that fit what matters to us. The skill for finding the right solutions for the right problems is one form of wisdom.

A mental model is simply a representation of how something works. We cannot keep all of the details of the world in our brains, so we use models to simplify the complex into understandable and organizable chunks.

When understanding is separated from reality, we lose our powers. Understanding must constantly be tested against reality and updated accordingly.

Our failures to update from interacting with reality spring primarily from three things : not having the right perspective or vantage point, ego – induced denial, and distance from the consequences of our decisions.

The second flaw is ego. Many of us tend to have too much invested in our opinions of ourselves to see the world’s feedback — the feedback we need to update our beliefs about reality. This creates a profound ignorance that keeps us banging our head against the wall over and over again. Our inability to learn from the world because of our ego happens for many reasons, but two are worth mentioning here.

The third flaw is distance. The further we are from the results of our decisions, the easier it is to keep our current views rather than update them.

It’s much easier to go on thinking what we’ve already been thinking than go through the pain of updating our existing, false beliefs.

We also tend to undervalue the elementary ideas and overvalue the complicated ones.

An engineer will often think in terms of systems by default. A psychologist will think in terms of incentives. A business person might think in terms of opportunity cost and risk – reward. Through their disciplines, each of these people sees part of the situation, the part of the world that makes sense to them. None of them, however, see the entire situation unless they are thinking in a multidisciplinary way.

Charlie Munger summed up this approach to practical wisdom : “ Well, the first rule is that you can’t really know anything if you just remember isolated facts and try and bang ‘ em back. If the facts don’t hang together on a latticework of theory, you don’t have them in a usable form.
The Map is not the Territory

What is common to many is taken least care of, for all men have greater regard for what is their own than for what they possess in common with others. – Aristotle

Remember that all models are wrong; the practical question is how wrong do they have to be to not be useful – George Box

There are three key practices needed in order to build and maintain a circle of competence : curiosity and a desire to learn, monitoring, and feedback.

Learning comes when experience meets reflection. You can learn from your own experiences. Or you can learn from the experience of others, through books, articles, and conversations.

It is extremely difficult to maintain a circle of competence without an outside perspective. We usually have too many biases to solely rely on our own observations. It takes courage to solicit external feedback, so if defensiveness starts to manifest, focus on the result you hope to achieve.
Supporting Idea: Falsifiability

The trend is not destiny. Even if we can derive and understand certain laws of human biological nature, the trends of history itself are dependent on conditions, and conditions change.

First Principles Thinking: Applying the filter of falsifiability helps us sort through which theories are more robust. If they can’t ever be proven false because we have no way of testing them, then the best we can do is try to determine their probability of being true.

Socratic questioning generally follows this process: Clarifying your thinking and explaining the origins of your ideas.

  1. Why do I think this? What exactly do I think?
  2. Challenging assumptions: How do I know this is true? What if I thought the opposite?
  3. Looking for evidence. How can I back this up? What are the sources?
  4. Considering alternative perspectives. What might others think? How do I know I am correct?
  5. Examining consequences and implications. What if I am wrong? What are the consequences if I am?
  6. Questioning the original questions. Why did I think that? Was I correct?
  7. What conclusions can I draw from the reasoning process?

Many people mistakenly believe that creativity is something that only some of us are born with, and either we have it or we don’t. Fortunately, there seems to be ample evidence that this isn’t true. We’re all born rather creative, but during our formative years, it can be beaten out of us by busy parents and teachers. As adults, we rely on convention and what we’re told because that’s easier than breaking things down into first principles and thinking for yourself. Thinking through first principles is a way of taking off the blinders. Most things suddenly seem more possible.

The truth is, the events that have happened in history are but one realization of the historical process — one possible outcome among a large variety of possible outcomes. They’re like a deck of cards that has been dealt out only one time. All the things that didn’t happen, but could have if some little thing went another way, are invisible to us.

Thought experiments tell you about the limits of what you know and the limits of what you should attempt. In order to improve our decision – making and increase our chances of success, we must be willing to probe all of the possibilities we can think of. Thought experiments are not daydreams. They require both rigor and work. But the more you use them, the more you understand actual cause and effect, and the more knowledge you have of what can really be accomplished.

In mathematics they call these sets. The set of conditions necessary to become successful is a part of the set that is sufficient to become successful. But the sufficient set itself is far larger than the necessary set. Without that distinction, it’s too easy for us to be misled by the wrong stories.
Second-Order Thinking

Probabilistic Thinking: Almost everyone can anticipate the immediate results of their actions. This type of first – order thinking is easy and safe but it’s also a way to ensure you get the same results that everyone else gets. Second – order thinking is thinking farther ahead and thinking holistically.

The theory of probability is the only mathematical tool available to help map the unknown and the uncontrollable. It is fortunate that this tool, while tricky, is extraordinarily powerful and convenient. Benoit Mandelbrot

Occam’s Razor: Think about not only what you could do to solve a problem, but what you could do to make it worse — and then avoid doing that, or eliminate the conditions that perpetuate it. Simpler explanations are more likely to be true than complicated ones. This is the essence of Occam’s Razor.

Why are more complicated explanations less likely to be true ? Let’s work it out mathematically. Take two competing explanations, each of which seem to equally explain a given phenomenon. If one of them requires the interaction of three variables and the other the interaction of thirty variables, all of which must have occurred to arrive at the stated conclusion, which of these is more likely to be in error ? If each variable has a 99 % chance of being correct, the first explanation is only 3 % likely to be wrong. The second, more complex explanation, is about nine times as likely to be wrong, or 26 %. The simpler explanation is more robust in the face of uncertainty.

Hanlon’s Razor: “extraordinary claims require extraordinary proof. ”

I need to listen well so that I hear what is not said. Thuli Madonsela

Hanlon’s Razor states that we should not attribute to malice that which is more easily explained by stupidity.

The explanation most likely to be right is the one that contains the least amount of intent.

Kahneman and Tversky exposed a sort of tic in our mental machinery : we’re deeply affected by vivid, available evidence, to such a degree that we’re willing to make judgments that violate simple logic. We over – conclude based on the available information. We have no trouble packaging in unrelated factors if they happen to occur in proximity to what we already believe.

Buy yourself a copy of the book.

Thanks for reading 🤗

Leave a Reply

Your email address will not be published. Required fields are marked *

You May Also Like
smartcuts book cover
Read More

Smartcuts by Shane Snow

Entrepreneur and journalist Shane Snow (Wired, Fast Company, The New Yorker, and cofounder of Contently) analyzes the lives of people and companies that do incredible things in implausibly short time.