Tim Urban has a new book, What’s Our Problem?, whose main thesis is that our minds have two modes, and high and a low mind. The high mind more seeks truth, while the low mind more seeks loyalty via confirming sacred beliefs. The high mind thinks like a scientist, especially in an “idea lab”. It realizes that it might be wrong, avoids bias or idea attachment, is open to stumbling and backtracking, and systematically collects hypotheses and data to carefully compare the two.
Finally! You're the only other person I've come across so far who isn't blindly worshipping this book. Tim's strength is simplification, but these topics need complexity and nuance. The book is called "What's Our Problem?" and subtitled "A Self-Help Book for Societies." First, this isn't about global societies—this book is about America (specifically American politics and current events). Regarding the title, the book never actually gets to root cause(s) or generator function(s)—it's stuck in hundreds of pages of muck just pointing at symptoms. The premise of the book is thinking better and increasing wisdom to save ourselves from exponential technology, yet the book never actually gets there. Here are more of my thoughts in a Twitter thread: https://twitter.com/SlowwCo/status/1627909918945869824
But there are clear observables that could be used as the basis for social norms re "high" vs "low" thinking. Conduct of debate!
If you're insulting or denouncing your partner in discussion, that's your "low mind" talking. If you're trying to shut the other person up, that's your "low mind" talking. If you're refusing to answer a relevant question because the answer would be detrimental to your point, that's your "low mind" talking. If you're refusing to agree on objectively obvious common ground with someone you disagree with, that's your "low mind" talking. If you're declaring victory over someone on the main point because you have caught them admitting uncertainty or making a mistake on just a supporting point, that's your "low mind" talking.
All of these detrimental behaviors are fairly clear observables. The problem is not that we can't observe them, but that we socially reward them. If we see someone we agree with being socially dominant over someone else - insulting them, painting them as stupid and evil, refusing to acknowledge or entertain what they're saying - then, most people will reward and approve of that behavior.
The solution is to build social systems where these clear observables are clearly punished. These would be moderated communities.
I think Urban's view is a simplification that feels good initially but actually causes a lot of confusion as one thinks more deeply (perhaps good for low minds but not so much for high minds ;)). It's a confusing ontology because we know many intellectual (high minded?) folks who are not really seeking truth. Many claim to seek truth (e.g., via science) when they are doing something else entirely (seeking prestige and power), and often these goals do not align. I came across a recent utility theory of belief that sounds right--people will exchange their (low-minded, inaccurate) beliefs for (high-minded, accurate) ones if the outcomes (good and bad) of accurate beliefs have more utility for the believer. So to become high-minded, we need to make low-mindedness relatively less appealing, in terms of outcomes.
I guess the low mind is a time and energy saving measure that we develop over time. If our experience tells us that many of those who say X are bad people, and that many bad people say X, then we generalize and conclude that X is bad and must be opposed at all costs. We all need quick rules of thumb like this. Seen as an optimization trick, the low mind doesn't seem that bad. We have one because we need one.
I agree that, other factors being equal, we should make more use of the high mind and less use of the low mind. I say other factors being equal because stopping to reflect on whether the wolf that is about to eat you is really dangerous is perhaps high-minded but certainly stupid.
<So what we first need is for people to actually want to displace low minds with high. Not just as a sort of nice thing to have sometimes. But as a deeply sacred thing in itself, more sacred than the political loyalty which encourages low minds. Hence my efforts to study and change the sacred.>
A new religion inspired by science?
To overcome bias it will probably help to dismantle the systems of power that coerce us into keeping our silence for fear of being deplatformed, fired, or imprisoned. This is not the “White Patriarchy”, but rather the Big Tech/ Big Media/ Big Brother industrial complex.
Also, last paragraph is a great teaser! I hope you'll elaborate in a subsequent post. (And I'm already thankful for the move to Substack as I was able to read the post on my phone while making dinner! Yey, technology!)
A question: isn't reading a book that teaches you to think useless? Matter of fact, most of these books teach you WHAT to think, not HOW to think. Right?
Isn't the main thing about making the distinction between high vs low mind, that you now try to foster a culture of discussion in a high minded mode, where all problems (where you rightfully observe, that the issues run much deeper) can be hashed out? In the current context of tribal warfare, none of our problems can be solved.
Sounds a bit like System 1 and 2 from Kahneman.
I agree that the solution sounds a little flimsy (having not read the book myself), but isn't this just the beginning of the solution? Conjecture? For instance, you "speaking your mind" about the need for concretized solutions baked into our institutions.
Karl Popper also said that long-term change must be institutional, but in my view, the problem with building social institutions is that the people most likely to contribute and shape them by nature must have a more definitive outlook that favors rigidity in order for the insitituion to survive. Today's world probably calls for one that can sway in the wind a bit. Insitutions that seem to survive the longest are adaptable. The "high" mind building institutions has to be able to forsee the low-minded nature of inhabitants of said institutions and account for their inclusion.
Does not everyone use the "high mind" in daily life? It's hard to survive and take care of loved ones for a week without: realizing that you might be wrong, avoiding bias/idea attachment, systematically (and subconsciously) collecting hypotheses and data to carefully compare the two, etc.
And people feel free to switch to their low mind in situations that aren't immediately relevant to their life, or in moments where a low mind is helpful.
I totally agree that the last part of Tim's book ("solution") is the weakest. We need social institutions that create stronger incentives for high rung thinking.
Nevertheless, I would give Tim more credit, because this goal (better social rules and institutions) requires awareness and courage to start with, most urgently from people who have a larger social reach. However, such individual efforts are unlikely to be stable enough in the long term. After all, we are fighting against our own "biology" here.
typo in opening sentence: *and high and a low mind > a
>I suggest that what would really help are better social institutions, such as prediction and decision markets, which by their structure encourage more use of high minds. Alas, I think most people oppose such changes exactly because they realize that they’d have that that result.
This theory predicts that Manifold Markets should be unpopular, but it is in fact popular.
Why aren't you using it? What does this tell us about you?
>So what we first need is for people to actually want to displace low minds with high.
That's what Tim's book sets out to achieve.
“the low mind more seeks loyalty via confirming sacred beliefs.”
Uuuuuhhhhh ohhhhhh we’re in trouble in 2023 America 🤣🤣🤣🤣🤣
The Black Snake of Wounded Vanity