Tag Archives: Disagreement

The Puzzle Of Persistent Praise

We often praise and criticize people for the things they do. And while we have many kinds of praise, one very common type (which I focus on in this post) seems to send the message “what you did was good, and it would be good if more of that sort of thing were done.” (Substitute “bad” for “good” to get the matching critical message.)

Now if it would be good to have more of some act, then that act is a good candidate for something to subsidize more. And if most people agreed that this sort of act deserved more subsidy, then politicians should be tempted to run for office on the platform that they will increase the actual subsidy given to that kind of act. After all, if we want more of some kind of acts, why don’t we try to better reward those acts? And so good acts shouldn’t long remain with an insufficient subsidy. Or bad acts without an insufficient tax.

But in fact we seem to have big categories of acts which we consistently praise for being good, and where this situation persists for decades or centuries. Think charity, innovation, or artistic or sport achievement. Our political systems do not generate much political pressure to increase the subsidies for such things. Subsidy-increasing proposals are not even common issues in elections. Similarly, large categories of acts are consistently criticized, yet few politicians run on platforms proposing to increase taxes on such acts.

My best interpretation of this situation is that while our words of praise give the impression that we think that most people would agree that the acts we praise are good, and should be more common, we don’t really believe this. Either we think that the acts signal impressive or praise-worthy features, but shouldn’t be more common, or we think such acts should be more common, but we also see large opposing political coalitions who disagree with our assessment.

That is, my best guess is that when we look like we are praising acts for promoting a commonly accepted good, we are usually really praising impressiveness, or we are joining in a partisan battle on what should be seen as good.

Because my explanation is cynical, many people count it as “extraordinary”, and think powerful extraordinary evidence must be mustered before one can reasonably suggest that it is plausible. In contrast, the usual self-serving idealistic explanations people give for their behavior are ordinary, and therefore can be accepted on face value without much evidence at all being offered in their defense. People get mad at me for even suggesting cynical theories in short blog posts, where large masses of extraordinary evidences have not been mustered. I greatly disagree with this common stacking of the deck against cynical theories.

Even so, let us consider some seven other possible explanations of this puzzle of persistent praise (and criticism). And in the process make what could have been a short blog post considerably longer. Continue reading "The Puzzle Of Persistent Praise" »

GD Star Rating
loading...
Tagged as: , , ,

Beware Status Arrogance

Imagine that you are expert in field A, and a subject in field B comes up at party. You know that there may be others at the party who are expert in field B. How reluctant does this make you to openly speculate about this topic? Do you clam up and only cautiously express safe opinions, or do you toss out the thoughts that pop into your head as if you knew as much about the subject as anyone?

If you are like most people, the relative status of fields A and B will likely influence your choice. If the other field has higher status than yours, you are more likely to be cautious, while if the other field has lower status than yours, you are more likely to speculate freely. In both cases your subconscious will have made good guesses about the likely status consequences to you if an expert in B were to speak up and challenge your speculations. At some level you would know that others at the party are likely to back whomever has the higher status, even if the subject is within the other person’s area of expertise.

But while you are likely to be relatively safe from status losses, you should know that you are not safe from being wrong. When people from different fields argue about something within one of their areas of expertise, that expert is usually right, even when the other field has higher status. Yes people from your field may on average be smarter and harder-working, and your field may have contributed more to human progress. Even so, people who’ve studied more about the details of something usually know more about it.

GD Star Rating
loading...
Tagged as: ,

Conflicting Abstractions

My last post seems an example of an interesting general situation: when abstractions from different fields conflict on certain topics. In the case of my last post, the topic was the relative growth rate feasible for a small project hoping to create superintelligence, and the abstractions that seem to conflict are the ones I use, mostly from economics, and abstractions drawn from computer practice and elsewhere used by Bostrom, Yudkowsky, and many other futurists.

What typically happens when it seems that abstractions from field A suggests X, while abstraction from field B suggests not X? Well first, since both X and not X can’t be true, each field would likely see this as a threat to their good reputation. If they were forced to accept the existence of the conflict, then they’d likely try to denigrate the other field. If one field is higher status, the other field would expect to lose a reputation fight, and so they’d be especially eager to reject the claim that a conflict exists.

And in fact, it should usually be possible to reject a claim that a conflict exists. The judgement that a conflict exists would come from specific individuals studying the questions of if A suggests X and if B suggests not X. One could just suggest that some of those people were incompetent at analyzing the implications of the abstractions of particular fields. Or that they were talking past each other and misunderstanding what X and not X mean to the other. So one would need especially impeccable credentials to publicly make these claims and make them stick.

The ideal package of expertise for investigating such an issue would be expertise in both fields A and B. This would position one well to notice that a conflict exists, and to minimize the chance of problems arising from misunderstandings on what X means. Unfortunately, our institutions for crediting expertise don’t do well at encouraging combined expertise. For example, often patrons are interested in the intersection between fields A and B, and sponsor conferences, journal issues, etc. on this intersection. However, seeking maximal prestige they usually prefer people with the most prestige in each field, over people who actually know both fields simultaneously. Anticipating this, people usually choose to stay within each field.

Anticipating this whole scenario, people are likely to usually avoid seeking out or calling attention to such conflicts. To seek out or pursue a conflict, you’d have to be especially confident that your field would back you up in a fight, because your credentials are impeccable and the field thinks it could win a status conflict with the other field. And even then you’d have to waste some time studying a field that your field doesn’t respect. Even if you win the fight you might lose prestige in your field.

This is unfortunate, because such conflicts seem especially useful clues to help us refine our important abstractions. By definition, abstractions draw inferences from reduced descriptions, descriptions which ignore relevant details. Usually that is useful, but sometimes that leads to errors when the dropped details are especially relevant. Intellectual progress would probably be promoted if we could somehow induce more people to pursue apparent conflicts between the abstractions from different fields.

GD Star Rating
loading...
Tagged as: ,

Bias Is A Red Queen Game

It takes all the running you can do, to keep in the same place. The Red Queen.

In my last post I said that as “you must allocate a very limited budget of rationality”, we “must choose where to focus our efforts to attend carefully to avoiding possible biases.” Some objected, seeing the task of overcoming bias as like lifting weights to build muscles. Scott Alexander compared it to developing habits of good posture and lucid dreaming:

If I can train myself to use proper aikido styles of movement even when I’m doing something stupid like opening a door, my body will become so used to them that they will be the style I default to when my mind is otherwise occupied. .. Lucid dreamers offer some techniques for realizing you’re in a dream, and suggest you practice them even when you are awake, especially when you are awake. The goal is to make them so natural that you could (and literally will) do them in your sleep. (more)

One might also compare with habits like brushing your teeth regularly, or checking that your fly isn’t unzipped. There are indeed many possible good habits, and some related to rationality. And I encourage you all to develop good habits.

What I object to is letting yourself think that you have sufficiently overcome bias by collecting a few good mental habits. My reason: the task of overcoming bias is a Red Queen game, i.e., one against a smart, capable, and determined rival, not a simple dumb obstacle.

There are few smart determined enemies trying to dirty your teeth, pull your fly down, mess your posture, weaken your muscles, or keep you unaware that you are dreaming. Nature sometimes happens to block your way in such cases, but because it isn’t trying hard to do so, it takes only modest effort to overcome such obstacles. And as these problems are relatively simple and easy, an effective strategy to deal with them doesn’t have to take much context into account.

For a contrast, consider the example of trying to invest to beat the stock market. In that case, it isn’t enough to just be reasonably smart and attentive, and avoid simple biases like not deciding when very emotional. When you speculate in stocks, you are betting against other speculators, and so can only expect to win if you are better than others. If you can’t reasonably expect to have better info and analysis than the average person on the other side of your trades, you shouldn’t bet at all, but instead just take the average stock return, by investing in index funds.

Trying to beat the stock market is a Red Queen game against a smart determined opponent who is quite plausibly more capable than you. Other examples of Red Queen games are poker, and most competitive contests like trying to win at sports, music, etc. The more competitive a contest, the more energy and attention you have to put in to have a chance at winning, and the more you have to expect to specialize to have a decent chance. You can’t just develop good general athletic habits to win at all sports, you have to pick the focus sport where you are going to try to win. And for all the non-focus sports, you might play them for fun sometimes, but you shouldn’t expect to win against the best.

Overcoming bias is also a Red Queen game. Your mind was built to be hypocritical, with more conscious parts of your mind sincerely believing that they are unbiased, and other less conscious parts systematically distorting those beliefs, in order to achieve the many functional benefits of hypocrisy. This capacity for hypocrisy evolved in the context of conscious minds being aware of bias in others, suspecting it in themselves, and often sincerely trying to overcome such bias. Unconscious minds evolved many effective strategies to thwart such attempts, and they usually handily win such conflicts.

Given this legacy, it is hard to see how your particular conscious mind has much of a chance at all. So if you are going to create a fighting chance, you will need to try very hard. And this trying hard should include focusing a lot, so you can realize gains from specialization. Just as you’d need to pay close attention and focus well to have much of a chance at beating the hedge funds and well-informed expert speculators who you compete with in stock markets.

In stock markets, the reference point for “good enough” is set by the option to just take the average via an index fund. If using your own judgement will do worse than an index fund, you might as well just take that fund. In overcoming bias, a reference point is set by the option to just accept the estimates of others who are also trying to overcome bias, but who focus on that particular topic.

Yes you might do better than you otherwise would have if you use a few good habits of rationality. But doing a bit better in a Red Queen game is like bringing a knife to a gunfight. If those good habits make you think “I’m a rationalist,” you might think too highly of yourself, and be reluctant to just take the simple option of relying on the estimates of others who try to overcome their biases and focus on those particular topics. After all, refusing to defer to others is one of our most common biases.

Remember that the processes inside you that bias your beliefs are many, varied, subtle, and complex. They express themselves in different ways on different topics. It is far from sufficient to learn a few simple generic tricks that avoid a few simple symptoms of bias. Your opponent is putting a lot more work into it than that, and you will need to do so as well if you are to have much of a chance. When you play a Red Queen game, go hard or go home.

GD Star Rating
loading...
Tagged as: ,

Disagreement Is Far

Yet more evidence that it is far mental modes that cause disagreement:

Recruiting a sample of Americans via the internet, they polled participants on a set of contentious US policy issues, such as imposing sanctions on Iran, healthcare and approaches to carbon emissions. One group was asked to give their opinion and then provide reasons for why they held that view. This group got the opportunity to put their side of the issue, in the same way anyone in an argument or debate has a chance to argue their case.

Those in the second group did something subtly different. Rather that provide reasons, they were asked to explain how the policy they were advocating would work. They were asked to trace, step by step, from start to finish, the causal path from the policy to the effects it was supposed to have.

The results were clear. People who provided reasons remained as convinced of their positions as they had been before the experiment. Those who were asked to provide explanations softened their views, and reported a correspondingly larger drop in how they rated their understanding of the issues. (more; paper; HT Elliot Olds)

The question “why” evokes a far mode while “how” which evokes a near mode.

GD Star Rating
loading...
Tagged as: ,

Reason, Stories Tuned for Contests

Humans have a capacity to reason, i.e., to find and weigh reasons for and against conclusions. While one might expect this capacity to be designed to work well for a wide variety of types of conclusions and situations, our actual capacity seems to be tuned for more specific cases. Mercier and Sperber:

Reasoning is generally seen as a means to improve knowledge and make better decisions. However, much evidence shows that reasoning often leads to epistemic distortions and poor decisions. This suggests that the function of reasoning should be rethought. Our hypothesis is that the function of reasoning is argumentative. It is to devise and evaluate arguments intended to persuade. … Poor performance in standard reasoning tasks is explained by the lack of argumentative context. … People turn out to be skilled arguers (more)

That is, our reasoning abilities are focused on contests where we already have conclusions that we want to support or oppose, and where particular rivals give conflicting reasons. I’d add that such abilities also seem tuned to win over contest audiences by impressing them, and by making them identify more with us than with our rivals. We also seem eager to visibly hear argument contests, in addition to participating in such contests, perhaps to gain exemplars to improve our own abilities, to signal our embrace of social norms, and to exert social influence as part of the audience who decides which arguments win.

Humans also have a capacity to tell stories, i.e., to summarize sets of related events. Such events might be real and past, or possible and future. One might expect this capacity to be designed to well-summarize a wide variety of event sets. But, as with reasoning, we might similarly find that our actual story abilities are tuned for the more specific case of contests, where the stories are about ourselves or our rivals, especially where either we or they are suspected of violating social norms. We might also be good at winning over audiences by impressing them and making them identify more with us, and we may also be eager to listen to gain exemplars, signal norms, and exert influence.

Consider some forager examples. You go out to find fire wood, and return two hours later, much later than your spouse expected. During a hunt someone shot an arrow that nearly killed you. You don’t want the band to move to new hunting grounds quite yet, as your mother is sick and hard to move. Someone says something that indirectly suggests that they are a better lover than you.

In such examples, you might want to present an interpretation of related events that persuades others to adopt your favored views, including that you are able and virtuous, and that your rivals are unable and ill-motivated. You might try to do this via direct arguments, or more indirectly via telling a story that includes those events. You might even work more indirectly, by telling a fantasy story where the hero and his rival have suspicious similarities to you and your rival.

This view may help explain some (though hardly all) puzzling features of fiction:

  • Most of our real life events, even the most important ones like marriages, funerals, and choices of jobs or spouses, seem too boring to be told as stories.
  • Compared to real events, even important ones, stories focus far more on direct conscious conflicts between people, and on violations of social norms.
  • Compared to real people, character features are more extreme, and have stronger correlations between good features.
  • Compared to real events, fictional events are far more easily predicted by character motivations, and by assuming a just world.
GD Star Rating
loading...
Tagged as: , ,

Extremists Compete

Extremists hold extreme views, and struggle to persuade others of their views, or even to get them to engage such views. Since most people are not extremists, you might think extremists focus mostly on persuading non-extremists. If so, they should have a common cause in getting ordinary people to think outside the usual boxes. They should want to join together to say that the usual views tend to gain from conformity pressures, and that such views are held overconfidently.

But in fact extremists don’t seem interested in joining together to support extremism. While each individual extremist tends to hold multiple extreme views, extremists groups go out of their way to distance themselves from other extremist groups. Not only do they often hate close cousins who they see as having betrayed their cause, they are also hostile to extremists groups on orthogonal topics.

This all makes sense if, as I’ve suggested, there are extremist personality types. Extremist groups have a better chance of attracting these types to their particular sort of extremism, relative to persuading ordinary folks to adopt  extreme views.

GD Star Rating
loading...
Tagged as: ,

Leadership Fantasies

Predictions about leadership in 2030:

The management consulting firm Hay Group worked with the German futurists at Z-Punkt to identify six mega trends such as globalization, technology convergence and the individualization of careers that will shape the kind of leaders companies will need in the future. I spoke with Georg Vielmetter, Hay Group’s regional director of leadership and talent, about the newly released study “Leadership 2030” that he co-authored. …

I think that positional power and hierarchical power will become smaller. Power will shift to stakeholders, reducing the authority of the people who are supposed to lead the organization. … The time of the alpha male — of the dominant, typically male leader who knows everything, who gives direction to everybody and sets the pace, whom everybody follows because this person is so smart and intelligent and clever — this time is over. We need a new kind of leader who focuses much more on relationships and understands that leadership is not about himself. …

Such a leader doesn’t doesn’t put himself at the very center. He knows he needs to listen to other people. He knows he needs to be intellectually curious and emotionally open. He knows that he needs empathy to do the job, not just in order to be a good person. … We will see a significant decline in physical loyalty between people and organizations. It will be very difficult for leaders to formally bind people to their organizations, so they should not try. This is a battle that leaders can only lose. … What is clear is that leaders in the future need to have a full understanding, and also an emotional understanding, of diversity. That’s for sure. (more)

I call bull. Here’s Jeffrey Pfeffer, in Power:

Most books by well-known executives and most lectures and courses about leadership should be stamped CAUTION: THIS MATERIAL CAN BE HAZARDOUS TO YOUR ORGANIZATIONAL SURVIVAL. That’s because leaders touting their own careers as models to be emulated frequently gloss over the power plays they actually used to get to the top. Meanwhile, the teaching on leadership is filled with prescriptions about following an inner compass, being truthful, letting inner feelings show, being modest and self-effacing, not behaving in a bullying or abusive way— in short, prescriptions about how people wish the world and the powerful behaved. There is no doubt that the world would be a much better, more humane place if people were always authentic, modest, truthful, and consistently concerned for the welfare of others instead of pursuing their own aims. But that world doesn’t exist.

More from Pfeffer last November:

Today’s work world is increasingly populated by millennials with values presumably different from more-senior employees—more egalitarian, less competitive, more meritocratic, less accepting of hierarchy, and more tolerant of all forms of diversity. And if that’s true, surely companies are changing, which means we need new theories about power and influence to reflect these new cultural realities. Strategically expressing anger, building a power base, or eliminating rivals are considered outmoded ways of getting ahead. Certainly, the reasoning goes, in a world where reputations get created and transmitted quickly and anonymously through ubiquitous social networks, people who resort to such bad behavior will suffer swift retribution.

The typical Silicon Valley recruitment pitch, or something to this effect, reinforces this view: “We’re not political here. We’re young, cool, socially networked, hip, high-technology people focused on building and selling great products. We’re family-friendly, have fewer management levels and less hierarchy, and make decisions collegially.”

Unfortunately there’s not much evidence of change but plenty of testimony to the contrary: the power struggles that beset the founding of Twitter (TWTR), the turnover among CEOs at Hewlett-Packard (HPQ), and the experiences of former Stanford MBA students working in the supposedly egalitarian world of high tech who have lost their jobs or been thrown out of companies they founded notwithstanding their intelligence and good job performance. Meanwhile, relationships with bosses still go a long way to predict people’s career success; organizational gossip lives on; and career derailment still awaits those who fail to master political dynamics. (more)

GD Star Rating
loading...
Tagged as: , ,

Prefer Contrarian Questions

Many people are attracted to authority. They are eager to defend what authorities say against heretics who say otherwise. This lets them signal a willingness to conform, and gain status by associating with higher status authorities against lower status heretics.

Other people are tempted to be contrarians. My blog readers tend more this way. Contrarians are eager to find authorities with which they disagree, and to associate with similar others. In this way contrarians can affirm standard forager anti-dominance norms, bond more strongly to a group, and hope for glory later if their contrarian positions becomes standard.

I haven’t posted much on disagreement here lately, but contrarians should be disturbed by the basic result that knowing disagreement is irrational. That is, it is less accurate to knowingly disagree with others unless one has good reasons to think you are more rational than they in the sense of listening more to the info implicit in their opinions.

Today I want to point out a way that contrarians can stay contrarians, taking an authority defying position they can share with like-minded folks and which might later lead to glory, while avoiding most of the accuracy-reducing costs of disagreement: be contrarian on questions, not answers.

Academia has well known biases regarding the topics it studies. Academia is often literature-driven, clumping around a few recently-published topics and neglecting many others. Academia also prefers topics where one can show off careful mastery of difficult and thus impressive methods, and so neglects topics worse suited for such displays.

Of course academia isn’t the only possible audience when selling ideas, but the other possible customers also have known topic biases. For example, popular writings are biased toward topics which are easy to explain to their audience, which flatter that audience, and which pander to their biases.

The existence of these known topic biases suggests how to be a more accurate contrarian: disagree with academia, the popular press, etc. on what questions are worth studying. While individuals may at times disagree with you on the importance of the topics you champion, after some discussion they will usually cave and accept your claim that academia, etc. has these topic biases, and that one should expect your topic to be neglected as a result.

Some academics will argue that only standard difficult academic methods are informative, and all other methods give only random noise. But the whole rest of the world functions pretty well drawing useful conclusions without meeting standard academic standards of method or care. So it must be possible to make progress on topics not best suited for showing off mastery of difficult academic methods.

So if your topic has some initial or surface plausibility as an important topic, and is also plausibly neglected by recent topic fashion and not well suited for showing off difficult academic methods, you have a pretty plausible contrarian case for the importance of your topic. That is, you are less likely to be wrong about this emphasis, even though it is a contrarian emphasis.

Now your being tempted to be contrarian on questions suggests that you are the sort of person who is also tempted to be contrarian on answers. Because of this, for maximum accuracy you should probably bend over backwards to not be contrarian on which answers you favor to your contrarian question. Focus your enjoyment of defying authorities on defying their neglect of your questions, but submit to them on how to answer those questions. Try as far as possible to use very standard assumptions and methods, and be reluctant to disagree with others on answers to your questions. Resist the temptation to too quickly dismiss others who disagree on answers because they have not studied your questions as thoroughly as you. Once you get some others to engage your question in some detail, take what they say very seriously, even if you have studied far more detail than they.

With this approach, the main contrarian answer that you must endorse is a claim about yourself: that you don’t care as much about the rewards that attract others to the usual topics. Most people work on standard topics because those usually give the most reliable paths to academic prestige, popular press popularity, etc. And honestly, most people who think they don’t care much about such things are just wrong. So you’ll need some pretty strong evidence in support of your claim that you actually differ enough in your preferences to act differently. But fortunately, your being deluded about this can’t much infect the accuracy of your conclusions about your contrarian topic. Even if you are mistaken on why you study it, your conclusions are nearly as likely to be right.

This is the approach I’ve tried to use in my recent work on the social implications of brain emulations. This is very contrarian as a topic, in the sense that almost no one else works on it, or seems inclined that way. But it has an initial plausibility as very important, at least if one accepts standard conclusions in some tech and futurist worlds. It is plausibly neglected as having negative associations and being less well suited for impressive methods. And I try to use pretty standard assumptions and methods to infer answers to my contrarian question. Of course none of that protects me from delusions on the rewards I expect to forgo by focusing on this topic.

Added 7Mar: People are already in the habit of pleasantly tolerating a wider range of opinion on which questions are important, both because differing values contribute, and because people tend to strongly overestimate the importance of the questions they work on personally.

GD Star Rating
loading...
Tagged as: ,

Joiners v. Middlers

Kelley … traced the success of conservative churches to their ability to attract and retain an active and committed membership, characteristics that he in turn attributed to their strict demands for complete loyalty, unwavering belief, and rigid adherence to a distinctive lifestyle. … [Such] a group limits and thereby increases the cost of non-group activities, such as socializing with members of other churches or pursuing “secular” pastime. …

Seemingly unproductive costs … screen out people whose participation would otherwise be low, while at the same time they increase participation among those who do join. As a consequence, apparently unproductive sacrifices can increase the utility of group members. Efficient religions with perfectly rational members may thus embrace stigma, self-sacrifice, and bizarre behavioral standards. …

When we group religions according to the (rated) stringency of their demands, … [we see that] compared to members of other Protestant denominations, [high-demand] sect members are poorer and less educated, contribute more money and attend more services, hold stronger beliefs, belong to more church-related groups, and are less involved in secular organizations. … Data from the 1990 National Jewish Population Survey reveal patterns of interdenominational variation virtually identical to those observed within Protestantism. (more)

I see these tendencies in opinions:

  1. Those with more opinions on some topic categories have more on other categories.
  2. Those with more opinions overall have more extreme opinions on each topic.
  3. Those with more extreme opinions on some topics have more extreme opinions on others.
  4. Those with more extreme opinions are more eager to express their opinions, and vice versa.
  5. Those with more extreme opinions are more eager to join groups and attend their meetings.

(All these could have instead been expressed in terms of less extreme opinions, and “extreme” means noticeably away from the distribution middle.)

One might try to explain these by saying that opinions on a few key topics drive most other opinions. Folks with weak opinions on key topics thus have fewer opinions on other topics, and less interest in expressing opinions or in joining groups to spread the word. Yet there is little evidence that such key opinions exist; most people show little correlation of opinion across topics, or even on the same subject across time.

A more plausible explanation follows the quote above on religion. Religions, ideologies, and other idea-affiliated social groups vary in the level of commitment they ask of members. High commitment groups produce stronger community bonds, and people vary in their taste for such strong bonds. Some folks are “joiners,” with a taste for more strongly bonded groups. Joiners have an induced taste for groups with extreme opinions, and thus an induced taste to have their own more extreme opinions, in order to better fit with stronger groups. Thus joiners tend to let themselves have more opinions and more extreme opinions on many topics.

The opposite group are “middlers,” who prefer to get along mildly well with most everyone, instead of bonding more tightly with a smaller group. Middlers have fewer opinions, fewer extreme opinions, and tend not to join groups that are clearly distinguished by being associated with unusual opinions.

The opinions habits of both joiners and middlers come mainly from social preferences, instead of a preference for belief accuracy. While it isn’t obvious which group is more wrong, it is more obviously wrong to embody the opinion correlations described above.

GD Star Rating
loading...
Tagged as: , , ,