My impulsive reaction to this post was "maybe this is as it should be". For, if you discuss a belief that I have strong priors against, it only takes a little against it for me to dismiss it, but a lot for it for me to believe it; whereas, if you discuss a belief that I have strong priors for, it only takes a little for it for me to believe it, but a lot against it for me to dismiss it. This is the proper response to arguments as dictated by Bayescraft. I think this observation indicates that society's implicit prior - to the extent it has one - approximately weights each hypothesis by the social status that it affords to its believers (although I'm not exactly sure about the causal relationship between prior weight and social status).
This has happened a lot to me. My assumption has been that the elite person had never come across my hypothesis and so their reaction was to dismiss it out-of-hand. Their imagined thought, "If I, an elite, has never heard of this, it must be wrong".
More of a living in a bubble problem, than elitism, per se.
We have some decent mechanisms in place ensuring that average elite views are of appreciably higher quality than average contrarian views. So it makes sense to place some sort of tax on contrarian views. I guess my question is how do you know that taxes on contrarian views are too high?
And this is why we need to break out of our system of social consensus dominated so strongly by elites.
That seems like a bit of a tricky thing to negotiate.
Or to put it another way: when "elites" make low quality arguments within their tribe, sometimes even really obviously bad claims, these are still often commonly & popularly accepted. (example: many claims Donald Trump makes)
Even further, even for claims like "George W Bush is a lizard person", my actual reasoning seems to rely more on "elites think this is crazy" rather than evaluating arguments for & against the idea & I think this may be really common.
I get the impression that elite gatekeeping is really functioning in a role of maintaining a hierarchical structure of truth-claims such that these issues are navigable. A flatter world of only arguments seems to me to be unintelligible, as I can't evaluate every argument personall, and I think I'm smart. I think for everybody else, elites function to shame poor reasoners away from clearly bad ideas.
This isn't to say you're wrong, or that there aren't clear inefficiencies, or abuses, or anything else. I think you're right on how elites reason. Just that elites are used as a bulwark to dismiss many bad ideas. People say "shut up and listen to the CDC" because the perception is that the next most popular sources of information are dangerous. (Even if the rationalist elites coalesced around Zeynep Tufekci)
And to even give a MORE cynical approach, I think "overthrow the elites" may even fall into the framework of meta-contrarianism: https://www.lesswrong.com/p... , which is to say that "dumb" people ignore elites, "smart" people obey elites, and "really smart" people seek to toss aside the elites to signal they're too smart to even need them.
I think Garett Jones has written about how intelligent people cooperate more, resulting in more positive-sum interactions. If status is correlated with intelligence, then we might expect high status people to be more cooperative.
> No, but they are better at being nice contingently, in the right situations where niceness is rewarded. And also with being mean contingently, in the situations where that is rewarded.
My strong intuition is that contingent niceness is better rewarded than contingent meanness. At least in the modern world. It's hard to think of many example of people who saw big career boosts from being mean to the right people. Sure it happens, but the counterexamples are rare and unusual enough that they practically typify the cliche, "this exception proves the rule"
In general, I think that does mean that there's a positive correlation between niceness and elite success. Just because being nice all the time is a stochastically better strategy than being mean all the time. Of course, being nice at exactly the right times is even better, because you can conserve your resources. But that's generally a lot harder to execute well. A good heuristic is when in doubt, be nice.
Status is a very standard concept in social science. If you had never studied the subject, "momentum" might seem loosely defined to you. https://www.overcomingbias....
I'm newer to this blog, so I may have missed some early articles, but what exactly is the definition of an 'elite' or 'higher status'. These terms feel loosely defined.
I've heard low-status contrarians complaining about high-status people they argue with before. What has been your experience of rewards when arguing against people lower in status than you? Have you ever gone back and noticed some of your own arguments were below your usual standards but got accepted anyway?
My impulsive reaction to this post was "maybe this is as it should be". For, if you discuss a belief that I have strong priors against, it only takes a little against it for me to dismiss it, but a lot for it for me to believe it; whereas, if you discuss a belief that I have strong priors for, it only takes a little for it for me to believe it, but a lot against it for me to dismiss it. This is the proper response to arguments as dictated by Bayescraft. I think this observation indicates that society's implicit prior - to the extent it has one - approximately weights each hypothesis by the social status that it affords to its believers (although I'm not exactly sure about the causal relationship between prior weight and social status).
This has happened a lot to me. My assumption has been that the elite person had never come across my hypothesis and so their reaction was to dismiss it out-of-hand. Their imagined thought, "If I, an elite, has never heard of this, it must be wrong".
More of a living in a bubble problem, than elitism, per se.
We have some decent mechanisms in place ensuring that average elite views are of appreciably higher quality than average contrarian views. So it makes sense to place some sort of tax on contrarian views. I guess my question is how do you know that taxes on contrarian views are too high?
I see what you did there, Robin!
And this is why we need to break out of our system of social consensus dominated so strongly by elites.
That seems like a bit of a tricky thing to negotiate.
Or to put it another way: when "elites" make low quality arguments within their tribe, sometimes even really obviously bad claims, these are still often commonly & popularly accepted. (example: many claims Donald Trump makes)
Even further, even for claims like "George W Bush is a lizard person", my actual reasoning seems to rely more on "elites think this is crazy" rather than evaluating arguments for & against the idea & I think this may be really common.
I get the impression that elite gatekeeping is really functioning in a role of maintaining a hierarchical structure of truth-claims such that these issues are navigable. A flatter world of only arguments seems to me to be unintelligible, as I can't evaluate every argument personall, and I think I'm smart. I think for everybody else, elites function to shame poor reasoners away from clearly bad ideas.
This isn't to say you're wrong, or that there aren't clear inefficiencies, or abuses, or anything else. I think you're right on how elites reason. Just that elites are used as a bulwark to dismiss many bad ideas. People say "shut up and listen to the CDC" because the perception is that the next most popular sources of information are dangerous. (Even if the rationalist elites coalesced around Zeynep Tufekci)
And to even give a MORE cynical approach, I think "overthrow the elites" may even fall into the framework of meta-contrarianism: https://www.lesswrong.com/p... , which is to say that "dumb" people ignore elites, "smart" people obey elites, and "really smart" people seek to toss aside the elites to signal they're too smart to even need them.
I think Garett Jones has written about how intelligent people cooperate more, resulting in more positive-sum interactions. If status is correlated with intelligence, then we might expect high status people to be more cooperative.
> No, but they are better at being nice contingently, in the right situations where niceness is rewarded. And also with being mean contingently, in the situations where that is rewarded.
My strong intuition is that contingent niceness is better rewarded than contingent meanness. At least in the modern world. It's hard to think of many example of people who saw big career boosts from being mean to the right people. Sure it happens, but the counterexamples are rare and unusual enough that they practically typify the cliche, "this exception proves the rule"
In general, I think that does mean that there's a positive correlation between niceness and elite success. Just because being nice all the time is a stochastically better strategy than being mean all the time. Of course, being nice at exactly the right times is even better, because you can conserve your resources. But that's generally a lot harder to execute well. A good heuristic is when in doubt, be nice.
Status is a very standard concept in social science. If you had never studied the subject, "momentum" might seem loosely defined to you. https://www.overcomingbias....
I'm newer to this blog, so I may have missed some early articles, but what exactly is the definition of an 'elite' or 'higher status'. These terms feel loosely defined.
Sure, I've noticed I'm often sloppier in argument when lecturing to students than when talking to colleagues.
I've heard low-status contrarians complaining about high-status people they argue with before. What has been your experience of rewards when arguing against people lower in status than you? Have you ever gone back and noticed some of your own arguments were below your usual standards but got accepted anyway?