I have always been struck by starkness of human hypocrisy and its incongruity in the face of avowed beliefs. … Sin is common, and human weakness in the face of contradiction the norm. Mens’ hearts are easily divided, and simultaneously sincere in their inclinations. … All this leads to the point that I believe far too many of those of us who wish to comprehend human nature scientifically lack a basic grasp of it intuitively. … Many atheists simply lack a deep understanding of what drives people to be religious, and that our psychological model of those who believe in gods is extremely suspect. The "irrationality" and "contradiction" of human behavior may be rendered far more systematically coherent simply by adding more parameters into the model. … When I engage with these sorts of issues with readers of Overcoming Bias or Singularitarians my suspicions become even stronger because I see in some individuals an even greater lack of fluency in normal cognition than my own. … My point is that understanding human nature is not a matter of fitting humanity to our expectations and wishes, but modeling it as it is, whether one thinks that that nature is irrational or not within one’s normative framework.
This frustrating critique is frustrating common: "You’re wrong because your model is too simple. But I’m not going to tell you what your model is missing, at least not in a clear enough way to help you improve your model." Yes of course almost all our models are too simple. We all know that; what we don’t know is exactly what complexities we should be adding to our models. And for the record I was a teen cultist and my dad and brother were/are church pastors.
For social scientists I think there is actually an advantage in having a less powerful intuitive understanding of human behavior – it helps us notice things that need explaining. To want to explain particular human behaviors you first need to see them as puzzling, and people with powerful intuitive understandings can predict behavior so well intuitively that they often don’t notice behaviors that are at odds with our best theories.
You do a lot better with religion if you don't treat it as primarily a system of belief, but as a combination of a set of practices and a community that engages in them. Treat the beliefs as a secondary aspect and their inconsistancies and absurdities don't matter so much. That is, if people are making sacrifices to propitiate the gods, it is the act of sacrifice, not whatever beliefs the practioners hold about the ontological status of their gods, that is the important feature. I'm not sure how this impacts the argument for or against simplicity, except that culturally embedded practices tend to be harder to describe (more complex) than abstract systems of belief.
"Our theory explains why simplicity is so highly desirable. To understand this there is no need for us to assume a 'principle of economy of thought' or anything of the kind. Simple statements, if knowledge is our object are to be prized more highly than less simple ones because they tell us more; because their empirical content is greater; and because they are better testable." -Karl Popper
The reason we want more testable models with greater empirical content is so that we can prove them wrong. If our model is wrong, and we don't yet know how to fix it, what do we gain from continuing to use and build upon it? the statement "You're wrong because your model is too simple. But I'm not going to tell you what your model is missing, at least not in a clear enough way to help you improve your model" is completely valid and useful in that it points out a problem. A solution would be nice too, but thats not always an option and not a necessary component of the scientific method.
Do you think the reason-based choice bias effects our ability to accept a criticism of a model when a better solution isn't included/available?