22 Comments

There are various ways in which people can be help culpable for having unjustified or false beliefs, but, whether or not we want to use the term 'bias' to cover the whole range, I wouldn't leave out, as Nick S is suggesting, factors that shape the direction of inquiry, and focus only on those that shape the formation of belief given a body of evidence.

Intellectual laziness is one example. If someone heard about this blog but prefers not to read it, because he doesn't care much about having truer beliefs, then this doesn't automatically relieve him of responsibility for having biases beliefs.

An even better example is self-deception. Self-deception is surely a form of bias, but self-deception operates not only by leading a person to believe falsely in the face of contrary evidence, but also (perhaps primarily) by causing him to be 'lazy' in gathering certain kinds of information that might force him to correct his beliefs.

Perhaps you'd want to leave intellectual laziness out because it operates across the board. But intellectual laziness may be focused even when there's no self-deceptive motivation at work. Perhaps someone just developed the habit of avoiding reading articles about science -- perhaps such an article bored him a long time ago, although reading such articles wouldn't bore him now or even take a great effort. Won't we say that this is a bias?

Expand full comment

I'm not sure if we're disagreeing about the nature of practical rationality (probably) or laziness, but we are certainly disagreeing about what is systematic in bias.

I take laziness to be aversion to work, and aversion to something influences what is practically rational for someone.

General intellectual laziness would be likely to lead to generalised inaccuracy in belief, (but cf Bishop, Michael. “In Praise of Epistemic Irresponsibility: How Lazy and Ignorant Can You Be?” in Synthese, 2000, 122: 179-208, available at http://www.niu.edu/phil/~bi... but general inaccuracy of belief is not bias and is not what I mean by systematicity when talking about bias. I mean either irrationally skewed belief about some topic of knowledge or specific kinds of error in the use of specific kinds of evidence.

Expand full comment

Nick S,In my example of Bob, I meant that it would be practically rational for him to do more intellectual work, not that his laziness would raise the cost of intellectual work so much that it would not be practially rational for him to do it. I also meant that he systematically errs on doing too little intellectual work.

Now, it might be the case that this means that Bob has a bias against doing intellectual work, although in at least one sense of the word I think he would not be biased if his systematic error did not result "directly" from the influence of some factor whose function is not to maximize accuracy of belief.

More importantly, however, even if Bob had a bias about the rewards of intellectual work, I don't think that would mean that he was also biased about all other topics which are such that his beliefs about these other topics would change if he did more intellectual work. It would seem that on your explication Bob's possible bias about the merits of intellectual work would spill over to make him generally biased about almost everything.

Robin, yes bias does seem closely tied up with "illegitimate considerations" - I think we are playing around with different attempts to develop and clarify this general idea.

Expand full comment

On Nick Bs interpretation: Also can be irrational if it were theoretically rational for us to acquire further evidence. Expected cost includes anything we take as costs, and so theoretical rationality can impinge on acquiring further evidence in two ways: lowering the probability of p so increasing expected costs of being wrong; wanting strongly to know whether p can raise expected cost of being wrong by raising cost of being wrong.

Bob's not biased under my defn. Bob's intellectually lazy so this increases the expected cost to him of acquiring new evidence. Under my defn: 1. He might not be irrational in belief because laziness mades the expected cost of evidence acquisition outweigh his expected cost of being wrong. 2. He might be irrational in belief but not biased because the irrationality is not systematic.

Expand full comment

My mind keeps coming back to trying to elaborate a counterfactual where my goals put more weight on reducing belief errors and less weight on things like wanting people to like and respect me. Considerations that would remain just as important in this counterfactual seem legitimate, and considerations that would be less important seem illegitimate. So costs of time and money seem legitimate, while costs of people liking me less seem illegitimate. "Biases" seem to me to be closely tied up with these illegitimate considerations.

Expand full comment

I've been away for the day and I see you've all been busy! :) The last concrete proposal I see here is "a factor that systematically shapes our beliefs whose function is neither to maximize epistemic accuracy nor to conserve intellectual resources."

The problem here is clarifying "intellectual resource." If I think better of my friends so they will like me more, and then pass more relevant info to me, is that good resource management or bias?It seems that there is an intuition here of legitimate vs. illegitimate costs one may consider.

Expand full comment

Nick S, Guy,

I take your point that conserving intellectual resources is not the only non-rational non-biasing driver. Inserting "directly" as Guy suggests might help.

The idea was that when diagnosing bias we should set aside shapers of our beliefs when those shapers are things like lack of time, effort, areas of interest, computational limitations etc. These shapers, one might say, are not trying to shape our beliefs in any particular way. They are, in some sense which I'm not yet sure how to express exactly, neutral vis-a-vis the content of out opinions: their function is not to make us believe p rather than not-p, or vice versa, although they may have indirect effects on what we believe by causing us to spend more time acquiring or evaluating evidence on some topics rather than others. We can use Guy's term "directly" as a placeholder for this idea, but I think it would be possible to replace this with a more precise explanation.

Nick S proposes to define bias as systematically irrational belief, where a belief can be irrational if it would have been practically rational for us to have acquired more evidence. (Is this a correct interpretation of what he said?) One worry I have about this is that it might indicate bias too often. Bob is intellectually lazy. It would be practically rational for him to acquire more evidence (and do more thinking) about a wide range of topics. It would then seem on NickS's account that most of Bob's beliefs on these topics are irrational (because his probability assignments would change if he investigated more). Nevertheless, all parts of Bob's brain, insofar as they work on forming beliefs at all, aim exclusively for the truth; and Bob is perfectly calibrated. His intellectual machine uses all the time and fuel it gets with optimal efficiency to generate accurate beliefs. It seems to me that what is wrong with Bob is not that he is biased.

Expand full comment

We should probably also distinguish subjective and objective bias. My post was about subjective bias, but objectively biased belief could probably be similarly defined in terms of systematic untruth. You might be unfortunately placed or just not know enough without any reason to know more, and so be in possession of evidence which is systematically misleading.

Expand full comment

Oh, I now see that Nick S's comment already contains my main point. Just to prevent misunderstanding: my previous comment was aimed at Nick B!

Expand full comment

Nick, this definition still wouldn't do, due to the element I mentioned in my comments on Robin's later post. Our interests and values are presumably a factor that can systematically shapes our beliefs and whose 'function' is neither to maximise epistemic accuracy nor to conserve intellectual resources. Interests and values systematically shape our beliefs because they determine the direction of inquiry, but this need not involve any bias (whereas wishful thinking does). So a further qualification is needed. (Would adding 'DIRECTLY shapes our beliefs' help? But 'directly' is not the clearest notion.)

(Another point: bias can be a property of different things. When we ascribe bias to a person, I think we are usually also ascribing to that person a measure of epistemic blame. This is not so when we ascribe bias to sub-systems or capacities, e.g. the perceptual system or one's capacity to estimate probabilities. Finally, bias can be a property of one's evidence ('a biased sample') even when one isn't in a position to be aware of this and all of one's relevant sub-systems/capacities are in perfect order.)

Expand full comment

I don't think we can adequately characterise bias without saying something about the rationality of belief in terms of believing in accordance with possessed evidence and in the light of the cost of acquiring further evidence, something like:

a rational belief in p is believing p in accordance with possessed theoretical reasons such that the expected cost of being wrong is less than the expected cost of acquiring more evidence.

I have some complications in what I would say about a theoretical reason, but I'll omit them for now. The relevance of expected cost is not purely the conservation of intellectual resources, but is a full bloodedly practical element and for this reason I don't agree with the other Nick's suggestion. I also think we need to include bias in causes and effects. So I would go for something like

bias in belief is either a cause of systematically irrational belief or the possession of systematically irrational beliefs

(where the systematicity could itself be either theoretical or practical)

Expand full comment

Paul, Eric,

Bias = "something shaping our beliefs whose function is not to maximize epistemic accuracy"?

This seems a little too broad. Time-constraints and lack of interest will in some sense shape our beliefs - in the sense that we would believe different things if these factors were not present. The function of these factors is not to maximize epistemic accuracy. Yet I don't think one is biased merely because one does not spend all one's time on epistemic pursuits.

For the influence of such a factor to be a bias, I think it would have some "directional" impact on our beliefs. I now put the word "directional" in quotation marks because I'm not sure it's exactly the right term. A factor that tended to make our beliefs unjustifiably confident (to make us appear competent) could be a bias even if it was indifferent between to the propositional content of the belief, as Paul points out.

How about this:

A bias = a factor that systematically shapes our beliefs whose function is neither to maximize epistemic accuracy nor to conserve intellectual resources

Expand full comment

How about: bias is something that can shape our beliefs whose function is not to maximize epistemic accuracy. This is close to Nick's working definition, but allows bias to be a potential and indirect cause of error.However, a concern I have with this is an old Kuhnian insight: sometimes, to increase accuracy one narrows the domain of inquiry. But we are unlikely to think of this as a form of bias, even if (as I noted in an earlier posting) it can be in some circumstances. So I propse the following: bias is something that can shape our beliefs whose function is not to maximize epistemic accuracy and generality.

Expand full comment

I agree with Robin that often there is no need for precise definitions, but it's only rarely that there is no need for clarity. But it's better to aim at clarity at the level of specific claims made rather than at that of the overarching theme under which these claims are made, as I think it's almost always more fruitful to work with a broad, open-ended and intuitive sense of the subject.

As Nick notes, 'bias' is best understood as an epistemic notion, referring to unjustified rather than to false belief, and there are all sorts of distinctions in epistemology that it might be useful to draw on to make charges of biases clearer -- if only to ensure that different people are really talking about the same thing. (For example, 'justified' can be understood in either an internalist or externalist sense.)

I share Nick's worry about talk about *avoidable* error. An error might be avoidable by the believer himself, and more importantly, it might be avoidable in a sense where we can hold him responsible for his unjustified beliefs. But there are many areas where we speak about bias where this isn't true. Or it may be avoidable simply in the sense that it's correctable. But do we have a clear notion of a form of error in belief that, although we have been made aware of it, we can't appropriately revise our beliefs? I prefer to interpret Robin's talk of 'avoidable' at a second-order level: given that we are aware that many of our beliefs may be biased, we can start a reflective inquiry into the epistemic standing of our beliefs, and in virtue of having started this inquiry, many of our biases become avoidable even in the narrower sense. (In other words, those consciously engaged in the correction of bias are saddled with greater epistemic responsibilities than ordinary believers.)

Another point I think Nick was raising was about generality. Do we want to think of any widespread false/unjustified belief, or even disposition to such belief, as a bias? The term 'bias' does suggest something wider, something with the causal powers to affect a whole sector of beliefs. But this distinction may not matter all that much when a disposition to form a particular unjustified belief causes a believer to form many other false beliefs.

Expand full comment

Nick, Paul has a good point, it seems more straightforward to just define bias as the belief changes due to non-epistemic-accuracy belief functions. This is closer to the proposal I just posted.

Expand full comment

Nick, it seems like "something shaping our beliefs whose function is not to maximize epistemic accuracy" is a pretty good definition on its own. Why should a bias involve pushing beliefs in some specific direction along some dimension? (Suppose there was evidence that I chose beliefs at random after a period of thinking, in order to display my decisiveness. Even those beliefs wouldn't necessarily be skewed in any particular "direction," wouldn't it still be a bias?)

Expand full comment