Bias is something bad from an epistemological point of view, but what exactly is it and how is it distinct from other kinds of epistemological error?
Let’s start with one remark Robin made in passing: "Bias is just avoidable systematic error." One big question here is what makes a systematic error avoidable. For example, suppose somebody has never heard about overconfidence bias. When such a person (erroneously) rates herself above average on most dimensions without strong specific evidence, is she making an avoidable error?
It seems to be avoidable only in the sense that there is information which she doesn’t have but which she could get which would make it possible for her to correct her beliefs about her own abilities. But in this sense, we would be biased whenever we lack some piece of information that bears on some broad domain. Socractes would be biased merely by being ignorant of evolution theory, neuroscience, physics, etc. This seems too broad.
Conversely, if she is systematically overestimating her own abilities, it seems she is biased even if these errors are unavoidable. Suppose she does learn about overconfidence bias, but for some deep psychological reasons simply cannot shake off the illusion. The error is systematic and unavoidable, yet I’d say it is a bias.
Here is an alternative explication that I’d like to put forward tentatively for discussion: A bias is a non-rational factor that systematically pushes one’s beliefs in some domain in one direction.
If we go with this, then to show somebody is biased it would be neither sufficient nor necessary to show that she is systematically in error in some domain (although showing that would often be inconclusive evidence in favor of the hypothesis that she is biased).
It is not sufficient, because it would also have to be shown that the systematic error results from the influence of some non-rational factor (as opposed, for instance, to some high-level assumption that rationally seems plausible to the subject, based on her evidence, but happens to be wrong).
It is not necessary, because somebody could be subject to the influence of a non-rational factor that systematically pushes her beliefs in some domain in a direction which fortuituosly makes her beliefs more accurate. Somebody might think highly of her own abilities as a result of a psychological mechanism that has evolved for signalling purposes, yet in some case it might happen to result in correct beliefs (for somebody who happens to be above average on most dimensions) even though she has no evidence for these beliefs.
There are various ways in which people can be help culpable for having unjustified or false beliefs, but, whether or not we want to use the term 'bias' to cover the whole range, I wouldn't leave out, as Nick S is suggesting, factors that shape the direction of inquiry, and focus only on those that shape the formation of belief given a body of evidence.
Intellectual laziness is one example. If someone heard about this blog but prefers not to read it, because he doesn't care much about having truer beliefs, then this doesn't automatically relieve him of responsibility for having biases beliefs.
An even better example is self-deception. Self-deception is surely a form of bias, but self-deception operates not only by leading a person to believe falsely in the face of contrary evidence, but also (perhaps primarily) by causing him to be 'lazy' in gathering certain kinds of information that might force him to correct his beliefs.
Perhaps you'd want to leave intellectual laziness out because it operates across the board. But intellectual laziness may be focused even when there's no self-deceptive motivation at work. Perhaps someone just developed the habit of avoiding reading articles about science -- perhaps such an article bored him a long time ago, although reading such articles wouldn't bore him now or even take a great effort. Won't we say that this is a bias?
I'm not sure if we're disagreeing about the nature of practical rationality (probably) or laziness, but we are certainly disagreeing about what is systematic in bias.
I take laziness to be aversion to work, and aversion to something influences what is practically rational for someone.
General intellectual laziness would be likely to lead to generalised inaccuracy in belief, (but cf Bishop, Michael. “In Praise of Epistemic Irresponsibility: How Lazy and Ignorant Can You Be?” in Synthese, 2000, 122: 179-208, available at http://www.niu.edu/phil/~bi... but general inaccuracy of belief is not bias and is not what I mean by systematicity when talking about bias. I mean either irrationally skewed belief about some topic of knowledge or specific kinds of error in the use of specific kinds of evidence.