Intuitions are a major source of evidence in philosophy. Intuitions are also a significant source of evidence about the person having the intuitions. In most situations where onlookers are likely to read something into a person’s behavior, people adjust their behavior to look better. If philosophical intuitions are swayed in this way, this could be quite a source of bias.
One first step to judging whether signaling motives change intuitions is to determine whether people read personal characteristics into philosophical intuitions. It seems to me that they do, at least for many intuitions. If you claim to find libertarian arguments intuitive, I think people will expect you to have other libertarian personality traits, even if on consideration you aren’t a libertarian. If consciousness doesn’t seem intuitively mysterious to you, one can’t help wonder if you have a particularly un-noticable internal life. If it seems intuitively correct to push the fat man in front of the train, you will seem like a cold, calculating sort of person. If it seems intuitively fine to kill children in societies with pro-children-killing norms, but you choose to condemn it for other reasons, you will have all kinds of problems maintaining relationships with people who learn this.
So I think people treat philosophical intuitions as evidence about personality traits. Is there evidence of people responding by changing their intuitions?
People are enthusiastic to show off their better looking intuitions. They identify with some intuitions and take pleasure in holding them. For instance, in my philosophy of science class the other morning, a classmate proudly dismissed some point, declaring,’my intuitions are very rigorous’. If his intuitions are different from most, and average intuitions actually indicate truth, then his are especially likely to be inaccurate. Yet he seems particularly keen to talk about them, and chooses positions based much more strongly on they than others’ intuitions.
I see similar urges in myself sometimes. For instance consistent answers to the Allais paradox are usually so intuitive to me that I forget which way one is supposed to err. This seems good to me. So when folks seek to change normative rationality to fit their more popular intuitions, I’m quick to snort at such a project. Really, they and I have the same evidence from intuitions, assuming we believe one anothers’ introspective reports. My guess is that we don’t feel like coming to agreement because they want to cheer for something like ‘human reason is complex and nuanced and can’t be captured by simplistic axioms’ and I want to cheer for something like ‘maximize expected utility in the face of all temptations’ (I don’t mean to endorse such behavior). People identify with their intuitions, so it appears they want their intuitions to be seen and associated with their identity. It is rare to hear a person claim to have an intuition that they are embarrassed by.
So it seems to me that intuitions are seen as a source of evidence about people, and that people respond at least by making their better looking intuitions more salient. Do they go further and change their stated intuitions? Introspection is an indistinct business. If there is room anywhere to unconsciously shade your beliefs one way or another, it’s in intuitions. So it’s hard to imagine there not being manipulation going on, unless you think people never change their beliefs in response to incentives other than accuracy.
Perhaps this isn’t so bad. If I say X seems intuitively correct, but only because I guess others will think seeing X as intuitively correct is morally right, then I am doing something like guessing what others find intuitively correct. Which might be a bit of a noisy way to read intuitions, but at least isn’t obviously biased. That is, if each person is biased in the direction of what others think, this shouldn’t obviously bias the consensus. But there is a difference between changing your answer toward what others would think is true, and changing your answer to what will cause others to think you are clever, impressive, virile, or moral. The latter will probably lead to bias.
I’ll elaborate on an example, for concreteness. People ask if it’s ok to push a fat man in front of a trolley to stop it from killing some others. What would you think of me if I said that it at least feels intuitively right to push the fat man? Probably you lower your estimation of my kindness a bit, and maybe suspect that I’m some kind of sociopath. So if I do feel that way, I’m less likely to tell you than if I feel the opposite way. So our reported intuitions on this case are presumably biased in the direction of not pushing the fat man. So what we should really do is likely further in the direction of pushing the fat man than we think.
I used to think people deliberately lied about their intuitions, but I now think it's mostly unconscious. People have evolved to actually believe irrational things they would otherwise have to pretend to believe. It takes less effort than lying, and comes across as more genuine, because it is. This particular kind of cognitive bias seems to correlate negatively with autism, causing some of the social difficulties associated with autism.
If you're the kind of person who reads Overcoming Bias, the big lesson to learn here is not that the truth is less socially acceptable than your beliefs. Rather, it's that you need to make more conscious effort to lie about your true beliefs in order to succeed socially among competitors who do this instinctively and unconsciously.
Ask yourself: do you want to say rational things, or say things that it's rational to say? You can't have it both ways.
This is quite interesting and plausible. Yes, reported moral intuitions are likely distorted by the desire to look good. But does this really imply that reported moral intuitions are biased away from "moral truth", in the direction of looking good?
I think there are two distinctions here that it's helpful to spell out. The first distinction is whether signaling affects our reported intuitions consciously unconsciously:1. Lying. bias in reported intuition, in comparison with actually felt intuition.2. Self-delusion / socialization. bias in actually felt intuition, in comparison with what the felt intuition would be in the absence of social pressure.
The second case is the interesting one. As your discussions hints, it's a bit of a judgment call if the second case is best interpreted as the unconscious idiocy of self-delusion or the unconscious intelligence of taking into account others' opinions.
And I think that highlights the second distinction here, which is whether signaling effects distort or correct intuitions that are shaped in the absence of signaling effects:1. inner voice. our moral intuition functions mostly accurately outside of social pressures2. collective intelligence. moral intuitions are most accurate when they are shaped by taking into account others' opinions of those intuitions.
What's interesting to me here is that it this question seems independent of the first one, since both the inner voice and collective intelligence scenarios are tenable whether or not the social pressures act consciously (via lying) or unconsciously (via self-delusion/socialization).