Michael Gerson has doubts on doubt:
Without a doubt, doubt is useful and needed at the margins of any ideology. The world is too complex to know completely. Many of our judgments are, by nature, provisional. Those who are immune to evidence, who claim infallibility on debatable matters, are known as bores – or maybe columnists.
Doubt becomes destructive as it reaches the center of a belief and becomes its substitute. A systematic skepticism may keep us from bothering our neighbor. It does not motivate a passion to fight for his or her dignity and rights. How do ambiguity and agnosticism result in dreams of justice, in altruism and honor, in sacrifices for the common good? What great reformers of American history can be explained by their elegant ambivalence? (more)
Ask yourself this simple question: how confident would you need to be on a moral or political conclusion in order to work passionately for it? 99%? 90%? 75%? If you have such an action-threshold, and this threshold is high, well then yes, to let your passion flower, you may need to lie to yourself about your confidence. So that you might actually do something.
Would your overconfidence then lead you to do too many things too enthusiastically? If so, perhaps you’d do better to also allow yourself some other more graded psychological reluctance to passion, to counter this bias.
But it would of course be even better if you could see the nobility and glory in doing your best as a limited but well-meaning creature. You shouldn’t need to be absolutely sure of a conclusion to work sincerely and passionately for it.
I think the fundamental premise of this argument is a shortcut, a simplification of a more subtle argument. A simplification that is often purposefully hijacked.
We should work on issues where our marginal efforts are likely to produce marginal benefits that exceed those marginal efforts.
Our degree of passion should depend on how much the likely benefits exceed those efforts and is a signal to others that they should work on these same things. The problem is that using someone else's passion as input to your own utility function can be detrimental unless the two utility functions are linked via a positive sum interaction (where increase in one utility function results in an increase in the other utility function). If the utility functions are coupled in a zero-sum interaction (or even worse a negative sum interaction), then one should do the opposite and work against what ever others work passionately on.
For example, enslaving someone would achieve substantial benefits to the owner because the value generated by the slave would accrue to the owner. But the passion with which an individual seeks to acquire slaves is a poor indicator as to whether one should assist in that effort. The critical factor is who becomes a slave and who becomes the owner. Future owners should work passionately to institute slavery, future slaves should work passionately to prevent slavery.
A more realistic example is in tax policy. People and corporations should work passionately on reducing their own taxes and passionately on increasing the taxes of everyone else. The current political atmosphere is doing exactly that. Taxes are being decreased on present tax payers and increased on future tax payers by funding government expenditures through debt which will be paid (with interest) by future tax payers. Politicians supporting such tax cuts are benefiting because their utility function is increased by campaign contributions from donors who benefit from the present tax cuts.
"Ask yourself this simple question: how confident would you need to be on a moral or political conclusion in order to work passionately for it? 99%? 90%? 75%?"
I realise this may only be a figure of speech, but I think it's a mistake to attach numerical confidence levels to many beliefs, particularly moral ones.
For instance I may hold a belief that if the laws of physics are such and such, and that evolution shows such and such, then there is no God. This is an axiomatic argument. You either accept the premises and the conclusion follows or you reject it. It's not probabilistic.
This isn't a call for absolutism. We could have an incorrect model of the world, but we can't attach probabilities as to its being wrong.
I suppose my point is that you can be skeptical about your model being absolutely correct, while being passionate about its implications.