I read Douglas Hofstadter’s new book I am a Strange Loop, which argues that consciousness happens spontaneously after a system of dynamic patterns is sufficiently complex. Strange loops of self-awareness existing on multiple levels (as in Godel’s famous proof) create hallucinations of a hallucination, and so an "I" forms. Anyway, as I often do when reading nonfiction, I read a little bit more about the author, and was struck by Hofstadter’s law: It always takes longer than you expect, even when you take into account Hofstadter’s law (note this is recursive and paradoxical, which is Hofstader’s specialty). This turns out to be pretty well known among programmers where everyone has read Hofstadter’s Godel, Escher, Bach.
As they say, Hofstadter’s Law is funny because it rings true to many programmers, who often work on complex projects that take years to complete. Clearly an alternative to the Law of Iterated expectations. Why might people involved in sufficiently complicated tasks–writing a paper, a book, building a deck–generally underestimate their length? I think the main reason is that goals become self-fulfilling, so any lengthening of a goal time would add to the total time the way bureaucracies spend the limit of their budget whatever it is. Just like a group of people, people themselves have multiple goals; to watch tv, to get a project done, to be a better golfer. A successful goal needs a bias to compete with your other goals, who probably also have biased homunculus advocating for them in your mind.
On one level an unbiased expectation is optimal because it allows us to allocate our resources more efficiently. But there are many cases where this is not true, where a little too much hope and faith actually makes you a more successful person, and more fun to be around. Just think about how annoying ‘brutally frank’ people are–they are jerks. Think about the guy who thinks he is a better dancer than he really his confidence actually makes him a better dancer, because part of good dancing is not being self-conscious. Robert Trivers has pointed out that self-deception is, in moderation, an evolutionary advantage, in that a liar who believes his own lies is a more effective persuader than a lie who knows he is lying, and fundamentally we are social animals trying to convince others to do this or think that.
The answer you give about Hofstadter's Law doesn't seem relevant to the question you're asking. I agree with your explaination that, if I expect it will take 10 weeks to do something but I'm given 15 weeks to do it, it will take me 15 weeks to do it. Hofstadter's Law says that, given a task that will take 15 weeks, people will estimate it to take them 10 weeks, even when they think they are being generous. This is logically a different issue.
Given that this site is about bias, you could have mentioned superiority bias (i.e. thinking, "This would take the average person 15 weeks, but of course I'm better than average, so it will only take me 10 weeks") or beneffectance effect on memory (=egocentric attribution errors) (i.e. "In the past, this took me 15 weeks, but the delay was due to circumstances beyond my control. When I have control, it will take 10 weeks".) or some variation of availability heuristic (remembering the days when you powered through the task rather than the days when you were distracted by minutiae).
Personally, I find arrogant, self-inflated people to be jerks and honest, self-critical people to be better company. I don't think I'm alone in that. Neither my preference nor your preference counts as a scientific argument.
As for bias being evolutionarily advantageous (in the context of deception): maybe, but so is violence against out-groups. That doesn't mean that it's morally good or socially desirable to have these biases. Your final point seems to be that lying is a more "fundamental" human behaviour than others. That sounds very arbitrary to say the least.
I don't understand how a bunch of neurons obeying the laws of physics (thru chemistry) generates my consciousness, yet I'm also certain that my consciousness is intrinsically related to my brain and will extinguish with it.
People have been 'certain' of a great number of things throughout history, and many of them ended up being provably wrong. It is probably a good thing to be 'certain' of as few things as possible, and thereby be open to finding out new information that might modify or expand or even invalidate parts of one's worldview.
I like what Jerry Fodor says on this topic:
Nobody has the slightest idea how anything material could be conscious. Nobody even knows what it would be like to have the slightest idea about how anything material could be conscious. So much for the philosophy of consciousness. . .