Suppose an author exercises due diligence in the researching and writing of a nonfiction book. He has good reason to believe that all of the statements he makes in the book are true. But he is also well aware of human fallibility and that he is no exception to the rule. And so, aware of his fallibility, he has good reason to believe that it is not the case that all of the statements he makes in the book are true. He makes mention of this in the book's preface. Hence 'paradox of the preface.' Thus:
1. It is rational for the author to believe that each statement in his book is true. (Because he has exercised due diligence.)
2. It is rational for the author to believe that some statement in his book is not true. (Because to err is human.)
Therefore
3. It is rational for the author to believe that (each statement in his book is true & some statement in his book is not true.)
Therefore
4. There are cases in which it is rational for a person to believe statements of the form (p & ~p).
"What the paradox shows is that we need to give up the claim that it is always irrational to believe statements that are mutually inconsistent." (Michael Clark, Paradoxes From A to Z, Routledge 2002, p. 144)
Is that what the paradox shows? I doubt it. The paradox cannot arise unless the following schema is valid:
a. It is rational for S to believe that p.
b. It is rational for S to believe that ~p.
Ergo
c. It is rational for S to believe that (p & ~p).
It is not clear that the schema is valid. Rational believability, unlike truth, is a relative property. What it is rational to believe is relative to background knowledge among other things. Relative to the author's knowledge that he exercised due diligence in the researching and writing of his book, it is rational for him to believe that every statement in the book is true. But relative to considerations of human fallibility, it is rational for him to believe that it is not the case that every statement in his book is true. So what (a) and (b) above really amount to is the following where 'BK' abbreviates 'background knowledge':
a*. It is rational for S to believe relative to BK1 that p.
b*. It is rational for S to believe relative to BK2 that ~p.
From these two premises one cannot arrive at the desired conclusion. So my solution to the paradox is to reject the inference from (1) and (2) to (3).
"But doesn't the author's background knowledge (BK) include both the truth that he exercised due diligence and the truth that human beings are fallible?" Well suppose it does. Then how could it be rational for him to believe that every statement in the book is true? It is rational for him to believe that every statement is true only if he leaves out of consideration that people are fallible. Relative to his total background knowledge, it is not rational for him to believe that every statement in his book is true.
In this way I avoid Clark's draconian conclusion that it is sometimes rational to believe statements that are mutually inconsistent.
Leave a Reply