I Am Right About Literally Everything

By Kenneth Quek

What would you think if I said I thought that my every belief was correct? I imagine it might sound pretty arrogant. But what if I told you that, on pain of contradiction, I had no choice but to believe my every belief was correct? This is, in essence, the dilemma that Dr. David Makinson calls The Preface Paradox1, which we will be looking at today. Let’s start off with two commonsensical assertions about what everyone believes about themselves.2

(1) If I believe P, then I take P to be true.

(2) I believe that at least some of my beliefs are false. 

Claim (1) is true just because that is just what a belief is – when I say I believe some declarative statement, what I’m saying is that I think that declarative statement is true. Because of this, we can quite safely say that each and every belief we hold is something we take to be true. Claim (2) is something I think most of us would reasonably just accept. After all, who in their right mind would think that their current set of beliefs is completely correct?

Now, in order to state explicitly what the problem here is, we’re going to have to first talk about what makes a paradox a paradox. We call something a paradox when, after taking certain intuitive claims to their logical end, we arrive at a statement that would seem quite ridiculous to common opinion.3 One of the most common ways to arrive at a paradox is to find some kind of contradiction in two ‘common-sense’ beliefs to force us to have to reject one, and that’s precisely the angle that Makinson takes here.

Let’s take a look at (1) more closely. Notice how it states that we hold each individual belief we have to be true. This means that, if I only believe three things, I take each of these three things to be true. If I believe thirty things, then I take each of these thirty things to be true, and so on. In reality, I believe a lot of things – but that doesn’t diminish how much (1) affects me. I take every single one of my many beliefs to be true. And, by logical implicature, if I take every one of my beliefs to be true, then I take none of my beliefs to be false. So, we can present (1) in a different way (1*).

(1*) I believe that none of my beliefs are false.

(2) I believe that at least some of my beliefs are false. 

The reason why we can perform this little switcheroo is because (1) and (1*) mean the same thing; put more formally, they are logically equivalent.4 But now that we present it this way, the contradiction becomes a lot more apparent. How can I believe that none of my beliefs are false, but also that at least some of my beliefs are false? Here’s the part where things get really weird. There is clearly a contradiction between (1*) and (2), which suggests that, somewhere down the line, we got something wrong. 

It might be worth slowing down a bit and considering whether we should really accept the premises being laid out here. Is it really the case that we believe that some of our beliefs are false like (2) would suggest? It might be the case that we believe something slightly different:

(3): I believe that it is possible that at least some of my beliefs are false.5

The difference between (3) and (2) is that (3) only asserts that I think it might be the case that I’m wrong about some things, and not that I actually think that I’m wrong about some things. If we actually believe (3) and not (2), then our problem is solved! (3) does not contradict (1) at all, because we can believe that none of our beliefs are false but also think it’s possible that they might be false. Since there is no contradiction, there is no paradox, and we can all sleep soundly knowing that the preface paradox has been solved.

I’m not entirely sure whether this response would work. I think that, when it comes to what we think of our beliefs as a whole, we believe something closer to (2) than it is to (3). My thought here is to think about some other cases where (3) is true, and see whether it’s similar to what’s going on when we think about whether our beliefs are wrong.

Imagine I am a palaeontologist, and someone asks me if it’s possible that dinosaurs never actually existed, and their remains are just constructions of our ancestors trying to play an elaborate prank on us. I think the question is silly, but indulge them nonetheless – I reply, ‘I suppose it’s possible, but highly unlikely’. 

Here, I am articulating a belief that is similar to what is expressed in (3). I think it’s possible that my belief in the existence of dinosaurs is mistaken, despite also believing (similarly to the thought expressed in (1)) that dinosaurs do in fact exist. There is no contradiction in what I’m saying here.

However, is it really the case that what we think about our beliefs as a whole follows this pattern? At least for me, I think the likelihood that I have some wrong beliefs is far greater than the likelihood that dinosaurs don’t exist. I don’t think it’s a mere possibility that I hold at least some wrong beliefs; I’m almost certain that I do. In other words, my belief that I hold some wrong beliefs is stronger than what is reflected in (3). In fact, it is a lot closer to what is reflected in (2); I do, in fact, actually believe that some of my beliefs are false. 

And so, we’re back at square one. Since we’re stuck with (2), we’re going to have to figure out what to do given that (1*) and (2) are contradictory. Here, we are offered three choices to resolve the paradox6: we either reject (1*), reject (2), or show that (1*) and (2) aren’t actually contradictory. 

Rejecting (1*) seems extraordinarily difficult. After all, it means the exact same thing as (1), and (1) is about as obvious as something can get – we’d be hard pressed to describe how we can have a belief that we take to be false. Just by definition, a belief is something that we hold to be true. Showing that they aren’t contradictions also seems a bit difficult – after all, (1*) and (2) are phrased in such a way that they’re pretty convincingly contradictory.

So we’re left with rejecting (2), which brings us full circle back to my clickbait title. It seems that, on the face of it, we can’t ever consistently say that we think some of our beliefs are false. Saying ‘I think I might be wrong’ might be good courtesy, but it would just be straight up illogical. That’s the true bite of Makinson’s paradox – if it truly did hold, then there is technically no logical basis for being humble about our beliefs. 

Thankfully, we might have a way out of accepting this nasty conclusion. I still don’t think rejecting (1*) is possible, but we might try to think more about whether (1*) and (2) are actually that directly at odds with each other. All this time, we’ve been thinking about truth value as a binary function – we’ve been assuming that things are only either true or false, and nothing in between. 

Enter Dr. Lofti A. Zadeh, a mathematician who developed the concept of fuzzy logic. He observed that we often have to make decisions based on varying degrees of truth, rather than just an absolute value of ‘true’ or ‘false’, and wanted to create a model that could help us make sense of them.7 Applying his logic to our problem, we might be able to wiggle our way out of the contradiction. Recall the statements in contradiction:

(1*) I believe that none of my beliefs are false.

(2) I believe that at least some of my beliefs are false. 

For simplicity, let’s pretend I only hold three beliefs: A, B, and C. Because of (1*), I hold that each of these three beliefs are true. Normally, if I take A to be true, B to be true, and C to be true, then I take A, B, and C to be true. This is common sense, but also a logical rule. It is when we commit to this step that the contradiction comes in; if we take our beliefs as a whole to have no errors, then we cannot take our beliefs as a whole to have at least some errors too.

Now let’s slide fuzzy logic into the problem. Say the threshold of certainty I need to say I believe something is 0.7. Let’s say, additionally, that I’m 90% certain of A, 80% certain of B and 75% certain of C. If we check whether we believe they’re all true, something weird happens. We get:

a x b x c = abc8 (For clarity: 8 is a footnote, not a mathematical index.)

0.9 x 0.8 x 0.75 = 0.54

A, B and C individually meet the 0.7 threshold for certainty. In other words, I believe that each of them is true, and (1*) is fulfilled. But if we look at the certainty of whether I think all our beliefs are true, we see it’s below 0.7. Despite believing that each of my three beliefs are true, I also believe that some of my beliefs could be wrong. If we apply fuzzy logic to beliefs, then we can hold both (1*) and (2) to be true without contradiction, and the paradox is solved.9

The reason why this works is because fuzzy logic makes us think about our beliefs as probabilities. When we say we believe something, it’s not because we think it’s undoubtedly true, but instead that it meets a certain threshold for how confident we are that it’s true. And if that’s the case, then we can say that we believe that there some errors in our beliefs as a whole, even if we believe each of our individual beliefs are true.

That’s not the end of the story though! We still have a couple of problems we need to sort out. If you were really observant, you might have noticed that accepting fuzzy logic here forces us to sacrifice another very intuitive principle. In fact, we briefly touched on it earlier: it is that, if we take P to be true, and Q to be true, we take P and Q to be true. Those familiar with basic symbolic logic would know this rule as the Conjunction Introduction Rule (though really, like I said, it’s very common-sensical).10 Because of how fuzzy logic solves this problem, accepting fuzzy logic’s strategy here would mean rejecting that this rule actually holds in all cases. Our application above would be a direct counterexample to the rule. Are we really prepared to sacrifice this rule to preserve the value of humility?

We might wiggle even further here. It seems obvious that there are some cases where this principle applies. After all, if I am hungry and I am thirsty, it is obvious that I am both hungry and thirsty. This much we’ll be hard pressed to deny. So maybe what’s really going on here is that we sometimes mean things in the normal way and sometimes mean them in the fuzzy logic way. The statements ‘I believe the earth revolves around the sun’ and ‘the earth revolves around the sun’ might both look like they deal with the same thing, but are actually of vastly different structures and employ different kinds of logic – the meaning of the first one would be ‘fuzzy’, while the meaning of the second one would be normal.  

If we think about it, we might have some really good reasons for wanting to make this distinction. What we could say is that thinking the two statements are the same is to conflate ontology with epistemology – it is to assume that statements about ‘what is’ and ‘what is believed’ follow the same logical structure and rules. While statements about ‘what is’ only have two possible truth values 11 (the earth either revolves around the sun, or it does not), statements about ‘what is believed’ comes in different degrees (I might be sceptical that the earth revolves around the sun, be fairly certain that it does, be conflicted on whether it does, etc.). Given that the very structure of ‘truth’ in these statements is different, it would then make sense that we would use different logical systems to map them out – a system with binary truth values for statements on ‘what is’, and a system with a spectrum of truth values for statements on ‘what is believed’.  

Maybe you have thoughts on whether this distinction makes sense. Or maybe you even have your own solution to the paradox. Either way, I think it’s amazing that this seemingly innocuous question about whether intellectual humility is logically valid can lead us down such a rabbit-hole down to the fundamentals of logic and, on a very basic level, how we employ our everyday language. That’s the beauty of philosophy, isn’t it? You try to figure out a problem, you dig deeper and deeper, and after a certain point you take a step back and realise you’re even more confused than when you first started – but now your confusion is more informed than it was at the start, and the things you are confused by are deeper and more interconnected than you could have ever thought. Who knows – maybe some version of this same problem might pop up somewhere we wouldn’t expect, like in ethics or political philosophy. If that ever happens, our discussion here might give us valuable insight on how to begin tackling those problems.


1. Makinson, D. C., 1965. Paradox of the Preface, Analysis (25), pp. 205-207. Oxford: OUP.

2. Makinson originally presented the Preface Paradox by drawing on the analogy of non-fiction writers who ‘preface’ their work with a disclaimer that the facts they present may be inaccurate. However, I think that presenting the problem as a direct problem with belief is easier. 

3. Cantini, A., Riccardo, B., 2021. Paradoxes and Contemporary Logic, The Stanford Encyclopedia of Philosophy. Edward N. Zalta (ed.).

4. Quine, W. V., 1953. Reference and modality. From a Logical Point of View, pp. 139–159. Cambridge: HUP.

5. For clarity, ‘possibility’ here is not used in the modal sense – it is not being used to mean ‘I believe there is some possible world where some of my beliefs are false’. Instead, it is used to mean ‘I believe it may be possible that, in this state of affairs that we live in, I hold some false beliefs.’

6. Lycan, W.G., 2010. What, exactly, is a paradox? Analysis, pp. 615-622. Oxford: OUP.

7. Novák, V et. al., 1999. Mathematical principles of fuzzy logic. Dordrecht: Kluwer Academic.

8. For the purposes of illustration, I have simplified fuzzy logic notation into a more easily understandable arithmetic form. Additionally, the interpretation being used is that our confidence in our beliefs as a whole is the product of our confidence in our individual beliefs. For example, if I think a coin has a 50% chance of landing on heads, then I would also think that there is a 0.5 x 0.5, or 25%, chance that two coins will both land on heads. This is how Dr. Zadeh defines statements that use the word ‘and’ in his logic – but this definition can be challenged!

9. We can say that the paradox is solved because we have found a possible situation where both (1* and (2) are simultaneously true. If this is the case, then there would be nothing wrong with claiming both of them at the same time, and we are not forced to discard either statement. 

10. Barker-Plummer, D. et. al., 2011., Language, Proof, and Logic: Second Edition, pp. 46-49. Stanford: Centre for the Study of Language and Information

11. As is typical in philosophy, this claim is debatable. While I personally do believe that whether things ‘are the case’ operates on a binary, you could certainly make a case that it too works on a spectrum. George Berkeley, for example, famously believed that some things could be more real (in other words, ‘exist’ to a greater degree) than others. For more on this, see Berkeley, G., 1710. A Treatise Concerning the Principles of Human Knowledge, 1.33. 

Leave a Comment

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s