Skip to main content

Hello. It looks like you’re using an ad blocker that may prevent our website from working properly. To receive the best experience possible, please make sure any ad blockers are switched off, or add https://experience.tinypass.com to your trusted sites, and refresh the page.

If you have any questions or need help you can email us.

Everyday philosophy: How to believe in politics

It is time to dispel disbelief and face the grim reality that is unfolding

Image: TNE

When we believe something we take it to be true. Our confidence in our beliefs comes in degrees: some beliefs we hold with greater certainty than others. I believe that it will rain tomorrow but my belief that the sun will rise is more certain. 

Knowledge is a special kind of belief. Knowledge is what philosophers call a success term; that is, if you know something, what you know has to be true; whereas your other beliefs might not map on to reality. 

What you know is, then, partly determined by your mind and partly by the way the world really is. 

For centuries philosophers were confident that all justified true belief was knowledge and all knowledge justified true belief. In other words, if you know something it’s not enough that what you know is true, rather should know it because your belief is both true and well-supported (justified). 

I know, for example, that Brexit has been a disaster because I hold that belief, the belief is true, and the belief is supported by a wealth of evidence (much of it published in the New European). 

Or, to take a less contentious example, I know my name is Nigel not just because I believe that and because my name really is Nigel, but because my belief is supported by the evidence of my birth certificate.

That justified true belief picture of knowledge works for most cases. But in 1963 Edmund Gettier wrote a brief journal article that threatened it. He came up with several hypothetical examples of apparently justified true beliefs that nevertheless didn’t amount to knowledge.

These weren’t isolated cases, but of a type that could generate similar examples. A whole academic industry of concocting so-called Gettier cases was born.

Here’s one. You’re standing in a field. You see what you take to be a sheep. So you believe that there is a sheep in the field.

In fact you’re looking at a dog disguised as a sheep. But, wait a minute,  there is a real sheep out of sight behind a hill. So your belief that there’s a sheep in the field is true. 

Do you know that there is a sheep in the field? Probably not. 

Another example. Henry is driving around the countryside and sees what he thinks is a barn. It really is a barn. But he doesn’t know that around these parts there are many fake barns, facades put up to make it look as if there are more barns than there are (who knows why?). 

If he had been looking at one of these facades, he would also have had a strong belief that he was looking at a barn. Although his belief that he was looking at a barn was both true and justified (because based on his observation), this doesn’t feel like knowledge because of those other barn facades that could so easily have misled him.

Philosophers are still debating what the implications of these awkward cases are, but for most purposes the justified true belief model of knowledge works well enough. 


I want to turn to a related topic that I haven’t seen philosophers discussing: the nature of disbelief. Disbelief, unlike knowledge, involves a false unjustified belief rather than a true justified one. If you’re in a state of disbelief at the US election results, then at some level you’re still entertaining the false belief that Trump didn’t triumph.

You’re not justified in holding that false belief: there is a weight of evidence that the votes went in his favour. He did triumph. And you know that at some level, too. 

You don’t really believe Kamala Harris won, but you can’t bring yourself to believe that Trump did. You’re dazed in the headlights of reality.

Disbelief is a kind of self-deception motivated by a strong desire for what you know to be false to be true (or vice versa). It’s a form of cognitive dissonance, a special variety of wishful thinking.

In the 1950s, the psychologist Leon Festinger infiltrated a religious cult that predicted the end of the earth by flood on a particular date. When that date came and the prophecy failed (there was no flood), rather than falling apart, the cult became even stronger in its belief in a coming apocalypse. 

That’s a common psychological pattern stemming from a reluctance to give up cherished positions. 

But we should try to avoid being like those cult members. It is time to dispel disbelief and face the grim reality that is unfolding. 

Hello. It looks like you’re using an ad blocker that may prevent our website from working properly. To receive the best experience possible, please make sure any ad blockers are switched off, or add https://experience.tinypass.com to your trusted sites, and refresh the page.

If you have any questions or need help you can email us.

See inside the Why won’t he listen? edition

Image: TNE

Critical mass: Max Planck’s lessons for scientists

In times of state corruption and political interference, scientists remain ‘apolitical’ at their peril