This is Philosophy Bites with me, David Edmonds. And me, Nigel Warburton. Philosophy Bites is available at www.philosophybites.com. Philosophy Bites is made in association with the Institute of Philosophy. The fact that different people have different beliefs about the world is kind of important.
that person A believes in Christianity and person B in Islam. Wars are fought over beliefs, politics divided by them. Disagreement about beliefs can break up friendships and marriage. If people shared more beliefs, the world would be calmer, more peaceful, if also, no doubt, a little duller. So how can people with different beliefs approach consensus about whose beliefs are right and whose wrong?
Jonathan Glover believes he has at least a partial answer. Jonathan Glover, welcome to Philosophy Bites. Thank you. We're going to be talking about systems of belief. Can we just begin by saying what that topic is and why it's of philosophical concern? Well, what I'm struck by is that individual people often have very different beliefs. Some people are religious, some are not. Some are socialists, some are capitalists, some are liberals, some are conservatives.
and also societies quite often have dominant systems of belief and sometimes those societies come into conflict with each other and wars are fought over religion or political ideology i am struck that if people have these belief systems
This is really what philosophy has been about more or less since Socrates. It's about thinking about whether a belief or a set of beliefs can be shown to be more reasonable or to be true as against other beliefs. And I'm really struck by the fact that although philosophers have been talking about this for thousands of years, when wars are about to break out because of religious or political ideological conflicts,
Philosophers normally don't seem to have anything to say, and I feel they should.
So there are these conflicting belief systems, and people within a belief system characteristically don't see anything wrong with their belief system. And then there's the history of philosophy, which has been all about what it is reasonable to believe. But surely the philosophy must have had some impact historically. Well, obviously philosophers quite often have succeeded. People say things like Rousseau had an influence on the French Revolution or
Mill had an influence in the development of British liberalism. People say that sort of thing, and I'm sure it's sometimes true. But it's usually been political philosophy or ethics that have had the public influence. It hasn't been epistemology or the theory of knowledge, the central part of philosophy, which is about what it's reasonable to believe, what we have grounds for thinking to be true.
So if what you're saying is right, now's the time for philosophers to give some kind of insight into the ways in which people form their beliefs, which ways are reasonable and which are not. So how should we change our thinking about these sorts of issues? Well, what I'd like to do, if I may, is approach that by talking about how belief systems actually usually work.
And I want to take a rather simple example to bring out what I consider to be the holistic nature of belief systems. If I go to the doctor and the doctor gives me a prescription, I will expect that the prescription will put right whatever medical problem it is I have. Usually that happens, I'm glad to say. But if it doesn't, something has gone wrong. I have this confident belief, this prediction, and so something in my belief system has gone wrong.
And there is a model, particularly in philosophy of science, particularly under the influence of Karl Popper, where falsification is the key thing. You have some hypothesis, you test it, the prediction turns out to be wrong, and so the hypothesis is falsified. This has been enormously influential in thinking about science and in thinking about rational beliefs about the world generally. But I want to say that in one important respect, this is a simplification.
Because we have a choice about what bit of our belief system to give up when something goes wrong and a prediction is falsified. So that when the medicine doesn't work, I can say, well, the doctor isn't as good as I thought he was. But it may not be that. Perhaps he gave a perfectly good prescription. Maybe the pharmacist didn't make it up properly.
Or again, maybe the pharmacist is perfectly all right. Maybe what's gone wrong is that I am a biochemical freak in some way, so the medicine that works for most people doesn't work for me. Or I can go back more generally. I can say maybe it isn't just this doctor that's gone wrong. Maybe it's the whole system of Western medicine, and I ought to go for some alternative medicine instead.
Or, even more fundamentally and generally, I may say perhaps it's the whole system of Western empirical science that's at fault, and that ought to be given up. I have to make some change when something's gone wrong in my system of beliefs, but I have a large number of choices of varying scope about what to give up.
So what you're saying is that we don't hold beliefs in isolation, they're all interconnected. And when one of our beliefs is apparently falsified, it doesn't mean that that's the belief we have to give up straightforwardly. There are a whole range of other nested beliefs that we might give up. Then the question is, on what basis do we decide which ones to hold on to and which ones to give up?
You put the thing so much better than I did. I agree, we do have this choice, and indeed there are large questions about what beliefs we should give up. But I want to talk about the way in which, because of this flexibility, which are holistic systems where beliefs are in systems, in families, rather than individually tested, this gives people the opportunity, if they want, to defend any belief they like.
A very striking real example, Philip Gosse in the 19th century was very interested in the fossil evidence which is normally interpreted as being the evidence showing that evolution is at least highly plausible. And Philip Gosse, as a fundamentalist religious believer, didn't want to change his belief in the Genesis account of the origin of life.
He said, well, what we have to say here is that God decided to test our faith. He planted evidence looking as though evolution had occurred in order to see who were the really true believers in the Genesis story. Now, most of us think that's absurd, but it shows just how far you can go. Presumably, these sorts of beliefs don't just occur within a scientific context, but also within a political one too, for instance.
Oh, certainly. The example which I think is a clear case of this is the case of the British Communist Party in 1939. Many of the people in the British Communist Party in the 1930s had actually joined up, not necessarily because they thought Karl Marx was right about economics or believed in the common ownership of everything or whatever it might be,
Many of them had joined because, for quite a bit of the time in the 1930s, the Communist Party looked like the most serious opponent of Hitler.
When the Hitler-Stalin pact took place, Stalin then sent orders to the British Communist Party, which were conveyed to the Central Committee, who had a meeting. His orders were these. You have to stop opposing Hitler, because Hitler's now my ally. You have to say this is an imperialist war on both sides. And you have to be prepared, if necessary, to undermine your own country's war effort against Hitler.
This presented a huge dilemma to the Central Committee. What it turned out was that one or two people on the Central Committee simply flip-flopped. These are the orders, we just change the line. Other people, we can't do that, and our whole raison d'être has been about being anti-Nazi. Eventually they all came round, but the record of the debate shows the writhing that took place.
You take one thing as your fixed point, and for many of them, the fixed point was the Soviet Union is always right. So you have to adjust and adapt other parts of your belief system in order to defend that belief. Some of them said, really there's not much to choose between the Western democracies, Britain and France, and Nazi Germany.
Another thing which would make it easier to support the new line would be if Nazi Germany was not a serious aggressive threat and Britain and France were much more responsible for the war.
And so they started running that line. Some of them said the plain fact is not the strength of Germany, but its weakness. Germany is desperately seeking peace. Now, in the period from 1936 onwards, Hitler had invaded the Rhineland, taken over Austria, bullied Britain and France into handing over parts of Czechoslovakia to him at the Munich Agreement, and then he'd invaded Poland.
It doesn't look like somebody desperately seeking peace, but it shows the extent to which preserving the fixed point requires distortion of the other beliefs in your system.
So what moral should we take from this? We have a tendency to believe things as part of a general holistic worldview and we can adapt any of our beliefs. We can choose which ones we adapt when confronted with difficult evidence. And also that we have these psychological propensities to cling to certain core beliefs. Surely the situation is hopeless.
Well, perhaps it's hopeless, but I rather hope it isn't. Once one becomes aware of how your inner belief system, which you've partly perhaps constructed yourself, but often very largely inherited from your family, the whole society in which you live, the television, the newspapers, the church, your school teachers, and so on. Once you become aware of this, this is likely to lead people who are more reflective into philosophy.
The distinguished American philosopher, Allan Buchanan, was brought up in the deep south of the United States. And he was brought up by lots of authority figures who unquestioningly accepted a racist ideology. He was a white person living in the deep south and he was taught a whole series of falsehoods about black people. And as a result, as a young person, held a racist ideology. And it was only as he started to question this
that he came to think about it and ask for evidence and so on. And he came to see that his authority figures were questionable and he'd have to think for himself. He now has utterly different beliefs from the ones he grew up with. Philosophy in one way is a natural thing for human beings. And many, many people at the age of roughly 12 to 15 start to realise that their inner belief system is so contingent. People start to think...
Here I am holding this version of Christianity or this version of liberalism or whatever it is. If I lived in China, I'd have a completely different set of beliefs. If I'd lived 400 years ago, I'd have a completely different set of beliefs. Once you start asking whether you know it or not, you're doing philosophy. You're asking what's it reasonable to believe.
So when you've reached that moment of sudden awareness that what you believe fundamentally may just be there on the basis of what your parents believed and the other people around you happen to believe, the contingency of it all, what do you do next?
Well, there's been one very powerful tradition in philosophy. Descartes quite obviously had this idea. He talked about needing to rebuild his beliefs on what he called secure foundations. It was like rebuilding a house. You live in a house and you discover it's got rotten foundations. You need to rebuild it. And the first thing you do is you have to have certainty, solid truths as the foundations for your system of beliefs.
So he looked for axiom. He was a mathematician and interested in geometry and so on. And so he looked for the idea of has it where the equivalent of the axioms in Euclid, the things which you know are so obviously true, and then you derive other things from them. Now, the problem with that in philosophy is that it's not at all clear that there are any certain axioms that nobody could possibly doubt.
and as a result, his project is widely held, while it had many merits, not actually to have succeeded. In fact, one of the striking things about Descartes' project is he starts off with the characteristic beliefs of a 17th-century Frenchman. He then junks a lot. He rebuilds the system, and somehow the system looks extraordinarily like the previous set of beliefs of a 17th-century Frenchman. An alternative model...
put forward by the Austrian philosopher Otto Neurath in the 1930s, is that you can't start from scratch completely. Neurath says the plausible way of looking at what we're doing is not like rebuilding a house. It's like rebuilding a boat, which we happen to be afloat in at sea.
Maybe the whole thing needs rebuilding, but at any point, inevitably, you have to keep enough afloat for you to do the rebuilding. It's an interesting image, that, because there's a sense in which you have to keep it afloat to survive, and discovering truth about the world is part of what we need to do to exist as human beings. If we're fundamentally wrong about the way the world is, we probably won't last very long.
That's a very powerful thought. A number of philosophers since about the mid-20th century have said we can't just carry on rebuilding our belief systems using philosophy alone. We have to take, for instance, the evidence of science. The evidence of science is normally taken to support the theory of evolution. For instance, when philosophers ask, can we trust our senses? How do our senses give us correct information about the world?
A species whose senses systematically misled it about the world wouldn't survive very long. They'd be eaten by some rival species with a better set of sense organs.
Now, that's, I think, a pretty powerful argument, but there's still a deep question as to whether that isn't circular. What you're doing is defending the senses, which are the basis of a lot of our scientific knowledge, by appealing to a theory in science that says we have good reason to expect them to work well. But you're kind of justifying the senses by science and justifying science by the senses. And some people would say that's a circularity.
Thinking about the ideological beliefs, the sorts of things which are shaping the century we live in, most of the dramatic conflicts arise from people with what seem to be completely unrevisable beliefs. Their core beliefs are not up for grabs.
I'm not offering some magic wand which will just wave away people's dogmatism. It's pretty clear that quite a lot of people are relatively unreachable. What I'm suggesting is that people should engage in dialogue and that the dialogue will consist in people carrying out the Socratic program of asking each other, spell out precisely what you think. Do you mean this or do you mean that?
What's the basis for this? What's the argument? What's the evidence? Try to tease out the assumptions. The first move is that people who are willing to engage in this will understand each other's beliefs better. They'll at least see what the differences are. Then it'll turn out, perhaps, that there aren't just differences of content, but the differences of content are defended by differences of intellectual strategy.
For instance, a creationist and an evolutionist, it may well be that there's an underlying difference, which is that for the creationist, the fundamental principle for getting at the truth is consult the Bible, whereas for the scientist it's consult the evidence of scientific investigation. At least they'll understand. They've got a bit deeper. They've discovered this deep epistemological difference.
And then you can start asking, what's the defense of it? What's the reason why you think your epistemology is better than the other person's epistemology? I'm just struck, just as an ordinary humdrum teacher of philosophy, students walk into classes with their beliefs. If I disagree with them, they just think I'm doing my job, just doing it to annoy. But when they disagree with each other,
They then, wow, somebody I really respect, a fellow student, holds a totally different view from me. So then you get them to talk to each other about what their basis is. And what I find is they don't normally persuade each other, but they hold their beliefs in a different way. People are not quite so confident.
Recently I've been teaching a class in global ethics. We have people who hold fairly traditional Muslim views as well as fairly traditional Christian views and also people who hold the views of scientific empiricism. And at first I start asking the people who hold the religious fundamentalist views how they can justify this and what reason they can give to persuade somebody who doesn't already hold their belief why they should hold it.
and they often start to flounder a bit. And the ones who believe in scientific empiricism look rather pleased and feel things are going rather well. But then I turn to them and say, now you believe in the scientific method. Why should we think that evidence from the past gives good evidence about what the world will be like in the future? Why exactly is it that we know that our senses are reliable and we get the accurate picture of the world?
And I quite enjoy seeing them get a bit discomforted too. So what you're saying is that philosophy can teach us to be a bit more humble. My hope is that we actually may make some progress towards agreement on things, that we might move towards a consensus about what's a reasonable intellectual strategy by agreeing on what sort of beliefs are plausible and what aren't in certain contexts and extrapolating from that.
We may not get there. If we don't, at least I think we might get, as it were, a map of what the different belief systems are so that we have more understanding and less likely to fight each other. Take a very simple political disagreement. There's a conflict about private education. Equality of opportunity says it's not fair for some children to have advantages paid for by parental wealth.
On the other hand, Liberty says it's an outrage to prevent parents spending their money on giving a good education to their children. Among people I know, quite a few support one side, quite a few support the other side. But it's not very difficult on that issue.
to get agreement on what the nature of the disagreement is. And on the whole, since most of us are a bit sympathetic to both liberty and equality, we're not going to go to war about it because most of us can understand the other person's point of view. My hope is that as well something like that might be generalised so that even where we don't get agreement, we might get some kind of mutual understanding and mutual respect. Jonathan Glover, thank you very much.
Thank you. There's now a Philosophy Bites book published by Oxford University Press. For more information, go to www.philosophybites.com. For more information about the Institute, go to www.philosophy.sas.ac.uk.