Reality is objective. What's true is true for everyone, regardless of what people believe.
You'd think this would be obvious. And yet, it comes up. Particularly in debates about the supernatural. People will say things like "Well, maybe ghosts don't exist for you, but they do exist for me." I've never seen anyone apply this sort of reasoning when doing something mundane, like say, crossing the street. No one ever says "That bus might exist for you, but it doesn't exist for me." People know there either is a bus there, or there isn't, regardless of what they believe, and if they want to not get hit by it, they ought to have beliefs which match objective reality.
I think the idea that reality is different to different people might come from the thought that our senses are infallible. If I see something, that means it's there. If I see something, and someone else sees something different, well our realities must be different. But our senses aren't infallible, and we frequently see or hear things that aren't there, or fail to see or hear things that are.
It seems to me much easier to explain occasional discrepancies in our perceptions in an objective reality than to explain the ubiquitous consistency of our perceptions. How often do people disagree about whether it's day or night, whether they're inside or outside, whether they're sitting or standing, whether there's a bus in the road or not? If reality were subjective, why would such consistency be so common?
Sunday, January 29, 2012
Wednesday, January 4, 2012
Justified True Belief?
Epistemology is the branch of philosophy interested in the study of knowledge. One of the foundational questions of epistemology is "What does it mean to know something?". A common answer is that knowledge is justified true belief. In order to know something, you have to believe it, have a good reason for believing it and it has to actually be true.
That sounds pretty good, but there are problems with it. It's possible to construct scenarios in which all three conditions are apparently met, but it doesn't seem like anything was "known".
There's also the question of what counts as justification. But I think there's another, more fundamental, problem with this definition. One I don't think I've ever seen anyone else point out.
No definition of knowledge should include the condition of being true.
To do so is to make the term inapplicable in any situation remotely resembling real life. The reason for that is because of the answer to another fundamental question of epistemology: "Is it possible to be absolutely certain of something?". The answer to that is no. (Of course, others disagree, and I should probably write another post explaining why I think that.) All we can do is get more and more evidence for something, getting closer and closer to 100% certainty, but never actually reaching it.
In toy examples, whether a given fact is true or not is simply assumed/given. So if a hypothetical character has a justified belief, we can say whether that character knows it or not, because of our god-like omniscience. But in real life, no one has that omniscience.
Consider the same question - Is a belief knowledge or not - applied to yourself. Do I know the sun will rise tomorrow, or do I just believe it? Well, I certainly think it's true that the sun will come up. I wouldn't believe it if it I didn't think that, tautologically. But if that's the standard, then I ought to consider every belief I have to be knowledge. If I didn't believe they were true, I wouldn't believe them.
And if we apply that standard to other people, then our evaluations of whether someone else knows something or merely believes it, simply becomes a question of whether they agree with you or not.
I propose a simpler definition: Knowledge is belief that is held with a high degree of confidence.
This definition fits very well into a Bayesian framework. Degree of confidence is simply probability. If you believe something is true with a probability greater than, say, 99.9%, you can be said to know it. This also handily deals with the question of what counts as justification - that's just Bayesian evidence.
That sounds pretty good, but there are problems with it. It's possible to construct scenarios in which all three conditions are apparently met, but it doesn't seem like anything was "known".
There's also the question of what counts as justification. But I think there's another, more fundamental, problem with this definition. One I don't think I've ever seen anyone else point out.
No definition of knowledge should include the condition of being true.
To do so is to make the term inapplicable in any situation remotely resembling real life. The reason for that is because of the answer to another fundamental question of epistemology: "Is it possible to be absolutely certain of something?". The answer to that is no. (Of course, others disagree, and I should probably write another post explaining why I think that.) All we can do is get more and more evidence for something, getting closer and closer to 100% certainty, but never actually reaching it.
In toy examples, whether a given fact is true or not is simply assumed/given. So if a hypothetical character has a justified belief, we can say whether that character knows it or not, because of our god-like omniscience. But in real life, no one has that omniscience.
Consider the same question - Is a belief knowledge or not - applied to yourself. Do I know the sun will rise tomorrow, or do I just believe it? Well, I certainly think it's true that the sun will come up. I wouldn't believe it if it I didn't think that, tautologically. But if that's the standard, then I ought to consider every belief I have to be knowledge. If I didn't believe they were true, I wouldn't believe them.
And if we apply that standard to other people, then our evaluations of whether someone else knows something or merely believes it, simply becomes a question of whether they agree with you or not.
I propose a simpler definition: Knowledge is belief that is held with a high degree of confidence.
This definition fits very well into a Bayesian framework. Degree of confidence is simply probability. If you believe something is true with a probability greater than, say, 99.9%, you can be said to know it. This also handily deals with the question of what counts as justification - that's just Bayesian evidence.
Subscribe to:
Posts (Atom)