Epistemology is the branch of philosophy interested in the study of knowledge. One of the foundational questions of epistemology is "What does it mean to know something?". A common answer is that knowledge is justified true belief. In order to know something, you have to believe it, have a good reason for believing it and it has to actually be true.
That sounds pretty good, but there are problems with it. It's possible to construct scenarios in which all three conditions are apparently met, but it doesn't seem like anything was "known".
There's also the question of what counts as justification. But I think there's another, more fundamental, problem with this definition. One I don't think I've ever seen anyone else point out.
No definition of knowledge should include the condition of being true.
To do so is to make the term inapplicable in any situation remotely resembling real life. The reason for that is because of the answer to another fundamental question of epistemology: "Is it possible to be absolutely certain of something?". The answer to that is no. (Of course, others disagree, and I should probably write another post explaining why I think that.) All we can do is get more and more evidence for something, getting closer and closer to 100% certainty, but never actually reaching it.
In toy examples, whether a given fact is true or not is simply assumed/given. So if a hypothetical character has a justified belief, we can say whether that character knows it or not, because of our god-like omniscience. But in real life, no one has that omniscience.
Consider the same question - Is a belief knowledge or not - applied to yourself. Do I know the sun will rise tomorrow, or do I just believe it? Well, I certainly think it's true that the sun will come up. I wouldn't believe it if it I didn't think that, tautologically. But if that's the standard, then I ought to consider every belief I have to be knowledge. If I didn't believe they were true, I wouldn't believe them.
And if we apply that standard to other people, then our evaluations of whether someone else knows something or merely believes it, simply becomes a question of whether they agree with you or not.
I propose a simpler definition: Knowledge is belief that is held with a high degree of confidence.
This definition fits very well into a Bayesian framework. Degree of confidence is simply probability. If you believe something is true with a probability greater than, say, 99.9%, you can be said to know it. This also handily deals with the question of what counts as justification - that's just Bayesian evidence.