A problem for rejecting knowledge of lottery propositions

So, lots of people find it plausible to deny knowledge in the following type of case.  Let there be a fair lottery with n tickets (where n is very high, and where one and only one ticket will win).  You receive ticket i, and reason that because 1/n is so small, you believe you will lose.   Lets say you are right, and ticket is in a loser.  Most people find it emminently reasonable to reject that you know that you will lose, for various (seemingly good) reasons.

But if they do, they will also have to deny the following rather plausible principle about the value of knowledge :

Knowledge is Valuable (KIV): A rational person should never prefer to merely truly believe that p, rather than to know that p.*

Now, consider this lottery.  Again, there is a lottery with n tickets where one and only one ticket will win.  The lottery tickets come with a bar code on them that has whether the ticket won or lose encoded on them.  You receive ticket i.

Now consider the following two situations.  In the first situation, you receive ticket i and you know the value of n.  So you know you have a 1/n chance of winning.  You believe that you will lose, and, in fact, the ticket is a loser.  Do you therefore know that you lose?  Again, most people say no.

In the second situation, you receive ticket i and are given a lottery ticket detector.  The detector reads the bar code and outputs “winner” or “loser.”  The detector is very, very reliable and is indeed more reliable than any other faculty (perceptual or otherwise) you have of gaining knowledge.  Let us say that it has a probability of p of correctly outputting “winner” or “loser” when reading the bar code, and p is very high (but less than 1-1/n : we are talking about a truly massive lottery here).  You don’t know how many tickets are in the lottery.  You receive ticket i, your lottery ticket detector outputs “loser” and you believe that your ticket loses.  Do you therefore know that you lose?  I find it incredibly implausible that the answer here is no (note that the reasons people standardly give against knowledge in the normal lottery ticket do not apply, your belief here is (very) sensitive to the truth of what you believe, etc).  At least, I find it very implausible for anyone who isn’t a radical skeptic.

But note that you should find it preferable to be in the first situation instead of the second.  In the first situation, your credence that the ticket loses given that it is an n-ticket fair lottery plausibly should be 1-1/n.  In the second situation, your credence that the ticket loses given a lottery ticket detector reading of “loser” plausibly should be p, which is stipulated to be less than 1-1/n.**   In other words, your rational credence that you lose should be higher in the know-the-number-of-tickets situation than in the lottery-ticket-detector situation.  It certainly seems like you are better informed about your chances of winning in the first situation as opposed to the second, and so you should prefer the first situation to the second.  But if the opponents of knowledge in the normal lottery case are correct, this means you should prefer merely truly believing that you lost to knowing that you lost.  Someone who rejects that you know in the lottery case therefore seems like they have to embrace skepticism or reject KIV.

Here is another example : the concerned doctor.  The concerned doctor is always worried about each new disease he reads about in the medical literature.  To his horror, he reads about a new disease X that has just been discovered.  It seems that people always catch X before they are 30, and never show any (macro) symptoms whatsoever until they are 40.  But X is quite painful, and the doctor, who is 35, is therefore worried.  Now consider the following two situations :

(1) The doctor reads in the article describing the disease that (amazingly) the researchers have also conclusively demonstrated that 1/n people between the age of 30 and 40 have this disease.  The value of n is vanishingly low, so the doctor is relieved, believes he does not have the disease, and (as a matter of fact) he doesn’t.

(2) The doctor reads in the article that there is a test for this disease that tells you whether you have the disease with probability p (and, again, p is very high).  The doctor orders up the test, administers it to himself, and it comes up negative.  The doctor is relieved, believes he does not have the disease, and (as a matter of fact)  he doesn’t.

Now, similarly to before, let 1-1/n be greater than p.  It seems that anyone who rejects knowledge in lottery cases will also reject that the doctor knows he is disease free in (1).  It seems that if anyone knows anything, the doctor knows he is disease free.  But, again, shouldn’t the doctor be more relieved in (1) than in (2), since 1-1/n is higher than p?  It seems that the doctor has more reason to think he is disease free in (1) than in (2) and hence should prefer (1) to (2).  But, of course, this again violates KIV.  The lesson here, it appears, is that sometimes it is more useful to know the base rate (alone) than it is to know the result of a test (alone).

Of course, in both cases we should prefer to know both the number of lottery tickets/base rate for the disease as well as the detector result/test result.  But if we aren’t in that position, it seems like we should prefer the lottery ticket/base rate-derived belief over the detector/test result belief.  So, again, we get a failure of KIV.  Maybe that isn’t that big a deal : maybe we do have other reasons to reject KIV.  But it is hard to see how a weaker version of KIV could be formulated that would avoid the above considerations.  If so, then maybe we should be wondering just what is so special about knowledge after all (ie, whether there is anything valuable about knowledge intrinsically, rather than just derivately) or maybe we need to give up the (fairly reasonable) claim that we don’t know in lottery type cases.

*Now, this is may be too strong and so not that plausible.  Potential counterexamples might involve situations where your knowledge may undermine other inferences you should make, where your merely truly believing something wouldn’t also undermine those other inferences.  If so, then you can revise the Knowledge is Valuable premise with the proviso that it only applies when your degree of belief when you know is no higher than your degree of belief in what you merely truly believe.  I think that weakening takes care of any counterexamples I can think of to KIV, and the lottery/detector example still works.

**I say plausibly because it may be that we shouldn’t have credences in those cases.  A credence of 1-1/n for the claim that we lose seems to follow from the type of principle of indifference defended by Adam Elga in his Sleeping Beauty paper and Chris Bostrom in his Simulation paper.  But principles of indifference often are pretty flaky, so maybe we shouldn’t have a credence of 1-1/n.  Similarly, maybe we shouldn’t have a credence in the detector case because we don’t know “the base rate” of the disease and so can’t really compute the conditional probability of “ticket i loses given that detector reads ‘loser’.”  I’m not sure what I think about either of these objections, but even if we don’t have a proper probability that we can compute, there is still an intuitive sense where knowing the number of tickets is better than knowing the detector result.


Tags: , ,

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: