It seems to me that some of the conversations we are having on this blog could benefit from making some distinctions. As such, I want to devote a couple of posts to simply making some distinctions that I think may prove helpful--although in some cases the distinctions are subtle and hard to make, and so I welcome advice in refining them.
The first distinction I want to make has to do with a pair of contrasting epistemic circumstances that, it seems to me, are often inadequately distinguished. I say this because, as I reflect on what I was doing in Is God a Delusion?, I worry that I was blurring together these contrasting epistemic circumstances myself.
The distinction bears on the relationship between two other contrasting concepts, namely agnosticism and what I like to call fallibilism. These are, if you will, contrasting epistemic attitudes. In roughest terms, to be an agnostic is to withhold belief on a matter, whereas to be a fallibilist is to have a belief but recognize that you could be mistaken, that those who disagree with you could have some or all of the truth, and that it is important to comport yourself accordingly.
What I want to suggest is that these contrasting epistemic attitudes may be correlated with contrasting epistemic circumstances--where agnosticism is (all else being equal) the most fitting response to one while fallibilism (again all else being equal) is the most fitting response to the other. It is the distinction between these underlying epistemic circumstances that I want to try to get at.
The contrasting epistemic circumstances I have in mind are ones that might be faced by reasonable people confronted with a body of evidence. For the sake of sketching out these circumstances, I am going to leave the concepts of “evidence” and “reasonable people” largely unanalyzed. All I will say is that I when I speak about "presumptive evidence" below, I mean to use the term in a broad sense so as to include anything that can be propositionally expressed, where that proposition strikes one as clearly ("evidently") true (in the way that propositions which express what one's senses are immediately delivering strike one as clearly true), and its truth supports the truth of some other proposition(s). But this definition is itself couched in terms that require unpacking, and which might be understood in different ways. In short, I am fully aware that these terms are not uncontroversial, but I am going to sidestep these controversies for the sake of focusing on a different issue, while remaining fully conscious of the fact that in order to adequately address the issues raised here, it is likely that we will eventually need to return to these more basic controversies.
Here, then, are the two epistemic circumstances I want to consider:
Epistemic Circumstance 1 (EC1): You confront a body of presumptive evidence that "reasonable people" (however that is to be understood) generally accept, but you recognize that there are different ways of fitting that evidence into a coherent whole—different "stories" we can tell that fit just as well with the given evidence. In other words, we have certain mutually exclusive holistic ways of seeing the evidence, each of which maps onto the evidence just as well. For simplicity, let us assume there are only two such ways of seeing that fit as well onto the evidence, which we will call Worldviews A and B.
Epistemic Circumstance 2 (EC2): You confront a body of presumptive evidence that reasonable people generally accept, as well as certain further “apparent truths,” that is, things you experience as clearly true/self-evident/obvious/hard to deny/intuitively correct. But some of the people you regard as rational don’t find these apparent truths nearly as apparent as you do, and may instead find other things evident which are hardly evident to you. So, within the total body of “evidence” with which you are confronted, some of it is “shared evidence” whereas some of it is “personal evidence.” Now suppose that, as before, Worldviews A and B both map onto the shared evidence (and are the only worldviews you have so far encountered that do this). But now let us suppose, furthermore, that Worldview A maps well onto the conjunction of the shared evidence and your personal evidence, while B doesn’t (accepting B would force you to abandon things that seem clearly right to you). At the same time, Worldview B maps well onto the conjunction of the shared evidence and what is apparently the personal evidence of reasonable people other than you.
These, in brief, are the two contrasting epistemic circumstances I want to consider. They are not meant to be exhaustive. There may even be a kind of continuum between EC1 and EC2--that is, a range of cases that are a bit like both in one way or another, some more like on and some more like the other. As such, they might be seen as ideal types.
Now if you find yourself in EC1, there is a clear sense according to which, from your standpoint, A and B are equally plausible on the evidence. Put another way, there is no reason, on the evidence, for you to favor A over B. If A and B are the only ways of seeing that you've so far encountered that offer a good fit with the evidence, you may have some reason to endorse the disjunctive proposition, "A or B." But the evidence as such favors neither.
This doesn't mean that you won't have pragmatic or personal reasons for operating as if Worldview A is true and not Worldview B. You might find A more hopeful. Or you might like who you are better when you live as if A is true. Or perhaps you’ve grown up with a community that embraces A, and you continue to have a sense of solidarity with that community. Or perhaps you’ve tried to see the world through the lens of B and it just doesn’t sit right with you because of what you identify as mere quirks of personality. Or perhaps it is a combination of these factor. In any event, you recognize these factors as personal and idiosyncratic ones, and you see those who choose to operate as if B is true (or who choose neither, assuming that is pragmatically possible) as being motivated by personal idiosyncratic motives that are no better and no worse than the ones that motivate you.
Put another way, whatever it is that you take to be motivating you to adopt A over B, you don't take that something to be evidence for the truth of A. It is, rather, a practical reason for you to engage in the act of behaving as if A is true, even though you don't think the evidence especially favors A over B. In a real sense, you experience A and B as equally plausible, and you see your own choice of A as nothing more than a story you like to tell yourself. As such, on a theoretic or intellectual level your stance is clearly agnostic (although, on a pragmatic level, you qualify as a kind of pragmatic believer in A).
But now suppose you find yourself in EC2. In this case, in addition to whatever pragmatic and personal reasons might move you towards Worldview A in EC1, you also have the further reason that Worldview A fits with the totality of the evidence available to you in a way that B does not. You have available to you a set of apparently compelling truths that you are striving to account for, and A accounts for them well. B, however, would force you to give up on a subset of the things that seem obviously correct to you. So, in terms of everything that strikes you as evident, A is more defensible than B.
But the very subset of apparent truths you would have to give up were you to accept B is a subset which some people—who otherwise seem like reasonable people to you—do not find intuitively obvious in the way you do. And they seem otherwise reasonable. And this gives you some reason to attach less weight to the “personal” evidence than you do to the shared evidence, and so be less confident than you might otherwise have been about A.
Put simply, looked at purely on its own merits, the personal evidence seems as compelling to you as the shared evidence. But the fact that others do not see the personal evidence in the same way that you do--in addition to being the primary reason why it ends up being put into the "personal" rather than the "shared" category--gives you reason (especially in the face of a general awareness of your own fallibility) to have doubts about the reliability of the personal evidence. But you are also aware that those who don't find the evidence compelling are also fallible--and you have no special reason to think that you are the one who is wrong in this case, rather than them. For all you know, the reason you regard as clearly true what others don't is that you are so situated so as to be able to immediately intuit truths that these others can't intuit from where they are situated.
Put in somewhat more technical terms, you have so far not encountered any "defeaters" for your personal evidence--that is, nothing that gives you clear reason to believe either that what seems right to you is false, or that the mechanism whereby it comes to seem right is sufficiently suspect to place no credence in its fruits. All you have, at this point, is the immediate intuitive sense that something is the case, along with the clear awareness that other seemingly reasonable people lack this sense.
In EC1, your reasons for favoring A over B are ones that do not appear to you as evidence for the truth of A, and in this sense are seen by you as nothing but pragmatic reasons to operate as if A is true. But in EC2, your reasons for favoring A over B have the "look and feel" of evidence, that is, they seem to be truths that speak in favor of the truth of A. And this makes your epistemic situation clearly different. It means, among other things, that when you endorse A, it is because A seems right to you in a way that B does not. You favor A over B on the basis of considerations that present themselves to you as evidence for the truth of A and against the truth of B. In this sense, you are not an agnostic on the theoretical level (although you may be a kind of pragmatic agnostic, insofar as you choose to operate as if either A or B were equally plausible for pragmatic reasons).
But the broader features of EC2 may inspire you to adopt an attitude of fallibilism with respect to your endorsement of A. It might also inspire you to reassess the reasonableness of those you had previously taken to be reasonable, insofar as they do not accept what strikes you as clearly true. But at least in general, fallibilism seems a better "fit" with EC2. If so, then while A just seems right to you in a way that B does not, you also know that you are fallible, and you know that some of the evidence you are using in arriving at A is not regarded as veridical by other people who otherwise seem eminently reasonable. This fact alone does not make the evidence seem less veridical to you, but it does motivate an attitude of due caution, a willingness to investigate, to hear opposing arguments and be open to be moved by them if they do amount to "defeaters" of your presumptive evidence. And it also makes you resistent to condemning those who endorse B.
In effect, you are inclined to say, "This is an issue about which it seems that reasonable people can disagree; but in my judgment, based on the considerations that seem convincing to me, A seems true and B false. That others reach the opposite judgment calls for respect, but it doesn't as such require that I abandon my judgment. Rather, it only requires that I hold to the judgment fallibilistically."
A few final remarks. It seems clearly possible for one person to be in EC1 and another in EC2 with respect to the same body of shared evidence. That is, Amy may confront the shared evidence without any substantive personal evidence to add, and so may see A and B as equally plausible on the evidence--while Ben confronts the shared evidence with a supplement of personal evidence, on the basis of which Ben sees A as clearly more plausible than B. In such a case, Amy might decide for pragmatic reasons to operate in terms of A--in which case we might say that both are pragmatic believers in A but Ben is also a theoretic believer (wherease Amy is a theoretic agnostic). But there are a number of alternative permutations here that need to be kept in mind--permutations which can generate theoretic accord and practical divergence, etc.
I will confess that this is a first run at my thinking on this issue, and so it doubtless needs considerable refinement. Thoughts?
"The children of God should not have any other country here below but the universe itself, with the totality of all the reasoning creatures it ever has contained, contains, or ever will contain. That is the native city to which we owe our love." --Simone Weil
Showing posts with label distinctions. Show all posts
Showing posts with label distinctions. Show all posts
Subscribe to:
Posts (Atom)