This is the the eighth post in my Theories of Knowledge and Reality series. Follow the link for more on the series and for links to other entries as they appear.
One general sort of response to skepticism is pragmatism. Pragmatists simply accept the skeptic's conclusion. We don't really know many of the things we thought we knew. They disagree with the skeptic on the import of such a conclusion. Skeptics think it's really worrisome if we don't know much, and we shouldn't believe things we don't have good reason to believe. Pragmatists generally think it's perfectly fine to believe certain sorts of things that we don't know and don't even have any rational reason to believe.
There are at least three kinds of pragmatism. All three admit that the skeptic is right in some way but insist that it doesn’t matter for how we live our lives. They differ in exactly why it doesn’t matter. Someone might hold more than one of these, but any motivation for pragmatism will lead to pragmatism, so I'm listing the motivations separately because some ground pragmatism in different ways.
A. Forced Pragmatism: We don’t know things, but we don’t care. We have to live as if we know things, because we have no choice about believing them. David Hume took this line about skeptical worries. We can't really live as if we don't know anything. When someone throws a rock at our head, we're going to duck. We can't just pretend we don't believe the rock is there. That doesn't in itself tell us that we do have knowledge. Hume didn't have any argument to counter those of Descartes. He simply said that we're constituted in such a way that we can't help but believe we know things.
B. "Why Not?" Pragmatism: We might as well believe things, because it beats wandering around and falling into ditches. The idea is that we might as well go with what seems to be true, even if we have no idea of whether it really is true. Descartes thought it to be irrational to go on believing something once you had reason to question it. This variety of pragmatism thinks it's the other way around. If we already find ourselves believing something, there's a good chance that it's the kind of thing we might as well continue believing. William James, on one interpretation anyway, defended this line of thought. He outlined a few requirements for which sorts of beliefs are ok to hold when there's no rational reason to believe. We shouldn't just adopt believes we have no already-existing belief about that have no import whatsoever. I have no idea if the number of stars in the universe is even or odd, and it doesn't matter. It isn't ok just to start believing it's odd (if I could even voluntarily give myself such a belief, which I can't). Beliefs you already have that are important enough and that you need some belief about are generally ok as long as there's no strong evidence to favor one way or the other. There's no strong evidence to favor Matrix hypotheses, Berkeley's idealist world, brain-in-vat scenarios, or anything else of the sort, but all the evidence that we could put forward to favor the common opinion of an external world is just as consistent with any of those scenarios. Therefore, since there's no evidence one way or the other, and we already believe in the external world, we might as well keep believing in it.
C. Moral Pragmatism: We don’t know things, but it would be wrong to act as if we didn’t know things. The proper behavior for us, given that we seem to be living in a society, is to assume it’s true and relate to other people in good ways. Otherwise we might really be doing something wrong. Of course, it might well be that continuing to live as normal with what seem to me to be other people is actually causing absolute torture to the only real beings in the universe besides me, but I have no evidence that even suggests such a thing. I at least have perceptions that seem to be about people, and this view takes that to lead to a moral obligation to act as if they're real by meeting what would be my moral obligations if what seems to be true is true.
D. "It Works" Pragmatism: We don't know things, but when we act as if we know things it leads to good results. This may not be the moral reason for continuing to believe despite not knowing very much, but it involves some normative component. It's not that ought to believe. It's that it's good to believe. It leads to the results we want. John Dewey's pragmatism seems to me to fit with this motivation, and sometimes William James would speak this way too. It was this element that led James to include the requirement that a belief must have some practical import for it to be ok to believe it without evidence.
Pragmatists in general think that there's no good answer to the sorts of problems raised by Descartes, at least in terms of knowledge. They agree with his methodological assumption that knowledge requires certainty, but they reject his assumption that without this certain kind of knowledge we should give up believing in things we've always believed in. This is highly unsatisfying to someone who really worries about what Descartes worried about, but many people don't seem to be too bothered by such worries, and they might consider pragmatism an innocent attempt to state what most people believe. That would be a mistake. Pragmatism is actually quite radical. Its defining point is that we can believe something without any grounding or rational motivation, with no truth-related justification or warrant, and that's perfectly fine, with certain caveats. This isn't all that common a belief, and it just takes looking at public discourse to see why.
People don't generally think of wishful thinking as a good reason in any sense to believing in things that have no evidence going for them. We constantly raise questions about why people believe things, as if they should have some support for their claims. Pragmatism moves away from that sort of thing, and the different versions might do so in different ways and to different degrees. There is a common sense sort of notion that it's moving away from. Pragmatism isn't, therefore, the straightforward, common sense view put into philosophical terms. It's a lowering of many people's standards for how we should arrive at our beliefs and why we should continue to hold them.