Edited by
Vincent F. Hendricks
Duncan H. Pritchard

December 2007, Palgrave McMillan Academic Publishers

ISBN 0 7546 5335 8

Contributing Authors


This book provides a valuable look at the work of up and coming epistemologists. The topics covered range from the central issues of mainstream epistemology to the more formal issues in epistemic logic and confirmation theory. This book should be read by anyone interested in seeing where epistemology is currently focused and where it is heading.
- Stewart Cohen, Arizona State University

These are the most exciting times in epistemology in the last 40 years. This group of essays indicates why: there are new developments in formal epistemology, new connections between formal and traditional work, new developments on epistemic paradoxes, and in value-driven approaches. Add to these further discussions of standard topics such as contextualism, coherentism, epistemic luck, externalism, deontologism, and naturalism, to name a few, and the result is an gem of a volume that is a “must read” for every epistemologist and any philosopher wishing to keep abreast of the current issues in epistemology.
- Jonathan Kvanvig, Baylor University

This volume is an excellent collection of new essays on core issues in epistemology, including new work on skepticism, contextualism, coherence, and epistemic deontology. The set of authors comprises a nice mix of new voices and experienced contributors who have already left their mark in the field. This promising book is bound to advance discussion and receive a good deal of attention in the field.
Matthias SteupSt Cloud University


In the past thirty years epistemology has been one of the fastest moving disciplines in philosophy. The reason for the rapid advancement is partly due to the fact that various schools and movements inside epistemology have developed different answers to classical epistemological problems, and partly due to the fact that formal methods from logic, probability theory and computability have been utilized to deal with many of the same issues and used for applications outside traditional epistemology. New Waves in Epistemology reflects this fast development. The goal of this compilation is to let up-and-coming scholars both describe the current trends in mainstream and formal epistemology and discuss the prospects of epistemology in the decades to come.

Contributing Authors

Michael Bergmann / Reidean Externalism

What distinguishes Reidean externalism from other versions of externalism is its commonsensism and its proper functionalism, both of which are inspired by the 18th century Scottish philosopher Thomas Reid. In this paper I will explain and defend both of these aspects of my Reidean position while also defending externalism. I will also discuss its ramifications for internalism and scepticism as well as for the “have-your-cake-and-eat-it-too” responses to scepticism proposed by contrastivists, contextualists and deniers of closure.

Tim Black / Defending a Sensitive Neo-Moorean Invariantism

I defend a sensitive neo-Moorean invariantism, an epistemological account with the following characteristic features: (a) it reserves a place for a sensitivity condition on knowledge, according to which, very roughly, S’s belief that p counts as knowledge only if S wouldn’t believe that p if p were false; (b) it maintains that the standards for knowledge are comparatively low; and (c) it maintains that the standards for knowledge are invariant (i.e., that they vary neither with the context of the subject of knowledge nor with the context of the attributor of knowledge). I argue that this sort of account allows us to respond adequately to some difficult puzzles in epistemology, puzzles such as sceptical puzzles and lottery puzzles, as well as puzzles that inspire epistemological contextualism. I also maintain that by utilizing what Keith DeRose calls a warranted assertibility manœuvre, sensitive neo-Moorean invariantism can account for our epistemic judgments in each of these puzzle cases.

Boudewijn de Bruin / Epistemic Logic and Epistemology

This paper provides a critical survey of the contemporary contributions of epistemic logic to epistemology, and sketches future challenges. I will begin by offering a self-contained introduction to contemporary epistemic logic, including an examination of some of its key applications in epistemology and a review of some of the traditional criticisms. This sets the stage for a discussion of more recent formal theories in dynamic logic and learning theory, and, again, an evaluation of their key applications to epistemology. In the second part I will discuss some questions in contemporary epistemology and evaluate how epistemic logic could contribute to our understanding and treatment of these problems. Among others, I will discuss non-propositional knowledge, the difference (or not) between ‘knowing that’ and ‘knowing how’, Williamson’s conception of ‘know’ as a factive mental state operator, and connections to the philosophy of science.

Troy Catterson / Hintikkan Epistemology

In this paper I chart the development, and gauge the prospects of, the turn from the conceptual analysis of knowledge to the formulation of truth conditions for knowledge claims, which came into its own with Jaakko Hintikka’s formulation of a possible worlds semantics for epistemic logic. I shall argue that although such a trajectory avoids many of the common problems that plague traditional accounts of knowledge, such as Gettier’s paradox and Scepticism, it does so only at the cost of giving rise to new problems, like logical omniscience and impossible possible worlds. I shall also argue that theseproblems turn out to be the semantic correlates to the old ones.

Paul Egré / Williamsonian Epistemology

Knowledge, as is notorious since Gettier, is more than justified true belief. A strong and widely shared intuition, on the other hand, is that knowledge is a form of robust or reliable belief. A problem is to find an adequate formalization for this notion of reliability at the logical level. In ordinary epistemic logics, for instance, knowledge is characterized simply as a form of factive belief (knowing that p implies that p is true), but factivity itself can be seen as a consequence of a more general principle of reliability. As Williamson writes (2000, 100): “if one believes p truly in case α, one must avoid false belief in other cases sufficiently similar to α in order to count as reliable enough to know p in α”. In Williamson’s analysis of knowledge, the notion of reliability is cashed out in terms of certain margin for error principles, which somehow generalize the idea that knowledge is factive: not only does knowing p require that p be true actually, but also that p be true in neighbouring cases. Thus for Williamson (2000, 17), “where one has only a limited capacity to discriminate between cases in which p is true and cases in which p is false, knowledge requires a margin for error: cases in which one is in a position to know p must not be too close to cases in which p is false”. In this paper, I investigate the logic of reliability outlined by Williamson, and discuss certain epistemological consequences of the margin for error principles on the unity of knowledge. In the first part of the paper, I present several formulations of the margin for error principle in a modal framework, and examine to what extent they provide an adequate analysis of the notion of reliability for knowledge. Margin for error principles originate in Williamson’s analysis of inexact knowledge, and more specifically of knowledge involving vague predicates. Thus, in Knowledge and its Limits, visual knowledge is taken as a paradigm case of knowledge involving a margin for error. Williamson, however, does not make it clear whether margin for error principles hold of all forms of knowledge alike, in particular of knowledge acquired non-perceptually. Despite this, Williamson has resorted to margin for error principles to argue against the idea that knowledge in general is luminous, namely that it obeys the principle of positive introspection. In the second part of the paper, I present a detailed criticism of his arguments, and examine more specifically the interplay of margin for error principles with iterative principles for knowledge. Williamson has propounded a series of epistemic puzzles to show that a reflexive form of the margin for error principle, in combination with further epistemic axioms, is contradictory with the principle of positive introspection. Gomez-Torrente (1997) and Graff (2002) have argued in the opposite direction that iterative versions of the margin for error principle may not all be true if some propositions at least are luminous (are known, and satisfy the schema of positive introspection). A different line of criticism, based on recent work (Dokic & Egré, 2004), consists in showing that Williamson’s arguments rest on the inappropriate identification of distinct forms of knowledge, namely perceptual and non-perceptual knowledge, which need not bring about the same margins for error. The margin for error principle and the principle of positive introspection can coexist, provided their domain of application is referred to the right sort of knowledge. Correlatively, as I will
show, consistent systems of epistemic logics with distinct knowledge modalities can be defined in which those principles are integrated. The picture of knowledge that emerges is not only richer, it is epistemologically more plausible.

Jeffrey Helzner / Conditions for Levi’s Concept of Informational Value

Isaac Levi begins his classic ‘The Enterprise of Knowledge’ with a provocative expression of his doubts concerning mainstream epistemology’s preoccupation with the “pedigree of knowledge”. In light of these doubts, Levi has offered a programme that places questions concerning the modification of epistemic states at centre of epistemology. In Levi’s programme these questions concerning the modification of epistemic states are treated decision-theoretically. One such question involves contractions of an agent’s epistemic state. Assuming that each epistemic state can be represented as a deductively closed set of sentences, an agent that must remove a specific sentence from its current epistemic state faces a decision over a set of possible contractions. Levi (along with Arlo-Costa in a recent 9 paper) have argued that the agent faced with such a problem should select a contraction that minimizes the loss of “informational value”. Little has been said as far as what conditions should govern the indicated concept of informational value. The purpose of this paper is to examine such conditions.

Franz Huber / Bayesianism and Scientific Inquiry

The problem addressed in this paper is “the explication of how we compare and evaluate theories [...] in the light of the available evidence”, which, according to van Fraassen, is “the main epistemic problem concerning science”. The first part presents the general, i.e., paradigm independent, loveliness-likeliness theory of theory assessment. In a nutshell, the message is (1) that there are two values a theory should exhibit: informativeness (loveliness) and truth (likeliness) measured respectively by a strength indicator (loveliness measure) and a truth indicator (likeliness measure); (2) that these two values are conflicting in the sense that the former is an increasing and the latter a decreasing function of the logical strength of the theory to be assessed; and (3) that in assessing a given theory one should weigh between these two conflicting aspects in such a way that any surplus in loveliness succeeds, if only the difference in likeliness is small enough. Particular accounts of this general theory arise by inserting particular strength indicators and truth indicators. The theory is spelt out for the Bayesian paradigm; it is then compared with standard (incremental) Bayesian confirmation theory. We close by discussing whether it is likely to be lovely. The second part discusses the question of justification any theory of theory assessment has to face: Why should one stick to theories given high assessment values rather than to any other theories? The answer given by the Bayesian version of the account presented in the first part is that one should stick to theories given high assessment values, because, in the medium run (after finitely many steps without necessarily halting), theory assessment almost surely takes one to the most informative among all true theories when presented separating data. The comparison between the present account and standard (incremental) Bayesian confirmation theory is continued.

Ram Neta / Evidential Contextualism and the Methodology of Epistemology

For the past 35 years, Anglo-American epistemology has been divided into two camps. There are epistemologists who try to figure out a priori the principles that determine the epistemic status of particular doxastic states. And there are epistemologists who try to figure this out on the basis of empirical psychological evidence. Each of these two sides accuses the other of being unable to argue for normative conclusions on the basis of the data to which they restrict themselves. I claim that both parties to this dispute are correct in their critical remarks about the other: the normative results that epistemology should deliver can come neither from a priori intuitions nor from psychological evidence.
Rather, I argue, epistemologists should start with the question “what purpose(s) should our practice of epistemic appraisal serve?” We can then figure out how our practice of epistemic appraisal should be fashioned. I argue that a well-fashioned practice of epistemic appraisal will involve the employment of terms of epistemic appraisal that are semantically context-sensitive, but in a way that is typically hidden from their users. Sceptical puzzles will inevitably arise from our unwitting exploitation of such semantic context-sensitivity. And so, I argue, sceptical puzzles are the inevitable but modest cost of a well-fashioned practice of epistemic appraisal.

Nikolaj Nottelmann / The present and future state of epistemic deontologism

 It the past it has standardly been presumed that a doxastic agent may only be held liable in an epistemic sense for those of her beliefs, whose content she could at least hypothetically have influenced by the direct operation of her will. However, in the recent decades, doxastic voluntarism in its classical form has sunk into a disreputable position, owing its place in the debate mostly to the sometimes doubtful quality of the arguments offered against it. If doxastic voluntarism is indeed false, we face the following trilemma concerning the application of deontic notions like responsibility and blameworthiness to objects of epistemic evaluation as e.g. belief: (1) We may give up as illegitimate the application of such notions. (2) We may claim that the legitimate application of deontic notions does not hinge upon the exercise of doxastic control. (3) We may look for alternative modes of doxastic control capable of underwriting the legitimate applicability of deontic notions. In order to engage fruitfully with the problem, a basic understanding of epistemically informed deontic notions must be developed. Further, if the applicability of such notions does indeed hinge on the possible exercise of doxastic control, a covering taxonomy of the relevant modes of doxastic control must be provided. Here I pursue both of these projects. A pertinent question is how a deontic notion may be epistemic in a deeper sense than that of applying to standard objects of epistemic evalution like e.g. beliefs. I defend here the view that such notions may indeed be informed by epistemic norms, although it turns out that e.g. ascriptions of epistemic blameworthiness do also hinge upon the doxastic agent’s all-things-considered reasons for certain actions and omissions.

Erik Olsson / The Place of Coherence in Epistemology

The current debate on the coherence theory is a striking example of how well-understood formal models here the theory of probability and Bayesian networks can fertilize vague epistemological discourse. In this case it has led to clear new answers to age-old questions, such as that of 10 the relationship between coherence and truth. At the same time, these answers are highly provocative, showing as they do that the tie between coherence and truth is much weaker than anyone could have expected. This paper rehearses the latest developments in this field culminating in some disturbing new impossibility results. This summary is followed by a detailed examination of different proposals for how to assign a positive role to coherence in epistemology in spite of the negative formal results, including proposals from Audi, Thagard, Shogenji and Bovens, and Hartmann.

Duncan Pritchard / Anti-Luck Epistemology

It has long been noted that in order to account for our intuitive idea of knowledge as involving a cognitive achievement on the part of the agent, it is essential that we capture the sense in which knowledge involves a true belief that is not gained via luck. Although this intuition informs much of epistemology, however, there has been very little in the way of direct examination as to what this intuition involves. I survey different ways in which one might understand the claim that knowledge is nonlucky and argue that there are in fact several construals available of this brute intuition, with each construal corresponding to a significant position within contemporary epistemology. By identifying the nuances involved in our understanding of the anti-luck intuition it is argued that light is thrown on such central debates in epistemology as the externalism/internalism distinction, the problem of radical scepticism, and the Gettier puzzles.

Wayne Riggs / The Value Turn in Epistemology

Twentieth Century Anglo-American philosophy famously took a “linguistic turn,” wherein philosophers sought to gain purchase on traditional problems in metaphysics, epistemology, and to some extent ethics, by first settling some fundamental questions in the philosophy of language. While this movement has largely run its course, it did provide us with lasting insights into some of these problems. In an analogous fashion, I think that philosophers can gain insight into the traditional problems of epistemology (at least) by attending first to questions of (epistemic) value. After making the case for a particular constellation of values that I take to be appropriate to the evaluation of our cognitive lives and the products thereof, I will show that surprising consequences follow both for the theory of knowledge and for epistemology more broadly construed.

Finn Spicer / Knowledge and the Heuristics of Folk Epistemology

Epistemologists sometimes try to convince us of the truth of their claims about the nature of knowledge by appeals to our epistemic intuitions. So for example, epistemologists describe a Gettier case to us; we intuitively think that the subject in this case fails to know what he justifiably and truly believes; and by this route the epistemologists convince us that knowledge is not justified true belief. Recently the place of appeals to intuition in epistemology has been challenged: Hilary Kornblith (2003) has argued that the proper task of epistemology is not to gather and systematise our intuitions; Brian Weatherson (2003) has argued that appeals to intuitions about Gettier cases provide a poor reason to reject the justified true belief account of knowledge. In this paper I offer a theory about the nature of our epistemic intuitions—a theory of the cognitive processes that generate them. I claim that we possess a folk theory of knowledge—folk epistemology—and I argue that our intuitive judgements about knowledge are the product of exercising our folk epistemological competence. I show that we can only assess whether our intuitions are a good guide to the nature of knowledge by understanding the function of folk epistemology and how it is realised in our cognitive architecture. I then offer a detailed hypothesis of these matters, in which folk epistemology is viewed as a tacit theory that includes tacitly known principles about knowledge and heuristics for ascribing knowledge. Against the background of this hypothesis, my conclusions about the proper role of intuitions develop Kornblith and Weatherson’s theme: I conclude epistemic intuitions can carry little weight in the construction and evaluation of epistemological theories. As a case study, I criticise in detail the role that intuitions are given in support of contextualism about the semantics of the predicate ‘knows’.

Jonathan Schaffer / Questioning Knowledge

Knowledge is generally regarded as connected to inquiry. Yet the knowledge relation is standardly treated as holding between a subject and a proposition (‘s knows that p’), in a way that makes with no connection to inquiry. I will consider the prospects for connecting the knowledge relation to inquiry, by relativizing knowledge to the question under inquiry, so that knowledge holds between a subject, a proposition (the answer), and a stage of inquiry (the question). On this view, all knowing is knowing the answer.

Jason Stanley / Subject-Sensitive Invariantism

My purpose in this paper is to evaluate the case for contextualism in epistemology and to present an alternative, which I shall call the interest-relative account. Contextualism in epistemology is the doctrine that the proposition expressed by a knowledge attribution relative to a context is determined in part by the standards of justification salient in that context. The (non-sceptical) contextualist allows that in some context c, a speaker may truly attribute knowledge at time of a proposition p to Hannah, despite her possession of only weak inductive evidence for the truth of that proposition. Relative to another context, someone may make the very same knowledge attribution to Hannah, yet be speaking falsely, because the epistemic standards in that context are higher. The reason this is possible, according to the contextualist, is that the two knowledge attributions express different propositions. The main argument for contextualism is that there is a set of intuitions about ordinary knowledge ascriptions, concerning which a contextualist semantics yields the most satisfactory account. Once the contextualist semantics is in place, it also provides a certain kind of resolution of the problem of scepticism. In the first section of the paper, I explain and motivate this argument for contextualism. In the second section, I argue that there is substantial evidence against the contextualist thesis that knowledge attributions are context-sensitive. In the third section, I discuss an argument, due to David Lewis, against non-contextualist, non-skeptical accounts of knowledge. Rejecting this argument clears the path for the fourth section, in which I introduce jus t such an account of knowledge, and show how one can use it to give an equally satisfying explanation of the intuitions about ordinary knowledge ascriptions that motivate the contextualist semantics. In the final section, I contrast the two accounts of knowledge.

Fritz Warfield / Knowledge, Scepticism, and Anti-Scepticism

I focus on sceptical and anti-sceptical arguments, arguing that several kinds of anti-sceptical arguments are more powerful than most epistemologists have realized. I also attend to some disputed
issues concerning the analysis of knowledge that bear on the overall evaluation of sceptical and antisceptical arguments.

Ralph Wedgwood / Contextualism and Moral Beliefs

A version of epistemological contextualism is outlined: primarily, this is contextualism about terms like ‘justified belief’ (or ‘reliable method’), rather than about ‘knowledge’. (It would lead to a form of contextualism about ‘knowledge’ only if justification or reliability is a necessary condition for knowledge—a question that is not addressed here.) Some beliefs are more justified than others. So, how justified does a belief need to be if it is to count as “justified” simpliciter? According to this version of contextualism, there is no contextindependent answer to that question. In some contexts in which we ask whether a belief is “justified”, demanding standards apply, so that relatively few beliefs can be truly described as “justified”; in other contexts, more relaxed standards apply, so that many more beliefs can be truly described as “justified”. The form of contextualism can deal, not only with outright beliefs (in which we simply believe a proposition, attaching no credence to any incompatible propositions), but also with partial beliefs (in which we “hedge our bets”, placing some credence in the proposition but also some credence in other incompatible propositions). Which aspects of the context determine whether demanding or relaxed standards apply? It is argued that the only features of the context that do this are the practical considerations—such as the needs, purposes and values—that are salient in the context. It is argued that this version of contextualism is no help in answering the most challenging forms of familiar arguments for scepticism. On the other hand, it is useful for dealing with other problems. In particular, it can help to solve a problem about moral beliefs, which may be called the problem of the “moral evil demons”. It can solve this problem because in many contexts in which we ask whether a moral belief is “justified”, the salient practical considerations dictate that relaxed and undemanding standards apply. The result is that that in those contexts, it is true to describe the moral belief as “justified”—even if the believer is barely more justified in believing the moral proposition than in believing that proposition’s negation.


  • Professor Michael Bergmann teaches Philosophy at Purdue University, USA
  • Professor Tim Black teaches Philosophy at California State University, Northridge, USA
  • Dr. Boudewijn de Bruin teaches Philosophy at the Institute for Logic, Language and Computation at the University of Amsterdam, NL
  • Dr. Troy Catterson teaches Philosophy at Hawaii Pacific Universit, USA
  • Dr. Paul Egré teaches Philosophy at the University of Paris IV, and is a fellow of the Institut Jean Nicod and the Institut d’Histoire et de Philosophie des Sciences et Technique, FR
  • Professor Jeffrey Helzner teaches Philosophy at Columbia University, USA
  • Dr. Franz Huber teaches Philosophy at the University of Konstanz, DE
  • Professor Ram Neta teaches Philosophy at the University of Carolina, Chapel Hill, USA
  • Dr. Nikolaj Nottelmann is associate research professor at the University of Copenhagen, Denmark
  • Dr. Erik Olsson teaches Philosophy at the University of Lund, Sweden
  • Dr. Duncan Pritchard teaches Philosophy at the University of Stirling, UK
  • Professor Wayne Riggs teaches Philosophy at the University of Oklahoma, USA
  • Professor Jonathan Schaffer teaches Philosophy at the University of Massachusetts, USA
  • Dr. Finn Spicer is lecturer at University of Bristol, UK
  • Professor Fritz Warfield teaches Philosophy at the University of Notre Dame, USA
  • Dr. Ralph Wedgwood teaches Philosophy at Merton College, Oxford, UK