What is it? An afternoon of talks on recent and current work in formal epistemology investigating what it takes to be rational, especially in settings that diverge from the standard perspective of ideal rationality.
Where is it? Tutorial Rooms 1/2, Centre for Applied Anatomy, Southwell Street, Bristol, UK (map)
When is it? Monday 9th December
1.00-2.15 – Kenny Easwaran (Texas A&M) – What is epistemic rationality?
2.15-2.30 – BREAK
2.30-3.45 – Kevin Dorst (Oxford) – “Over-confidence” is rational
3.45-4.00 – BREAK
4.00-4.45 – Richard Pettigrew (Bristol) – Logical ignorance and logical learning
4.45-5.30 – Catrin Campbell-Moore (Bristol) – Probabilistic uncertainty and non-classical logic
5.30-5.45 – BREAK
5.45-6.30 – Jason Konek (Bristol) – Accuracy and holism
Generously funded by the University of Bristol’s Centre for Science and Philosophy.
Kenny Easwaran – What is epistemic rationality?
I consider the concept of rationality for degrees of belief generally. I take a pragmatist stance, arguing that all rationality norms must follow from the central role degree of belief plays (whether pragmatic or alethic), but I argue that different perspectives yield different overall sets of norms. Many disputes about epistemic rationality are thus, I claim, terminological disputes, about which perspective is the “right” one. From one perspective, the only norms of rationality will be probabilism, conditionalization, and logical omniscience, yielding a very permissive conception of rationality. But from other perspectives, there could well be additional norms. Probabilism, conditionalization, and logical omniscience themselves have very different force because they derive from very different sources. These contrasting views of rationality are connected to the distinction between an “internal” and an “ecological” conception of rationality, the question of whether attitudes or policies are the right level of evaluation, and the contrast between ideal and non-ideal theories of rationality. I argue that epistemology, even for a purely solitary agent, needs to consider issues from social and political theory.
Kevin Dorst – “Overconfidence” is rational
Do people tend to be overconfident in their opinions? Psychologists think so. They have run calibration studies in which they ask subjects a variety of questions, and then compare their confidence in their answers to the proportion that were true. Under certain conditions, an “overconfidence effect” is robust—for example, of the answers people are 80% confident in, only 60% are true. Psychologists have inferred that people are irrationally overconfident—that they are more confident than they should be, given their evidence. Although this is not a valid inference, it can be partially vindicated: the inference is warranted when (and because) you should defer to the opinions it would be rational for the subjects to have. But this vindication is only partial, for when you should not defer, the “overconfidence effect” can in fact be evidence that the person is rational. I argue that attention to these epistemological details reveals that the empirical evidence from calibration studies in fact fits well with the hypothesis that people’s degrees of confidence tend to be rational.
Catrin Campbell-Moore – Probabilistic uncertainty & nonclassical logic
If I know that the right logic is some particular non-classical logic, what should the model of probabilistic uncertainty be? Williams argues that we should assign a particular numerical value to the non-classical truth values, e.g. “neither”, and our degrees of belief are still given by numerical functions. I put some pressure on this and offer some alternative models. A further question is: what about if I’m uncertain about logic?
Richard Pettigrew – Logical ignorance and logical learning
According to certain normative theories in epistemology, rationality requires us to be logically omniscient. Yet this prescription clashes with our ordinary judgments of rationality. How should we resolve this tension? In this paper, I focus particularly on the logical omniscience requirement in Bayesian epistemology. Building on a key insight by Ian Hacking (1967), I develop a version of Bayesianism that permits logical ignorance. This includes an account of the synchronic norms that govern a logically ignorant individual at any given time, as well as an account of how we reduce our logical ignorance by learning logical facts and how we should update our credences in response to such evidence. At the end, I explain why the requirement of logical omniscience remains true of ideal agents with no computational, processing, or storage limitations.
Jason Konek – TBA