What’s Wrong with Moral Deference?
63rdAnnual meeting of the Florida Philosophical Association, 2017
Jonathan Matheson, University of North Florida
I want to begin by thanking those responsible for this great conference. Laurie Shrage, for doing a superb job putting together these talks, and Ron Cooper, for all of his hard work as site coordinator, particularly his work in putting on this great banquet at such a nice location.
Tonight I want to briefly raise and motivate a puzzle and suggest a solution. The puzzle concerns moral deference. Most of our beliefs come by way of testimony. Among those are beliefs where we have deferred. We can understand an act of deference as follows:
S1 defers to S2 about p when S1 comes to believe p merely because S1 discovers that S2 believes p.
In general, deference to an expert is perfectly appropriate. This can be seen by quickly looking at some ordinary examples:
- It is appropriate for me to defer to a botanist as to whether a given tree is an elm.
- It is appropriate for me to defer to a chemist as to molecular structure of glucose.
- It is appropriate for me to defer to an art historian as the historical influences on a particular piece of art.
- And so forth.
However, at the same time, many have questioned and doubted moral deference is appropriate. For instance, they have questioned and doubted whether:
- It is appropriate for me to defer to a moral expert as to whether it is morally wrong to eat meat.
- It is appropriate for me to defer to a moral expert as to whether the death penalty is morally permissible.
- It is appropriate for me to defer to a moral expert as to whether it is morally obligatory for me to give much more of my resources to charity.
- And so forth.
Let’s examine a case of moral deference in more detail. Consider the following:
Melanie meets Maggie at an academic party. Melanie is a historian and she comes to find out that Maggie is an applied ethicist who has spent most of her career examining the moral case for vegetarianism. In their conversation, Melanie doesn’t ask Maggie to present the main arguments for or against vegetarianism, but simply asks Maggie what her position is on the issue. Maggie says that she believes that it is morally wrong to eat meat. Considering Maggie’s expertise on the matter, Melanie defers and also adopts the belief that it is morally wrong to eat meat. 1
Intuitively, something is amiss with Melanie’s deference, or at least there is something that seems more problematic here than in other cases of more ordinary deference. This is particularly the case if we understand the example as a case of ‘pure moral deference’ – a case where Melanie is not ignorant of any non-moral fact that Maggie knows that is relevant to the issue.
Following McGrath (2011), let’s call the following claim ‘DATUM’.
DATUM: There is something amiss with moral deference that is not present in ordinary cases of deference.
What explains DATUM? DATUM cries out for explanation or to be undermined. However, initially intuitive explanations do not hold up to continued scrutiny. Before providing a novel explanation of DATUM, I will briefly consider and reject a number of competitor explanations of DATUM. 2
First, it is clear that moral deference is not inappropriate simply since it is a kind of deference. As we saw above, we do not see any problem, or at least the same problem, with other kinds of deference. DATUM requires there to be something particularly problematic about moraldeference. Further, moral deference cannot be inappropriate simply because it results in the formation of a new moral belief. While moral deference does have this result, we don’t see any problem with forming new moral beliefs. This is something that we do all the time. It might be thought that the problem results from forming a new moral belief due to others. Moral deference is importantly different that simply thinking about a new moral question and coming to a new moral belief. However, moral persuasion is perfectly appropriate and it to results in a new moral belief on the basis of another. So, any problem with moral deference must lie elsewhere.
Another thought in the literature is that moral deference is inappropriate since morality is equally accessible to us all. If you and I each had equal access to some fact, it would be bizarre for me to simply defer to you on the matter. However, it is implausible that morality is equally accessible to us all in the way required to explain DATUM. First, it is implausible that each of us have thought about moral questions equally long, have the same amount of evidence, equally good intellectual faculties, and so forth. We aren’t all in the same epistemic position on moral matters. But, if that is so, then the inappropriateness of moral deference can’t be what was explored above. It may be that we all in principle are able to have equal access to morality. I don’t see why this would be, but even if it were, it would fail to explain DATUM. For instance, it may be that car mechanic knowledge is equally accessible to us all. That is, it could be that so long as we all put in the relevant time and training we could be in as good of an epistemic position on automotive matters. However, even if so, the fact is that most of us have not done so. This makes an important difference and makes it appropriate to defer to a car mechanic on an automotive issue. So, the idea that morality is equally accessible to us all cannot explain DATUM. 3
A further thought in the literature is that moral deference is inappropriate since it exhibits a lack of autonomy or authenticity. The idea here is that some beliefs simply should not be outsourced – we should figure it out on our own or at least appreciate the reasons for ourselves. Whereas it may not be important to understand the reasons that led to the car mechanics belief, morality could be importantly different, requiring us to appreciate the reasons behind the truth (not just believing it). While it is clearly preferable for individuals to not simply have true moral beliefs but to also appreciate for themselves the reasons behind the truths, this line of reasoning fails to explain DATUM. Moral deference may not get one to the optimal state, but it can provide one with an important improvement. Sometimes such an appreciation is outside of the grasp of the individual or there is not the time to nourish it. In such cases, moral deference (and coming to a true moral belief) is much preferable to sticking with how things seem to the individual in question.
Along these lines, it has been claimed that moral deference is inappropriate since it precludes something more valuable – moral understanding or cultivation of moral virtue. Here too, we can agree that moral deference does not give us everything that we may want, but it does offer us an important improvement. It is better for an individual to have moral virtue and understanding, and one cannot simply get either of these goods by way of deference, but that is not to say that nothing can be gained by moral deference. In moral deference one gains access to moral truth and true moral belief. These too are goods, and goods important to attaining the even more valuable moral virtue and moral understanding. In fact, such moral deference appears to be a necessary and important part of moral education and development. So, if there is something amiss with moral deference, it must lie elsewhere.
Finally, it might be thought that moral deference is inappropriate since morality is normative, and there is a problem with normative deference in general. The examples of appropriate deference we examined were all cases of non-normative deference, so such an explanation gets the heretofore-examined cases correct. However, other kinds of normative deference aren’t problematic at all. Deference appears to be perfectly appropriate in matters of etiquette, logic, what legal defense I ought to pursue. Each of these domains involve normativity, yet it is easy to find examples of appropriate deference within them. Further, other cases non-normative deference seem equally amiss. For instance, deferring as to God’s existence or as to whether we have free will seem inappropriate, and inappropriate in the same ways that moral deference is. 4
This final failed explanation points toward a promising positive proposal. In many ways, the proposal is far from novel. In fact, it appears to have its roots in ancient Greek philosophy. We are all familiar with the Socratic claim that “the unexamined life is not worth living.” Even if this claim is too strong as it stands, ‘the examined life is of greater value’ is pretty plausible. It is important that youthink about life’s big questions. This is part of the value of philosophy.
Recently a group of Princeton scholars wrote an open letter to students headed off to college for the first time. 5 In brief, the takeaway was simply this: think for yourself. Some pieces of the letter are worth quoting:
“In today’s climate, it’s all-too-easy to allow your views and outlook to be shaped by dominant opinion on your campus or in the broader academic culture. The danger any student—or faculty member—faces today is falling into the vice of conformism, yielding to groupthink.”6
“Thinking for yourself means questioning dominant ideas even when others insist on their being treated as unquestionable. It means deciding what one believes not by conforming to fashionable opinions, but by taking the trouble to learn and honestly consider the strongest arguments to be advanced on both or all sides of questions—including arguments for positions that others revile and want to stigmatize and against positions others seek to immunize from critical scrutiny.” 7
This may start to sound like one of the earlier rejected explanations of DATUM that appealed to autonomy and authenticity. As we saw, that explanation of DATUM failed. We can see further problems here. Imagine the student who does think for himself in all of the ways described and yet comes to the conclusion that:
- Vaccines cause autism.
- The Holocaust never happened.
- The Earth’s climate is not changing in any significant way.
In such cases, something has gone right. The student has thought for himself/herself and thought hard about the issue. That’s a good thing. But, something has importantly gone wrong. The student has appreciably bungled things. He/she has terribly misevaluated the relevant evidence. While it is important to think for oneself, it is also important to humbly acknowledge that one’s epistemic position on many matters is impoverished in comparison to a great many others. The insights of these experts should not be dismissed even if it is important to also think for oneself.
The idea here is that some questions are such that it is good for you to think hard about them. Further, it is good for you to do this even ifat the end of the day you shouldn’t believe your own conclusions on the matter – even if after your own careful thought and evaluation you still need to defer to the relevant experts. This conclusion also fits nicely with insights from the literature on the epistemology of disagreement.
A plausible view in the epistemology of disagreement, one I have defended elsewhere 8, is that what it is reasonable for someone to believe depends importantly on what they are reasonable in believing about the state of the debate surrounding that proposition. In cases where an individual is justified in believing that there is something of a consensus on the matter amongst the relevant experts, she is also justified in adopting that same attitude toward the proposition in question. In cases where an individual is justified in believing that there is significant controversy surrounding the issue amongst the relevant experts, she is justified in suspending judgment about the proposition in question. In these ways, the higher-order evidence one has about what the relevant experts believe has a dramatic effect on what one is justified in believing about the matter.
This proposed explanation of DATUM is evident to me when teaching Introduction to Philosophy. In that course, I want students to think hard and long about difficult ethical issues, God’s existence, whether and how they have free will and knowledge, and so forth. I think that it is important for them as individuals to wrestle with these questions – to think hard about them for themselves. Something would have gone wrong if I simply told them the answers (or what at least I take them to be) and then tested them on how well they could remember what the correct answers are. No, they need to wrestle with these questions. Simply deferring on these matters robs them of important goods. That said, it doesn’t follow that the belief they come to at the end of the day is justified or one what they should keep. After all, they are novices about such matters. There are a great many who are smarter, more knowledgeable, more intellectually virtuous, etc. that have also thought about these important questions in much more detail than they have. If a student ignores this fact, then their belief is importantly lacking.
So, according to this explanation of DATUM, moral deference is problematic, but it is not problematic for epistemic reasons. Moral deference can be problematic because it can rob the subject of doing something that is inherently valuable – wrestling first-hand with the important questions of life. Since lives go better for their subjects when they do wrestle with those questions, moral deference can make one’s life worse than it would otherwise be. The best of both worlds is to both wrestle first-hand with the important questions of life, but also appreciate one’s intellectual place in the world and defer (or suspend judgment) as the expert consensus (or disagreement) would have it. So, think for yourself, just don’t stop there!
- This case comes from Matheson, McElreath, and Nobis (2018).
- This follows Matheson, McElreath, and Nobis (2018).
- This reasoning follows McGrath (2011).
- For examples, see Matheson, McElreath, and Nobis (2018).
- The letter can be assessed here: https://jmp.princeton.edu/announcements/some-thoughts-and-advice-our-students-and-all-students
- For instance, see Carey and Matheson (2013) and Matheson (2015).