08.03.2010
Meta-ethics: On Emotions, Responsibility and Determinism
Christopher B. Germann
University of Plymouth
Faculty of Science and Technology
Department of Psychology Ψ
This paper will discuss the role of emotions in moral judgments. It will be
argued that emotions form the fundamental basis of morality. This distinctive
approach comes from David Hume and has been elaborated in important ways
by Peter Strawson. Recent results from cognitive, clinical, social, developemental
and neuroscience providing converging empirical support for this perspective
on morality will be presented. Moreover, both philosophers employ a kind of
emotion based defense of moral responsibility. Therefore issues concerning moral
responsibility and determinism will be adressed.
mail@christopher.germann.de
“The hypothesis which we embrace is plain.
It maintains that morality is determined by
sentiment. It defines virtue to be whatever
mental action or quality gives to a spectator
the pleasing sentiment of approbation.... If
you call this metaphysics, and find any-
thing abstruse here, you need only conclude
that your turn of mind is not suited to the
moral sciences.
~ David Hume, EPM, Appendix I ~
1
2 CHRISTOPHER B. GERMANN
The thesis of the Scottish philosopher David Hume
(1711-1776) is that our moral judgments are based on
our emotional reactions (but see Sutherland, 1976).
Contratry to Hume, many philosophers (i.e., Emanuel
Kant) followed a normative approach to ethics. Nor-
mative ethics is concerned with questions about “how
should we live and “what are the principles we should
we live by”. Hume, on the other hand, follows an-
other branch of ethics or moral philosophy that tries
to figure out the nature of morality itself: “What
makes something right or wrong”. This approach is
focused on the fundamental nature of morality and
moral judgment and it is called meta-ethics (see Gar-
ner & Rosen, 1967). One central tradition within
meta-ethics holds that ultimately morality is based in
the emotions (Ayer, 1936; Stevenson, 1944). Hume is
probably the best known advocate of this approach
to morality which is sometimes called sentimentalism
(Slote, 2003). On his view we have the moral convic-
tions that we do because of the emotional responses
that we have. If we had really different emotional re-
sponses we would have really different moral convic-
tions. Part of Hume’s sentimentalist view of moral-
ity is that our emotions drive our judgments about
right and wrong and that emotions provide the foun-
dations for our moral capacities. Hume only offered
philosophical arguments for this, but recently there
has been a wave of research that looks like it supports
at least some parts of Hume’s view that emotions play
a critical role in moral judgment (see appendix A for
some results from neuro-imaging studies).
Emotions and moral
judgment
A famous experimental paradigm used to inves-
tigate moral decision making is the trolley dilemma
(Thomson, 1976, 1985; Greene et al., 2009). The
trolley dilemma is a thought experiment in which
participants are asked to give their judgement with
regard to two scenarios. In the first scenario (see
Figure 1) the trolley is running out of control down a
track. Unfortunately five people are tied to the track
and the trolley is about to overrun them. However,
participants are told that they can flip a switch, which
will redirect the trolley down a different track. Only
a single person is tied to that track. The question is:
Should you flip the switch?
Figure 1. The question is: Should you flip the switch
in order to save the five people?
In the second scenario (see Figure 2), as before,
the trolley is about to overrun five people. In this
version of the thought experiment participants have
to imagine themselves standing on a bridge under
which the trolley will pass. Moreover, a tall and heavy
man is standing on the bridge aswell and the only way
to stop the trolley in order to save the live of the five
people is to push that man over the bridge onto the
track. The question is: Should you push the man?
Figure 2. The question is: Should you push the man in
order to save the five people?
In a formal mathematical sense these are absolutly
equivalent choices (you kill one in order to save
five). However, in the first case you do nothing more
emotionaly soiling than pull a lever. In the second
case you use your own hands to push the person onto
the track. Even though from an economic standpoint
both scenarios are absolutly equivalent people are
three times more likely to pull the lever than to
pushwith their own hands. These findings have
been cross-validated by investigating over 200000
individuals from more than 100 countries (Miller,
2008). Typically individuals find it hard to come up
with a compelling justification for the incongruent
decissions they make. It has been argued that this
indicates a dissociation between judgments and jus-
tifications (Cushman et al., 2006; Hauser et al., 2007;
Mikhail, 2007). The judgments people make conform
in different ways to utilitarianism and deontology.
Utilitarian’s say what you should do in both cases is
save the people. But deontologists do not agree. In
the case where you would have to push the guy you
should not do so because that’s wrong (Greene &
Haidt, 2002).
One of the really interesting things that has hap-
pened just in the last several years is that researchers
began to examine whether the emotion are involved
in these kinds of judgments. These investigations
are based on the idea of modifying the emotional re-
sponses. The undelying logic is that when people say
that it is wrong to push the guy it is because their emo-
tions are telling them that. That’s the part that fits
with Hume’s view that the emotions are at the core
of morality (see also Haidt, 2001, 2002, 2003, 2008).
In a recent clever fun study conducted by Valdesolo
& DeSteno (2006) the researchers induced emotions
THEORETICAL AND PHILOSOPHICAL ISSUES IN PSYCHOLOGICAL RESEARCH 3
to investigate this hypothesis experimentally. In this
study one group of participants saw a clip from Satur-
day Night Live that was apparently very funny and an-
other group saw a clip of a Spanish fishing village that
was not funny at all. What the investigators found was
that participants that saw the Saturday Night Live clip
were more likely to say that it is okay to push the guy of
the footbridge relative to those that saw the clip of the
Spanish village. So it looks like by changing the emo-
tions people have one can change the kinds of moral
judgments that they make.
There is even more compelling evidence for Hume’s
view that emotions lie at the heart of morality. Re-
searchers found that people who have diminished
emotional capacities also showed different patterns
of moral judgment. One patient population that has
been studied in this context has natural occurring
damage to an area of the brain known as the ventral-
medial-prefrontal cortex
1
(another population are
people diagnosed as psychopaths, but see appendix
B). We already know from previous work (A. Dama-
sio et al., 1990; Anderson et al., 1999) that damage to
that area is associated with diminished emotional re-
sponses. They gave these kinds of tasks like the foot-
bridge dilemma to patients that have damage to this
area of the brain (Koenigs et al., 2007). What the re-
searchers found was that they were more likely to say
it is okay to push the guy of the footbridge relative to
controls. So they basically gave more utilitarian judg-
ments.
What is important from a Humeian perspective is
that it looks like not having the emotions that nor-
mal people have has an effect on how they make their
moral judgments. The fact that they have damage
to emotional regions of the brain affects their judg-
ments about what the right thing to do is (see Moll
et al., 2005, 2008, for comprehensive reviews). So
this gives some evidence that suggests that Hume was
right about how moral judgment actually works. It
looks like the emotions do play an important role in
moral judgment and it is likely that we would have
radically different views about right and wrong if we
had radically different emotions. Hume also holds
that there is no further deeper justification for moral
beliefs. In contrast to many other philosophers be-
fore and after Hume he maintains that moral beliefs
don’t have any rational basis. He argues that there
is no objective morality that stands apart from our
emotions. Our moral understanding does not come
from logic or reason. Moreover he thinks that any
attempt to ground it in logic or reason or proof is
just hopeless and that we will never have a proof of
morality. Nonetheless, Hume still thinks that it is ap-
propriate to follow our moral beliefs since that’s the
fundamental basis of morality (Nichols, 2004). There
will be nor replacement of morality that does not de-
pend on the emotions. A nicely fitting analogy is aes-
thetics. If you come to think that your judgments
about beauty are guided by your emotional reactions
does that make you think “Oh, I guess I shouldn’t have
those judgments of beauty”? If somebody convinces
me that the reasons why I like classical music is be-
cause that music has an emotional effect on me I
would never say “I guess then that music must not
be any good. I wouldn’t stop regarding something
highly aesthetically just because I found out that emo-
tions have a lot to do with it. Neither, according to
Hume, should I give up those kinds of emotion based
aesthetic judgments and the same goes for moral-
ity. Coming to appreciate that morality has an emo-
tional basis shouldn’t make you think that “Oh, then I
give up morality, according to Hume’s view (Nichols,
2004). That naturally extends to the case of moral re-
sponsibility. Our ideas of moral responsibility, Hume
suggests, also depend on the particular emotions we
have (Nichols, 2007a). If we had different emotions
we would not have the same notions of moral respon-
sibility. So how do the emotions play in to moral re-
sponsibility? According to Hume, if a person performs
a vicious act that will naturally lead others to feel emo-
tions of blame and resentment towards him (Nichols,
2004).
Taken together, that’s a really quick picture of
Hume on the sentimental basis of morality includ-
ing the sentimental basis of moral responsibility. How
does this play in the debate over determinism and re-
sponsibility?
Moral responsibility
Hume has a defense of moral responsibility that
draws on the idea that the emotions are really im-
portant in judging responsibility. He thinks when we
recognize the role of emotions in moral responsibil-
ity we can see that questions about determinism are
just beside the point. A modified example of this
view goes like this: Imagine you get slandered by a
colleague. Will your emotions be tempered by the-
oretical reflections on determinism? Hume puts the
question in the following way: “A man who is robbed
of a considerable sum; does he find his vexation for
the loss anywise diminished by these sublime reflec-
tions (about necessity)?” (Hume, 1743/1955, content
in brackets added). Hume doesn’t even bather to an-
swer the question because he thinks it is so obvious.
The obvious answer is “No”. The emotions are a natu-
ral response to vicious acts like slander or theft. No
sophisticated philosophical theory about determin-
ism is going to get in the way, Hume argues (Nichols,
2004). The big question is whether a victim in this case
1
The ventral-medial-prefrontal cortex is a frontal region
of the brain just above the eyes. Phineas Gage had damage
to this brain area (but see H. Damasio et al., 1994; Sanfey,
Hastie, et al., 2003). To put it a bit oversimplified, these pa-
tients behave like “Vulcans” they cannot bring their emo-
tions into play (but see Charland, 1998; Cohen, 2005).
4 CHRISTOPHER B. GERMANN
should find his vexation diminished by thinking about
the necessity of all things. So that the question be-
comes: Should thinking about determinism diminish
our vexation? Hume puts the burden on his opponent.
If the victim doesn’t find his vexation diminished why
should he stop resenting the person? Why should you
think that resentment is incompatible with necessity?
Given that reflecting on determinism won’t effect our
emotions of blame, Hume says, we should see that the
emotional reactions are really compatible with deter-
minism (Nichols, 2007a). From this point of view, the
doctrine of determinism is irrelevant to these emo-
tional responses. Since emotional responses are the
basis for morality there is no other authority for moral
right and wrong. That provides a strong presumptive
case for sustaining our resentment (Nichols, 2007a).
In the early 1960 the English philosopher Peter
Strawson (1919-2006) developed a similar view and
this view has been very influential over the last sev-
eral decades (Nichols, 2007a). He defends the propri-
ety of holding people morally responsible (P. Straw-
son, 1962). He focuses on cases where morality most
directly influences our lives. Every single day we in-
teract with people. We encounter our friends, neigh-
bors, colleagues, et cetera. Every single day we have
emotional reactions to people we interact with. We
feel gratitude, resentment, moral outrage, forgiveness,
love, and so forth. Strawson calls this reactive atti-
tudes (P. Strawson, 1962, p.18). They are natural re-
actions, as he says, to the good will or ill will or indif-
ference of others towards us (P. Strawson, 1962). We
react with resentment if we feel that a person’s action
reveals that they don’t respect us and with gratitude
if we think that they have shown a positive attitude
towards us. Strawson argues that the attitudes of re-
sentment and gratitude reflect the fact that we hold
the individuals morally responsible for their actions
(Nichols, 2004). For example, if another student in-
sults me I will feel resentment. Strawson argues that
this emotional reaction carries with it the implication
that I regard him as responsible for his action. The
notion of responsibility is built into this reaction of
resentment. When I feel grateful for a friend’s kind-
ness it is because I think he is morally responsible for
his kindness. In short, he says, that these kinds of at-
titudes like resentment and gratitude presuppose the
responsibility of the person toward whom we are re-
acting. Strawson says that these kinds of attitudes are
sensitive to some qualification (McKenna & Russell,
2007). Sometimes we will excuse an individual be-
cause we think his offending action was accidental.
Suppose that someone hits you in the face with his
elbow but he didn’t mean to. You would excuse him;
you wouldn’t blame or resent him. But one feature of
excuses, Strawson declares, is that you do not give the
guy a “carte blanche”. He is still an appropriate target
for resentment but in this particular case he has a free
card. Sometimes though, Strawson argues, we will ex-
empt someone. We will think that a given person can
never be resented because they don’t have the right
faculties to be an appropriate target for resentment
(but see McKenna & Russell, 2007, p.6). Young chil-
dren are a good example for such a case. Sometimes
we see adults get indignant at very young children
and it can seem really inappropriate. Imagine a situa-
tion where a 2 year old intentionally spills his orange
juice over your laptop. it is easy to imagine the almost
reflective resentment you experience. But then you
would step back from the heat of the moment thinking
that the kid is not old enough to be someone you can
resent. But even with adults we might conclude that
a particular person is exempt from being held respon-
sible. Someone can be exempt because he is psycho-
logically deranged or because he has severely limited
mental capacities (McKenna & Russell, 2007). In that
case, Strawson argues, one takes an objective attitude
toward the individual (for example, he is a subject for
treatment). Under these circumstances I can fear the
person or maybe I can be disgusted by his actions but
I can’t resent him because he doesn’t meet a minimal
standard. it is not appropriate to resent him because
of his limitations. Following Strawson, these are the
kinds of things that do effect whether we resent some-
one. We either can exempt an individual because they
have severe mental limitations or we can excuse some
particular action they perform as an accident. Those
are the kinds of reasons that lead us to check our re-
sentment. By contrast, he writes, our resentment is
not sensitive whether determinism is true (P. Straw-
son, 1962). He says determinism is not the right thing
to temper resentment at all because it does not fit
into either category (excuses or exemptions). Obvi-
ously it is not an excuse because excuses only apply
to particular actions and determinism is a global the-
sis. So it would be silly to say every particular bad ac-
tion gets excused like it is an accident. So then, Straw-
son argues, the only possibility is that determinism
would provide a general exemption (everybody gets a
free pass). Strawson says that this doesn’t even make
sense. The whole point of exemptions is that they are
special cases. He writes, “it cannot be a consequence
of any thesis which is not itself self-contradictory that
abnormality is the universal condition (as cited in
McKenna & Russell, 2007, p.26). The idea is that ex-
emptions are unusual special cases. To say that deter-
minism makes every case a special case doesn’t make
any sense. Furthemore, Strawson explores the general
issue of determinism and responsibility and he con-
siders two questions. First, he says, if we came to be-
lieve in determinism would we give up the reactive at-
titudes? Second, if we came to believe in determin-
ism should we give it up? For the first question he
says that it is practically inconceivable (P. Strawson,
1962, p.12). it is inconceivable that we put this into
practice. He is echoing Hume here when he argues
that the reactive attitudes are just too central to what
THEORETICAL AND PHILOSOPHICAL ISSUES IN PSYCHOLOGICAL RESEARCH 5
we care about to be displaced by some theoretical be-
lieve like determinism. He says that the human com-
mitment to participation in ordinary interpersonal re-
lationships is too deeply rooted for us to take seri-
ously the thought that a general theoretical conviction
might change our world. So here Strawson is mak-
ing an empirical prediction that people wouldn’t give
up their emotions because of determinism (see also
Haji, 2010). Is Strawson right about that? Well, it is
hard to get clean evidence on this. In Calvinistic soci-
eties, for example, it was assumed that predestination
was true. In their view everything was predetermined
but they had no trouble blaming people. They con-
tinued to blame people even though they believed in
determinism (Nichols, 2007a). The essential question
is whether we should give up the reactive attitudes if
we believed in determinism? Strawson says, the only
way to make this decision would be “in the light of
an assessment of the gains and losses to human life
(P. Strawson, 1962, p.14). Would it diminish or enrich
human live to give up the reactive attitudes? Would
it diminish or enrich human live to suppress them or
should we let them continue as they are? Those are the
questions that face us. Strawson doesn’t say that these
are bad questions; he says only that when we try to an-
swer these questions the status of determinism is irrel-
evant (Nichols, 2007a). This gives a very different way
to defend compatibilism. He is saying is that deter-
minism is irrelevant to whether a person is responsi-
ble. If the action was an accident that’s relevant. If the
person is very young, that’s also relevant to whether
they are responsible. Determinism, on the other hand,
doesn’t matter. That means that determinism would
be no obstacle to a person being morally responsible.
That is of course the central element of compatibilism
(McKenna, 2009).
Perhaps the best know response to Peter Straw-
son comes from his son Galen Strawson (G. Straw-
son, 1994). He agrees that the reactive attitudes like
resentment are central to our lives (Nichols, 2007a).
He also agrees that they presuppose moral respon-
sibility. So, Galen Strawson thinks that feelings like
resentment and gratitude are core to our emotional
lives and that when we feel resentment that feeling
of resentment presupposes that the person we resent
is morally responsible (Nichols, 2004). On all that he
is in agreement with Strawson the elderly. But Galen
Strawson claims that when I resent someone under
the presumption that he is morally responsible I also
presume that he is not determined. He says that the
idea that responsibility is incompatible with deter-
minism is enshrined in the reactive attitudes them-
selves (Nichols, 2007a). So just as Peter Strawson says
that reactive attitudes like resentment have the no-
tion of responsibility built in Galen Strawson says that
the reactive attitudes have incompatibilism built in
(Nichols, 2004). He writes “What is more, the roots of
the incompatibilist intuition lie deep in the very reac-
tive attitudes that are invoked in order to undercut it.
The reactive attitudes enshrine the incompatibilist in-
tuition. .. (G. Strawson, 1986, p.89). His view is that
these emotions really are sensitive to whether deter-
minism is true. Our resentment is affected when we
start to think that people’s actions are determined.
One example that suggest that our emotions are
sensitive to concerns about determinism comes from
the live of a ruthless murderer named Robert Al-
ton Harris (Nichols, 2007a) who was executed in San
Quentins gas chamber in 1992 (California Depart-
ment of Corrections and Rehabilitations, 2007). In
1978 Harris abducted two sixteen year old boys from
a fast food restaurant and he made them drive to
a secluded area where he killed them. After killing
them Robert Harris ate the rest of the boys half-eaten
cheeseburgers. He bragged about how he had killed
them and laughed at what it would be like for the po-
lice telling the families of the victims that the boys are
dead. This story triggers a feeling of moral disgust and
outrage. At first glance “Harris is an architypal can-
didate for blame (McKenna & Russell, 2007, p.128).
One subsequently discovers some details about Harris
bad childhood and that he had been neglected by his
mother and regularly beaten by his father.
Harris sister reported that his mother hated him
and that Robert never experienced any affection from
his mother: He wanted love so bad he would beg for
any kind of physical contact . . . he would come up to
my mother and just try to rub his little hands on her leg
or her arm. He just never got touched at all. She‘d just
push him away or kick him. One time she bloodied his
nose when he was trying to get closer to her (as cited
in Watson, 1987, p.273).
Despite the efforts of psychologists to prevent the
execution the California governor Pete Wilson con-
cluded, "As great as is my compassion for Robert Har-
ris the child, I cannot excuse or forgive the choice
made by Robert Harris the man .. . we must in-
sist on the exercise of personal responsibility and re-
straint by those capable of exercising it. If we excuse
those whose traumatic life experiences have injured
them–but not deprived them of the capacity to exer-
cise responsibility and restraint–we leave society dan-
gerously at risk” (as cited in Golden, 1999, p.270).
Some claim that after hearing about the terrible
childhood of Robert Harris the resentment diminishes
(Nichols, 2007a). This, they claim, occurs because
we lose confidence that Harris really deserves to be
blamed. Once we have some idea about what let him
to be the way he is we start to regard him as being
less responsible. Before we learned about his upbring-
ing we suppose that he really deserved the resentment
that we felt towards him (Nichols, 2007a). But once
we get the picture about how he ended up that way it
changes our sense that he really deserves our reactive
attitude. This sensitivity to the causal history might
just be a particular vivid example that illustrates a
6 CHRISTOPHER B. GERMANN
more general point. Namely, once we get a glimpse of
a deterministic story it undercuts our resentment (but
see Pereboom, 2007). So if we came to think about
every person they were always inevitably let to their
actions we would likely feel less resentment towards
everyone. That is the view that is promoted by Galen
Strawson. It is plausible that our anger and resent-
ment would be muted if we came to view someone’s
criminal behavior as entirely a product of their genes
and environment. But it is possible that this emo-
tional muting is a sophisticated response. Peter Straw-
son might still be correct that we have a strong gut re-
action to wrongful behavior and that this gut reaction
itself doesn’t care about determinism. That is, it might
be that we feel moral outrage and resentment at a base
level and this base level is entirely insensitive to lofty
thoughts about determinism. This is actually a com-
mon feature of emotions. An emotion like fear can be
triggered by simple stimuli like a big picture of a hairy
spider. The fear system naturally produces a response.
But then the smart reflective system can say to the
fear system “it is just a picture dummy”. That mod-
ulates the emotional reaction by dampening the fear
response. Similarly it might be that reflections on de-
terminism dampen the natural automatic outrage we
feel when we hear about Harris (for further considera-
tions on dual-process approaches see for example Za-
jonc, 1980; Greene et al., 2004, 2008; Saunders, 2009;
Evans, 2009). We still get the initial feeling of outrage
but then we convince ourselves that this gut-level feel-
ing is inappropriate. If this is the right story about how
determinism deflates our moral anger towards Harris
there remain very interesting questions about which
emotional reactions should be embraced. Is our gut
reaction of anger better or is our reflective muting of
that emotion better? According to Strawson the elder,
to answer this we might ones again have to answer the
question: What would the gains and losses be for hu-
man live if we favor our immediate reaction over the
reflective one? What would the gains and losses be if
we favor the reflective reaction? To me it is not ob-
vious that the gains and losses would favor the more
reflective one.
Conclusion
To sum up, this paper discussed the big question
where our sense of moral right and wrong ultimately
comes from. A multitude of findings from different re-
search areas give credit to the notion that morality is
based on the emotions. In addition, this paper pre-
sented a philosophical perspective on moral respon-
sibility that comes from Hume and Peter Strawson. It
is an ambitious approach to defend moral responsi-
bility and it is far from established. I would like to
end by remarking on something particularly interest-
ing and important about this kind of emotion based
defense of moral responsibility. Hume and Strawson
are not arguing that we should redefine our notion of
free will. They argue that we should retain our notion
of responsibility. We should keep blaming people the
way we do because it is so deeply infused to our emo-
tional live and our attitudes that are central to our ev-
eryday lives. That is why we should keep them around
and why determinism provides no basis for undercut-
ting them.
THEORETICAL AND PHILOSOPHICAL ISSUES IN PSYCHOLOGICAL RESEARCH 7
References
American Psychiatric Association. (1994). Diagnostic and
statistical manual of mental disorders (4th ed.). Washing-
ton, DC: APA.
Anderson, S., Bechara, A., Damasio, H., Tranel, D., & Dama-
sio, A. (1999). Impairment of social and moral behavior
related to early damage in human prefrontal cortex. Na-
ture Neuroscience, 2(11), 1032-1037.
Ayer, A. (1936). Language, truth, and logic. London: Victor
Gollancz.
Bechara, A., Damasio, H., & Damasio, A. (2000). Emotion,
decision making and the orbitofrontal cortex. Cerebral
Cortex, 10(3), 295-307.
Berthoz, S., Grezes, J., Armony, J. L., Passingham, R. E., &
Dolan, R. J. (2006). Affective response to ones own moral
violations. Neuroimage, 31(2), 945-950.
Blair, R. (1995). A cognitive developmental-approach to
morality - Investigating the psychopath. Cognition, 57(1),
1-29.
Blair, R. (1997a). Affect and the moral-conventional distinc-
tion. Journal of Moral Education, 26(2), 187-196.
Blair, R. (1997b). Moral reasoning and the child with psycho-
pathic tendencies. Personality and Individual Differences,
22(5), 731-739.
Blair, R., Jones, L., Clark, F., & Smith, M. (1995). Is the psy-
chopath morally insane. Personality and Individual Dif-
ferences, 19(5), 741-752.
Blair, R., Jones, L., Clark, F., & Smith, M. (1997). The psy-
chopathic individual: A lack of responsiveness to distress
cues? Psychophysiology, 34(2), 192-198.
Blair, R., Mitchell, D., & Blair, K. (2005). The psychopath:
Emotion and the brain. Oxford: Blackwell.
Blair, R., Sellars, C., Strickland, I., Clark, F., Williams, A.,
Smith, M., et al. (1995). Emotion attributions in the psy-
chopath. Personality and Individual Differences, 19(4),
431-437.
Book, A., & Quinsey, V. (2004). Psychopaths: cheaters or
warrior-hawks? Personality and Individual Differences,
36(1), 33-45.
California Department of Corrections and Rehabilitations.
(2007). Research Review Process. Available from
http://www.cdcr.ca.gov (Accessed: 27.01.2010)
Charland, L. (1998). Is Mr. Spock mentally competent?
Competence to consent and emotion. Philosophy, Psy-
chiatry and Psychology, 5, 67-81.
Cleckley, H. (1941). The mask of sanity. St Louis, MO: Mosby.
Cohen, J. (2005). The vulcanization of the human brain: A
neural perspective on interactions between cognition and
emotion. Journal of Economic Perspectives, 19(4), 3-24.
Cushman, F., Young, L., & Hauser, M. (2006). The role
of conscious reasoning and intuition in moral judgment:
Testing three principles of harm. Psychological Science,
17(12), 1082-1089.
Damasio, A., Tranel, D., & Damasio, H. (1990). Individuals
with sociopathic behavior caused by frontal damage fail
to respond autonomically to social-stimuli. Behavioural
Brain Research, 41(2), 81-94.
Damasio, H., Grabowski, T., Frank, R., Galaburda, A., &
Damasio, A. (1994). The return of Gage, Phineas - Clues
about the brain from the skull of a famous patient. Sci-
ence, 264(5162), 1102-1105.
Ellis, H. (1902). Die Unterbringung Geisteskranker Ver-
brecher: [The Disposal of Insane Criminals] Von P.
Näcke. Halle: Marhold. Large 8vo, pp. 57, 1902. Jour-
nal of Mental Science, 48(201), 341-342. Available from
http://bjp.rcpsych.org
Evans, J. S. B. T. (2009). The duality of mind: An historical
perspective. In J. S. B. T. Evans & K. Frankish (Eds.), In
two minds: Dual processes and beyond (p. 33-54). Oxford:
Oxford University Press.
Federal Burau of Investigation. (2005). Theodore Robert
Bundy . Available from http://foia.fbi.gov (Accessed:
19.01.2010)
Fisher, L., & Blair, R. (1998). Cognitive impairment and its
relationship to psychopathic tendencies in children with
emotional and behavioral difficulties. Journal of Abnor-
mal Child Psychology, 26(6), 511-519.
Garner, R., & Rosen, B. (1967). Moral philosophy: A sys-
tematic introduction to normative ethics and meta-ethics.
New York: Macmillan.
Glenn, A. L., & Raine, A. (2009). Psychopathy and instru-
mental aggression: Evolutionary, neurobiological, and le-
gal perspectives. International Journal of Law and Psychi-
atry, 32(4), 253-258.
Golden, J. (1999). An argument that goes back to the
womb”: The demedicalization of fetal alcohol syndrome,
1973-1992. Journal of Social History, 33(2), 269-288.
Greene, J., Cushman, F. A., Stewart, L. E., Lowenberg, Nys-
trom, L. E., & Cohen, J. D. (2009). Pushing moral buttons:
The interaction between personal force and intention in
moral judgment. Cognition, 111(3), 364-371.
Greene, J., & Haidt, J. (2002). How (and where) does moral
judgment work? Trends in Cognitive Sciences, 6(12), 517-
523.
Greene, J., Morelli, S., Lowenberg, K., Nvstrom, L., & Cohen,
J. (2008). Cognitive load selectively interferes with utili-
tarian moral judgment. Cognition, 107(3), 1144-1154.
Greene, J., Nystrom, L., Engell, A., Darley, J., & Cohen, J.
(2004). The neural bases of cognitive conflict and control
in moral judgment. Neuron, 44(2), 389-400.
8 CHRISTOPHER B. GERMANN
Güth, W., & Yaari, M. (1992). An evolutionary approach to
explain reciprocal behavior in a simple strategic game. In
U. Witt (Ed.), Explaining process and change - approaches
to evolutionary economics (p. 23-34). Ann Arbor: Univer-
sity of Michigan Press.
Haidt, J. (2001). The emotional dog and its rational tail: A
social intuitionist approach to moral judgment. Psycho-
logical Review, 108(4), 814-834.
Haidt, J. (2002). Dialogue between my head and my heart”:
Affective influences on moral judgment. Psychological In-
quiry, 13(1), 54-56.
Haidt, J. (2003). The moral emotions. In R. Davidson,
K. K. R. Scherer, & H. Goldsmith (Eds.), Handbook of af-
fective sciences (p. 852-870.). Oxford: Oxford University
Press.
Haidt, J. (2008). Morality. Perspectives on Psychological Sci-
ence, 3(1), 65-72.
Haji, I. (2010). Free Will and Reactive Attitudes. Philosophi-
cal Quarterly, 60(238), 213-218.
Hare, R. D. (1980). A Research Scale For The Assessment
Of Psychopathy In Criminal Populations. Personality and
Individual Differences, 1(2), 111-119.
Hare, R. D. (2003). The hare psychopathy checklist-
revised. Toronto, Canada: Multi-Health Systems. (see
also: http://www.hare.org/scales/)
Hare, R. D., & Neumann, C. S. (2008). Psychopathy as a clin-
ical and empirical construct. Annual Review of Clinical
Psychology, 4, 217-246.
Harenski, C. L., Antonenko, O., Shane, M. S., & Kiehl, K. A.
(2010). A functional imaging investigation of moral de-
liberation and moral intuition. Neuroimage, 49(3), 2707-
2716.
Hariri, A., Drabant, E., Munoz, K., Kolachana, L., Mattay, V.,
Egan, M., et al. (2005). A susceptibility gene for affec-
tive disorders and the response of the human amygdala.
Archives of General Psychiatry, 62(2), 146-152.
Harpur, T., Hakstian, A., & Hare, R. D. (1988). Factor struc-
ture of the psychopathy checklist. Journal of Consulting
and Clinical Psychology, 56(5), 741-747.
Hauser, M., Cushman, F., Young, L., Jin, R. K.-X., & Mikhail,
J. (2007). A dissociation between moral judgments and
justications. Mind & Language, 22(1), 1-21.
Heekeren, H., Wartenburger, I., Schmidt, H., Schwintowski,
H., & Villringer, A. (2003). An fMRI study of simple ethical
decision-making. Neuroreport, 14(9), 1215-1219.
Herpertz, S., & Sass, H. (2000). Emotional deficiency and
psychopathy. Behavioral Sciences & The Law, 18(5), 567-
580.
Hobson, J., & Shine, J. (1998). Measurement of psychopa-
thy in a UK prison population referred for long-term psy-
chotherapy. British Journal of Criminology, 38(3), 504-
515.
Huebner, B., Dwyer, S., & Hauser, M. (2009). The role of emo-
tion in moral psychology. Trends in Cognitive Sciences,
13(1), 1-6.
Hume, D. (1743/1955). An enquiry concerning human
understanding (L. Selby-Bigge, Ed.). Oxford: Clarendon
Press.
Knafo, A., Zahn-Waxler, C., Van Hulle, C., Robinson, J. L., &
Rhee, S. H. (2008). The Developmental Origins of a Dispo-
sition Toward Empathy: Genetic and Environmental Con-
tributions. Emotion, 8(6), 737-752.
Koenigs, M., Young, L., Adolphs, R., Tranel, D., Cushman,
F., Hauser, M., et al. (2007). Damage to the prefrontal
cortex increases utilitarian moral judgements. Nature,
446(7138), 908-911.
Kohlberg, L. (1971). From is to ought: How to commit the
naturalistic fallacy and get away with it in the study of
moral development. New York: Academic Press.
McKenna, M. (2009). Compatibilism. In E. N. Zalta (Ed.),
The stanford encyclopedia of philosophy. Available from
Available from: http://plato.stanford.edu (Ac-
cessed: 21.01.2010)
McKenna, M., & Russell, P. (2007). Free Will and Reactive
Attitudes: Perspectives on P.F. Strawsons Freedom and Re-
sentment. Aldershot: Ashgate.
Mealey, L. (1995). The sociobiology of sociopathy - An in-
tegrated evolutionary model. Behavioral and Brain Sci-
ences, 18(3), 523-541.
Mealey, L. (1997). Heritability, theory of mind, and the na-
ture of normality. Behavioral and Brain Sciences, 20(3),
527+.
Michaud, S., & Aynesworth, H. (1989). Ted bundy: Conver-
sations with a killer. New York: New American Library.
Mikhail, J. (2007). Universal moral grammar: theory, evi-
dence and the future. Trends in Cognitive Sciences, 11(4),
143-152.
Miller, G. (2008). The roots of morality. Science, 320(5877),
734-737.
Moll, J., Oliveira-Souza, R. de, Bramati, I., & Grafman, J.
(2002). Functional networks in emotional moral and non-
moral social judgments. Neuroimage, 16(3, Part 1), 696-
703.
Moll, J., Oliveira-Souza, R. de, & Eslinger, P. (2003). Morals
and the human brain: a working model. Neuroreport,
14(3), 299-305.
Moll, J., Oliveira-Souza, R. de, Eslinger, P., Bramati, I.,
Mourao-Miranda, J., Andreiuolo, P., et al. (2002). The neu-
ral correlates of moral sensitivity: A functional magnetic
resonance imaging investigation of basic and moral emo-
tions. Journal of Neuroscience, 22(7), 2730-2736.
Moll, J., Oliveira-Souza, R. de, & Zahn, R. (2008). The neu-
ral basis of moral cognition - Sentiments, concepts, and
values. In Year in Cognitive Neuroscience 2008 (Vol. 1124,
p. 161-180). Blackwell Publishing.
THEORETICAL AND PHILOSOPHICAL ISSUES IN PSYCHOLOGICAL RESEARCH 9
Moll, J., Zahn, R., Oliveira-Souza, R. de, Krueger, F., & Graf-
man, J. (2005). The neural basis of human moral cogni-
tion. Nature Reviews Neuroscience, 6(10), 799-809.
Neumann, C. S., Hare, R. D., & Newman, J. P. (2007).
The super-ordinate nature of the psychopathy checklist-
revised. Journal of Personality Disorders, 21(2), 102-117.
Nichols, S. (2002). How psychopaths threaten moral ratio-
nalism: Is it irrational to be amoral? Monist, 85(2), 285-
303.
Nichols, S. (2004). Sentimental rules: On the natural foun-
dations of moral judgment. New York: Oxford University
Press.
Nichols, S. (2007a). After incompatibilism: A naturalistic de-
fense of the reactive attitudes. Philosophical Perspectives,
21(1), 405-428.
Nichols, S. (2007b). On the psychological diversity of moral
insensitivity. In O. Vilarroya & L. Valencia (Eds.), Biology
of conflicts and cooperation. Barcelona: Littera Ediciones.
Nichols, S., & Folds-Bennett, T. (2003). Are children moral
objectivists? - Childrens judgments about moral and
response-dependent properties. Cognition, 90(2), B23-
B32.
Nichols, S., & Vargas, M. (2008). How to be fair to psy-
chopaths. Philosophy, Psychiatry, and Psychology, 14,
153-155.
Nunnerwinkler, G., & Sodian, B. (1988). Childrens under-
standing of moral emotions. Child Development, 59(5),
1323-1338.
Patrick, C., Zempolich, K., & Levenston, G. (1997). Emo-
tionality and violent behavior in psychopaths - A bioso-
cial analysis. In Raine, A and Brennan, PA and Farring-
ton, DP and Mednick, SA (Ed.), Biosocial Bases of Violence
(Vol. 292, p. 145-161). New York: Plenum Press.
Pereboom, D. (2007). Hard incompatibilism. In J. Fischer,
R. Kane, D. Pereboom, & M. Vargas (Eds.), Four views on
free will (p. 85-126). Blackwell.
Phan, K., Wager, T., Taylor, S., & Liberzon, I. (2002). Func-
tional neuroanatomy of emotion: A meta-analysis of
emotion activation studies in PET and fMRI. Neuroimage,
16(2), 331-348.
Piaget, J. (1932). The moral judgment of the child. London:
Kegan Paul.
Prehn, K., Wartenburger, I., Meriau, K., Scheibe, C., Good-
enough, O. R., Villringer, A., et al. (2008). Individual dif-
ferences in moral judgment competence influence neural
correlates of socio-normative judgments. Social Cognitive
and Affective Neuroscience, 3(1), 33-46.
Salekin, R. T., Neumann, C. S., Leistico, A. M. R., & Zalot, A. A.
(2004). Psychopathy in youth and intelligence: An inves-
tigation of Cleckley’s hypothesis. Journal of Clinical Child
and Adolescent Psychology, 33(4), 731-742.
Salekin, R. T., Rogers, R., & Sewell, K. (1996). A review
and meta-analysis of the psychopathy checklist and psy-
chopathy checklist-revised: Predictive validity of danger-
ousness. Clinical Psychology-Science and Practice, 3(3),
203-215.
Sanfey, A., Hastie, R., Colvin, M., & Grafman, J. (2003).
Phineas gauged: decision-making and the human pre-
frontal cortex. Neuropsychologia, 41(9), 1218-1229.
Sanfey, A., Rilling, J., Aronson, J., Nystrom, L., & Cohen, J.
(2003). The neural basis of economic decision-making in
the ultimatum game. Science, 300(5626), 1755-1758.
Saunders, L. (2009). Reason and intuition in the moral
life: A dual process account of moral justification. In
J. S. B. T. Evans & K. Frankish (Eds.), In two minds: Dual
processes and beyond (p. 335-363). Oxford: Oxford Uni-
versity Press.
Schroeder, M., Schroeder, K., & Hare, R. D. (1983). Gener-
alizability of a Checklist for Assessment Of Psychopathy.
Journal of Consulting and Clinical Psychology, 51(4), 511-
516.
Shoemaker, D. (2007). Moral address, moral responsibil-
ity, and the boundaries of the moral community. Ethics,
118(1), 70-108.
Shoemaker, D. (2009). Responsibility and disability.
Metaphilosophy, 40(3-4, Sp. Iss. SI), 438-461.
Singer, T., Seymour, B., O’Doherty, J., Stephan, K., Dolan,
R., & Frith, C. (2006). Empathic neural responses are
modulated by the perceived fairness of others. Nature,
439(7075), 466-469.
Slote, M. (2003). Sentimentalist virtue and moral judgement
- Outline of a project. Metaphilosophy, 34(1-2), 131-143.
Spiecker, B. (1988). Psychopathy - The incapacity to have
moral emotions. Journal of Moral Education, 17(2), 98-
104.
Stevens, D., Charman, T., & Blair, R. (2001). Recognition of
emotion in facial expressions and vocal tones in children
with psychopathic tendencies. Journal of Genetic Psychol-
ogy, 162(2), 201-211.
Stevenson, C. (1944). Ethics and language. New Haven: Yale
University Press.
Strawson, G. (1986). Freedom and belief. New York: Oxford
University Press.
Strawson, G. (1994). The Impossibility of Moral Responsi-
bility. Philosophical Studies, 75(1-2), 5-24.
Strawson, P. (1962). Freedom and resentment. Proceedings
of the British Academy, 48, 1-25.
Sutherland, S. (1976). Hume on Morality and the Emotions.
Philosophical Quarterly, 26(102), 14-23.
Thomson, J. (1976). Killing, Letting Die, and Trolley Prob-
lem. Monist, 59(2), 205-217.
10 CHRISTOPHER B. GERMANN
Thomson, J. (1985). The Trolley Problem. Yale Law Journal,
94(6), 1395-1415.
Turiel, E. (1983). The development of social knowledge:
Morality and convention. Cambridge: Cambridge Univer-
sity Press.
Valdesolo, P., & DeSteno, D. (2006). Manipulations of emo-
tional context shape moral judgment. Psychological Sci-
ence, 17(6), 476-477.
Viding, E., Blair, R., Moffitt, T., & Plomin, R. (2005). Evi-
dence for substantial genetic risk for psychopathy in 7-
year-olds. Journal of Child Psychology and Psychiatry,
46(6), 592-597.
Watson, G. (1987). Responsibility and the limits of evil.
In F. Schoeman (Ed.), Responsibility, character, and the
emotions (p. 256-86). Cambridge: Cambridge University
Press.
Zajonc, R. (1980). Feeling and thinking - preferences need
no inferences. American Psychologist, 35(2), 151-175.
Appendix A
Empirical evidence from neuro imaging studies
supports the notion that emotions are involved in
moral judgements. For instance, in a study by
Singer et al. (2006) participants observed how people
who had either played fair or unfair in a prisoner‘s
dilemma game got punished with electric shocks.
2
The researchers found that brain regions associated
with negative emotion were highly activated when
fair players were punished and less active when un-
fair player were punished. Another study by San-
fey, Rilling, et al. (2003) investigated the emotional
responses of participants in an ultimatum game.
3
The investigators reported an increase in neuronal
metabolism in brain areas associated with emotions
when unfair behaviour was perceived. In an interesst-
ing study by Heekeren et al. (2003) participants had to
evaluate wether sentences were sematically or morally
correct. When participants judged the moral content
the emotional structures of the brain (i.c., ventrome-
dial prefrontal cortex, posterior superior temporal sul-
cus) showed an increase in activity. Moreover, Moll,
Oliveira-Souza, Bramati, & Grafman (2002) found in-
creased metabolic activity in brain areas associated
with emotions when participants heared emotional
offensive sentences relative to neutral sentences. In
a related study using fMRI Moll, Oliveira-Souza, Es-
linger, et al. (2002) scanned participants while they
were viewing pictures of emotionally charged scenes
with and without moral content as well as emotion-
ally neutral pictures. The reseachers reported that
the orbital and medial prefrontal cortex and the supe-
rior temporal sulcus are recruited by viewing scenes
evocative of moral emotions. In another interesting
study, Moll et al. (2003) instructed participants to indi-
cate wether a moral sentence or factual sentence was
right. An example of a incorrect moral sentence would
be “They electrocuted an innocent person, whereas
an example of an factual incorrect sentence would be
“Trees are made of plastic”. The data confirmded the
predictions of the researchers. Emotional areas were
more active when participants had to make moral
judgements relative to non-moral judgements.
Recently, Prehn et al. (2008) investigated individ-
ual differences in moral judgment competence. They
found that participants moral judgment competence
2
The shocked persons were of course confederates of the
experimenter and no real electric shocks were delivered to
them.
3
The ultimatum game (Güth & Yaari, 1992) is often used
in economic experiments. Two players have to decide how
to divide a sum of money that is given to them. The first
player proposes how to divide the sum between the two
players, and the second player can either accept or reject
this proposal. If the second player rejects, neither player re-
ceives anything. If the second player accepts, the money is
split according to the proposal.
THEORETICAL AND PHILOSOPHICAL ISSUES IN PSYCHOLOGICAL RESEARCH 11
was related to the left ventromedial prefrontal cor-
tex and the left posterior superior temporal sulcus
when participants had to judge social norm viola-
tions. A conceptually related study by Berthoz et al.
(2006) examined whether the amygdala is activated
when participants are judging their own moral viola-
tion of social norms. The researchers asked partici-
pants to evaluate the inappropriateness of social be-
haviours. Particpants had to judge situations in which
they themselves, or someone else, transgressed so-
cial norms either intentionally or accidentally. In-
creased amygdala activity was found when partici-
pants judged their own intentional transgression of
social norms. The investigators concluded that the
amygdala is associated with affective responses to
moral transgressions.
Bechara et al., 2000 focus on the role of the or-
bitofrontal cortex in decision making and emotional
processing. After reviewing the literature the re-
searchers concluded that the orbitofrontal cortex
plays an essential role in a neural networks respon-
sible for decision making and emotional processing.
Using meta-analytic methods Phan et al. (2002) con-
cluded that there are certain structures which are reg-
ularly associated with emotions. These structures in-
clude the orbitofrontal cortex, the temporal pole, the
insula and the anterior cingulate cortex. In other
words, emotions coincide with moral judgments.
Recently Harenski et al. (2010) hypothesized a dif-
ferential activation in medial prefrontal areas for im-
plicit and explicit moral tasks. In their study partic-
ipants viewed unpleasant pictures, half of which de-
picted moral violations. Half of the participants were
instructed to judge moral violation severity (explicit
task) while the other half had to indicate whether
the picture was taken indoors or outdoors (implicit
task). The results confirmed the hypothesis of the
researchers. Increased ventromedial prefrontal ac-
tivity was observed when participants performed the
explicit task compared to the implicit task. How-
ever, both groups showed an increase in temporo-
parietal junction activity when presented with pic-
tures of moral violations.
However, it has been remarked that the data is
inconclusive and that the conclusion that emotion
is necessary for making moral judgments is not
justyfied due to the correlational nature of the data
(i.e., Huebner et al., 2009).
Appendix B
Another line of evidence comes from psychopathic
individuals. The emerging evidence suggests that psy-
chopaths have an emotional deficit.
4
In order to cat-
egorize a person as psychopathic clinicians often use
the 20-item psychopathy checklist (Hare, 1980, 2003;
Hare & Neumann, 2008). Examples from the check-
list are, for example, lack of remorse or guilt, lack of
empathy, failure to accept responsibility for your own
actions (Harpur et al., 1988; Salekin et al., 1996; Hob-
son & Shine, 1998; Neumann et al., 2007). People
who score high on these traits are categorized as psy-
chopaths (see also Schroeder et al., 1983).
Psychopathy is not equivalent with diagnoses of
antisocial personality disorder (American Psychiatric
Association, 1994) but it represents subset of this
disorders. In addition to diagnostic criteria for
antisocial behaviour psychopathy is marked by emo-
tional impairments (i.e., lack of guilt). The American
Psychological Association (1994) gives the following
description in the Diagnostic and Statistical Manual
of Mental Disorders (DSM IV):
“Individuals with Antisocial Personality Disorder
frequently lack empathy and tend to be callous,
cynical, and contemptuous of the feelings, rights,
and sufferings of others. They may have an inflated
and arrogant self-appraisal (e.g., feel that ordinary
work is beneath them or lack a realistic concern about
their current problems or their future) and may be
excessively opinionated, self-assured, or cocky. They
may display a glib, superficial charm and can be
quite voluble and verbally facile (e.g., using technical
terms or jargon that might impress someone who is
unfamiliar with the topic). Lack of empathy, inflated
self-appraisal, and superficial charm are features
that have been commonly included in traditional
conceptions of psychopathy that may be particularly
distinguishing of the disorder and more predictive
of recidivism in prison or forensic settings where
criminal, delinquent, or aggressive acts are likely to be
nonspecific” (American Psychiatric Association, 1994,
p.703).
In some groundbreaking work in the 1990s James
Blair (see Blair et al., 2005) showed that psychopaths
respond atypically to standard kinds of questions
about morality. There is a large body of work in devel-
opmental psychology (Piaget, 1932; Kohlberg, 1971;
Turiel, 1983) that shows that children can distinguish
between classic moral violations (i.e., hitting some-
one) and classic conventional violations (i.e., talking
in class) (see Nichols, 2007b). In general children rate
conventional violations as less serious than moral vio-
lations (Blair, 1997a). When children are asked why is
4
Psychopaths have no deficit in their reasoning abilities.
Their IQ scores are normal and their cognitive architecture
is mainly intact (but see Cleckley, 1941; Salekin et al., 2004)
12 CHRISTOPHER B. GERMANN
it not okay to talks during class they would normally
say something like because it is against the rules.
Whereas, if kids are asked why is it wrong to hit some-
one they would say because someone gets hurt (Nun-
nerwinkler & Sodian, 1988; Nichols, 2002; Nichols &
Folds-Bennett, 2003). Blair (1997b) found when he
did experiments with children who had psychopathic
tendencies that these children responded differently
relative to normal children when asked some of these
questions. In particulars, children with psychopathic
tendencies were more likely to say well, if there is no
rule against hitting than its okay to hit.
5
Moreover,
Blair (1995) found that adult psychopathic criminals
also responded different to these kinds of questions
compared to non-psychopathic criminals. In a par-
ticular study which was conducted in England, Blair
conducted experiments with psychopaths and non-
psychopath murderers and he asked them the stan-
dard moral conventional kinds of questions. When he
asked “why is it wrong to hurt other people the non-
psychopathic individuals gave the normal response
“because you shouldn’t hurt other people”. In con-
trast, when he asked psychopaths the typical response
was “it is not the done thing” which is the British
equivalent to “you’re not supposed to”. What Blairs
studies show is that psychopaths fail to appreciate the
deep feeling of wrongness that we all have to certain
violent actions (Spiecker, 1988). We all take these feel-
ings as very serious and important and they are not
dependent on any external rules (Nichols, 2007b).
This lack of appreciation of the difference between
moral violations and conventional violations is cap-
tured in a remark by the psychopathic serial killer Ted
Bundy who was sentenced to death and executed in
1989 (Federal Burau of Investigation, 2005). In inter-
views from prison he listed a number of things that he
said are wrong.
He said: It is wrong for me to jaywalk. It is wrong
to rob a bank. It is wrong to break into other people’s
houses. It is wrong for me to drive without a driver’s
license. It is wrong not to pay your parking tickets. It is
wrong not to vote in elections. It is wrong to intention-
ally embarrass people” (Michaud & Aynesworth, 1989,
p.119).
Ted Bundy, presumable inadvertently, is revealing
something deep about his psychology. Normal people
would never equate jaywalking and bank robbery . By
fusing all this together he illustrates exactly the point
made by Blair’s studies. Psychopaths have serious de-
ficient understanding of moral violations. They do not
appreciate the gravity of moral violations. However,
it seems as if they are able to differentiate right from
wrong. All the things in Bundys list are wrong but it
seems as if Bundy does not distinguish between the
radiacally different kinds of wrongs” (Nichols, 2004,
p.76). He does not differentiate between conventional
and moral violations. Psychopaths fail to appreciate
the distinctive character of moral wrongness and this
is something that normal people do from a very young
age (approximately 3 or 4 years) (Blair, 1997b; Knafo
et al., 2008). Blair et al. (2005) offer an emotion based
explanation for why psychopaths show this pattern of
response. He maintains that it results from a lack of
emotional responsiveness and he investigated this by
showing emotionally distressing pictures (i.e., a cry-
ing child) to psychopaths and normal adults (Blair et
al., 1997). Physiological measures show that normal
adults react with heightened physiological responses
when shown these pictures. However, psychopaths
show significantly less physical responses than non-
psychopaths. That suggests that they are less sensitive
to cues of suffering than the rest of us (Nichols & Var-
gas, 2008). Using brain monitoring instruments Blair
and his collegues found that psychopaths have abnor-
mally low activity in the amygdale which is associated
with emotional responses (Stevens et al., 2001). Blair’s
theory is that the problem with psychopaths moral
understanding is due to their deficient emotional re-
action (Blair, Sellars, et al., 1995). This gives the basis
for a scientific characterization of psychopaths hav-
ing a mental disease or defect which is perhaps genet-
ically determined (Viding et al., 2005). Recent find-
ing suggest that there is some genetic abnormality
in these individuals that leads them to have abnor-
mal development of the amygdale (Patrick et al., 1997;
Hariri et al., 2005). Consequently, these individuals
show a defect in emotion processing and in particu-
lar emotion processing of suffering in other people.
Following Blair’s theory, our emotional responses to
others suffering plays a critical role in developing a
normal appreciation of the wrongfulness of violent of-
fences (Blair et al., 2005). it is because we have these
feelings that we assign such significance to hitting
and killing other people. Because psychopaths have
a diminished capacity for emotional processing they
fail to develop a normal sense of the wrongfulness of
harming and killing others (Nichols & Vargas, 2008).
Recently it has been noted that it is questionable if
psychopaths should be held morally and legally re-
sponsible (Shoemaker, 2007, 2009). It is agueable that
they do not meet the criteria for legal sanity at all (Her-
pertz & Sass, 2000; Nichols & Vargas, 2008). Indeed,
one term that has been used to describe psychopaths
it that they are moral-insane (Ellis, 1902; Blair, Jones,
et al., 1995).
To sum up, psychopaths criminally hurt people be-
cause they have a deficiency in their emotional sensi-
tivity to others pain. But perhaps this deficiency was
never under their control because it was largely the re-
sult of genetics. Thus, it is thinkable they shouldn’t
be held morally or legally responsible for their crimi-
5
Children with psychopathic tendencies were compared
to children with other behavioral and developmental prob-
lems and the difference was still significant (see also Fisher
& Blair, 1998).
THEORETICAL AND PHILOSOPHICAL ISSUES IN PSYCHOLOGICAL RESEARCH 13
nal acts of violence. One argument against this is that
psychopaths only have a diminished understanding of
morality. it is not completely absent. The data (i.e.,
Blair, 1997b) shows that children with psychopathic
tendencies do distinguish the moral violations from
the conventional ones. It is just diminished when
compared to other children. It could be argued that
having some minimal understanding of morality is
sufficient to meet the test of sanity.
Another question is if psychopathy is a mental ill-
ness at all because psychopaths often view themselves
as totally normal (Nichols, 2004). The psychologist
Linda Mealey (1955-2002) is not a mental illness. It is
not the product of brain damage or anything like that.
She proposed that psychopathy should be seen as bi-
ological trait which can be advantageous (for the psy-
chopath not for us). She draws on the idea of what
is termed an evolutionary stable strategy” (Mealey,
1995). In game theory, for example, what’s best for
a given individual will depend on the other members
of the population (for further considerations see Book
& Quinsey, 2004). This can also be said for geneti-
cally fixed traits (Mealey, 1997). Mealy argues that if
everyone is sympathetic to other people than that is
an opportunity for a new group to evolve. This group
can exploit the sympathetic behavior of others. This
is comparable to a biological niche that is available.
It might have been a natural evolutionary develop-
ment for psychopaths to emerge and, if that is right,
that psychopathy is not a neuropsychological disor-
der or a developmental disorder. It is a trait that basi-
cally works to take advantage of the rest of us (Glenn &
Raine, 2009). Of course this is really speculative but it
makes clear that it remains an open question whether
psychopathy is a mental disease or defect or simply a
different way of being.