The Human Journey
Thinking Big

Creating a Sustainable Future


Moral Tribes: Emotion, Reason, an the Gap
Between Us and Them

NY: Penguin Books, 2014

Available for purchase here

About the Author:  Joshua D. Greene is the John and Ruth Hazel Associate Professor of the Social Sciences and the director of the Moral Cognition Laboratory in the Department of Psychology, Harvard University. He studies the psychology and neuroscience of morality, focusing on the interplay between emotion and reasoning in moral decision-making. His broader interests cluster around the intersection of philosophy, psychology, and neuroscience.

Pages 12

We live in an age of historically declining violence and expanding kindness. But it doesn’t feel like that to most of us. Unprecedented global threats and conflicts demand advances in our ability to coexist peacefully. Greene believes such improvements require changes in our moral thinking and a global morality to help resolve disagreements. This book aims to lead us toward these goals using philosophy, science, psychology and neuroscience to illuminate the nature of modern moral disputes, their difference from the types of problems our brains evolved to solve intuitively, and how our different thinking capabilities fit these two types of problems.

Pain matrix
When children see an image of a person in pain, portions of their brain register that pain on a fMRI scan. When the children see a person intentionally hurt, portions of the brain associated with moral reasoning are also activated. See more at: UChicagoNews

For years the overwhelming consensus was that morality has its origin in religious thought, and that without belief and guidance from a higher source we would revert to the immoral savages we used to be. However, recent studies in animal behavior, developmental psychology and neuroscience have transformed this notion. From observations by de Waal, the findings of Bloom, Dunbar, Lieberman and others we learn that our moral behavior is innate, it evolved over millions of years to promote cooperation within our group. Each group has its moral code, which provides a map for how individuals can live successfully within it. Because we were able to make decisions that favored the group over the individual, the human race was able to evolve, adapt and expand as it has.

To gain a deeper understanding of human morality Joshua Greene scanned the brains of individuals while they puzzled over philosophical questions such as the famous Trolley Experiment.

One version, The Footbridge Dilemma, places you on a bridge, with a clear view of an oncoming trolley about to strike and kill five workers. The only means of preventing their deaths is for you to push another rather large observer off the bridge onto the tracks to stop the trolley; this one death would prevent the deaths of the other five, which would arguably be the best outcome. But most people find this action morally unacceptable, even when increasing the number of lives saved to millions.

Another version, The Switch Dilemma, poses a similar situation, but the participant can avoid the deaths of the five workers by throwing a switch to redirect the trolley on to a different track—where a single worker will be killed. Most people find this action acceptable.

Why is sacrificing one life to save five acceptable in one instance but not in the other? Greene discovered that we contemplate issues of right and wrong differently in different situations because the neural networks in our brains respond differently depending on the degree of personal involvement in a dilemma. Functional MRI (fMRI) testing reveals that Footbridge type scenarios activate the ventromedial prefrontal cortex (VMPFC) and the amygdala regions of the brain. These areas are associated with intuitive, emotional responses; they are quick, even impulsive. But Switch type scenarios activate the dorsolateral prefrontal cortex (DLFPC), an area that is associated with cognitive control, with slower, more deliberative responses.

The Switch case seems more acceptable than the Footbridge case because of the use of personal physical force in the latter, which by itself ought not seem morally relevant (we sacrifice one life to save five, either way)—but it is psychologically relevant to our dual-process brains. Only 31% would throw off the bystander in the Footbridge case, while 87% would flip the switch. Additional versions of the dilemma reveal more. We would act positively in other versions which save five people through an action whose side effect kills a single person; for example, by opening a trap door to drop him onto the tracks (59-63%); by knocking him onto the tracks in a rush to reach the switch (81%); by throwing a switch to loop away from the 5 victims but killing the other one, stopping the trolley from returning to the main track and the other victims (81%); and by directing a parallel trolley onto a sidetrack where collision with one victim will trigger a sensor stopping both trolleys (86%).

Jack Kevorkian
Jack Kevorkian became a pathologist who assisted people suffering from acute medical conditions in ending their lives. After years of conflict with the court system over the legality of his actions, he spent eight years in prison after a 1999 conviction. Kevorkian's actions spurred national debate on the ethics of euthanasia and hospice care. He died in Royal Oak, Michigan, on June 3, 2011.
Drone firing missle
In 2014 the group Reprieve found that drone strikes targeting 41 people in Yemen and Pakistan have killed more than 1,000 other, unnamed people.

These same kinds of judgments are made between killing as collateral damage vs. killing intentionally, and prescribing palliative drugs which hasten the death of a terminally ill patient vs. prescribing drugs to end the patient’s life. A similar distinction judges harm caused by things we actively do to be worse than harm caused by inaction. This distinction, too, is widely applied; for example, doctors may not cause a patient’s death, but they sometimes may permit them to die. None of these decisions reflect deliberative thought; they are automatic moral judgments.

Jane Goodall
Jane Goodall is one of the world’s leading voices on the issue of climate change and protecting the environment. At the U.N. climate summit in Paris, December 2015, she talked about Republican climate change denial, the link between diet and climate change, her hopes "to save the rainforests" from corruption and intensive farming, and how climate concerns drove her to be a vegetarian.

Even ordinary decisions often involve the dual-process brain, for example deciding between immediate gratification and the “greater good” of future consequences: now vs. later. fMRI experiments with such decisions reveal brain activity like that in the Bridge and Switch Dilemmas. Decisions for immediate reward exhibit increased VMFPC activity that’s absent in the DLFPC-heavy decisions for delayed reward. Similar fMRI results appear when people try to regulate their emotions, such as looking at pictures which typically elicit negative emotions while trying to apply a positive description of the pictured activity—in effect, reappraising the pictures. And similar brain activity appears when viewing pictures of racial out-groups (whites viewing pictures of black people, for example).

Greene suggests that the two main problems of modern existence are similarly affected by the dual way our brains are activated. Both these problems are essentially social: they are “Me vs. Us” and “Us vs. Them.”   “Most of morality is about gut feelings,” Greene says. “Our gut feelings enable us to be cooperative, to form groups. But the same gut feelings that enable us to form a cohesive group—that enable people to put ‘us’ ahead of ‘me’—also make us put ‘us’ ahead of ‘them,’ not just in terms of our interests, but in terms of our values.”

Sheep Wars
The Sheep Wars were a series of armed conflicts in the Western United States between sheepmen and cattlemen over grazing rights. Between 1870 and 1920, approximately 120 engagements occurred in eight different states or territories. At least 54 men were killed and some 50,000 to over 100,000 sheep were slaughtered.

He discusses the well-known tragedy of the commons, in which multiple sheepherders share a pasture but each acts in his own self-interest, eventually over-grazing and destroying the pasture. He argues that this problem is not really tragic because in “Me vs. Us” situations our established intuitive mode of thinking – which he compares to the “point and shoot settings” of a camera – produces emotional responses, such as guilt and empathy that nudge us towards seeking solutions that favor cooperation with the group. But, at the same time, our “point and shoot” responses lead us to solidify the loyalty and strengthen the affinity we feel for our group at the expense of caring for those outside it. Over time each group, tribe and community develops a different moral system as they implement different methods to avert the tragedy of the commons.

The “Us vs. Them” problem is more difficult to overcome as we see everyday among different racial, religious, political and national groups. All groups may want the same things: health, food, shelter, and leisure; and share the same core values of honesty and morality; and even in conflict, our minds work similarly, fighting not just for ourselves but for family, friends, community and our idea of justice. Nevertheless, our specific values and interests may differ from the other groups’, resulting in disagreements as we try to coordinate and solve problems. Here “point and shoot” responses, in which we unreflectively follow our moral instincts, only further complicate our ability to come up with solutions. Differences in interests affect our gut intuitions about what is to be done, because, Green affirms, “our biases are baked into our gut reactions”. Each group’s view of the facts is likely to be colored by its biases, history and commitments; and these may well be incompatible with those of other groups. Greene calls this the tragedy of common-sense morality

He suggests that if the problem is one where both sides have strong gut reactions that are incompatible—even if, in some abstract sense, they reflect similar values—it can only be solved if we “shift into the manual mode” of the camera metaphor and think more reflectively. Psychologically, these two modes resemble emotion and reason. Emotions are automatic—you cannot choose to experience an emotion (though you can choose to experience something likely to trigger one). For most activities, our automatic settings—emotional responses—guide us well. But when we recognize the need for it, our manual mode—reason—needs to be able to override those automatic emotions.

Pages 12 TOP