We all make a decision every day, consciously or unconsciously: Am I going to cooperate today? Perhaps the question is cooperation with a partner. Or the kids. Or a neighbor. Or the people at school or work. Are we going to “go along to get along,” as the old saying goes? How much of ourselves are we going to give away? Because it feels like that sometimes, doesn’t it? That cooperation is giving parts of yourself away?
Dr. Joshua D. Greene is a cognitive neuroscientist and philosopher and director of Harvard’s Moral Cognition Laboratory. Greene has been doing some fascinating work that he describes on the website The Edge.com and he has just published a book, Moral Tribes: Emotion, Reason, and the Gap Between Us and Them.
I think that Greene succinctly sums up a central aspect of what morality means. Greene says, “Morality is fundamentally about the problem of cooperation.”
I think Professor Greene is onto something here. This idea clarifies a lot of things that get muddied up when we start reading books on morality and ethics: “Morality is fundamentally about the problem of cooperation.” The thesis of his new book is that there are two types of human interactions that we do: “me versus us” and “us versus them.”
My examples about getting up in the morning and deciding to cooperate with others (or not) focuses on the individual cooperating with a group. But groups cooperate or not as well, and in those cases, too, I think the formula holds: morality is about cooperation.
We saw a failure to cooperate recently in the Washington budget brouhaha. We see it in Egypt. We see it in Syria. We see it in spying on foreign leaders. We see it in drone strikes. Figuring out what’s moral and what’s not moral is not difficult: “Morality is fundamentally about the problem of cooperation.”
Except . . . Professor Greene does insert that little word “problem.” Greene puts it this way:
Each moral tribe has its own sense of what’s right or wrong—a sense of how people ought to get along with each other and treat each other—but the common senses of the different tribes are different. That’s the fundamental moral problem. http://wisdomresearch.org/Arete/Greene.aspx
It’s hard to cooperate with a group that sees things differently. For example, I don’t like a group that would cut funding for food stamps. I don’t like a group that would spy on foreign allies. I don’t like a group that sees “god” differently from the way I do. The list of groups I don’t like goes on and on! (And it may well be that THEY don’t like me either! Maybe they even want to hurt me!) And the perimeters of the groups expand and contract and shift constantly. I don’t think I want to cooperate at all!
Here’s a novel idea: let’s kill everybody we don’t agree with! Well . . . that’s a problem, isn’t it? That’s not such a novel idea, unfortunately.
Greene likens our moral thinking to a camera with two modes: a point-and-shoot, auto-focus mode and a manual mode, in which all the settings have to be consciously manipulated (you know, focus, f-stops).
“Bomb everybody different from us” is the auto-focus, point-and-shoot mode. It’s automatic. It’s gut. And, it’s immoral. It’s a failure to cooperate. The more remote the other group is from us, the more likely we are to react in the point-and-shoot mode.
A key finding in the research done at the Moral Cognition Laboratory is that we have no specific area of the brain that controls moral decision making. When people are asked moral questions, at least three areas of the brain light up. And they are the same three areas that light up when we are asked questions about buying things. Economic decisions.
Several systems work together, evaluating the probability of success and the diminishing returns we are likely to reap. So it appears that our moral reasoning has something to do with our acquisition of food back in our hunter and gather days.
Imagine you are hiding in a tree. Naked. No weapons. And there’s a dead rabbit right over there and you’re hungry—do you hop down out of the protective environment and take a chance?
One of the basic calculations concerning food for a hunter gatherer is, How dangerous is this to me? What’s the profit and what’s the loss?
(We have to be very careful when we get into explanations based on evolution. Neuroscientists can clearly see brain functions in these experiments, but the “why” is much more difficult to discover.)
We all know that watching someone die in our arms feels different than hearing about a death on the telephone. Or reading about it in the newspaper. Or seeing it on television.
Distant things—and distant groups—are much more difficult to care about. We always knew this; now we know it’s in our wiring. How to get around this flaw in a shrinking world is the challenge humanity will or will not solve. Religions and philosophies have been working on it for a while now . . . like, oh, seven thousand years, at least.
Still, it’s all about hopping down out of that tree and saying, “Hello.”
This content is cross-posted on the UU Collective, a Patheos blog.
We rely heavily on donations to help steward the CLF, this support allows us to provide a spiritual home for folks that need it. We invite you to support the CLF mission, helping us center love in all that we do.
Can you give $5 or more to sustain the ministries of the Church of the Larger Fellowship?
If preferred, you can text amount to give to 84-321
Quest for Meaning is a program of the Church of the Larger Fellowship (CLF).
As a Unitarian Universalist congregation with no geographical boundary, the CLF creates global spiritual community, rooted in profound love, which cultivates wonder, imagination, and the courage to act.