Tuesday, March 11, 2008

Morality as Genetic Predisposition and Neurobiology

A look at the emerging field of moral psychology

Within the intersecting disciplines of psychology, neurobiology, philosophy of mind, ethics, and cognitive science, a new field of inquiry has emerged of late. Although it goes by different names, including such recent coinages as ‘neuromorality’, the field is perhaps best referred to as moral psychology. I have touched on this topic in a previous column, but given the recent preponderance of media fixation on this topic, I thought it was time to take a closer look.


One might consider moral psychology as an emerging field of research that delves into questions that have long captivated the curiosity of a broad array of disciplines in the Arts and Sciences, some for several centuries: To what extent do our own bodies influence and determine our moral judgments and behavior? Are there genetic predispositions for everything from altruism to serial killing? How are we to make sense out of the uniquely human endeavor of formulating moral judgments? Can an understanding of neurobiology and genetics shed any light on this?


For many researchers in this field, such questions boil down to the challenge of mapping out what some would call the "neuro-anatomy of moral judgment." Across the country, moral psychologists, working in tandem with behavioral psychologists, evolutionary biologists and persons in related fields, believe they are hot on the trail of figuring out how humans are "wired for morality."


As a token example of the work in this field, we could note the investigations of Harvard university psychology professor Marc Hauser, author most recently of Moral Minds: How Nature Designed Our Universal Sense of Right and Wrong (2006). Hauser was featured last May in the Harvard University Gazette On-line:


In a talk April 26, psychology professor Marc Hauser argued that our moral sense is part of our evolutionary inheritance. Like the “language instinct” hypothesized by linguistic theorist Noam Chomsky, the capacity for moral judgment is a universal human trait, “relatively immune” to cultural differences. Hauser described it as a “cold calculus,” independent of emotion, whose workings are largely inaccessible to our conscious minds.


Hauser, along with other leaders in this emerging field would have us conclude that morality is ultimately explainable in empirical terms: genetic evolution and inheritance, brain anatomy, neuronal activity, mixed with our environment and education—this and little more.


And that is where the trouble with neuromorality begins. It is indeed unfortunate that the pioneers of the new moral psychology—given all the potential for truly breathtaking and worthwhile insights which their discipline can provide—appear to be all too ready to succumb to that intellectual hubris that would reduce the broader whole of understanding to one very narrow vantage point. And this is already leading to untenable extremes.


Case in point is a recent lengthy exposé that ran in New York Times Magazine entitled “The Moral Instinct,” authored by another Harvard psychologist Steven Pinker. I will engage in a lengthier critique of Pinker’s article next week, but just to give you a better taste of some of the unfortunate excesses of neuromorality, allow me to share and comment on the following amazing paragraphs. Writes Pinker:



The gap between people’s convictions and their justifications is also on display in the favorite new sandbox of moral psychologists, a thought experiment devised by the philosophers Philippa Foot and Judith Jarvis Thomson called the Trolley Problem. On your morning walk, you see a trolley car hurtling down the track, the conductor slumped over the controls. In the path of the trolley are five men working on the track, oblivious to the danger. You are standing at a fork in the track and can pull a lever that will divert the trolley onto a spur, saving the five men. Unfortunately, the trolley would then run over a single worker who is laboring on the spur. Is it permissible to throw the switch, killing one man to save five? Almost everyone says ‘yes’.


Consider now a different scene. You are on a bridge overlooking the tracks and have spotted the runaway trolley bearing down on the five workers. Now the only way to stop the trolley is to throw a heavy object in its path. And the only heavy object within reach is a fat man standing next to you. Should you throw the man off the bridge? Both dilemmas present you with the option of sacrificing one life to save five, and so, by the utilitarian standard of what would result in the greatest good for the greatest number, the two dilemmas are morally equivalent. But most people don’t see it that way… When pressed for a reason, they can’t come up with anything coherent, though moral philosophers haven’t had an easy time coming up with a relevant difference, either (p. 35, emphasis my own).


What this is supposed to show is that our deepest convictions about right and wrong are not based on reasons, but on deep-seated tendencies, hardwired into our brains by our DNA and evolutionary history. The fact that people have a hard time coming up with reasons for their moral convictions is educed as evidence that either there are no reasons, or that any reasons given are utterly relative and may or may not reflect the deeper workings of our DNA driven psychological dispositions.

Pinker’s interpretation of the Trolley Problem —and presumably that of most people in the survey—fails to distinguish between intending to harm and allowing a foreseeable harm on reasonable grounds. The former is immoral; the latter might constitute a licit option depending on the case. Which is to say, the natural law tradition clearly provides reasons why it might be licit to pull a lever and divert the trolley onto a spur (the first case), and reasons why it would never be licit to throw the fat man down onto the tracks (the second case). The fact that persons surveyed had trouble articulating reasons for their moral convictions should not suggest that morality is ultimately irrational—determined within the deep recesses of our genetically predisposed subconscious—but simply that most people today have little or no formal training in ethics, let alone natural law theory. But more on this next week.

To conclude, the field of moral psychology is in many ways fascinating. It will undoubtedly make many valuable contributions not only to our philosophical understanding of human nature and morality, but also to our cultural considerations about how to educate our young people to live sound moral lives. It will do a grave disservice to the same, however, if moral psychologists aim to reduce entire fields of human understanding (in this case moral knowledge) to "nothing but" the stuff of neurological function and evolutionary biology.