Neuropsychologist Michael Gazzaniga on the innate sense of morality

Michael Gazzaniga, an American neuropsychologist, professor of psychology and director of the SAGE Center for the Study of the Brain at the University of California, Santa Barbara, and head of the Law and Neuroscience project on the biological basis of morality.

Neuropsychologist Michael Gazzaniga on the innate sense of morality
Moral from birth We unconsciously reproduce other people’s actions, imitate others, and simulate emotions. We communicate in a variety of ways to cope with social difficulties in the human world. And yet how is it that most of us get along—why aren’t 7.3 billion people constantly attacking each other? Do we really rely on learned behavior and conscious reasoning, or do we have an innate tendency to behave appropriately? Is it possible that we have acquired an innate sense of morality as a species that evolved because its members banded together for survival? Is killing unacceptable to us because that’s what our innate feelings are, or because God, Allah, Buddha, or our government said so? Questions about whether a person has an innate sense of morality are not new. David Hume asked them back in 1777: “…A recent controversy concerning the general foundations of morality deserves to be investigated. Do they arise from reason or from feeling…? Do we acquire knowledge about them through a chain of arguments and induction, or through direct feeling and a more subtle inner sensation…?”Philosophers and religious leaders have been arguing about these questions for centuries, but it is only now that neuroscience has the tools and empirical evidence to help answer them. Anthropologist Donald Brown has compiled a list of human universals, which includes many concepts found in all cultures and related to what is considered moral behavior. Here are some of them: justice; empathy; the difference between good and evil and the correction of the latter; praise and admiration for generous deeds; prohibition of murder, incest, violence and cruelty; rights and duties; shame. Psychologist Jonathan Haidt, in an effort to capture the characteristic features of all moral systems (and not just Western thinking), came up with the following definition: “Moral systems are interrelated sets of values, virtues, norms, practices, distinctive features, institutions, technologies, and emerging psychological mechanisms that work together to suppress or regulate selfishness and ensure the possibility of a social life.” Moral intuition Many intuitive ideas about morality are rapid automatic judgments about behavior that are associated with deep feelings of justice and expediency. They usually do not arise as a result of a deliberate, conscious assessment under the influence of a specific reason. If you witness a person intentionally violating one of the universal moral principles listed above, you are likely to have an intuitive rejection of this behavior. A glaring example: a child peacefully playing in a sandbox gets slapped by his grandmother. If you see this, you will instantly have a judgment about this act as bad, wrong, unacceptable — and you will be justifiably outraged. If you ask about your judgment, you can easily explain it. Such an example, however, does little to answer Hume’s question. Haidt came up with a different story and began to present it to a variety of people.: Julie and Mark are brother and sister. College is on summer vacation, and they are traveling together in France. One evening they were left alone in a beach house and decided that it would be interesting and funny to try to make love. At least it will be a new experience for everyone. Julie had already started taking birth control pills, but Mark decided to wear a condom just in case. They both enjoyed making love, but they agreed not to do it anymore. That night became their special secret, which brought them even closer. Is it right that they made love? Haidt thought out the details well in order to touch all the deep instincts and moral foundations. He defines moral intuition as “the sudden appearance in consciousness or on its periphery of an evaluative feeling (like — dislike, good — bad) about a person’s personality or actions, without realizing that some stages of searching and evaluating evidence have been completed, some steps to draw a conclusion.” In his script, Haidt gave clear answers to any objection in advance. He had no doubt that most people would consider the brother and sister’s act wrong and disgusting (that’s what almost everyone said), but he wanted to get to the origins of such judgments (if they exist at all), apparently common to all of us. Why is this wrong? What does your rational brain say? As expected, many responded that the incest could have resulted in crippled children or that the experience could have hurt the siblings emotionally. However, the story contains refutations of both assumptions. Haidt found that most of the respondents eventually said, “I can’t explain it, I just know it’s wrong.” Is this a rational judgment or an intuitive one? Have we internalized the moral principle that incest is unacceptable because of our parents, religion, or culture, or is it an innate, natural rule that we struggle to overcome with rational arguments? Incest is taboo in all cultures. It is generally accepted that this is bad human behavior. In 1891, Finnish anthropologist Edward Westermark put forward the following hypothesis: since people are not able to automatically recognize their siblings by sight (hence all these films in which sister and brother grew up apart, accidentally met and fell in love with each other), humans have an innate mechanism that prevents incest and usually works. It causes a person to become indifferent or disgusted at the thought of having sexual relations with people with whom he spent a lot of time as a child. It turns out that childhood friends and stepbrothers who grew up together, in addition to relatives, should also not marry each other. And this is confirmed by relevant studies. Evolutionary psychologist Debra Lieberman continued to study this issue. She was interested in how a personal prohibition against incest (“Sex with my relative is reprehensible”) is generalized (“Incest is unacceptable to everyone”) and whether it occurs spontaneously from within or is acquired in the learning process. She found that the more time a person spent under the same roof with their siblings (relatives, adoptive parents, or stepbrothers), the stronger their individual moral attitude against incest is, as a rule, which does not depend on the assimilation of attitudes of society or parents, nor on the degree of kinship. Avoiding incest is not a rationally learned behavior or attitude that has been instilled in us by parents, friends, or religious mentors. If this prohibition were rational, it would not apply to foster and stepbrothers and sisters. This is a trait that was selected evolutionarily, because in many situations it prevented the birth of offspring that were less healthy due to incest and the expression of recessive genes. It’s innate, which is why it’s universal across all cultures. However, your conscious, rational brain has no idea that you have an innate incest avoidance system. All he knows is that in Haidt’s story, a brother and sister had a sexual relationship and that it was BAD. And when you are asked why this is bad, your interpreter, who works only with the information he has (which usually does not include the latest scientific literature on incest avoidance, but includes unpleasant feelings), tries to give an explanation and gives out a number of considerations. Moral judgments and emotions Antonio Damasio and his group helped answer the question of whether emotional reactions play a causal role in making moral judgments. 68 He worked with a group of patients who had damaged the part of the brain necessary for the normal occurrence of emotions, the ventromedial prefrontal cortex. They had problems with both expressing emotions and managing them, but they had completely normal general intelligence, logical thinking, and declarative knowledge of social and moral norms. Damasio’s team suggested that if emotional reactions (mediated by the ventromedial prefrontal cortex) influence moral judgments, then in such patients, solutions to personal moral dilemmas (like the trolley problem in the second variant) will be pragmatic, and solutions to impersonal tasks will be common. During the brain scan, the patients first answered questions related to resolving low-conflict situations, such as “Would it be right to kill your boss?Both healthy subjects from the control group and patients with brain damage answered: ”No, it’s wrong, it’s crazy.“ However, everything changed when it came to highly confrontational, personal moral dilemmas (is it acceptable to harm some people for the benefit of others), which usually cause strong emotions in people. In addition to the second variant of the trolley problem, the subjects were offered the following situation: “There is a brutal war going on, you are hiding from enemy soldiers in a room with ten other people, including a small child. He starts crying, and this will give away your secret place. Would it be right to strangle a child so that the other nine people would not be discovered and killed?” With this formulation of the question, the judgments and reactions of patients with lesions of the ventromedial prefrontal cortex differed significantly from the usual ones (in the control group). Without experiencing emotional reactions to these stories, they gave quick and pragmatic answers: of course, the fat man should be pushed onto the rails; naturally, the child should be strangled. Moral emotions, moral rationalization, and the Interpreter Jonathan Haidt believes that a person first reacts to a dilemma due to unconscious moral emotions, and only then seeks justification for his reaction. At this point, the interpreter intervenes and carries out moral rationalization using information about culture, family, human knowledge, and so on. As a rule, we do not participate in moral thinking, although it is possible. This happens only when we change our point of view, put ourselves in the other’s shoes, and try to find the basis for our judgments. According to Mark Hauser, we are born with abstract moral rules and a willingness to acquire new ones (as we are born with a willingness to learn a language), and then our environment, family and culture restrain us and direct us to a certain moral system (as they lead us to a specific language). Consider a variant of the trolley problem created by Steven Pinker: An uncontrolled trolley is about to kill a school teacher. You can direct her to take a detour, but then she activates a switch that sends a signal to a class of six-year-olds allowing them to name the teddy bear Muhammad. Is it acceptable to move the arrow? This is not a joke. Last month, a British teacher at a private school in Sudan allowed her class to name a teddy bear after the most popular boy in the class, who bore the name of the founder of Islam. She was imprisoned for blasphemy and threatened with public flogging, while a crowd outside the prison demanded her death. For the participants of this protest, a woman’s life was of less value than exalting the dignity of her religion, and their decision on whether to redirect an imaginary trolley to another path would be different from ours. Whatever principles govern people’s moral judgments, these judgments cannot be so universal. Anyone who doesn’t fall asleep watching the TV series “Anthropology 101″ can give many other examples. Although Pinker’s reasoning causes some difficulties, they can be settled with the help of our theory of universal, innate moral behavior, we just have to take into account the influence of culture. And Jonathan Haidt and his colleagues will help us. The universal components of morality Haidt and Craig Joseph created a list of universal components of morality by comparing works on human universals, cultural differences in the field of morality, and the beginnings of morality in chimpanzees. The five points they identified are related to suffering (one must help others and not harm them), reciprocity (a sense of justice is born from it), hierarchy (respect for elders and those related to legitimate authority), cohesion (loyalty to one’s group) and purity (praising purity and avoiding defiling and depraved behavior) 70 . Intuitive moral judgments are based on these components, which arose to handle the special life situations of our hunter-gatherer ancestors. They lived in a social world made up of groups, each of which consisted mainly of relatives united for the sake of survival. From time to time, they came across other groups of people, sometimes hostile, sometimes with closer internal ties, but they all solved the same problems of survival: limited resources, how to eat and not be eaten by themselves, finding shelters, reproduction and caring for offspring. In the process of interacting with each other, our ancestors often faced a choice, and in some situations they had to solve problems that we today call moral and ethical. A person’s survival depended on the survival of the group, which provided him with protection by its numbers, and on his personal skills as part of a social group and in the physical world. Individuals and groups who survived and left offspring were those who successfully coped with moral tasks. Darwin wrote: It is obvious that a tribe comprising a large number of members who are endowed with a highly developed sense of patriotism [cohesion], loyalty [cohesion], obedience [hierarchy], courage and concern for others [suffering] — members who are always ready to help each other [reciprocity] and sacrifice themselves for the common good [cohesion], should prevail over most other tribes, and this will be natural selection. At all times and throughout the world, some tribes have supplanted others, and since morality is one of the elements of their success, it is clear that the overall level of morality and the number of gifted people should constantly strive to increase and increase. Are judgments about other people’s ideas in the right hemisphere? Neuroscientist Rebecca Sachs suggested that when we try to understand the principles and moral attitudes of another, or try to figure out their ideas and influence them, it’s not just about simulating emotions. To test this hypothesis, she and a group of colleagues scanned the brains of subjects who were currently solving the classic task of understanding false beliefs. In it, Sally and Ann are in a room. Sally hides the ball in a blue box in front of Ann, and then leaves the room. Then Ann gets up and puts the ball in a red box. Sally returns to the room. Question: Where does Sally think the ball is? Children under the age of four respond that Sally thinks the ball is in a red box. They don’t understand that there are misconceptions. Children from four to five years old are already beginning to understand and say that Sally thinks the ball is in a blue box. This ability, which develops and begins to work at the age of 4 to 5 years, allows you to realize that other people’s ideas may be false. Sachs found that a certain area of the right hemisphere is activated in adult subjects when they reflect on other people’s beliefs; when they are directly informed about someone’s views in writing; when they follow vague instructions to understand another person’s beliefs, and when they need to anticipate the actions of a person who holds a false belief. When I first heard about these results, I was amazed that such a mechanism is located in the right half of the brain. After all, if information about other people’s perceptions is located in the right hemisphere, it means that in patients with split brains it cannot reach the left hemisphere, which solves problems and has the language ability. It turns out that their process of making moral judgments should be disrupted. But this is not happening. In this sense, split-brain patients behave like everyone else. My colleagues and I tested our infinitely patient patients again. We already knew that information about other people’s goals is located in the left hemisphere, and so far we have accepted on faith that the ability to judge others’ ideas is located in the right hemisphere. And we offered our subjects with split brains to answer the following questions:: 1. Suzy’s secretary thinks she’s putting sugar in her boss’s coffee, but in fact it’s poison accidentally left by a chemist. The boss drinks coffee and dies. Was Susie’s action acceptable? 2. Suzy’s secretary wants to “remove” her boss and puts poison in his coffee, which in reality turns out to be sugar. The boss drinks coffee and feels great. Was Suzy’s action acceptable? Will the listener of these stories only be concerned about their denouement, or will he evaluate everything based on the ideas of the actor? If these questions had been asked to you or me, we would have considered the first action acceptable, since Susie thought there was nothing wrong with the coffee. However, in the second case, we would call her behavior unacceptable, since the secretary believed that she was giving the boss poisoned coffee. We judge based on Susie’s intentions, her ideas. What would split-brain patients say? It was expected that they would only care about the outcome of events, since the area of their brain associated with the representations of others is separated from the areas responsible for solving problems, language and speech. That’s exactly what we saw: their judgments were based solely on the results. For example, patient JW was offered such a story. The waitress thinks that the restaurant visitor has a severe allergy to sesame seeds, and deliberately serves him food with them. Everything ends well, as in the end it turns out that the visitor did not have any allergies. JW immediately reasoned that the waitress’s act was acceptable. Since split-brain patients function perfectly normally in the real world, we weren’t surprised by what happened next. A few seconds later, after his conscious brain had processed what he had just said, JW tried to logically justify (the interpreter rushes to the rescue) his answer: “Sesame seeds are so tiny — they can’t hurt anyone.” He had to adapt his own automatic response, which did not take into account the information about the waitress’s ideas, to what he knew on a rational and conscious level about acceptable behavior. Suppression of selfishness We often consider dilemmas related to justice to be moral. One of the most interesting and widely known discoveries is related to the so-called ultimatum game. One round, two participants. One of them is given 20 dollars, he must share them with another player, and decide on his own how much of the money to give. Both get the amount that the owner of $20 will offer first. However, if the player who is offered a portion of the money refuses it, no one gets anything. From a rational point of view, a player who is offered money should accept any amount, because this is the only way to stay profitable. However, people react differently. They accept money only if they find the offer fair — when they are offered at least 6-8 dollars. Ernst Fehr and his colleagues used transcranial electrical stimulation to temporarily disable the prefrontal cortex. They found that when the work of her right dorsolateral part is disrupted, the players agree to offers of a lower amount, although they still consider them unfair. If the suppression of this area of the brain enhances selfish reactions to unfair offers, it means that it normally suppresses selfishness (willingness to accept any offer) and reduces the influence of selfish motives on the decision-making process, that is, it plays a key role in the implementation of fair actions. Research conducted by Damasio’s group confirmed that the right dorsolateral part of the prefrontal cortex suppresses egoistic reactions. Scientists offered to take a moral principles test for adults who have had this area damaged since childhood. Their responses, as well as their behavior, were immensely egocentric. They did not restrain the manifestations of selfishness well and could not accept someone else’s point of view. People with similar brain lesions that occurred in adulthood (another group of Damasio’s patients) are better adapted. Apparently, neural systems damaged at an early age were critically important for the acquisition of social knowledge. 73 There are many moral patterns that seem to be distributed throughout the brain. We have many innate reactions to the social world (including automatic empathy, unconscious evaluation of other people, and emotional reactions) that influence our moral judgments. However, we usually don’t think about these automatic reactions and don’t rely on them when explaining our decisions. In most cases, people are guided by moral principles in their actions, but insist on other reasons for their own actions. It’s all about the cacophony of factors that govern our behavior and judgments. These include emotional systems and special systems of moral judgments. First, our innate moral behavior manifests itself, and then we interpret it. We believe in this interpretation ourselves, so it becomes a significant part of our lives. However, our reactions are initiated by those universal properties that we are all endowed with. It seems that we all share common moral networks and systems and tend to respond in the same way to similar tasks. We differ from each other not in our behavior, but in our theories, which explain our own reactions, and the weight that we attach to different moral systems. I think it would be much easier for people with different belief systems to get along with each other if they understood that the sources of all conflicts are our theories and the value we attribute to them. Our brains have created neural networks that allow us to thrive in a social context. Even as infants, we make judgments, make choices, and base our behavior on the actions of others. We prefer people who are willing to help us or at least not harm us. We understand when someone else needs help, and we willingly help out of altruistic motives. Our extensive mirror neuron system enables us to understand other people’s intentions and emotions, and the interpretation module uses this information to create theories about them. We use the same module to make up a story about ourselves. As the social context changes as we gain knowledge about the true nature of human beings, we may want to change the way we live and understand our social lives, especially with regard to justice and punishment. Source: Michael Gazzaniga “Who’s in Charge? Free will from the point of view of neurobiology”

Don’t miss the most important science and health updates!

Subscribe to our newsletter and get the most important news straight to your inbox

Published

July, 2024

Duration of reading

About 3-4 minutes

Category

Awareness, responsibility and morality

Share

Send us a message