Moral Reasoning Definition
Moral reasoning refers to the processes involved in how individuals think about right and wrong and in how they acquire and apply moral rules and guidelines. The psychological study of morality in general is often referred to as the study of moral reasoning, although moral psychology is now understood as encompassing more than just the reasoning process.
Many of the topics that social psychologists were originally interested in (such as obedience and conformity) had to do in one way or another with questions of moral judgment and behavior. Despite this early interest in morality, the study of moral reasoning specifically had its beginnings in the work of moral philosophers and developmental psychologists rather than in social psychology.
Moral Reasoning Research History
Although morality was originally the domain of religion and theology, interest in the psychology of morality has been around since at least the time of the early Greek philosophers. Plato and Aristotle, for instance, devoted much of their discussion to how people came to acquire moral notions. The tradition continued, as Western philosophers such as Immanuel Kant and David Hume wrote much on the psychological processes involved in moral judgment. These two philosophers famously debated the role of reason versus emotion in moral judgment, with Kant placing a much greater emphasis on rational thought as the proper foundation for moral judgment.
Kant’s ideas, particularly his emphasis on reason as the foundation of moral judgment, influenced some of the earliest psychological work on moral reasoning, that of the Swiss psychologist Jean Piaget. Piaget believed that children developed a mature sense of morality as their ability to reason unfolded. Particularly important for Piaget was the idea that mature reasoning caused a shift from children seeing the world from only their perspective (egocentrism) toward being able to take the perspective of others. For Piaget, this developing ability to reason when combined with the natural social interactions children had with one another (which often involved having to share, take turns, and play games together) caused children to move from a morality based on rules, authority, and punishment (heteronomous morality) to a morality based on mutual respect, cooperation, and an understanding of the thoughts and desires of other individuals (autonomous morality).
Lawrence Kohlberg, a developmental psychologist, expanded upon Piaget’s stage theory of development to include multiple stages of moral reasoning spanning through adulthood. Kohlberg first outlined his theory of moral development in 1958, in what was to become one of the most influential psychological dissertations of all time. Heavily influenced by the rationalist philosophies of Kant and John Rawls (whose theory of justice was one of the most influential political theories of the 20th century), Kohlberg, like Piaget, believed that as reasoning developed, so did moral judgment. For Kohlberg, individuals progressed from an early, egocentric morality based on the fear of punishment and the desire for reward (stages 1 and 2, pre-conventional morality), toward a more mature morality based on social norms (stages 3 and 4, conventional morality), and finally (though not always) to an understanding of universal moral principles that existed independently of social convention (stages 5 and 6, post-conventional morality). Like Piaget, Kohlberg believed that being exposed to social interactions involving moral conflicts could cause progression from one stage of moral reasoning to the next.
Although Piaget and Kohlberg set the groundwork for the study of moral reasoning and stimulated a wealth of research in the area (Kohlberg’s stage theory continues to stimulate work), their ideas have been challenged. In particular, as the study of moral reasoning expanded from the domain of developmentalists to include other areas of psychology, such as cognitive psychology, evolutionary psychology, social psychology, and cognitive neuroscience, researchers began to question some of the assumptions Piaget and Kohlberg made. For instance, many have argued that stage theories are not the best way to characterize the progression of moral reasoning, and there is mounting evidence that moral emotions may be a greater influence on people’s everyday moral thinking than was believed by the developmentalists.
Social Psychology and Moral Reasoning
Because social psychologists have long studied the areas of reasoning and judgment, they have been particularly well suited to investigate the processes involved in everyday moral reasoning. Recently, researchers have taken the wealth of research on topics such as causal reasoning, intentionality, attitudes, heuristics and biases, and emotion, and applied it toward arriving at a better understanding of moral judgment. One of the most interesting findings to emerge about moral reasoning is that it seems to be different from “regular” reasoning (reasoning about nonmoral issues) in important ways.
For instance, there are several differences in the way people think about their moral beliefs and attitudes than about their nonmoral beliefs and attitudes. First, moral attitudes are unlike other attitudes in that they are surprisingly strong and resistant to change. It is very hard, for instance, to convince a pro-life proponent that abortion should be legal, or a pro-choice proponent that abortion should be banned (persuasion in the moral domain is very rare). Second, most people believe that moral truths are universally binding—if a person believes that something is wrong, others should believe this too. Unlike one’s attitude toward chocolate ice cream (the person doesn’t particularly care whether or not others like it), it is problematic if others don’t share a person’s attitude toward rape (it is important to the person that others also believe it is wrong). In fact, although Westerners generally appreciate diversity of all sorts, researchers have shown that diversity of moral opinion causes quite a bit of discomfort. Third, individuals often adhere to strong moral rules despite the consequences. For instance, most Westerners believe that it is not permissible to sacrifice one innocent individual to save five. Indeed, the very notion of sacrificing innocent individuals no matter what: the benefits seems to be seen as impermissible. These rules that are seen as impermissible despite their consequences are often referred to as deontological rules. These deontological rules don’t always seem rational in the sense that most psychologists use the word, as rationality is often defined as making choices that maximizing good consequences. In short, what research seems to show is that people treat their moral beliefs, attitudes, or opinions as values that should be protected at nearly any cost. Because of this, many researchers have referred to these moral rules as sacred or protected values.
Although the topic of moral reasoning has a long history, much work remains to be done before psychologists can be satisfied that they have answered the fundamental question of how people think and decide about issues of right and wrong.
References:
- Greene, J. D., Sommerville, R. B., Nystrom, L. E., Darley, J. M., & Cohen, J. D. (2001, September 14). An fMRI investigation of emotional engagement in moral judgment. Science, 293, 2105-2108.
- Haidt, J. (2001). The emotional dog and its rational tail: A social intuitionist approach to moral judgment. Psychological Review, 108, 814-834.
- Sunstein C. R. (2005). Moral heuristics. Behavioral and Brain Sciences, 28(4), 531-573.