[Journal abstract] Hiding your true colors may make you feel morally tainted
Hiding your true colors may make you feel morally tainted.
From the abstract (Psychological Science, May 11)
The Moral Virtue of Authenticity
How Inauthenticity Produces Feelings of Immorality and Impurity
Abstract
The five experiments reported here demonstrate that authenticity is directly linked to morality. We found that experiencing inauthenticity, compared with authenticity, consistently led participants to feel more immoral and impure. This link from inauthenticity to feeling immoral produced an increased desire among participants to cleanse themselves and to engage in moral compensation by behaving prosocially. We established the role that impurity played in these effects through mediation and moderation. We found that inauthenticity-induced cleansing and compensatory helping were driven by heightened feelings of impurity rather than by the psychological discomfort of dissonance. Similarly, physically cleansing oneself eliminated the relationship between inauthenticity and prosocial compensation. Finally, we obtained additional evidence for discriminant validity: The observed effects on desire for cleansing were not driven by general negative experiences (i.e., failing a test) but were unique to experiences of inauthenticity. Our results establish that authenticity is a moral state—that being true to thine own self is experienced as a form of virtue.
Related articles
[News release] Gender difference in moral judgments rooted in emotion, not reasoning, study finds
Gender difference in moral judgments rooted in emotion, not reasoning, study finds | EurekAlert! Science News. (3 April 2015 excerpt)
If a time machine was available, would it be right to kill Adolf Hitler when he was still a young Austrian artist to prevent World War II and save millions of lives? Should a police officer torture an alleged bomber to find hidden explosives that could kill many people at a local cafe? When faced with such dilemmas, men are typically more willing to accept harmful actions for the sake of the greater good than women. For example, women would be less likely to support the killing of a young Hitler or torturing a bombing suspect, even if doing so would ultimately save more lives.
According to new research published by the Society for Personality and Social Psychology, this gender difference in moral decisions is caused by stronger emotional aversion to harmful action among women; the study found no evidence for gender differences in the rational evaluation of the outcomes of harmful actions.
“Women are more likely to have a gut-level negative reaction to causing harm to an individual, while men experience less emotional responses to doing harm,” says lead research author Rebecca Friesdorf. The finding runs contrary to the common stereotype that women being more emotional means that they are also less rational, Friesdorf says. The journal article was published online in the Personality and Social Psychology Bulletin on April 3, 2015.
[Research article]Morality Rebooted: Exploring Simple Fixes to Our Moral Bugs
From the abstract
Morality Rebooted: Exploring Simple Fixes to Our Moral Bugs
Ting Zhang ,Harvard Business School
Francesca Gino ,Harvard University – Harvard Business School
Max H. Bazerman ,Harvard Business School – Negotiations, Organizations and Markets Unit
April 21, 2014Harvard Business School NOM Unit Working Paper No. 14-105
Abstract:Ethics research developed partly in response to calls from organizations to understand and solve unethical behavior. Departing from prior work that has mainly focused on examining the antecedents and consequences of dishonesty, we examine two approaches to mitigating unethical behavior: (1) values-oriented approaches that broadly appeal to individuals’ preferences to be more moral, and (2) structure-oriented approaches that redesign specific incentives, tasks, and decisions to reduce temptations to cheat in the environment. This paper explores how these approaches can change behavior. We argue that integrating both approaches while avoiding incompatible strategies can reduce the risk of adverse effects that arise from taking a single approach.
How Do We Make Moral Judgments? Insights from Psychological Science
From the 21 September 2012 article at Science News Daily
We might like to think that our judgments are always well thought-out, but research suggests that our moral judgments are often based on intuition. Our emotions seem to drive our intuitions, giving us the gut feeling that something is ‘right’ or ‘wrong.’ In some cases, however, we seem to be able to override these initial reactions.
Matthew Feinberg and colleagues hypothesized that this might be the result of reappraisal, a process by which we dampen the intensity of our emotions by focusing on an intellectual description of why we are experiencing the emotion.
Across several studies, participants read stories describing moral dilemmas involving behaviors participants would probably find disgusting. Participants who reappraised the scenarios logically were less likely to make intuition-based moral judgments. These findings suggest that although our emotional reactions elicit moral intuitions, these emotions can also be regulated.
“In this way,” the researchers write, “we are both slave and master, with the capacity to be controlled by, but also shape, our emotion-laden judgmental processes.”……
Related articles
- Why Mental Pictures Can Sway Your Moral Judgment (psychologicalscience.org)
- Why Pictures Can Sway Your Moral Judgment (npr.org)
- Inner Conflicts – Which Aspect Prevails? (emotionaldetective.typepad.com)
- The more people rely on their intuitions, the more cooperative they become (sciencedaily.com)
Study posits a theory of moral behavior
From the 22 February 2012 Eureka news alert
Researchers say theory may help explain ethical lapses that led to recession
WASHINGTON, DC, February 21, 2012 — Why do some people behave morally while others do not? Sociologists at the University of California, Riverside and California State University, Northridge have developed a theory of the moral self that may help explain the ethical lapses in the banking, investment, and mortgage-lending industries that nearly ruined the U.S. economy.
For decades, sociologists have posited that individual behavior results from cultural expectations about how to act in specific situations. In a study, “A Theory of the Self for the Sociology of Morality,” published in the February issue of the American Sociological Review, Jan E. Stets of UC Riverside and Michael J. Carter of CSU Northridge found that how individuals see themselves in moral terms is also an important motivator of behavior.
Those bankers, stockbrokers, and mortgage lenders whose actions helped cause the recession were able to act as they did, seemingly without shame or guilt, perhaps because their moral identity standard was set at a low level, and the behavior that followed from their personal standard went unchallenged by their colleagues, Stets explained.
“To the extent that others verify or confirm the meanings set by a person’s identity standard and expressed in a person’s behavior, the more the person will continue to engage in these behaviors,” Stets said of the theory of moral identity she and Carter advance. “One’s identity standard guides his or her behavior. Then the person sees the reactions of others to his or her behavior. If others have a low moral identity and do not challenge the illicit behavior that follows from a person’s identity standard, then the person will continue to do what he or she is doing. This is how immoral practices can emerge…
…
Wherever individuals are located on this continuum, they act with the goal of verifying the meanings of who they are that is set by their moral identity standard, Stets and Carter said. “We found that individuals with a high moral identity score were more likely to behave morally, while those with a low moral identity score were less likely to behave morally. Respondents who received feedback from others that did not verify their moral identity standard were more likely to report guilt and shame than those whose identities were verified,” they said.
The goal is to live up to one’s self-view however that appears across the moral continuum from being very uncaring and unjust to very caring and very just, the researchers said. “When the meanings of one’s behavior based on feedback from others are inconsistent with the meanings in one’s identity standard, the person will feel bad,” they said.
More research is needed to identify the source of moral identity meanings,….
Related articles
- Effects of Anger, Guilt, and Envy on Moral Hypocrisy (psp.sagepub.com)
- Study posits a theory of moral behavior (esciencenews.com)
- Study posits a theory of moral behavior (eurekalert.org)
- Study posits a theory of moral behavior (physorg.com)
- Morality in Bullying: What is Right and What is Wrong? (education.com)
Are doing harm and allowing harm equivalent? Ask fMRI
Are doing harm and allowing harm equivalent? Ask fMRI
Actions trigger immediate indignance. Evaluating passive harms requires more thought
From the 2 December 2011 Science News article
PROVIDENCE, R.I. [Brown University] — People typically say they are invoking an ethical principle when they judge acts that cause harm more harshly than willful inaction that allows that same harm to occur. That difference is even codified in criminal law. A new study based on brain scans, however, shows that people make that moral distinction automatically. Researchers found that it requires conscious reasoning to decide that active and passive behaviors that are equally harmful are equally wrong.
For example (see below), an overly competitive figure skater in one case loosens the skate blade of a rival, or in another case, notices that the blade is loose and fails to warn anyone. In both cases, the rival skater loses the competition and is seriously injured. Whether it is by acting, or willfully failing to act, the overly competitive skater did the same harm.
“What it looks like is when you see somebody actively harm another person that triggers a strong automatic response,” said Brown University psychologist Fiery Cushman. “You don’t have to think very deliberatively about it. You just perceive it as morally wrong. When a person allows harm that they could easily prevent, that actually requires more carefully controlled deliberative thinking [to view as wrong].”