Moral hypocrisy is an antisocial behavior familiar to most of us in which people tend to judge their own moral transgressions more leniently than the exact same transgressions when committed by others. Yet, until recently, the origin of this bias was not known. Northeastern University researchers Piercarlo Valdesolo and David DeSteno have now found that at heart, the mind is just as sensitive to our own transgressions, but that bias in favor of protecting the self actually grows out of cognitive rationalization processes. The research is discussed in the latest issue of the Journal of Experimental Social Psychology.
In their study, Valdesolo and DeSteno demonstrated not only that participants viewed their own transgressions as significantly more “fair” than the same transgressions enacted by others, but also that this bias was eliminated under conditions of cognitive constraint. They found that hypocrisy readily emerged under normal processing conditions, but disappeared under a cognitive load, which “ties up” the mind’s ability to engage in higher order rationalization and reasoning.
“Our findings support the view that hypocrisy emerges from deliberative processes,” said David DeSteno, Associate Professor of Psychology at Northeastern University. “It stems from volitionally-guided justifications, which shows that at a more basic level, humans possess a basic negative response to violations of fairness norms whether enacted by themselves or others.”
In their studies, the authors gave participants the option to assign fun and onerous tasks to themselves and others either randomly, or by personal choice. Other participants did not make the choice themselves, but watched other individuals assign themselves the more enjoyable task. When the authors asked individuals to judge the fairness of these actions, everyone who assigned the preferable task to themselves judged this action to be more fair than did those who judged another person assign the easy task to him or herself.
However, when these judgments were made under cognitive constraint (i.e., remembering a random digit string), “participants experiencing cognitive load judged their own transgressions to be as unfair as the same behavior enacted by another,” said Piercarlo Valdesolo, graduate student of psychology at Northeastern University. “It is also clear that when contemplating one’s transgressions, motives of rationalization and justification temper the mind’s initial negative response to fairness transgressions and leads to more lenient judgment.”
This study provides strong evidence that moral hypocrisy is governed by a dual-process model of moral judgment wherein the prepotent negative reaction to the thought of fairness transgression operates in tandem with higher order processes to mediate decision making.
“In light of our findings, future work should aim to further define the conditions which temper hypocrisy and ultimately suggest ways in which humans can better translate moral feelings into moral actions,” added DeSteno.
Source: Northeartern University via Newswise