On a particular visit to Canada from the UK my mum declared me to have low moral standards! Gosh what had I done you may well ask? Before you let your imaginations run wild, she actually was in despair that I allowed my children’s bedrooms to accumulate mess and clutter!
Not what a good girl like me had been brought up to encourage!
Ma I think you confused low morality with ‘less than high standards of tidyiness’ oops mea culpa!
However like most people if asked whether I’d steal, like most I would say no. Would I try to save a drowning person? That depends—perhaps on our fear of big waves which in my case is hampered by being a very bad swimmer! Much research has explored the ways we make moral decisions. But in the clinch, when the opportunity arises to do good or bad, how well do our predictions match up with the actions we actually take?
A study by Rimma Teper, Michael Inzlicht, and Elizabeth Page-Gould of the University of Toronto Scarborough tested the difference between moral forecasting and moral action—and the reasons behind any mismatch. Published in Psychological Science, a journal of the Association of Psychological Science, the findings look encouraging: Participants acted more morally than they would have predicted.
But lest we get sentimental about that result, lead author and psychology PhD candidate Teper offers this: “There has been other work that has shown the opposite effect—that people are acting less morally” than they forecast.
What’s the missing link between moral reasoning and moral action? Emotion. Emotions—fear, guilt, love—play a central role in all thinking and behavior, including moral behavior. But when people are contemplating how they’ll act, “they don’t have a good grasp of the intensity of the emotions they will feel” in the breach, says Teper, so they misjudge what they’ll do.
For this study, three groups of students were given a math test of 15 questions. One group was told that a glitch in the software would cause the correct answer to show on the screen if they hit the space bar—but only they would know they’d hit it. This group took the test; a $5 reward was promised for 10 or more right answers. Another group was given a description of this moral dilemma, and was then asked to predict whether or not they would cheat for each question. The third group just took the test without the opportunity to cheat.
During the trial, electrodes measured the strength of participants’ heart contractions, their heart and breathing rates, and the sweat in their palms—all of which increase with heightened emotion. Not surprisingly, those facing the real dilemma were most emotional. Their emotions drove them to do the right thing and refrain from cheating.
The students asked only to predict their actions felt calmer—and said they’d cheat more than the test-takers actually did. Students who took the test with no opportunity to cheat were calmer as well, indicating the arousal that the students in the first group were feeling was unique to the moral dilemma.
But emotions conflict, and that figures in decision making too. “If the stakes were higher—say, the reward was $100—the emotions associated with that potential gain might override the nervousness or fear associated with cheating,” says Teper. In future research, “we might try to turn this effect around” and see how emotion leads people to act less morally than they forecast.
“This time, we got a rosy picture of human nature,” coauthor Michael Inzlicht comments. “But the essential finding is that emotions are what drive you to do the right thing or the wrong thing.”
Source: Association for Psychological Science