Moral Outsourcing and Clever Fudge Factors
Under a Chinese law enacted in November, students caught cheating on the high-stakes Gaokao college entrance exam may face up to seven years in prison. Nine million anxious students recently filed into testing centers across the country to take the exam, widely considered the most important test in the life of a Chinese citizen.
A New York Times story by Javier C. Hernandez reports that the harsh penalty was, according to the Chinese newspaper Global Times, intended to enforce fairness and uphold a sense of “social justice” in society, because these test results have such critical impact on any individual’s future. A high score means a prestigious university and a well-paid profession and a low score means shame and a lifetime in menial jobs. Families have gone to extremes to help their kids on the tests, hiring companies to surreptitiously transmit answers, bribing officials for an advance look at the questions, and buying pens and other products designed to facilitate cheating. To help enforce the law, Beijing officials said they had sent eight police officers to each of the city’s 96 testing sites. Reactions on Weibo, the Chinese version of Twitter, were mixed—some supported enforced fairness, and other considered the penalties too harsh.
Do punishments prevent cheating? Is morality more certain when it relies on external enforcement? High-stakes educational testing in the U.S. has been marred by cheating scandals, and some teachers and other adults involved have faced criminal charges. Student too have faced sanctions.
Dan Ariely, professor of psychology and behavioral economics, has studied cheating and other forms of dishonest behavior in business environments and private transactions. He’s the author of The Honest Truth about Dishonesty, and Predictable Irrational. In the latter, he explains that we internalize the values and ethics from the society we live in, and we’re unhappy when we’re not in compliance and happy when we are—or appear to be. His experiments show we want to maintain a positive view of ourselves as honest people, and we also want to get what we want. When those two goals are in conflict, he says, we devise what he calls a moral fudge factor. In experiments where circumstances allowed students to correct test answers so they would appear smarter, or when they could reward themselves with coins for asserting improved scores that could not be verified, most test subjects cheated a little bit. Compared with scores of students who had no chance to cheat, the groups who could get away with cheating consistently scored higher. But they boosted their performance just a little, so they could still feel good about themselves, not outrageously enough to feel they’d been dishonest. Ariely and other scholars also examined the Enron scandal, in which a group of executives pushed the company to collapse by deliberately disguising massive debt with creative accounting. The norms within the group blurred and changed as the cheating progressed, and Ariely writes that when social norms collide with market norms, market norms tend to prevail.
David Mayer, writing in Fast Company, suggests that when our morality and self-interest conflict, one of our fudges is to “outsource” unethical behavior to others who can do it for us. So we sometimes like leaders, bosses, officials and political candidates, who do or support things we’d rather not personally acknowledge. He writes:
“The psychologist Crystal Hoyt and her colleagues found in several experiments that when productivity is at stake, people are less concerned that their leaders use unethical means to reach their goals. This is consistent with recent coverage in the popular press suggesting that jerks can be better bosses because they're efficient, that narcissists are unusually likely to rise into leadership positions, and that we're psychologically vulnerable to trusting obviously untrustworthy people. Many of us want leaders to engage in whatever "goal-pursuit" best serves our self-interest, and we're more willing to make moral accommodations for those who appear hell-bent on doing that.”
Ariely writes that determined people find their way around laws and regulations intended to enforce ethics that thwart their interests, but he urges against giving up on honesty. We need reminders about our personal honest when the temptation occurs. In one large experiment, he asked one group of subjects to name 10 books they read in high school, and another group to name all of the Ten Commandments they could remember. Given an experimental chance to cheat, some of the book list subjects cheated. Among the Ten Commandments group, even among those who remembered only one or two commandments, none cheated. The result wasn’t about religion, Ariely wrote in Predictably Irrational. It was because the exercise had evoked the idea of honesty among the subjects.