Bounded Ethicality and Ethical Fading
What if someone made you an offer that would benefit you personally but would require you to violate your ethical standards? What if you thought you could get away with a fraudulent act that would help you in your career?
Most of us think we would do the right thing. We tend to think of ourselves as honest and ethical people. And we tend to think that, when confronted with a morally dubious situation, we would stand up for our convictions and do the right thing.
But research in the field of behavioral ethics says otherwise. Contrary to our delusions of impenetrable virtue, we are no saints.
We’re all capable of acting unethically, and we often do so without even realizing it.
In their book Blind Spots: Why We Fail to Do What’s Right and What to Do About It, Max Bazerman and Ann Tenbrusel highlight the unintentional, but predictable, cognitive processes that result in people acting unethically. They make no claims about what is or is not ethical. Rather, they explore the ethical “blind spots,” rooted in human psychology, that prevent people from acting according to their own ethical standards. The authors are business ethicists and they emphasize the organizational setting, but their insights certainly apply to ethical decision making more generally.
The two most important concepts they introduce in Blind Spots are “bounded ethicality” and “ethical fading.”
Bounded Ethicality is derived from the political scientist Herbert Simon’s theory of bounded rationality – the idea that when people make decisions, they aren’t perfectly rational benefit maximizers, as classical economics suggests. Instead of choosing a course of action that maximizes their benefit, people accept a less than optimal but still good enough solution. They “satisfice” (a combination of “satisfy” and “suffice”), to use Simon’s term.
They do this because they don’t have access to all the relevant information, and even if they did, their minds wouldn’t have the capacity to adequately process it all. Thus, human rationality is bounded by informational and cognitive constraints.
Similarly, bounded ethicality refers to the cognitive constraints that limit people’s ability to think and act ethically in certain situations. These constraints blind individuals to the moral implications of their decisions, and they allow them to act in ways that violate the ethical standards that they endorse upon deeper reflection.
So, just as people aren’t rational benefit maximizers, they’re not saintly moral maximizers either.
Check out this video about bounded ethicality from the Ethics Unwrapped program at University of Texas Austin at Austin:
Ethical Fading is a process that contributes to bounded ethicality. It happens when the ethical implications of a decision are unintentionally disregarded during the decision-making process. When ethical considerations are absent from the decision criteria, it’s easier for people to violate their ethical convictions because they don’t even realize they’re doing so.
For example, a CEO might frame something as just a “business decision” and decide based on what will lead to the highest profit margin. Obviously, the most profitable decision might not be the most ethically defensible one. It may endanger employees, harm the environment, or even be illegal. But these considerations probably won’t come to mind if he’s only looking at the bottom line. And if they’re absent from the decision process, he could make an ethically suspect decision without even realizing it.
Check out this video about ethical fading from the Ethics Unwrapped program at University of Texas Austin at Austin.
You’re Not a Saint, So What Should You Do?
Nudge yourself toward morality.
Bazerman and Tenbrusel recommend preparing for decisions in advance. Consider the motivations that are likely to influence you at the time of the decision and develop proactive strategies to reduce their influence. Pre-commitment strategies are highly effective. If someone publicly pre-commits to an ethical action, he’s more likely to follow through than if he doesn’t. Likewise, pre-commiting to an intended ethical decision and sharing it with an unbiased and ethical person makes someone more likely to make the ethical decision in the future.
During actual decision making, it is crucial to elevate your abstract ethical values to the forefront of the decision-making process. Bazerman and Tenbrusel point out that “rather than thinking about the immediate payoff of an unethical choice, thinking about the values and principles that you believe should guide the decision may give the ‘should’ self a fighting chance.” One strategy for inducing this type of reflection, they say, is to think about your eulogy and what you’d want to be written about the values and principles you lived by.
There’s also the “mom litmus test.” When tempted by a potentially unethical choice, ask yourself whether you’d be comfortable telling your mom (or dad or anyone else you truly respect) about the decision. Imagining your mom’s reaction is likely to bring abstract principles to mind, they contend.
Yet another strategy for evoking ethical values is to change the structure of the decision. According to Bazerman and Tenbrusel, people are more likely to make the ethical choice if they have the chance to evaluate more than one option at a time. In one study, “individuals who evaluated two options at a time – an improvement in air quality (the ‘should’ choice) and a commodity such as a printer (the ‘want’ choice) – were more likely to choose the option that maximized the public good.” When participants evaluated these options independently, however, they were more likely to choose the printer.
In another study, people decided between two political candidates, one of higher integrity and one who promised more jobs. The people who evaluated the candidates side by side were more likely to pick the higher integrity candidate. Those who evaluated them independently were more likely to pick the one who would provide the jobs.
Bazerman and Tenbrusel say this evidence suggests that reformulating an ethical quandary into a choice between two options, the ethical one and the unethical one, is helpful because it highlights “the fact that by choosing the unethical action, you are not choosing the ethical action.”
What Implications Do Bounded Ethicality and Ethical Fading Have for Moral Responsibility?
Bazerman and Tenbrusel don’t address this question directly. But the notion that our default mode of ethical decision making in some circumstances is bounded by psychological and situational constraints – influences we’re not consciously aware of that affect our ethical decision-making abilities – seems to be in tension with the idea that we are fully morally responsible for all our actions.
The profit-maximizing CEO, for example, might be seen by his friends and peers as virtuous, caring, and thoughtful. He might care about his community and the environment, and he might genuinely believe that it’s unethical to endanger them. Still, he might unintentionally disregard the moral implications of illegally dumping toxic waste in the town river, harming the environment and putting citizens’ health at risk.
This would be unethical, for sure, but how blameworthy is he if he had yet to read Blind Spots and instead relied on his default psychology to make the decision? If ethical blind spots are constitutive elements of the human psyche, are the unethical actions caused by those blinds spots as blameworthy as those that aren’t?
Either way, we can’t be certain that we’d have acted any differently in the same circumstances.
We’ll all fail the saint test at some point, but that doesn’t make us devils.
Learn More About Behavioral Ethics
Blind Spots: Why We Fail to Do What’s Right and What to Do About It
One thought on “You’re Probably Not as Ethical as You Think You Are?”