Some insights to enhance comprehension and provide guidance on addressing this significant matter.
In 2004, the U.S. Supreme Court heard Cheney v. U.S. District Court for D.C. The central issue in this case was whether or not the then US vice president, Dick Cheney, had an obligation to provide information about the task force he led on energy sector regulation. It turns out that one of the Supreme Court justices, Antonin Scalia, was Cheney's golf partner. An environmental group, the Sierra Club Foundation, asked the judge to recuse himself from the case on the grounds that their friendship could jeopardize Scalia's objectivity. The judge's response is emblematic, and, I'm sure, many of us have heard it or said some version of it: “If it is reasonable to think that a Supreme Court justice can be bought so cheap, the nation is in dee- per trouble than I had imagined.”
Justice Scalia's example demonstrates how different interpretations of the conflict of interest issue are common. It's very common for us to overlook conflict-of-interest issues out of practicality. How many times do we hire a friend or relative because they will give us a better price or because they are trustworthy? Like Justice Scalia, we see conflicts of interest as a minor, character-related issue and believe that the choice of whether or not to act ethically is purely rational.
A growing body of research on behavioral ethics shows that in situations of conflict of interest, various psychological processes lead well-meaning people to behave unethically. Not only that, but these processes also affect people who witness unethical behavior, leading them not to notice or ignore such behavior. This phenomenon is known as bounded ethicality, and it applies to all of us.
A better understanding of the mechanisms that lead to bad behavior in conflict-of-interest situations can help to give more importance to this issue and deal with it more objectively and efficiently.
Another important issue, and a factor that hinders dialogue on the subject, is that it is much more difficult to recognize situations in which we ourselves are conflicted. When other people are in a conflict of interest, it's easier to see.
In 1979, researcher Ulric Neissen carried out an experiment that has become famous—and which many of us have taken part in on courses and MBAs—and which proves how much our focus affects our perception. He asked a group of university students to watch a video showing the superimposition of two groups of players, one wearing white shirts and the other wearing black shirts. He asked the students to count the number of times the group wearing black shirts passed the ball between them. If you haven't taken part in this exercise and want to do this fun test, pause reading and go to the link: https://youtu.be/nkn3wRyb9Bk?si=XCXfYhBB50DhB6hC . The fact that the students were focused on counting the number of passes meant that 79% didn't notice that a woman carrying an umbrella passed between the players. The conclusion is that, while focusing on a task, the students failed to notice an obvious piece of information accessible to their field of vision.
The phenomenon observed in the experiment, previously known as inattentional blindness, is now called bounded awareness because it encompasses the fact that we also fail to perceive non-visual information. This means that we fail to perceive information or the need to seek out information that is often crucial for good decision-making. Some “limited awareness” processes lead us to ignore other people's unethical behavior. These processes include outcome bias, the slippery slope effect, motivated blindness, and indirect blindness.
Outcome bias is our tendency to overvalue information about the outcome. It leads us to make a number of mistakes in the decision-making process:
· The desire for a certain outcome influences our judgment and, consequently, our decisions.
· We are also less likely to condemn unethical behavior when it leads to favorable outcomes. That is, we ignore good decision-making processes in favor of good results.
· On the other hand, we tend to condemn people who value good decision-making or make sensible decisions that lead to less favorable outcomes.
· When there are one or more identified victims, there is a tendency for a harsher judgment. On the other hand, when the number of victims is such that they turn into a group of anonymous people, the judgment tends to be more lenient.
· We tend to judge a behavior as more or less ethical based on the result it provides.
What researchers call the “slippery slope” is the gradual change in the behavior of others. We often fail to observe the gradual erosion of ethical behavior in those around us.
Research on this topic reveals that people often rationalize unethical decisions in their favor to uphold their self-image as morally upright individuals when they conflict with their own interests.
We tend to see information that supports our point of view and ignore or not see information that contradicts what we think. A lot of research shows that even trained and knowledgeable professionals are susceptible to this bias. This research also demonstrates that motivated blindness is not conscious and that we are all subject to this bias, even well-meaning people. This phenomenon is called motivated blindness.
And finally, indirect blindness leads us to be less harsh in our judgments of unethical behavior that is committed through third parties. In other words, we judge a person more harshly when they act unethically themselves, rather than when they transfer the unethical action to someone else.
Another aspect of indirect blindness is that judging two or more actions at the same time leads to more rational judgments than when judging one action in isolation.
In our contracts, agreements, and engagements, there are many reasons to avoid conflict of interest situations as much as possible. The question is: how? Behavioral research doesn't provide recipes, but it does give us good tips.
Ensuring the conditions for good decision-making is fundamental.
Many of us are familiar with the two systems that underlie our cognitive functioning. System 1 is the most used; it is automatic, fast, emotional, and requires less effort. System 2 is slower, conscious, rational, and requires more effort. One of the ways to ensure a good decision-making process and get away from the biases and heuristics to which we are subject is to train ourselves to move out of System 1 and into System 2. It's not always easy, but taking the time to make decisions means checking that we're considering different points of view, that we're asking the right questions, and that we're not leaving out any information.
Another tip from behavioral ethics research is what has become known as the "nudge," that little push in the right direction. Are the incentives in contracts and guidelines aligned with the objectives? Do they assist in preventing conflicts of interest? Do they help prevent unethical behavior in general?
When hiring, setting up agreements and contracts, or modifying those already in place, it is important to pay attention to the presence of or openness to conflict of interest situations. As with temptations, it is easier to avoid conflicts of interest and cognitive biases than it is to resist them.
By Alessandra Corrêa
This post was originally published in Portuguese on the Head Energia blog.
Comments