Ethical
Leadership and the Psychology of Decision Making The focus is on three sources of error
arising in: 1.
Theories about the
world: -
Beliefs we hold
about how the world works, -
the nature of the
causal network in which we live, -
the ways in which
our decisions influence the world, especially beliefs the
probabilistic or deterministic texture of the world (and
degrees of freedom) 2.
Theories about
others -
Beliefs about how
“we” are different from “them,” or they are different from
us -
Often such beliefs
are unconscious 3.
Theories about
ourselves -
Unrealistic belief
about ourselves may cause us to: -
Underestimate our
exposure to risk -
Take more than our
fair share of the credit for success or too little for
failure -
Be too confident
that our theory of the world is the correct one -
Be unclear about
our talents and weaknesses Theories
about the World Successful executives must have
accurate knowledge of their world. If they lack this
knowledge, they must know how to obtain it. One typical challenge is how to assess
the risk of a proposed strategy or policy, which involves
delineating the policy's consequences and assessing the
likelihood of various possibilities. If an executive does
a poor assessment of a policy's consequences, there may be
problems. There are three components to our
theories of the world: (a) the consideration of possible
consequences, (b) the judgment of risk, and (c) the
perception of causes. [Note:
There are classical subsets or “parts” of prudence
covering each of these.
More on this in due course.] 1.
The Cascade of
Consequences "You can never do just one thing.” Major decisions
have a spectrum of consequences, not just one, and
especially not just the intended consequence. Everyday
experience as well as psychological research suggests
that, in making complex choices, people often simplify the
decision by ignoring possible outcomes or consequences
that would otherwise complicate the choice. In other
words, there is a tendency to reduce the set of possible
consequences or outcomes to make the decision manageable.
In extreme cases, all but one aspect of a decision will be
suppressed, and the choice will be made solely on the
basis of the one privileged feature. The folly of ignoring
a decisions possible consequences should be obvious to
experienced decision makers, but there are several less
obvious ways in which decision errors can create moral
hazards. The tendency to ignore the full set of
consequences in decision making leads to the following
five biases: a) ignoring low-probability events, b)
limiting the search for stakeholders, c) ignoring the
possibility that the public will "find out," d)
discounting the future, and e) undervaluing collective
outcomes. a)
Ignoring Low
Probability Events -
If a new product
has the potential for great acceptance but a possible
drawback, perhaps for only a few people, there is a
tendency to underestimate the importance of the risk. b)
Limiting the Search
for Stakeholders -
Getting blind-sided
by unanticipated consequences to an altogether different
group. A careful analysis of the interests of the
stakeholders (those persons or groups whose welfare may be
affected by the decision under consideration) is essential
to reasonably anticipating potential problems. c)
Ignoring the
Possibility the Public will “Find Out” -
Executives should
ask, "What would the reaction be if this decision and the
reasons for it were made public?" If they fear this
reaction, they should reconsider the decision. -
A decision or
policy that must be hidden from public view has the
additional risk that the secret might be revealed. Damage
to self-respect
[we did that
and now we have to live with that fact] and institutional
respect of those who must implement and maintain the
concealment should also be considered a consequence. d)
Discounting the
Future -
The consequences of
decisions cascade not only over people and groups, but
also over time. Figuring out how to address the entire
temporal stream of outcomes is one of the most challenging
tasks executives face.
(Short term or long term perspective?) e)
Undervaluing
Collective Outcomes -
My actions have
consequences for others, perhaps an entire industry
(Country? Family? School?). There is a tendency to treat
these collective costs as externalities and to ignore them
in decision making. To do so, however, is to ignore a
broad class of stakeholders whose response could be, "If
they voluntarily ignore the collective interests, then it
is in the collective interest to regulate their activity." Result: More rules, less
freedom to be creatively prudent. -
Ethical decisions
must be based on accurate theories about the world. That
means, at a minimum, examining the full spectrum of a
decisions consequences. Our research suggests that a set
of biases reduces the effectiveness of the search for all
possible consequences. 2.
Judgment of Risk Theories of the world will be
inaccurate if they systematically fail to account for the
full spectrum of consequences associated with decisions.
And they will be inaccurate if they systematically err in
assessing the probabilities associated with the
consequences. a)
Denying Uncertainty -
Looking for
(expecting? demanding?) certainty in an uncertain world.
They want to know what will or did happen, not what may or
might happen or have happened. People find it easier to
act as if the world were certain and deterministic rather
than uncertain and often unpredictable. What people want
to hear is not what might happen, but what will happen.
When executives act as if the world is more certain than
it is, they expose themselves to poor outcomes, for both
themselves and others. It is simply foolish to ignore risk
on one's own behalf, but it is unethical to do so on
behalf of others. -
“Fooled by
randomness”: we misperceive chance events. Coincidence is
not causality. -
One implication of
the belief in a deterministic world is the view that
evidence should and can be perfect. It never is. How
close does it need to be? (Making prudent judgments given
all the available evidence, if the evidence
was gathered honestly without bias. “Confirmation
bias.”) -
We believe in a
deterministic world in some cases because we exaggerate
the extent to which we can control it. (“Experts”) -
“Hindsight bias”:
In situations in which our expectations or predictions
were wrong, we often misremember what our expectations
were. We commonly tend to adjust our memories of what we
thought would happen to what we later came to know did
happen. This insulates us from understanding our errors. -
We fail to
appreciate the role of chance if we assume that every
event that occurred was, in principle, predictable. The
response "I should have known . . ." implies the belief
that some future outcome was inherently knowable, a belief
incompatible with the fact that essentially random events
determine many outcomes. b)
Risk Trade-offs -
Uncertainty and
risk are facts of executive life. What level of
uncertainty and/or risk is acceptable? -
One unhelpful
answer: No risk. Any price is worth paying for complete
safety. The illusion that a riskless world can be created
is a myth that is consistent with a theory of the world
that minimizes the role of chance. -
A certain premium
seems to get attached to situations in which all risk can
be eliminated. -
Result: misdirected
risk-reduction efforts, saving fewer lives at greater
cost. c)
Risk Framing -
How is the
situation being “framed”?
Primarily as attaining a positive or as avoiding a
negative? What
is the risk? Job losses: lose some or risk losing all? (Moral purity? Universal moral
principles or rules?
We never
abandon a fellow worker.) - If different stakeholders have different frames, the potential for moral disagreement is great. (Understanding the other person’s moral “frame”; being clear about yours.)
3. Perception of Causes
-
Being aware of the
beliefs we hold about the causal texture of the world,
about why things happen or don’t happen. -
Judging causal
responsibility is often a precursor to judging moral
accountability and to blaming or praising a person,
organization, or policy for an outcome. However, even
under the best of circumstances, causation is usually
complex, and ambiguity about causation is often at the
heart of disputes about responsibility, blame, and
punishment. (What
caused the boat to capsize?) -
Focus on People:
tendency in most cases is to blame a person, not a faulty
system -
Different events:
Theories about causes often lead people to disagree,
because they are explaining different events (often
without knowing that this is the problem). Why is life more
expensive? (What
caused prices to go up? What caused income to go down or
not keep pace?) General
event or specific event? (Why is my take-home pay less?
vs. Why haven’t the workers received a pay raise in five
years?) -
Sins of Omission: What wasn’t done
that needed to be done?
It is an old adage that evil prevails when good
people fail to act, but we rarely hold the "good" people
responsible for the evil they failed to prevent. (Are they
responsible? As
responsible or partially?
Or in some circumstances not at all?) Theories
about Other People Do we have erroneous ideas (theories?)
about other people? Other groups of people? (“us” vs.
“them”) 1.
Ethnocentrism -
This perception
that "our" way is normal and preferred and that other ways
are somehow inferior. The world revolves around our group,
and our values and beliefs become the standard against
which to judge the rest of the world. (Note, this wouldn’t
always be associated with an “ethnic” group, depending
upon how broadly one understands that term. “In” group
favoritism. “You people are weird.” You white
people, you Asian people, you people who go to Texas A
& M, you Texans, you Southerners, you computer
programmers, you women, you men), which brings us to the
next category: 2.
Stereotypes -
In addition to the
"theory" that "our" group is better than others, we often
have specific beliefs about particular groups, which
constitute implicit theories about people in these groups. -
Like ethnocentrism,
stereotypes are dangerous because we are often unaware of
their influence. We tend to think that our beliefs about
groups are accurate, and we can often draw on experience
to support these beliefs. (But we had those experiences
through a certain interpretive lens.) -
How can we guard
against the dangers of ethnocentric and stereotypical
theories? Starting with ethnocentrism, we should question
arguments based on the belief that "they" are different
from "us." The safest assumption to make, in the absence
of contrary evidence, is that "they" are essentially the
same as "us" and that if we want to know how "they" will
react to a situation, a wise first step is to ask how "we"
would react. Historically, far more harm has been incurred
by believing that different groups are basically different
than by assuming that all people are essentially the same. Theories
about Ourselves These problems are not associated with
low self-esteem. But
note, there are problems with low self-esteem as well. In fact, many
successful people slingshot back and forth between
inferiority and superiority complexes. The goal is to
have an accurate sense of oneself: both ones’ strengths
and weaknesses. If
we were to give this virtue a name, classically, it is
called “humility.” 1.
The Illusion of
Superiority a)
Illusion of
Favorability -
This illusion is
based on an unrealistically positive view of the self, in
both absolute and relative terms. For instance, people
highlight their positive characteristics and discount
their negatives. In relative terms, they believe that they
are more honest, ethical, capable, intelligent, courteous,
insightful, and fair than others. People give themselves
more responsibility for their successes and take less
responsibility for their failures than they extend to
others. People edit and filter information about
themselves to maintain a positive image, just as
totalitarian governments control information about
themselves. b)
Illusion of Optimism -
This illusion
suggests that people are unrealistically optimistic about
their future relative to others. People overestimate the
likelihood that they will experience "good" future events
and underestimate the likelihood of "bad" future events.
In particular, people believe that they are less
susceptible than others to risks ranging from the
possibility of divorce or alcoholism to injury in traffic
accidents. To the extent that executives believe
themselves relatively immune from such risks, they may be
willing to expose themselves and their organizations to
hazards. c)
Illusion of Control -
The illusion of
optimism is supported by the illusion of control that we
referred to earlier. One reason we think we are relatively
immune to common risks is that we exaggerate the extent to
which we can control random events. (Indeed, the
belief that one is exempt from these illusions, while
others are not, is an excellent illustration of the
illusion of optimism.) 2.
Self-Serving
Fairness Biases -
Most executives
want to act in a just manner and believe they are fair
people. Since they are also interested in performance and
success, they often face a conflict between fairness and
the desired outcome. They may want a spacious office, a
large share of a bonus pool, or the lions share of the
market. Furthermore, they may believe that achieving these
outcomes is fair because they deserve them. Different
parties, when judging a fair allocation among them, will
often make different judgments about what is fair, and
those judgments will usually serve the party's interest.
These judgments often reflect disagreements about
deservedness based on contributions to the collective
effort. It is likely that if you asked each division in
your organization to estimate the percentage of the
company's worth that is created by the division, the sum
of the estimates would greatly exceed 100 percent.
(Research has been shown this to be true with married
couples. The researchers who did the study reported that
they had to ask the questions carefully because spouses
would often be amazed, and then angry, about the estimates
that their mates gave to questions like, "What percentage
of the time do you clean up the kitchen?") -
One important
reason for these self-serving views about fairness is that
people are more aware of their contributions to collective
activities than others are likely to be; they have more
information about their own efforts than others have or
than they have about others. -
Furthermore,
executives, like other people, credit themselves for their
efforts, whereas they are more likely to credit others
only for their achievements. They also credit themselves
for the temptations that they resisted but judge others
strictly by their actions, not by their lack of action. -
Egocentric
interpretations of fairness hinder conflict resolution
because each party believes that its own demands are fair
and thus is unwilling to agree to what it perceives as
inequitable settlements. It is not just a matter of
different interests, it is a matter of what is fair and
proper. The difference in perspectives can lead parties to
question each others’ ethics and morality. The temptation
to view the other side as immoral when they fail to agree
with us. 3.
Overconfidence -
Most people are
erroneously confident in their knowledge. -
The danger of
overconfidence is, of course, that policies based on
erroneous information may fail and harm others as well as
the executive who established the policy. Overconfidence,
as part of our theories about ourselves, coupled with
flawed theories about the world or about other people,
poses serious threats to rational and ethical decision
making. -
To the degree to
which people are overconfident in their (conservative)
risk assessments - in their beliefs about the availability
of scarce resources or the character of people unlike
themselves - they will fail to seek additional information
to update their knowledge. One cost of overconfidence is a
reluctance to learn more about a situation or problem
before acting. -
Even if people
acknowledge the need for additional information, research
has shown that their process for gaining that information
may be biased to confirm prior beliefs and hypotheses.
(confirmation bias) -
What question are
you asking? ("This really is a safe grip, isn't it?") -
Will feedback help
to eliminate or reduce these biases? We believe that
feedback may provide only limited help because of the
tendency to seek and notice confirming information, which
forms an additional barrier to learning through
experience. When we consider the combined impact of
the three processes described in this section — the
illusion of superiority, self-centered perceptions of
fairness, and overconfidence — we can see the peril
associated with erroneous theories of the self. The major
peril is that we will come to see ourselves as people for
whom the normal rules, norms, and obligations do not
apply. The danger is that an executive, especially a
successful executive, will hold himself above conventional
ethical principles and subject himself only to
self-imposed rules that others might judge to be
self-serving. He might justify telling a lie on the ground
that it permits him to achieve important benefits for
others (such as shareholders or employees) even though the
shareholders or employees are being duped. He might feel
that inflating an expense account or using company
property for personal ends is not "really" wrong because
of the value that he contributes to the company. Finally,
he may undertake an immoral or illegal act, convinced that
he will never be caught. The tendencies to feel superior,
to generate self-serving, on-the-spot moral rules, and to
be overconfident about beliefs create the potential for
moral shallowness and callowness. Improving
Ethical Decision Making Poor ethical decisions are often the
same as the causes of poor decisions generally; decisions
may be based on inaccurate theories about the world, about
other people, or about ourselves. 1.
Quality - A general principle
is that the types of flaws and biases we have discussed
are likely to influence decision making more when
decisions are intuitive, impulsive, or subjective rather
than concrete, systematic, and objective. - Reliable evidence
about (close attention to) the real world. (Warnings about
“data” and “numbers.”) - Reasoning by
anecdote is usually foolish. - A corollary is that
getting high-quality data is obligatory. (What are you
measuring? How are you measuring it? What is your measure
actually measuring? What small slice of the world are you
seeing?) - Sometimes (all the
time?) executives cannot escape making decisions and
judgments on subjective bases. But they can take steps to
prevent some of the biases from distorting judgment. To
combat overconfidence, for instance, it is effective to
say to yourself, "Stop and think of the ways in which you
could be wrong." Similarly, to avoid minimizing risk, you
can ask, "What are the relevant things that I don't know?"
Often, a
devil's advocate, who is given the role of scrutinizing a
decision for false assumptions and optimistic projections,
can play this role. (What are your checks and balances? If you don’t
have any, you’re asking for trouble.) - One threat to
rational and ethical decision making that we noted earlier
stems from the untrustworthiness of human memory. The
first step in managing this threat is to acknowledge it.
The second is to compensate for it with improved, detailed
record keeping. - Quality management
and ethical management are close companions; what promotes
one generally promotes the other. Erroneous theories
threaten both. 2.
Breadth -
By breadth, we mean
assessment of the full range of consequences that policies
may entail. -
Openness itself is
often a signal to potential opponents that nothing is
being hidden and there is nothing to fear. (honesty,
transparency: you
can’t fake it) -
A company is (and
persons are) part of a broader community that has an
interest in its actions. A full accounting for decisions
must include a community-impact assessment. -
One’s decisions
affect those not only in the present but also in the
future. -
Breadth is an
important quality of ethical decision making because it is
both ethically proper and strategically sound. It means
doing the right thing and doing the smart thing.
Intentional decisions to exclude stakeholders' interests
or input may not only violate their rights, which is an
ethical infraction, but also invite opposition,
resentment, and hostility, which is stupid. 3.
Honesty -
In discussing
breadth, we urged openness. But executives can rarely
divulge all the information involved in a decision. A
policy of openness does not require executives to tell
all. It is perfectly ethical and appropriate to withhold
some types of information. It is inappropriate to withhold
information about a project or policy merely because an
executive is ashamed to make it public. We propose that,
if an executive feels so embarrassed about some aspect of
a project that she wants to hide the information, she
probably should not undertake the project. If an idea
cannot stand the light of day or the scrutiny of public
opinion, then it is probably a bad idea. A variant of this
"sunshine test" is to imagine how you would feel if you
saw the idea or decision on the front page of the New York
Times. -
We ourselves are
the easiest audience that we have to play to and the
easiest to fool. Consequently, we should imagine whether
our audience would accept the idea or decision. In
particular, we should ask whether the people with the most
to lose would accept the reasons for our actions. If not,
we are probably on moral thin ice. -
One risk often
overlooked when practicing deceit is the continual need to
maintain deception. Not only are original facts hidden,
but the fact of hiding must also be hidden. -
While it is important to be
honest with others, it is just as important to be honest
with yourself. Self-deception — being unaware of the
processes that lead us to form our opinions and judgments
— is unavoidable. We think we remember things accurately,
but careful studies show that we do not. We think we know
why we make judgments about other people, but research
shows us other reasons. -
If we can accept the fact
that the human mind has an infinite, creative capacity to
trick itself, we can guard against irrational, unethical
decisions. To deny this reality is to practice
self-deception. We can learn to suspect our naive
judgments. We can learn to calibrate ourselves to judge
risk. We can examine our motives in judging others; are we
using hard, reliable information to evaluate subordinates,
or are we using stereotypes? The topic of executive ethics has been dominated by the
assumption that executives are constantly faced with an
explicit trade-off between ethics and profits. Most people do
not say to themselves, “Now I will do the morally bad thing.” Most decisions
are not usually so explicit.
If it seemed “really, really evil” or “just plain
stupid,” most of us wouldn’t have done it in the first
place. Parts of
Prudence Potential
parts of prudence: the
virtues connected with it: "good counsel,"
which concerns counsel, "synesis," which
concerns judgment in matters of ordinary occurrence, "gnome," which
concerns judgment in matters of exception to the law Subjective
parts of prudence: prudence
in different areas -
The prudence
whereby a man rules himself vs. the prudence whereby a man
governs a multitude. -
Again, the prudence
whereby a multitude is governed, is divided into various
species according to the various kinds of multitude. There
is the multitude which is united together for some
particular purpose; thus an army is gathered together to
fight, and the prudence that governs this is called
“military.” -
There is also the
multitude that is united together for the whole of life;
such is the multitude of a home or family, and this is
ruled by “domestic prudence.” -
And such again is
the multitude of a city or kingdom, the ruling principle
of which is “regnative prudence” in the ruler, and
“political prudence” in the subjects. Integral
parts of prudence: skills and abilities we would
need to strengthen to help people make prudent judgments.
Intelligentia:
the understanding of first principles; Ratio: Discursive
reasoning and the ability to research and compare
alternatives
Memoria: accurate
memory; that is, memory that is true to reality, and an
ability to learn from experience Docilitas:
(teach-ability): an open-mindedness that recognizes
variety and is able to seek and make use of the experience
and authority of others Sollertia (eustochia: good
conjecture): shrewdness or quick-wittedness, i.e. the
ability to evaluate a situation quickly; Providentia:
foresight – i.e. the capacity to estimate whether
particular actions can realize goals Circumspection:
the ability to take all relevant circumstances into
account; Caution: the ability to understand and mitigate risk.
|