It all started innocently enough with a student inquiry about one of our simulations. The dilemma asked the learner to identify the most ethical option if a long-term, valued employee gets angry and throws a fire extinguisher at another employee while off his meds for being bipolar. The student made a good case for the most ethical option being termination, rather than sending him home for two weeks to get his life back in order and then giving him another chance—the option determined most ethical by the EthicsGame writing team.
As is the custom, when EthicsGame gets a question from a student, I poll the leadership team (conveniently, we all look at the world through a different ethical lens). One of us with a long history as a HR executive wanted to protect the other employees and the company and so would terminate the employee. One of us with a long history in the corporate world—who also works with the police—advocated for calling the police. I was taken aback. That would be the last action I would take.
I then got gently schooled in how, with a decline in mental health services, police departments have resources available both to make sure that the person doesn’t hurt themselves or others and also gets the help they need. And, the employee would be terminated.
As I began to sort through the different approaches, I noted that all of us were in agreement about the facts as presented. The importance of starting with an agreement about facts cannot be overstated. Many of us become committed to the facts as we want them to be, instead of as they are, diminishing any hope of working together, as Michal Blake details in a provocative article entitled “Why bullshit hurts democracy more than lies.”
The problem is that none of us ever has all the facts we want before we have to make a decision, especially if we are responding to an emerging situation. Then the aha: each of us filled in missing details and, in the moment, gave the benefit of the doubt to either the employee or the manager based on our own experience and ethical perspective—based on our implicit biases and preferred value priorities. Faced with incomplete or imperfect information, we act based on our previous experience and preferred world view.
Most of us do not want to believe that we have implicit biases, which the Kirwan Institute (2015) defines as “the implicit associations we harbor in our subconscious [that] cause us to have feelings and attitudes about other people based on characteristics such as race, ethnicity, age, and appearance.” However, a fact of life is that none of us can get rid of biases (nor would that eradication of learned behaviors and information about life be particularly useful, as we see the plight of those with dementia who no longer have access to memories).
Our biases are learned as we make sense of our world in childhood and either reinforced or rebutted as we evaluate our lived experience and learn more about others and ourselves. While these biases can help us sort out our value priorities, they also can hinder us from respecting people and making the best decision possible. An antidote to inappropriate action is to become aware of our own biases, test them to see if they make sense (noticing when a situation has become unsafe and getting out of compromising situations are still useful skills), and work to change the biases as appropriate (which is a long-term process). Most importantly, we can learn to notice when our biases come into play, and then test our proposed action against a different set of biases, a different framing of the experience.
I had a chance to test the theory about identifying implicit bias by exploring who got the benefit of the doubt in a conversation about the recent Starbucks incident where two African-American men were arrested for loitering while they said they were waiting for a colleague. One person I spoke with, who had a long history of living among and working with the homeless, defended the barista who made the call because the men looked like they were homeless—and loitering in the public space. One person, a DA, defended the police because their ability to exercise discretion has been greatly reduced in order to reduce arbitrary actions. One person, active in civil rights issues, was outraged because of the perceived targeting of the men. Each person filled in details and gave the benefit of the doubt to a different stakeholder in that situation.
I then tried to look at the situation from the vantage point of my conversation partners and realized how hard it was to shift my gaze from my preferred world view. To even try, I had to use my imagination and envision myself as those other people. It turns out that our imagination and learning tools, such as simulations, can prepare us for that sliver of time where a choice has to be made. As we become aware of our own implicit biases and rehearse what we might do in a particular situation, we begin developing strategies to master the game of life—to live effectively and ethically with other people.