Recognizing & Understanding Stereotypes and Bias

Introduction

 

-Cops_in_a_Donut_Shop_2011.jpg

Police officers buying donuts and coffee, an example of perceived stereotypical behavior in the US

 

 

Psychologists once believed that only bigoted people used stereotypes. Now the study of unconscious bias is revealing the unsettling truth: We all use stereotypes, all the time, without knowing it. We have met the enemy of equality, and the enemy is us.

 

Each of us has a biased world view because we are all limited to a single camera perspective. That is we can only see what comes before us, we can only hear what is around us, and we can only read that which is in front of us. No one has the definitive version of reality, including the the author of this lesson. Our social locations helps inform our world view - our race, class, gender, religion, sexual orientation, culture, etc.Our world view impacts how we view, respond, and react to every experience. Our job in this lesson is to learn what stereotypes and biases are, how to recognize our own biases, and how move beyond them to a more balanced ability to evaluate and understand people.

 

 

Learning Objectives

 

Participants who successfully complete this workshop should be able to:

Learning Activities:

The learning activities in this lesson include:

 

As you work through this lesson, you will encounter many activities to help reinforce your learning of this material. Some of the activities are graded, and some are not. All of the graded activities may be attempted as often as necessary for you to demonstrate your understanding. Upon completion of this lesson and all the graded activities, and after achieving a score of at least 80%, you will be provided with a Certificate of Completion.

 

This seems like a lot to cover, so we had best get busy...........

 

 

Definitions:

Stereotype

From Wikipedia

A stereotype is an exaggerated belief, image or distorted truth about a person or group — a generalization that allows for little or no individual differences or social variation. Stereotypes are based on images in mass media, or reputations passed on by parents, peers and other members of society. Stereotypes can be positive or negative. (Southern Poverty Law Center)

One theory as to why people stereotype is that it is too difficult to take in all of the complexities of other people as individuals. Even though stereotyping is inexact, it is an efficient way to mentally organize large blocks of information. Categorization is an essential human capability because it enables us to simplify, predict, and organize our world. Once one has sorted and organized everyone into tidy categories, there is a human tendency to avoid processing new or unexpected information about each individual. Assigning general group characteristics to members of that group saves time and satisfies the need to predict the social world in a general sense.

Bargh thinks that stereotypes may emerge from what social psychologists call in-group/out-group dynamics. Humans, like other species, need to feel that they are part of a group, and as villages, clans, and other traditional groupings have broken down, our identities have attached themselves to more ambiguous classifications, such as race and class. We want to feel good about the group we belong to—and one way of doing so is to denigrate all those who who aren't in it. And while we tend to see members of our own group as individuals, we view those in out-groups as an undifferentiated—stereotyped—mass. The categories we use have changed, but it seems that stereotyping itself is bred in the bone.

Though a small minority of scientists argues that stereotypes are usually accurate and can be relied upon without reservations, most disagree—and vehemently. "Even if there is a kernel of truth in the stereotype, you're still applying a generalization about a group to an individual, which is always incorrect," says Bargh. Accuracy aside, some believe that the use of stereotypes is simply unjust. "In a democratic society, people should be judged as individuals and not as members of a group," Banaji argues. "Stereotyping flies in the face of that ideal."

Stereotypes can have a negative and positive impact on individuals. Joshua Aronson and Claude M. Steele have done research on the psychological effects of stereotyping, particularly its effect on African Americans and women. They argue that psychological research has shown that competence is highly responsive to situation and interactions with others. They cite, for example, a study which found that bogus feedback to college students dramatically affected their IQ test performance, and another in which students were either praised as very smart, congratulated on their hard work, or told that they scored high. The group praised as smart performed significantly worse than the others. They believe that there is an 'innate ability bias'. These effects are not just limited to minority groups. Mathematically competent white males, mostly math and engineering students, were asked to take a difficult math test. One group was told that this was being done to determine why Asians were scoring better. This group performed significantly worse than the control group.

 

Possible prejudicial effects of stereotypes are:

The problem, as Banaji's own research shows, is that people can't seem to help it. A recent experiment provides a good illustration. Banaji and her colleague, Anthony Greenwald, Ph.D., showed people a list of names—some famous, some not. The next day, the subjects returned to the lab and were shown a second list, which mixed names from the first list with new ones. Asked to identify which were famous, they picked out the Margaret Meads and the Miles Davises—but they also chose some of the names on the first list, which retained a lingering familiarity that they mistook for fame. (Psychologists call this the "famous overnight-effect.") By a margin of two-to-one, these suddenly "famous" people were male.

Participants weren't aware that they were preferring male names to female names, Banaji stresses. They were simply drawing on an unconscious stereotype of men as more important and influential than women. Something similar happened when she showed subjects a list of people who might be criminals: without knowing they were doing so, participants picked out an overwhelming number of African-American names. Banaji calls this kind of stereotyping implicit, because people know they are making a judgment—but just aren't aware of the basis upon which they are making it.

 

 

Some of the stereotypes we typically encounter in ourselves and others can include:

World Map of Useless Stereotypes.jpg

The World Map of Useless Stereotypes by Christoph Niemann

 

I think you get the picture - any and all characteristics can, in our minds, create a picture of that entire person and place that individual into a stereotypical group. It is not just characteristics that set an individual apart that creates a stereotype, but also characteristics that cause us to place an individual as a member of a group, then infer that that individual is exactly like all other members of that group or that all members of that group are like that individual.That picture can then influence our jugements about that individual. Stereotypes can be positive (blonds have more fun) or negative (the Irish drink too much) in their original intent, but are still an abbreviated and inaccurate characterization that can cause harm.

 

In the video below, Dr. Leeno Karumanchery, President and CEO of Diversity Solutions Inc., explores some of the complexities involved in how and why women and other minoritized groups get stereotyped.

 

And one more short 3 minute video: Ouch! That Stereotype Hurts!

 

 

Bias

From Wikipedia

Bias is an inclination to present or hold a partial perspective at the expense of (possibly equally valid) alternatives. Bias can also be defined as:

Many of the human behaviors we discuss under this topic are technically known as biases, and also because the normal meaning of "bias" refers to our noticing these sorts of behavior in someone else. When a behavioral economist says that people in general have some specified bias, he is saying that people tend to behave in a way that is wrong according to the theories of his or her field.

Bias can come in many forms. Anything biased is generally one-sided. A cognitive bias is any of a wide range of observer effects identified in cognitive science and social psychology including very basic statistical, social attribution, and memory errors that are common to all human beings. Social biases, usually called attributional biases, affect our everyday social interactions. And biases related to probability and decision making significantly affect the scientific method which is deliberately designed to minimize such bias from any one observer.

These processes include information-processing shortcuts, motivational factors, and social influence (Wilcox, 2011). Such biases can result from information-processing shortcuts called heuristics. They include errors in judgment, social attribution, and memory. Cognitive biases are a common outcome of human thought, and often drastically skew the reliability of anecdotal and legal evidence. It is a phenomenon studied in cognitive science and social psychology.  Bias can also be defined as an acceptance of a stereotype as fact, despite objective evidence to the contrary

We can easily see the relationships between these two terms- Stereotype and Bias: We have specific beliefs about individuals or groups based on some known characteristic (stereotype), and we have an inclination to hold a partial perspective (for or against) these people (bias) based on that characteristic.

Below are some subsets of cognitive biases with examples (from YorkPsych)

Self Perception Biases

selfperceptionbias.jpg

photo credit:jcoterhals@flickr

Self Perception biases are the tendency to allow one's dispositions to affect one's way of interpreting information. Self perception biases are distortions of one's own view of self.

1.Bias Blind Spot - the affectation or tendency to be ignorant of one's own biases. This is a case of the blind not knowing or ignoring that they are blind. (Pronin and Kugler, 2007)

2. Illusion of Control - the belief of being in at least some control over events and outcomes that you actually have no effect on. The devoted fan who gets out his lucky hat that "always brings the game back whenever the Giants are down" is a good example of this bias. (Kahneman and Tversky, 1972)

3. Restraint Bias - having overconfidence in one's own ability to deny temptations. This is a common bias because people like to believe they can handle whatever faces them and do not want to see themselves as having weak willpower. A Yorkie might fully believe they can become a vegetarian and even spend four days without eating any meat, but when they attend a Carnivores' Club meeting and smell the mouthwatering aroma of bacon, they give into the temptation that they were so confident they would overcome.

4.Self-Serving Bias - the tendency to be less prone to claim a failure than to claim a success. This is mostly due to people thinking their successes were due to their own brilliance, but their errors were caused by mistakes outside of their control ||Cognitive Dissonance||. In Mr. Fink's titration lab, a student is less likely to claim personal responsibility for the error that ends up skewing some of the results than for his quick thinking that enabled his group to salvage some meaningful data from the experiment.

5. Overconfidence Effect - inappropriately high confidence in one's own answers, opinions or beliefs. These overestimations could be driven by a strong desire to succeed or could just be a consequence of the general optimism produced by cognitive bias. Examples of overconfidence bias include a famous 1983 study in which 93% of drivers reported that they believed they were among the upper 50% of driving skill. (Pohl, 2006)

6. Egocentric Bias - the tendency of people to claim more responsibility in a group project than actuality. Egocentric bias could be observed if, for instance, any one person claimed to run Fall Fair when in reality, anyone who has taken part in Fall Fair knows it is an enormous team effort. (Kruger, Dunning, 1999)

Perception Biases

Perception biases are inaccurate views or conclusions drawn in various ways. They explain certain behavioral vicissitudes as well as how collective debates can result in so many various opinions.

Rearview.jpg

photo credit: suburbanbloke@flickr

1. Attentional Bias - the tendency for one's emotions to determine or affect one's focus. Emotional propaganda plays on this; for instance, certain charity commercials will show pictures of starving kids in Africa to draw attention away from the fact that only a fraction of the money donated actually goes to charitable causes.

2. Availability Heuristic - basing jugements or estimations on what most easily comes to memory. Because we remember cases or events that stand out as unusual or unexpected, this usually results in false assumptions or estimations. (Tversky and Kahneman, 1972) The availability heuristic is hypothesized to be to blame for the misconception that couples are more likely to conceive after they have adopted a child. People tend to remember all of the people who conceive after adoption and tend to forget about all of the cases in which the couples did not conceive after adopting. A more York oriented example is the common belief students seem to have that if their teacher doesn't show up to class within the first 15 minutes, then they have a free. This fits the availability heuristic because they most easily remember hearing of cases where other students did get away with this and enjoyed an unexpected free, rather then the more plentiful instances where the teacher showed up just in the nick of time and was angry at their attempt to desert class.

3. Hindsight Bias - "the I-knew-it-all-along bias", it is the tendency to believe you knew something when you truly did not. This also includes viewing completed events as more predictable than they actually were. (Pohl, 2006) Hindsight Bias can easily be observed outside the science building as Yorkies walking out of a math test will ask one another what they got on the Option A and frustratedly proclaim they knew that was what they were supposed to do, but for some reason didn't apply it at the time.

4. Observer Expectancy Effect / Selective Perception - known as the "observer effect", this is a fallacy that can very easily skew results in qualitative scientific experimentation. It is the tendency to manipulate or misinterpret data so that it will support (or disprove) a hypothesis. Essentially, it is the tendency to see what you want or expect to see.

5. Framing Effect - the tendency to interpret information differently based on changes in context or description. A Yorkie might exhibit this in the stress they put on studying for a chemistry quiz in comparison to a chemistry test. Even though Ms. Trachsel will explain that test and quiz scores are valued equally, and this quiz will be the same length as an average test, you might still hear one Yorkie telling another that "It's just a quiz," implying that being a quiz makes it somehow less imperative or important, regardless of how many points it's worth.

6. Choice Supportive Bias - the propensity to believe your choices were better or more righteously chosen than they actually were. This tends to happen when an individual remembers only the positive aspects of the chosen option of a decision, and only the negative aspects of the rejected options. For example, a second semester senior who hasn't taken any AP classes might justify his choice by concentrating on how much stress he would have now had he taken any AP classes, while not thinking about the benefits of passing the AP test and potentially getting college credit.

Logic and Decision Biases

Cognitive biases in logic and decisions are shown mostly through how people go about solving problems in different ways, make various choices, and judge different situations.

1.Base Rate Fallacy - Base Rate Fallacy is the inclination for someone to base his jugements on specifics rather than the big picture. An example of this could be a York Senior who chooses a college for having a strong chemistry program and ignores other aspects such as its location in the middle of a desert.

2.Zero-Risk Bias - the tendency for someone to try to eliminate a small risk rather than lower the likeliness of a great risk. An example of this could be a Yorkie that decides against joining the cross country team because they run on trails adjacent to areas that could contain unexploded ordinance. Rather than always choosing public transportation over driving a car to greatly reduce the risk of death in a transportation accident, the Yorkie reduces a small chance of getting blown to bits. This bias stems from a desire to reduce risk based on proportion rather than by chance. In other words, this Yorkie values a 100% risk decrease from .1% to 0% rather than a 66% risk decrease from, say, 3% to 1%

3. Anchoring‍ - the inclination for someone to allow one piece of information to outweigh others when making a decision. An example might be a couple considering the fact that the girl they hired to babysit their children goes to Stanford to be more important than the side facts that that girl skips half her classes, rides a motorcycle and brings her boyfriend with her to babysitting jobs.

4. Belief Bias - the tendency for someone to ignore logical error in an argument based on how believable a conclusion may be. For instance, people often buy into weight loss commercials that promise you could lose 20 pounds despite the illogical claim that you don't have to diet and only have to use their method for 10 minutes everyday for two weeks.

5.Semmelweis Reflex - the reflex-like tendency to ignore or reject any contradictory information against what one already believes. An example might be some one who does not believe that high fructose corn syrup is alright for their children after being told it was unhealthy, despite solid research and facts disputing that misconception.

Probability Biases

A probability bias arises when someone misinterprets precedents or past information and acts on this inaccuracy.

1. Normalcy Bias - the bias best represented in the freshmen class as Yorkies who are used to flying by in classes believe that since they have never received a B before, it simply cannot or will not happen. This is a logical error based on previous experience that most usually will throw the freshmen into shock. (Hsee and Zhang, 2004)

2. Gambler's Fallacy‍ - the propensity to believe that happenings of the past determine what will happen in the future. Just as its name predicts, this is most commonly exemplified by gamblers whom mistakenly tend to think along the lines that since they lost their game the last 6 times, they have a much greater chance of winning this time, or the next time, or the time after that. (Hsee and Zhang, 2004)

Predictive Biases

Predictive biases are most usually related to someone holding the inaccurate belief that they prematurely know information about events or people based on large or general ideas rather than specifics.

1. Optimism Bias - the higher tendency to expect positive outcomes of planned actions, rather than negative. People known as optimists tend to be the reassuring, confidence boosting, Mrs.Sherry-type people who always encourage you to hope for the best.

2. Pessimism Bias - opposite of the Optimism Bias, this is the habit of anticipating negative outcomes rather than positive. Pessimists sometimes suffer from depression, and typically have less hope for success of planned actions.

3. Planning Fallacy - possibly due to deficiencies in the Prefrontal Cortex (Cerebral Cortex), this is the tendency to inaccurately predict the time necessary to complete a task. This can be observed in some York seniors taking AP Psych who underestimate how much time will be needed to complete their textbook-wiki assignment and therefore are up until 2am the night before an installment is due.

4. Stereotyping - a bias in judgement, stereotyping is setting expectations for or drawing conclusions about an individual, based on the group they are tied to. Racial, religious and political stereotyping are most common as one will assume that because someone looks a certain way, believes a certain way or votes a certain way, she is like the majority of all others who affiliate with them.

‍Conformity Biases

Conformity biases are the most socially based cognitive biases that are exemplified by people young and old in instances varying from politics to surfing.

1. Availability Cascade - the idea that if you believe something enough, it becomes the truth. This idea is subjective to each individual as, for instance, religious upbringing results in different people having concrete belief in opposing concepts.

2.Ingroup Bias - the tendency for someone to be more comfortable or friendly with people whom he perceive as like himself, or as in the same group as himself. This most basically explains the "cliques" of typical high school as people with common interests gravitate to each other. (Garcia, Song and Tesser, 2010)

3.Out-group Homogeneity Bias - also called homogeneity blindness, this is the tendency for people within a like group to see their group members as more varied and individualistic than the members of other groups.

4. System Justification - the "go with the flow" tendency for people to more frequently adhere to precedents, rather than establish something new or different. As exemplified with political parties vs York clubs, people tend to mold to existing political parties with a general fit to their beliefs/interest rather than establish new, more self-specific parties. Yorkies are less subject to System Justification than most people, as in anyone having a unique interest, such as in surfing, but finding there is no pre-existing group to facilitate that interest, will easily start a surf club. (Edwards, 1968)

 

This video by Mr. Wray, hits the winner bell on two fronts. First, it is a great overview on cognitive bias and second, it is in the form of a song.

 

 Here is something to carefully consider - we all stereotype, and we all have biased perceptions. We even apply stereotypes to ourselves (or they are applied by someone else) , and then modify our own behavior based on those stereotypes - I am a mature, college educated professional, and so should wear a tie to work and probably would not get a tarantula tattooed on my bald head no matter how much I wanted to.

Below is a very good video that describes what is referred to as "Self-fulfilling Prophecy", or the "Stereotype Effect". - we become what we are stereotyped as.

 

 

Unless you are a psychology or sociology major, there really is not need to memorize all the various types of cognitive biases listed above (and there are many more that aren't listed). These lists are provided so that the reader might recognize in themselves these same kinds of fallacies. These examples help us to also better understand the people around us, and how they view the world.

Let's test our understanding of these definitions with a short quiz. Don't panic, you can retry this quiz as many times as you like.

 Toggle open/close quiz group

 

 

Maya_Angelou_Quote.png

 

 

In order for us to be able to move beyond our own biases and be able to apply critical thinking skills to our evaluations of individuals, we must first be able to recognize and address our own biases. If we can recognize our own biases, we can then identify the questions we must ask that inform us about the true individual..

 

 

 

How We See Ourselves and Others

From SGBA e-Learning Resources Written by Barbara Clow, Yvonne Hanson & Jennifer Bernier

We saw in definitions on the previous page that stereotypes are pictures of people we form in our minds based on some outward characteristic, and that a bias is an opinion we form or have about that person based on the stereotype. Are our stereotypes and biases wrong? Are they right? When our brains take that mental shortcut to a stereotype, does this represent the entire person? Can we recognize when there might be more to an individual than what is readily visible and apparent?

Often the best way to learn is to do, so let's try an activity...

Look in the mirror and describe yourself, thinking about the categories of diversity listed in the table below. Compare yourself to the image in the category and decide if you are similar or different from the person or idea represented in that image.

 Activity 1

 

Category of Diversity

 

Think about how you relate to the categories of diversity below

How We See Others

 

Are you similar or different from the person or idea represented in the image?

Questions to Consider

 

Roll over the word "Questions" below to reveal some further questions to consider for each picture

Sex

Woman.png

Questions

Gender

Gender.png

Questions

 

Ethnicity

Ethnicity.png

Questions

 

Income

Income.png

Questions

 

Employment

Employment.png

Questions

 

Education

Education.png

Questions

 

Age

Age.png

Questions

 

Sexual Orientation

Sexual_orientation.png

Questions

 

Housing

Housing.png

Questions

 

Family Structure

Family_Structure.png

Questions

 

Ability

Ability.png

Questions

 

 

 

Activity Discussion

This exercise helps us see that there are all different kinds of diversity in the world and that people are diverse in different ways. There are some things we have in common with one group of people and not another. There are some things that make us different from one group of people and not another. This exercise also demonstrates the effects of stereotypes and bias, in which we form an impression of an individual based one or a very few obvious characteristics. 

You may have found it difficult to answer the question "Is this person like me or different?" The point of this question is that we don't ever know enough about another person just by looking at them to recognize the points of similarity and difference between them and ourselves. For example, we might be heterosexual, but also married like the couple in picture 8, but why should we assume that the two men in picture 8 are a couple, or married, or even homosexual? There are many cultures in which this display of closeness between men has nothing to do with sexual orientation. We might not be Chinese, but we might have been married in a wedding dress or suit that looks just much the one in picture 3. Then again, what makes us so anthropologically discriminating to be able to identify this couple as Chinese - might they be Korean, Vietimese, or even Americans of mixed Asian descent? We might not have Down's syndrome, but we might have a job and an income that allows us to dress well for work, as does the young man in picture 11.

 What this exercise also demonstrates is the power of first impressions - we look, we stereotype based on some characteristic, then form a bias based on that stereotype. In the next section of this lesson we will learn more about this.

Activity 2

Here is another related activity.....

Think about the following behaviors that you might observe and/or experience when meeting or socializing with others. How might you react to or interpret these behaviors? What might these behaviors mean to someone from a culture other than your own? After you think about your possible reaction to the behaviors below, roll over each behavior with your cursor to reveal some other meanings.

Using incorrect grammar

Paying the bill for dinner at a restaurant

Asking questions about someone's mother and father

Burping loudly after a meal

Shaking hands when you meet someone

Arriving late for an appointment, class or meeting

Crossing a heart when making a promise

Holding hands in public with someone of the same sex

Repeating the same story

 

 

  

We react to behaviors based on our own cultural norms; and in fact, seldom recognize that there may be other cultural norms, or that they are as valid as our own. Our own culture, and how we were raised within that culture - the things we are taught to respect as acceptable behavior - also contribute to our biases. People who behave like we do are the same as us and are therefore good, while people who do not behave as we do are different and may not be good. Sociologists call this in-group and out-group bias.This is a very old social phenomenon, and is a survival instinct that is most likely biological and hard-wired into our brains. Those with the same characteristics and behaviors as us are part of our in-group and are safe, while those with different characteristics and behaviors are part of an out-group and are not safe.For many years American children were taught the concept of "Stranger Danger" to protect children from people who might hurt them, despite the overwhelming evidence that children were most often hurt by someone close.

In the previous activity we learned that characteristics create a first impression (stereotype) that can bias our mental picture of the entire individual; and now in this activity, we have learned that behaviors also create a first impression that can bias our mental picture of an individual. In the next section of this lesson we will learn a bit about how powerful these first impressions can be.

 

 

 

 

 

 

 

Before we move on, let's see how well we understand the material on this page.....................

 Toggle open/close quiz group

 

The Power of First Impressions

In this sectionwe are discussing first impressions (Halo/Devil Effect), because first impressions result from characteristics that trigger stereotypes and bias. We see an expensive suit, and we immediately form an image of a consummate professional (stereotype). We note a degree from a prestigious university and we "know" that person must be incredibly competent (bias). We see a shabby suit and and a degree from a community college, and we might believe this person is an incompetent loser. Read on....

The Halo/Devil Effects

From enVision: http://www.envisionsoftware.com/articles/Halo_Effect.html, Creative Commons Attribution By License

What exactly is the Halo Effect?

A psychology textbook provides a "simplistic" definition of the Halo effect as a subjective bias about a person's one outstanding trait extending to influence the total judgment of that person.First Impression 1.jpg

E. L. Thorndike's 1920 paper titled "A Constant Error on Psychological Rating", published in Journal of Applied Psychology first documented this perception error (wahrnehmungsfehler) with regard to rating employees.  This has also been followed up by Phil Rosenzweig's book on the same topic called The Halo Effect... and Eight Other Business Delusions That Deceive Managers.

Thorndike therein defined the halo effect as "a problem that arises in data collection when there is carry-over from one judgment to another."

First Impressions are powerful!

He further expanded that it is "...an extension of an overall impression of a person (or one particular outstanding trait) to influence the total judgment of that person. The effect is to evaluate an individual high on many traits because of a belief that the individual is high on one trait. Similar to this is the 'devil effect,' whereby a person evaluates another as low on many traits because of a belief that the individual is low on one trait which is assumed to be critical."

So, to clarify, if possible, when an individual is found to possess one desirable trait, then that individual is automatically assumed to have many other desirable traits as well. A kind of an "angelic halo" surrounds the person, in the eyes of the beholder, and they can do no wrong. If a person is bestowed with good physical beauty, then this person is also presumed to possess a host of other positive attributes as well, such as social competence, intellectual competence, and personal adjustment.

The inverse phenomenon called the "Devil Effect," and sometimes the "Horn Effect", doesn't seem to get as much attention, even though its impact is just as prevalent in society. Here, if a person seems particularly deficient in a critical trait,First Impression 2.jpg then that person is automatically assumed to be deficient in many other traits as well, related or otherwise. For example, an employee who is constantly "late" to work (perhaps due to other non-work responsibilities in the morning) is assumed to be negligent in their work-related duties, not committed to the job/company/project, and perhaps even lazy overall.

First Impressions are powerful!

 

Ultimately, these faulty biases may prove to become factual due to the Pygmalion effect, or "self-fulfilling prophecy ", further reinforcing future errors in perception due to bias and predisposition by the observer. The person working long hours (perhaps compensating for technical incompetence), assumed to be a good worker is given greater opportunity and thus attains greater, albeit undue, career advancement (cf: The Peter Principle). Conversely, the worker who dresses shabbily is assumed to care little about their job, and therefore bypassed for greater opportunity when the situation arises, regardless of suitability or capacity otherwise. Essentially, this phenomenon is a psycho-social application of the Law of Proximity, whereby certain unrelated observations, found in the comparable subjects in a narrow sample set, are assumed to have a high correlation, when, in fact, no such correlation exists.

 

First Impressions are powerful!

Application

There are a number of different ways in which the psychology of the halo (or devil) effect may manifest. As you will see, the halo effect is a form of a cognitive bias.

Instead of seeing the observable behaviors of a person, instead we see a certain picture. From this picture we draw conclusions about them which have no bearing in reality.

A person who is highly connected or good friends with those deemed to possess positive traits is erroneously assumed to possess those same good traits in "birds of a feather flock together" fashion.

If we know the societal role or other demographic information, the person targeted by the halo effect (or devil effect) will be perceived as "just like" all others who have held a similar role. We do not see the distinctions between people of this group and instead simply see:

Teacher

Congressperson

Frenchman

Negro

Blue Collar Worker

Inner City Dweller

Noticing only one or two characteristics, unrelated conclusions are drawn based upon personal experience, social bias, or group norming.

People who smile are honest.

Blondes are unintelligent.

Within a person and/or a person's group, certain desirable characteristics are noticeably absent. For example, when there is a superstar in a group, it's very easy to misinterpret the actions of those who would otherwise be deemed at least average performers as poor performers, simply because of the context of the evaluation.

With a person or group, characteristics are noticed which are very similar to our own or those we respect greatly. While similar (no pun intended) to the Social Bias listed above, instead of having a desirable social standing, instead this person possesses actual traits which are noticeably similar to those who have created a strong bias in the observer. So, while the person is obviously a distinct and unique individual, the fact that he reminds you of your shiftless, no good, brother-in-law, impacts further observations.

The very first impression left with the observer overrides all subsequent impressions, until a very strong and distinct impression is made to alter the existing path. The first impression is a psychology concept similar to that of inertia or momentum in the physical sciences.

 

The perception of a person orients itself at few conspicuous characteristics. All other characteristics are ignored. The effect describes the tendency, in order to form a certain characteristic of a person a "yard" and derive from it spreading acceptance. A well dressed lady also as wealthy and more educated than the same lady in a Jogging suit.

 

Consequences

Ultimately, the halo effect, much like many psychology concepts, can be used as a tool for motivating others to a desired end, or a phenomenon to specifically be alert to when relating to and evaluating situations and people.

Left alone, the halo effect can negatively impact all areas of management. Interviewers can wrongly infer that a candidate has a slew of required characteristics or attributes, simply because the candidate exhibited others which were desirable.

Managers responsible for employee ratings, can let the strong rating of one critical factor influence the ratings for all other factors exhibit the halo effect. The halo effect is also demonstrated when an overall global impression influences ratings. This problem occurs with employees who are friendly (or unfriendly) toward you or especially strong (or weak) in one skill.

Having clear and specific ratings standards can help avoid the halo effect. Another means to traverse the hazards of the halo effect is completely assess the performance on one performance factor before moving on to the next factor. Finally, simply being aware of the halo effect and how it works may afford one the opportunity more objective judgment and to be able to see if its harmful effects are at work.

We tend to think of the power of first impressions in regards to job applicants making a good impression when they come for an interview, and in that case a first impression can be incredibly powerful. However, I would like you to also consider the dangers of first impressions. Everybody applies stereotypes to individuals and has biases based on those stereotypes. A very positive (to our biases) first impression can cause us to hire a person based on a $2500 suit, perfect hair and a degree from an Ivy League college instead of their actual knowledge, skills and abilities.

First Impressions are dangerous!

 

Below is a brief (6:21 min.) video that explains the halo effect through a series of experiments that are interesting as well as informative - okay, maybe even a bit frightening....

 

In the final sections of this lesson, we will learn some tips for recognizing and overcoming our biases in the interview process, and all our interactions with individuals.

 

But first, another little quiz............

 

 Toggle open/close quiz group

 

 

 

Moving Beyond Biases and Stereotypes

Excerpts from: University of Hong Kong, Philosophy Department, Open Courseware on Critical Thinking, Logic and Creativity

We have learned that we all stereotype people. We have maybe even extrapolated that we are a stereotype to others. We have also learned the we all have biases but that we typically only recognize bias in others, and that some people may be biased for or against us based on a stereotype. We've also seen that stereotyping and biases may be hardwired into our brains, and therefore unavoidable. You might now be thinking, "but I want to judge people rationally, and not based on some silly bias I developed as a child", or "I want to understand the real value of the person across the table from me, rather than some shallow stereotypical judgement". How can move beyond our own ingrained stereotypes and biases? We can do this by honing our critical thinking skills.

Critical thinking is the ability to think clearly and rationally. It includes the ability to engage in reflective and independent thinking. Someone with critical thinking skills is able to do the following :

 

Other definitions of critical thinking have been proposed and argued throughout history.

Critical thinking is not a matter of accumulating information. A person with a good memory and who knows a lot of facts is not necessarily good at critical thinking. A critical thinker is able to deduce consequences from what he knows, and he knows how to make use of information to solve problems, and to seek relevant sources of information to inform himself.

Critical thinking should not be confused with being argumentative or being critical of other people. Although critical thinking skills can be used in exposing fallacies and bad reasoning, critical thinking can also play an important role in cooperative reasoning and constructive tasks. Critical thinking can help us acquire knowledge, improve our theories, and strengthen arguments. We can use critical thinking to enhance work processes and improve social institutions.

Good critical thinking might be seen as the foundation of science and a liberal democratic society. Science requires the critical use of reason in experimentation and theory confirmation. The proper functioning of a liberal democracy requires citizens who can think critically about social issues to inform their judgments about proper governance and to overcome biases and prejudice.

Why is Critical Thinking Important

Without thinking critically, you're only looking at the surface of things. When you come across a politician's statement in the media, do you accept it at face value? Do you accept some people's statements and not others'? The chances are you exercise at least some judgment, based on what you know about the particular person, and whether you generally agree with her or not.

Knowing whether or not you agree with someone is not necessarily the same as critical thinking, however. Your reaction may be based on emotion ("I hate that guy!"), or on the fact that this elected official supports programs that are in your interest, even though they may not be in the best interests of everyone else. What's important about critical thinking is that it helps you to sort out what's accurate and what's not, and to give you a solid, factual base for solving problems or addressing issues. Critical thinking helps you to move beyond the stereotypes and your own biases to judge individuals more accurately.

Some specific reasons for the importance of critical thinking:

It identifies bias. Critical thinking identifies both the bias in what it looks at (its object), and the biases you yourself bring to it. If you can address these honestly, and adjust your thinking accordingly, you'll be able to see the object in light of the way it's slanted, and to understand your own biases in your reaction to it.

A bias is not necessarily bad: it is simply a preferred way of looking at things. You can be racially biased, but you can also be biased toward looking at all humans as one family. You can be biased toward a liberal or conservative political point of view, or toward or against tolerance. Regardless of whether most of us would consider a particular bias good or bad, not seeing it can limit how we resolve a problem or issue.

It's oriented toward the problem, issue, or situation that you're addressing. Critical thinking focuses on analyzing and understanding its object. It eliminates, to the extent possible, emotional reactions, except where they become part of an approach or solution.

It's just about impossible to eliminate emotions, or to divorce them from your own deeply-held assumptions and beliefs. You can, however, try to understand that they're present, and to analyze your own emotional reactions and those of others in the situation.

There are different kinds of emotional reactions. If all the evidence points to something being true, your emotional reaction that it's not true isn't helpful, no matter how badly you want to believe it. On the other hand, if a proposed solution involves harming a particular group of people "for the good of the majority", an emotional reaction that says "we can't let this happen" may be necessary to change the situation so that its benefits can be realized without harm to anyone. Emotions that allow you to deny reality generally produce undesirable results; emotions that encourage you to explore alternatives based on principles of fairness and justice can produce very desirable results.

It gives you the whole picture. Critical thinking never considers anything in a vacuum. Its object has a history, a source, a context. Thinking critically allows you to bring these into play, thus getting more than just the outline of what you're examining, and making a realistic and effective solution to a problem more likely.

•It brings in other necessary factors. Some of the things that affect the object of critical thought -- previous situations, personal histories, general assumptions about an issue -- may need to be examined themselves. Critical thinking identifies them and questions them as well.

During the mid-90's debate in the United States over welfare reform, much fuss was made over the amount of federal money spent on welfare. Few people realized, however, that the whole entitlement program accounted for less than 2% of the annual federal budget. During the height of the debate, Americans surveyed estimated the amount of their taxes going to welfare at as much as 60%. Had they examined the general assumptions they were using, they might have thought differently about the issue.

•It considers both the simplicity and complexity of its object. A situation or issue may have a seemingly simple explanation or resolution, but it may rest on a complex combination of factors. Thinking critically unravels the relationships among these, and determines what level of complexity needs to be dealt with in order to reach a desired conclusion.

It gives you the most nearly accurate view of reality. The whole point of critical thinking is to construct the most objective view available. 100% objectivity may not be possible, but the closer you can get, the better.

Most important, for all the above reasons, it is most likely to help you get the results you want. The closer you are to dealing with things as they really are, the more likely you are to be able to address a problem or issue with some hope of success.

In more general terms, the real value of critical thinking is that it's been at the root of all human progress. The first ancestor of humans who said to himself, "We've always made bone tools, but they break awfully easily. I bet we could make tools out of something else. What if I tried this rock?" was using critical thinking. So were most of the social, artistic, and technological ground breakers who followed. You'd be hard pressed to find an advance in almost any area of humanity's development that didn't start with someone looking at the way things were and saying "It doesn't have to be that way. What if we looked at it from another angle?"

 

Let's watch a short video on critical thinking by QualiaSoup...

 

How to Develop the Critical Stance

From: The Community Tool Box

The Community Tool Box is a web site promoting community health and development by connecting people ideas and resources. The following information is written from their "community" perspective, but if we can recognize that our college is also a community and that the staff, faculty and students are the community member, then the advice below can be seen as a good fit from the perspective of our organization.

The critical stance is the generalized ability and disposition to apply critical thinking to whatever you encounter, recognizing assumptions -- your own and others' -- applying that recognition to questioning information and situations, and considering their context.

1. Recognize assumptions.

Each of us has a set of assumptions -- ideas or attitudes or "facts" we take for granted -- that underlies our thinking. Only when you're willing to look at these assumptions and realize how they color your conclusions can you examine situations, problems, or issues objectively.

Assumptions are based on a number of factors -- physical, environmental, psychological, and experiential -- that we automatically, and often unconsciously, bring to bear on anything we think about. One of the first steps in encouraging the critical stance is to try to make these factors conscious.

Sources of assumptions are numerous and overlapping, but the most important are:

Senses. The impact of the senses is so elemental that we sometimes react to it without realizing we're doing so. You may respond to a person based on smells you're barely aware of, for instance.

Experience. Each of us has a unique set of experiences, and they influence our responses to what we encounter. Ultimately, as critical thinkers, we have to understand both how past experience might limit our thinking in a situation, and how we can use it to see things more clearly.

Values. Values are deeply held beliefs -- often learned from families, schools, and peers -- about how the world should be. These "givens" may be difficult even to recognize, let alone reject. It further complicates matters that values usually concern the core issues of our lives: personal and sexual relationships, morality, gender and social roles, race, social class, and the organization of society, to name just a few.

Emotion. Recognizing our emotional reactions is vital to keeping them from influencing our conclusions. Anger at child abusers may get in the way of our understanding the issue clearly, for example. We can't control whether emotions come up, but we can understand how we react to them.

Self interest. Whether we like it or not, each of us sometimes injects what is best for ourselves into our decisions. We have to be aware when self interest gets in the way of reason, or of looking at the other interests in the situation.

Culture. The culture we grew up in, the culture we've adopted, the predominant culture in the society -- all have their effects on us, and push us into thinking in particular ways. Understanding how culture acts upon our and others' thinking makes it possible to look at a problem or issue in a different light.

History. Community history, the history of our organization or initiative, and our own history in dealing with particular problems and issues will all have an impact on the way we think about the current situation.

Religion. Our own religious backgrounds -- whether we still practice religion or not -- may be more powerful than we realize in influencing our thinking.

Biases. Very few of us, regardless of what we'd like to believe, are free of racial or ethnic prejudices of some sort, or of political, moral, and other biases that can come into play here.

Prior knowledge. What we know about a problem or issue, from personal experience, from secondhand accounts, or from theory, shapes our responses to it. We have to be sure, however, that what we "know" is in fact true, and relevant to the issue at hand.

Conventional wisdom. All of us have a large store of information "everybody knows" that we apply to new situations and problems. Unfortunately, the fact that everybody knows it doesn't make it right. Conventional wisdom is often too conventional: it usually reflects the simplest way of looking at things. We may need to step outside the conventions to look for new solutions.

 

This is often the case when people complain that "common sense" makes the solution to a problem obvious. Many people believe, for instance, that it is "common sense " that sex education courses for teens encourage them to have sex. The statistics show that, in fact, teens with adequate sexual information tend to be less sexually active than their uninformed counterparts.

 

2. Examine information for accuracy, assumptions, biases, or specific interests.

Some basic questions to examine information for accuracy, assumptions, biases or specific interests are:

•What's the source of the information? Knowing where information originates can tell you a lot about what it's meant to make you believe.

•Does the source generally produce accurate information?

•What are the source's assumptions about the problem or issue? Does the source have a particular interest or belong to a particular group that will allow you to understand what it believes about the issue the information refers to?

•Does the source have biases or purposes that would lead it to slant information in a particular way, or to lie outright? Politicians and political campaigns often "spin" information so that it seems to favor them and their positions. People in the community may do the same, or may "know" things that don't happen to be true.

•Does anyone in particular stand to benefit or lose if the information is accepted or rejected? To whose advantage is it if the information is taken at face value?

•Is the information complete? Are there important pieces missing? Does it tell you everything you need to know? Is it based on enough data to be accurate?

 

Making sure you have all the information can make a huge difference. Your information might be that a certain approach to this same issue worked well in a similar community. What you might not know or think to ask, however, is whether there's a reason that the same approach wouldn't work in this community. If you investigated, you might find it had been tried and failed for reasons that would doom it again. You'd need all the information before you could reasonably address the issue.

•Is the information logically consistent? Does it make sense? Do arguments actually prove what they pretend to prove? Learning how to sort out logical and powerful arguments from inconsistent or meaningless ones is perhaps the hardest task for learners. Some helpful strategies here might include mock debates, where participants have to devise arguments for the side they disagree with; analysis of TV news programs, particularly those like "Meet the Press," where political figures defend their positions; and after-the-fact discussions of community or personal situations.

Just about anyone can come up with an example that "proves" a particular point: There's a woman down the block who cheats on welfare, so it's obvious that most welfare recipients cheat. You can't trust members of that ethnic group, because one of them stole my wallet.

 

Neither of these examples "proves" anything, because it's based on only one instance, and there's no logical reason to assume it holds for a larger group. A former president was particularly fond of these kinds of "proofs", and as a result often proposed simplistic solutions to complex social problems. Without information that's logically consistent and at least close to complete, you can't draw conclusions that will help you effectively address an issue.

 

•Is the information clear? Do you understand what you're seeing?

•Is the information relevant to the current situation? Information may be accurate, complete, logically consistent, powerful...and useless, because it has nothing to do with what you're trying to deal with.

 

An AIDS prevention initiative, for instance, may find that a particular neighborhood has a large number of gay residents. However, if the HIV-positive rate in the gay community is nearly nonexistent, and the real AIDS problem in town is among IV drug users, the location of the gay community is irrelevant information.

 

•Most important, is the information true? Outright lies and made-up "facts" are not uncommon in politics, community work, employment applications and other situations. Knowing the source and its interests, understanding the situation, and being sensibly skeptical can help to protect learners from acting on false information.

 Toggle open/close quiz group

 

 

 

Conclusion

We have learned in this short lesson that we all stereotype, because that is how our brains process the large amounts of data that comprise an individual. We have also learned that we all have biases, but we seem to tend to only recognize them in others. I am hoping that we have also learned how to recognize and better understand our own biases. For those of you who might still doubt that you also might be biased, I would like to introduce

Project Implicit.

Project Implicit is a non-profit organization and international collaborative network of researchers investigating implicit social cognition - thoughts and feelings outside of conscious awareness and control. Project Implicit is the product of a team of scientists whose research produced new ways of understanding attitudes, stereotypes and other hidden biases that influence perception, judgment, and action. Project Implicit was founded as a multi-university research collaboration in 1998 by three scientists - Tony Greenwald (University of Washington), Mahzarin Banaji (Harvard University), and Brian Nosek (University of Virginia), and was incorporated as a non-profit in 2001 to foster dissemination and application of implicit social cognition. Project Implicit supports a collaborative network of researchers interested in basic and applied research concerning thoughts and feelings that occur outside of conscious awareness or control. Project Implicit expanded into a substantial web-based infrastructure for supporting behavioral research and education that is available to other laboratories. Finally, Project Implicit provides consulting, education and training services on implicit bias, diversity and inclusion, leadership, applying science to practice, and innovation.

What is very cool about Project Implicit's web site is that you can take some rather eye-opening tests to evaluate your own biases. By taking these evaluations, you are adding data to this very worthy international resesarch project. If you are brave enough or honest enough for this kind of deep evaluation, navigate to this website: https://implicit.harvard.edu/implicit/demo/ and click on "Go to the Demonstration Tests". Once on the page for the demonstration tests, pick one or many and try them out. If you just want more information of Project Implicit, follow this link to their home page: http://www.projectimplicit.net/index.html