A stereotype is an exaggerated belief, image or distorted truth about a person or group — a generalization that allows for little or no individual differences or social variation. Stereotypes are based on images in mass media, or reputations passed on by parents, peers and other members of society. Stereotypes can be positive or negative. (Southern Poverty Law Center)
One theory as to why people stereotype is that it is too difficult to take in all of the complexities of other people as individuals. Even though stereotyping is inexact, it is an efficient way to mentally organize large blocks of information. Categorization is an essential human capability because it enables us to simplify, predict, and organize our world. Once one has sorted and organized everyone into tidy categories, there is a human tendency to avoid processing new or unexpected information about each individual. Assigning general group characteristics to members of that group saves time and satisfies the need to predict the social world in a general sense.
Bargh thinks that stereotypes may emerge from what social psychologists call in-group/out-group dynamics. Humans, like other species, need to feel that they are part of a group, and as villages, clans, and other traditional groupings have broken down, our identities have attached themselves to more ambiguous classifications, such as race and class. We want to feel good about the group we belong to—and one way of doing so is to denigrate all those who who aren't in it. And while we tend to see members of our own group as individuals, we view those in out-groups as an undifferentiated—stereotyped—mass. The categories we use have changed, but it seems that stereotyping itself is bred in the bone.
Though a small minority of scientists argues that stereotypes are usually accurate and can be relied upon without reservations, most disagree—and vehemently. "Even if there is a kernel of truth in the stereotype, you're still applying a generalization about a group to an individual, which is always incorrect," says Bargh. Accuracy aside, some believe that the use of stereotypes is simply unjust. "In a democratic society, people should be judged as individuals and not as members of a group," Banaji argues. "Stereotyping flies in the face of that ideal."
Stereotypes can have a negative and positive impact on individuals. Joshua Aronson and Claude M. Steele have done research on the psychological effects of stereotyping, particularly its effect on African Americans and women. They argue that psychological research has shown that competence is highly responsive to situation and interactions with others. They cite, for example, a study which found that bogus feedback to college students dramatically affected their IQ test performance, and another in which students were either praised as very smart, congratulated on their hard work, or told that they scored high. The group praised as smart performed significantly worse than the others. They believe that there is an 'innate ability bias'. These effects are not just limited to minority groups. Mathematically competent white males, mostly math and engineering students, were asked to take a difficult math test. One group was told that this was being done to determine why Asians were scoring better. This group performed significantly worse than the control group.
Possible prejudicial effects of stereotypes are:
- Justification of ill-founded prejudices or ignorance
- Unwillingness to rethink one's attitudes and behavior towards stereotyped group
- Preventing some people of stereotyped groups from entering or succeeding in activities or fields
The problem, as Banaji's own research shows, is that people can't seem to help it. A recent experiment provides a good illustration. Banaji and her colleague, Anthony Greenwald, Ph.D., showed people a list of names—some famous, some not. The next day, the subjects returned to the lab and were shown a second list, which mixed names from the first list with new ones. Asked to identify which were famous, they picked out the Margaret Meads and the Miles Davises—but they also chose some of the names on the first list, which retained a lingering familiarity that they mistook for fame. (Psychologists call this the "famous overnight-effect.") By a margin of two-to-one, these suddenly "famous" people were male.
Participants weren't aware that they were preferring male names to female names, Banaji stresses. They were simply drawing on an unconscious stereotype of men as more important and influential than women. Something similar happened when she showed subjects a list of people who might be criminals: without knowing they were doing so, participants picked out an overwhelming number of African-American names. Banaji calls this kind of stereotyping implicit, because people know they are making a judgment—but just aren't aware of the basis upon which they are making it.
Some of the stereotypes we typically encounter in ourselves and others can include:
The World Map of Useless Stereotypes by Christoph Niemann
- Sexual orientation,
- Body type,
- Country of origin,
- State of origin,
- City of origin
- Renter or owner,
- Children or no children
- Education level,
- School or college attended,
- Married or single,
- Introverted or extroverted,
- Hair color,
- Body art,
- Scented body products,
- Political party,
- Club memberships,
- Favorite sports,
- Favorite teams,
- Body odors,
I think you get the picture - any and all characteristics can, in our minds, create a picture of that entire person and place that individual into a stereotypical group. It is not just characteristics that set an individual apart that creates a stereotype, but also characteristics that cause us to place an individual as a member of a group, then infer that that individual is exactly like all other members of that group or that all members of that group are like that individual.That picture can then influence our judgments about that individual. Stereotypes can be positive (blonds have more fun) or negative (the Irish drink too much) in their original intent, but are still an abbreviated and inaccurate characterization that can cause harm.
In the video below, Dr. Leeno Karumanchery, President and CEO of Diversity Solutions Inc., explores some of the complexities involved in how and why women and other minoritized groups get stereotyped.
And one more short 3 minute video: Ouch! That Stereotype Hurts!
Bias is an inclination to present or hold a partial perspective at the expense of (possibly equally valid) alternatives. Bias can also be defined as:
- "Preference or inclination that inhibits impartiality; prejudice" (American Heritage Dictionary,1983).
- A partiality that prevents objective consideration of an issue or situation
- Disposed to favor one over another
- A predisposition or a preconceived opinion that prevents a person from impartially evaluating facts that have been presented for determination
Many of the human behaviors we discuss under this topic are technically known as biases, and also because the normal meaning of "bias" refers to our noticing these sorts of behavior in someone else. When a behavioral economist says that people in general have some specified bias, he is saying that people tend to behave in a way that is wrong according to the theories of his or her field.
Bias can come in many forms. Anything biased is generally one-sided. A cognitive bias is any of a wide range of observer effects identified in cognitive science and social psychology including very basic statistical, social attribution, and memory errors that are common to all human beings. Social biases, usually called attributional biases, affect our everyday social interactions. And biases related to probability and decision making significantly affect the scientific method which is deliberately designed to minimize such bias from any one observer.
These processes include information-processing shortcuts, motivational factors, and social influence (Wilcox, 2011). Such biases can result from information-processing shortcuts called heuristics. They include errors in judgment, social attribution, and memory. Cognitive biases are a common outcome of human thought, and often drastically skew the reliability of anecdotal and legal evidence. It is a phenomenon studied in cognitive science and social psychology. Bias can also be defined as an acceptance of a stereotype as fact, despite objective evidence to the contrary
We can easily see the relationships between these two terms- Stereotype and Bias: We have specific beliefs about individuals or groups based on some known characteristic (stereotype), and we have an inclination to hold a partial perspective (for or against) these people (bias) based on that characteristic.
Below are some subsets of cognitive biases with examples (from YorkPsych)
Self Perception Biases
Self Perception biases are the tendency to allow one's dispositions to affect one's way of interpreting information. Self perception biases are distortions of one's own view of self.
1.Bias Blind Spot - the affectation or tendency to be ignorant of one's own biases. This is a case of the blind not knowing or ignoring that they are blind. (Pronin and Kugler, 2007)
2. Illusion of Control - the belief of being in at least some control over events and outcomes that you actually have no effect on. The devoted fan who gets out his lucky hat that "always brings the game back whenever the Giants are down" is a good example of this bias. (Kahneman and Tversky, 1972)
3. Restraint Bias - having overconfidence in one's own ability to deny temptations. This is a common bias because people like to believe they can handle whatever faces them and do not want to see themselves as having weak willpower. A Yorkie might fully believe they can become a vegetarian and even spend four days without eating any meat, but when they attend a Carnivores' Club meeting and smell the mouthwatering aroma of bacon, they give into the temptation that they were so confident they would overcome.
4.Self-Serving Bias - the tendency to be less prone to claim a failure than to claim a success. This is mostly due to people thinking their successes were due to their own brilliance, but their errors were caused by mistakes outside of their control ||Cognitive Dissonance||. In Mr. Fink's titration lab, a student is less likely to claim personal responsibility for the error that ends up skewing some of the results than for his quick thinking that enabled his group to salvage some meaningful data from the experiment.
5. Overconfidence Effect - inappropriately high confidence in one's own answers, opinions or beliefs. These overestimations could be driven by a strong desire to succeed or could just be a consequence of the general optimism produced by cognitive bias. Examples of overconfidence bias include a famous 1983 study in which 93% of drivers reported that they believed they were among the upper 50% of driving skill. (Pohl, 2006)
6. Egocentric Bias - the tendency of people to claim more responsibility in a group project than actuality. Egocentric bias could be observed if, for instance, any one person claimed to run Fall Fair when in reality, anyone who has taken part in Fall Fair knows it is an enormous team effort. (Kruger, Dunning, 1999)
Perception biases are inaccurate views or conclusions drawn in various ways. They explain certain behavioral vicissitudes as well as how collective debates can result in so many various opinions.
photo credit: suburbanbloke@flickr
1. Attentional Bias - the tendency for one's emotions to determine or affect one's focus. Emotional propaganda plays on this; for instance, certain charity commercials will show pictures of starving kids in Africa to draw attention away from the fact that only a fraction of the money donated actually goes to charitable causes.
2. Availability Heuristic - basing judgments or estimations on what most easily comes to memory. Because we remember cases or events that stand out as unusual or unexpected, this usually results in false assumptions or estimations. (Tversky and Kahneman, 1972) The availability heuristic is hypothesized to be to blame for the misconception that couples are more likely to conceive after they have adopted a child. People tend to remember all of the people who conceive after adoption and tend to forget about all of the cases in which the couples did not conceive after adopting. A more York oriented example is the common belief students seem to have that if their teacher doesn't show up to class within the first 15 minutes, then they have a free. This fits the availability heuristic because they most easily remember hearing of cases where other students did get away with this and enjoyed an unexpected free, rather then the more plentiful instances where the teacher showed up just in the nick of time and was angry at their attempt to desert class.
3. Hindsight Bias - "the I-knew-it-all-along bias", it is the tendency to believe you knew something when you truly did not. This also includes viewing completed events as more predictable than they actually were. (Pohl, 2006) Hindsight Bias can easily be observed outside the science building as Yorkies walking out of a math test will ask one another what they got on the Option A and frustratedly proclaim they knew that was what they were supposed to do, but for some reason didn't apply it at the time.
4. Observer Expectancy Effect / Selective Perception - known as the "observer effect", this is a fallacy that can very easily skew results in qualitative scientific experimentation. It is the tendency to manipulate or misinterpret data so that it will support (or disprove) a hypothesis. Essentially, it is the tendency to see what you want or expect to see.
5. Framing Effect - the tendency to interpret information differently based on changes in context or description. A Yorkie might exhibit this in the stress they put on studying for a chemistry quiz in comparison to a chemistry test. Even though Ms. Trachsel will explain that test and quiz scores are valued equally, and this quiz will be the same length as an average test, you might still hear one Yorkie telling another that "It's just a quiz," implying that being a quiz makes it somehow less imperative or important, regardless of how many points it's worth.
6. Choice Supportive Bias - the propensity to believe your choices were better or more righteously chosen than they actually were. This tends to happen when an individual remembers only the positive aspects of the chosen option of a decision, and only the negative aspects of the rejected options. For example, a second semester senior who hasn't taken any AP classes might justify his choice by concentrating on how much stress he would have now had he taken any AP classes, while not thinking about the benefits of passing the AP test and potentially getting college credit.
Logic and Decision Biases
Cognitive biases in logic and decisions are shown mostly through how people go about solving problems in different ways, make various choices, and judge different situations.
1.Base Rate Fallacy - Base Rate Fallacy is the inclination for someone to base his judgments on specifics rather than the big picture. An example of this could be a York Senior who chooses a college for having a strong chemistry program and ignores other aspects such as its location in the middle of a desert.
2.Zero-Risk Bias - the tendency for someone to try to eliminate a small risk rather than lower the likeliness of a great risk. An example of this could be a Yorkie that decides against joining the cross country team because they run on trails adjacent to areas that could contain unexploded ordinance. Rather than always choosing public transportation over driving a car to greatly reduce the risk of death in a transportation accident, the Yorkie reduces a small chance of getting blown to bits. This bias stems from a desire to reduce risk based on proportion rather than by chance. In other words, this Yorkie values a 100% risk decrease from .1% to 0% rather than a 66% risk decrease from, say, 3% to 1%
3. Anchoring - the inclination for someone to allow one piece of information to outweigh others when making a decision. An example might be a couple considering the fact that the girl they hired to babysit their children goes to Stanford to be more important than the side facts that that girl skips half her classes, rides a motorcycle and brings her boyfriend with her to babysitting jobs.
4. Belief Bias - the tendency for someone to ignore logical error in an argument based on how believable a conclusion may be. For instance, people often buy into weight loss commercials that promise you could lose 20 pounds despite the illogical claim that you don't have to diet and only have to use their method for 10 minutes everyday for two weeks.
5.Semmelweis Reflex - the reflex-like tendency to ignore or reject any contradictory information against what one already believes. An example might be some one who does not believe that high fructose corn syrup is alright for their children after being told it was unhealthy, despite solid research and facts disputing that misconception.
A probability bias arises when someone misinterprets precedents or past information and acts on this inaccuracy.
1. Normalcy Bias - the bias best represented in the freshmen class as Yorkies who are used to flying by in classes believe that since they have never received a B before, it simply cannot or will not happen. This is a logical error based on previous experience that most usually will throw the freshmen into shock. (Hsee and Zhang, 2004)
2. Gambler's Fallacy - the propensity to believe that happenings of the past determine what will happen in the future. Just as its name predicts, this is most commonly exemplified by gamblers whom mistakenly tend to think along the lines that since they lost their game the last 6 times, they have a much greater chance of winning this time, or the next time, or the time after that. (Hsee and Zhang, 2004)
Predictive biases are most usually related to someone holding the inaccurate belief that they prematurely know information about events or people based on large or general ideas rather than specifics.
1. Optimism Bias - the higher tendency to expect positive outcomes of planned actions, rather than negative. People known as optimists tend to be the reassuring, confidence boosting, Mrs.Sherry-type people who always encourage you to hope for the best.
2. Pessimism Bias - opposite of the Optimism Bias, this is the habit of anticipating negative outcomes rather than positive. Pessimists sometimes suffer from depression, and typically have less hope for success of planned actions.
3. Planning Fallacy - possibly due to deficiencies in the Prefrontal Cortex (Cerebral Cortex), this is the tendency to inaccurately predict the time necessary to complete a task. This can be observed in some York seniors taking AP Psych who underestimate how much time will be needed to complete their textbook-wiki assignment and therefore are up until 2am the night before an installment is due.
4. Stereotyping - a bias in judgment, stereotyping is setting expectations for or drawing conclusions about an individual, based on the group they are tied to. Racial, religious and political stereotyping are most common as one will assume that because someone looks a certain way, believes a certain way or votes a certain way, she is like the majority of all others who affiliate with them.
Conformity biases are the most socially based cognitive biases that are exemplified by people young and old in instances varying from politics to surfing.
1. Availability Cascade - the idea that if you believe something enough, it becomes the truth. This idea is subjective to each individual as, for instance, religious upbringing results in different people having concrete belief in opposing concepts.
2.Ingroup Bias - the tendency for someone to be more comfortable or friendly with people whom he perceive as like himself, or as in the same group as himself. This most basically explains the "cliques" of typical high school as people with common interests gravitate to each other. (Garcia, Song and Tesser, 2010)
3.Out-group Homogeneity Bias - also called homogeneity blindness, this is the tendency for people within a like group to see their group members as more varied and individualistic than the members of other groups.
4. System Justification - the "go with the flow" tendency for people to more frequently adhere to precedents, rather than establish something new or different. As exemplified with political parties vs York clubs, people tend to mold to existing political parties with a general fit to their beliefs/interest rather than establish new, more self-specific parties. Yorkies are less subject to System Justification than most people, as in anyone having a unique interest, such as in surfing, but finding there is no pre-existing group to facilitate that interest, will easily start a surf club. (Edwards, 1968)
This video by Mr. Wray, hits the winner bell on two fronts. First, it is a great overview on cognitive bias and second, it is in the form of a song.
Here is something to carefully consider - we all stereotype, and we all have biased perceptions. We even apply stereotypes to ourselves (or they are applied by someone else) , and then modify our own behavior based on those stereotypes - I am a mature, college educated professional, and so should wear a tie to work and probably would not get a tarantula tattooed on my bald head no matter how much I wanted to.
Below is a very good video that describes what is referred to as "Self-fulfilling Prophecy", or the "Stereotype Effect". - we become what we are stereotyped as.
Unless you are a psychology or sociology major, there really is not need to memorize all the various types of cognitive biases listed above (and there are many more that aren't listed). These lists are provided so that the reader might recognize in themselves these same kinds of fallacies. These examples help us to also better understand the people around us, and how they view the world.
Let's test our understanding of these definitions with a short quiz. Don't panic, you can retry this quiz as many times as you like.