This is going to be the first of many posts on how cognitive bias might undermine social change efforts. Yeah, I know. I will do my best to keep this concrete and interesting.
A number of quirks in how our brains work might lead to mistaken judgments in many areas relevant to social change: problem analysis, strategy, program design, and advocacy. Maybe fundraising. Taking effective action on a certain problem is what activists want to do, so we'll come back to fundraising later.
1.Confirmation bias may cause us to look for reasons why a certain strategy or tactic is the right one. Supportive evidence gets woven into the narrative, lending support to the need for more education for women or more gun control or whatever the topic might be. Negative information gets tossed out.
2. Bandwagon effect -This may be a real problem when the resources devoted to addressing an issue get all of proportion to the seriousness of an issue. Instead, bandwagon jumping siphons away money and volunteer time. Does every school in the country need an anti-smoking campaign, because smoking was on the rise in a few schools just recently? No.Maybe you should spend that time and money on identifying teenagers who are risk of suicide? That issue isn't a hot one, now, but the issue still deserves considerable attention
3. Backfire effect - People strengthen their beliefs when confronted with dis-confirming evidence. This is a particular problem when we raise ideas or make claims that threaten peoples' fundamental beliefs. Nobody really cares about your evidence that sunscreen is dangerous, but start throwing out evidence that gun ownership carries more costs and more benefits, and your efforts to reduce gun violence can run off into a ditch.
4. Availability cascade - a belief becomes more and more plausible through repetition. The more you expose yourself to a false narrative, that FEMA is going to start rounding up Christians the more likely it will seem plausible. The more you expose yourself to the belief that capitalism is behind the world's problems the more likely you will waste time and money on useless crusades against 'banksters' and 'robber barons'.
5. Curse of knowledge - When well-informed people find it hard to think like less-informed people about a problem or issue communication can seem fruitless because the audience simply lacks the knowledge they need to evaluate the argument, This might be one reason it is so hard to talk to people about the impacts of climate change. Rational discussion about GMS and vaccines tends to get undermined for the same reason.
6. Focusing effect - the tendency to place too much emphasis on one aspect of an event. Especially common in diagnosing things like history: We won our freedom from the UK because we had guns. Actual historians would credit our victory to the huge territory involved, the difficulties of long-distance travel, and help from the French as well to the armed resistance.
7. Sunk cost fallacy - evaluating further investment as wise due to the high past investment in a decision, in the face of evidence that the decision was probably a bad one.The error is based on the bad assumption that somehow you will be able to recover that investment. The same can happen in activism, when a tactic or a policy idea obviously aren't working, but we remain committed to them because of our prior commitment.
8. Zero-risk bias - Preference for reducing a small risk to zero over greater reduction of a much greater risk. Some people insist they must own a gun and carry it for self defense. Yet, the odds of facing financial hardship because of a sudden job loss or a serious accident are much higher for most of us.
9. Naive realism - the belief that you see reality as it is and others who see things differently are uninformed, dumb, or lazy. People who don't seem to care much about ocean pollution, biodiversity loss, or climate change may have very good reasons for not caring that much. Try to find out why and craft messages that speak to those perceptions (mis-perceptions) and concerns.
What did you guys think about this primer on cognitive biases and social change efforts? Comments are welcome. I'll give you a cookie.
A number of quirks in how our brains work might lead to mistaken judgments in many areas relevant to social change: problem analysis, strategy, program design, and advocacy. Maybe fundraising. Taking effective action on a certain problem is what activists want to do, so we'll come back to fundraising later.
1.Confirmation bias may cause us to look for reasons why a certain strategy or tactic is the right one. Supportive evidence gets woven into the narrative, lending support to the need for more education for women or more gun control or whatever the topic might be. Negative information gets tossed out.
2. Bandwagon effect -This may be a real problem when the resources devoted to addressing an issue get all of proportion to the seriousness of an issue. Instead, bandwagon jumping siphons away money and volunteer time. Does every school in the country need an anti-smoking campaign, because smoking was on the rise in a few schools just recently? No.Maybe you should spend that time and money on identifying teenagers who are risk of suicide? That issue isn't a hot one, now, but the issue still deserves considerable attention
3. Backfire effect - People strengthen their beliefs when confronted with dis-confirming evidence. This is a particular problem when we raise ideas or make claims that threaten peoples' fundamental beliefs. Nobody really cares about your evidence that sunscreen is dangerous, but start throwing out evidence that gun ownership carries more costs and more benefits, and your efforts to reduce gun violence can run off into a ditch.
4. Availability cascade - a belief becomes more and more plausible through repetition. The more you expose yourself to a false narrative, that FEMA is going to start rounding up Christians the more likely it will seem plausible. The more you expose yourself to the belief that capitalism is behind the world's problems the more likely you will waste time and money on useless crusades against 'banksters' and 'robber barons'.
5. Curse of knowledge - When well-informed people find it hard to think like less-informed people about a problem or issue communication can seem fruitless because the audience simply lacks the knowledge they need to evaluate the argument, This might be one reason it is so hard to talk to people about the impacts of climate change. Rational discussion about GMS and vaccines tends to get undermined for the same reason.
6. Focusing effect - the tendency to place too much emphasis on one aspect of an event. Especially common in diagnosing things like history: We won our freedom from the UK because we had guns. Actual historians would credit our victory to the huge territory involved, the difficulties of long-distance travel, and help from the French as well to the armed resistance.
7. Sunk cost fallacy - evaluating further investment as wise due to the high past investment in a decision, in the face of evidence that the decision was probably a bad one.The error is based on the bad assumption that somehow you will be able to recover that investment. The same can happen in activism, when a tactic or a policy idea obviously aren't working, but we remain committed to them because of our prior commitment.
8. Zero-risk bias - Preference for reducing a small risk to zero over greater reduction of a much greater risk. Some people insist they must own a gun and carry it for self defense. Yet, the odds of facing financial hardship because of a sudden job loss or a serious accident are much higher for most of us.
9. Naive realism - the belief that you see reality as it is and others who see things differently are uninformed, dumb, or lazy. People who don't seem to care much about ocean pollution, biodiversity loss, or climate change may have very good reasons for not caring that much. Try to find out why and craft messages that speak to those perceptions (mis-perceptions) and concerns.
What did you guys think about this primer on cognitive biases and social change efforts? Comments are welcome. I'll give you a cookie.
Comments
Post a Comment