Archive for the ‘information dissemination’ Tag

Diagnosis: Denialism

see no evil

New Scientist has an opinion piece up that hits on an ongoing theme of this blog: communicating statistical science to the public.

In summary, it argues that those who reject scientific ideas like evolution, global warming, vaccination, AIDS, and such are fooling themselves. They come up with complex reasonings based on faulty logic to dismiss scientific claims:

Here’s a hypothesis: denial is largely a product of the way normal people think. Most denialists are simply ordinary people doing what they believe is right. If this seems discouraging, take heart. There are good reasons for thinking that denialism can be tackled by condemning it a little less and understanding it a little more.

Whatever they are denying, denial movements have much in common with one another, not least the use of similar tactics (see “How to be a denialist”). All set themselves up as courageous underdogs fighting a corrupt elite engaged in a conspiracy to suppress the truth or foist a malicious lie on ordinary people. This conspiracy is usually claimed to be promoting a sinister agenda: the nanny state, takeover of the world economy, government power over individuals, financial gain, atheism.

The piece is a bit more cynical than I tend to be, with MacKenzie the author seeing climate deniers, evolution deniers, vaccine deniers, etc. as something of conspiracy theorists. She cites an informal checklist for developing your own denialism argument of accepted science:

  1. Allege that there’s a conspiracy. Claim that scientific consensus has arisen through collusion rather than the accumulation of evidence.
  2. Use fake experts to support your story. “Denial always starts with a cadre of pseudo-experts with some credentials that create a facade of credibility,” says Seth Kalichman of the University of Connecticut.
  3. Cherry-pick the evidence: trumpet whatever appears to support your case and ignore or rubbish the rest. Carry on trotting out supportive evidence even after it has been discredited.
  4. Create impossible standards for your opponents. Claim that the existing evidence is not good enough and demand more. If your opponent comes up with evidence you have demanded, move the goalposts.
  5. Use logical fallacies. Hitler opposed smoking, so anti-smoking measures are Nazi. Deliberately misrepresent the scientific consensus and then knock down your straw man.
  6. Manufacture doubt. Falsely portray scientists as so divided that basing policy on their advice would be premature. Insist “both sides” must be heard and cry censorship when “dissenting” arguments or experts are rejected.

I want to dismiss it!

But I see some really interesting ideas in this piece. One is the reason why people want so desperately to dismiss certain scientific ideas: the science says they way they’ve been leading their life is bad. Global warming says that our fossil fuel-dependent lifestyle is bad. Evolution says that a Creationist view of life is, well, pretty absurd. Anti-smoking research says that your 2-pack-a-day habit bad for you.

Hence, if science tells you that you’re bad then  you’ve got motivation to dismiss the science.

MacKenzie then pulls this motivation to dismiss science into a discussion of power/agency:

It is this sense of loss of control that really matters. In such situations, many people prefer to reject expert evidence in favour of alternative explanations that promise to hand control back to them, even if those explanations are not supported by evidence (see “Giving life to a lie”).

All denialisms appear to be attempts like this to regain a sense of agency over uncaring nature: blaming autism on vaccines rather than an unknown natural cause, insisting that humans were made by divine plan, rejecting the idea that actions we thought were okay, such as smoking and burning coal, have turned out to be dangerous.

There’s a certain loss of control that occurs when you find out science says you’re bad. Rejecting the science, then, is a way to regain that control. Reminds me of Koerber’s discussion of agency-through resistance. Koerber argues that mothers who breastfeed their babies even when advised against it by their doctors gain a certain degree of agency in the act. Resisting the institution can be empowering. Sounds a lot like Denialists, no?

But that just doesn’t make sense!

MacKenzie hits on an issue that I’ve thought about before. People are more willing to believe anecdotes than statistics.

Greg Poland, head of vaccines at the Mayo Clinic in Minnesota and editor in chief of the journal Vaccine, often speaks out against vaccine denial. He calls his opponents “the innumerate” because they are unable to grasp concepts like probability. Instead, they reason based on anecdote and emotion. “People use mental short cuts – ‘My kid got autism after he got his shots, so the vaccine must have caused it,'” he says. One emotive story about a vaccine’s alleged harm trumps endless safety statistics.

And denialism is more likely to occur in science that isn’t directly visible. If you can’t actually see the science in action, you’re much more likely to dismiss it.

The first thing to note is that denial finds its most fertile ground in areas where the science must be taken on trust. There is no denial of antibiotics, which visibly work. But there is denial of vaccines, which we are merely told will prevent diseases – diseases, moreover, which most of us have never seen, ironically because the vaccines work.

We believe what we see with our own eyes.

There’s, I think, a relationship between dismissing science that tells us we’re bad and dismissing science that we can’t see. If the science is plainly obvious, we’d have a certain intuition about it and we’d have changed our behavior long ago.

Science tells us that jumping off tall buildings is bad for our health, but we have an intuition built into us telling us the same thing. Our intuition was formed based on plainly understandable science.

Science tells us that dumping large amounts of radioactive material into a small lake is bad.  And our intuition tells us the same thing, since it’s plainly obvious what the result is.

Science tells us that dumping large amounts of CO2 into the air is bad. But our intuition tells us that CO2 is a natural part of the atmosphere — why would relatively larger amounts of it be bad? (note that the intuition is only 150 years old or so, the time in which we’ve even known CO2 existed) Science is contradicting our intuition.

Just because you’re paranoid doesn’t mean they’re not after you!

I don’t think that I would go as far as MacKenzie does, saying that believers in denialism rhetoric have “some fragility in their thinking” and that creators of denailism rhetoric suffer from “paranoid personality disorder.” At least not in the case of more general forms of denialism like evolution and global warming. It’s just that scientists are telling us something that contradicts our intuition. And that gives us motivation to find reasons to dismiss the science.

MacKenzie’s claims of fragile thinking and paranoid personalities may apply to more extreme forms of denialism. Believing that pharmaceutical companies started the swine flu of 2009 and that vaccines cause austim probably require those things. But challenging global warming, evolution, tobacco, and such is just placing intuition over measured effects.

International Conspiracy Theorists

I’ve always been fascinated with the conspiracy theory mindset. There’s the surface level rhetoric of taking events that aren’t fully explained and presenting a rationale for them that is plausible — though fully unlikely. But there’s also the deeper issue of why so many people are so willing to subscribe to absolutely absurd explanations for the troubles in the world.

Weekly World News coverAnd today there was an article in the NYTimes about conspiracy theories in Pakistan. I didn’t realize this, but Pakistanis are rampant conspiracy nuts who have been blaming their woes on somebody else (the U.S., India, Israel) for decades. That’s got me thinking about conspiracies again – how they form and how they work.

Who buys into them?

Conspiracy nuts tend not to be your typical upstanding citizen who volunteers in the community. But I think we can go deeper than that. Conspiracy nuts tend to be those with at least average intelligence who have been marginalized by society.

My proposed explanation for this is that they are intelligent enough to know they should be doing better in life. Yet they can’t really figure out why they’re doing so poorly (for whatever reason). So, in an attempt to do so, they concoct elaborate theories where their problems in life are not their fault. Their problems in life are the result of an extremely large, invisible plot over which they have no control.

And, in trying to expose that plot, they are able to regain some “control” over the direction of their lives. They are absolved of responsibility for their failings and achieve a renewed sense of purpose and fulfillment.

Who’s the bad guy?

But this is the good one. The bad guy is always a quasi-tangible entity who already has some minor position of power.

Take the article above about Pakistani conspiracy theories. The evil American entity who’s dictating Pakistani life (to the point of cutting off power)? Think tanks. Now, those of us in the U.S. know just how absurd that is. Think tanks tend to be a small group of academics who sit around and debate policy issues. They then communicate those ideas to politicians who may or may not act on them. Think tanks are a pretty weak force in American politics. But they’ve got a mysterious bent to them. What all do they say? To whom? How often are their ideas acted upon? That ambiguity makes them an ideal candidate for a conspiracy theory.

There’s the whole 9-11 Truthers thing. 9-11 was a traumatic day in the U.S. We were completely unprepared for it. And we immediately wanted somebody to blame. There’s those out there who couldn’t imagine it being done by Muslim extremists from the Middle East (I guess the extremists aren’t deemed competent enough?). And the result has been a vast government/corporate conspiracy in which thousands of Americans were killed and a city was sent into metaphoric shock to justify a war that secured oil resources. The explanations tend to focus on ambiguous moments of recent history that are chained together in a way that seems almost plausible. Until you realize how complex the conspiracy network would have to be and that no one from that network ever squealed before the act. But I digress. The perpetrator here is usually taken to be greedy oil corporations and their politician-pawns. We all know that corporations donate prolifically to politicians’ campaign funds. And that those donations buy a certain degree of influence. But how much? Hard to quantify. Again, an entity with some real power, but who’s true power is hard to nail down.

And there’s my personal favorite: the Bilderberg group. There really is a Bilderberg group. They really are a collection of powerful businessmen and former politicians. And they really do meet for a “secret” retreat every year, full of high security and hush-hush conversations. But does the Bilderberg group really control the world’s financial fate? Did they bring the Great Recession on us so that they could make large sums of money? Do they have every major politician in their pocket? Seems absolutely absurd, no? But it’s hard to refute. The group’s members all have a certain degree of power individually and the power of the group as a whole is pretty much unknown.

Can you refute them?

Not really. They’re built around stories and incidents that aren’t fully known. For individual reasons, the bases of conspiracy theories are formed from events in the past that were never fully explained. For national security issues, or just due to their extreme complexity, there’s holes that we can’t really fill in. And those holes become fodder for making the issue far more complex than it really is.

As well, true conspiracy nuts build their identity around these theories. Their purpose in life is to reveal the conspiracy to the world. And, since they have so much invested in the theories, they are very reluctant to give them up. So, combine hazy stories with people committed to believing a certain interpretation of those stories, and you get a very intractable situation.

Should you take them seriously?

I’d love to say no. But then you’ve got the whole “death panel” thing from the Long Hot Healthcare Summer of 2009. And, sometimes, conspiracy theories gain enough traction in the public mind that they can influence national policy — and by extension all of our lives. (there were no death panels, btw — complete fabrication designed to be a scare tactic against Obama’s healthcare plan)

They’re also pretty universal across cultures. Every country seems to have its own form of conspiracy nuts. The U.S. government blowing up its own country. Pakistanis seeing think tanks turning off their water. Iranians professing the holocaust never happened. Rastafarians claiming that the world is ruled by a white Babylonian race. Eurabia, a secret alliance between France and the Islamic World. And, well, you get the point.

These seem to be a natural feature of the human condition. We like to place the fault for our woes on somebody else’s shoulders. And, as long as news coverage is less than 100% explanatory, we’ll have conspiracy theories about evil fast food companies rigging our burgers with explosives garnered from black-ops military groups.

Or, I could be part of the conspiracy and am just trying to force you to drink more Kool-Aid

conspiracy cartoon

Forget the numbers?

Yet another post on what has become a near obsession of mine: how do experts communicate numbers to the public?

I’m still working my way through the Intro to Psych class from MIT that motivated the last post. But this post should be a bit more involved than that one. Right now I’m in Lecture 8, a discussion of how we think. What’s caught my eye in this one is the discussion that’s going on about how able we, humans, are to comprehend statistics and big numbers.

statistics

And, while the lecture is really just about the basics of this ability, its gotten me thinking about implications of the recognition that the human brain just isn’t wired to understand statistical probability or the implications of really big numbers. I’ve always thought that this was the case (lots of evidence from public understanding of evolution and astrophysics).  It’s comforting – though somewhat disturbing – for me to know that there’s a pretty solid psychological foundation for the concept that we have a really hard time grasping probability, weighing options based on big numbers, and understanding the concept of statistical outliers.

Apologies that I can’t provide better sources for the tests that follow. They’re from an audio file and my memory just isn’t good enough to catch the tests that the lecturer named. But look in this Intro to Psych course, lecture 8, about 30 minutes in if you’re interested (there’s also a PDF handout with some incomplete citations). The tests apparently netted more than one Nobel prize, so reliable stuff.

Comparing Numbers

Ok, so we’re all able to judge that 4 > 3, right? Hopefully? But apparently, we’re not very good at judging when one number is bigger than another when the numbers come from formulas. The test goes as follows:

Which of the following is more dangerous?

a) a program that has a 1/3 chance of killing 600 people

b) a program that will definitely kill 200 people

So, you do some quick math, and the number of people that will be killed in each case is exactly the same. But, when asked people will almost always pick option b.

Forgetting the Base Rate

This test looks at how good we are at determining the probable chance of something happening.

You meet Jack. Jack is a short, skinny, clean-cut man who loves poetry. He’s a self described feminist who is active in environmental causes. Is Jack more likely

a) a classics professor from an Ivy League school

b) a truck driver

Ok, we all say a, right? Thing is, when you factor in the base rate, it’s overwhelmingly b. Base rate is the total number of possibilities in the set (in statistical terms). Or, in layman terms, how many people you’re starting out with. There’s hundreds of thousands of truck drivers in the US. There’s maybe 100 Ivy League classics professors. So, the odds of Jack being a classics professor are just overwhelmed by the odds of him being a truck driver.

The Story Dwarfing the Numbers
(or, qualitative over quantitative)

Which are you more afraid of: driving a car or flying in an airplane? The airplane, right? Ok, now which is statistically more dangerous: driving a car or flying in an airplane? Actually, driving a car by pretty much any measurement out there.

Psychologists have some ideas on why this is. And while there’s a number of reasons, probably the strongest is the narratives that go along with plane crashes versus car crashes. Plane crashes make the evening news. They get investigated by Congressional commissions. Car crashes are mundane activities that may make page 14 of the newspaper. We hear about plane crashes more, so we’re scared of them more.

Telling Stories

So, how this plays out in real life is that when you’re convincing the populace of something, explaining the numbers rarely works.

roulette wheel
Take the lottery, for example. Your odds of winning are infinitesimal. Yet, lottos make states enough money to fund entire school systems. Neil deGrasse Tyson, a science popularizer a la Carl Sagan, has a line that I love:

Lottos are a tax on those who failed statistics in high school

Las Vegas is a thriving economy for much the same reason.

But take the health care debate that (thankfully) just ended. It still stands out in my mind when ABC televised Obama’s town hall meeting at the start of the long hot summer of 2009. In the town hall, he tried very hard to explain the economics of his ideas – specifically why the public option was a good idea. He went through hard and soft dollars when calculating long term savings, exactly how much uninsured Americans cost insured Americans by adding to insurance premiums and hospital fees, and other numbers.

Then, a week later, the White House dropped the the tactic of explaining the economics of the proposal and instead framed the debate as an attack against evil health insurance companies. The reason, apparently, was that the American people just couldn’t grasp the economic/math stuff.

And there’s also the whole global warming thing. “Climategate” (which was recently debunked) supposedly was a case in which questionable research at one institution 25 years ago invalidated all global warming research. This is a case of the story dwarfing the numbers. There have been thousands of research projects which support anthropomorphic global warming. Invalidating one of those shouldn’t invalidate the other 999+.

Remember Snowpocalypse 2010? And remember the Fox News-types saying that a single massive snow storm invalidates global warming theories (when, in fact, it reinforces them)? Again, a story overwhelming the numbers.

Last example. State of the Union speeches always have the “hero” in the balcony next to the First Lady (ok, they’re really heroes, but you’ll see my point soon). They’re there to put a human face on a large statistical number. “Employment is up across the nation, as evidenced by Frank, sitting next to my wife.” “The stimulus bill has saved thousands of public jobs, as evidenced by Sarah, sitting next to my wife.” “Things are good because of me, as evidenced by” well, you get the point.

Don’t blame the schools

My first thought was to blame the school system. (why not, they’re an easy target) But, then, I asked myself: did we ever understand the numbers? Really, probably not. Could we ever understand the numbers? Really, probably not.

Schools try; teachers try. But you’re just fighting the hard wiring of the human brain here when you try to say,

Convince someone that just because their neighbor won the lotto while watching the results standing on their head doesn’t mean that they’ll win the lotto while watching the results standing on their head.

It may just be something we have to live with. People don’t like big numbers.

Forget the Numbers?

So, I’m left thinking:

People have a really hard time understanding numbers. People need to understand numbers to understand climate change, health care, education reform, global economics, immigration, mammogram screening policies, and evil aliens visiting Earth.

Uh oh.

How, then, do we communicate the importance and point of these issues to people who have a really hard time understanding the rationale behind them?

Well, we could keep throwing numbers at them until we both give up in disgust.

Or, we could do something productive like give them a story that exemplifies the point the numbers make.

Question comes up. Is it even worth giving them the numbers at all? Will providing the climate change data sets add anything to the argument that climate change will submerge New Orleans in 30 years? Will providing the data showing the vast connections between the Chinese and U.S. economies add anything to free trade arguments, more than showing Bob who now makes a living moving goods between the two countries?

I don’t think so. But, I think it’s important to include the numbers for ethical reasons.

The Ethics of Numberless Stories

Because if we leave out the numbers, then no-one has the ability to fact check the stories. And we can get into a contest of who has the better story. Whoever has the more sympathetic story wins the debate – not who has the better numbers.

There’s, I think, some discussion of this in Risk Communication literature. If any of you know of some, throw it my way, will you?

Is it unethical to use a compelling story to convince someone that something isn’t true? Seems like an easy answer: yes. But, is it unethical to use a compelling story to convince someone that something is true? If all you use is the story – if you omit the numbers – I’d have to say that’s an ethical gray zone.

aristotleConvincing someone that the stimulus package created lots of jobs because Mary in New York still has her teaching gig isn’t really convincing them with the facts, is it? It’s almost akin to convincing someone the sky is blue because grass is green. And convincing someone without facts, without logos and only with pathos or ethos, has always bothered me. It’s too close to manipulation.

So, if you want to convince someone that, say, the Earth really is warming, you need to give them the numbers and give them a story that demonstrates how the numbers play out. But if the numbers are never absorbed by your audience (if they never end up meaning anything), have you really communicated them? Sure, you spoke, but did they listen?

And if the numbers aren’t communicated, aren’t you essentially back to convincing someone solely with the story, and all the ethical quandaries therein?

I know the typical scientist’s response: the numbers are there. If they don’t understand them, that’s their fault. But I’m a rhetorician – I don’t like ineffective communication.

The Solution

<purposefully left blank>

Oh, how times have changed!

GoldmanSachsHeadquartersI got a request to do an entry on the Goldman Sachs “situation.” And I always listen to my fans, err, friends kind enough to read this regularly. Earlier today both the NYTimes and the Wall Street Journal have published lead stories about an SEC civil suit accusing the investment bank of directly manipulating the subprime mortgage market for financial gain.

The situation (as far as my non-economics brain can understand) is that Goldman Sachs promoted and possibly sold mortgage packages that it fully expected to collapse. Then, through the purchasing of derivatives, Goldman Sachs (GS) made substantial amounts of money when the mortgages went belly-up. In this way, GS made massive amounts of money by manipulating the market.

What fascinates me as a human being with ethical principles is that GS found it fully acceptable to manipulate the market in such a way that robbed investors of their money and fueled the sale of mortgages to individuals who couldn’t afford them.

What fascinates me as a rhetor is the public communications that surround this event. This story made the front page of two of the most respected publications in the world. It also headlined on the covers of USA Today, CNN, and CNBC (who used the title  “For some investors, another reason to distrust Wall Street” for their article). And it is a story that will be discussed around countless dinner tables tonight – especially as people pull out their 401k statements.

But if this story were to have happened 20 years ago (or, for that matter, 3 years ago), would it have gotten all of this attention? I don’t think so. Yes, it would have headlined on CNBC and most likely the WSJ, but USA Today and CNN wouldn’t have covered it.

The difference? The economy.

There has been a shift in the attention given to Wall Street, or more precisely to Wall Street’s misconduct on an order of magnitude from before the fall of Lehman Brothers. It’s not that Wall Street suddenly got less ethical since then (it can be argued that they’re actually more ethical). It’s that we suddenly care. Multi-million dollar bonuses existed long before the financial collapse and the bailout of AIG. Our attitude has just shifted. An example – Weekend at Bernie’s was on a couple weekends back. I had never seen it, so I watched it for a while. And I found it fascinating that the excesses of Wall Street that we now vilify were glorified in the movie. I remember watching Michael J. Fox’s The Secret of My Succe$s as a teenager and wanting to be an investment banker.

Now, TV shows are more likely to show CEOs of companies emptying port-a-potties than eating caviar in the back of stretch limos.

And I find it really refreshing. It’s nice to see the excesses of greed become socially taboo. I think we’ll all be a little better off if we’re a little less concerned with the size of our car.

Will it last? Actually, I’m doubtful. Once the economy stabilizes (as it’s appearing to), people will again turn to material goods as a motivation for their lives. And excesses of Wall Street will attract young people more than the Peace Corp will. But maybe we as a society will be a little more restrained after our current scare?

In the mean time, I’m really enjoying watching the two political parties fight over who’s going to stick it to Wall Street bankers harder.

Climategate debunked, finally?

There was the whole “climategate” debacle a few months back where emails were stolen from a leading climate research lab that supposedly showed fraudulent activity. Rhetoric of Science people almost assuredly remember that. Earlier today a second UK panel investigating the manner came out with their final report that cleared the researchers of any conscious “scientific malpractice.” That’s two panels clearing the researchers with a third still investigating.

This should come really close to dismissing the accusations that these scientists purposefully misrepresented their data to advance a political agenda. And, hopefully, work to dismiss the accusations that all climate change research is suspect because of this particular research. It won’t, but that’s because people are remarkably adept at believing what they want to and finding conspiracies to discredit what runs contradictory to those beliefs.

This was a well-timed release for me personally, though, because I had just responded to a climate skeptic’s post to the ARST listserv (a listserv for Rhetoric of Science-type folk). My post was one of several that tried to justify why ARST folk hadn’t been more vocal in questioning the legitimacy of climate change science. I took the tact of trying to explain how rarely a consensus of science is actually wrong (extremely, if you’re curious). Others tried to address why we should trust the experts in matters where we have no expertise, a comment that actually ran similar to an early post on this blog. And yet another tried to explain that, though climate change is usually discussed as a fact, that doesn’t necessarily mean the researchers accept it unquestioningly. Simply that the evidence is so overwhelming at this point that it’s pretty darn near undeniable.

What’s struck me about this email thread, and nearly all the public rhetoric surrounding climate change, is that it often takes the form of individuals who know little to nothing about the creation  of scientific knowledge questioning the results of science knowledge creation. Climate change has highlighted (for me at least) the vast gap between the public’s acceptance of scientific knowledge and their awareness of where that knowledge comes from.

As an example, it is quite common for climate skeptics to claim that climate science will soon be disproven by citing Kuhn’s paradigm shifts. Yet, they don’t seem to recognize how very rare paradigm shifts really are. As another example, the original poster in the listserv mentioned earlier claimed that the “climategate” emails called into question the entire body of climate change research (a common claim). Yet the emails are in reference to a single project conducted by a small group of researchers at one location in the 1980s. How those researchers’ actions invalidates the research done by thousands of other researchers over the past 25 years in dozens of countries has always baffled me.

I heard an explanation once, though, that I do think hints at the psychological process involved in such claims. The explanation went that you can view climate change science as either a house of bricks or a house of cards. If it’s a house of bricks, you can take out one brick (invalidate one particular study) and the house still stands. If it’s a house of cards, taking out one card (one study) will send the whole thing toppling down. Given the multiple sources of data analyzed from multiple perspectives by multiple researchers, I’m much more of a “house of bricks” person.

The complicated nature that science has taken in the past 100 years seems to have really distanced it from the general public. General relativity is grossly misunderstood. Particle physics research is seen as a danger, as it may create black holes (though you’re more likely to be annihilated by an intergalactic space war than killed by a human-induced black hole). And climate science, with the complex feedback loops and intensely statistical nature, is seen as smoke and mirrors for a more subversive liberal agenda.

Was it always this way? In part. Galileo was imprisoned. Darwin was mocked. And witches were burned in colonial America. I just have the sense that the gap between the state of cutting edge science research and the public’s understanding of science is much larger than before.

And, as such, scientists have a responsibility to spend more of their time educating the public about their research – even if that means doing less of that research. I know, that’s boring. But it’s important.