One explanation for Propaganda Techniques Related to Enviromental Scares comes from Paul R. Lees-Haley, Ph.D. The following article was copied from the Quackwatch web site:
Psychologists have studied several perceptual factors that help explain how reasonable people can conclude that they have suffered toxic exposures and injuries when they have not. These include social proof, repeated affirmations, appeals to authority, vividness, confusion of inverse probabilities, confusion techniques, and distraction techniques.
Social proof is the tendency to believe what most people believe. If an advocate creates the impression that "everyone knows" that someone is lying and covering up facts, there is a subtle implication that those who disagree are somehow flawed and lacking in credibility. Identifying a few people who believe a proposition, and encouraging them to go public (especially repeatedly) creates the impression that lots of people are experiencing something real. Repeated affirmations create the impression that the assertion is true.
Appeals to authority add weight to these persuasions. If one or more of the people affirming a belief is perceived as authoritative, e.g., a physician or a political leader, more people will be persuaded. It may matter little that the expert is the only one in the universe with that opinion, if he or she is the only one whose opinions we hear. Sometimes politicians are persuaded to join in unfounded but politically advantageous rhetoric. If we like the source of an opinion, we are more likely to believe. So if a popular actor, media figure, politician, or local hero joins the process, more people will endorse the perceived reality.
Vivid examples -- especially dramatic case histories -- often influence judgments more than dull but more accurate quantitative examples. For example, inviting the single child with a birth defect to the town hall meeting may overwhelm the fact that there are fewer birth defects in the neighborhood than in most similar residential areas.
Confusion techniques can create perceptions of toxicity, injury, or disease. For example, illogical but eloquent rhetoric delivered with an air of certainty can create such perceptions if a few clear alarming phrases are woven into the message. If the release of something harmless to humans is announced along with discussions of studies indicating cancer, birth defects, or brain damage in animals, concern or alarm may ensue. A classic technique is to pose an alarming question as the headline of a speech, article, or broadcast, e.g., "Are your children in danger?" We commonly hear announcements that "bad chemicals" or "known carcinogens" are out there, without objective data to clarify whether the type, amount, and location of the substance could actually hurt anyone. When someone questions the plausibility of the alleged toxic exposures, advocates may self- righteously respond that reasonable people have a right to worry, -- as though people who try to alleviate unnecessary worry are violating the rights of others.
Manipulators dramatically announce that people in the community have cancer, birth defects, immune disorders, and other disturbing health problems, as if this were a discovery, or something unusual. Facts about the normal prevalence of these problems are seldom disseminated or compared with the numbers contained in these sensational announcements. Have you ever seen a headline, "Cancer rate and birth defects rates exactly normal in Ourtown, USA"?
Ignoring coincidence and drawing attention to a few sick people can be highly misleading. In any large population, for example, it is simple to find a few people who have various severe diseases, including some relatively rare ones. When confronted with the facts about an alleged environmental toxin, for example, manipulative advocates may respond with confusion techniques such as: "One sick child is too many, and we resent your implying that it's OK to poison our children" or "How many body bags will it take to convince you people?" In other cases they skip over probability and go directly to the impossible -- in the words of a concerned parent at a town hall meeting, "How are you going to guarantee that my children won't have cancer in twenty years?"
Confusion, distraction, and other propaganda techniques may be used to make spurious accusations that inspire outrage against opposing parties. In response to recent criticisms of junk science, antiscience arguments are on the rise. Advocates tell us, "We can't wait on science. We have to act now!" and "The scientists want us to do nothing! How many people have to die before XYZ does what is right?" One such critic ironically declared, "We can't wait on science, we have to act on the evidence!" Certainly we make most of our decisions in life without conducting a scientific study first. However, the allegation that some environmental toxin caused brain damage in a specific group of people is a factual question that can be answered only by looking at the data, not by emotional reactions to speculation, sensationalism, and innuendo.
Manipulators strive to divorce us from the facts. Rather than encouraging us to examine the evidence and reasoning of people who appear to disagree with us, they block communications and openly or indirectly try to persuade us that people who disagree with their views are dishonest, not trustworthy, incompetent, biased, racist, only concerned with money, insulting our intelligence, corrupt, betrayers of the American dream, and so on. The subtext is: "Do not consider alternative points of view. Do what we tell you, without realizing that we are controlling you." Like cult leaders, manipulators encourage us to close ranks and form an in-group suspicious of those who question the party line.
Manipulators often try to control beliefs and actions by exploiting people's feelings. Inflammatory emotional rhetoric hardens attitudes against the opponent, and subtly justifies bending the rules to fight against the evil doer. Rhetoric that characterizes the opponent as a powerful bully (for example, that the AMA is persecuting "alternative" pracitioners) elicits a desire to root for the underdog, and provides emotional justification for bending ethical rules.
Confusion of inverse probabilities is another classic form of invalid interpretation of facts that arouses unnecessary alarm. For example, suppose an announcement of a release of a toxic chemical is accompanied by news that the chemical can cause upper respiratory symptoms, aches and pains, or other common symptoms. Some people with these symptoms will conclude that the chemical was responsible. And this could be true. However, it may also be true that only 10% of persons exposed develop such symptoms, and only 1% of the population was exposed, so that the probability that a particular person has been poisoned is one in a thousand. These important details can be overlooked in the hue and cry following a dramatic toxic spill.
People tend to assume that sensational terms represent reality. Multiple chemical sensitivity and Gulf War syndrome are prime examples. The existence of a name does not necessarily mean that there is a corresponding real event. However, spurious allegations may appear plausible if associated with common symptoms. of human existence, especially if depicted by an expert.
Another misleading technique is the use of categorical terms that lead away from a more reassuring (and more reasonable) quantitative reality. For example, an expert witness in a court case may discourse at length on the effects of severe toxic brain injury when testifying about a mild injury. Or instead of stating that a plaintiff has a subtle cognitive impairment that probably will not affect his life very much, the expert decribes the plaintiff as "brain damaged." And instead of saying that a plaintiff has less than 1/10 of 1 percent greater likelihood of contracting cancer than the base rate, the expert opines that the plaintiff has "increased risk of developing cancer" due to some exposure. Both statements are technically correct but not presented equally. Interruptions, objections, topic changes and ad hominem arguments may also be used to divert attention from science-based facts.
Dr. Lees-Haley is a psychologist with offices in the Los Angeles area. Researchers conducting studies on related issues can contact him at 21331 Costanso Street, Woodland Hills, CA 91364.Telephone: 818-887-2874 ||| Fax: 818-887-9034 ||| Email email@example.com
This article was adapted from Lee-Haley PR. Manipulation of perception in mass tort litigation. Natural Resources & Environment 12:64-68, 1997.