THE NEWS . . .
Americans have existed under a cloud of media and government induced fear since the Creel Commission at the start of WWI. As fascism took hold FDR warned: the only thing we have to fear is fear itself. Then the anti-communist hysteria and the witch hunts of McCarthyism and HUAC gripped the United States and propelled the nation into another era of paranoia and distrust. By the time the Cold War ended, Americans were accustomed to being afraid, whether at the behest of politicians or on media command. Even after the spectre of communism had dissipated by the dawn of the 21st century, politicians capitalized on fear mongering against hidden foreign enemies. Meanwhile, the nightly news droned on about street crime, sex scandals, violence, while titillating the public with reality programming.
September 11, 2001 changed that persistent phobia in a crucial way. It allowed government and media to focus Americans' fears tangibly. The reality tv shows paled in comparison to the endless desensitizing reruns of the collapse of the WTC. Terror alerts and false alarms became the order of the day. Suddenly, new villains were omnipresent, not just "over there." Poll after poll exclaimed that the country now supported the unelected and once unpopular president, were terrified, and demanded action. Media institutions offered little analysis to explain America's anxiety which was, after all, almost a century in the making. Rather than engaging in sober critical discourse our spin doctors kept it simple and simply cast blame. We were told that "evildoers" hated our freedom. Daily polls told us what we thought, and that we all thought alike, even though everyday Americans suspected we did not. Michael Moore's Oscar-winning documentary "Bowling for Columbine" addressed the issue of fear and violence in America and how media and government created this very climate, yet most in the press missed that crucial opportunity for a national therapy session.
So how can we clarify the role that the corporate media and their incessant polling play in our collective social mindset? In early 2002 we thought about this question and decided to form our own polling organization to try and better understand what they are doing. When we look at our own poll results we see confusion and have asked ourselves: What do these polls tell us when they show nearly 50% of the citizenry now believes that Iraq was behind 9-11 almost two years afterwards (even though no evidence exists to corroborate such a sentiment)? Or when a large majority of Americans oppose the provisions of the USA PATRIOT Act, declare that media is a major cause of the climate of fear in the U.S. yet many of these same people simultaneously support George Bush's war on terror?
We believe our poll results provide evidence that corporate polls exist mainly to validate corporate media and government disinformation and to tell the public WHAT to think, rather than reflect how they think. The population's apparent consent to the war on terror is virtually manufactured by media and government. Describing and exposing this phenomenon is the focus of Retro Poll. Behind the phenomenon we find that even the best polls, those that use reliable methods, are usually concealing vital truths from the public either by omitting, hiding, or oversimplifying important information, by choice.
Here are some of the details. Retro Poll uses a unique methodology which investigates peoples' background knowledge as well as asking their opinions. This allows an assessment of the extent to which background knowledge or its absence correlates with views about topical issues. We also compare individual's responses on different opinion questions looking for clues. From April 5-20, 2003 more than 30 volunteers, mainly college students in the San Francisco area, polled a random sample of the U.S. population on their knowledge and views concerning Constitutional rights and the War on Terrorism. Of over 1000 people contacted, 215 from 46 states agreed to participate. Detailed results of the poll are reported on the web site.
Like the corporate polls we buy phone lists from a company which randomly generates and sells these lists for surveys and marketing purposes. Of the several hundred Americans one of us (Marc) personally spoke with, about 25-30% agreed to answer the questions. The others either declined or hung up. This isn't surprising. It is commonly accepted in public opinion research that in random samples usually 70% or more of those contacted will refuse to participate. With that single act, the refusers destroy the claim that the poll sampled people randomly. The results of any poll can honestly reflect the views of the general population only if the 70% who refuse to talk have nearly identical views to those who agree to be polled. If there are significant differences, the results can not be said to equate to public opinion.
Polls usually report out a statistical "margin of error" for their results. The margin of error that polls report depends not upon the number of people called but upon the number who responded, the sample size. They usually report a margin of error of about 3% for a sample size of 1000. But this margin of error statistic that makes polls look highly accurate is, in essence, a cover to hide the 70% who refused to participate. Even if 99% refused to participate and we had to speak to 100,000 people to find 1,000 who would talk with us, the margin of error statistic would still be reported as the same 3%. It would be hiding the problem of non-responders. So the margin of error statistic is not only inappropriate in this circumstance; it suggests a level of certainty that is fraudulent.
While it is always possible that those refusing have similar views to those agreeing to be polled, Retro Poll has found evidence to the contrary. When we asked over a thousand people, "would you take a few minutes to respond to a poll on the impact of the war on terrorism on the rights of the American people", one woman responded: "You wouldn't want to hear our view on that. People wouldn't like what we think."
"That's ok", we said, "your views are important; they should be counted and reported as part of the democratic process. We want your opinion to count." "No," the woman said insistently. "We're against the war the way they did it. We think they should just bomb all of them, not send our troops over there...." We didn't ask whether she meant bomb everyone in Iraq or some larger group of Muslims, or nations of people, but the woman's self-awareness that her views were outside the "norm" caused her to refuse to participate. Undoubtedly others have specific and different reasons for non-participation that we have difficulty ascertaining because most won't talk about it.
If the "bomb them all" couple may seem the exception among non-responders, consider this: Fewer African Americans and Latin Americans agreed to be polled in both of our national samples (in the current poll 5.7% were African Americans and in the prior poll 4%; for Latin Americans the corresponding figures were 6.2% and 8%. Each of these groups make up about 12% of the U.S. population, actually 12.5% for Latinos). As a result, our poll sample ended up at 79.4% "caucasian" ( i.e. European American) but the actual White/non-Hispanic European American proportion of the population is 69.1% according to the 2000 Census.
It is possible to improve the participation of underrepresented groups in a poll. Gallup reports on their web site that after completing a poll they weight the demographics to assure correct proportions are represented. Weighting means that you multiply the results of an underrepresented group by a factor that will bring their input up to intended and expected levels. Another thing that can be done is to simply over-sample in a population that is expected to self select out of the poll. If, for example, you want to double the number of African American responses you just begin with a sample that has 24% African Americans instead of 12%. These tricks of the trade work on paper and in statistical analysis but they both fail to address the important question "why would any particular group be less likely or more likely to participate? And "are these refusers different?"
If those questions sounds familiar they should. For this is just a more specific and powerful example of the pesky problem of the 70% refusers to participate in polls-the problem that won't go away. When we take it to the level of the under-representation of ethnic groups however it is easier to see that there are probably specific socio-political and/or economic reasons why some people are more likely to participate and others to not participate. These can include issues like English language skills, fears of being monitored by race, lack of self-confidence, or poor educational background. Any of these factors or dozens of others that may have an impact on peoples' decision would invalidate the principle of a random poll sample that can be used to represent the general public. If those African Americans who agreed to participate were more wealthy or better educated than those that refused, then adjusting their input upward by a multiplier (weighting them) to provide a bigger contribution would be a charade since their views might not represent those of less educated lower socio-economic classes of African Americans. You might, for example be inappropriately magnifying the views of a tiny group of African American Republicans.
But the pretense of random samples is only part of the polling problem.
In a recent investigative article on the Field Poll, a group at Poor News was able to tease out a key part of the polling fraud. When directly interviewed, Field Poll leaders claimed that poll publishers in the media and other big dollar poll funders have no influence on poll subject, content or interpretation. They claimed that Field researchers choose their own survey topics and the media financially supports them mainly by subscriptions. But when Poor investigators called and pretended to be interested in purchasing (i.e. commissioning) a particular poll, they were told by a Field Director that they would have to come up with 6 figures in big bucks to get what they wanted. The caller was given the example of a $100,000 poll funded by the San Francisco Chronicle and other unnamed sponsors which found renewed strong public support for nuclear power. Who funded that poll besides the Chronicle? The Field Director didn't say, but we might guess it was the energy industry.
The weak attempt to deny these practices actually conceals more ominous and detrimental purposes and impacts of these polls. Our April 2003 poll on public views concerning the Patriot Act, the War on Terrorism, civil rights and Iraq revealed a public totally confounded by the disinformation they receive from the media and Government, something that major polls almost never explore. For instance, when Americans hear specific provisions of the USA Patriot act, they oppose the intrusions of this law into their civil rights by a wide margin (average 77%). Yet when asked generally what impact the War on Terrorism is having upon civil rights, many of the same people say its "strengthening" or having "no impact" upon their rights (57%). This inner confusion and conflicting loyalties was exemplified by a 37- year -old woman from Udora, Kansas who rejected each of three provisions of the Patriot Act mentioned in the poll and also opposed the use of torture, other outlawed forms of coercion and lengthy prison detention without trial; she also supported a requirement that the U.S. must prove accusations against other nations before attacking them. However, when asked each of the following two questions: "should the U.S. support international efforts to prosecute war crimes," and "should the US make war against Iraq or other countries the government accuses of supporting terrorism when they are not attacking anyone," this same Kansan hesitated and replied: "I'm confused. What is Bush for? I want to do whatever Bush wants. I want to support the President".
One might think that the media would be fascinated with and want to study this contradictory phenomenon. But there are strong financial incentives for polls to provide a simpler picture, one which validates the sponsors and the government. Because most major polls are generated by the mass media and other corporate forces (including foundations that depend upon money from their parent corporations) they will aim to show public views to be consistent with the funders needs and wishes. The contradictions and confusion in the public outlook, which often derive from media disinformation, and government-media collaboration--being a source of embarrassment to the media--will tend to be suppressed.
Likewise, questions are often dumbed-down to create emotional mass responses rather than to challenge people to think and weigh an issue. Questions like: do you like the President? Is he doing a good job? Do you support the troops overseas? Is the war on terrorism protecting your rights (our question)? are actually biased toward the media's most recent slant on the issue. To say 'no' to any of these implies aberrance. Such questions require a person with a different perspective to risk identifying themselves as one of those dissenters and outsiders, as being against "our" young soldiers or as hating America, should they disagree.
People are so used to having such hidden assumptions placed into mass media and polling discourse that some inevitably find Retro Poll's attempts to neutralize such assumptions and bias to reflect "bias". For instance, the September 2002 Retro Poll contained this factual question (derived from If Americans Knew): "In the Palestinian uprising of the past 2 years 84 children were killed by one side before the other side killed a child. Were these 84 children killed by the Israeli Army, Palestinian militants, neither, don't know?" Obviously this factual question was chosen with a purpose in mind that would irritate the supporters of Ariel Sharon, but it is nevertheless a factual question, with a factual answer. Moreover it is a factual question aimed at examining the way that disinformation has been used in the mass media around the Israeli-Palestinian confrontation. Someone who knows the correct answer but prefers that such bitter and suppressed truths not be highlighted in public will rankle at this question and may call it biased for it challenges the widely held notion that the Palestinians initiated the terror against civilians. But the question itself is not biased, for those who do not know the answer will simply say so. The question actually measured bias in the mass media coverage when 13% of respondents (more than those who correctly said the Israeli Army), assumed it had to be the Palestinians rather than answering "don't know" (Retro Pollsters tell people it is better to answer "don't know" than to guess the answers to the factual questions).
Because major polls before the invasion consistently showed at least 2/3 of Americans opposed to attacking Iraq without UN approval, one might ask how it became important to so frequently ask people whether they support the invasion once war had begun. Was this meant to intimidate the majority into supporting an unprovoked war under color of being labeled disloyal and anti-American if they did not change their views? Everyone knows, certainly the media knows, that at the initiation of any war the public view will always appear to shift to support of government policies. This is a well-studied mass "loyalty" effect. By highlighting this shift and making it look like a measure of a real shift in public belief (rather than an inevitable byproduct of government action) the media polls fraudulently generated a snowball "pro-war" effect for the Government to feast on, although in actuality the revulsion at what the U.S. government was doing remained widespread. Such media behavior empowers right wing extremism, potentiates attacks on, and weakens the general public perception of, the peace movement.
The eagerness and frequency with which media conduct all types of polls is a measure of the extent to which relevant news and critical thinking are supplanted by the business of news marketing. Even the more "professional" and "reputable" polling outfits end up as prostitutes to all-powerful government, corporate and marketing forces and, as in the case of Field, they dare not admit that most of what they do is designed to insure the financial success of their organizations by pleasing their corporate funders with beneficial results.
Mickey S. Huff (firstname.lastname@example.org) is adjunct faculty at Diablo Valley and Chabot Community Colleges in the San Francisco East Bay. He teaches American History and Critical Thinking and coordinates student outreach for Retro Poll. Marc Sapir (Msapir@compuserve.com) is an East Bay physician and media and peace activist. He directs the Retro Poll project at www.retropoll.org.
| POLLS | RESULTS OF POLLS
| CORRECT ANSWERS AND CITATIONS RELATED TO EACH POLL
ABOUT OUR POLLING STATISTICS | MISSION STATEMENT | HOW YOU CAN HELP | ADVISORY BOARD |
SUBSCRIPTIONS | NEWS | PRESS RELEASES | FREQUENTLY ASKED QUESTIONS | OTHER | CONTACT