Do you ever get the feeling that we only have elections to find out if the polls were right?
Robert Orben, magician and comedy writer, 1927
Atlanta, GA – Infosurv Research’s home base – is the focus of two hotly contested races this year: one for the Governor’s office, the other for a seat in the Senate. As a result, we are inundated with all types of political messaging, from the candidates, from PACs, and from anyone else with a few dollars to spend for a stake in the outcome!
On top of all that, the political polling is rampant. This is gravy for firms involved in political polling, but perhaps not so good for voters bombarded with political calls day and night. The percentages of votes earned by each of the major candidates in both races hover around 45%, with 10% undecided, and a margin of error of 4%. So what does that mean to the average voter? Can you trust the results of these polls? Good question, and even politicians have mixed feelings about polling and how much we should rely on them:
“You can’t depend upon polls.” Michael Bloomberg
“I don’t pay a lot of attention to polls.” Noam Chomsky
“Don’t worry about polls, but if you do, don’t admit it.” Rosalynn Carter
“How far would Moses have gone if he had taken a poll in Egypt?” Harry S. Truman
4 Survey Design Flaws in Political Polls
Sample design and question wording make a world of difference in the results of any survey. This is especially true with political polls. There are four main ways to bias the results of a political poll with survey design. Let’s take a look at each of these.
- Sampling: The most effective, and perhaps most nefarious, way to influence the result of a political poll is in how the sample of respondents is chosen. Purposely biasing the sample selection to various groups, whether by political affiliation, age, race, gender, geographic location, income, and so forth, can have a profound impact on the results of a poll. Legitimate polls today are weighted to bring an unbalanced sample back into line with a random sample of the population. Some poll results are even weighted based on past election results to account for the impact of the expected voter turnout. But sampling remains a critical element that can be manipulated by the unscrupulous.
- Leading questions: Pollsters can influence the results of their surveys – whether knowingly or not – by the way they word the questions. A Pew Research survey asked respondents whether they would: “favor or oppose taking military action in Iraq to end Saddam Hussein’s rule.” In response, 68% said they favored military action while 25% said they opposed military action. However, when the question was changed to whether they would “favor or oppose taking military action in Iraq to end Saddam Hussein’s rule even if it meant that U.S. forces might suffer thousands of casualties,” responses differed dramatically. Only 43% of respondents said they favored military action while 48% said they opposed it. The introduction of U.S. casualties altered the context of the question and influenced whether people favored or opposed military action in Iraq.
- Loaded words: In a Pew Research survey, 51% of respondents said they favored “making it legal for doctors to give terminally ill patients the means to end their lives” but only 44% said they favored “making it legal for doctors to assist terminally ill patients in committing suicide.” Although both question are asking about the same thing, respondents reacted differently to the word “suicide” than they did to “the means to end their lives.” In another example, respondents have reacted differently to questions using the word “welfare” as opposed to the more generic “assistance to the poor,” with much greater public support for expanding “assistance to the poor” than for expanding “welfare.”
- Question Number and Order: If the survey is wildly biased, and the respondent has been battered with what seems like a never-ending series of one-sided questions, they might have difficulty sticking to their guns. The survey itself can begin to bias their responses. “It doesn’t take a lot of imagination to think of how questions raised during a poll might make someone squeamish to admit he or she is a Republican or a Democrat at the end of the survey,” pollster Scott Rasmussen “If the survey was all about those mean-spirited Republicans and their war on women, some people who are marginal Republicans might tell that pollster they’re independents. Or if they are asked about Harry Reid and the Democrats not introducing a budget, that also might pollute the numbers.”
Typically, however, evaluating the results of political polling requires examining the survey questionnaire and methodology in detail, a task far beyond most voters’ capabilities. So it is really no wonder that we have become so inured to truth-shading in political advertising that we turn to other organizations who serve only to fact-check the statements made in the advertisements! But whether you are influenced by the polls or if you choose to ignore them, there is one thing we must all do: VOTE!
One thought on “4 Survey Design Flaws in Political Polling”
This is some really good information about political polling. It does seem like a good thing to word things properly. I liked that you showed that using words like commit suicide can be a problem for a poll. So, it does seem like a good idea to have someone normal read over the poll before sending it out.