Online vs. Live Caller Polls: Which is Better?

July 6, 2020

I was just beginning my survey career in the 1980’s, just as pollsters were transitioning from in-person interviews to telephone surveys. Gallup finally gave up the in-person survey in 1988, marking the end of public polls using in-person surveys.

Today the new kid on the block is the online poll where interviews are conducted through the internet via computers and cell phones. This change has blossomed for a number of reasons, the most important is cost. A telephone survey is far more expensive to conduct.

The other problem is the response rate, which is the percentage of people who complete a telephone poll. Today that rate has dropped to 6% of voters. This has led some national firms, like Pew Research, to shift most of its surveys to an online format.

But there are downsides to this change. The most important is the lack of randomness that telephone surveys enjoy. The key issue for any poll is that every respondent has an equal probability of being selected and interviewed. Probability Theory mandates that randomness is necessary to make mathematical assumptions as to the reliability of the poll. In other words, whether we have confidence in the results based on expected error.

Opt-in polls are self-selecting and are not random, consequently the results are not testable. That doesn’t mean the results are wrong, but it does mean you cannot use probability theory to estimate the potential error. Some major survey firms that use online polls, compensate for this by using national panels of voters that are recruited via probability methods like the American Trends Panel.

Even with these issues, most national surveys have switched to online polls. The ultimate question is how different are they from live-caller surveys? To analyse this, I have adopted a method called an observational study which does not require the randomization the subjects involved. This is often used in medical, economic and political studies where the randomization of participants is not possible.

In June, most polls started to show Joe Biden expanding his lead over Donald Trump in national and state polls. Taking the results of all surveys conducted in June, I selected national polls using two different survey methods: online and live caller interviews that occurred within two days of each other. The object was to measure if the results of the two methods resulted in significantly different results.

In June there were 10 live-caller surveys and 10 online only surveys. In both mode types, Biden led in every poll ranging from seven percent to fourteen percent. As shown in Table 1 below, the number in each column is the percentage lead Biden had in each survey mode.

Table 1: Biden poll lead over Trump in July

The average percent lead for Biden in the online polls is 8.3%. In the live-caller polls, it is 10.5% or a difference of 2.2%. In the chart below, we see a graphic representation of the both poll methods.


The red line shows the live-caller poll’s percent lead for Biden and the blue line the online-poll percent lead for Biden. Although some results showed both polls with similar outcomes, overall it is apparent that Biden in general has a small lead among live caller surveys. But is there a statistical difference in the two modes?

Using a simple student T-Test, we can confirm if the differences are significant or just a random occurrence.

One-Sample Test
 Test Value = 0                                       
MODE TYPESig. (2-tailed) 95% Confidence Interval of the Difference
TABLE 2 significance level <.0001

As shown in Table 2, the differences are significant (Sig. level =.000) is less than .05, so we have confidence that the two modes are significantly different from each other.

So we have two types of surveys, one group using live callers from a randomized list and another group from online surveys with opt-in panels taken in the same time frame. Both sets show Biden in the lead but the online surveys show the lead significantly less than the live-interviews surveys.

In 2016, there was a similar pattern that showed non-live interviewer polls narrowing the gap after June between Trump and Clinton, but still showing Clinton leading. In the end, the online surveys in many battleground states had the race closer but with Clinton still losing.

Non-live interview firms have always asserted that some voters are more honest about their voting intentions when responding to either an online or IVR (robocall) poll.

This would usually be attributable to what political scientists call social-desirability effects, that is when a voter gives an answer that is considered more acceptable even though they don’t believe in it, when they are talking to a real person. And the online survey mode may give some Trump voters more courage to respond truthfully, since there is no live interviewer.

So what is a poll watcher to do? My advice is to average all the polls for that period and ignore the poll mode. It won’t guarantee the actual results are correct, but it increases the number of surveys and, of course, the total respondents. The “law of large numbers” can increase the accuracy. More on this in a future post. Be safe…

By Jim Kane

Jim Kane is a pollster and media advisor, and was for fifteen years an Adjunct Professor of Political Science at the University of Florida. Kane is founder of the polling firm USAPoll and served as the Director of the Florida Voter Poll. His political clients have included both Republican and Democratic candidates, including the Republican Party of Florida, and both the Sun-Sentinel and Orlando Sentinel newspapers. At the University of Florida, Professor Kane taught graduate level courses in political science on Survey Research, Lobbying and Special Interest Groups in America, Political Campaigning, and Political Behavior. In addition to his professional and academic career, Jim Kane has been actively involved in local and state policy decisions. He was elected to the Broward County Soil and Water Conservation Board (1978-1982) and the Port Everglades Authority (1988-1994). Kane also served as an appointed member of the Broward County Planning Council (1995-2003), Broward County Management Review Committee (Chair, 1990-1991), Broward County Consumer Protection Board (1976-1982), and the Broward County School Board Consultants Review Committee (1986-1990).

Leave a comment

Your email address will not be published. Required fields are marked *