.desc-wrapper { height:300px!important; }

Polling

2018 Midterms: Reviewing How We Did And One Big Headscratcher…

Before last week’s midterms, we released four public polls covering key races in Tennessee and Florida.

Overall, the accuracy of our polls was quite good.

In Florida, we were one of the few polling organizations to accurately predict narrow one-point wins by former Governor Rick Scott over incumbent Bill Nelson in the senate race and Congressman Ron DeSantis over former Tallahassee Mayor Andrew Gillum in the race for governor.

Since it’s Florida, we will not know the final vote tally for at least another week while recounts, competing lawsuits, and numerous court decisions decide the final outcome. Regardless, we nailed the election night totals.

Florida Post Mortem.png

In Tennessee, we correctly forecasted an easy win for Republican Businessman Bill Lee over former Nashville Mayor Karl Dean in the governor’s race. Lee won by a surprising 21 points. The RealClear Politics average predicted a 14-point win and our polling showed a nine-point win.

In looking at the polling of the race, it appears most of the undecided voters ended up voting for the Republican candidate, Bill Lee.

TN Gov Post Mortem.png

The real headscratcher for us was the Tennessee Senate race. Our polling, along with a nearly concurrent poll conducted by East Tennessee State University, indicated a very close race.

Our polling showed former Democratic Governor and Mayor of Nashville, Phil Bredesen tied with Republican Congresswoman Marsha Blackburn at 47% of the vote with 6% of likely voters undecided.

TN Senate Post Mortem.png

Election night, Blackburn won the race easily by 11 points. (55% to 44%)

So what happened? It appears undecided voters broke en masse to the Republican candidate (Blackburn) just as they did in the Tennessee Governor’s race.

We compared our polling by region versus the preliminary vote totals by region. As you can see below, our margins are close to the actual vote in East and West Tennessee.

Targoz Post Mortem Polling Estimates.png
TN Actual Vote By Region.png

So what happened in Middle Tennessee? We had Bredesen up 8 points in Middle Tennessee, but Blackburn won her home region by 6 points. Before serving as governor, Bredesen served two terms as Mayor of Nashville so it was expected that he would do quite well in the region, which also trends Democrat. Blackburn has lived and represented Williamson County (which is just south of Nashville) for many years. However, these results were still a surprise.

We took at a look at the turnout for the state and Metro Nashville Davidson County and Shelby County (Memphis). Statewide, it appears turnout was up 55% over the 2014 midterm elections. In Shelby County, turnout was only up 47% from 2014 and in Metro Nashville Davidson County just 50%.

TN Post Mortem Turnout.png

At first glance, it appears turnout in Davidson and Shelby, both Democrat strongholds, was a bit lower than expected and lagged the increases we saw across the rest of the state. It will be a few more days until we get precinct level data, but it looks like turnout in Shelby and Davidson should have been higher which would have it made it a closer race.

It should also be noted that strong storms roared across the state and the Middle Tennessee area the evening before and during the early morning of election day. Amid power outages and at least one reported death as a result of the storm, turnout could have been negatively affected by the bad weather.

There is one other thing to note about these results.

After the 2016 election, some pollsters and analysts suggested Trump’s surprising win was the result of “shy” Trump voters who were fearful of publicly identifying as a Trump supporter in polls and surveys. I should note, we were not surprised by the win. Our online polling indicated a close race and Trump win in 2016.

Regardless, many pollsters chalked up their misses in 2016 to quiet Trump supporters, and to not including enough working class/blue-collar voters in their polls. To combat this in 2018, most pollsters made changes to their methodologies to ensure voters from all educational backgrounds were included in their surveys, and the results of 2018 shows some improvement.

However, the Tennessee results where it appears undecided voters in both races broke heavily for the Republican candidate is very concerning to me. If we accept the shy voter theory, measuring public opinion in the increasingly uncivil environment we face today will become even more challenging. It’s certainly something we will be investigating over the next few months as more data from this election is released.

Hopefully, we can gain some clarity on these issues before the 2020 elections.

Yes, The Election Polls Were Wrong. Here’s How We Fix It.

The cracks in the polling industry have been readily apparent for years and it was only a matter of time before we had a major polling miss in this country. On Monday I offered the following warning about the accuracy of the polls leading up to election day. I knew a surprise was coming thanks to my own online surveys throughout the year.

For years, I’ve been advising clients to move their polls from phone to online. Political and issue-oriented clients have been more resistant to the change, while private firms have embraced the lower cost, quick turnarounds, and improved accuracy offered by online surveys.

Much of the resistance from clients in the public affairs space (i.e. campaigns, associations, public affairs groups, etc.) is due to guidance from groups like FiveThirtyEight, AAPOR, AP, etc. who for years maligned many of the publicly available online polls.

Granted, in some cases their skepticism and criticism was warranted. But it was totally unfair and wrong to advise media and organizations to simply avoid “unreliable” online polls. There’s a lot of good online polling being done right now such as the USC Dornsife/L.A. Times tracking poll that correctly predicted Trump’s victory.

Another factor is resistance to change. Too many pollsters are wedded to phone polling due to the revenue streams associated with a methodology they have used for years that gave them significant professional and financial success.

Those days are now over.

People lack the time or patience to answer a 15-minute phone survey, and the respondents who do stick with it are probably not an accurate reflection of any group other than partisans or people who are natural joiners or volunteers.

Consumers have tools to avoid being interrupted by telephone polls via caller-id and blocking technologies just as they successfully avoid TV commercials with subscription services and time-shifting.

The ability to avoid or the lack of desire to participate in phone polls have led to record low response and cooperation rates. This has been common knowledge for years and it biases survey results. Adding cell phone interviews to this stew helps, but is not a cure, because it’s very expensive and difficult to reach the 40% to 50% of the population without a landline phone.

Respondents are simply more honest answering an online survey compared to surveys that are administered by a live interviewer over the phone. This is especially true when testing voting intentions involving two candidates with off-the-chart negative ratings competing in a highly charged media environment.

Short online surveys are the way to go.

Let me add one caveat.

Most online surveys use a panel of pre-recruited individuals or households who have agreed to take part in online market research. Typically, there are not enough of these panelists to conduct a statistically relevant poll in a smaller geography such as a congressional or state legislative district. However, online is a viable and a preferred option for statewide and larger congressional districts.

Okay. So, we should go ahead and convert our phone polls to online?

No.

You can’t just convert your phone poll to an online survey. It’s not that simple. Online respondents require different questions and different methods to interpret the results.

This was the mistake newspapers made in the early days of online news. They simply took their print product, which was declining in readership, and replicated it online. Not taking advantage of the new technology was a huge mistake. It’s the same for polling. Simply taking a phone poll and asking the same questions online is not advisable.

What do I suggest? I don’t pretend to have all of the answers, but these are some ideas that have worked for me.

Shift To Online Polling

There are just too many issues in phone polling, ranging from non-response (i.e. getting a representative sample of people to talk with you) to coverage (i.e. reaching certain segments of the population such as prepaid cell phone households)

Purchase quality online sample or grow your own online panels. Online panel quality matters. A lot. For example, a panel built from coupon clippers and entrants to online sweepstakes, contests, and giveaways will skew your results — unless that’s your target audience.

Respondents are more likely to be honest when answering an online poll. I’ve tested it and so have others. Use this to your advantage. For example, respondents are more likely to state their actual household income when answering an online survey. Ask for their household income in the survey and then use it to weight the data. Too many disregard income questions in phone polls fearing that respondents are less than honest in their answers. If you believe this, then don’t ask the question, or find a methodology such as online where people will accurately answer the question. Stop wasting peoples’ time. It reflects badly on all of us in the industry.

Proper Weighting

We know some population groups are over- or under-represented in a survey sample regardless of methodology. We have to do a better job of weighting (i.e. assigning “corrective” values to each one of the sample responses of a survey) regardless of mode (i.e. online, phone, mail, or in-person surveys) to ensure results reflect the profile of our desired audience for our survey.

This election provides a great example. YouGovUS, an online polling firm, polled throughout this election cycle. While they weighted their results across a host of demographic variables, it doesn’t appear they used household income in their weighting scheme. That was a mistake.

As you can see below, their samples skewed lower income, and Clinton outperformed with lower-income voters while Trump over performed with upper income households. Who knows? If they had factored income in their weightings, perhaps they would have had a better read on this election.

Surveys should be heavily weighted. Just using simple demographics such as gender, age, and ethnicity to weight a survey is insufficient. You have to include attitudinal and behavioral measures in addition to demos such as income if you have any chance of getting useable results. Harris Interactive did a lot of great work in this area in the early days of online polling.

Beware Digital and Social Media Signaling

Use digital and social media analytics to augment your polling, not to replace it. Digital plays a role and in some ways replaces qualitative research. However, in the current media environment, language has been weaponized, especially online. The data you collect from social media will reflect the socially desirable aspects of peoples’ personality or beliefs. People typically put on their best face online so it’s really difficult to determine what is real or simply social signaling.

Shorter Surveys and Polls

People lack the time or patience to answer a 15-minute survey and the respondents that do are probably not an accurate reflection of the overall population. Use third-party data or purchase demographic data from panel providers for variables you need to weight the data. If you have a lot to test, launch multiple short surveys instead of long surveys that respondents hate.

Vary Your Sampling

For political polling, don’t rely solely on verified registered voter files for your samples. Conversely, don’t rely on samples of the general population using screening questions to determine voting status or intent. Do both.

Pollsters are like generals, they’re always fighting the last war and every election is different. Chances are, you will miss. Tight screens might work for one cycle and then be absolutely the wrong option in another cycle. Vary your sampling and you will have a wider range of possible outcomes to review and analyze.

There are no easy solutions to cure what ails the polling industry. Technology has given us better analysis tools while making it more difficult to collect data to analyze.

Typically, when faced with these situations, people focus on what they know and keep doing the same things over and over until they’re forced to change.

Our jobs are too important to simply ignore the serious issues we now face or to leave it to the next generation to address. We need to change and evolve instead of deriding or attacking other ideas on how we should do our jobs.

The suggestions above are just that, suggestions. If you have a better mousetrap or idea, I would love to hear it, and I hope others in the industry are ready to be open-minded, too.

Journalists Are Out of Touch With Reality

Headlines coming out of the third and final 2016 presidential debate all followed a similar theme: Trump will not commit to accepting the election results if he loses.

The four largest newspapers in the country ran front page headlines that adhered to the same script, echoing post-debate comments from broadcast commentators:

  • Washington Post: “Trump Refuses To Say Whether He’ll Accept Election Results”
  • The Wall Street Journal: “Trump Won’t Commit to Accepting Vote if He Loses”
  • New York Times: “Trump Won’t Say if He Will Accept Election Results”
  • USA Today: “Keep You In Suspense: Trump Won’t Commit To Accepting Vote Results”

Out of a ninety-minute debate, viewed by more than 71 million Americans and covering a variety of topics, this was the major takeaway?

Despite the blaring and repetitive headlines, Americans’ view of this election and whether it’s rigged is unlikely to be shaped by an increasingly out of touch and out of time press.

The belief that Trump’s failure to commit to the outcome of the election was a major gaffe simply illustrates just how out of touch major news organizations are with the country they cover.

Americans increasingly believe the system(s) is rigged against them and view corruption as widespread in government and business. Few will be troubled by Trump’s “rigged” rhetoric since most already believe corruption is a huge problem in the country.

According to Gallup, three in four Americans (75%) perceive corruption as widespread in the country’s government. While this number is from 2014, it’s been steadily increasing since 2007.

So what do Americans fear most? Corrupt government officials. According to the 2016 Chapman University Survey of American Fears, six in ten Americans (61%) identified corrupt government officials as their top fear, eclipsing both terrorist attacks (41%) and not having enough money for the future (40%). Government corruption was also their top fear in 2015.

Americans’ view of government and government officials is dismal and their assessment of the economy and businesses are not any better.

Seventy-one percent of Americans think the U.S. economic system is “rigged in favor of certain groups” according to a June 2016 poll conducted by Marketplace and Edison . This belief was shared regardless of political affiliation or ethnicity.

In a follow-up poll conducted in October 2016, nine out of ten Americans who believed the economic system was rigged in favor of certain groups agreed the U.S. economic system is rigged to benefit politicians (89%) and corporations (86%).

Amidst this backdrop of distrust, why do journalists think voters care if Trump accepts the outcome of November elections? They probably don’t. If anything, they probably agree that it is rigged.

And as much as the legacy press highlights “the gaffe”, their ability to influence voters’ opinion on this topic is increasingly weak. Americans view the press just as negatively as the aforementioned government officials.

In a poll conducted in 2016 by the Media Insight Project for the American Press Institute, less than one in ten Americans (6%) have a great deal of confidence in the press. Four in ten (41%) Americans said they have hardly any confidence at all in the press. In the same poll, more than one out of three (38%) Americans said they have had an experience with a news and information source that made them trust it less.

One of the great challenges ahead is finding a way to restore faith in institutions and most importantly, the press. Sadly, the damage will take decades to repair. These types of initiatives are often generational and will require the passing of significant time before peoples’ memories of this era fade.

The first step to restoring faith in the press is for the media to admit there’s a problem. Second, media organizations need to accept reality and understand how the public actually thinks.

Too many journalists and thought-leaders view the country as they wish it to be and have segregated themselves into hive minds that shelter them from opposing opinions.

Journalists and legacy media are some of the worst offenders of this phenomenon, often opting to challenge the grammar of a dissenting voice than understanding the beliefs and judgments of the messenger. Granted, sheltering yourself, regardless of your profession, against the trolls and vitriol of social media is tempting. However, it skews your view of the country, which is a big problem if you’re a journalist.

Without trust in the press, it’s unlikely faith in any government or political institution will be repaired anytime soon. The divisions and distrust we see today will only continue to grow.

It’s high time journalists and the media return to reality, perhaps survey their audiences to learn how they think, focus coverage on issues the public actually cares about, and then report it accurately.

New Online Voter Panel for Political Polling?

Research Now Group Inc, headquartered in Plano, TX, just launched a new voter panel. This panel allows political pollsters and new tool  to measure American voters’ perception about various issues. It provides insight on voters’  opinions of candidates, voter turnout, key campaign issues, and insights into the perceptions of millennials.

This panel gives researchers access to more than 600,000 deeply profiled, verified voters from every state. Researchers can pick constituents based on party affiliation, historical election turnout, and congressional district among other variables. Panelists can participate in surveys via varying platforms (mobile, tablet, or PC), so all voter populations will be represented.

Research Now has identified hundreds of thousands of voters who are historically hard-to-reach. This includes 70,000 millennials and other voters with no publicly available phone numbers.  This panel is the largest of its kind and marks a new step forward in polling. In recent years, polls have faced a lot of challenges. This is largely due to changes in phone use, caller ID, etc.. Many voters, especially millennials, do not use land-line telephones, so it is difficult to get accurate data. This new panel from Research Now provides a way to access this hard-to-reach population.

While this new panel is a step forward in polling, it is not necessarily the end point for polling improvement. In a previous blog post, I stated that the future of public opinion research lies in a variety of new methods. Social media analysis is one way to gain access to millennials and see their voting preferences, but it is also complicated and not always reliable. Biometrics technology can better understand voters’ tendencies and opinions. A combination of methods while using panels like the new Research Now panel will be key parts of the toolkit for public opinion researchers and it will be interesting to see what other technology emerges as the 2016 race for president continues.    

Polling: What Can We Learn From the UK?

 While I do not normally follow British politics, I am interested in the opinion polling around their May elections. Polling is heavily relied on during campaigns so candidates know where they stand with the public and where they could improve. It also provides content for desperate reporters and news agencies looking to fill time and column inches.

The accuracy of the pre-election polls are important not only for people who want to know who is currently in the lead, but also for political campaigns searching for an edge for their candidate.

The opinion polls leading up to this year’s UK elections were particularly inaccurate. Nearly every popular poll had the conservative and labour parties placed within one percent of each other. The polls indicated that this election would likely be “hung” and that no party would have majority seating in the UK’s parliamentary system.

What actually happened is that the conservative party won the, albeit slight, majority of seats. The conservatives, led by David Cameron, secured 331 seats, which puts them in the majority (majority is considered 326 seats). Labour secured 232 seats, Scottish National Party (SNP) secured 56 seats, the Liberal Democrats retained 8 seats, United Kingdom Independence Party (UKIP) now has 1 seat, and other parties make up 22 seats.

Clearly, what actually happened is very different from the neck-and-neck dead heat that the polls predicted.

So, what happened? What went wrong with the polling?

Multiple sources (FiveThirtyEight, Telegraph, and The Conversation) have ascribed the misses to a failure of sufficiently accounting for the documented late swing towards the incumbent party (the Conservatives). This swing is something that traditionally happens in UK elections. According to Leighton Vaughan Williams’s article on The Conversation, another problem involved an overestimation of the amount of people that would be voting. The Conversation also points to the methodology of pollsters. Pollsters in most of these polls only supplied party names (conservative, labour, etc) instead of actual candidate names, which tends to “miss a lot of late tactical vote switching.” The late swing of votes, inaccuracies in voter turnout, and issues with the pollsters’ methodology account for possibilities of why the pollsters were so inaccurate. 

Granted, polling UK voters is a historically difficult task. Polls in the 1992 election were more inaccurate than this election and history repeated itself in 2015.

So, what does this mean for the future? Is this a harbinger for our elections in 2016?

It’s no secret that traditional polling methods are quickly becoming outdated. According to MPR news, political polling is evolving to monitor social media usage along with social media analytics. Another type of emerging technology in campaigns is biometrics.

While some countries have started to use biometrics at polling stations to help with voter identification, biometrics has the potential to be more. Using biometrics for polling purposes can help the system be more effective since it measures how much a specific person agrees with a statement, question, or wants to vote for a candidate. Even though this technology is new and still in development stages, I think it will change the accuracy and landscape of campaign research. The US presidential race of 2016 is sure to demonstrate some new polling methods, and it will be a good opportunity to observe what does and does not work in a rapidly changing industry. 

Poll Today, Gone Tomorrow

The latest employment numbers continue to show significant improvement in the economy. Granted, the recent numbers have been a jumbled mess due to Easter and the Census. But if you look below the misleading headlines and campaign speeches, you will find significant improvement.

It looks like the private sector produced 123,000 jobs in March and 60% of all reporting industries indicated job growth. I think it is now safe to say the economy is growing again and the recession did indeed end in July or August of last year. We have turned a corner and the immediate future is looking better.

Granted, the picture is not completely rosy. Those unemployed less than 26 weeks are likely to get recalled to work. However, those unemployed for more than 26 weeks due to a plant closure will probably remain unemployed for a significant amount of time. 

Moderate GDP growth of 2 to 3 % for 2010 and 2011 is definitely on tap.

One thing to watch out for is the impact of a growing economy on the fall elections. Republicans have partially enjoyed relative strength on most generic polls due to the problems in the economy. As the economy begins to expand, democratic candidates in communities with less structural unemployment (i.e. areas without significant plant closures) will be able to point to the improvement in the economy and make the case that their policies are working.

A key to successfully interpreting research is to realize that polling and market research is a snapshot of today, not tomorrow. When assessing the results of any project, you have to include other inputs and information (such as economic data) to really understand what is happening and most importantly what you will face tomorrow or in November.

Glimmer of Consumer Optimism

BIGresearch's March 2010 Consumer Intentions & Actions Survey contains a few kernels of optimism on the economic front. Here are a few highlights:

  • In March, fewer than one in three (29.8%) contend they are confident/very confident in chances for a strong economy. While this figure has risen 2+ points from a month ago (27.2%), it continues in the “about 30%” holding pattern begun in May-09. This month’s reading represents an improvement from a year ago (19.5%) as well as Mar-08 (24.8%), but is still well below Mar-07’s 46.9%.
  • One in five (20.6%) assert that they worry more about political/national security issues, down nearly a point from last month (21.3%) and three points from Mar-09 (23.8%).
  • Consumer confidence showed slight improvement from February, nearly half of those surveyed (48.4%) contend they’ve become more practical in purchasing, up five points from a month ago (43.3%), but still below the 52.7% reading recorded in Mar-09.
  • More than half of those surveyed (55.7%) say they are focused on just the necessities when spending, up more than three points from a month ago (52.1%), but lower than Mar-09’s (58.1%).

Pocketbook Vs The Medicine Cabinet

During a rough economy, pocketbook issues are always number among voters. According to the latest American Pulse™ Survey, a majority of Americans (55.5%) think the #1 issue that the President and Congress should be focusing their attention on is the economy.  Following not so closely behind is: Healthcare Reform (18.3%), Terrorism (6.4%), Social Security (5.8%) and Afghanistan (5%).

Seemingly, everyone and everything is focused on healthcare and not the economy. Talk about ignoring the needs of your customers/voters.

The study also found that 81.9% of Americans say the U.S. Government is spending too much. Of those who agree, 76.9% say the high level of spending may be sacrificing future economic growth. Over 60% of Americans have negative feelings towards Government spending.

Regarding Government spending, which of the following best describes your feelings?  (Adults 18+)     

Angry, debt is bad                                               

30.7%

Happy, debt is good if it helps people                     

6.2%       

Powerless, no one in Government                  
seems to care      

37.8%

Empowered, the more Government               
does, the better it is for everyone

6.1%

Unsure                                                                  

19.3

 

Yes, healthcare reform is important. But not addressing the country’s number one issue and piling up a mountain of debt in the process is irrational and incredibly out of touch. Is it really a wonder that more than one out of three Americans say they feel powerless in regards to government spending? A lot can happen in a year, but I have no doubt that voters will be empowered to punish incumbents in 2010.