.desc-wrapper { height:300px!important; }

Market Research

Yes, The Election Polls Were Wrong. Here’s How We Fix It.

The cracks in the polling industry have been readily apparent for years and it was only a matter of time before we had a major polling miss in this country. On Monday I offered the following warning about the accuracy of the polls leading up to election day. I knew a surprise was coming thanks to my own online surveys throughout the year.

For years, I’ve been advising clients to move their polls from phone to online. Political and issue-oriented clients have been more resistant to the change, while private firms have embraced the lower cost, quick turnarounds, and improved accuracy offered by online surveys.

Much of the resistance from clients in the public affairs space (i.e. campaigns, associations, public affairs groups, etc.) is due to guidance from groups like FiveThirtyEight, AAPOR, AP, etc. who for years maligned many of the publicly available online polls.

Granted, in some cases their skepticism and criticism was warranted. But it was totally unfair and wrong to advise media and organizations to simply avoid “unreliable” online polls. There’s a lot of good online polling being done right now such as the USC Dornsife/L.A. Times tracking poll that correctly predicted Trump’s victory.

Another factor is resistance to change. Too many pollsters are wedded to phone polling due to the revenue streams associated with a methodology they have used for years that gave them significant professional and financial success.

Those days are now over.

People lack the time or patience to answer a 15-minute phone survey, and the respondents who do stick with it are probably not an accurate reflection of any group other than partisans or people who are natural joiners or volunteers.

Consumers have tools to avoid being interrupted by telephone polls via caller-id and blocking technologies just as they successfully avoid TV commercials with subscription services and time-shifting.

The ability to avoid or the lack of desire to participate in phone polls have led to record low response and cooperation rates. This has been common knowledge for years and it biases survey results. Adding cell phone interviews to this stew helps, but is not a cure, because it’s very expensive and difficult to reach the 40% to 50% of the population without a landline phone.

Respondents are simply more honest answering an online survey compared to surveys that are administered by a live interviewer over the phone. This is especially true when testing voting intentions involving two candidates with off-the-chart negative ratings competing in a highly charged media environment.

Short online surveys are the way to go.

Let me add one caveat.

Most online surveys use a panel of pre-recruited individuals or households who have agreed to take part in online market research. Typically, there are not enough of these panelists to conduct a statistically relevant poll in a smaller geography such as a congressional or state legislative district. However, online is a viable and a preferred option for statewide and larger congressional districts.

Okay. So, we should go ahead and convert our phone polls to online?

No.

You can’t just convert your phone poll to an online survey. It’s not that simple. Online respondents require different questions and different methods to interpret the results.

This was the mistake newspapers made in the early days of online news. They simply took their print product, which was declining in readership, and replicated it online. Not taking advantage of the new technology was a huge mistake. It’s the same for polling. Simply taking a phone poll and asking the same questions online is not advisable.

What do I suggest? I don’t pretend to have all of the answers, but these are some ideas that have worked for me.

Shift To Online Polling

There are just too many issues in phone polling, ranging from non-response (i.e. getting a representative sample of people to talk with you) to coverage (i.e. reaching certain segments of the population such as prepaid cell phone households)

Purchase quality online sample or grow your own online panels. Online panel quality matters. A lot. For example, a panel built from coupon clippers and entrants to online sweepstakes, contests, and giveaways will skew your results — unless that’s your target audience.

Respondents are more likely to be honest when answering an online poll. I’ve tested it and so have others. Use this to your advantage. For example, respondents are more likely to state their actual household income when answering an online survey. Ask for their household income in the survey and then use it to weight the data. Too many disregard income questions in phone polls fearing that respondents are less than honest in their answers. If you believe this, then don’t ask the question, or find a methodology such as online where people will accurately answer the question. Stop wasting peoples’ time. It reflects badly on all of us in the industry.

Proper Weighting

We know some population groups are over- or under-represented in a survey sample regardless of methodology. We have to do a better job of weighting (i.e. assigning “corrective” values to each one of the sample responses of a survey) regardless of mode (i.e. online, phone, mail, or in-person surveys) to ensure results reflect the profile of our desired audience for our survey.

This election provides a great example. YouGovUS, an online polling firm, polled throughout this election cycle. While they weighted their results across a host of demographic variables, it doesn’t appear they used household income in their weighting scheme. That was a mistake.

As you can see below, their samples skewed lower income, and Clinton outperformed with lower-income voters while Trump over performed with upper income households. Who knows? If they had factored income in their weightings, perhaps they would have had a better read on this election.

Surveys should be heavily weighted. Just using simple demographics such as gender, age, and ethnicity to weight a survey is insufficient. You have to include attitudinal and behavioral measures in addition to demos such as income if you have any chance of getting useable results. Harris Interactive did a lot of great work in this area in the early days of online polling.

Beware Digital and Social Media Signaling

Use digital and social media analytics to augment your polling, not to replace it. Digital plays a role and in some ways replaces qualitative research. However, in the current media environment, language has been weaponized, especially online. The data you collect from social media will reflect the socially desirable aspects of peoples’ personality or beliefs. People typically put on their best face online so it’s really difficult to determine what is real or simply social signaling.

Shorter Surveys and Polls

People lack the time or patience to answer a 15-minute survey and the respondents that do are probably not an accurate reflection of the overall population. Use third-party data or purchase demographic data from panel providers for variables you need to weight the data. If you have a lot to test, launch multiple short surveys instead of long surveys that respondents hate.

Vary Your Sampling

For political polling, don’t rely solely on verified registered voter files for your samples. Conversely, don’t rely on samples of the general population using screening questions to determine voting status or intent. Do both.

Pollsters are like generals, they’re always fighting the last war and every election is different. Chances are, you will miss. Tight screens might work for one cycle and then be absolutely the wrong option in another cycle. Vary your sampling and you will have a wider range of possible outcomes to review and analyze.

There are no easy solutions to cure what ails the polling industry. Technology has given us better analysis tools while making it more difficult to collect data to analyze.

Typically, when faced with these situations, people focus on what they know and keep doing the same things over and over until they’re forced to change.

Our jobs are too important to simply ignore the serious issues we now face or to leave it to the next generation to address. We need to change and evolve instead of deriding or attacking other ideas on how we should do our jobs.

The suggestions above are just that, suggestions. If you have a better mousetrap or idea, I would love to hear it, and I hope others in the industry are ready to be open-minded, too.

Wearables and Market Research

Google and long-time clothing producer Levi Strauss Co. have just partnered up to produce a whole new kind of fabric – a “smart cloth.” Called Project Jacquard, after the inventor of the loom, this new interactive fabric can be embedded into any fabric by way of an industrial loom. This means the new fabric is easy to use and can be wide-spread. These interactive threads currently function like a touchscreen on a phone. They can detect someone swiping or moving their fingers and can connect with other technology, like a smartphone. This means we might soon have another way to answer our phones or snooze our alarms.

Google and Levi are not the first brands to come out with “smart” fabric. Clothing brand Athos has embedded wearable sensors for heartrate, breathing rate, electrical activity generated by muscles (EMG), and more into workout clothes. The idea is that all of this information can be displayed for the user so they can better maximize their workout.

Recently, Researchers at University of California in San Diego were granted $2.6 million to develop smart clothes that help regulate body temperature. By using polymers that expand and shrink, their idea is to make a lightweight, washable, easy to use shirt that can thicken if the room gets colder or thin out if the room gets warmer. This will cut down on electricity and heating and cooling costs. The technology is still in the very early stages, but if it is developed as they hope, it could considerably help with natural disasters like the heat wave recently seen in India.

While Google is certainly not the first company to bring technology into fabrics, they are entering the market with new boundaries to push. As shown, other “smart clothes” use sensors or polymers in their fabrics. Google is working with threads that have microchips in them. These fabrics will be able to be programmed to do almost anything. While Google is designing the software and will be available for support, other designers will be in charge of the actual products. Levi’s, for one, will get their chance to use this new software in an exciting way. Perhaps they will embed a game onto the sleeve of a shirt, or maybe embed a TV remote to the arm of a sofa. Google will remain an interested partner, but the designing is left to other companies who may have a better sense of what the market is ready for and what customers want. We will see if this new wearable tech leads to a touch screen integrated into a shirt, new remotes that are embedded into a sofa, or even quicker doctor visits due to shirts that measure all vital signs.

In terms of the market research industry, this new technology could work hand-in-hand with biometrics to better measure responses to a myriad of things. Responses to commercials, brand messages, and advertisement campaigns could be tested more efficiently with this new technology. Wearable technology will be able to measure heart rate, breathing rate, and potentially other factors that are important physiological changes that come along with someone either liking or disliking a message. If we could use this wearable technology in conjunction with biometric measures like facial expression analysis, we will be able to get a better feeling for how customers actually react to a commercial, product, or branding message. Time will tell, but I think the combination of “smart clothes” with biometrics will soon become commonplace for market researchers.

Polling: What Can We Learn From the UK?

 While I do not normally follow British politics, I am interested in the opinion polling around their May elections. Polling is heavily relied on during campaigns so candidates know where they stand with the public and where they could improve. It also provides content for desperate reporters and news agencies looking to fill time and column inches.

The accuracy of the pre-election polls are important not only for people who want to know who is currently in the lead, but also for political campaigns searching for an edge for their candidate.

The opinion polls leading up to this year’s UK elections were particularly inaccurate. Nearly every popular poll had the conservative and labour parties placed within one percent of each other. The polls indicated that this election would likely be “hung” and that no party would have majority seating in the UK’s parliamentary system.

What actually happened is that the conservative party won the, albeit slight, majority of seats. The conservatives, led by David Cameron, secured 331 seats, which puts them in the majority (majority is considered 326 seats). Labour secured 232 seats, Scottish National Party (SNP) secured 56 seats, the Liberal Democrats retained 8 seats, United Kingdom Independence Party (UKIP) now has 1 seat, and other parties make up 22 seats.

Clearly, what actually happened is very different from the neck-and-neck dead heat that the polls predicted.

So, what happened? What went wrong with the polling?

Multiple sources (FiveThirtyEight, Telegraph, and The Conversation) have ascribed the misses to a failure of sufficiently accounting for the documented late swing towards the incumbent party (the Conservatives). This swing is something that traditionally happens in UK elections. According to Leighton Vaughan Williams’s article on The Conversation, another problem involved an overestimation of the amount of people that would be voting. The Conversation also points to the methodology of pollsters. Pollsters in most of these polls only supplied party names (conservative, labour, etc) instead of actual candidate names, which tends to “miss a lot of late tactical vote switching.” The late swing of votes, inaccuracies in voter turnout, and issues with the pollsters’ methodology account for possibilities of why the pollsters were so inaccurate. 

Granted, polling UK voters is a historically difficult task. Polls in the 1992 election were more inaccurate than this election and history repeated itself in 2015.

So, what does this mean for the future? Is this a harbinger for our elections in 2016?

It’s no secret that traditional polling methods are quickly becoming outdated. According to MPR news, political polling is evolving to monitor social media usage along with social media analytics. Another type of emerging technology in campaigns is biometrics.

While some countries have started to use biometrics at polling stations to help with voter identification, biometrics has the potential to be more. Using biometrics for polling purposes can help the system be more effective since it measures how much a specific person agrees with a statement, question, or wants to vote for a candidate. Even though this technology is new and still in development stages, I think it will change the accuracy and landscape of campaign research. The US presidential race of 2016 is sure to demonstrate some new polling methods, and it will be a good opportunity to observe what does and does not work in a rapidly changing industry. 

Life Lesson or Momentary Pause?

Has the turmoil witnessed during this recession really taught consumers a long-lasting fiscal lesson?

According to BIGresearch, a whopping 37.0% of consumers in June admit they haven’t saved any of their income in the past 12 months, while one in five (22.0%) discloses saving more than 10%. At the very least, younger generations appear to be vigilant about feeding their piggy banks...three in five of those 18-24 (58.5%) plan to save “more” than they did last year, compared to just one in four 45-54 year olds (27.6%).

One potential reason for the lack of savings is the decline in consumer debt. Americans are paying off their cards. Credit card debt has actually declined by double digits in the past three months.

Relationships Matter

Regardless of size, advisory boards can help businesses learn more about their organizations, their products, and their sales staff. In downturns, having a relationship with key customers and learning what they need is key.

Advisory boards also help companies with small marketing and research budgets stay current with trends. It also gives firms an outside sounding board to test new products or initiatives. Customers, vendors, and service providers are all potential advisers.

Having created several advisory boards, I can tell you that it is quite easy to do and requires little expense.

If done correctly, the insights, reward and feedback can be invaluable.