.desc-wrapper { height:300px!important; }

2018 Midterms: Reviewing How We Did And One Big Headscratcher…

Before last week’s midterms, we released four public polls covering key races in Tennessee and Florida.

Overall, the accuracy of our polls was quite good.

In Florida, we were one of the few polling organizations to accurately predict narrow one-point wins by former Governor Rick Scott over incumbent Bill Nelson in the senate race and Congressman Ron DeSantis over former Tallahassee Mayor Andrew Gillum in the race for governor.

Since it’s Florida, we will not know the final vote tally for at least another week while recounts, competing lawsuits, and numerous court decisions decide the final outcome. Regardless, we nailed the election night totals.

Florida Post Mortem.png

In Tennessee, we correctly forecasted an easy win for Republican Businessman Bill Lee over former Nashville Mayor Karl Dean in the governor’s race. Lee won by a surprising 21 points. The RealClear Politics average predicted a 14-point win and our polling showed a nine-point win.

In looking at the polling of the race, it appears most of the undecided voters ended up voting for the Republican candidate, Bill Lee.

TN Gov Post Mortem.png

The real headscratcher for us was the Tennessee Senate race. Our polling, along with a nearly concurrent poll conducted by East Tennessee State University, indicated a very close race.

Our polling showed former Democratic Governor and Mayor of Nashville, Phil Bredesen tied with Republican Congresswoman Marsha Blackburn at 47% of the vote with 6% of likely voters undecided.

TN Senate Post Mortem.png

Election night, Blackburn won the race easily by 11 points. (55% to 44%)

So what happened? It appears undecided voters broke en masse to the Republican candidate (Blackburn) just as they did in the Tennessee Governor’s race.

We compared our polling by region versus the preliminary vote totals by region. As you can see below, our margins are close to the actual vote in East and West Tennessee.

Targoz Post Mortem Polling Estimates.png
TN Actual Vote By Region.png

So what happened in Middle Tennessee? We had Bredesen up 8 points in Middle Tennessee, but Blackburn won her home region by 6 points. Before serving as governor, Bredesen served two terms as Mayor of Nashville so it was expected that he would do quite well in the region, which also trends Democrat. Blackburn has lived and represented Williamson County (which is just south of Nashville) for many years. However, these results were still a surprise.

We took at a look at the turnout for the state and Metro Nashville Davidson County and Shelby County (Memphis). Statewide, it appears turnout was up 55% over the 2014 midterm elections. In Shelby County, turnout was only up 47% from 2014 and in Metro Nashville Davidson County just 50%.

TN Post Mortem Turnout.png

At first glance, it appears turnout in Davidson and Shelby, both Democrat strongholds, was a bit lower than expected and lagged the increases we saw across the rest of the state. It will be a few more days until we get precinct level data, but it looks like turnout in Shelby and Davidson should have been higher which would have it made it a closer race.

It should also be noted that strong storms roared across the state and the Middle Tennessee area the evening before and during the early morning of election day. Amid power outages and at least one reported death as a result of the storm, turnout could have been negatively affected by the bad weather.

There is one other thing to note about these results.

After the 2016 election, some pollsters and analysts suggested Trump’s surprising win was the result of “shy” Trump voters who were fearful of publicly identifying as a Trump supporter in polls and surveys. I should note, we were not surprised by the win. Our online polling indicated a close race and Trump win in 2016.

Regardless, many pollsters chalked up their misses in 2016 to quiet Trump supporters, and to not including enough working class/blue-collar voters in their polls. To combat this in 2018, most pollsters made changes to their methodologies to ensure voters from all educational backgrounds were included in their surveys, and the results of 2018 shows some improvement.

However, the Tennessee results where it appears undecided voters in both races broke heavily for the Republican candidate is very concerning to me. If we accept the shy voter theory, measuring public opinion in the increasingly uncivil environment we face today will become even more challenging. It’s certainly something we will be investigating over the next few months as more data from this election is released.

Hopefully, we can gain some clarity on these issues before the 2020 elections.

Tennessee Poll: The Race for Senate is Too Close to Call

Like much of the nation, turnout for the midterm election in Tennessee is setting new records. Interest and participation in this election is extremely high compared to recent history.

In the race for Senate, Republican Rep. Marsha Blackburn and Democratic former Gov. Phil Bredesen are tied (48% to 48%) among likely and early voters. When undecided likely voters are pressed to make a choice (Leaners), both remain tied (49% to 49%).

In the race for Governor, Republican businessman Bill Lee holds a commanding 9-point lead (52% to 43%) over the Former Democratic Mayor of Nashville Karl Dean. When undecided likely voters are pressed to make a choice (Leaners), Lee’s lead remains at 9 points (53% to 44%).

Among early voters, Blackburn holds a 2-point lead over Bredesen and Lee holds a 6-point lead over Dean.

Election day turnout will play a significant role in the selection of Tennessee’s next Senator.

In the Governor’s race, it appears roughly 4% to 5% of likely voters are crossing over from Lee and voting for the relatively conservative Democrat Phil Bredesen who served two terms as Governor in the Senate race.

As mentioned earlier, the state of Tennessee is on pace to set a record for the highest voter turnout in a midterm election. If turnout mirrors the 2016 election for President, Blackburn could achieve a narrow win. Among voters who voted in 2016 and have or plan to vote in 2018, Blackburn leads by 4 points, and Lee leads by a commanding 15 points.

TN Senate Ballot 2016.png
TN Governor Ballot 2016.png

Most public polls show Blackburn and Lee with commanding leads. Based on the results of this poll, it’s possible we will be up late Tuesday night to learn who will represent Tennessee in the Senate next year. Concession speeches in the Governor’s race could occur very early next Tuesday evening.


This online poll was conducted with 802 registered voters from October 28-31, 2018 by Targoz Market Research. Respondents were selected from ProdegeMR’s online panel respondents who were matched to voter records from 2016 and 2012. Of the 802 registered voters in the sample, 480 were identified as likely voters including 228 who said they have already voted.

The results reflect a representative sample of registered voters. Results were weighted for age, gender, region, race/ethnicity, income, and political party. Additional behavioral weighting was also used to adjust for respondents’ propensity to be online.

[The poll was conducted by Targoz Market Research of Nashville, TN and was not commissioned or paid for by any candidate or political organization.] RandyEllison@targoz.com; RandyEllison@Twitter


Ballot Results: Tennessee Senator

QUESTION: If the Tennessee election for U.S. Senator were held today, would you vote for: (ROTATE)

TN Senate Ballot 2018.png

Ballot Results: Tennessee Governor

QUESTION: If the Tennessee election for Governor were held today, would you vote for: (ROTATE)

TN Governor Ballot 2018.png

Florida Poll: Tight Races for Senate and Governor Could Yield Another Election Night Surprise

Florida Poll: Tight Races for Senate and Governor Could Yield Another Election Night Surprise

Historically, Democrats hold a narrow advantage among registered voters in Florida. However, Republican turnout is generally higher than Democratic turnout on election day.

It appears 2018 could be another repeat of this phenomenon.

Voice of the Reader Survey Finds Rising Book Prices Are Driving Buyers to Delay Purchases, Buy Used Books, or Use Subscription Services

Targoz Strategic Marketing Announces Availability of
Reading Pulse Survey™

Big news from the Targoz team!  We have announced the immediate availability of the Reading Pulse Survey™. Based on six years of survey research, the syndicated study provides book publishers, agents, and sellers with an accurate picture of readers, and delivers actionable data on what readers want and how to influence them to buy.

 “At its heart, Reading Pulse is a voice of the reader survey,” said Randy Ellison, President of Targoz Strategic Marketing. “While everyone in the industry has focused on transactional sales data and analytics, we wanted to concentrate on the most important part of the industry: readers.”

Surprising Results

The survey asked readers about the genres they read, how much they’re willing to pay for a book, and where they buy their books. The study also includes Author Ranking Brand Scores™, which measure and rank the brand strength of bestselling fiction authors by genres, as well as extensive reader segmentation and demographic data.

“While the number of adults reading books has grown over the past five years and readers tell us they are finding more time to read, we are seeing declines in the number of books purchased by readers,” said Ellison. “Rising prices for print and e-book titles are driving book buyers to look for value. High prices are causing readers to delay purchases or to find lower price titles by buying used books, downloading free and discounted e-books from BookBub and other discount newsletters, or by using subscription services. Readers read, and those who consume the most books are looking for value to feed their reading habit. Amazon understands pricing, and our study proves they understand it far better than most publishers.”  

Reading Pulse Survey™ Availability

The Reading Pulse Survey™ report can be purchased at www.readingpulse.com for immediate download. The annual study is a broad-based consumer tracking study which compares readers’ attitudes, habits, and purchase patterns against the general population. This provides buyers with the most accurate and broad-based assessment of the U.S. market.

Yes, The Election Polls Were Wrong. Here’s How We Fix It.

The cracks in the polling industry have been readily apparent for years and it was only a matter of time before we had a major polling miss in this country. On Monday I offered the following warning about the accuracy of the polls leading up to election day. I knew a surprise was coming thanks to my own online surveys throughout the year.

For years, I’ve been advising clients to move their polls from phone to online. Political and issue-oriented clients have been more resistant to the change, while private firms have embraced the lower cost, quick turnarounds, and improved accuracy offered by online surveys.

Much of the resistance from clients in the public affairs space (i.e. campaigns, associations, public affairs groups, etc.) is due to guidance from groups like FiveThirtyEight, AAPOR, AP, etc. who for years maligned many of the publicly available online polls.

Granted, in some cases their skepticism and criticism was warranted. But it was totally unfair and wrong to advise media and organizations to simply avoid “unreliable” online polls. There’s a lot of good online polling being done right now such as the USC Dornsife/L.A. Times tracking poll that correctly predicted Trump’s victory.

Another factor is resistance to change. Too many pollsters are wedded to phone polling due to the revenue streams associated with a methodology they have used for years that gave them significant professional and financial success.

Those days are now over.

People lack the time or patience to answer a 15-minute phone survey, and the respondents who do stick with it are probably not an accurate reflection of any group other than partisans or people who are natural joiners or volunteers.

Consumers have tools to avoid being interrupted by telephone polls via caller-id and blocking technologies just as they successfully avoid TV commercials with subscription services and time-shifting.

The ability to avoid or the lack of desire to participate in phone polls have led to record low response and cooperation rates. This has been common knowledge for years and it biases survey results. Adding cell phone interviews to this stew helps, but is not a cure, because it’s very expensive and difficult to reach the 40% to 50% of the population without a landline phone.

Respondents are simply more honest answering an online survey compared to surveys that are administered by a live interviewer over the phone. This is especially true when testing voting intentions involving two candidates with off-the-chart negative ratings competing in a highly charged media environment.

Short online surveys are the way to go.

Let me add one caveat.

Most online surveys use a panel of pre-recruited individuals or households who have agreed to take part in online market research. Typically, there are not enough of these panelists to conduct a statistically relevant poll in a smaller geography such as a congressional or state legislative district. However, online is a viable and a preferred option for statewide and larger congressional districts.

Okay. So, we should go ahead and convert our phone polls to online?


You can’t just convert your phone poll to an online survey. It’s not that simple. Online respondents require different questions and different methods to interpret the results.

This was the mistake newspapers made in the early days of online news. They simply took their print product, which was declining in readership, and replicated it online. Not taking advantage of the new technology was a huge mistake. It’s the same for polling. Simply taking a phone poll and asking the same questions online is not advisable.

What do I suggest? I don’t pretend to have all of the answers, but these are some ideas that have worked for me.

Shift To Online Polling

There are just too many issues in phone polling, ranging from non-response (i.e. getting a representative sample of people to talk with you) to coverage (i.e. reaching certain segments of the population such as prepaid cell phone households)

Purchase quality online sample or grow your own online panels. Online panel quality matters. A lot. For example, a panel built from coupon clippers and entrants to online sweepstakes, contests, and giveaways will skew your results — unless that’s your target audience.

Respondents are more likely to be honest when answering an online poll. I’ve tested it and so have others. Use this to your advantage. For example, respondents are more likely to state their actual household income when answering an online survey. Ask for their household income in the survey and then use it to weight the data. Too many disregard income questions in phone polls fearing that respondents are less than honest in their answers. If you believe this, then don’t ask the question, or find a methodology such as online where people will accurately answer the question. Stop wasting peoples’ time. It reflects badly on all of us in the industry.

Proper Weighting

We know some population groups are over- or under-represented in a survey sample regardless of methodology. We have to do a better job of weighting (i.e. assigning “corrective” values to each one of the sample responses of a survey) regardless of mode (i.e. online, phone, mail, or in-person surveys) to ensure results reflect the profile of our desired audience for our survey.

This election provides a great example. YouGovUS, an online polling firm, polled throughout this election cycle. While they weighted their results across a host of demographic variables, it doesn’t appear they used household income in their weighting scheme. That was a mistake.

As you can see below, their samples skewed lower income, and Clinton outperformed with lower-income voters while Trump over performed with upper income households. Who knows? If they had factored income in their weightings, perhaps they would have had a better read on this election.

Surveys should be heavily weighted. Just using simple demographics such as gender, age, and ethnicity to weight a survey is insufficient. You have to include attitudinal and behavioral measures in addition to demos such as income if you have any chance of getting useable results. Harris Interactive did a lot of great work in this area in the early days of online polling.

Beware Digital and Social Media Signaling

Use digital and social media analytics to augment your polling, not to replace it. Digital plays a role and in some ways replaces qualitative research. However, in the current media environment, language has been weaponized, especially online. The data you collect from social media will reflect the socially desirable aspects of peoples’ personality or beliefs. People typically put on their best face online so it’s really difficult to determine what is real or simply social signaling.

Shorter Surveys and Polls

People lack the time or patience to answer a 15-minute survey and the respondents that do are probably not an accurate reflection of the overall population. Use third-party data or purchase demographic data from panel providers for variables you need to weight the data. If you have a lot to test, launch multiple short surveys instead of long surveys that respondents hate.

Vary Your Sampling

For political polling, don’t rely solely on verified registered voter files for your samples. Conversely, don’t rely on samples of the general population using screening questions to determine voting status or intent. Do both.

Pollsters are like generals, they’re always fighting the last war and every election is different. Chances are, you will miss. Tight screens might work for one cycle and then be absolutely the wrong option in another cycle. Vary your sampling and you will have a wider range of possible outcomes to review and analyze.

There are no easy solutions to cure what ails the polling industry. Technology has given us better analysis tools while making it more difficult to collect data to analyze.

Typically, when faced with these situations, people focus on what they know and keep doing the same things over and over until they’re forced to change.

Our jobs are too important to simply ignore the serious issues we now face or to leave it to the next generation to address. We need to change and evolve instead of deriding or attacking other ideas on how we should do our jobs.

The suggestions above are just that, suggestions. If you have a better mousetrap or idea, I would love to hear it, and I hope others in the industry are ready to be open-minded, too.

Could The Polls Be Wrong? It’s Possible.

Over this final weekend before the election, Nate Silver FiveThirtyEight has taken fire for being too cautious in forecasting the outcome of tomorrow’s election. Some have charged that his probabilities for a Clinton win are too low. Frankly, I don’t blame him. I would be cautious too.

Silver and crew use a results-validated model to make election predictions. Like any model, the results are only as accurate as its inputs. Garbage in, garbage out.

Silver’s models contain a large number of public phone-based polls that typically include some combination of landline and cell phone interviews. This would make me nervous. Very nervous.

The accuracy of phone polling is declining and has been declining for years for a host of reasons. It’s one reason I’ve shifted everything to online polling. Also, media sponsored public polls are generally less reliable than private campaign polls.

Polling by phone is an increasingly expensive and difficult endeavor. Today, phone polling should include some degree of cell phone interviews to augment standard landline phone interviewing. Adding cell phone interviews to this stew helps, but is not a panacea for reaching the 40% to 50% of the population without a landline phone.

Missing from most cell phone surveys are interviews with voters who are not on a cell phone contract. Pre-paid and pay-as-you-go cell phones are two growth areas in the cell phone industry and at least a quarter of the cell phone population is probably missing from most polls. In the UK and Europe, pre-paid is the bulk of the cell phone market.

Also, including cell phone interviews is very expensive. Legally, cell phone interviews must be hand dialed which increases the labor expense for phone polling and reduces the quantity of available public polls. This is especially true at the state level where local media lack the funds to conduct a series of strong phone polls.

However, simply including cell phone interviews doesn’t cure all the issues faced in polling by phone. Getting a representative sample of respondents willing to complete a survey is difficult. On a good day, we may get 11% of a sample to fully respond to a poll by phone. This is tragically and historically low.

Response rates to surveys also vary by season. Rates are low during holidays (i.e. December) especially among consumer audiences. It’s also difficult to get a representative sample during the summer when people are on vacation, traveling, watching youth soccer and baseball, etc.

October is an equally problematic time for polling. Historically, response rates for October phone-based polls can decline up to 25%. October is the first month of the last quarter of the year and is typically a huge month for business travel. You also have Halloween, a bank holiday, benchmark tests for school students under “No Child Left Behind”, fall breaks, youth sports, etc. October isn’t as bad as December for polling, but it’s not far off. It’s entirely possible that respondents to phone polls in October are not representative of who will show up on election day.

This non-response issue is the likely reason we’ve seen such crazy numbers from the October polls. Take a look at the NBC News/Wall Street Journal polls from October and November. The swing of Clinton from +6 to +9 and +11 and back to +4 are more likely the result of changes in survey response rates than a genuine change in the voting intentions of the electorate.

NBC News/Wall Street Journal Poll

It’s wild swings like we see in the NBC Poll and a lot of prior research on phone polling that makes me question most of the current phone based polls. As I mentioned earlier, I’ve shifted to online polling where I typically see more stable numbers without the wild fluctuations we have seen in the phone polls.

Sadly, there are not that many online polls this cycle. Outside of YouGov and newer entries from Google Surveys and SurveyMonkey, we just don’t have a lot to work with. We also don’t have a lot of information on the source of the online sample used by these groups.

Most online surveys use an online panel of pre-recruited individuals or households who have agreed to take part in online market research surveys. Most are compensated in some way and the quality of these samples vary. The quality of the respondents matters a great deal, so it’s hard to gauge the accuracy of these polls without knowing who and how their survey respondents were recruited.

Two polls widely viewed as outliers this cycle are the IBD/TIPP Tracking poll (Phone) and the LA Times/USC Tracking poll (Online). Both polls have been remarkably stable over the final months of this election and both indicate a much closer race than the majority of other public polls. Also, both polls have given Trump leads over the past few months.

One possible reason for the outlier status of these two polls is the use of weighting.

Weighting is a technique in survey research where survey results are re-balanced to more accurately reflect the population you’re polling. A demographic profile (based on known data such as a census age distribution) is often used to rebalance survey results to better reflect real-world results.

From what I can tell, both the IBD/TIPP Tracking poll and the LA Times/USC Tracking poll heavily weight their results beyond measures such as gender, age, party affiliation, and ethnicity. If their use of weighting cures some of the deficiencies in terms of non-response (i.e. households who don’t respond or answer a phone poll) and coverage (i.e. prepaid cell phone households not in the cell phone samples), they might not be outliers after all.

This is especially true for the LA Times/USC Tracking poll which has consistently given Trump a better chance of winning than pretty much any other polling organization. This poll is conducted online, uses a different set of questions than most “traditional” polls, and weights the results across a broad range of measures. This poll is conducted among a sample of respondents that were recruited specifically for this type of research. Their methodology could be an antidote for the problems with traditional polling that I mentioned earlier, or it could be a major factor in being very wrong about this election. They are basically in a go big or go home situation that I totally support. Stick to your guns and if you are wrong, tell us why.

There is some recent precedence for properly weighted online polls outperforming phone polls, namely, Brexit.

Online polls outperformed phone with the UK Brexit vote earlier this year. Phone polls tended to show a win for the “Remain” camp. But online polls by TNS UK and Opinium Research accurately predicted a “Leave” win, and were both viewed as outliers by most pundits. Both also used extensive weighting to ensure their results properly reflected the views of potential voters.

It’s entirely possible that the phone polls of this election could be wrong, and if they are, a large component of Silver’s data-driven election model, will also be wrong. Nate Silver and crew are right to be cautious. It’s not out of the realm of possibility that we are shocked by the election results tomorrow night.

Based on the public and private polling I’ve tracked and conducted, I can easily envision three very different scenarios occurring tomorrow night.

Scenario 1: Clinton cruises to an easy three or four-point win in the popular voter and handily wins the electoral vote

Scenario 2: The election is exceptionally close late into the evening and we are up waiting for results from Colorado, Arizona, and Nevada to find out the winner

Scenario 3: Trumps win the popular vote but loses the electoral vote

Scenario three is my nightmare situation, leaving everyone unhappy and providing zero closure to an overly-stressed electorate.

Conventional wisdom has been wrong all year, and I’m totally expecting some type of surprise tomorrow night. It could be a decisive win by Clinton, a close win by Trump, or a deadlocked election.

So forgive me if I don’t jump on the Nate Silver bashing bandwagon. This has been a crazy year and I expect tomorrow to be the same. I just hope we can learn something from this whole episode that helps us learn more about our fellow citizens and how to properly capture their actual thoughts, concerns, and beliefs for future elections.

Journalists Are Out of Touch With Reality

Headlines coming out of the third and final 2016 presidential debate all followed a similar theme: Trump will not commit to accepting the election results if he loses.

The four largest newspapers in the country ran front page headlines that adhered to the same script, echoing post-debate comments from broadcast commentators:

  • Washington Post: “Trump Refuses To Say Whether He’ll Accept Election Results”
  • The Wall Street Journal: “Trump Won’t Commit to Accepting Vote if He Loses”
  • New York Times: “Trump Won’t Say if He Will Accept Election Results”
  • USA Today: “Keep You In Suspense: Trump Won’t Commit To Accepting Vote Results”

Out of a ninety-minute debate, viewed by more than 71 million Americans and covering a variety of topics, this was the major takeaway?

Despite the blaring and repetitive headlines, Americans’ view of this election and whether it’s rigged is unlikely to be shaped by an increasingly out of touch and out of time press.

The belief that Trump’s failure to commit to the outcome of the election was a major gaffe simply illustrates just how out of touch major news organizations are with the country they cover.

Americans increasingly believe the system(s) is rigged against them and view corruption as widespread in government and business. Few will be troubled by Trump’s “rigged” rhetoric since most already believe corruption is a huge problem in the country.

According to Gallup, three in four Americans (75%) perceive corruption as widespread in the country’s government. While this number is from 2014, it’s been steadily increasing since 2007.

So what do Americans fear most? Corrupt government officials. According to the 2016 Chapman University Survey of American Fears, six in ten Americans (61%) identified corrupt government officials as their top fear, eclipsing both terrorist attacks (41%) and not having enough money for the future (40%). Government corruption was also their top fear in 2015.

Americans’ view of government and government officials is dismal and their assessment of the economy and businesses are not any better.

Seventy-one percent of Americans think the U.S. economic system is “rigged in favor of certain groups” according to a June 2016 poll conducted by Marketplace and Edison . This belief was shared regardless of political affiliation or ethnicity.

In a follow-up poll conducted in October 2016, nine out of ten Americans who believed the economic system was rigged in favor of certain groups agreed the U.S. economic system is rigged to benefit politicians (89%) and corporations (86%).

Amidst this backdrop of distrust, why do journalists think voters care if Trump accepts the outcome of November elections? They probably don’t. If anything, they probably agree that it is rigged.

And as much as the legacy press highlights “the gaffe”, their ability to influence voters’ opinion on this topic is increasingly weak. Americans view the press just as negatively as the aforementioned government officials.

In a poll conducted in 2016 by the Media Insight Project for the American Press Institute, less than one in ten Americans (6%) have a great deal of confidence in the press. Four in ten (41%) Americans said they have hardly any confidence at all in the press. In the same poll, more than one out of three (38%) Americans said they have had an experience with a news and information source that made them trust it less.

One of the great challenges ahead is finding a way to restore faith in institutions and most importantly, the press. Sadly, the damage will take decades to repair. These types of initiatives are often generational and will require the passing of significant time before peoples’ memories of this era fade.

The first step to restoring faith in the press is for the media to admit there’s a problem. Second, media organizations need to accept reality and understand how the public actually thinks.

Too many journalists and thought-leaders view the country as they wish it to be and have segregated themselves into hive minds that shelter them from opposing opinions.

Journalists and legacy media are some of the worst offenders of this phenomenon, often opting to challenge the grammar of a dissenting voice than understanding the beliefs and judgments of the messenger. Granted, sheltering yourself, regardless of your profession, against the trolls and vitriol of social media is tempting. However, it skews your view of the country, which is a big problem if you’re a journalist.

Without trust in the press, it’s unlikely faith in any government or political institution will be repaired anytime soon. The divisions and distrust we see today will only continue to grow.

It’s high time journalists and the media return to reality, perhaps survey their audiences to learn how they think, focus coverage on issues the public actually cares about, and then report it accurately.

New Online Voter Panel for Political Polling?

Research Now Group Inc, headquartered in Plano, TX, just launched a new voter panel. This panel allows political pollsters and new tool  to measure American voters’ perception about various issues. It provides insight on voters’  opinions of candidates, voter turnout, key campaign issues, and insights into the perceptions of millennials.

This panel gives researchers access to more than 600,000 deeply profiled, verified voters from every state. Researchers can pick constituents based on party affiliation, historical election turnout, and congressional district among other variables. Panelists can participate in surveys via varying platforms (mobile, tablet, or PC), so all voter populations will be represented.

Research Now has identified hundreds of thousands of voters who are historically hard-to-reach. This includes 70,000 millennials and other voters with no publicly available phone numbers.  This panel is the largest of its kind and marks a new step forward in polling. In recent years, polls have faced a lot of challenges. This is largely due to changes in phone use, caller ID, etc.. Many voters, especially millennials, do not use land-line telephones, so it is difficult to get accurate data. This new panel from Research Now provides a way to access this hard-to-reach population.

While this new panel is a step forward in polling, it is not necessarily the end point for polling improvement. In a previous blog post, I stated that the future of public opinion research lies in a variety of new methods. Social media analysis is one way to gain access to millennials and see their voting preferences, but it is also complicated and not always reliable. Biometrics technology can better understand voters’ tendencies and opinions. A combination of methods while using panels like the new Research Now panel will be key parts of the toolkit for public opinion researchers and it will be interesting to see what other technology emerges as the 2016 race for president continues.    

How to Make Customers More Comfortable with Data Collection

Gigya, a customer identity management firm in Mountain View California, has just released a report about consumers and privacy. This report shows that customers are increasingly concerned about their privacy. This shouldn’t come as a surprise, especially after customers were outraged over Samsung’s Smart TVs.  These TVs have the capability to listen to voice commands and then change the channel or turn up the volume automatically, but there is a catch. For this to work, these televisions were listening to everything that customers were saying and then sending that information to a third party to have it translated and turned into a command. To be expected, people weren’t so happy that their conversations were being recorded and sent to a different company for analysis. In the wake of this incident, Gigya proposes two ways to keep customers satisfied with customized content but still comfortable with data collection.

First, Gigya suggests a focus on collecting first-party data. This means that you should only use information that customers actually provide you. Instead of using information from data-brokers, getting information directly from the consumer allows your customers to feel more comfortable with your brand. First-party data is not only more accurate, but it respects your customers’ privacy. Some companies, like McCormick, are taking first-party data collection a step further and allowing customers to go in and customize their tastes specifically so they can have their own “flavor print.” McCormick then uses this information to suggest recipes and spices to try, which leaves the customers with a positive and customized experience.

Secondly, Gigya suggests being completely transparent about how you will use the customer’s data. Almost half (45%) of customers say they are more willing to give their information if a company makes it clear how they will use the data. What’s more, 80% of respondents reported leaving sites or closing registrations because they were concerned about the type of information requested.  Clearly, telling the customer exactly what the data is for and asking only for data that you need is important to make customers feel comfortable and trust your brand.

Customers’ concern for privacy is certainly not going to disappear anytime soon. It is likely that customers will become even more guarded and less trusting towards data collection as the age of digital technology continues to evolve. It is important, then, to establish a trusting relationship between your company and your customers now. Show your customers what you need their data for, collect it directly from them, and then use it in a way to give them a customized, uniquely-curated experience. That should help you make customized products or services that customers will feel comfortable using again and again.  

TV Streaming: Netflix vs Hulu vs Amazon Prime

Do you find yourself watching Netflix often these days? What about Hulu or Amazon Prime Instant Video? According to our Reading Pulse data, people who regularly watch television programs on online streaming like Netflix and Hulu platforms have increased 98% from December 2010 to April 2015. Unsurprisingly, millennials are the most likely to regularly watch television programs on this platform. This begs the question, are all streaming platforms the same?

According to a new study by iModerate, customers prefer Netflix over both Hulu and Amazon Prime Instant Video, so no, not all platforms are equal. Consumers view Netflix as a cable replacement whereas both Amazon and Hulu face “delivery and brand challenges.” It seems that customers like the wide array of content and uninterrupted viewing on Netflix. Netflix is also becoming part of the social scene for many customers, and having a “Netflix night” is the new normal. People like to compare what shows they have binge-watched with their friends as well as talk about movies and shows together. Netflix is now engrained in popular culture.

Consumers dislike Hulu’s commercials that are compulsory even with the paid service. According to iModerate’s data, people are eager to try Hulu but mostly watch a specific TV show. Once they watch their show, viewers are not very inclined to search for additional or original Hulu content. Hulu, though, does offer shows that are not offered on Netflix and Hulu has brand recognition.

Amazon Prime Instant Videos gets overlooked because customers feel Amazon does not distinguish instant videos from their prime service. Consumers forget or are unaware that Amazon offers a streaming option. The iModerate data shows that many consumers feel as though Amazon videos lack value or defining characteristics. Those customers who were aware of Amazon Prime Instant Videos reported that it was slow, short on value, and not something that they would use if they had to pay for it.

Overall, video streaming has increased dramatically in the past five years. This trend is expected to continue and as of now, it seems as though Netflix will be the main beneficiary. Perhaps Hulu can give Netflix a run for their money if it can solve the commercial dilemma and introduce a first-time-user friendly interface. Amazon Prime Instant Video should perhaps work on branding as well as improving content and speeds. Netflix is “King of the Hill” for now, but we will keep you in the loop if anything further develops. 

Did Apple Change the Music Game Yet Again?

Apple launched it’s new all-in-one music service last week, Apple Music. The reviews are in, and it seems like Apple might be onto something, if they can clean up a few problems. While Apple has the upper-hand on in-house playlist creation and recommending music, they currently have too many problems to give the other streaming services a run for their money.

Reports have shown that there are issues with the social component, the user interface, playlist length, and system bugs among other things. However, the real genius from Apple Music is in Apple’s ability to predict what songs, artists and playlists to recommend for you. Apple Music is now the place to go to discover new music. How does Apple excel at giving users exposure to their new favorite songs, artists, and playlists?

Mostly, Apple uses Beats Music interface and curated playlists to provide users with this experience. Apple acquired Beats Music in 2014 and is heavily relying on the Beats music streaming service. When users first set-up Apple Music, they tap genres and artists that they like on large bubbles, as shown in the image to the left.  After picking their favorite artists and genres, new music is generated for users’ specific tastes.

The algorithm that Apple Music uses can generate songs, artists, and playlists for specific tastes much more accurately than Spotify, Pandora, or Google Music.

Apple Music also has specially curated playlists that, again, come from the Beats service. These playlists are hand-created by editors, artists, and people called “curators.” Each playlist is targeted specifically towards a user’s tastes. These playlists and Apple’s ability to recognize what songs or artists users will like are what make Apple Music stand out in the streaming world.

It looks as though Apple’s ability to recommend music sets it apart, but will that be enough to convince users to switch over? Will finding my new favorite band, song, or playlist be enough to deal with a bad user interface, bugs, and a hard to use product? Time will tell, but I predict that Apple Music will improve drastically over the next three months and it may have the potential to compete with music providers.

Ditto Finds New Way to Observe Customer-Brand Interactions

“Ditto” is a way to quickly show support and agreement for something that someone has said. Well, I say “ditto” to what the company Ditto Labs, Inc. is saying and doing. Ditto Labs is a startup that was built on technology from MIT-trained computer scientists. Ditto’s photo-analytic software scans public photos on social media platforms and recognizes facial expressions, products, clothing, logos, brands, and scenes.

This data can help companies see how customers are actually using their products, when they are using the products, and what other products are being used in conjunction with their product. This will help marketers get a better feel for how to market the product or brand. Ditto can also give ideas about sponsorships or partnerships with companies that are mentioned alongside the target product or brand.

Ditto can be used to engage directly with those who post photos of your brand. This is an easy way to find and communicate with people who are passionate about your product or service. Further, you can ask these consumers if you can use the best photos in galleries and marketing campaigns. Ditto also provides an easy way to ad-target. Many customers do not follow brands on social media even if they use the brand’s products. Ditto can give your company a list of people who share photos with your brand, and then you can target those customers specifically to engage them on social media. Here is a video to demonstrate more of what Ditto can do.

Ditto recently made a presentation at the Innovations Insights eXchange (IIeX) conference in Atlanta that showcased which brands were connected to other brands. Unsurprisingly, Coke was the most often and most connected brand. Car brands are not highly connected to each other, but are often found in pictures with beverage brands, perhaps due to sponsorships. Alcohol and soda brands are often mentioned together, specifically whiskey and coke. These key insights are part of why Ditto can be influential in the marketing world. To look closer at their presentation, you can view their slideshare here. Always on the lookout for new ways to research and gain information for clients, I think Ditto might be an innovative way to glean insights from consumers’ everyday lives.

VoxPopMe Launches Theme Explorer

Video is becoming more and more popular on social media platforms, but how about incorporating video into market research? VoxPopMe is the company from Birmingham, England that is the leading platform for video responses in market research. There are a couple of different ways firms can use VoxPopMe services.

Companies can embed a link from VoxPopMe straight into a survey that will allow respondents to record a 15 – 60 second video of themselves answering a question. The idea is that this response can take the place of a traditional blank text box and will lead to more thoughtful and insightful responses. With the VoxPopMe video in your survey, respondents will not have to go to another site or open a new window, the video can be recorded right on a laptop or smartphone within the original survey.

Additionally, companies can pose questions on the VoxPopMe app and reach more than 10 million global respondents. To check it out, I downloaded the app and answered some of the questions. The app is very easy to use and shows you a prompt as well as follow up questions for your video response. Your response must be at least 15 seconds and no more than 60 seconds, and the app will prompt you if you need to get brighter lighting or speak louder. For different questions, respondents can be paid for their answers. This incentive is thought to make for more thoughtful and meaningful answers. The incentives, however, are not high with each question valued at about $0.75 per video. I think VoxPopMe has a bright future especially in regards to brand reviews and opinions about ads.

VoxPopMe just unveiled a new and very exciting feature, “theme explorer.” I spoke with Dean Macko, the managing director of the North American branch of VoxPopMe to hear more about this new feature. Essentially, this technology can analyze all responses on a specific question and then tell you which themes and key insights are important and reoccurring. It can show the key themes from all of the responses so that you have an idea of what to look for before watching some or all of the videos. Instead of having people go through all the responses and code them, theme explorer will do all the leg work in much less time.  VoxPopMe also analyzes responses based on sentiment so that you can see if a respondent is positive, neutral, or negative at first glance. There is also a system set in place to take the best videos to the top of the pack. When I asked Mr. Macko what makes a video the “best,” he emphasized that video and audio quality is the most important part. The best videos also answer all of the questions, including prompt questions, and uses most of the time.

All in all, VoxPopMe can embed video responses into your survey and then analyze the responses, show you overarching themes, bring the best videos to the top, and show you the sentiment of responders at first glance. I think that VoxPopMe is in a position to make video research approachable and affordable in the market research industry.

Contact us to learn more about VoxPopMe and how we can help you with your research.

Wearables and Market Research

Google and long-time clothing producer Levi Strauss Co. have just partnered up to produce a whole new kind of fabric – a “smart cloth.” Called Project Jacquard, after the inventor of the loom, this new interactive fabric can be embedded into any fabric by way of an industrial loom. This means the new fabric is easy to use and can be wide-spread. These interactive threads currently function like a touchscreen on a phone. They can detect someone swiping or moving their fingers and can connect with other technology, like a smartphone. This means we might soon have another way to answer our phones or snooze our alarms.

Google and Levi are not the first brands to come out with “smart” fabric. Clothing brand Athos has embedded wearable sensors for heartrate, breathing rate, electrical activity generated by muscles (EMG), and more into workout clothes. The idea is that all of this information can be displayed for the user so they can better maximize their workout.

Recently, Researchers at University of California in San Diego were granted $2.6 million to develop smart clothes that help regulate body temperature. By using polymers that expand and shrink, their idea is to make a lightweight, washable, easy to use shirt that can thicken if the room gets colder or thin out if the room gets warmer. This will cut down on electricity and heating and cooling costs. The technology is still in the very early stages, but if it is developed as they hope, it could considerably help with natural disasters like the heat wave recently seen in India.

While Google is certainly not the first company to bring technology into fabrics, they are entering the market with new boundaries to push. As shown, other “smart clothes” use sensors or polymers in their fabrics. Google is working with threads that have microchips in them. These fabrics will be able to be programmed to do almost anything. While Google is designing the software and will be available for support, other designers will be in charge of the actual products. Levi’s, for one, will get their chance to use this new software in an exciting way. Perhaps they will embed a game onto the sleeve of a shirt, or maybe embed a TV remote to the arm of a sofa. Google will remain an interested partner, but the designing is left to other companies who may have a better sense of what the market is ready for and what customers want. We will see if this new wearable tech leads to a touch screen integrated into a shirt, new remotes that are embedded into a sofa, or even quicker doctor visits due to shirts that measure all vital signs.

In terms of the market research industry, this new technology could work hand-in-hand with biometrics to better measure responses to a myriad of things. Responses to commercials, brand messages, and advertisement campaigns could be tested more efficiently with this new technology. Wearable technology will be able to measure heart rate, breathing rate, and potentially other factors that are important physiological changes that come along with someone either liking or disliking a message. If we could use this wearable technology in conjunction with biometric measures like facial expression analysis, we will be able to get a better feeling for how customers actually react to a commercial, product, or branding message. Time will tell, but I think the combination of “smart clothes” with biometrics will soon become commonplace for market researchers.

Instant Articles: Facebook's Publishing Platform

Facebook has now launched a new feature called “instant articles.” These articles are provided by sources like BBC, National Geographic, or Buzzfeed and are more immersive and interactive than what is currently on your newsfeed. Not only does this new platform allow for content to load “instantly,” well maybe not instantly but it is much faster, but it also allows for zooming in on a picture, hearing the author narrate the caption, and auto-play videos. Facebook hopes that Instant articles are going to change the way we interact with content. Instead of clicking on a link, going to the National Geographic article, and only seeing a large picture without much detail, instant articles gives readers the ability to immerse themselves in the article in a more interactive way. By using auto-play videos, zoomed in pictures, and narrated captions, readers will experience the content in a new approach. Facebook has enabled their nine current partners (The New York Times, National Geographic, Buzzfeed, NBC News, The Atlantic, The Guardian, BBC News, Spiegel Online, and Bild) to keep track of audience metrics and to keep their current advertisements on the articles. Thus, instant articles could be good news for the publishers. The content will likely have more engagement while still maintaining audience metrics and their own advertisements. To see the first instant article from the New York Times, follow this link.

Instant articles is not the only new innovation coming from Facebook. Recently, Facebook announced they’re going to ‘up the ante’ with their buying and selling pages. Currently, people that are part of these buy and sell pages are able to post a location, description, price, and photos about the item for sale. Now, Facebook is moving to the next level. Facebook is introducing a new “all sales groups” option for users. A user belonging to multiple buying and selling groups can now see items for sale from all groups in one convenient place. This new page also hosts a search bar so that you can easily see who has a “sofa” or “coffee table” available. This page is putting Facebook in direct competition with companies like eBay and Craigslist. It will be interesting to see how consumers feel about purchasing goods via Facebook and if this innovation will drive consumers away from eBay or Craigslist. According to The Next Web, Facebook will be testing this new page soon. 

Facebook has always been an innovative company, so these new advances are no surprise. It will be interesting to see where Facebook goes in the future – whether it’s faster video content, additional service pages, more partners for instant articles, or virtual reality, we will be waiting to see what Facebook does next.

Polling: What Can We Learn From the UK?

 While I do not normally follow British politics, I am interested in the opinion polling around their May elections. Polling is heavily relied on during campaigns so candidates know where they stand with the public and where they could improve. It also provides content for desperate reporters and news agencies looking to fill time and column inches.

The accuracy of the pre-election polls are important not only for people who want to know who is currently in the lead, but also for political campaigns searching for an edge for their candidate.

The opinion polls leading up to this year’s UK elections were particularly inaccurate. Nearly every popular poll had the conservative and labour parties placed within one percent of each other. The polls indicated that this election would likely be “hung” and that no party would have majority seating in the UK’s parliamentary system.

What actually happened is that the conservative party won the, albeit slight, majority of seats. The conservatives, led by David Cameron, secured 331 seats, which puts them in the majority (majority is considered 326 seats). Labour secured 232 seats, Scottish National Party (SNP) secured 56 seats, the Liberal Democrats retained 8 seats, United Kingdom Independence Party (UKIP) now has 1 seat, and other parties make up 22 seats.

Clearly, what actually happened is very different from the neck-and-neck dead heat that the polls predicted.

So, what happened? What went wrong with the polling?

Multiple sources (FiveThirtyEight, Telegraph, and The Conversation) have ascribed the misses to a failure of sufficiently accounting for the documented late swing towards the incumbent party (the Conservatives). This swing is something that traditionally happens in UK elections. According to Leighton Vaughan Williams’s article on The Conversation, another problem involved an overestimation of the amount of people that would be voting. The Conversation also points to the methodology of pollsters. Pollsters in most of these polls only supplied party names (conservative, labour, etc) instead of actual candidate names, which tends to “miss a lot of late tactical vote switching.” The late swing of votes, inaccuracies in voter turnout, and issues with the pollsters’ methodology account for possibilities of why the pollsters were so inaccurate. 

Granted, polling UK voters is a historically difficult task. Polls in the 1992 election were more inaccurate than this election and history repeated itself in 2015.

So, what does this mean for the future? Is this a harbinger for our elections in 2016?

It’s no secret that traditional polling methods are quickly becoming outdated. According to MPR news, political polling is evolving to monitor social media usage along with social media analytics. Another type of emerging technology in campaigns is biometrics.

While some countries have started to use biometrics at polling stations to help with voter identification, biometrics has the potential to be more. Using biometrics for polling purposes can help the system be more effective since it measures how much a specific person agrees with a statement, question, or wants to vote for a candidate. Even though this technology is new and still in development stages, I think it will change the accuracy and landscape of campaign research. The US presidential race of 2016 is sure to demonstrate some new polling methods, and it will be a good opportunity to observe what does and does not work in a rapidly changing industry. 

Businesses Overestimate Their Customer Service Prowess

Most businesses and C-Suite occupants think they truly understand their customers. We typically find that many of these business leaders also believe they provide an optimal experience for their consumers.

This is rarely the case, and it appears we are not alone in encountering this issue. 

In a survey of customers and businesses, Millward Brown Digital discovered a significant gap between how businesses believe they care for customers and how valued consumers actually feel.

While three out of four businesses believe they provide an optimal experience for consumers, just 36 percent of customers in the survey say they feel cared for.

This is pretty common, especially among larger organizations. We find the greater the distance between management and the customer, the greater the discrepancy in understanding consumer needs, experience, and loyalty.

If you are doing employee surveys, we recommend you test perceptions among employees from all levels of authority on how consumers view your customer service and product/service value. Once you have these numbers, compare these results with your customer loyalty surveys and see how closely management is in alignment with the consumer.

Strong and successful organizations are usually in alignment with their customers.  Firms on the cusp of trouble typically think they provide an optimal customer experience, while their consumers rate their experiences very differently.

Millward Brown surveyed 1,650 mobile phone users over age 18 in the U.S., U.K. and Australia in September 2014 for Mblox. To see their results, you can get the information from Mblox here.