With Survey USA releasing statewide polling yesterday, it is important to look at their most recent experience in California and see if they’ve learned anything from that “sobering” experience.    In the San Diego Mayoral Primary and General election, SurveyUSA released several polls, each time with PDI pointing out here and here that their baseline numbers were way off.  The projected turnout among young people was sky-high, their Latino turnout was about double expectations, independent voters were way too high, and other key demographics gave us concern.  If campaigns were making strategic decisions based on those voter universes, they would dramatically miss the mark.

In the end, the results were so far off, we even did a little gloating:

So, we were surprised today to see that the newest SurveyUSA poll suffers from all of the same problems as the last.  The following table shows our current likely voter universe breakdown (using 14P3) , the participation rates among current voters in the 2010 and 2012 primaries, the Survey USA totals, and an average error rate we see in their base sample using an average of the three PDI data sources.

PDI 2014 Likely 2012 Primary 2010 Primary Survey USA Avg Error
DEMOCRAT 45 44 43 47 +3
DEMPLUS 50 49 48
REPUBLICAN 38 38 40 29 +10
REPPLUS 41 41 44
OTHER (NOT DEM OR REP) 18 18 17 23 +6
OTHERPLUS 9 10 8
AGE 18-24 7 10 10 18 +9
AGE 25-34 17 17 19 26 +8
AGE 35-44 35 35 37 34 -1
AGE 65+ / NO AGE 41 38 33 22 -15
LATINO 14 13 12 30 +17
ASIAN 8 8 8 13 +5

As this chart shows, there are some dramatic errors in their base sample, including:

:: The Poll has Latino turnout at 30%.  This is about twice the highest projection for Latino turnout in the primary, and nearly three times the turnout so far among the absentees.

:: They have again created a sample that skews amazingly young.  For example their age groupings show 65+ voters to be 22% of the electorate, but they are already 50% of the ballots cast.  If Seniors hold at 50% of the absentee voters, and then no Seniors, not one, voted at the polls, Survey USA would still have under-counted them.

:: The entire basis of their polling sample is that 61% of a random 1,000 randomly called phone numbers in California were determined, by them, to be “certain to vote.”   Polls conducted doing random digit dialing should be viewed as suspect from the outset, particularly in a state like California which has great income, age, ethnic, and regional diversity which expresses itself through divergent political ideologies.  To be anything close to realistic, an RDD survey should at least include a post-survey weighting to the state’s likely voter demographics – something SurveyUSA does not do.

The response to this criticism from Survey USA, via twitter, was quick, if somewhat illogical:

If you believe that poll results are inherently valid because non-voters won’t sit through a survey, there has never been a bad poll.

It is true that Survey USA makes some of their data public, and thankfully since we wouldn’t have the ability to decipher the validity of the survey if they, like campaign pollsters, simply put out the topline results.  But that isn’t an excuse to publish a survey that is so entirely out of alignment with the reality of Political campaigns in California that it is basically useless.