http://www.rasmussenreports.com/public_content/about_us/methodologyOk, it's just 1000 responses taken. I understand that they use sampling methods that try to weighht different sub-groups based on observed variability (i.e. you may need to sample less african american females than white males to observe trends in the wider populations).
2 points of concern... albeit this goes for many pollsters, not just Ras/Fox.
A) They rely entirely on a computer program to interpret automated telephone answers.In a situation like Florida, i assume there are both a lot of people without registered phone numbers and also a lot of heavy accents that basically break the automation and validity (i.e. i assume the system asks you to repeat yourself if it cannot understand your response, and after a couple times of that i bet many folks just hang-up).
Are my assumptions correct? Is there documented proof that such an automated voice system is missing a statistically significant number of VOTERS (and i mean people who actually vote).
B) They rely on the "likely voter" method which can contain a number of distorting dynamics.Let's be clear: rasmussen weights both the folks surveyed as well as the results of the survey, based on their beliefs about partisan variation... and also about how likely folks are to really vote.
They try to balance things based on their constant updated data on how people classify themselves as democrat, republican or independent. That's fair, but it's the kind of thing that's very "fuzzy."
More importantly, they literally filter and weight their results depending on their identification of respondents in terms of "likely voting." This gets very tricky, because a die-hard Republican might be very angry and frustrated that night... saying they wont vote. The next day, something energizing might cause them to express strong interest in voting. When pollsters rely on this "likely voter" method, they are opening themselves up to confusing results of political leanings... with results of enthusiasm. That Republican voter sudden shows up very strongly in the 2nd survey and can skew a real analysis of trends because while there is no real shift in mccain vs obama support... the strength of that support can tilt the results.
I hope that makes sense.
Has anybody reviewed the best way to handle this issue? Pollsters are businessmen, afterall.