2016 Postmortem
Related: About this forum47 - 47 Michigan - Baydoun/Foster (D)
http://www.myfoxdetroit.com/story/19905504/michigan-poll-obama-and-romney-in-dead-heatFoster McCollum White Baydoun (FMW)B, a national public opinion polling and voter analytics consulting firm based in Michigan and representing the combined resources of Foster McCollum White & Associates (Troy Michigan) and Baydoun Consulting (Dearborn Michigan) conducted a telephone-automated polling random survey of Michigan registered and most likely November 2012 General election voters for Fox 2 News Detroit to determine their voting and issue preferences for the presidential election.
An initial qualifying statement was read to respondents asking them to participate only if they were very likely to vote in the November General Election.
Thirty five thousand (35,000) calls were placed, and 1,122 respondents fully participated in the survey. The margin of error for this total polling sample is 2.93% with a confidence level of 95%.
The 2012 United States Presidential election will be held on November 6, 2012. Who are you most likely to vote for in the election?
President Barack Obama 46.92%
Republican Nominee Mitt Romney 46.56%
another candidate 2.30%
Undecided 4.23%
Blast me if you want I do not think this poll is worth a shit either, but what caught my eye was the fact they called 35,000 people and only 1,122 people responded
Is 3.2% respondents normal?
alcibiades_mystery
(36,437 posts)CreekDog
(46,192 posts)WI_DEM
(33,497 posts)and spending money there. They aren't.
TroyD
(4,551 posts)Even more so than Gravis and some of the others.
And for some strange reason it's supposedly a Democratic firm.
RomneyLies
(3,333 posts)Their likely voter screen is fucked beyond all recognition.
krawhitham
(4,644 posts)ProSense
(116,464 posts)up 15 points in Florida in August.
All of this polling was pretty normal, of course. But there was one last survey. It was from the polling firm Foster McCollum White Baydoun, which conducts polls for Democratic candidates as well as independently. It was a poll of Florida and it had Mr. Romney ahead by nearly 15 points there.
http://fivethirtyeight.blogs.nytimes.com/2012/08/20/aug-20-when-the-polling-gets-weird/
FBaggins
(26,748 posts)Rasmussen had it at 52-45 after the first debate.
We're not going to win MI by 16 points again... it probably won't even be double digits...
... but it isn't anything close to tied.
cthulu2016
(10,960 posts)Obamamama44
(98 posts)Drunken Irishman
(34,857 posts)RandySF
(58,935 posts)It took 35,000 calls to get 1k respondents?
aaaaaa5a
(4,667 posts)fujiyama
(15,185 posts)This is laughable. The pollster may as well have just made up numbers.
flowomo
(4,740 posts)Apparently, F/D has a problem with age distribution (and possibly other things):
"Other pollsters don't show weighted age distributions and just provide unweighted sample sizes. A September poll in Michigan by Foster McCollum White Baydoun put the unweighted break for 66+ at 44 percent while just 3 percent of their unweighted interviews were 18-30 years old. The census data for Michigan estimates that 19 percent of 2008 voters were 18-30, suggesting a huge under-representation of 18- to 30-year-olds in their sample (unless they weighted up their 18-30 year old sample by an enormous amount). That poll showed Obama +2 which was a significant departure from other polling in Michigan at the time."
http://www.huffingtonpost.com/nick-gourevitch/age-different-polling-skew_b_1952251.html
krawhitham
(4,644 posts)world wide wally
(21,744 posts)In all seriousness...us "old" people stopped the war in Vietnam, pushed through civil rights, lowered the voting age and drinking age in many states, grew up with the Beatles, made diversity popular, discovered the environment, and elected Clinton.
A simple "Thank you will work"
But I wonder what happened to all my old hippy friends who betrayed all the values they grew up on and voted for Reagan. I wonder how they voted for Bush... and I STILL CAN'T IMAGINE how they could possibly vote for Romney who represents every fucking thing they ever believed in when they were alive.
Response to flowomo (Reply #13)
Post removed
Welcome_hubby
(312 posts)ncav53
(168 posts)yellowcanine
(35,699 posts)DemocratSinceBirth
(99,710 posts)So the pollster just made a lot of automated calls and didn't weight the responses. That makes no sense.
yellowcanine
(35,699 posts)internet poll except that access is restricted to those who get called.
efoster40
(5 posts)Hello everyone.
My name is Eric Foster, President of Foster McCollum White & Associates. I am the lead pollster for our firm and our partnership with Baydoun Consulting. I want to thank everyone for taking the time to provide their comments and feedback. It is critical and important for everyone to voice their ideas. I would like to respond to the questions about our polling work.
First, I would like to address the topic of our political affiliation. Foster McCollum White Baydoun is non partisan. We do not ask for a Democratic or Republican label. Separate of our polling work, FMW has consulted to Democratic and Republican candidates for elected office, candidates for Non Partisan and Judicial positions, Millage campaigns and ballot initiatives. Our partner Baydoun Consulting consults Democratic candidates and the owner of Baydoun Consulting is very involved in statewide Democratic politics. We do not allow our political views or personal voting patterns impact our polling methodology or reporting. We present the findings in a clear and detailed manner, with explanations on our weighting methodology, demographic respondents versus projected turnout demographics and other key modeling processes, so readers can have objective data to review. We are not aligned with either political party and these polls are independent of any bias from the Democratic or Republican parties. I understand that most of the respondents on blogs are either Democrats or Republicans and they may have offense with the data findings that reflect negatively on their candidate. Our job is not to present information to make either party like the data, our job is to present unbiased data and let the reviewers draw their responses and conclusions.
Second. There has been a lot of discussion about our Florida poll in August and our use of historical election turnout statistics collected from individual city and county clerks and the Secretary of State's offices instead of exit polling. I want to explain our methodology for the readers review.
Our Predictive Voter Behavior Analysis model reviews election statistics for age, gender, voting participation pattern, gender and socioeconomic factors to determine the likely voting universe for an upcoming election. Our turnout models are based on state based historical turnout statistics provided by the municipal and county clerks and secretaries of states office of a state for age, gender, party, ethnicity and voting method (early, absentee, poll location) instead of exit polls. We trust the reliability of the election statistics from the clerks offices to give us value data reads on future elections.
The reason we take the historical data for a state is to give us a baseline for each precinct within the state and then build models up from there. We work to identify solid trends of turnout over a series of primary and general election contest so that we can remove outliers within turnout, age, gender, partisan (if collected) and ethnicity and determine the true participation base for that precinct. We can then project out for the variable election conditions (type, advertising impact, voter mobilization, outlier ballot issue impact, etc.) that allow us to determine our high moderate and low performing turnout and voter models.
Our polling call list was weighted to the historical weights for age, gender, race, region and congressional district area. Our list is also comprised of voters with previous voting histories in Presidential, state and local elections. We include the moderate and low performance voters, but the call files do contain a significant portion of voters who have a likely history to participate. We do not call voters who have never participated in elections but are registered. It is difficult to contact people via cell phones is The Telephone Consumer Protection Act (TCPA) (47 U.S.C. 227 , 47 CFR 64.1200) prohibits the use of an automatic telephone dialing system to contact any telephone number assigned to a cellular telephone service without express prior consent from the party being called. Based upon this federal law and the difficulty in procuring call files with parties (voters) who have provided their consent, our call files are comprised of landlines.
When we call through the list, we report the demographics of the respondents without weight. If our demographics match the likely voter demographics for the polling study, we will report the baseline results as unweighted. If there are underrepresented groups within our aggregate respondent universe, we use our weighting model to adjust for their representative weight and the groups reflected polling preference for the baseline questions. We still will report the un-weighted demographics of our respondents because they reflect the prevailing interest level of the voting groups at the time of our polling survey.
Based on the respondent universes to our Florida poll, we made the adjustment weight for the five underrepresented groups in Florida based on our PVBA model. We analyzed the respondents participation rates to our data models for Florida and also considered the recent spike in Presidential election rates for the younger age groups and the representative portion that each group makes up of the registered voting base. Even though our model projects a lower turnout among primarily voters under 50, we weighted the voters ages 18 to 30 at 12% of the possible election universe and voters ages 31 to 50 at 15%, for a total of 27%. Re-elections for Democrats tend to draw fewer younger and Minority voters then their initial election, per our historical analysis models. We believe that these groups tend to feel that they accomplished the major task of placing their change agent candidate into office and those they should have built enough of a base to sustain themselves.
We have made weighting adjustments to the aggregate baseline responses based on the following five groups who were underrepresented in our aggregate polling respondents:
Male respondents 42.38% of respondent universe versus 45.2% of (FMW)B PVBA model projections for 2012 November general election and Floridas overall registered voter base, with a final weighted determinate 45.0% of the aggregate baseline universe.
African American respondents 6.32% of actual respondent universe was weighted to reflect the 10.1% of (FMW)B PVBA model projections for 2012 November general election, 13.2% of Floridas overall registered voter base and 13.7% of Floridas adult population, with a final weighted determinate 11.5% of the aggregate baseline universe.
Latino American respondents 4.06% of actual respondent universe was weighted to reflect the 7.0% of (FMW)B PVBA model projections for 2012 November general election, 12.5% of Floridas overall registered voter base and 21.1% of Floridas adult population, with a final weighted determinate 10% of the aggregate baseline universe.
Voters ages 18 to 30 years old 1.33% of actual respondent universe was weighted to reflect the 1.8% of (FMW)B PVBA model projections for 2012 November general election and 16.3% of Floridas overall registered voter base, with a final weighted determinate 12% of the aggregate baseline universe.
Voters ages 31 to 50 years old 7.65% of actual respondent universe was weighted to reflect the 10.1% of (FMW)B PVBA model projections for 2012 November general election and 21.6% of Floridas overall registered voter base, with a final weighted determinate 15% of the aggregate baseline universe.
The self identified partisan participation rates from our Florida poll was as followed:
Total Democrats 38.31%
(Independent): 25.10%
Total Republicans 36.36%
When we polled Florida, a number of additional factors within our cross tabs relate to the shift in Obamas fortunes in the state:
White Women He was losing them in our Florida poll
People ages 31 to 50 He won this group handily in 2008, but with the economic challenges and housing struggles, this group is more disenchanted then before.
Florida Latino voters the Cuba community make up a significant shift of voters to the Republican party.
People didn't understand Obamas plan to Ryans plan At the time, Paul Ryan has provided Mitt Romney with cover for lacking details about his economic and budget plans. People could at least understand and make sense of what Ryan wants to change about Government. President Obamas plan still seems vague to most voters.
With respect to our Michigan polling, some people have attributed a bias against minority voters in our polls. That is false. Per our PVBA model (which is built on 20 general election cycles worth of historical voter turnout statistics for age, gender and ethnicity, the 2012 Michigan Presidential general election turnout should be have 25% minority voters (African American, Latino American, Asian American, Arabic American, Native American and Multiracial Americans) For our sixth poll in a row, our Minority voter participation rate was significantly lower (16%) of the respondent weight in spite of our poll sample call file universe being roughly 26% minority. This indicates a continuing interest gap among these voting groups.
With respect to our October 23rd poll, we made weighting adjustments to the aggregate baseline responses based on the following four groups who were underrepresented in our aggregate polling respondents:
Male respondents 35.63% of actual respondent universe was weighted to reflect the 46% (FMW)B PVBA male voter turnout model projections for 2012 November general election, with a final weighted determinate factor of 45.0% of the aggregate universe.
African American respondents 9.02% of actual respondent universe was weighted to reflect the 17.5% of (FMW)B PVBA model projections for 2012 November general election, with a final weighted determinate factor of 17.5% of the aggregate universe..
Voters ages 18 to 30 years old 1.96% of actual respondent universe was weighted to reflect the 16% of (FMW)B PVBA model projections for 2012 November general election, with a final weighted determinate factor of 16.00% of the aggregate universe.
Voters ages 31 to 50 years old 14.20% of actual respondent universe was weighted to reflect the 25% of (FMW)B PVBA model projections for 2012 November general election, with a final weighted determinate factor of 25.00% of the aggregate universe.
With respect to the Nate Silver article, Nate made a number of mistakes in reviewing our methodology and demographic reporting sections. He reported our respondent demographics as if they were our projections about turnout. Considering most polling forms don't publish their respondent demographics, I could understand the confusion. Additionally, we reviewed our methodology section and added additional verbiage to provide a clearer understanding of our polling construct. We still support what Nate does and continue to provide polling data to him, which he continues to publish to this day. Nate actually helped us by making us understand that more data, not less, is the best way to operate in this industry.
I do want to address something else that is discussed in the polling and survey industry and that is the question of oversampling. Oversampling is a method in which one attempts to gain a higher proportion of a market, consumer or voter segment for statistical or scientific purposes. When a public opinion polling conducts a survey and has built the polling sample call file based on historical demographic research or multiple exit polling models to strengthen reliability, the intent and goal is to receive proportional responses from all of the cross tabular groupings. Weighting adjustments are made when specific groups do not respond to the calls or complete the survey to their projected proportion of the universe. When that occurs, then you apply weighting models to balance the survey to reflect a proportional universe. If we had a group that did not respond to the poll in proportion to the number of persons from that group in the control (call file), it is not purpose under-sampling and over-sampling of the related groups in that cross tabular universe (i.e., African Americans should be 17.5% of an election universe, they make up 18.7% of the control group (call file), yet only 10% actually complete the survey). we throw the term out to imply a purposeful deception by polling organizations. We don't operate in that capacity and we believe most of the organizations in public opinion polling do not either.
I am taking the time to provide such detail because I believe that firms in this industry are working to provide the public with quality information for review and assessment. Polling is a snapshot in time and the respondents are influenced by a number of external factors prior to the call and possibly during the survey call. It is our job to provide clear and open information so that you the reviewers can understand the science behind the work and the data that supports the findings. We provide more detail about our work then most of our colleagues in the industry. As two Michigan based small businesses who are working within the American Dream, we have a model that we believe fits this business universe and provides a high level of detail and information. We will continue to be open and transparent in our work and will always provide you and the public with as much information as possible. If you have any additional questions, please feel free to email me at efoster@fostermccollumwhite.com.
Thank for taking the time to read my response.
Sincerely,
Eric Foster
Tutonic
(2,522 posts)Why would you even release the results after 33,800 people hung up your ass? Were the 1,122 bedridden or Romney sister wives?
Maximumnegro
(1,134 posts)efoster40
(5 posts)Hello everyone.
My name is Eric Foster, President of Foster McCollum White & Associates. I am the lead pollster for our firm and our partnership with Baydoun Consulting. I want to thank everyone for taking the time to provide their comments and feedback. It is critical and important for everyone to voice their ideas. I would like to respond to the questions about our polling work.
First, I would like to address the topic of our political affiliation. Foster McCollum White Baydoun is non partisan. We do not ask for a Democratic or Republican label. Separate of our polling work, FMW has consulted to Democratic and Republican candidates for elected office, candidates for Non Partisan and Judicial positions, Millage campaigns and ballot initiatives. Our partner Baydoun Consulting consults Democratic candidates and the owner of Baydoun Consulting is very involved in statewide Democratic politics. We do not allow our political views or personal voting patterns impact our polling methodology or reporting. We present the findings in a clear and detailed manner, with explanations on our weighting methodology, demographic respondents versus projected turnout demographics and other key modeling processes, so readers can have objective data to review. We are not aligned with either political party and these polls are independent of any bias from the Democratic or Republican parties. I understand that most of the respondents on blogs are either Democrats or Republicans and they may have offense with the data findings that reflect negatively on their candidate. Our job is not to present information to make either party like the data, our job is to present unbiased data and let the reviewers draw their responses and conclusions.
Second. There has been a lot of discussion about our Florida poll in August and our use of historical election turnout statistics collected from individual city and county clerks and the Secretary of State's offices instead of exit polling. I want to explain our methodology for the readers review.
Our Predictive Voter Behavior Analysis model reviews election statistics for age, gender, voting participation pattern, gender and socioeconomic factors to determine the likely voting universe for an upcoming election. Our turnout models are based on state based historical turnout statistics provided by the municipal and county clerks and secretaries of states office of a state for age, gender, party, ethnicity and voting method (early, absentee, poll location) instead of exit polls. We trust the reliability of the election statistics from the clerks offices to give us value data reads on future elections.
The reason we take the historical data for a state is to give us a baseline for each precinct within the state and then build models up from there. We work to identify solid trends of turnout over a series of primary and general election contest so that we can remove outliers within turnout, age, gender, partisan (if collected) and ethnicity and determine the true participation base for that precinct. We can then project out for the variable election conditions (type, advertising impact, voter mobilization, outlier ballot issue impact, etc.) that allow us to determine our high moderate and low performing turnout and voter models.
Our polling call list was weighted to the historical weights for age, gender, race, region and congressional district area. Our list is also comprised of voters with previous voting histories in Presidential, state and local elections. We include the moderate and low performance voters, but the call files do contain a significant portion of voters who have a likely history to participate. We do not call voters who have never participated in elections but are registered. It is difficult to contact people via cell phones is The Telephone Consumer Protection Act (TCPA) (47 U.S.C. 227 , 47 CFR 64.1200) prohibits the use of an automatic telephone dialing system to contact any telephone number assigned to a cellular telephone service without express prior consent from the party being called. Based upon this federal law and the difficulty in procuring call files with parties (voters) who have provided their consent, our call files are comprised of landlines.
When we call through the list, we report the demographics of the respondents without weight. If our demographics match the likely voter demographics for the polling study, we will report the baseline results as unweighted. If there are underrepresented groups within our aggregate respondent universe, we use our weighting model to adjust for their representative weight and the groups reflected polling preference for the baseline questions. We still will report the un-weighted demographics of our respondents because they reflect the prevailing interest level of the voting groups at the time of our polling survey.
Based on the respondent universes to our Florida poll, we made the adjustment weight for the five underrepresented groups in Florida based on our PVBA model. We analyzed the respondents participation rates to our data models for Florida and also considered the recent spike in Presidential election rates for the younger age groups and the representative portion that each group makes up of the registered voting base. Even though our model projects a lower turnout among primarily voters under 50, we weighted the voters ages 18 to 30 at 12% of the possible election universe and voters ages 31 to 50 at 15%, for a total of 27%. Re-elections for Democrats tend to draw fewer younger and Minority voters then their initial election, per our historical analysis models. We believe that these groups tend to feel that they accomplished the major task of placing their change agent candidate into office and those they should have built enough of a base to sustain themselves.
We have made weighting adjustments to the aggregate baseline responses based on the following five groups who were underrepresented in our aggregate polling respondents:
Male respondents 42.38% of respondent universe versus 45.2% of (FMW)B PVBA model projections for 2012 November general election and Floridas overall registered voter base, with a final weighted determinate 45.0% of the aggregate baseline universe.
African American respondents 6.32% of actual respondent universe was weighted to reflect the 10.1% of (FMW)B PVBA model projections for 2012 November general election, 13.2% of Floridas overall registered voter base and 13.7% of Floridas adult population, with a final weighted determinate 11.5% of the aggregate baseline universe.
Latino American respondents 4.06% of actual respondent universe was weighted to reflect the 7.0% of (FMW)B PVBA model projections for 2012 November general election, 12.5% of Floridas overall registered voter base and 21.1% of Floridas adult population, with a final weighted determinate 10% of the aggregate baseline universe.
Voters ages 18 to 30 years old 1.33% of actual respondent universe was weighted to reflect the 1.8% of (FMW)B PVBA model projections for 2012 November general election and 16.3% of Floridas overall registered voter base, with a final weighted determinate 12% of the aggregate baseline universe.
Voters ages 31 to 50 years old 7.65% of actual respondent universe was weighted to reflect the 10.1% of (FMW)B PVBA model projections for 2012 November general election and 21.6% of Floridas overall registered voter base, with a final weighted determinate 15% of the aggregate baseline universe.
The self identified partisan participation rates from our Florida poll was as followed:
Total Democrats 38.31%
(Independent): 25.10%
Total Republicans 36.36%
When we polled Florida, a number of additional factors within our cross tabs relate to the shift in Obamas fortunes in the state:
White Women He was losing them in our Florida poll
People ages 31 to 50 He won this group handily in 2008, but with the economic challenges and housing struggles, this group is more disenchanted then before.
Florida Latino voters the Cuba community make up a significant shift of voters to the Republican party.
People didn't understand Obamas plan to Ryans plan At the time, Paul Ryan has provided Mitt Romney with cover for lacking details about his economic and budget plans. People could at least understand and make sense of what Ryan wants to change about Government. President Obamas plan still seems vague to most voters.
With respect to our Michigan polling, some people have attributed a bias against minority voters in our polls. That is false. Per our PVBA model (which is built on 20 general election cycles worth of historical voter turnout statistics for age, gender and ethnicity, the 2012 Michigan Presidential general election turnout should be have 25% minority voters (African American, Latino American, Asian American, Arabic American, Native American and Multiracial Americans) For our sixth poll in a row, our Minority voter participation rate was significantly lower (16%) of the respondent weight in spite of our poll sample call file universe being roughly 26% minority. This indicates a continuing interest gap among these voting groups.
With respect to the Nate Silver article, Nate made a number of mistakes in reviewing our methodology and demographic reporting sections. He reported our respondent demographics as if they were our projections about turnout. Considering most polling forms don't publish their respondent demographics, I could understand the confusion. Additionally, we reviewed our methodology section and added additional verbiage to provide a clearer understanding of our polling construct. We still support what Nate does and continue to provide polling data to him, which he continues to publish to this day. Nate actually helped us by making us understand that more data, not less, is the best way to operate in this industry.
I do want to address something else that is discussed in the polling and survey industry and that is the question of oversampling. Oversampling is a method in which one attempts to gain a higher proportion of a market, consumer or voter segment for statistical or scientific purposes. When a public opinion polling conducts a survey and has built the polling sample call file based on historical demographic research or multiple exit polling models to strengthen reliability, the intent and goal is to receive proportional responses from all of the cross tabular groupings. Weighting adjustments are made when specific groups do not respond to the calls or complete the survey to their projected proportion of the universe. When that occurs, then you apply weighting models to balance the survey to reflect a proportional universe. If we had a group that did not respond to the poll in proportion to the number of persons from that group in the control (call file), it is not purpose under-sampling and over-sampling of the related groups in that cross tabular universe (i.e., African Americans should be 17.5% of an election universe, they make up 18.7% of the control group (call file), yet only 10% actually complete the survey). we throw the term out to imply a purposeful deception by polling organizations. We don't operate in that capacity and we believe most of the organizations in public opinion polling do not either.
I am taking the time to provide such detail because I believe that firms in this industry are working to provide the public with quality information for review and assessment. Polling is a snapshot in time and the respondents are influenced by a number of external factors prior to the call and possibly during the survey call. It is our job to provide clear and open information so that you the reviewers can understand the science behind the work and the data that supports the findings. We provide more detail about our work then most of our colleagues in the industry. As two Michigan based small businesses who are working within the American Dream, we have a model that we believe fits this business universe and provides a high level of detail and information. We will continue to be open and transparent in our work and will always provide you and the public with as much information as possible. If you have any additional questions, please feel free to email me at efoster@fostermccollumwhite.com.
Thank for taking the time to read my response.
Sincerely,
Eric Foster
Maximumnegro
(1,134 posts)That's what the GOP pollsters are up to. They don't care if it looks bogus, fact is Nate is gonna put them in his mini WOPR and crank out numbers. This all about weighing down the average wherever they can. Why do you Ras has been releasing polls right after other pollsters post debates? SAme for ARG, GRavis, etc. They know they looks sketchy but people will include them anyway.
OldDem2012
(3,526 posts)* People without caller ID?
* People who answer every call regardless?
Who exactly?
Jennicut
(25,415 posts)They once had Romney up by 15 in Florida. Seriously. Right up there with Gravis Marketing. I wouldn't even bother looking at the poll.
They are really Foster McCollum White and Baydoun. Eric Foster claims to be a Dem but donated only once to someone, to a Repub congresswoman in 2008.
efoster40
(5 posts)Hello everyone.
My name is Eric Foster, President of Foster McCollum White & Associates. I am the lead pollster for our firm and our partnership with Baydoun Consulting. I want to thank everyone for taking the time to provide their comments and feedback. It is critical and important for everyone to voice their ideas. I would like to respond to the questions about our polling work.
First, I would like to address the topic of our political affiliation. Foster McCollum White Baydoun is non partisan. We do not ask for a Democratic or Republican label. Separate of our polling work, FMW has consulted to Democratic and Republican candidates for elected office, candidates for Non Partisan and Judicial positions, Millage campaigns and ballot initiatives. Our partner Baydoun Consulting consults Democratic candidates and the owner of Baydoun Consulting is very involved in statewide Democratic politics. We do not allow our political views or personal voting patterns impact our polling methodology or reporting. We present the findings in a clear and detailed manner, with explanations on our weighting methodology, demographic respondents versus projected turnout demographics and other key modeling processes, so readers can have objective data to review. We are not aligned with either political party and these polls are independent of any bias from the Democratic or Republican parties. I understand that most of the respondents on blogs are either Democrats or Republicans and they may have offense with the data findings that reflect negatively on their candidate. Our job is not to present information to make either party like the data, our job is to present unbiased data and let the reviewers draw their responses and conclusions.
Second. There has been a lot of discussion about our Florida poll in August and our use of historical election turnout statistics collected from individual city and county clerks and the Secretary of State's offices instead of exit polling. I want to explain our methodology for the readers review.
Our Predictive Voter Behavior Analysis model reviews election statistics for age, gender, voting participation pattern, gender and socioeconomic factors to determine the likely voting universe for an upcoming election. Our turnout models are based on state based historical turnout statistics provided by the municipal and county clerks and secretaries of states office of a state for age, gender, party, ethnicity and voting method (early, absentee, poll location) instead of exit polls. We trust the reliability of the election statistics from the clerks offices to give us value data reads on future elections.
The reason we take the historical data for a state is to give us a baseline for each precinct within the state and then build models up from there. We work to identify solid trends of turnout over a series of primary and general election contest so that we can remove outliers within turnout, age, gender, partisan (if collected) and ethnicity and determine the true participation base for that precinct. We can then project out for the variable election conditions (type, advertising impact, voter mobilization, outlier ballot issue impact, etc.) that allow us to determine our high moderate and low performing turnout and voter models.
Our polling call list was weighted to the historical weights for age, gender, race, region and congressional district area. Our list is also comprised of voters with previous voting histories in Presidential, state and local elections. We include the moderate and low performance voters, but the call files do contain a significant portion of voters who have a likely history to participate. We do not call voters who have never participated in elections but are registered. It is difficult to contact people via cell phones is The Telephone Consumer Protection Act (TCPA) (47 U.S.C. 227 , 47 CFR 64.1200) prohibits the use of an automatic telephone dialing system to contact any telephone number assigned to a cellular telephone service without express prior consent from the party being called. Based upon this federal law and the difficulty in procuring call files with parties (voters) who have provided their consent, our call files are comprised of landlines.
When we call through the list, we report the demographics of the respondents without weight. If our demographics match the likely voter demographics for the polling study, we will report the baseline results as unweighted. If there are underrepresented groups within our aggregate respondent universe, we use our weighting model to adjust for their representative weight and the groups reflected polling preference for the baseline questions. We still will report the un-weighted demographics of our respondents because they reflect the prevailing interest level of the voting groups at the time of our polling survey.
Based on the respondent universes to our Florida poll, we made the adjustment weight for the five underrepresented groups in Florida based on our PVBA model. We analyzed the respondents participation rates to our data models for Florida and also considered the recent spike in Presidential election rates for the younger age groups and the representative portion that each group makes up of the registered voting base. Even though our model projects a lower turnout among primarily voters under 50, we weighted the voters ages 18 to 30 at 12% of the possible election universe and voters ages 31 to 50 at 15%, for a total of 27%. Re-elections for Democrats tend to draw fewer younger and Minority voters then their initial election, per our historical analysis models. We believe that these groups tend to feel that they accomplished the major task of placing their change agent candidate into office and those they should have built enough of a base to sustain themselves.
We have made weighting adjustments to the aggregate baseline responses based on the following five groups who were underrepresented in our aggregate polling respondents:
Male respondents 42.38% of respondent universe versus 45.2% of (FMW)B PVBA model projections for 2012 November general election and Floridas overall registered voter base, with a final weighted determinate 45.0% of the aggregate baseline universe.
African American respondents 6.32% of actual respondent universe was weighted to reflect the 10.1% of (FMW)B PVBA model projections for 2012 November general election, 13.2% of Floridas overall registered voter base and 13.7% of Floridas adult population, with a final weighted determinate 11.5% of the aggregate baseline universe.
Latino American respondents 4.06% of actual respondent universe was weighted to reflect the 7.0% of (FMW)B PVBA model projections for 2012 November general election, 12.5% of Floridas overall registered voter base and 21.1% of Floridas adult population, with a final weighted determinate 10% of the aggregate baseline universe.
Voters ages 18 to 30 years old 1.33% of actual respondent universe was weighted to reflect the 1.8% of (FMW)B PVBA model projections for 2012 November general election and 16.3% of Floridas overall registered voter base, with a final weighted determinate 12% of the aggregate baseline universe.
Voters ages 31 to 50 years old 7.65% of actual respondent universe was weighted to reflect the 10.1% of (FMW)B PVBA model projections for 2012 November general election and 21.6% of Floridas overall registered voter base, with a final weighted determinate 15% of the aggregate baseline universe.
The self identified partisan participation rates from our Florida poll was as followed:
Total Democrats 38.31%
(Independent): 25.10%
Total Republicans 36.36%
When we polled Florida, a number of additional factors within our cross tabs relate to the shift in Obamas fortunes in the state:
White Women He was losing them in our Florida poll
People ages 31 to 50 He won this group handily in 2008, but with the economic challenges and housing struggles, this group is more disenchanted then before.
Florida Latino voters the Cuba community make up a significant shift of voters to the Republican party.
People didn't understand Obamas plan to Ryans plan At the time, Paul Ryan has provided Mitt Romney with cover for lacking details about his economic and budget plans. People could at least understand and make sense of what Ryan wants to change about Government. President Obamas plan still seems vague to most voters.
With respect to our Michigan polling, some people have attributed a bias against minority voters in our polls. That is false. Per our PVBA model (which is built on 20 general election cycles worth of historical voter turnout statistics for age, gender and ethnicity, the 2012 Michigan Presidential general election turnout should be have 25% minority voters (African American, Latino American, Asian American, Arabic American, Native American and Multiracial Americans) For our sixth poll in a row, our Minority voter participation rate was significantly lower (16%) of the respondent weight in spite of our poll sample call file universe being roughly 26% minority. This indicates a continuing interest gap among these voting groups.
With respect to the Nate Silver article, Nate made a number of mistakes in reviewing our methodology and demographic reporting sections. He reported our respondent demographics as if they were our projections about turnout. Considering most polling forms don't publish their respondent demographics, I could understand the confusion. Additionally, we reviewed our methodology section and added additional verbiage to provide a clearer understanding of our polling construct. We still support what Nate does and continue to provide polling data to him, which he continues to publish to this day. Nate actually helped us by making us understand that more data, not less, is the best way to operate in this industry.
I do want to address something else that is discussed in the polling and survey industry and that is the question of oversampling. Oversampling is a method in which one attempts to gain a higher proportion of a market, consumer or voter segment for statistical or scientific purposes. When a public opinion polling conducts a survey and has built the polling sample call file based on historical demographic research or multiple exit polling models to strengthen reliability, the intent and goal is to receive proportional responses from all of the cross tabular groupings. Weighting adjustments are made when specific groups do not respond to the calls or complete the survey to their projected proportion of the universe. When that occurs, then you apply weighting models to balance the survey to reflect a proportional universe. If we had a group that did not respond to the poll in proportion to the number of persons from that group in the control (call file), it is not purpose under-sampling and over-sampling of the related groups in that cross tabular universe (i.e., African Americans should be 17.5% of an election universe, they make up 18.7% of the control group (call file), yet only 10% actually complete the survey). we throw the term out to imply a purposeful deception by polling organizations. We don't operate in that capacity and we believe most of the organizations in public opinion polling do not either.
Also krawhitham, reporting the call volume and response rate is another part of open disclosure that we have in our polling
methodology and process.
I am taking the time to provide such detail because I believe that firms in this industry are working to provide the public with quality information for review and assessment. Polling is a snapshot in time and the respondents are influenced by a number of external factors prior to the call and possibly during the survey call. It is our job to provide clear and open information so that you the reviewers can understand the science behind the work and the data that supports the findings. We provide more detail about our work then most of our colleagues in the industry. As two Michigan based small businesses who are working within the American Dream, we have a model that we believe fits this business universe and provides a high level of detail and information. We will continue to be open and transparent in our work and will always provide you and the public with as much information as possible. If you have any additional questions, please feel free to email me at efoster@fostermccollumwhite.com.
Thank for taking the time to read my response.
Sincerely,
Eric Foster
outsideworld
(601 posts)Thank for taking your time out to address this post and comments. Expect a rebuttal of all you said in a moment.
with all due respect, a poll showing Romney leading by 15 points in Florida is bunk. If you're trying to justify those results, you're wasting your time.
brindis_desala
(907 posts)Bogus polls to create a false narrative. We must have independent exit polling on Election Day.