Welcome to DU! The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards. Join the community: Create a free account Support DU (and get rid of ads!): Become a Star Member Latest Breaking News General Discussion The DU Lounge All Forums Issue Forums Culture Forums Alliance Forums Region Forums Support Forums Help & Search

krawhitham

(4,644 posts)
Wed Oct 24, 2012, 08:35 PM Oct 2012

47 - 47 Michigan - Baydoun/Foster (D)

http://www.myfoxdetroit.com/story/19905504/michigan-poll-obama-and-romney-in-dead-heat

Foster McCollum White Baydoun (FMW)B, a national public opinion polling and voter analytics consulting firm based in Michigan and representing the combined resources of Foster McCollum White & Associates (Troy Michigan) and Baydoun Consulting (Dearborn Michigan) conducted a telephone-automated polling random survey of Michigan registered and most likely November 2012 General election voters for Fox 2 News Detroit to determine their voting and issue preferences for the presidential election.

An initial qualifying statement was read to respondents asking them to participate only if they were very likely to vote in the November General Election.

Thirty five thousand (35,000) calls were placed, and 1,122 respondents fully participated in the survey. The margin of error for this total polling sample is 2.93% with a confidence level of 95%.

The 2012 United States Presidential election will be held on November 6, 2012. Who are you most likely to vote for in the election?

President Barack Obama 46.92%
Republican Nominee Mitt Romney 46.56%
another candidate 2.30%
Undecided 4.23%


Blast me if you want I do not think this poll is worth a shit either, but what caught my eye was the fact they called 35,000 people and only 1,122 people responded

Is 3.2% respondents normal?
34 replies = new reply since forum marked as read
Highlight: NoneDon't highlight anything 5 newestHighlight 5 most recent replies
47 - 47 Michigan - Baydoun/Foster (D) (Original Post) krawhitham Oct 2012 OP
I'd love for Romney to start pumping money into Michigan alcibiades_mystery Oct 2012 #1
no cell phones CreekDog Oct 2012 #31
If Michigan were as close as this poll indicates Romney and Obama would be campaigning WI_DEM Oct 2012 #2
This has been the weirdest of all polling firms this year TroyD Oct 2012 #3
35,000 people called with 1,122 respondents means one thing. RomneyLies Oct 2012 #4
Same company had rMoney up 4 in August krawhitham Oct 2012 #5
Garbage pollster. Had Romney ProSense Oct 2012 #8
Not a chance FBaggins Oct 2012 #6
robocall -- no cell phones cthulu2016 Oct 2012 #7
This is bs-not one ad is being aired here. Nt Obamamama44 Oct 2012 #9
Even Rasmussen has Obama up 7. Drunken Irishman Oct 2012 #10
They suck. RandySF Oct 2012 #11
Trust me. If Romney were competitive in Michigan, he would be there. aaaaaa5a Oct 2012 #12
Yeah, he'd be kind of stupid to give up on 16 EVs so easily if he still had a chance fujiyama Oct 2012 #24
Nick Gourevitch at HuffPo looked at this pollster in October... flowomo Oct 2012 #13
Apparently only old people will answer their phone for these people krawhitham Oct 2012 #15
Hey!... I'm old (64 today) ... but I don't answer my phone world wide wally Oct 2012 #30
Post removed Post removed Oct 2012 #29
In August, this pollster had Romney ahead by 15% in FL Welcome_hubby Oct 2012 #14
Yep gotta call BS. ncav53 Oct 2012 #16
Social scientists call this "self selection." Basically makes the poll meaningless. yellowcanine Oct 2012 #17
I'm Confused DemocratSinceBirth Oct 2012 #22
Yes. Anyone could have been answering the questions. It really is not much better than an yellowcanine Oct 2012 #25
Eric Foster of Foster McCollum White & Associates response to your post about our Michigan poll efoster40 Oct 2012 #28
The 7-11 Poll is more credible than this crap. Tutonic Oct 2012 #18
To weigh down the average Maximumnegro Oct 2012 #19
Eric Foster of Foster McCollum White & Associates response to your post about our Michigan poll efoster40 Oct 2012 #27
They're weighing down the average folks Maximumnegro Oct 2012 #20
At this point before the election, who in their right mind is going to answer the phone?... OldDem2012 Oct 2012 #21
Trust me, Baydoun/Foster has the rep of being one of the worst pollsters out there. Jennicut Oct 2012 #23
Eric Foster of Foster McCollum White & Associates response to your post efoster40 Oct 2012 #26
Well outsideworld Oct 2012 #32
Um, ProSense Oct 2012 #33
This is a known Romney strategy brindis_desala Oct 2012 #34

WI_DEM

(33,497 posts)
2. If Michigan were as close as this poll indicates Romney and Obama would be campaigning
Wed Oct 24, 2012, 08:37 PM
Oct 2012

and spending money there. They aren't.

TroyD

(4,551 posts)
3. This has been the weirdest of all polling firms this year
Wed Oct 24, 2012, 08:38 PM
Oct 2012

Even more so than Gravis and some of the others.

And for some strange reason it's supposedly a Democratic firm.

 

RomneyLies

(3,333 posts)
4. 35,000 people called with 1,122 respondents means one thing.
Wed Oct 24, 2012, 08:39 PM
Oct 2012

Their likely voter screen is fucked beyond all recognition.

ProSense

(116,464 posts)
8. Garbage pollster. Had Romney
Wed Oct 24, 2012, 08:42 PM
Oct 2012

up 15 points in Florida in August.

<...>

All of this polling was pretty normal, of course. But there was one last survey. It was from the polling firm Foster McCollum White Baydoun, which conducts polls for Democratic candidates as well as independently. It was a poll of Florida and it had Mr. Romney ahead by nearly 15 points there.

http://fivethirtyeight.blogs.nytimes.com/2012/08/20/aug-20-when-the-polling-gets-weird/


FBaggins

(26,748 posts)
6. Not a chance
Wed Oct 24, 2012, 08:41 PM
Oct 2012

Rasmussen had it at 52-45 after the first debate.

We're not going to win MI by 16 points again... it probably won't even be double digits...

... but it isn't anything close to tied.

fujiyama

(15,185 posts)
24. Yeah, he'd be kind of stupid to give up on 16 EVs so easily if he still had a chance
Thu Oct 25, 2012, 12:14 AM
Oct 2012

This is laughable. The pollster may as well have just made up numbers.

flowomo

(4,740 posts)
13. Nick Gourevitch at HuffPo looked at this pollster in October...
Wed Oct 24, 2012, 08:45 PM
Oct 2012

Apparently, F/D has a problem with age distribution (and possibly other things):

"Other pollsters don't show weighted age distributions and just provide unweighted sample sizes. A September poll in Michigan by Foster McCollum White Baydoun put the unweighted break for 66+ at 44 percent while just 3 percent of their unweighted interviews were 18-30 years old. The census data for Michigan estimates that 19 percent of 2008 voters were 18-30, suggesting a huge under-representation of 18- to 30-year-olds in their sample (unless they weighted up their 18-30 year old sample by an enormous amount). That poll showed Obama +2 which was a significant departure from other polling in Michigan at the time."

http://www.huffingtonpost.com/nick-gourevitch/age-different-polling-skew_b_1952251.html

world wide wally

(21,744 posts)
30. Hey!... I'm old (64 today) ... but I don't answer my phone
Fri Oct 26, 2012, 03:25 AM
Oct 2012

In all seriousness...us "old" people stopped the war in Vietnam, pushed through civil rights, lowered the voting age and drinking age in many states, grew up with the Beatles, made diversity popular, discovered the environment, and elected Clinton.
A simple "Thank you will work"
But I wonder what happened to all my old hippy friends who betrayed all the values they grew up on and voted for Reagan. I wonder how they voted for Bush... and I STILL CAN'T IMAGINE how they could possibly vote for Romney who represents every fucking thing they ever believed in when they were alive.

Response to flowomo (Reply #13)

DemocratSinceBirth

(99,710 posts)
22. I'm Confused
Thu Oct 25, 2012, 12:00 AM
Oct 2012

So the pollster just made a lot of automated calls and didn't weight the responses. That makes no sense.

yellowcanine

(35,699 posts)
25. Yes. Anyone could have been answering the questions. It really is not much better than an
Thu Oct 25, 2012, 09:10 AM
Oct 2012

internet poll except that access is restricted to those who get called.

efoster40

(5 posts)
28. Eric Foster of Foster McCollum White & Associates response to your post about our Michigan poll
Fri Oct 26, 2012, 12:11 AM
Oct 2012

Hello everyone.

My name is Eric Foster, President of Foster McCollum White & Associates. I am the lead pollster for our firm and our partnership with Baydoun Consulting. I want to thank everyone for taking the time to provide their comments and feedback. It is critical and important for everyone to voice their ideas. I would like to respond to the questions about our polling work.

First, I would like to address the topic of our political affiliation. Foster McCollum White Baydoun is non partisan. We do not ask for a Democratic or Republican label. Separate of our polling work, FMW has consulted to Democratic and Republican candidates for elected office, candidates for Non Partisan and Judicial positions, Millage campaigns and ballot initiatives. Our partner Baydoun Consulting consults Democratic candidates and the owner of Baydoun Consulting is very involved in statewide Democratic politics. We do not allow our political views or personal voting patterns impact our polling methodology or reporting. We present the findings in a clear and detailed manner, with explanations on our weighting methodology, demographic respondents versus projected turnout demographics and other key modeling processes, so readers can have objective data to review. We are not aligned with either political party and these polls are independent of any bias from the Democratic or Republican parties. I understand that most of the respondents on blogs are either Democrats or Republicans and they may have offense with the data findings that reflect negatively on their candidate. Our job is not to present information to make either party like the data, our job is to present unbiased data and let the reviewers draw their responses and conclusions.

Second. There has been a lot of discussion about our Florida poll in August and our use of historical election turnout statistics collected from individual city and county clerks and the Secretary of State's offices instead of exit polling. I want to explain our methodology for the readers review.

Our Predictive Voter Behavior Analysis model reviews election statistics for age, gender, voting participation pattern, gender and socioeconomic factors to determine the likely voting universe for an upcoming election. Our turnout models are based on state based historical turnout statistics provided by the municipal and county clerks and secretaries of state’s office of a state for age, gender, party, ethnicity and voting method (early, absentee, poll location) instead of exit polls. We trust the reliability of the election statistics from the clerks’ offices to give us value data reads on future elections.

The reason we take the historical data for a state is to give us a baseline for each precinct within the state and then build models up from there. We work to identify solid trends of turnout over a series of primary and general election contest so that we can remove outliers within turnout, age, gender, partisan (if collected) and ethnicity and determine the true participation base for that precinct. We can then project out for the variable election conditions (type, advertising impact, voter mobilization, outlier ballot issue impact, etc.) that allow us to determine our high moderate and low performing turnout and voter models.

Our polling call list was weighted to the historical weights for age, gender, race, region and congressional district area. Our list is also comprised of voters with previous voting histories in Presidential, state and local elections. We include the moderate and low performance voters, but the call files do contain a significant portion of voters who have a likely history to participate. We do not call voters who have never participated in elections but are registered. It is difficult to contact people via cell phones is The Telephone Consumer Protection Act (TCPA) (47 U.S.C. 227 , 47 CFR 64.1200) prohibits the use of an “automatic telephone dialing system” to contact “any telephone number assigned to a cellular telephone service” without “express prior consent” from the party being called. Based upon this federal law and the difficulty in procuring call files with parties (voters) who have provided their consent, our call files are comprised of landlines.

When we call through the list, we report the demographics of the respondents without weight. If our demographics match the likely voter demographics for the polling study, we will report the baseline results as unweighted. If there are underrepresented groups within our aggregate respondent universe, we use our weighting model to adjust for their representative weight and the groups reflected polling preference for the baseline questions. We still will report the un-weighted demographics of our respondents because they reflect the prevailing interest level of the voting groups at the time of our polling survey.

Based on the respondent universes to our Florida poll, we made the adjustment weight for the five underrepresented groups in Florida based on our PVBA model. We analyzed the respondent’s participation rates to our data models for Florida and also considered the recent spike in Presidential election rates for the younger age groups and the representative portion that each group makes up of the registered voting base. Even though our model projects a lower turnout among primarily voters under 50, we weighted the voters ages 18 to 30 at 12% of the possible election universe and voters ages 31 to 50 at 15%, for a total of 27%. Re-elections for Democrats tend to draw fewer younger and Minority voters then their initial election, per our historical analysis models. We believe that these groups tend to feel that they accomplished the major task of placing their change agent candidate into office and those they should have built enough of a base to sustain themselves.

We have made weighting adjustments to the aggregate baseline responses based on the following five groups who were underrepresented in our aggregate polling respondents:
• Male respondents – 42.38% of respondent universe versus 45.2% of (FMW)B PVBA model projections for 2012 November general election and Florida’s overall registered voter base, with a final weighted determinate 45.0% of the aggregate baseline universe.
• African American respondents – 6.32% of actual respondent universe was weighted to reflect the 10.1% of (FMW)B PVBA model projections for 2012 November general election, 13.2% of Florida’s overall registered voter base and 13.7% of Florida’s adult population, with a final weighted determinate 11.5% of the aggregate baseline universe.
• Latino American respondents – 4.06% of actual respondent universe was weighted to reflect the 7.0% of (FMW)B PVBA model projections for 2012 November general election, 12.5% of Florida’s overall registered voter base and 21.1% of Florida’s adult population, with a final weighted determinate 10% of the aggregate baseline universe.
• Voters ages 18 to 30 years old – 1.33% of actual respondent universe was weighted to reflect the 1.8% of (FMW)B PVBA model projections for 2012 November general election and 16.3% of Florida’s overall registered voter base, with a final weighted determinate 12% of the aggregate baseline universe.
• Voters ages 31 to 50 years old – 7.65% of actual respondent universe was weighted to reflect the 10.1% of (FMW)B PVBA model projections for 2012 November general election and 21.6% of Florida’s overall registered voter base, with a final weighted determinate 15% of the aggregate baseline universe.

The self identified partisan participation rates from our Florida poll was as followed:
Total Democrats 38.31%
(Independent): 25.10%
Total Republicans 36.36%

When we polled Florida, a number of additional factors within our cross tabs relate to the shift in Obama’s fortunes in the state:
• White Women – He was losing them in our Florida poll
• People ages 31 to 50 – He won this group handily in 2008, but with the economic challenges and housing struggles, this group is more disenchanted then before.
• Florida Latino voters – the Cuba community make up a significant shift of voters to the Republican party.
• People didn't understand Obama’s plan to Ryan’s plan – At the time, Paul Ryan has provided Mitt Romney with cover for lacking details about his economic and budget plans. People could at least understand and make sense of what Ryan wants to change about Government. President Obama’s plan still seems vague to most voters.

With respect to our Michigan polling, some people have attributed a bias against minority voters in our polls. That is false. Per our PVBA model (which is built on 20 general election cycles worth of historical voter turnout statistics for age, gender and ethnicity, the 2012 Michigan Presidential general election turnout should be have 25% minority voters (African American, Latino American, Asian American, Arabic American, Native American and Multiracial Americans) For our sixth poll in a row, our Minority voter participation rate was significantly lower (16%) of the respondent weight in spite of our poll sample call file universe being roughly 26% minority. This indicates a continuing interest gap among these voting groups.

With respect to our October 23rd poll, we made weighting adjustments to the aggregate baseline responses based on the following four groups who were underrepresented in our aggregate polling respondents:
• Male respondents – 35.63% of actual respondent universe was weighted to reflect the 46% (FMW)B PVBA male voter turnout model projections for 2012 November general election, with a final weighted determinate factor of 45.0% of the aggregate universe.
• African American respondents – 9.02% of actual respondent universe was weighted to reflect the 17.5% of (FMW)B PVBA model projections for 2012 November general election, with a final weighted determinate factor of 17.5% of the aggregate universe..
• Voters ages 18 to 30 years old – 1.96% of actual respondent universe was weighted to reflect the 16% of (FMW)B PVBA model projections for 2012 November general election, with a final weighted determinate factor of 16.00% of the aggregate universe.
• Voters ages 31 to 50 years old – 14.20% of actual respondent universe was weighted to reflect the 25% of (FMW)B PVBA model projections for 2012 November general election, with a final weighted determinate factor of 25.00% of the aggregate universe.

With respect to the Nate Silver article, Nate made a number of mistakes in reviewing our methodology and demographic reporting sections. He reported our respondent demographics as if they were our projections about turnout. Considering most polling forms don't publish their respondent demographics, I could understand the confusion. Additionally, we reviewed our methodology section and added additional verbiage to provide a clearer understanding of our polling construct. We still support what Nate does and continue to provide polling data to him, which he continues to publish to this day. Nate actually helped us by making us understand that more data, not less, is the best way to operate in this industry.

I do want to address something else that is discussed in the polling and survey industry and that is the question of oversampling. Oversampling is a method in which one attempts to gain a higher proportion of a market, consumer or voter segment for statistical or scientific purposes. When a public opinion polling conducts a survey and has built the polling sample call file based on historical demographic research or multiple exit polling models to strengthen reliability, the intent and goal is to receive proportional responses from all of the cross tabular groupings. Weighting adjustments are made when specific groups do not respond to the calls or complete the survey to their projected proportion of the universe. When that occurs, then you apply weighting models to balance the survey to reflect a proportional universe. If we had a group that did not respond to the poll in proportion to the number of persons from that group in the control (call file), it is not purpose under-sampling and over-sampling of the related groups in that cross tabular universe (i.e., African Americans should be 17.5% of an election universe, they make up 18.7% of the control group (call file), yet only 10% actually complete the survey). we throw the term out to imply a purposeful deception by polling organizations. We don't operate in that capacity and we believe most of the organizations in public opinion polling do not either.

I am taking the time to provide such detail because I believe that firms in this industry are working to provide the public with quality information for review and assessment. Polling is a snapshot in time and the respondents are influenced by a number of external factors prior to the call and possibly during the survey call. It is our job to provide clear and open information so that you the reviewers can understand the science behind the work and the data that supports the findings. We provide more detail about our work then most of our colleagues in the industry. As two Michigan based small businesses who are working within the American Dream, we have a model that we believe fits this business universe and provides a high level of detail and information. We will continue to be open and transparent in our work and will always provide you and the public with as much information as possible. If you have any additional questions, please feel free to email me at efoster@fostermccollumwhite.com.

Thank for taking the time to read my response.

Sincerely,

Eric Foster

Tutonic

(2,522 posts)
18. The 7-11 Poll is more credible than this crap.
Wed Oct 24, 2012, 09:09 PM
Oct 2012

Why would you even release the results after 33,800 people hung up your ass? Were the 1,122 bedridden or Romney sister wives?

efoster40

(5 posts)
27. Eric Foster of Foster McCollum White & Associates response to your post about our Michigan poll
Fri Oct 26, 2012, 12:09 AM
Oct 2012

Hello everyone.

My name is Eric Foster, President of Foster McCollum White & Associates. I am the lead pollster for our firm and our partnership with Baydoun Consulting. I want to thank everyone for taking the time to provide their comments and feedback. It is critical and important for everyone to voice their ideas. I would like to respond to the questions about our polling work.

First, I would like to address the topic of our political affiliation. Foster McCollum White Baydoun is non partisan. We do not ask for a Democratic or Republican label. Separate of our polling work, FMW has consulted to Democratic and Republican candidates for elected office, candidates for Non Partisan and Judicial positions, Millage campaigns and ballot initiatives. Our partner Baydoun Consulting consults Democratic candidates and the owner of Baydoun Consulting is very involved in statewide Democratic politics. We do not allow our political views or personal voting patterns impact our polling methodology or reporting. We present the findings in a clear and detailed manner, with explanations on our weighting methodology, demographic respondents versus projected turnout demographics and other key modeling processes, so readers can have objective data to review. We are not aligned with either political party and these polls are independent of any bias from the Democratic or Republican parties. I understand that most of the respondents on blogs are either Democrats or Republicans and they may have offense with the data findings that reflect negatively on their candidate. Our job is not to present information to make either party like the data, our job is to present unbiased data and let the reviewers draw their responses and conclusions.

Second. There has been a lot of discussion about our Florida poll in August and our use of historical election turnout statistics collected from individual city and county clerks and the Secretary of State's offices instead of exit polling. I want to explain our methodology for the readers review.

Our Predictive Voter Behavior Analysis model reviews election statistics for age, gender, voting participation pattern, gender and socioeconomic factors to determine the likely voting universe for an upcoming election. Our turnout models are based on state based historical turnout statistics provided by the municipal and county clerks and secretaries of state’s office of a state for age, gender, party, ethnicity and voting method (early, absentee, poll location) instead of exit polls. We trust the reliability of the election statistics from the clerks’ offices to give us value data reads on future elections.

The reason we take the historical data for a state is to give us a baseline for each precinct within the state and then build models up from there. We work to identify solid trends of turnout over a series of primary and general election contest so that we can remove outliers within turnout, age, gender, partisan (if collected) and ethnicity and determine the true participation base for that precinct. We can then project out for the variable election conditions (type, advertising impact, voter mobilization, outlier ballot issue impact, etc.) that allow us to determine our high moderate and low performing turnout and voter models.

Our polling call list was weighted to the historical weights for age, gender, race, region and congressional district area. Our list is also comprised of voters with previous voting histories in Presidential, state and local elections. We include the moderate and low performance voters, but the call files do contain a significant portion of voters who have a likely history to participate. We do not call voters who have never participated in elections but are registered. It is difficult to contact people via cell phones is The Telephone Consumer Protection Act (TCPA) (47 U.S.C. 227 , 47 CFR 64.1200) prohibits the use of an “automatic telephone dialing system” to contact “any telephone number assigned to a cellular telephone service” without “express prior consent” from the party being called. Based upon this federal law and the difficulty in procuring call files with parties (voters) who have provided their consent, our call files are comprised of landlines.

When we call through the list, we report the demographics of the respondents without weight. If our demographics match the likely voter demographics for the polling study, we will report the baseline results as unweighted. If there are underrepresented groups within our aggregate respondent universe, we use our weighting model to adjust for their representative weight and the groups reflected polling preference for the baseline questions. We still will report the un-weighted demographics of our respondents because they reflect the prevailing interest level of the voting groups at the time of our polling survey.

Based on the respondent universes to our Florida poll, we made the adjustment weight for the five underrepresented groups in Florida based on our PVBA model. We analyzed the respondent’s participation rates to our data models for Florida and also considered the recent spike in Presidential election rates for the younger age groups and the representative portion that each group makes up of the registered voting base. Even though our model projects a lower turnout among primarily voters under 50, we weighted the voters ages 18 to 30 at 12% of the possible election universe and voters ages 31 to 50 at 15%, for a total of 27%. Re-elections for Democrats tend to draw fewer younger and Minority voters then their initial election, per our historical analysis models. We believe that these groups tend to feel that they accomplished the major task of placing their change agent candidate into office and those they should have built enough of a base to sustain themselves.

We have made weighting adjustments to the aggregate baseline responses based on the following five groups who were underrepresented in our aggregate polling respondents:
• Male respondents – 42.38% of respondent universe versus 45.2% of (FMW)B PVBA model projections for 2012 November general election and Florida’s overall registered voter base, with a final weighted determinate 45.0% of the aggregate baseline universe.
• African American respondents – 6.32% of actual respondent universe was weighted to reflect the 10.1% of (FMW)B PVBA model projections for 2012 November general election, 13.2% of Florida’s overall registered voter base and 13.7% of Florida’s adult population, with a final weighted determinate 11.5% of the aggregate baseline universe.
• Latino American respondents – 4.06% of actual respondent universe was weighted to reflect the 7.0% of (FMW)B PVBA model projections for 2012 November general election, 12.5% of Florida’s overall registered voter base and 21.1% of Florida’s adult population, with a final weighted determinate 10% of the aggregate baseline universe.
• Voters ages 18 to 30 years old – 1.33% of actual respondent universe was weighted to reflect the 1.8% of (FMW)B PVBA model projections for 2012 November general election and 16.3% of Florida’s overall registered voter base, with a final weighted determinate 12% of the aggregate baseline universe.
• Voters ages 31 to 50 years old – 7.65% of actual respondent universe was weighted to reflect the 10.1% of (FMW)B PVBA model projections for 2012 November general election and 21.6% of Florida’s overall registered voter base, with a final weighted determinate 15% of the aggregate baseline universe.

The self identified partisan participation rates from our Florida poll was as followed:
Total Democrats 38.31%
(Independent): 25.10%
Total Republicans 36.36%

When we polled Florida, a number of additional factors within our cross tabs relate to the shift in Obama’s fortunes in the state:
• White Women – He was losing them in our Florida poll
• People ages 31 to 50 – He won this group handily in 2008, but with the economic challenges and housing struggles, this group is more disenchanted then before.
• Florida Latino voters – the Cuba community make up a significant shift of voters to the Republican party.
• People didn't understand Obama’s plan to Ryan’s plan – At the time, Paul Ryan has provided Mitt Romney with cover for lacking details about his economic and budget plans. People could at least understand and make sense of what Ryan wants to change about Government. President Obama’s plan still seems vague to most voters.

With respect to our Michigan polling, some people have attributed a bias against minority voters in our polls. That is false. Per our PVBA model (which is built on 20 general election cycles worth of historical voter turnout statistics for age, gender and ethnicity, the 2012 Michigan Presidential general election turnout should be have 25% minority voters (African American, Latino American, Asian American, Arabic American, Native American and Multiracial Americans) For our sixth poll in a row, our Minority voter participation rate was significantly lower (16%) of the respondent weight in spite of our poll sample call file universe being roughly 26% minority. This indicates a continuing interest gap among these voting groups.

With respect to the Nate Silver article, Nate made a number of mistakes in reviewing our methodology and demographic reporting sections. He reported our respondent demographics as if they were our projections about turnout. Considering most polling forms don't publish their respondent demographics, I could understand the confusion. Additionally, we reviewed our methodology section and added additional verbiage to provide a clearer understanding of our polling construct. We still support what Nate does and continue to provide polling data to him, which he continues to publish to this day. Nate actually helped us by making us understand that more data, not less, is the best way to operate in this industry.

I do want to address something else that is discussed in the polling and survey industry and that is the question of oversampling. Oversampling is a method in which one attempts to gain a higher proportion of a market, consumer or voter segment for statistical or scientific purposes. When a public opinion polling conducts a survey and has built the polling sample call file based on historical demographic research or multiple exit polling models to strengthen reliability, the intent and goal is to receive proportional responses from all of the cross tabular groupings. Weighting adjustments are made when specific groups do not respond to the calls or complete the survey to their projected proportion of the universe. When that occurs, then you apply weighting models to balance the survey to reflect a proportional universe. If we had a group that did not respond to the poll in proportion to the number of persons from that group in the control (call file), it is not purpose under-sampling and over-sampling of the related groups in that cross tabular universe (i.e., African Americans should be 17.5% of an election universe, they make up 18.7% of the control group (call file), yet only 10% actually complete the survey). we throw the term out to imply a purposeful deception by polling organizations. We don't operate in that capacity and we believe most of the organizations in public opinion polling do not either.

I am taking the time to provide such detail because I believe that firms in this industry are working to provide the public with quality information for review and assessment. Polling is a snapshot in time and the respondents are influenced by a number of external factors prior to the call and possibly during the survey call. It is our job to provide clear and open information so that you the reviewers can understand the science behind the work and the data that supports the findings. We provide more detail about our work then most of our colleagues in the industry. As two Michigan based small businesses who are working within the American Dream, we have a model that we believe fits this business universe and provides a high level of detail and information. We will continue to be open and transparent in our work and will always provide you and the public with as much information as possible. If you have any additional questions, please feel free to email me at efoster@fostermccollumwhite.com.

Thank for taking the time to read my response.

Sincerely,

Eric Foster

Maximumnegro

(1,134 posts)
20. They're weighing down the average folks
Wed Oct 24, 2012, 09:32 PM
Oct 2012

That's what the GOP pollsters are up to. They don't care if it looks bogus, fact is Nate is gonna put them in his mini WOPR and crank out numbers. This all about weighing down the average wherever they can. Why do you Ras has been releasing polls right after other pollsters post debates? SAme for ARG, GRavis, etc. They know they looks sketchy but people will include them anyway.

OldDem2012

(3,526 posts)
21. At this point before the election, who in their right mind is going to answer the phone?...
Wed Oct 24, 2012, 09:42 PM
Oct 2012

* People without caller ID?
* People who answer every call regardless?

Who exactly?

Jennicut

(25,415 posts)
23. Trust me, Baydoun/Foster has the rep of being one of the worst pollsters out there.
Thu Oct 25, 2012, 12:11 AM
Oct 2012

They once had Romney up by 15 in Florida. Seriously. Right up there with Gravis Marketing. I wouldn't even bother looking at the poll.

They are really Foster McCollum White and Baydoun. Eric Foster claims to be a Dem but donated only once to someone, to a Repub congresswoman in 2008.

efoster40

(5 posts)
26. Eric Foster of Foster McCollum White & Associates response to your post
Fri Oct 26, 2012, 12:04 AM
Oct 2012

Hello everyone.

My name is Eric Foster, President of Foster McCollum White & Associates. I am the lead pollster for our firm and our partnership with Baydoun Consulting. I want to thank everyone for taking the time to provide their comments and feedback. It is critical and important for everyone to voice their ideas. I would like to respond to the questions about our polling work.

First, I would like to address the topic of our political affiliation. Foster McCollum White Baydoun is non partisan. We do not ask for a Democratic or Republican label. Separate of our polling work, FMW has consulted to Democratic and Republican candidates for elected office, candidates for Non Partisan and Judicial positions, Millage campaigns and ballot initiatives. Our partner Baydoun Consulting consults Democratic candidates and the owner of Baydoun Consulting is very involved in statewide Democratic politics. We do not allow our political views or personal voting patterns impact our polling methodology or reporting. We present the findings in a clear and detailed manner, with explanations on our weighting methodology, demographic respondents versus projected turnout demographics and other key modeling processes, so readers can have objective data to review. We are not aligned with either political party and these polls are independent of any bias from the Democratic or Republican parties. I understand that most of the respondents on blogs are either Democrats or Republicans and they may have offense with the data findings that reflect negatively on their candidate. Our job is not to present information to make either party like the data, our job is to present unbiased data and let the reviewers draw their responses and conclusions.

Second. There has been a lot of discussion about our Florida poll in August and our use of historical election turnout statistics collected from individual city and county clerks and the Secretary of State's offices instead of exit polling. I want to explain our methodology for the readers review.

Our Predictive Voter Behavior Analysis model reviews election statistics for age, gender, voting participation pattern, gender and socioeconomic factors to determine the likely voting universe for an upcoming election. Our turnout models are based on state based historical turnout statistics provided by the municipal and county clerks and secretaries of state’s office of a state for age, gender, party, ethnicity and voting method (early, absentee, poll location) instead of exit polls. We trust the reliability of the election statistics from the clerks’ offices to give us value data reads on future elections.

The reason we take the historical data for a state is to give us a baseline for each precinct within the state and then build models up from there. We work to identify solid trends of turnout over a series of primary and general election contest so that we can remove outliers within turnout, age, gender, partisan (if collected) and ethnicity and determine the true participation base for that precinct. We can then project out for the variable election conditions (type, advertising impact, voter mobilization, outlier ballot issue impact, etc.) that allow us to determine our high moderate and low performing turnout and voter models.

Our polling call list was weighted to the historical weights for age, gender, race, region and congressional district area. Our list is also comprised of voters with previous voting histories in Presidential, state and local elections. We include the moderate and low performance voters, but the call files do contain a significant portion of voters who have a likely history to participate. We do not call voters who have never participated in elections but are registered. It is difficult to contact people via cell phones is The Telephone Consumer Protection Act (TCPA) (47 U.S.C. 227 , 47 CFR 64.1200) prohibits the use of an “automatic telephone dialing system” to contact “any telephone number assigned to a cellular telephone service” without “express prior consent” from the party being called. Based upon this federal law and the difficulty in procuring call files with parties (voters) who have provided their consent, our call files are comprised of landlines.

When we call through the list, we report the demographics of the respondents without weight. If our demographics match the likely voter demographics for the polling study, we will report the baseline results as unweighted. If there are underrepresented groups within our aggregate respondent universe, we use our weighting model to adjust for their representative weight and the groups reflected polling preference for the baseline questions. We still will report the un-weighted demographics of our respondents because they reflect the prevailing interest level of the voting groups at the time of our polling survey.

Based on the respondent universes to our Florida poll, we made the adjustment weight for the five underrepresented groups in Florida based on our PVBA model. We analyzed the respondent’s participation rates to our data models for Florida and also considered the recent spike in Presidential election rates for the younger age groups and the representative portion that each group makes up of the registered voting base. Even though our model projects a lower turnout among primarily voters under 50, we weighted the voters ages 18 to 30 at 12% of the possible election universe and voters ages 31 to 50 at 15%, for a total of 27%. Re-elections for Democrats tend to draw fewer younger and Minority voters then their initial election, per our historical analysis models. We believe that these groups tend to feel that they accomplished the major task of placing their change agent candidate into office and those they should have built enough of a base to sustain themselves.

We have made weighting adjustments to the aggregate baseline responses based on the following five groups who were underrepresented in our aggregate polling respondents:
• Male respondents – 42.38% of respondent universe versus 45.2% of (FMW)B PVBA model projections for 2012 November general election and Florida’s overall registered voter base, with a final weighted determinate 45.0% of the aggregate baseline universe.
• African American respondents – 6.32% of actual respondent universe was weighted to reflect the 10.1% of (FMW)B PVBA model projections for 2012 November general election, 13.2% of Florida’s overall registered voter base and 13.7% of Florida’s adult population, with a final weighted determinate 11.5% of the aggregate baseline universe.
• Latino American respondents – 4.06% of actual respondent universe was weighted to reflect the 7.0% of (FMW)B PVBA model projections for 2012 November general election, 12.5% of Florida’s overall registered voter base and 21.1% of Florida’s adult population, with a final weighted determinate 10% of the aggregate baseline universe.
• Voters ages 18 to 30 years old – 1.33% of actual respondent universe was weighted to reflect the 1.8% of (FMW)B PVBA model projections for 2012 November general election and 16.3% of Florida’s overall registered voter base, with a final weighted determinate 12% of the aggregate baseline universe.
• Voters ages 31 to 50 years old – 7.65% of actual respondent universe was weighted to reflect the 10.1% of (FMW)B PVBA model projections for 2012 November general election and 21.6% of Florida’s overall registered voter base, with a final weighted determinate 15% of the aggregate baseline universe.

The self identified partisan participation rates from our Florida poll was as followed:
Total Democrats 38.31%
(Independent): 25.10%
Total Republicans 36.36%

When we polled Florida, a number of additional factors within our cross tabs relate to the shift in Obama’s fortunes in the state:
• White Women – He was losing them in our Florida poll
• People ages 31 to 50 – He won this group handily in 2008, but with the economic challenges and housing struggles, this group is more disenchanted then before.
• Florida Latino voters – the Cuba community make up a significant shift of voters to the Republican party.
• People didn't understand Obama’s plan to Ryan’s plan – At the time, Paul Ryan has provided Mitt Romney with cover for lacking details about his economic and budget plans. People could at least understand and make sense of what Ryan wants to change about Government. President Obama’s plan still seems vague to most voters.

With respect to our Michigan polling, some people have attributed a bias against minority voters in our polls. That is false. Per our PVBA model (which is built on 20 general election cycles worth of historical voter turnout statistics for age, gender and ethnicity, the 2012 Michigan Presidential general election turnout should be have 25% minority voters (African American, Latino American, Asian American, Arabic American, Native American and Multiracial Americans) For our sixth poll in a row, our Minority voter participation rate was significantly lower (16%) of the respondent weight in spite of our poll sample call file universe being roughly 26% minority. This indicates a continuing interest gap among these voting groups.

With respect to the Nate Silver article, Nate made a number of mistakes in reviewing our methodology and demographic reporting sections. He reported our respondent demographics as if they were our projections about turnout. Considering most polling forms don't publish their respondent demographics, I could understand the confusion. Additionally, we reviewed our methodology section and added additional verbiage to provide a clearer understanding of our polling construct. We still support what Nate does and continue to provide polling data to him, which he continues to publish to this day. Nate actually helped us by making us understand that more data, not less, is the best way to operate in this industry.

I do want to address something else that is discussed in the polling and survey industry and that is the question of oversampling. Oversampling is a method in which one attempts to gain a higher proportion of a market, consumer or voter segment for statistical or scientific purposes. When a public opinion polling conducts a survey and has built the polling sample call file based on historical demographic research or multiple exit polling models to strengthen reliability, the intent and goal is to receive proportional responses from all of the cross tabular groupings. Weighting adjustments are made when specific groups do not respond to the calls or complete the survey to their projected proportion of the universe. When that occurs, then you apply weighting models to balance the survey to reflect a proportional universe. If we had a group that did not respond to the poll in proportion to the number of persons from that group in the control (call file), it is not purpose under-sampling and over-sampling of the related groups in that cross tabular universe (i.e., African Americans should be 17.5% of an election universe, they make up 18.7% of the control group (call file), yet only 10% actually complete the survey). we throw the term out to imply a purposeful deception by polling organizations. We don't operate in that capacity and we believe most of the organizations in public opinion polling do not either.


Also krawhitham, reporting the call volume and response rate is another part of open disclosure that we have in our polling
methodology and process.

I am taking the time to provide such detail because I believe that firms in this industry are working to provide the public with quality information for review and assessment. Polling is a snapshot in time and the respondents are influenced by a number of external factors prior to the call and possibly during the survey call. It is our job to provide clear and open information so that you the reviewers can understand the science behind the work and the data that supports the findings. We provide more detail about our work then most of our colleagues in the industry. As two Michigan based small businesses who are working within the American Dream, we have a model that we believe fits this business universe and provides a high level of detail and information. We will continue to be open and transparent in our work and will always provide you and the public with as much information as possible. If you have any additional questions, please feel free to email me at efoster@fostermccollumwhite.com.

Thank for taking the time to read my response.

Sincerely,

Eric Foster

outsideworld

(601 posts)
32. Well
Fri Oct 26, 2012, 08:23 AM
Oct 2012

Thank for taking your time out to address this post and comments. Expect a rebuttal of all you said in a moment.

ProSense

(116,464 posts)
33. Um,
Fri Oct 26, 2012, 08:29 AM
Oct 2012

with all due respect, a poll showing Romney leading by 15 points in Florida is bunk. If you're trying to justify those results, you're wasting your time.

brindis_desala

(907 posts)
34. This is a known Romney strategy
Fri Oct 26, 2012, 09:06 AM
Oct 2012

Bogus polls to create a false narrative. We must have independent exit polling on Election Day.

Latest Discussions»Retired Forums»2016 Postmortem»47 - 47 Michigan - Baydou...