tv Trump Presidential Campaign Pollster at University of Chicago CSPAN December 25, 2017 4:50am-6:01am EST
next, a trump campaign poster talks about their polling methods in 2016. he is joined by strategist ted ro. from the university of chicago institute of politics, this is just over an hour. [applause] mr. roe: thank you so much for coming. michael is president and ceo of baselice and associates. he has extensive experience in political campaigns and projects in the u.s. and abroad. his firm has carried out thousands of survey projects.
most notably, donald trump's presidential campaign. he was a research analyst for the national republican congressional committee during the 1988 campaign cycle and conducted opposition research and targeting projects for key congressional campaigns. in 1987, he produced the first-ever orvs. today, he will talk to us about the art of analyzing public opinion. afterwards, jethro will sit down with him to continue the conversation. mr. roe is a nationally recognized republican political operative. he has built the largest republican consulting firm in the nation. please join me in welcoming michael baselice to the stage. [applause] mr. baselice: thank you very much. it is a pleasure to be here today. i think i have been tasked with sharing some information about
polling. one of the things we want to look at is what happened in 2016. so, that is where we are going to start. one of the things i want to share with you right away is the headline that took me about 20 years to get. i wanted to get quoted on this. i was asked why a reporter on the eve of the presidential primary in 2008. they asked me about hillary clinton and barack obama which i was not tracking. what i thought was going to happen in the upcoming race between obama and clinton. and i said -- you know, i think this race is going to hinge on who turns out. really? so he typed that out but he was on a deadline. i think the race is going to hinge on not only who turns out, but how many of them there are. that was great.
i put that headline out jokingly but it really does come back when we talk about polling in that the race does hinge on who turns out and which polls had a better handle on that and which did not have a handle on the electorate. my beginning with the trump campaign goes back to june, 2016. you see the name here -- kellyanne conway. everyone knows her. i love the line at the bottom of that paragraph where she was supposed up a senior strategic role in the campaign. she did help donald trump when he was considering running in 2012. she was not available until june. i was called by someone about a week before this came out. and he said -- mike, this is like childbirth. they just brought me to another reason i would like you to be with me on this one. and that is when i joined him.
kellyanne conway -- everyone knows her story. when paul manafort left the campaign, she was elevated to campaign manager. that was a good day for as. in the two months i was there, going all the way from the end of june until the end of august, we were able to manage a loss of six electoral votes going from 160 to 154. the race was pretty tight most of the way. mid-september, just a few points differential. if you fast forwarded eight weeks from this point, you find that the race ends with the popular vote at about the same place. a two-point lead for clinton. if you work in third-party candidates, which is often done in polling, having a head-to-head between the republican and the democrat and then throwing in third-party
candidates, you get a different rate. what i like is what happens and how close the race becomes and what it looks like depending on how you set up the scale. look at the top one. the scale up. 50% and then down to 38%. the one at the bottom goes from zero up to 50. don't think for a second that polling and data cannot be presented in different ways to give you a certain look. certainly, the trump campaign liked the bottom one because it was tighter. at the same time as the national polls that were out through the cycle, there were different statewide polls. one of the things we had to recognize with those is that there were not as many of them
as it were national polls. one of the knocks on what happened in the electorate -- and the election was how many polls were there? looking at mid-october, clinton and mccain were just a few points away from securing the points that they needed. now, i had been with the campaign for several months. we managed to get 10 electoral votes. we were on our way. minnesota. i am highlighting minnesota for a reason. minnesota was in the tossup category coming even in the middle of october. real clear politics will put something in the tossup category if they do not have three recent data points. the only public poll in 2012 in texas was a poll by the associates. maybe the third data point to move texas from tossup to likely republican. that -- without that public poll, there was no reason for anyone to go poll texas. no one really was looking at minnesota. back to the national look up to
the eve of the election. here is a poll by abc news and the washington post that has clinton with a four-point lead. we can look at horserace numbers all day long and they are interesting to look at. there are other numbers and other questions worth looking at. which candidate is considered to be the most honest. in this series of questions through the eve of the election, donald trump was considered more trustworthy than hillary clinton. the enthusiasm was in favor of donald trump but then hillary clinton caught up at the end. now, when we look at methodology, and this is really something we don't talk a lot about because it is so interesting and sexy to talk about the horserace numbers. what is a valid? who is winning? is that all we need to know? we need to know more about these polls. oftentimes, you can look at the
methodology. here is the methodology for the abc news poll and it is dated july 2015. the same methodology they have been using. they use the same sheet over and over again. digging deeper, they are from media, pennsylvania. it is producing the poll. at the bottom of this, it tells us something that is important. it tells us that 65% of the interviews were conducted by cell phone and 30 pipe -- and 35% by land line. does that sound right? is that balanced? a lot of the polls that we were doing were 50-50. one thing for sure is there is no excuse for not finding younger people because cell phones are how you find younger people. for those of you -- for everyone in the room -- who here has a landline? one guy.
we are not going to find any of you students. they talk about waiting and they talk about how they wait on the data. this makes a big difference. we have to know what that is. they will tell you exactly what they waited too. this is an article. an embargo on data. talking about how close the race is. going further down into the data, you see some things -- the 47% for clinton. you can see that trump is winning the independents. there were several polls that show this late in the campaign. late in october, independents started breaking for trump. here is something else and probably more important than what i was saying.
look at the demographics. this poll has 38% democrat, 31% republican. below that, when you take tightens up to a four-point differential. which comes pretty close to what i believe it should be in this country. i believe the country is about a three point democratic advantage over republicans. how do we get there? if we have time, i will show you what the national map looks like on a regional basis. how do i get three points? what was the spread in the actual outcome between trump and clinton? two points. what was the spread between obama and romney in 2012? four points. average the two together, the most recent he data that we have show the country to be about a three-point democratic advantage. if i see a poll that has 10-12 points for democrats over republicans, what will we do
about that? my name is michael baselice, not michael baselice the polling police. i can't do anything about that. i also don't think there should be a poll that has more republicans than democrats in it nationwide because that is not the way the nation is. this is a look at the early votes by state that our party egistration. florida traditionally is a point and a half to two points more democratic than republican in its turnout. when we look at our polls in florida, we had a 1.5 democratic advantage as well. that is important to know. in colorado, that was one of the states i was charged with. we got all the way within two points -- three points but no better. if you look at colorado -- they had about 35% of those that voted that were democrats. only about half the state cap party registration.
texas does not have a party registration. we have to look at other measures. now, we go to the eve of the election. if you are me and you wake up at 3:00 in the morning, eastern time, which is even earlier here for you and you see clinton has a 90% chance of winning, what are you to believe question mark are we looking at this public polling ata and are we observing it? reuters has a story. hey talk a little bit about it and they are almost giving themselves some cya. michigan and ohio -- michigan and ohio were too close to call. it could be enough to tilt ohio and michigan to trump and put pennsylvania in play. by this time, ohio is looking like a trump state. but here, it is looking more like a tossup. michigan and pennsylvania were looking closer.
it also shows that clinton enjoys a one-point lead in florida. most of the polls in florida were close. the public polling was pretty good there. yes, because, it does matter who turns out. now, this is a look at the map as it finally ended up. with 232 electoral votes for clinton. some don't like the look of that map. let us make it a cartogram and make -- and allow the states to keep the color they are but base it on the size of the state. that looks more balanced. if you don't like that, you can go to different shades of red and blue. not everything is really read or really blue. there are purple counties through that county. if you apply that to the cartogram, that is your nation.
how trump won. quickly, to talk about this, a lot of things were said after the election. a fundamental rewriting of the map. the myths went far and wide. mike murray said i believe in data for 30 years and politics. please keep believing in data. tonight, data died. it did not die. we are still around. you still use pollsters, don't you jeff? i have never -- sobriety about what happened tonight is essential. palma gall -- polls could not keep up with changing demographics. i might take this quote to task. the polls have demographics in them. the key is -- were the polls art of the demographics? looking at another poll -- this one came pretty close. they tell you how they did the pool.
ome of us in the polling field have a problem with doing a survey of adults and then look at the likely voters below that. now, you have to wonder if we are getting the right number of interviews in each state. at least they tell you what they did. there are 940 likely voters in the margin of error was 3.2%. they tell you, when you get into the extra tab, that 36% of the respondents were democrats. a little too democrat by my tandards for a nationwide poll but that is what they had. and a point spread. but a one-point race.
the democrats, nine out of 10 were voting for hillary clinton. and i netted 10 republicans were voting for trump. another poll showed him winning the independents at the end. the national polls don't mean anything even though they can the closest. you need to look at the states. arizona was pretty close. colorado. resources required us to go elsewhere. colorado, 2.9% was the real clear politics average. the best poll that i showed in the final days was a three-point deficit. i was one of those that recommended that we go elsewhere. if you think that we had this great science and formula behind the decisions to move out of colorado and go to michigan and pennsylvania, let me put in my two cents -- it is kind of a necessity. if you are going to lose nevada, virginia, and colorado, the map requires that we go somewhere else. out of necessity, we were looking at wisconsin and
michigan and pennsylvania. florida showed us to be pretty close to even. final outcome, one point 2%, advantage trump. georgia ended up being -- a number of the states were pretty good. iowa was a little off the ark. public polling said it would be tighter. in the end, trump defeated clinton by nearly 10 points. the state of maine was in the right direction. michigan, most people thought michigan was going to lean in favor of clinton and there was a slight differential in favor of trial. obody was close. new hampshire.
margin of error. new mexico. we took a look at but we could not see it getting better than this. north carolina was closed. it turned out that it was going to go for trump and set off a wave of things on the east coast that were not good for the clinton campaign. ohio missed the mark. why? same thing about any of these polls. i complain about some of these polls being too democratic nationwide. some of the polls in ohio had too many democrats. if you have too many democrats, you get an infield -- an inflated view for hillary clinton. pennsylvania was pretty close. virginia was pretty close. most of the states were pretty good. some of them were off the mark. the knock is, because of the predictive models out there, that the polling was off. here is something we have to look at. look at what i circled. johnson in the public polling is getting 4.8% in the four-way valid test. jill stein is polling at 2.3% and gets 1.1. you're looking at about 2.4% of the vote that did not go for these third-party candidates. where do they go? to trump. and clinton falls to point short of the public polling numbers.
there were some votes that we can talk about later called the hidden trump vote and how we uncovered that. here, you see polls done in the final days in michigan. only one poll, the last one done on the sixth of 1200 voters had trump with a two-point lead. wisconsin is interesting. clinton has a 6.5 point lead. if you look at the data, there is nothing inside the second of november or the third, fourth, fifth, six, or seventh. it is somewhat dated. it goes back to the 26 of october. my complaint about wisconsin is that there were not enough public polls in the state. not that the data was necessarily wrong. we have to keep in mind that polls are the balance sheets of politics. anyone in here in business question mark taking business classes question mark if you have accounting, they teach you the balance sheets are assets
against liabilities. at a given time. that is what a poll is. at a given time. a poll october 26 is different than one on november 6. the income statement that measures net income or net loss in business are the election returns. election returns tells us performance over time. we see that every two years. a quick look at pennsylvania. hillary clinton lost pennsylvania. here is what is interesting. and what we did not see until about 2-3 weeks to go in other states like this, particularly florida. the margin of victory in the suburbs was pretty similar to what obama had done in 2008 and 2012. you average out obama's margin of victory in the suburban philadelphia counties, you get about 179,000 votes for hillary
clinton. in philadelphia county, the same as philadelphia city, it is going democrat all the way back to 1988 and she gets 455,000 margin votes there. not quite what obama had done in his races prior, but add that to the margin of victory that she enjoyed, she was basically where obama was. what we saw in pennsylvania -- was a 2012 turnout. we saw things in philadelphia looking like we expected them to look in terms of turnout. the rest of the state -- if you are familiar with pennsylvania it is a rectangle. the rest of the state looked different. bush beat the caucus there in 1988. clinton won his two races there. and then the other 62
counties. you can see that bush beat kerry there by 250,000. in 2012, romney diffie did -- romney defeated clinton. trump wins by 700,000. more than covering the vote differential that he was down in the southeast part of the state. these bring up some points. that we were not used to seeing and pollsters don't like to admit. but we had to. we saw miami look like 012. typical presidential turnout that we have seen. but then, there were places in the panhandle, jacksonville, tampa that looked like 2014 and it would be good for republicans. pollsters like one turnout model for the whole state. i have done high turnout models with higher-than-expected african-american voters and latinos. for the first time, we had turnout models in one part of
the state that maybe needed to be applied and a different one in another part of the state. you saw a lot about these different programs, upshot, and 538 making these predictions and different paths to victory for one side or the other. clinton had more pats to victory based on the public polling and historical data. when you look at the map on the 21st of october, it was 262 votes for clinton and she is getting close. ohio was still pretty tight i the public polls. a point lead for trump. ends it up really taking off for trump in the final weeks. i am showing this because we have to look at what happened, the data within the poll, and what is in the poll. real quick about the exit poll. it only took place in certain states.
only 15% were senior citizens. a good fourth of the vote or more is going to be senior citizens in any presidential election. exit polls data and the being waited anyway to reflect the allot but they did not reflect the ages. i think they had the right african-american percentages in the latino percentages. you almost have to wait the data to make it look like what happened in those particular states. i will conclude with these comments. and then we will go on with some questioning from jeff. the trump targets were identified as voters who wanted to change the country but were not yet supporting him. that is something we uncovered with our polls. peeling off third-party votes. looking at people that were voting for stein or johnson.
we also had to expand the map and get beyond the states we were going to lose and look at ther states. and we have not capitalized on some enthusiasm for trump and deliver on life improving messages. regardless of what you may think about the two candidates, one of the things we noticed in our data was that hillary clinton was not associating anything different. trump was going to build a wall. reduce immigration. turn back trade deals. bring jobs back to america. cut taxes. bring higher credit for working moms for child care. a lot of things he was talking about were issues we felt were resonating. and there was a hidden trump
vote and i can talk about that. why don't i stop here, jeff, and talks more about what happened and what is going on with polling. that was good. [applause] mr. roe: i rode out some questions. i try to anticipate what some folks might want. we have seen some important swings in african-american rates and white working-class rates. how did pollsters -- how do you make the decision when you construct your sample to take account for that? mr. baselice: first thing we have to do is get an understanding of what should the turnout be based on recent elections. we start there. in texas, for example, i know that there is going to be about 12%-13% african american turnout. and around 20% should be latino, hispanic. some people might say -- wow. texas is on was 40%
hispanic. if you look at the adult population it is maybe 38%. then you look at registered voters with a hispanic surname and it is about 25% of the turnout is about 21%. so you have to understand the electorate and then look at other factors. enthusiasm. and screener questions to get eople in the survey in the last days of the campaign. have you voted early or not? if you pick a candidate, are you definitely voting for that candidate? when you see if it is african-americans or anglos are different age groups are partisan groups, that can affect your model a little bit and it can make you want to weight data differently. i had this happen in 2002 in texas. there was an african-american candidate running for senate.
tony sanchez was running for governor against rick perry. at the top of the ticket, there was an african-american on the democrat side and a hispanic. they were talking about, at that time, a 25% turnout of latinos and 15% of african-american but there was no historical data to back that up. just because there were two candidates on the list that were black and brown. could i argue that these candidates were anglo that more anglos would vote? no. and so, you have to look at other questions and see how they are answering them to see if the interest is there and wait the data accordingly. mr. roe: you started out talking tonight about cell
phones and landmines and the percentages of each. i think we had three hands go up. i have a landline. how do you account for that? how do you account for people that do not take the entire survey or refuse -- maybe walk the crowd through how you actually get people on the phone. mr. baselice: when i started in the business in 1989, we had 166 phones down in houston, texas. they were eventually bought out by gallup. phyllis was the head of the phone bank. she was one of the original hone people. she was concerned that at that time, spring, 1989, that we
were approaching two refusals for every completed interview. how would we ever get a valid sample? now, it is 15. we get 15 refusals. and there are standard refusals -- soft refusals which is thank you. and then there are hard refusals and a hangup. they add up to about 15. mr. roe: does that include no answers? mr. baselice: no, we will talk to 16 people and one will do the survey and 15 will refuse. other people screen out which is another calculation. alf of the calls we make go to voicemail. it takes a lot more dials. it used to take about 15 or 16 to get a completed interview. about 28 or 29 years ago.
and now, it can take 100. 110, depending on where we are calling. fortunately, for polling, and you just saw in the national election that the national polls were pretty good. fortunately, for polling, by telephone, we are seeing people that are old and young, different income levels, black, white, brown. rural. suburban. answering surveys at near equal rates. we are getting representative samples. the first thing we do when we go to polling is get a male-female quota in each part of this day. after that, we look at ages. partisanship.
and race ethnicity. when i started in the business, you could do 1000 surveys in a state and let it roll. now, it is harder to find young people. you call a landline and there is a 4% chance you will get a senior citizen. trying to find a 18-24-year-old -- we even ask for the person. and then we go on to someone else in the household. so we needed the cell phones. if you have a cell phone and you think you're dialing this cell phone, you're 42 years old, should we find you as a 42-year-old? someone else answered the phone that qualified, so we talk to them. there is a lot more that goes into it now. we check our demographics as we go through the polling. we may go to the last hour of the last night and say -- stop calling republicans in this region of this state. we need democrats and independents. mr. roe: do you find the people answer fewer as a person that has utilized your services -- i don't feel the length of the survey has been reduced.
mr. baselice: we are doing more cell phone interviewing. some of our national polls, we are doing 50% -- i'm sorry, one third cell phone, one third landline, one third internet. if we could just get someone to answer the call and go through the first couple of questions, e have them. people don't get through six minutes and then say -- something is burning on the stove and i have to go. if we can get them started, they will continue. if they get through the demographic questions at the end, we can't use that survey. throw it away.
we can try calling them back if we lose the connection and finish it but we don't want to wait. we want to continue while the thoughts are present in their mind. we don't want to start over it the next day. they may have a new opinion of the candidate. mr. roe: is that called the incident rate? mr. baselice: incidence is the percentage of people that qualify for the survey that you end up talking to. the easy one is -- are you registered to vote? you would think the incidents from a oter list would be high. but the phone number may be used by someone else. it now belongs to someone else in that county. so the first one is -- are you registered to vote in this county? from there, we calculate the percentage of people we have on the phone and starting to talk to against those that qualify. we are looking for incidences of over 40%. when they fall below 30, it becomes expensive.
if you are looking at a likely voter survey, and you start throwing out people because they aren't interested, you can get the survey completed, but it will take more time on the phone. mr. roe: you talked about internet polling. for me, this is one of the for cycles that we have used that in a meaningful way and trusted it. i think we have tried it over the years. do you see the same participation rate question mark -- do you see the same participation rate? mr. baselice: we thought we were going to show tv commercials to republican primary voters and we had 80,000. we invited everyone of them and 32 people did the survey. a random invite on the internet does not work. you have to have panelists. you have to have people that agree to doing the survey.
we have been effective with that lately in our national surveys. when we do 1200 interviews nationwide, we will do 400 by cell, 400 by land line, 400 will i internet. that gives more people to participate in the survey. one of the knocks that you have to be aware of with internet survey exclusively is it can be too anglo, too educated, and too young. you have to be mindful of your demographics. i did a google survey to see the demographics in texas last summer and that is what happened. you can pay two dollars an interview and get more senior citizens or minorities to participate. mr. roe: that is through email. mr. baselice: that is the google survey. when we do the other panel surveys likely due at the national level on issues like tax or form, we have panelists signed up around the country.
mr. roe: and they will take them in the required time links. mr. baselice: you can take a survey faster then when you have someone asking you questions. mr. roe: kellyanne conway. the president would do better with cell phone -- i'm sorry, with internet interviews then he would landline or live interviews. and she said it was the shy trump voter. did you buy that? mr. baselice: we started looking at that. in some places, we saw that and in others we did not. e started looking at the psychology of -- when you are in front of your camera, it is like being in the ballot
box. there are polling places now that have a screen that you look at and you're just punching numbers. it is more a kin to what you do and it is private. it is more like you are voting. what we started to look at were questions to uncover the hidden trump vote. we asked questions along the lines of -- you know someone that supports donald trump for president but won't say so publicly? interesting. and when it first came out in our daily call, i suggested that it didn't mean anything unless we ask the same thing about hillary clinton. what will we do with this number when we get it if we don't know what the hidden clinton vote is question i can remember florida with about a week or 10 days to go that there was about a 12 point differential with more people saying they were going to vote for trump than clinton. what do you do with that 12 points? that was something new to us.
it told us that there were more people out there for trump and would admit. -- then would it bet. mr. roe: there is a theory about polling and herding and the herding of pollsters. people start to see -- this is from public surveys -- people start to see what other pollsters are using for demographics and so they use the same demographics so they are not accused of trying to manipulate the data. mr. baselice: that is a problem. if you look at states like ohio, and you don't have enough republicans, and everyone is following the same path, all of the surveys are under representing the republican candidate. the worst thing that can happen is what do you do if you are the clinton campaign or the trump campaign and you follow that path? we did not. we had such ways we were
looking at from the moment i joined the campaign. we look at our states. colorado was going to be balanced. i had a few pulls with 35 or 36 and we were still down three points. move along. nd so, we had our demographics known. the key was not to be swayed by the public polls. you can look at the public polls if you are a campaign but you cannot rely on them. mr. roe: do you think there is a philosophy that whoever people think will win is more important than who they vote for? is there any psychological motivation behind media or liberal groups that poll to just make this a feature to make this a fait accompli? r. baselice: the other
position would be everyone is voting for her so i don't need to vote. it can work both ways. the biggest mistake i saw was in 2006 when called the rhône -- calderon in mexico won. he said -- i am a head in the polls, -- i am ahead in the olls, vote for me. that is not a reason to go vote for someone. or go vote for me, i am behind in the polls and i needed next her bump. we have not seen enough data to come to any conclusive result or conclusion as to what is the psychology behind numbers being thrown out to the public and these polls and what do they
mean. you can look at this last election and there were a lot of indicators from the public polls that showed advantage clinton. when it did not turn out that way, everyone started to question the numbers. the national polls were loser. some states were off. but a number of them that had current data were pretty close. mr. roe: if you take some of the symposium's after the election to talk about what happened and the democrats seem to be relying on the second comey letter being -- did you -- or what impact did you see from the comey letter? mr. baselice: inconclusive. can't say that it was a determining factor. what i wish we had now was more
polling that went to the end and asked that question about that in a number of different ways. to learn its impact. same thing -- my mentor at the tarrance group wrote a ook. and he said that it takes a lot of data -- it is still inconclusive. we just don't have enough data one way or the other. mr. roe: when you talk about weighting. there was more information that showed things going for trump in the last two weeks of october. with that trajectory have continued? let us talk about the weighting of a survey. is there a standard measurement of how far you will wait? if you have people making over $100,000 and you know in that district it is highland park nd 5%.
how far will you allow that to ait? mr. baselice: if we have 5% of the district in highland park, and that is supposed to be the umber of interviews we are seeking. a 500 sample survey and we need 25. what makes a difference is the content of those 25 people. it should be probably about 12 mail and 13 female. when i do a statewide in texas, i have 16 sample regions.
i have two in dallas. two in fort worth and two in houston. the preponderance for pollsters when they call a county is they find more anglos than minorities. i was finding too many anglos in these urban centers. i broke out the more minority portion in these areas. and i have a quota for each. that might be over the top but i figure if i'm going to do this, i have to report data that is representative of the electorate. it may cost more to have more quotas set up. and now i know how many hispanics and african-americans come from these regions. i will still get the proper number of interviews but now, my content -- as a matter of fact, if i could, let us take a look at this. i have it down here further.
we have to talk about this also. i know you will want to talk about that ted cruz thing. nationwide, i was telling everyone earlier about how 47% of the country is democratic and 44% republican because of the averages of the last two presidential. gallup had several hundred thousand interviews in certain states in 2015 and 2016 that mirrors the same thing. in the northeast, it should be about 24 points. if i see a national poll and it is the other way around, i'm going to question that poll immediately. when did the northeast start voting like that? michigan, pennsylvania, maryland, delaware, ohio, and indiana. that is a microcosm right there partisanship wise. the southeast counties combined -- states, are more
republican. five points more democrat then the central. south central. i am from texas. oklahoma, louisiana, mississippi, very republican. if i see a poll that has more republicans -- democrats in that region, how did that happen? and now the pacific states. if i see a survey that has as many republicans as democrats on the west coast, i question that. we know that data. when i look at a national poll, i question it. there was a poll that just came out from fox news. you will love the headline. this is amazing. it just came out this morning. i saw it getting on the plane. the headline here is -- fox news poll, storm erodes trump's
ratings. that's the headline. look at the numbers on from and his job approval, 38%. first-time fox news has had trump at below 40. he was at 42 mid september. mid august, 41. july, 41. now he's at 38%. and it's the storms. that is the headline. what should we look at? not only who is in there but how many of them. if we look at the demographics, this poll this time is 47% democrat, 35% republican. the one in august was three points more democrat. it went from three points more democrat over republican as a nation. maybe the demographics had something to do with the erosion of the job approval. if you read the article, they talk about it.
meanwhile, the white house received mixed reviews of for its response to recent disasters. you have a separate question that gives him mixed reviews on the hurricanes and yet the headline is that. one thing that we take away from this tonight is kick the tires, look at the methodology and see how many republicans and democrats are in the polls. mr. roe: let us start with some questions. yes. >> thank you for the talk. i have two questions. if that is ok. my first question is -- how do he questions you ask leading to the election impact the questions you will have in the next election? you talked about the shy trump
supporters. how will those impact your next questions? and to what extent do you think pollsters will start using artificial intelligence like data mining systems that you see -- that you can see hidden patterns? mr. baselice: i will start with the first one. we are asking questions now about donald trump's job approval and we are seeing 41 or 42. we are also asking other questions, almost like we did in the campaign. do you know someone that has approved of the job trump is doing but won't say so publicly question mark and how about this -- even if you disapprove of the job he is doing, are there some things he is doing that you do like? you take those questions together and you have two thirds of the country that like something he is doing or like him overall. that is of much bigger
number. it allows those that have issues to then target another group of individuals that now like something he is doing. we are going to put out the same kinds of questions the next time around. jeff has some analytics he used when he was helping ted cruz get as far as they did. good race, jeff. we looked at acxiom data. nfo group. you may be a wine enthusiast of a two. we don't even know what a two means. when we see that combination, and it is showing up again and
again with other voters, and you are undecided, ok -- but ou are a conservative? now, we have patterns to look at large numbers of voters. i understand they were doing some data mining and still looking at the ballot test and the name ids but not digging into why you are voting for hillary clinton or why are you voting for donald trump. they did not know the why. we do get large numbers of interviews completed and you combine it with the other data, algorithms can be traded to look for other things that voters have in common. and yet campaigns are using it. democrats were using this before the republicans. we finally caught up in this last election. mr. roe: if you are a cat lover and a cat lover and old like wine, you are probably not onservative.
>> my question is, i have always found it pretty fascinating to look at the hidden trump supporters. i am going to vote for. are you comfortable talking bout the things you can use in the future for those guys of people that would vote for him but would not admit it, maybe even to themselves. mr. baselice: and there may be a question along those lines that i can put out there -- where do you find these votes question mark there are x
number of votes we are finding for him. then, it was a matter of -- we need some more people to win florida or more in pennsylvania or michigan. and we had combinations of questions. and you start building the area an easy one would be -- a gender question and an age question. put them together. now, you have males, under age 55, two variables that come together to make a complex code. then we start taking other questions. would you prefer someone that wants to take the country in the same direction as obama or do you think the country should be led by someone that can take the country in a different direction? early on in the campaign, that was good for trial. -- great for trump. it was 60-30 for trial. after labor day, it was a 53, 38, new direction, keep it as it is going. you may have someone that is so we looked at that. undecided on the ballot that wants to go in a new direction.
we even looked at people that were probably leaning for hillary clinton. we just put aside those that were definitely voting for hillary clinton. we started looking at other combinations of things to come up with a where to go to get a few more votes. we knew it was going to be close. we are ready had a few questions we were developing to expose the hidden trump vote. but what do you do with 36% of the people that say i'm going to vote for trump that won't say so publicly and 24% saying the same thing about clinton. we are still looking at that data. maybe we have to come back in four more years to say if we ad expanded on that. >> you had talked about that after the election people said the data was lying.
looking at 2018, how are people going to trust data even though the methodology does not lie and does tell a complete story -- how will you get politicians to hire you or get citizens to trust you to help with that going forward? what is the potential for the data interest-rate risk -- data interest-rate -- data industry writ large? mr. baselice: we look at demographics closely and we are mindful of them whereas the media -- that is not their first interest. their first interest is getting story out there and sharing the job approval ratings. the short answer is that there is nowhere else for candidates to go but to the industry that has pollsters existing. there are new tools out there. we are looking at internet polling more. quick surveys on the
internet. mining data. voters scoring. voter's scoring does not answer the why's. it just shares with you some issues that are important. it is a combination of tools eing used. within the industry, to be trusted, we all have to fall flat on our face on both the democrat and our side. and we are still here. and so, until they can figure out how to replace us, jeff, don't replace us yet, give me a few more cycles. keep us around. we'll keep innovating. i am playing around with the notion of going back to door to oor. it was given up in the 1970's reluctantly because not everyone had a land line phone, but there were enough people of different demographic
backgrounds that had phones that i could get a good sample. so they said they would give up going door to door. in mexico, i do door-to-door and telephone surveys. not everyone has a telephone. another way -- and you may have had someone come to your door with an ipad who asked you questions right away. it is just like doing it over the phone. we could do these kinds of surveys over the weekends. in election years when the sun is still up, you can do that. but after halloween, you would have to do a lot of saturday and sunday polling. we have to think of some other things that will help us collect data and give everyone a chance to be interviewed enough to get a representative sample. another way to answer that is
that we will be around in our industry as long as we are useful. if we can't hit the mark, time and time again, candidates will look for something else. mr. roe: to add on briefly -- analytics can tell you for example, instead of 6% in iowa, it can give you more specifics. without polling to understand what moves them, there is an agreement and there is ovement. so are you against opioids and the abuse of oippeds? yes. but does that move your vote for somebody that abreeze with you? you can't figure that out from data and analytics. you can only figure that out with a person on the phone. different ways to collect, it's out the door, 15 bucks to get a survey. i think door-to-door is another avenue.
>> i remember something you did in iowa. i remember reading about it. maybe you can expand on it. when cruz found that some of the data scoring or voter scoring in the survey, you found some people, i think they re upset about the fire work ban issue. they said that's an infringement. you should shoot off fireworks if i want, and you were able to reach those people about government overreaching. isn't there something to that? the one that we really use ad lot was red light cameras that we found in the des moines suburbs, that republicans hated. if you can find an issue people agree with specifically, it is a special moment in politics. usually you call on the phone and find an issue that they agree with, it's really a
powerful moment in politics. we were able to do that with analytics. it wasn't the big issues of the day. we were talking about fireworks bans and red light cameras and niche issues that was just important to that one voter. >> time for two more questions. et a mic here in the middle. >> i was wondering. does polling -- do you believe polling had a real effect on corker and flake's decisions? mr. roe: i don't know. i've been around people that have served in congress and the senate, and it's a hard road. it can wear you down. i do not know if the polling was part of their calculus in their decision not to run. i don't know. but it is a hard life. you have a little bit of a life
in congress depending on if you are chairman of the committee but i don't know how much polling played into their ecision. they just may have had enough. yes, sir? >> i see you have the map divided into 7 different sections. it is interesting to look at the numbers. i'm wondering if there's been study that is take a deeper dive into each of we those areas and try to explain why they're the way they are in terms of the party affiliation. is it cultural, economic, that kind of thing? has there been more depth studies tooze what makes them the way they are? mr. roe: one comes to mind immediately.
this was a few years ago. i don't know what happened. i kept getting times. three different news outlets for the same story. hispanic voters in california versus hispanic voters in texas. you look at thousands of nterviews i have collected and in texas about 30% of the hispanic community votes epublican. only 24% votes republican over in california. so there's a 14-point differential. what's the dix? hispanic voters in california are more liberal by about 15 points. then there's things we don't measure, the lifestyle, the news that you get, the taxes that are different. there's no income tax in texas. there is one in california. there are all kinds of
variables to be looked at. why do you think this way? what kind of news are you getting? there are many factors. what we don't often do after an election is look back and say what percentage of this outcome was because of the demographic makeup of these people. their spiritual beliefs, economic position and education. we do not do enough to look at all these variables. we are so happy by the victory and so sad by the loss that we piddle around with it but do not do in-depth studies. once in a while, an independent party will pay and the other olks will have the same on the other side. aisle and dig into deeper into attitudes. quite frankly, more of it should be done. i think we should get a poll done very quickly.
[laughter] >> we'll do one more very quickly. >> another question i had, a lot of the questions that you ask seem to be binary, yes, no, this, that, or how much is satisfaction scales, thinking about something like job approval ratings. when does it switch to approve or not approve? how does that play into the methodology and polling data you use? >> those are the questions that are easy to ask. one is tell me why you think this republican healthcare alternative is a good idea. what can you tell me about it? those are open ended questions. then we do view pointed questions, where we put out both sides of the argument. some people say tax reform is a good idea because of x, y, z. or this plan is good because of x, y, z. then we say others say it's a bad idea because of a, b, c.
so you put out viewpoints. it still gives a choice. most of the questions are designed that way. there are also questions being done on scales. i mentioned one earlier, the scale of 1 to 10, how interested are you in this election. boy, did we see something in 2014 where republicans were much more interested than democrats. it helped of them in the midterm elections. one thing that helps the current republican governor, you look at the 2014 election results and that is a big deal. one of the mistakes that i saw made is not looking at those type of scales in 2012. i did six surveys of u.s. senate races for a business group and i said democrats are going to win all 6 seats. they said, are you crazy? people do not like obama from 2010. republicans got control of the house.
i said 2012 is different. that is a good obama year. republicans are not going to win any of the seats. we had to look at the scale questions. most of the questions are designed to be approved and disapproved. they are easy to ask, answer and analyze. >> i want to thank everyone for coming. i would like to thank you for having two right-wing knuckle draggers on the stage. hank you for coming, mike. very insightful. thank you all for being here. well done. [applause] [captioning performed by national captioning institute] [captions copyright national cable satellite corp. 2017]
>> c-span's student cam, the tweets say it all. student cam in action. video i haditying and splicing for constitutional documentaries. this group showed us how it's done. two stellar interviews in one day. and these students asked hard-hitting questions about immigration reform and the dream act. we're asking students to choose a provision of the u.s. constitution and create a video illustrating why it's important. our competition is open to all middle-school and high school students grades 6 through 12. $100,000 in cash prizes will be awarded. the grand prize of $5,000 will go to the student or team with the best overall entry. the deadline is january 18. get content and details on our website at studentdam.org. -- studentcam.org.
>> tonight on "the communicators" -- national association of broadcasters c.e.o. gordon smith on the future of television. >> i think, again, the future, in my view, is very bright for broadcasting because of this new 3.0 receiver standard, because it will give -- it's investing dramatically into our -- into the efficiency of our spectrum. and it will wake up your phone, so if there's an emergency coming into your neighborhood, you can be alerted to that, through a broadcast signal. i've already said tremendous pictures that it will provide and the sound capabilities that will give and augment, and the internet, via broadcast signal, is a one-way signal, one source to everyone in a geographic
area. but because it will be in the future internet interon the part ofable, if a viewer wants to talk to them, it will come back through a broadcast signal, -- and there will be the opportunity to have engagement .uring television broadcaster i wish this existed when i was on the ballot. it will enable the ability of a broadcaster to provide political advertising for members of congress just of the people in the districts they represent. >> watch the commuters -- march the communicators tonight on c-span two. >> coming up, "q&a," heritage foundation distinguished fellow, lee edwards. that is followed by washington journal, live at 7:00 with your phone calls and today's headlines. ♪