Ever since the 1960s, when the first pollster stopped the first voter and asked him or her how they voted, the media and the politically addicted have viewed exit polls as oracles to explain why candidates won or lost. And if the results were surprisingly different than the pre-election polls, then they were pored over and studied for months afterwards.
It turns out that, rather being a source of truth, exit polls are deeply flawed and fundamentally biased. That’s according to an analysis of Pew Research data by The New York Time’s Nate Cohn. Pew matched actual voter registration files and vote history data to establish a profile of who voted for whom in 2016. Their research showed that 45 percent of 2016 voters were whites without a college degree. But the exit polls said that the percentage was 34. Hmmm.
As it turns out, almost all polls weight their samples to appropriately represent less educated voters, except exit polls. Exit polls aren’t weighted by education, only by demographic characteristics that the interviewer assumes by looking at the respondent: gender, race, and a rough guess of age. Who knows how accurate those guesses are? Obviously, education isn’t part of those assumptions.
Thus, exit polls over-represent young, non-white, and well-educated voters and then compensate by weighting other voters more heavily. So the data is fundamentally flawed. The problem is compounded when pundits and talking heads compare the more accurately weighted pre-election polls to the flawed exit polls. Perhaps it’s time for exit polls to exit the political stage, or at least strongly revise their methodology. Until then, exit polls are our Measurement Menaces of the Month. ∞
Thanks to Paul Sableman at Flickr for the image.