Regardless of the reasons so many of the polls we’re so far off, it is clear the polling system is broken.
There are a few conservative pollsters who can be trusted, but that’s it for us.
Nate Silver admits to a strong Democrat bias in polling but less so for Donald Trump. However, that was bad too.
At one point in this thread, Silver says he doesn’t expect the anti-Republican bias, but we fully expect it to continue.
-
The Importance of Prayer: How a Christian Gold Company Stands Out by Defending Americans’ Retirement
Polls now try to manipulate as opposed to measure public opinion.
The polls, overall, had a huge bias for Democrats. Depending on the race, you could take a poll average and just add four or five or even six points to the Republican total… https://t.co/IlRt1nX38K
— Byron York (@ByronYork) March 26, 2021
That’s the biggest bias in either direction in the cycles our pollster ratings cover (since 1998). It’s likely that some earlier years, certainly 1980 and probably 1994, would have had a bigger bias if you extended back that far. So not unprecedented. But still … not … good.
— Nate Silver (@NateSilver538) March 25, 2021
don’t think we should necessarily expect that polls will continue to have a anti-Republican bias. Historically, the direction of bias is not very predicable as pollsters adjust, adapt, etc. But I do think we may continue to see systematic polling errors in BOTH directions. pic.twitter.com/qRvINCmkLN
— Nate Silver (@NateSilver538) March 25, 2021
In an environment where politics are highly nationalized and polarized, you don’t really have “50 separate contests” for the presidency. For that matter, presidential and downballot outcomes are highly correlated. So if your polls are off in one race, they may be off everywhere.
— Nate Silver (@NateSilver538) March 25, 2021
The other big finding is that we no longer see a clear rationale to give live-caller polls a higher grade by default in our pollster ratings. For one thing, they haven’t particularly outperformed other methodologies. pic.twitter.com/iy502D79RU
— Nate Silver (@NateSilver538) March 25, 2021
For another thing, it no longer really makes sense to classify entire *polling firms* by their methodology. Lots of polling firms mix-and-match methodologies, change them in midstream, etc. Methodology is a characteristic of a poll, not the pollster.
— Nate Silver (@NateSilver538) March 25, 2021
That doesn’t mean that quality doesn’t matter. We find that pollsters that participate in professional transparency/data-sharing initiatives continue to get considerably better results. The pollster ratings will continue to reflect this. pic.twitter.com/gsp1olV914
— Nate Silver (@NateSilver538) March 25, 2021
Something else worth mentioning—it’s common sense but it shows up in the data—is that you should be mildly distrustful of pollsters without a track record. As a rule of thumb, it takes about 20 polls before you can have much confidence that what pollster does is working. pic.twitter.com/xdRRsUEZOa
— Nate Silver (@NateSilver538) March 25, 2021
Finally, here’s how the most prolific pollsters fared in the general election.
The best-performing pollster was AtlasIntel!
Second was Trafalgar!
Yeah, they incorrectly had Trump winning a few states, but they were close on the margins, and that’s the better metric. pic.twitter.com/yqfbjCequB
— Nate Silver (@NateSilver538) March 25, 2021
And … one last piece (for now) of pollster-ratings-related content. A podcast! If the view I expressed in the article this morning struck you as a little too optimistic about the future of polling, this is shaded a bit more pessimistically FWIW.https://t.co/Ag3ddk5cZV
— Nate Silver (@NateSilver538) March 25, 2021
Subscribe to the Daily Newsletter