I know we didn't do that, and I don't really think any of our competitors at least in state level polling did either. If you look at our polls and their demographic composition by the three things we weight for, which are gender, race, and age, I don't think you're going to see a whole lot of difference between what our projections were on the last poll and what they were on other surveys we did in the same states in September and October.
There were ten states we polled the week before the election that we had polled at least one other time in the previous six weeks. Here's how our final poll and second to last poll in each of those states compared:
State | Second to Last Poll | Last Poll | Difference |
| Obama +10 | Obama +10 | None |
| Obama +1 | Obama +2 | D +1 |
| Obama +2 | Obama +1 | R +1 |
| Obama +10 | Obama +13 | D +3 |
| Obama +2 | Tie | R+2 |
| Obama +11 | Obama +17 | D +6 |
| Obama +1 | Obama +1 | None |
| Obama +7 | Obama +2 | R +5 |
| Obama +9 | Obama +6 | R +3 |
| McCain +8 | McCain +13 | R +5 |
Average | Obama + 4.5 | Obama + 3.9 | R + .6 |
In five states we showed movement toward McCain, in three movement toward Obama, and two no difference. On average we showed McCain doing about half of a percentage point better in our final poll compared to our second to last one, hardly evidence we were 'fixing' our polls.
Interestingly, in two of the three states where we showed the biggest movement between the final two polls we were about the only polling company to do so. Nobody had Obama winning New Mexico by as much as us, and nobody had him losing West Virginia by as much as us. In both cases our final judgement ended up being the most accurate, and it's worth noting again that the movement we found in those two states actually headed in opposite directions.
Our only goal, whenever on the calendar we were conducting a poll, was to give the most accurate picture of the race at that time we possibly could.
No comments:
Post a Comment