Wednesday, November 7, 2012

How well did Nate Silver do?

The news is saying that Nate Silver (who does election predictions at FiveThirtyEight) got fifty states out of fifty. It's being reported as a victory of math nerds over pundits.

In my humble opinion, getting 50 out of 50 is somewhat meaningless. A lot of those states weren't exactly swing states! And if he gets some of them wrong, that doesn't mean his probabilistic predictions were wrong. Likewise, if he gets them right, that doesn't mean he was right.

I thought it would be more informative to look at Nate Silver's state-by-state predictions of Obama's vote share. That way, Nate could be wrong even in states like California.  So here's what I did for each state: I took the difference between the final prediction of FiveThirtyEight, and the vote share reported by Google this morning.  Then I divided this difference by Nate's margin of error.  See the results in a histogram below.


What the figure shows is that Nate's predictions were more accurate than Nate himself claimed!

The mean of the actual distribution is -0.14, which means that Obama did slightly worse than Nate predicted, but by an amount that can be explained by random error.  The standard deviation of the distribution is 0.5, which means that Nate predicted an error that was twice the actual error.

Of course, Nate's reported error is likely due to expected systematic error.  For example, if all states were slightly more in favor of Obama, that would be a systematic error.  Assuming that Nate Silver predicted a spread of 0.5, he must have expected a systematic error of about 0.85 in one direction or the other.