17 March 2003

Sorry to temporarily hijack the blog to something related to general blogosphere debates, but I needed to chip in on a math question that came up in these posts on the Volokh Conspiracy and Matt Yglesias.

The question is as follows: Two polls each have a margin of error of 3 points, together they show a decline of 4 points, what is the odds that the this decline is actually real.

We compute this as follows. The original distributions are assumed to be roughly normal with means differing by 4 points and with 3 points = 1.96 standard deviations (this is what a 3 point margin of error means, since getting it within 1.96 standard deviations happens .95% of the time, see this link for how to make calculations turning standard deviations into percentages). Thus 1 standard deviation = 3/1.96 points = 1.53 points.

Now we turn to looking at the distribution of the difference of these two random variables. We recall the well known fact (cf. this site) that the difference of two normal distributions is a normal distribution whose mean is the difference of the means and whose standard deviation is gotten by taking the square root of the sum of the squares. Therefore the difference is distributed in a normal distribution whose mean is 4 points and whose standard deviation is sqrt(2) * 1.53 points = 2.16 points.

We want to know the chances that the difference is larger than one. That is asking what are the chances that you lie within 4 points on one side and anything on the other side of the mean. Translating into standard deviations we find we are asking to find the probability that a randomly distributed variable is smaller than 4/2.16 = 1.85 standard deviations above the mean. Using a standard table this happens almost 97% of the time.

Therefore, one can say with almost 97% accuracy that this two polls do show that the popularity of the war has declined.

If you want a more particular question like has it dropped by at least two points, this can also be easily calculated using the above method. 2 points is .93 standard deviations, and again using the table shows that we have a 82% confidence level that Bush's Iraq policy popularity dropped by at least two points.

Using these sorts of results we can see that what we can say with the usual 95% accuracy is the following. Since we only care about the error in one direction we get 95% accuracy within 1.65 standard deviations. 1.65 standard deviations is just under 3.6 points. So we can say with 95% confidence that Bush's approval ratings dropped by at least .4%.

So the long and short of it is that it seems Eugene and Matt (I feel weird calling profs by first names but anyway) are both right here. Even though it is within 6 points you can still say with over 95% accuracy that Bush's numbers did actually drop, and so the story is accurate. However, Eugene is right that if you really want that 95% confidence then its not much of a story cause you're only confident of a .5% drop which is hardly newsworthy.

No comments: