Another poll used inappropriately via the Volokh Conspiracy.
[Eugene Volokh, 5:26 PM]
REPEAT AFTER ME: "I will not believe scientifically invalid polls." "I will not believe scientifically invalid polls." "I will not believe scientifically invalid polls, even if I like their results."
Look, I'd like to think that most MTV viewers support the war; in fact, they might well support the war. But unless I misunderstand the way the poll described here (see also the InstaPundit link) was conducted, that poll doesn't tell us that. It doesn't tell us much of anything, because it counts only those people who choose to vote in it (key phrase: "Among people voting in MTVNews.com's polls . . .," and see also this sample of an mtv.com poll), and we have no reason to believe that they're a representative sample of MTV viewers, or of any other group.
UPDATE: Drat, The Corner falls for this, too.
Remember, if you want to make accurate inference about a population, your sample selection methods need to be free from systematic biases...this example clearly fails that test. Although I doubt they were trying to make accurate inference...
Keywords: Polls, BUS230
Friday, March 21, 2003
Wednesday, March 12, 2003
Strong Dollar Advocates
Another topic to add to the list of things to write about:
Strong Dollar policy...What does this mean? Is it a good policy? The reality is it’s an intellectually vacant idea, except that talking about it seems to temporarily influence the occasional ignorant FX trader.
Strong Dollar policy...What does this mean? Is it a good policy? The reality is it’s an intellectually vacant idea, except that talking about it seems to temporarily influence the occasional ignorant FX trader.
Wednesday, March 05, 2003
Lies, Damn Lies, and Polls
By way of the Volokh Conspiracy. An important point on survey data. The question matters, and the alternatives offered matter even more.
[Eugene Volokh, 7:38 PM]
MORE INCONCLUSIVE POLLS ON IRAQ: Edward Boyd (Zonitics) cites the following poll:
ABC News/Washington Post Poll. Feb. 26-March 2, 2003. N=1,022 adults nationwide. MoE ± 3 (total sample). Fieldwork by TNS Intersearch.
"The Bush Administration says it will move soon to disarm Iraq and remove Saddam Hussein from power, by war if necessary, working with countries that are willing to assist, even without the support of the United Nations. Overall, do you support or oppose this policy?"
[Support] 59%
[Oppose] 37%
[No Opinion] 4%
Cool, I like that. But if you read further on the same page (at the pollingreport.com site, a great resource), you also see:
CNN/USA Today/Gallup Poll. Latest: Feb. 24-26, 2003. N=1,003 adults nationwide. MoE ± 3. . . .
"As you may know, the U.S., Great Britain, and Spain plan to submit a resolution to the United Nations that says that Iraq is in serious violation of prior UN resolutions that required Iraq to disarm. Do you think the United States should invade Iraq with ground troops only if the UN approves this new resolution, even if the UN does not approve this new resolution, or do you think the United States should not send ground troops to Iraq at all?" Options were rotated
[Only With UN Approval] 40%
[Even Without UN Approval] 38%
[Not At All] 19%
[No Opinion] 3%
So one poll (2/26-3/2) says 59%-37% in favor of war even without U.N. approval; another (2/24-2/26) says 59%-38% against war without U.N. approval.
So what does this mean? Well, obviously people's reactions are sensitive to the wording of the question, but I think that sort of sensitivity itself shows that people's views on the subject just aren't very firm. A bit over a third of the public supports war without the U.N., a bit over a third of the public opposes it (either altogether or unless the U.N. says yes), and a bit under a third is uncertain.
Thus, we don't have a settled public opinion even now -- and this means that it's utterly futile to predict how the public would view the war after it takes place. It would be pretty pointless, it seems to me, for Democrats, Republicans, or anyone else to focus much on these polls.
I’m not sure I’d come to the same conclusion as he does, but it clearly points out an inconsistency. So what explains it? Random sampling error? Each survey lists the margin of error at +/- 3%, which is not necessarily the margin of sampling error for this question, but the largest possible sampling error for a bivariate categorical question in this survey. As a short aside, the inaccurate reporting of margins of error for surveys has become a bit of a pet peeve. Rather then report the actual sampling error for each question, which would vary based on the variation in responses to that question, they report a sampling error for the survey. The problem is there is no such thing as a sampling error for a survey. The sampling error is for a particular question, trying to measure a particular phenomenon. Now in their defense they always assume the largest possible variation in a dichotomous response question, therefore biasing the margin of error upward.
There are of course many other types of errors besides sampling error that can be responsible for the different results. Things such as sample selection problems, order bias, administrative errors, etc. Eugene thinks that people have not made up there minds yet. Notice that the actual question in the first poll asks if you support or oppose the policy outlined in the question. The first answer offered is “support” and if Eugene is right and people don’t have strong opinions, then they usually take the first alternative offered (order bias). The pollsters should have rotated the alternatives support and oppose, like they rotated responses in the second poll to avoid this problem. I don’t think that is the case, besides both surveys allowed people to freeely respond with no opinion, and those percentages are quite low suggesting that people have thought about it, even if they haven’t thought very deeply. My guess is that the wording is the root cause of the different results. What exaclty do people think they are supporting in the first poll?
I think the difference is due to people’s reluctance to be opposed to military action if it is undertaken. Call it the Vietnam effect, or the rally around the flag effect, or maybe the patriotic effect (no one wants to be so unamerican as to oppose the policy of the president). And no one wants to be seen as being opposed to the troops. Notice in the second poll they are offered policy alternatives that let them choose which policy they prefer, very different from asking if they support or oppose a policy currently being used. I would tend to believe the second poll is a better measure of American’s stance on what they feel is the best policy choice for the US to follow.
Keywords: Polls, BUS230
[Eugene Volokh, 7:38 PM]
MORE INCONCLUSIVE POLLS ON IRAQ: Edward Boyd (Zonitics) cites the following poll:
ABC News/Washington Post Poll. Feb. 26-March 2, 2003. N=1,022 adults nationwide. MoE ± 3 (total sample). Fieldwork by TNS Intersearch.
"The Bush Administration says it will move soon to disarm Iraq and remove Saddam Hussein from power, by war if necessary, working with countries that are willing to assist, even without the support of the United Nations. Overall, do you support or oppose this policy?"
[Support] 59%
[Oppose] 37%
[No Opinion] 4%
Cool, I like that. But if you read further on the same page (at the pollingreport.com site, a great resource), you also see:
CNN/USA Today/Gallup Poll. Latest: Feb. 24-26, 2003. N=1,003 adults nationwide. MoE ± 3. . . .
"As you may know, the U.S., Great Britain, and Spain plan to submit a resolution to the United Nations that says that Iraq is in serious violation of prior UN resolutions that required Iraq to disarm. Do you think the United States should invade Iraq with ground troops only if the UN approves this new resolution, even if the UN does not approve this new resolution, or do you think the United States should not send ground troops to Iraq at all?" Options were rotated
[Only With UN Approval] 40%
[Even Without UN Approval] 38%
[Not At All] 19%
[No Opinion] 3%
So one poll (2/26-3/2) says 59%-37% in favor of war even without U.N. approval; another (2/24-2/26) says 59%-38% against war without U.N. approval.
So what does this mean? Well, obviously people's reactions are sensitive to the wording of the question, but I think that sort of sensitivity itself shows that people's views on the subject just aren't very firm. A bit over a third of the public supports war without the U.N., a bit over a third of the public opposes it (either altogether or unless the U.N. says yes), and a bit under a third is uncertain.
Thus, we don't have a settled public opinion even now -- and this means that it's utterly futile to predict how the public would view the war after it takes place. It would be pretty pointless, it seems to me, for Democrats, Republicans, or anyone else to focus much on these polls.
I’m not sure I’d come to the same conclusion as he does, but it clearly points out an inconsistency. So what explains it? Random sampling error? Each survey lists the margin of error at +/- 3%, which is not necessarily the margin of sampling error for this question, but the largest possible sampling error for a bivariate categorical question in this survey. As a short aside, the inaccurate reporting of margins of error for surveys has become a bit of a pet peeve. Rather then report the actual sampling error for each question, which would vary based on the variation in responses to that question, they report a sampling error for the survey. The problem is there is no such thing as a sampling error for a survey. The sampling error is for a particular question, trying to measure a particular phenomenon. Now in their defense they always assume the largest possible variation in a dichotomous response question, therefore biasing the margin of error upward.
There are of course many other types of errors besides sampling error that can be responsible for the different results. Things such as sample selection problems, order bias, administrative errors, etc. Eugene thinks that people have not made up there minds yet. Notice that the actual question in the first poll asks if you support or oppose the policy outlined in the question. The first answer offered is “support” and if Eugene is right and people don’t have strong opinions, then they usually take the first alternative offered (order bias). The pollsters should have rotated the alternatives support and oppose, like they rotated responses in the second poll to avoid this problem. I don’t think that is the case, besides both surveys allowed people to freeely respond with no opinion, and those percentages are quite low suggesting that people have thought about it, even if they haven’t thought very deeply. My guess is that the wording is the root cause of the different results. What exaclty do people think they are supporting in the first poll?
I think the difference is due to people’s reluctance to be opposed to military action if it is undertaken. Call it the Vietnam effect, or the rally around the flag effect, or maybe the patriotic effect (no one wants to be so unamerican as to oppose the policy of the president). And no one wants to be seen as being opposed to the troops. Notice in the second poll they are offered policy alternatives that let them choose which policy they prefer, very different from asking if they support or oppose a policy currently being used. I would tend to believe the second poll is a better measure of American’s stance on what they feel is the best policy choice for the US to follow.
Keywords: Polls, BUS230
Subscribe to:
Posts (Atom)