In a previous post I urged the readers of this blog to participate in a Skidmore student’s survey on politics. I regret that I did so prior to viewing the survey. Having now looked at it I am quite troubled by what I found.
First of all, the questions have continually changed. This is quite bizarre.
Here are three screen shots of one of the questions on the site on different days. You will see that on each occasion the question morphed.
Example #1
The second question on this page is about the proposed new charter. In this case it contains the sentence “In addition to this, supporters of charter reform argue it is a government that will bring fresh blood into politics and rid city hall of political cronies by imposing term limits.”
Example #2
The first question on this page is again about the proposed new charter. It does not have the business about cronyism and now ends with the question, “If Mayor Joanne Yepsen supports charter reform, would you support charter reform?”
Example #3
The third question on this page is about charter change but in this case it has neither the business about cronyism nor the question about the mayor’s support.
So, having established that the survey lacks the kind of consistency that is required to tally answers, let’s scrutinize the design of the questions themselves.
Let’s take one of the versions of the question related to the charter:
“Every 10 years the mayor of Saratoga Springs appoints a charter review commission in accordance with the city charter. This commission has revised the charter that will result in a more effective, less polarized, and a more/modern [sic] and professional system of government. If Mayor Joanne Yepsen supports charter reform, would you support charter reform?”
This question violates the very basics of what is considered good polling. To begin with it has as its assumption that the charter as currently proposed “will result in a more effective, less polarized, and more modern and professional system of government.” Who, one might ask, would be opposed to such an improvement in our city government if it is as described? But the question is not even whether you approve or oppose the charter itself because it is qualified by the final sentence which asks, “If Mayor Yepsen supports charter reform, would you support charter reform?” So even if you support the proposed charter reform, the survey wants to find out whether you would change your mind if you learned (in the most unlikely of circumstances) that the mayor opposed it or if you oppose charter reform whether you would change your mind if you learned that the mayor supported it.
One has to ask, what is the purpose of this question? Is it to find out how much influence the Mayor has over voters? If that were the purpose then why not simply ask whether the person taking the survey will decide on their vote on charter reform based on what the Mayor advocates?
In fact this question has the unpleasant characteristics of a push poll. This is a commonly used political strategy where people are asked (usually by phone) their position on an issue which is presented in an extremely slanted way meant to spin the person taking the poll in a particular direction. The results are then used to claim support for a particular position on an issue. After all the question of whether to change the current form of government is quite controversial. People of good faith are quite divided on the question. There are those who would vigorously disagree that the proposed charter change would result in the benefits assumed by the way this question was worded but there is no way to register that opinion on this poll as constructed.
Complicating things further is that the charter has not even been completed yet so who knows what it may or may not do.
There are also two odd questions that appear to be related. The person taking the survey is asked first to rate the city council:
“On a scale from 1 to 5, 1 being the lowest 5 being the highest, rate the city council.”
Then they are asked to rate the mayor:
“On a scale from 1 to 5, 1 being the lowest 5 being the highest, rate the mayor.”
Now in the Commission form of government the mayor is a co-equal member of the city council. Again, I would ask, what is the purpose of this question? Is this an attempt to rate the mayor’s status among the voters? Why not ask the person being surveyed to rate the other commissioners?
I am a supporter of the Sustainable Saratoga initiative on affordable housing. Nevertheless I am bothered by another question that appears to be a push poll:
“The Saratoga Springs Housing Task Force is working on an initiative that would create more workforce housing in Saratoga Springs. Proponents of the project argue that it will create more affordable housing for residents. Do you support this initiative?”
Who would be against the city having more affordable housing? What is the likelihood of people taking this poll clicking on the “oppose” button? In fact, while I disagree with them, there are a number of interest groups that have expressed opposition to the proposed ordinance who say that they are in favor of affordable housing but oppose this particular method of achieving it but they have no way of registering that opinion in this poll. There is no button to push that says “I support initiatives for affordable housing but not this initiative.” Is the goal of the survey to assess support for affordable housing initiatives in Saratoga Springs or to assess support for this particular initiative?
Here is another question from the poll:
“Saratoga schools are now going to implement iReady, an online program that pinpoints where students need to improve to pass state-educational tests was[sic] passed despite a current budget deficit. Do you support this addition to the budget?”
First of all the wording of this question is exceedingly awkward to say the least. More importantly, though, I would expect that most people who participate in this survey are like me in that they have little idea of the actual merits/effectiveness of the iReady program. Again, the question asserts that it “pinpoints where students need to improve…” Does it actually effectively “pinpoint where students need to improve”? We have only the assertion of the person who crafted this survey that it does. And how expensive is it? How much does it contribute to the deficit? This question is flawed because it assumes a level of knowledge of survey participants which is probably unrealistic thus results will be unreliable.
Finally, there is the basic problem that this is a self-selecting survey. That is to say that rather than do a random sample that would be demographically representative of the community, this survey is being filled out by people who simply are interested in responding for whatever reason. They may or may not be representative of the broader community. I again ask, what is the purpose of such a study? What will the results tell us?
I am sorry to be so critical of this survey because I regret that my blog might embarrass the student who is conducting this survey. My criticism is really directed to whomever the faculty members are who are overseeing the student’s work.
I’m glad it wasn’t just me that saw this John! Thanks for bringing it to light!
LikeLike
So, having established that the survey lacks the kind of consistency that is required to tally answers”
The surveys are randomized and the questions changed for however many iterations of the survey there are. This is part of the design and is a sign of a sophisticated survey, not a defective one. Just like when you take a standardized test the answer key is not the same for all of the tests in the room. In standardized tests this randomization prevents cheating, in surveys, it gives a deeper and broader response, but you would not know that and instead, you assume nefarious intent or incompetence how sad.
LikeLike
Paula is correct. Modern surveys, particularly those about subjective topics, often create multiple ways of asking the question in order to ferret out biases and perceptions about the topic. They don’t lump the results into a single tally, but if done correctly can draw conclusions as to why participants answered one way vs another when the topic is presented differently.
LikeLike
Consider that when the New York Times or the Washington Post have articles on polling results regarding issues they display the
questions used to solicit answers. I have never seen an article of this nature that deviated from this model. Your comments prompted me to do a search on the issue and while there were many references for the need to randomize the order of questions, I could find no reference to randomizing the actual text of the questions. I would be appreciate if Paula or BeeBee could cite their source.
LikeLike
Paula, While standardized tests may vary the order of questions as you pointed out, they do not vary the wording of the questions.I suspect if this is done in a survey or poll it is done for other reasons than those stated for this student survey. I would be curious to look at what research there is on varying the actual wording of questions in polls. Could you and Beebee please refer me to your sources on this. Thanks!
LikeLike
Happy to see your comments. I too found the statements and questions to be a bit unworkable -thought it was just me for a minute!
LikeLiked by 1 person
At least they had a ‘rate the mayor’ question. I have to admit I was a bit disappointed there wasn’t anything lower then a “1” to rate her however.
LikeLike
nonissue, Skidmore student trying to learn, geez
LikeLike
This is a FAKE survey. All surveys are fake. Wake up people. In this digital age, everything can be modified to the nth degree; ad nauseum. The authors are duping the general population to believe they are credited and worthy. Not. This is a phony group of confidence tricksters riding on their past laurels (if any) but more importantly are using the Skidmore College name, logo and brand to further their disingenuous agenda. In short; They are up to no good.
LikeLike
Sorry, but the correct answer is “None of the above.” In reality, it’s was real survey, the mayor wasn’t involved, but it was just a (Political) Science Lab experiment gone awry.
John shared some information with me privately and, at his request, I did some background research on the issue. The gist of it is that Prof. Christopher Mann, whose class conducted the survey, has been engaged in researching the way the wording of survey questions can influence their outcome for much of his academic career. You can access many examples of his published work here, almost all of which precedes his arrival in Saratoga Springs:
http://www.christopherbmann.com/publications/
As with many chemistry or biology lab experiments, the teacher already knows what the outcome will likely be, but proposes these experiments so that students can have hands-on experience to see how that outcome was reached. That’s what this survey was all about. It was never meant to be made public and was simply a classroom exercise in how public response varies with the way a question is posed. People who clicked on the link were directed to one of several variations in which only one question was modified. In this case, all of the other questions were window dressing designed to mask the question about the charter. If the other questions were poorly worded, it didn’t matter because they were going to be thrown out, anyway.
If the Skidmore experiment had reached its natural end, students should have seen significant deviations among respondents based on which way the target question was worded. However, John unwittingly outed the crux of the experiment, rendering it useless, so it was taken down.
The only sinister implication is the possible involvement of Prof. Mann’s colleague and charter review chairman, Bob Turner. If you’re of a conspiratorial bent, then you might hypothesize that Prof. Turner proposed to Prof. Mann that this question be the object of the experiment because he (Turner) wanted to see which wording would elicit the most favorable (or least negative) response to the proposed charter change. In that case, Prof. Mann and his class were unwitting pawns in a larger game.
Alternatively, Prof. Mann, as a political scientist, was certainly aware of the charter review controversy and proposed this question himself because of it’s volatile nature. Short of issuing subpoenas and taking testimony under oath, which hypothesis is correct can only be classified as speculation.
LikeLike