Advertisement

Letters

Letters

Flawed Methodology on Living Wage Poll

To the editors:

According to “Weekend Survey Shows Lack of Support for Sit-In,” (News, April 30), student support for a living wage for all Harvard workers “has dropped significantly in the last year.” Based on the details in the story, I draw some different conclusions. First, the survey method makes it impossible to know what students believed about a living wage last Sunday. And second, we cannot conclude that levels of support have changed.

Advertisement

Of the five attitude questions in the poll, one—whether students would be willing to pay higher tuition if that were necessary to pay for a living wage for all workers—was not kosher. For one thing, the implications of the question are vague. What if it had asked if students would be willing to pay an extra $5 a year in tuition? An extra $20? Moreover, I’ve seen no claim from Harvard administrators that they would pay for a living wage through a tuition increase. Polls that contain a political message (e.g., a living wage might cost you more) have been termed “push polls” and their use in last fall’s campaigns to transmit falsehoods about candidates has been decried. Legitimate surveys avoid biased and vague items like the plague. Such questions not only produce meaningless results, they bias the answers to the questions that follow them by shaping the meaning respondents attach to them. And in e-mail surveys, respondents can see all the questions before answering any of them, meaning answers to preceding questions can be biased.

Even if students’ responses to other questions were not skewed, it is still impossible to draw any inferences about change in support for a living wage since The Crimson’s last poll in January 2000.

There are four reasons why the percentage of students who support a living wage for all Harvard workers might differ across the two surveys: (1) by chance alone the first survey included more pro-living wage students than the second survey, (2) the two surveys differed in question wording, question order, sample selection or survey administration (e-mail and telephone) (3) the attitudes of all Harvard students changed over the last 15 months, the explanation The Crimson prefers, and (4) the two samples did not come from the same populations.

The article does not provide enough information for readers to assess the possibility that the difference was due to random sampling because it does not tell us how many people responded to the items being compared. Nor does it tell us whether the comparison is based on identical surveys, administered through identical designs. Thus, we cannot tell whether the differences reflect question wording or question ordering or survey administration effects or real change.

Given the timing of the two surveys, it is quite likely that the populations from which the surveys were drawn differ, the fourth possible explanation. Last Sunday’s survey population comprised students who were accessible by e-mail or telephone between midnight and 8 p.m. Anyone gone for the weekend or camping out in the Yard, for that matter, was not part of that population. The January 2000 poll, in contrast, surveyed people on campus immediately before or during finals. The two populations surveyed may also differ because of “nonresponse bias.” In last week’s survey, just 62 percent of the random sample replied. Before generalizing their responses to Harvard students, we need to know whether the 38 percent who did not answer differ systematically from the respondents. Survey research routinely compares the respondents with nonrespondents on known attributes (concentration, sex, economic background, national origin) that allow inferences about the extent of nonresponse bias.

The problems I describe above are not arcane statistical issues. Any good course in research methods addresses them. Failing to pay attention to them results in misleading journalism. The Crimson should do better.

But ultimately, even impeccably designed and interpreted surveys that capture the public opinion of the majority cannot be the basis for decisions about the fair treatment of people whose disadvantaged economic and social status silence their voices. Harvard’s decision to pay all its workers a living wage—and I’m confident that our administrators will reach such a decision—will result from their taking the ethical high ground. Sometimes it takes a sit-in for decent, ethical leaders to recognize where that high ground is.

Barbara F. Reskin

May 3, 2001

The writer is a professor of sociology at Harvard.

IOP Welcomes Students

To the editors:

I write on behalf of all 12 members of the Institute of Politics (IOP) Programming Committee, all six student members of the Director’s Task Force, 10 additional former members of the Student Advisory Committee (SAC), nine members of the IOP staff and IOP director David Pryor.

We are puzzled by The Crimson’s call to eliminate the student role at the IOP (Editorial, “Deja Vu at the IOP,” April 24). Since the Institute’s founding in 1966, students and staff alike have remained committed to ensuring that undergraduates are deeply involved in Institute activities.

We are also troubled by The Crimson’s assertion that the former members of SAC undermined the Institute’s mission by interfering with programming efforts. The careless leveling of such a profoundly damaging and absolutely untrue accusation harms all involved.

The IOP is a unique organization, where staff and students must work cooperatively to ensure interesting, diverse programming for students and the Harvard community at large. The restructuring process undertaken this year aims to enhance our ability to work together and to create trust, openness and responsiveness to the student body at large.

We count on The Crimson to uphold the highest editorial standards, and we hope that its misstatements with regard to former SAC members and the student role at the Institute of Politics do not foster a harmful impression around campus that students have no place at the IOP.

Robert F. McCarthy ’02

April 29, 2001

The writer is president of the IOP Programming Committee.

Tags

Recommended Articles

Advertisement