The independent newspaper of the University of Iowa community since 1868

The Daily Iowan

The independent newspaper of the University of Iowa community since 1868

The Daily Iowan

The independent newspaper of the University of Iowa community since 1868

The Daily Iowan

The accelerating demise of science journalism

This is the last column I’ll write for The Daily Iowan (insert melodramatic crocodile tears here), and it seems fitting that I go out with a great big jamboree about social science, the sorry state of science journalism, and the University of Iowa drinking culture.

As you may have heard, the UI is America’s No. 2 party school, down from No. 1 last year, if you can trust the Princeton Review — you can’t, but more on that later.

In fact, that survey is so worthless that local and national media, yes even the DI, ought to just ignore it like the media have learned to ignore Sarah Palin (unless you’re MSNBC).

Every year, it’s the same song-and-dance: The press makes a big deal out of this sad excuse for a survey, and UI administrators point out that the actual statistics from such reliable sources as the National College Health Assessment show binge drinking and partying in general is declining.

As my colleagues here on the Opinions page and I have pointed out in the past, the Princeton Review bases its rankings on nonrandom sampling and tiny sample sizes. The sample doesn’t accurately reflect college populations, and because so few students are polled, we can’t know if the findings are accurate.

Furthermore, the survey only looks at students’ perceptions of fellow undergrads rather than using statistics on something more concrete such as alcohol consumption.

UI students know about the party-school reputation, and if they believe it, Psychology 101 tells us they’ll look for evidence to confirm that belief and ignore evidence that contradicts it.

Bad sampling + bad measurement = crappy survey.

So then, why do journalists pay a lick of attention to the party-school rankings? Given the weak methods, it seems pretty darn irrelevant.

It probably makes attention-grabbing headlines, and Iowa gets some mild national attention, which is unusual outside an election year. It’d also be weird if you’re the one news outlet in Iowa that completely ignores the story.

After quickly surveying this year’s coverage from Iowa’s major newspapers, it appears that virtually all criticism of the Princeton Review survey was left to the editorial pages. Reporters could have easily talked to any old statistician who could tell them in about 10 seconds, “Yeah, this survey sucks.”

No one in particular is really to blame. It’s a systemic problem in journalism. Reporters and their editors are usually terrified of ever looking like they’re taking a side. On its own, that’s not a problem, but taken to extremes, it gives us false equivalency on steroids.

Coverage of climate change is a prime example. The news media still often use a frustrating model that gives equal time to each side of the debate, even though 97 percent of climatologists agree the Earth is warming because humans are producing excess greenhouse-gas emissions.

When there really is a clear fact of the matter, sure opinions still matter to an extent, but journalists do the public no service when they give decades of conclusive scientific research the same weight as a pure opinion with extremely little scientific support.

Of course, it becomes a lot harder for reporters to parse out the science when science reporters are an endangered species in traditional media. The Columbia Journalism Review reported that the number of weekly science sections in newspapers fell from 95 in 1989 to just 19 in 2012. It also doesn’t help when giants such as CNN completely eliminate their science and technology teams.

Then you end up with general-assignment reporters who are less familiar with the subjects they cover, and you often get glaringly bad mistakes, such as outright misinterpreting a study’s findings, forgetting to account for inflation in the economy, misunderstanding “statistical significance,” confusing percent with percentage points, and many other cringe-worthy errors.

But there’s at least one thing you can figure out without a ton of scientific or statistical training, maybe a basic research methods course at most: the Princeton Review’s party-school rankings are meaningless and are hardly worth the media’s time and attention, especially large-scale outlets in which audiences aren’t even remotely affected by the information.

More to Discover