Humans are messy and complicated. Psychologists and sociologists have put us through proverbial rats’ mazes for decades, hoping to find patterns in our behavior. Every time, our humanness gets in the way.
Social science is not like physical science: drop 10 eggs from the same height, and they’ll all fall. Feed eggs to 10 people, though, and you may get 10 different responses. Did the subject’s mother serve eggs every Sunday for breakfast? How did she feel about her mother? Are eggs the subject’s favorite food? Is he allergic? It all plays a role.
That’s why reporting on social-science studies can be deceptively hard. Sure, each one opens with a theory that seems solid and sensible — an easy news hook. But dig deeper, and you’ll find that most studies raise more questions than they answer.
These tips, aimed at general assignment reporters, can save you from misinterpreting or sensationalizing a researcher’s hot new findings. Even those on social science beats may learn something new.
Read the whole study
This advice may seem obvious, but when you’ve been handed a 20-page study full of academic language two hours before deadline, it can be tempting to read the introduction and then start putting together your story. Academic papers are not written in reverse-pyramid style, however, and sometimes key points — including data that contradict the findings — are buried on the 19th page.
Take, for instance, a study about whether playing a competitive video game would make subjects more aggressive. Aggressiveness was measured by the amount of hot sauce subjects gave each other after playing a game. (Yes, this sounds a little odd, but the study explains it further.) In general, the subjects did reach for hot sauce after a high-stakes game.
But flip through to the limitations section and you learn that the results might be different in other geographic areas or among different ethnic groups. Would test subjects raised with spicy foods view serving hot sauce as an aggressive act? Probably not.
Demographics can skew results in many ways. Look at the sample size, gender and ethnic or socioeconomic factors. The hot-sauce trial included two groups: one of 60 college students and another of 42. Given the tested region’s demographics, they were probably mostly white and all college age. Would the findings apply to African American 15-year-olds, or to Florida retirees?
These details are worth sharing with readers so they have better context for the findings you’re reporting.
Determine how true-to-life the experiment was
Researchers are always limited by ethical issues that prevent them from, say, subjecting 1,000 kids to 18 years of eating nothing but fast food to determine the health consequences.
So they craft lab trials they hope will invoke responses that can be tested and repeated — and that apply to real life. In reality, if video games make college kids aggressive, Sriracha might not be their weapon of choice. So, is using it in a research setting realistic?
Also, just bringing people into a lab can taint research. People act differently when they know they’re being studied. In addition, being asked to do something they normally wouldn’t do (such as serve cups of hot sauce to strangers) can change their behavior.
Case-control and longitudinal studies are rarer, but they provide more reliable information about the populations in question, either because they have a large pool of subjects, the subjects were studied for a long time in real-life settings, or both.
Ask: Could the findings be backward?
Reporters who spent time in college sociology courses heard the phrase “correlation is not causation” almost as often as the five Ws. It’s easy to forget this maxim when a study reveals a link, for instance, between teen depression and listening to heavy metal. Plenty of journalists would go ahead and say the music caused bad feelings — that’s what many people believed in the 1980s, right?
However, the correlation could go the other way: maybe depressed people seek out heavy metal to make themselves feel better. In fact, that’s just what heavy metal fans reported.
When reporting on such studies, resist the urge to imply one thing causes another. And, if you suspect the correlation could be reversed, make time to find a source who can discuss the other side.
Look for bias
Remember when I said humans are complicated? That’s true of researchers, too. Many do their best to remain objective — and to find objective results — but our tendency to look for patterns or to see what we’re expecting can get the best of us.
Many researchers spend their entire careers studying the same topic. On the upside, this means they’ve got that particular topic in the bag. On the downside, they may be more likely to reach the same conclusions over and over, especially once they start anticipating certain results based on prior tests. This effect can be magnified when they base their research on similar studies — and not on research that shows opposite results.
Funding is another source of bias. Follow the trail, and most money for social science studies can be traced back to the university that hosted them. When industries or lobbyists bankroll a study, it’s easy to see the spin potential. But even universities gain bragging rights when their researchers “prove” something related to a hot topic — or lose them when they don’t.
Readers have become more sensitive to questions of funding, so mentioning it in your reporting is one way to trigger their critical-thinking tendencies.
Avoid sensationalized language
In this era of “most-clicked” articles, it’s tempting to lead with whatever nugget from the study will draw the most readers, whether or not it’s factual. In reality, such findings are much more nuanced.
Strong writing, combined with reporting that digs up aspects of the research other journalists aren’t covering, is one way to make those nuances sexy. It’ll set you — and your news site — apart.