Photo by Flickr user Sociology at Work.
News outlets are chock-full of articles reporting some new study and what it says about health or climate change or violent video games. Reading these articles can sometimes be a little like playing that childhood game of telephone. By the time the message reaches you, it’s probably at least a little different from what the original person said. I’ve written for Poynter about how journalists can improve their coverage of research. Today I want to give everyday folks some pointers for reading and interpreting actual studies.
1. Get the actual study.
Articles usually mention the researchers involved, and the university, if there was one. Sometimes studies are locked away behind academic-journal paywalls, but if you do enough hunting you can find “author proofs” — non-final drafts — on the researchers’ web sites. Google is your friend, and you can’t assess a study if you don’t read it. I’m going to base this post off of one such proof, Christopher Ferguson and Cheryl Olson’s Video Game Violence Use Among ‘‘Vulnerable’’ Populations. Load it up and read along.
2. Consider whether the study is linked to any recent events.
Most of the studies I look at here are about kids and violent video games. Although they may not be explicit, it’s notable that these studies picked up in frequency after events like the Columbine High School massacre and attempts to blame video games like Doom. Be suspicious when the sensationalism comes first and the studies follow. Among other things, this gives researchers an agenda, whether they say so or not. One of the reasons I like Ferguson and Olson in general is that they come right out and say it:
Existing societal concerns about video games have intensiﬁed after the 1999 Columbine High School massacre (Ferguson 2013) and other well-publicized school shootings. The tragic 2012 Sandy Hook Elementary School murders in Newtown, Connecticut resurrected these debates amid reports that the 20-year-old shooter was an avid gamer (e.g., Henderson 2012). … Given … the recurring media focus on video games, researchers need to do more to answer the questions of greatest public concern regarding video games and any potential harm to youth.
3. Look at the demographics of the sample.
The sample means the group that is included in the study; demographics refers to their characteristics. Since Ferguson and Olson have said this is meant to apply to the question of whether violent video games harm youth, especially with respect to school shootings, you’d want the sample to resemble school shooters demographically; age 13 to 17, usually male, usually white, usually suburban. Often, the sample is full of college kids who are over 18. That’s less than ideal to begin with.
In Ferguson and Olson’s study here, we have a pretty good range of kids: 377 youth with a mean age of 13 (which means they were a variety of ages in that range). 182 had ADD symptoms, while 284 had symptoms of depression (yes, that means 89 of them had symptoms of both). The sample included 140 boys and 234 girls, as well as 3 who did not declare a gender. There were urban and suburban kids, though it doesn’t say how many. The kids were mostly white, but a significant number of the kids from the urban school were black.
4. Look at the size of the sample.
This one’s pretty easy. Bigger is always better. The more subjects you have, the more you’re getting away from anecdotal information and the more you’re getting into actual data. Now, a larger sample of extremely similar kids is only useful if they resemble the outside-world kids you’re trying to represent. More ideal is a larger sample of different kinds of kids. 377, the number in this study, isn’t a ton, but it’s more than the 100-200 many such studies have.
5. Look at the methodology.
Most sociological experiments are highly synthetic constructs, which are meant to single out and test one particular aspect of everyday life. In this case, Ferguson and Olson opted to put the kids through a survey or a questionnaire, which can be tricky because the results are only as good as the honesty of the kids. I personally think kids are reasonably reliable at answering certain kinds of questions (like how often they play certain kinds of video games, which was asked here) and not necessarily others (things they might not want adults to know, like sexual or drug experimentation). Still, it’s worth taking most self-report studies with a grain of salt, unless they’re corroborated with other data from the subjects’s lives.
(I’d also like to point out a detail from their survey that is relevant to this blog: Of all the kids they surveyed, only 6.1 percent had played no video games in the prior 6 months, and only 11 percent had played no violent video games. These are kids with a mean age of 13. In other words, most kids are playing them. Where’s the giant surge of violence and aggression?)
6. Look at what they say about the results.
This is especially true if you’re comparing your own read of a study to what was reported in the news. Here are the results from this study, truncated:
With the sample of children with clinically elevated depressive symptoms and regarding delinquent criminality as an outcome only stress and trait aggression were predictive of delinquent criminality. Neither exposure to video game violence nor the interaction between trait aggression and exposure to video game violence were predictive of delinquent outcomes.
In other words, among the kids with symptoms of depression, mood issues alone didn’t correlate with delinquency. Only if they were under stress and were aggressive did they have a tendency to be delinquent. Video games didn’t seem to play a role.
They go through other trait combinations and look at bullying as well as delinquency. Here’s one interesting angle to the study, which is one a lot of news outlets focused on:
Finally, with the sample once again of children with clinically elevated attention deﬁcit symptoms and with regards to bullying behavior only trait aggression was predictive of bullying behaviors along with the interaction between trait aggression and exposure to violent games did approach signiﬁcance suggesting that highly trait aggressive children who also played violent video games were less likely to engage in bullying behaviors. Exposure to video game violence was not a signiﬁcant predictor of bullying behaviors.
Note that that result, while interesting, only “approached significance” mathematically, and they immediately point out that none of the findings in this section was statistically significant. In other words, it’s inconclusive.
7. Read the “discussion” section carefully.
This is where the researchers explain their findings and discuss how they think they should be interpreted. While journalists often overemphasize certain things (which makes for a more compelling news story), researchers usually de-emphasize. Here’s what they say about the above result:
[After saying they didn't find video games were a factor n kids' violent behavior.] The only exception was our ﬁnding that, for children with elevated attention deﬁcit symptoms, trait aggression and video game violence interacted in such a way as to predict reduced bullying. This could be considered some small correlational evidence for a cathartic type effect, although we note it was for only one of four outcomes and small in effect size. Thus we caution against overinterpretation of this result.
8. Read the “limitations” section even more carefully.
This is where researchers will tell you the shortcomings of the methods they used, as well as the extent to which their findings are useful. They’ll usually suggest how future research can build upon what they’ve found, should anyone care to follow up with a larger or at least related study. This study’s limitations section is accompanied by a “word of caution” section, both of which are worth reading in full.
9. Check for bias.
Look at the researchers’ websites. Look at the titles of their papers, and see if you notice a trend. If they keep finding the same things over and over, there’s a chance they’re suffering from researcher bias. There’s a lot of this in the violent-video-game research world, which is one reason you see the same names popping up again and again (and finding the same things again and again).
10. Follow the money.
If research was conducted by an academic, chances are good that the money came from the university where they work. That could include public or private dollars. Unfortunately, researchers are pretty mum about where that money comes from or how much a given study cost. But at least sniff around and see what you can learn.
If you’re tired of reading about video games, this is a great breakdown of how the myth that eating breakfast helps you lose weight became so pervasive, and why it’s wrong. It also discusses the difference between association (correlation) and causation, which is important to understand when reading any sociological study, as well as how even the “gold standard” methods for research can be wrong, especially when they aren’t backed by further research.
So there you go. Now you know how to read a social-science study. Go read some more and have fun! Commenters: can you think of anything I’ve left out that you’d like to add?