Whether you agree or disagree with its tenets, GamerGate has been an important part of the new digital media frontier over the last year or so. It’s one of the biggest demonstrations of the power of Internet media (including, in one instance, a self-published e-book) to organize and amplify the voice of a relatively small movement to let it have a nationwide effect. This is a powerful and sobering testament to just how much influence the bits of news and discussion we read off those little glowing screens can really have on us.

But in trying to consider how popular and widely-held the GamerGate movement really was, I just keep thinking of this piece of doggerel:

The lurkers support me in email
They all think I’m great don’t you know.
You posters just don’t understand me
But soon you will reap what you sow.

A 1998 filk by Jo Walton, this song is about the tendency of those on the losing side of an online argument to claim that “lurkers”—people who read the forum but don’t actively contribute to it—are sending them private affirmations of support.

It also sums up one of the central beliefs of adherents to the GamerGate movement—the idea that most gamers out there might not be saying anything about it but are actually on their side, because they are on the side of right and truth and justice. But is that really the case? Superheroes in Racecars blogger Livio De La Cruz decided to look at the available Internet media to see what he could determine about the GamerGate movement’s overall popularity from it.

De La Cruz’s methodology was to look at all the articles he could find from the most popular sites by Alexa rank that fit the criteria he came up with for determining whether GamerGate or related issues were actually their subject matter. He ended up with 1,183 articles altogether, and he made the index of these articles (in the form of links to the original source material) available in its entirety.

After collecting this data, De La Cruz analyzed the articles for content either in favor of or against GamerGate. It should come as very little surprise to anyone who followed the news coverage as it happened that the vast majority of coverage came down against the movement. Coverage in favor tended to come from one of three sources: Breitbart, which overtly supported GamerGate with 60 articles on the subject, The Escapist, which scrupulously attempted to treat GamerGate as a legitimate movement with real grievances, and lone pro-GamerGate writers on the staffs of otherwise anti-GamerGate publications (Forbes, TechCrunch, and CinemaBlend). Apart from that, nearly all GamerGate coverage from sites with an Alexa popularity ranking of below 11,000 was opposed.

Much of the article constitutes a history of the media’s coverage of the GamerGate movement, from the early days when it convinced Intel to pull its ads from Gamasutra, to the later days when much of the media stopped paying attention. I did notice he barely touched upon mention of the Hugo Awards, seeming to consider use of the “GamerGate” term in that context to constitute a shorthand for online abuse of any kind. (There were accusations that the Sad or Rabid Puppies leadership attempted to lure GamerGate members into supporting their own movements. However, these were never substantiated to the best of my knowledge and De La Cruz doesn’t mention them in his report, though some of the articles about the matter are included in his list of courses.)

So, how convincing is it? As comments from GamerGate adherents beneath the report show (or at least, they did show—the comments section isn’t there anymore; perhaps De La Cruz decided it was likely to cause too much trouble and removed it), it isn’t likely to change the minds of anyone who doesn’t already agree with it. It is an admirable survey of the history of GamerGate, as presented through content available on sites ranging from Wikipedia (Alexa Ranking 7) to Gamasutra (Alexa Ranking 10,664), and I personally do not disagree with its overall conclusion. However, I do have a few misgivings about the study’s methodology that tend to cast a little doubt.

First of all, as a survey of content from selected sources, it is necessarily a bit subjective. De La Cruz’s criteria seem entirely reasonable, but nonetheless he had to make decisions over what to include and what to leave out. Given that De La Cruz himself is clearly opposed to GamerGate, might he have let his researcher’s bias influence the conclusion? Given the disparity in quantity between the pro-GamerGate and anti-GamerGate coverage in the list of sources, I find it hard to imagine he could have left out enough pro-GamerGate coverage to make a difference. But it’s possible his opinions could have colored the conclusions he drew from it.

(I’m not saying that being opposed to GamerGate is a bad thing, as I am myself, but looking at it from the other side, I can see how some people might use that fact to question the report.)

I also have my doubts about De La Cruz’s estimates of the relative audience sizes of the pro- and anti-GamerGate factions, which he derives from the number of subscribers to forums like KotakuInAction, the audience count for the Colbert Report, or the number of views of YouTube videos. But we all know how unreliable viewer and subscriber counts can be, what with dummy accounts and click fraud.

De La Cruz states that “one of my core assumptions is that the most visible coverage of GamerGate had a greater impact on ‘what most people think’ than GamerGate’s social media presence did” but it seems like a tenuous link to draw. History is replete with examples of the common person thinking something different than the media do—that iconic photo of Harry S Truman holding up a newspaper announcing an election victory for Dewey, for example.

If you’re going to study “what most people think,” you should do so by examining the people directly. Looking at the social media (as Newsweek did, for example, when it hired social media analytic company BrandWatch to analyze tweets under the #GamerGate hashtag) would be one way, though even then you end up with an analysis of what the loudest tweeters think, not the people who just read and don’t say anything. Surveying gamers or Internet-users in general on their feelings about the movement would be another method, though there you run into the problem that, given how people don’t like taking surveys, it’s way too easy to design a lousy survey study.

Addressing the overall media coverage does show that, when the media investigated GamerGate adherents’ statements and actions fully, they by and large came to the same uniform conclusions about its nature, as addressed in De La Cruz’s report. However, GamerGate adherents can and will (and some of those in the removed comment section did) simply continue to claim there is a “media bias” in what was reported, but the actual audience was largely with them.

Or, in other words, “The lurkers support me in email.”

On the whole, I find De La Cruz’s report to be a highly useful and informative work of scholarship on how the GamerGate movement started, grew through the media, and eventually petered out. But by its very nature, it’s not going to address the contention De La Cruz meant for it to—providing an accurate depiction of what “the vast majority of people” think about GamerGate.

I personally think he’s probably right that most people do agree with those conclusions, but his report can’t tell us that. It can only tell us what the popular Internet media thought about GamerGate. It can’t tell us what the average person thought about it.


The TeleRead community values your civil and thoughtful comments. We use a cache, so expect a delay. Problems? E-mail newteleread@gmail.com.