FYI.

This story is over 5 years old.

News

The Troubling Link Between Facebook's Emotion Study and Pentagon Research

A researcher behind the
Photo by Franco Bouly

Facebook users were rightfully unnerved on discovering that the social media giant had been experimenting with manipulating users' emotions through tweaking the content of news feeds.

Without informing the experiment subjects, data scientists skewed what almost 700,000 Facebook users saw when they logged in. Some were shown posts containing more happy and positive words; some were shown content analyzed as sadder and more negative. After one week, the study found that manipulated users were more likely to post either especially positive or negative words themselves.

Advertisement

They called the effect "emotional contagion." It's troubling both because the experiment was carried out without user consent and, above all, because it seems to have worked. So much for autonomous subjects, freely spreading ideas over social media platforms. Cybernetic dreamers wept.

Facebook has defended the experiment as a means to improve its product. However, evidence has emerged that links the "emotional contagion" experiment to US Department of Defense research into quelling social unrest. One of the authors of the Facebook study, Jeffrey T. Hancock of Cornell University, also received funding from the Pentagon's s Minerva Research Initiative to conduct a similar study on the spread of ideas through social media under authoritarian regimes.

As I have written, the Minerva initiative — a series of projects across universities that, in sum, aims to provide a picture both descriptive and predictive of civil unrest — is a pernicious example of using the academy for the militaristic purposes of studying and stemming dissent.

The link between the Facebook study and the DoD research comes only through Hancock, and it is to be expected that scientists receive funding and directives from a number of sources. The DoD did not pay for the Facebook study, but the research link is not irrelevant. The Minerva research on the spread of social unrest through online vectors mirrors Facebook's interest in emotional resonance.

Both a Silicon Valley giant and the Pentagon are pouring funds into tracing how we relate emotionally online. I'd argue that this illustrates the limits of platforms like Facebook for radical social change. As the "emotional contagion" experiment suggests, these platforms are all too easily manipulated by those in positions of power. There is an insurmountable asymmetry between those behind Facebook and those using it. There is a vast power differential between those interested in preventing dissent and those interested in spreading it. As we know from the existence of Minerva, too, networked societies are closely observed by those with a stake in social control.

The parlance of "contagion" infects both the DoD and the Facebook research projects. It's a significant metaphor for tracing affect (emotional or political) across networked societies. It is significant too because contagion — the passing of disease — is that which is understood as necessary to control. Facebook wants control over its users' experiences in order to monetize them better. The government wants control because, at base, it is in the business of control. The task then, for those of us unnerved by Facebook and DoD efforts here, is to be less predictable than a contagious disease.

Follow Natasha Lennard on Twitter: @natashalennard

Image via Flickr