The Null Device

Facebook and emotional engineering

In January 2012, Facebook conducted a psychology experiment on 689,003 unknowing users, modifying the stories they saw in their news feeds to see whether this affected their emotional state. The experiment was automatically performed over one week by randomly selecting users and randomly assigning them to two groups; one had items with positive words like “love” and “nice” filtered out of their news feeds, whereas the other had items with negative words similarly removed; the software then tracked the affect of their status updates to see whether this affected them. The result was that it did: a proportion of those who saw only positive and neutral posts tended to be more cheerful than those who saw only negative and neutral ones. (The experiment, it must be said, was entirely automated, with human researchers never seeing the users' identities or posts.)

Of course, this sort of experiment sounds colossally unethical, not to mention irresponsible. The potential adverse consequences are too easy to imagine, and too hard to comfortably dismiss. If some 345,000 people's feeds were modulated to feed them a week of negativity in the form of what they thought were their friends' updates, what proportion of those were adversely affected beyond feeling bummed out for a week? Out of 345,000, what would be the expected amount of relationship breakups, serious arguments, alcoholic relapses, or even incidents of self-harm that may have been set off by the online social world looking somewhat bleaker and more joyless? And while it may seem that the other cohort, who got a week's worth of sunshine and rainbows, were done a favour, this is not the case; riding the heady rush of good vibes, some of them may have made bad decisions; taking gambles on bad odds because they felt lucky, or dismissing warning signs of problems. And then there's the fact that messages from their friends and family members were deliberately not shown to them if they went against the goals of the experiment. What if someone in the negative cohort was cut off from communications with a loved one far away, for just long enough to introduce a grain of suspicion into their relationship, or someone in the positive cohort didn't learn about a close friend's problems and was unable to offer support?

In academe, this sort of thing would not pass an ethics committee, where informed consent is required. However, Facebook is not an academic operation, but a private entity operating in the mythical wild frontier of the Free Market, where anything both parties consent to (“consent” here being defined in the loosest sense) goes. And when you signed up for a Facebook account, you consented to them doing pretty much whatever they like with your personal information and the relationships mediated through their service. If you don't like it, that's fine; it's a free market, and you're welcome to delete your account go to Google Plus. Or if Google's ad-targeting and data mining don't appeal, to build your own service and persuade everyone you wish to keep in touch with to use it. (Except that you can't; these aren't the wild 1990s, when a student could build LiveJournal in his dorm room; nowadays, the legal liabilities and regulatory compliance requirements would keep anyone other than multinational corporations with deep pockets out of the game.) Or go back to emailing a handful of friends, in the hope that they'll reply to your emails in the spare time left over after keeping up with Facebook. Or only socialising with people who live within walking distance of the same pub as you. Or, for that matter, go full Kaczynski and live in a shack in the woods. And when you've had enough of trapping squirrels for your food and mumbling to yourself as you stare at the corner each night, you can slink back to the bright lights, tail between your legs, reconnect with Mr. Zuckerberg's Magical Social Casino, where all your friends are, and once again find yourself privy to sweet, sweet commercially-mediated social interaction. In the end, we all come back. We know that, in this age, the alternative is self-imposed exile and social death, and so does Facebook, so they can do what they like to us.

As novel as this may seem, this is another instance of the neoliberal settlement, tearing up prior settlements and regulations in favour of a flat, market-based system, rationalised by a wilful refusal to even consider the disparities of power dynamics (“there is no such thing as an unfair deal in a free market, because you can always walk away and take a better offer from one of the ∞ other competitors”, goes the argument taken to its platygæan conclusion). Just as in deregulated economies, classes of participants (students, patients, passengers) all become customers, with their roles and rights replaced by what the Invisible Hand Of The Free Market deals out (i.e., what the providers can get away with them acquiescing to when squeezed hard enough), here those using a means of communication become involuntary guinea pigs in a disruptive, and (for half of them) literally unpleasant experiment. All that Facebook has to provide, in theory, is something marginally better than social isolation, and everything is, by definition, as fair as can be.

Facebook have offered an explanation, saying that the experiment was intended to “make the content people see as relevant and engaging as possible”. Which, given the legendarily opaque Facebook feed algorithm, and how it determines which of your friends' posts get a look into the precious spaces between the ads and sponsored posts, is small comfort. Tell you what, Facebook: why don't you stop trying to make my feed more “relevant” and “engaging” and just give me what my friends, acquaintances and groups post, unfiltered and in chronological order, and let me filter it as I see fit?

There are no comments yet on "Facebook and emotional engineering"