Facebook Mood Research: who’s really not thinking this through?

Facebook Logo

Must admit, i’ve been totally bemused by the reaction of many folks and media outlets I usually respect to this “incident”. As you may recall from other news sources, Facebook did some research to see if posts they deemed as “happier” (or the opposite) had a corresponding effect on the mood of other friends seeing those status posts. From what I can make out, Facebook didn’t inject any changes to any text; they merely prioritised the feed of specific posts based on a sentiment analysis of the words in them. With that came cries of outrage that Facebook should not be meddling with the moods of it’s users.

The piece folks miss is that due to the volume of status updates – and the propensity of your friends to be able to consume that flow of information from their friends – an average of 16% of your status posts get seen by folks in your network (the spread, depending on various other factors, is from 2% to 47% – but the mean is 16% – 1 in 6). This has been progressively stepping down; two years ago, the same average was 25% or so. Facebooks algorithms make a judgement on how pertinent any status makes to each of your friends, and selectively places (or ignores) that in their feed at the time they read their wall.

As an advertiser with Facebook, you can add weight to a posts exposure to show ads in the wall of people with specific demographics or declared interests (aka “likes”). Which can usually be a specific advert, or an invite to “like” a specific interest area or brand – and hence to be more likely to see that content in your wall alongside other posts from friends.

So, Facebook changed their algorithm, based on text sentiment analysis, to slightly prioritise updates with a seemingly positive (or negative) disposition – and to see if that disposition found it’s way downstream into your friends’ own status updates. And in something like 1 in a 1000 cases, it did have an influence.

Bang! Reports everywhere of “How dare Facebook cross the line and start to meddle with the mood swings of their audience”. My initial reaction, and one I still hold, is the surprising naivety of that point of view, totally out of depth with:

  1. the physics of how many people see your Facebook updates
  2. the fact that Facebook did not inject anything into the text – just prioritised based on an automated sentiment analysis of what was written and above all:
  3. have people being living under a rock that they don’t know how editorial decisions get prioritised by *every* media outlet known to man?

There are six Newspaper proprietors in the UK that control virtually all the National Newsprint output, albeit a business that will continue to erode with an ever aging readership demographic. Are people so naive that they don’t think Tabloid headlines, articles and limited right to reply do not follow a carefully orchestrated interest of their owners and associated funding sources? Likewise the Television and Radio networks.

The full horror is seeing output from a Newspaper, relaying stories about foreign benefit cheats, who end up hiring a Russian model to act as a Latvian immigrant, inject alleged comments from her to incite a “how dare you” reaction, add text of a government ministerial condemnation, and then heavily moderate the resulting forum posts to keep a sense of “Nationalistic” outrage at the manufactured fiction. That I find appalling and beneath any sense of moral decency. That is the land of the Tabloid Press; to never let facts get in the way of a good story. That is a part of society actively fiddling with the mood swings of their customers. By any measure, Facebook don’t even get on the same playing field.

In that context, folks getting their knickers in a twist about this Facebook research are, I fear, losing all sense of perspective. Time to engage brain, and think things through, before imitating Mr Angry. They should know better.