Facebook mood study: Playing with your mind

CLICK - Facebook Likes tell a lot about you, such as if you drink beer, have sex regularly and are happy.

Facebook engaged in a large study to see if users’ emotional states could be affected by their news feed content.
Consent of Human Subjects: Subjects not asked for permission first.
Findings: Extremely small effects.
Research methodology: Poor algorithms used, questionable findings.

Key finding: A reduction in negative content in a person’s newsfeed on Facebook increased positive content in users’ posting behavior by about 1/15 of one percent!

We address 3 questions

1. Why did some of the checks and balances possibly fail?
2. Should we worry about the study’s findings?
3. What benefits do Facebook users get out of this study?

Non-techie description of study: News feed: ‘Emotional contagion’ sweeps Facebook

1. Some checks and balances failed

Following the spirit as well as the letter of the law is the key to successful compliance. In turn, any governance depends upon the participants doing their job thoroughly and carefully.

In this case, the academics thought this was an important subject that could be nicely studied with Facebook users. They may not have considered how much it might upset users and the media.

Cornell University has a its procedure in place for getting approval for research with human subjects. As the image below illustrates, the researcher is expected to reflect on the project and if in doubt, ask for help.

CLICK - Why does the media not get the facts right about the Facebook study? #BigData

The university points out that it did not review the study. Specifically, it did not check whether it met university guidelines for doing research with human subjects. The reasons given were that its staff:

Curious? Join 1500 other subscribers to this blog’s newsletter and read on! Pls. use an e-mail that works after you change jobs!

– did not collect the data, and
– did not do the analysis.

Cornell University attempts to minimize its role in this tussle, but their statement shows they are simply doing damage control:
Media statement on Cornell University’s role in Facebook ‘emotional contagion’ research
US compliance: Federal Policy for the Protection of Human Subjects (‘Common Rule’)
Also interesting: Parsing the Facebook paper’s authorship and review

What about Facebook? By consenting to Facebook’s Data Use Policy — which, if you have a Facebook account, you did — you gave Facebook permission to use you as a test subject if necessary. Unfortunately, Facebook did not include the ‘research’ portion of this policy until May 2012.

Hence, when the study was conducted, users had not yet consented to this new policy. The policy states that the information may be used:

…for internal operations, including troubleshooting, data analysis, testing, research and service improvement…

Even so, making users submit to a catch-all list that does not provide a clear, specific description of a study fails the consent standard. The above illustrates that Facebook does not feel the need to ask your permission before carrying out its tests.

Adam Kramer – the Facebook employee AND first author of the paper – released a statement on his Facebook page (June 29, 2014) indicating that internal review practices were in place at Facebook. What they are and how they were applied to this study remains a mystery.

The paper was published in the Proceedings of the National Academy of Sciences of the United States of America (PNAS) and handled by the subject area editor Susan T. Fiske, a psychologist.

CLICK - Should Consent from Facebook users have been acquired BEFORE study began? Maybe but how would this have biased the results? #BigData

Ms Fiske had some concerns, but as the image above and the one below both illustrate, it does not appear that the researchers submitted a form from their university. Nor did Adam D. I. Kramer submit anything verifying his claim that Facebook’s committee on Research with Human Subjects had reviewed the study.

CLICK - Should Consent from Facebook users have been acquired BEFORE study began? Maybe but how would this have biased the results? #BigData

I guess editor Susan T. Fiske thought the information provided by the authors was sufficient… Apparently, the authors told her things were okay. If you cannot trust a fellow academic – whom can you trust? Insert sarcasm here.

Interesting readDid Facebook and PNAS violate human research protections in an unethical experiment?

2. No need to worry about the study’s findings

Previous experiments show that emotional contagion in real-world situations happens. For instance, interacting with a happy person is infectiously pleasant. Of course, crossing swords with a grump can launch an epidemic of grumpiness. Posting negatively worded stuff on Facebook does not make you many friends (see Forest, A. L., and Wood, J. V. (2012). When social networking is not working: Individuals with low self-esteem recognize but do not reap the benefits of selfdisclosure on Facebook. Psychological Science, 23(3), 295–302. doi: 10.1177/0956797611429709 Retrieved May 16, 2012, from http://pss.sagepub.com/content/early/2012/02/07/0956797611429709.abstract).

This is the second study I have come across that tries to assess ‘contagions’ focusing on online exposure to mood-laden text. The current Facebook study revealed that reducing negative posts led to an increase in positive posts. Increasing negative posts in a subject’s news feed led to a decrease in positive posts by that person.

How big is the effect you might ask? Kramer et al. (2014) found a 0.07 percent (i.e. about 1/15 of one percent) decrease in negative words in people’s status updates when the number of negative posts on their Facebook news feed decreased.

You have to write a few thousand words before you might have written one less or more negative word, so this effect is VERY small. The study relied on the Linguistic Inquiry and Word Count (LIWC) text analysis software, which in turn relies on a dictionary of words classified as “sad” or “angry” or “happy”. It counts their use and then arrives at an estimation of emotions expressed in a given text. But does it work? The authors explain the use of this tool here:

“Posts were determined to be positive or negative if they contained at least one positive or negative word, as defined by Linguistic Inquiry and Word Count software (LIWC2007) word counting system…”

The study states: “People who viewed Facebook in English were qualified for selection into the experiment.” I for one view Facebook in English but often post in German as well. For my case and others in similar positions (e.g., view it in English, but also post in Spanish) the program would totally fail.

To make my point, I entered the text below in the tool LWIC Light. The screenshot shows the result I got. Ignore the right two columns, which give the average scores for previous analyzed texts.

CLICK - Linguistic Inquiry and World Count is one of many primitive attempts to map natural language onto mathematical criteria. These programs see the word 'happy' as positive and the word 'sad' as negative, but they have no ability to detect sarcasm...

Even if we take English sentences, the algorithm’s results do not instill trust in these findings. Such attempts to map natural language onto mathematical criteria are fine, but they fail to detect sarcasm. For instance:

CLICK - Linguistic Inquiry and World Count, is one of many primitive attempts to map natural language onto mathematical criteria. The program fails to detect sarcasm or irony, which are essential in analyzing social network content, and they have a limited grasp of grammatical constructs that sometimes turn word meanings on their head.

This one has zero negative words according to LIWC’s count, thereby demonstrating that the program cannot handle this properly.

See more here: Facebook tried to manipulate users’ emotions. But we have no idea if it succeeded.

3. Benefits for Facebook users? What benefits?

Facebook has tested before and learned that when people see more text status updates in their news feed, they write more status updates themselves. Nevertheless, if the validity of data are questionable because of the tool used, what else if anything can we learn from these the findings?

Given the issues mentioned above, should I trust the results? Not if you watch this 44-second video of Ronald Reagan with Mikhail Gorbachev:

YouTube Preview Image

So we have to wait for another study. That one will hopefully give us more insights about how and why certain positively worded content influences your peers positively on Facebook or not.

I am pretty sure that the study’s results will be repeated with larger effects (assuming the language coding improves) pretty soon.

Interesting read: In defense of Facebook

Finally, everybody including family and lovers are trying to understand your behaviour in the hope of getting you to do something. You decide whether this is manipulation or not. However, while your mother wants you to eat more vegetables and fruit, your boss wants you to work longer today without getting overtime and so forth.

Tip

CLICK - Why sharing a blog's content on social networks matters #Usefulness #SmallDataFacebook already ranks how you interact with your news feed by using an algorithm called EdgeRank. It measures things such as how frequently a news feed owner interacts with its author and the quality of that interaction. In the latter case, a comment is of course more valuable than a Like.

You can tweak the settings to make posts appear in their “natural” order.

By the way, if you feel you have wasted too much time on Facebook, why not check? Just download your Facebook history. This is available under ‘General Settings’. If you want to delete your account, you have to fill out a brief form.

Source – Facebook mood study: Why we should be worried!

VERY interesting readABSTRACT: Experimental evidence of massive-scale emotional contagion through social networks
Full Paper: Kramer, Adam, D. I., Guillory, Jamie, E., and Hancock, Jeffrey, T. (June 2014). Experimental evidence of massive-scale emotional contagion through social networks. Proceedings of the National Academy of Sciences of the United States of America (PNAS), vol. 111 | no. 24, pp. 8788-8790. doi: 10.1073/pnas.1320040111 Retrieved, June 30, 2014 from http://www.pnas.org/content/111/24/8788.full

What do you think?

Facebook has made no secret that its news feed is a manipulated version of reality, and their properietary EdgeRank algorithm helps. It selects the post and links from your friends to display on your feed that it has found through testing are the most likely to interest you. Moreover, it knows which ones will encourage you to respond and post yourself.

How often do data scientists look at data for marketing purposes? Do you ever hear about this? Most likely not. The best part is that Facebook published these data and did not keep them under wraps. It is certainly nicer to know what purpose you were manipulated for, than being left in the dark. Kudos to Facebook for letting us know what they are trying to understand and not keeping these things secret! Plus, if you always tell people beforehand that they are part of an experiment, it can influence your findings (sometimes called response bias or halo effect).

Nevertheless, here we have been the product or guinea pig that Facebook tested in this experiment. And while this raises ethical issues, it is an advertising-funded social network. I find it much more creepy to think about all the studies where we are the product that a company has been testing but we never know about it (e.g., Edward Snowden raised a few similar issues…).

– What is your experience, do people feel differently and thus post differently, when they see positive or negative posts in their feeds?
The Office of the Data Protection in Ireland has asked Facebook to clarify privacy matters, including consent for this study. Do you think this is an issue for regulators?
Have your say below.


Urs E. Gattiker, Ph.D. - CyTRAP Labs - ComMetrics.

Hooray – you read the whole post by author Urs E. Gattiker – aka DrKPI! Want to hang out more? Check out the news updates on Twitter, join our Social Media Monitoring discussion group on Xing, chat with us on Google+, and receive your fortnightly updates and behind the scenes scoops through our newsletter.

Urs’ latest book, Social Media Audits: Achieving deep impact without sacrificing the bottom line was published in April 2014 by Chandos Publishing / Elsevier – blog readers => grab your 25 percent discount with free shipping now. Extra Tidbit: Need to do an audit? Get Social Media Audit: Measure for Impact (Springer Science Publishers).


Urs E. Gattiker

Professor Urs E. Gattiker - DrKPI is corporate Europe's leading social media metrics expert (see his books). He continues to work with start-ups. Urs is CEO of CyTRAP Labs GmbH and President of the Marketing Club Lago, a member of the German Marketing Association (DMV).

6 thoughts on “Facebook mood study: Playing with your mind

  • Pingback: Facebook and Cornell U. backlash: Why we should...

  • 28. July 2014 at 12:51
    Permalink

    Dear Urs
    I just wanted to pass on this Info:

    “I cant say for sure but even a site like avaaz pretends to be one thing and is maybe another
    wolf in sheeps clothing
    big business uses sites like these to perform market research
    without having their name directly attached to it.
    http://www.avaaz.org/en/

    conspiracy theories???”

    Reply
    • 28. July 2014 at 12:55
      Permalink

      Dear Anonymous
      This is difficult to reply too. I recently got something like this about this avaaz group

      Yes we know about AVAAZ – and stay meanwhile as far as possible away from them. Please know that most of these “internet-campaign”-groups are in reality perfidious businesses and make tons of money with your cause.
      We always advise true activist groups to run an internet campaign only on their own websites and servers. We did thorough research and can maybe only recommend one group/campaign-website remaining, which is ready to be transparent and truly co-operative, if a group doesn’t have the capacity to run a campaign on their own website and server.
      The old days of honest “petitionsite” et al. are gone since long.”

      When I read the above, I find what Facebook did rather harmless. Not fair, maybe but the results where published. And while the effects where minimal, it still makes me wonder…. For instance, we know little if anything about the procedure Facebook uses when its research involves human subjects, as was the case in the above study.
      Hence, revealing that they did this study is a good first step.
      A second step is to tell us about their guidelines when doing research with human subjects.

      I find doing such things without being open worse. That is to say, doing “do goodies” campaigns to find out how people feel about things is legitimate. But as a corporate sponsor you need to reveal who you are – being transparent certainly helps your brand and reputation.
      Doing some things the way AVAAZ does raises questions. Some further material came across my desk, such as:

      “The problem with AVAAZ specifically is that they get contracts from the big boys (political and business) “to check out certain public reaction to specific issues” – testing the waters so to speak and advising them on strategic moves (of course against payment).
      They also are the mud-eaters in the fishtank of the big sharks and play one against the
      other.. They are not interested in your cause,

      What should we think about things like this? I worry about those things we do knot know about… conspiracy.

      Not really, but driving too hard to make a profit may increase risks for such things to happen. What does everybody else think?

      Reply
  • Pingback: small win,small data,UPS,Google,iPhone,Harvard,incremental innovation

  • Pingback: non-discretionary release,ethics,morals,validity,reliability,iPhone 6

  • Pingback: business analytics | new product research | social media monitoring |

Leave a Reply

Your email address will not be published. Required fields are marked *