Tuesday, July 1, 2014

A Facebook "social experiment" to manipulate your feelings

For one week in early 2012, Facebook ran a somewhat creepy social psychology experiment on about 690,000 users of the web site. In conjunction with Cornell University and the University of California, the social media site attempted to control the emotional state of users by controlling the type of posts that showed up on a person’s news feed. Specifically, the organization reduced the amount of “emotional content” in the news feed, in some cases reducing only negative content, and in other cases reducing only positive content. As reported in the study, “These results indicate that emotions expressed by others on Facebook influence our own emotions.” At the risk of sounding unprofessional, "well, duh."

It is in our nature to be affected by the emotions of those around us, whether we are cognizant of it or not. Words, attitudes, appearance, and body language lend clues into whether one is happy or sad, and the human psyche more often than not leads us to empathize with those around us. Facebook however demonstrated "emotional contagion" strictly through text content. Those that saw fewer positive posts, themselves wrote fewer positive posts, and those that saw fewer negative posts in turn produced fewer negative posts. Put loosely, users that saw fewer negative posts were slightly happier.

Honestly, I'm not at all surprised, and at first glance I am not terribly disturbed by it. It's a matter of remembering who is the customer. I'm not paying Facebook for a service, so I am not the customer ... I am the product being consumed. Since I understand that, I can keep that in mind when deciding what to share and what not to share. In truth, I think Facebook did the world a service by revealing in a controversial way what all media do. As is stated in the study report, “Because people’s friends frequently produce much more content than one person can view, the News Feed filters posts, stories, and activities undertaken by friends. News Feed is the primary manner by which people see content that friends share. Which content is shown or omitted in the News Feed is determined via a ranking algorithm that Facebook continually develops and tests in the interest of showing viewers the content they will find most relevant and engaging.” So yes, Facebook manipulates news feeds because the business is best served when users (the product) are actively engaged. It's not all that different from traditional media - no one would deny that all media pick and choose what to report.

I do not find it in the least bit surprising that Facebook would try such an experiment. What is disturbing, though, is thinking about how this idea could be used in some really unnerving ways. Facebook has somewhere around a billion users from all around the globe. It's not too much of a stretch to think certain three-letter-acronym agencies could compel the site to use this capability with the express goal of inducing dissatisfaction with a particular government or (if you are a conspiracy theorist) with a candidate for office running against an entrenched incumbent.

Before running wild with such ideas though, it is worth looking at the actual results of the experiment. Yes, Facebook was able to demonstrate a change in the content of posts by the manipulated users - but it was on the order of a tenth of a percent. Statistically measurable but hardly overwhelming.

TL;DR? Facebook performed a research experiment that borders on creepy, proved that they could (minutely) manipulate user emotional state, and reminded users once again that if you are not paying, you are not the customer and should keep that in mind.

Do you have something to add? A question you'd like answered? Think I'm out of my mind? Join the conversation below, reach out by email at david (at) securityforrealpeople.com, or hit me up on Twitter at @dnlongen

Whois David?

My photo

I have spent the better part of two decades in information technology and security, with roots in application developer support, system administration, and network security. My specialty is cyber threat intelligence - software vulnerabilities and patching, malware, social networking risks, etc. In particular, I strive to write about complex cyber topics in a way that can be understood by those outside the infosec industry.

Why do I do this? A common comment I get from friends and family is that complex security topics give them headaches. They want to know in simple terms how to stay safe in a connected world. Folks like me and my peers have chosen to make a profession out of hacking and defending. I've been doing this for the better part of two decades, and so have a high degree of knowledge in the field. Others have chosen different paths - paths where I would be lost. This is my effort to share my knowledge with those that are experts in something else.

When not in front of a digital screen, I spend my time raising five rambunctious teens and pre-teens - including two sets of twins. Our family enjoys archery, raising show and meat rabbits, and simply enjoying life in the Texas hill country.

For a decade I served as either Commander or a division leader for the Awana Club in Dripping Springs, Texas; while I have retired from that role I continue to have a passion for children's ministry. At the moment I teach 1st through 3rd grade Sunday School. Follow FBC Dripping Springs Kids to see what is going on in our children's ministries.