We’ve all had it happen, you ‘like’ a friend’s “engaged” Facebook status, and suddenly, all of your suggested pages are engagement ring websites or wedding venues. Or, if you’re like me, your Facebook ads have already begun facilitating a future crazy cat lady status by continuously suggesting pages to ‘adopt a kitten’.
It’s not news to anyone that Google and Facebook have opened the doors to the largest marketing database in history. All online ads are catered to our likes and dislikes so that even the first page of an identical Google search can be completely different between any two people. Some consider this to be a huge invasion of privacy. For others, it’s convenient; you’re going to be exposed to online advertising anyway, it may as well be for products you’re actually interested in.
But how would you feel if you knew that the Facebook terms and conditions you’ve agreed to allow them to twist your newsfeed to alter your mood each day?
Recently, Facebook published a scientific report into the ways that social media can be used to manipulate the overall mood of users. The experiment was aptly named, “Experimental evidence of massive-scale emotional contagion through social networks”. Essentially, Facebook have a team of scientific data analysts who select user groups at random, and tweak their newsfeeds to understand how they can manipulate the human mind via subliminal social media messages.
The latest experiment aimed to discover if social media has the ability to actively change the mood of a user, simply by the terminology they are exposed to on their newsfeed. The answer: yes. If you were part of the 600,000-700,000 random users selected, your newsfeed was skewed to increase the number of negative words that you were exposed to each week. Then, your Facebook activity, posts and searches were recorded in order to understand how this interaction affected your happiness. These Facebook scientists actually proved that they had made people “sadder” during this experimental week.
This is interesting as all hell, but slightly scary. You could be a part of a (totally legal) Facebook experiment right now. Consider people with personality disorders, mental health issues or just a tendency to become self deprecating: the outcome of your entire week can be recklessly controlled by Facebook.
On a surface level, yeah, maybe you’ll be a bit of a jerk for a week. On a larger scale, consider that it could be the week you have a job interview for your dream career, or a family member just died.
The potential volatility of such subliminal research could have widespread legal and social ramifications. But what does it matter? Because you’ll never know that you’ve been randomly selected to partake in this emotional manipulation. Obviously, ignorance is not always bliss.
What do you think? Should the terms and conditions on Facebook be more explicit? Should there be a box that users have to consciously tick to allow their activity to be monitored? Or is this all unlikely to ever seriously affect anyone?
WORDS BY CYNDALL MCINERNEY a student who can’t help but think it would be really easy to blame her poor choices on a suspiciously dark facebook newsfeed. Take a look at her other posts here.