Blog Post

A Glance at Facebook's Relationship with Users

Since Facebook was founded in 2004 its mission has been to “give people the power to share and make the world more open and connected,” and with over 1.4 billion users it has achieved nothing short of remarkable success. In many ways, Facebook has become integrated into many of our daily actions. When I walk through my town, the writing on the cutesy cupcake shop windows urges me to “like” their business on Facebook. When I install a new time-sucking game on my phone, it asks me if I want to quickly sign up using my Facebook profile. When I was studying in France and was too nervous to use my shoddy French skills to buy a phone plan, I simply asked my friends to message me through Facebook instead. And lately when I was panicking to find a job to occupy me after my impending graduation, I used a recruiting group for my town through Facebook’s website. These are only a few of the ways that Facebook has become a fixture of my daily life as it makes interacting with the world as simple as clicking a few buttons, and I’d bet that whatever your feelings about the service, it has become a fixture of your life as well. At this point, it’s much more difficult to opt out of using Facebook while still staying connected and competitive with the rest of the world.

Inevitably, in the course of using Facebook’s services, we give up a ton of personal information about ourselves through our data. A quick look through Facebook’s privacy explanation reveals that they are constantly collecting data about every single thing you do through their site, including what you “like”, what you search, who you message, what you post, what you write and then choose not to post, and even where your mouse is on the page while you scroll and how long you look at each post. It also collects data on your location and address book by accessing the device you use and collects information from all the sites/apps you use Facebook to log into. With this massive amount of data, Facebook is able to paint a pretty accurate portrait of who you are down to your demographic information, your interests, your exact location, and your social connections.

If you’ve ever taken a scroll through your News Feed, then you won’t be surprised to find that Facebook uses all of this tantalizing data to attract advertisers who are trying to pinpoint and target specific sets of users. Looking through my own feed, I see frequent ads from Fabletics, a sportswear company for women. Most likely, Fabletics is trying to sell to young, college-aged, females and as Facebook knows precisely which of their 1.4 billion users fall into this demographic, they are an attractive option for any company. Indeed, Facebook is very clever; in addition to these direct advertisements, users will often see companies that are “liked” by their friends. Facebook is hoping to influence its users through their friendship connections. The closer you are to someone, the more likely they are to influence you, so Facebook will be quick to show you what they are paying attention to in the hopes that you will as well, and it is effective.

Still, all of this is probably in line with what you’d expect from Facebook. After all, it is a corporation worth more than $100 billion and you and I don’t have to pay a dime to use it. However, a closer look at Facebook’s privacy policy will reveal a line that quickly and vaguely mentions that the service may use your data for “research.” It’s not at all strange for a company to carry out research in order to identify how to improve its services, but with its power and unparalleled access to personal data, it is not difficult for Facebook’s research to stray into the realm of the controversial. As you might know, Facebook conducted a study in 2012 on over 60,000 users wherein they manipulated their News Feeds to show either mostly sad content or mostly happy content in an effort to understand how a user’s feed might affect their mood. This study was most likely in response to the idea that when we see our friends posting about their perfect lives, we feel envious or left-out, but Facebook’s study revealed that we in fact generally feel happy when we see our friends posting happy content, not sad. While it may seem that Facebook is within its rights to conduct research to better understand their service, many are left feeling angry and creeped out by Facebook’s ability to manipulate emotions using sorting algorithms.

When we sign up with Facebook, it’s privacy policy can reasonably lead us to expect that we are selling our data for use of the site and that Facebook is in turn selling that data to advertisers. Many users knowingly consent to this; after all, you get to see content and advertising that is most likely to interest you and you get to use the service without monetary cost. Win-win, right?  However, the privacy policy does not reasonably lead users to expect that their own emotions will be blatantly manipulated. Facebook conducted research on humans that fell short of the Institutional Review Board’s (IRB) standard of informed consent; they should have explicitly made their users aware of the possibility of being a part of the research and given them the chance to opt out, instead of hiding behind a vague statement in their privacy policy. By acting in this manner, Facebook shows to users that it is more concerned with maximizing its services than with adequately protecting its users from possible psychological harm. In treating users like simple data sources to be mined rather than as actual people, Facebook effectively destroys its own brand trust.

In a Forbes article entitled “In Brands We Trust - Why Brands Must Treat Trust Like Gold,” contributor Steve Olenski reported that “79 percent of respondents are more likely to provide personal information to a ‘trusted’ brand” and quoted Mark Lancaster, the CEO of SDL which conducted this research on brand trust, as saying, “Marketers that understand their customers’ privacy concerns and commit to using customer data judiciously will create a strong customer commitment.” It makes sense; companies like Facebook make billions off of our data and we only feel comfortable sharing that data, personal information about ourselves, when we trust the brand.

Startlingly, Facebook’s omnipotence in our lives means that we aren’t likely to stop using it altogether, even given that many users don’t trust the company to protect their privacy. Yet, while I can distinctly remember signing up for the service 8 years ago and eagerly putting all my interests and thoughts onto my profile, I actively refrain from doing so now that that trust has been broken. It is also not uncommon to hear that many people delete the Facebook Messenger app from their phone because they feel that its ability to access most of the phone’s features such as GPS and microphone, is an overreach of what one can reasonably be expected to exchange for Facebook’s services. Unfortunately, the fact that Facebook now essentially forces you to use the app if you’d like to message mobily suggests that rather than focusing on rebuilding brand trust they will instead focus on how to make extensive and automatic data collection as ingrained as the need for their services.

It is clear to see that our power relationship with Facebook is disproportionate; it is difficult to protest the actions of a corporation whose services are so ingrained in your life that you are reluctant to stop using them. It will be interesting to see if our attempts to limit the data we give out will be effective in changing the company’s attitude and appreciation of its users, but I can’t say I’m optimistic.


No comments