Blog Post

Ethics & That Facebook Study

Full disclosure, I am an inactive member of Facebook following a really bad summer of updates. I just ... couldn't emotionally handle it, which is funny given the news that came out over the weekend and was widely discussed on my social network of choice, twitter. In fact, I feel almost like one of those infamous twitter wars was on the brink of breaking out over the study. It's gotten so big that it's even made it to Bloomberg.

The short hand is, Facebook manipulated thew news feeds of a bit over 600,000 users to show them either more negative or more positive content to see if they could effect their emotions. The research findings were, look we can! The IRB situation is sketchy, in that it seems they extended an existing IRB, and they stated that by agreeing to the TOS facebook users had consented to be research subjects. There was no opt in. No opt out. No debrief. I don't think anyone knows if they were or were not part of the study. The thing that was published that caused the mess looked at two weeks of data. If all a user had to do was agree though, the side of me that is skeptical always wonders if that means that they only did this for two weeks, or, if after two weeks they determined that it worked and put it in to practice (for whatever practical reason made them feel the need to do the study in the first place).

Anyway, the conversation has been heated because this is in so many disicplines. Everyone has their toe in the water of implicaitons of this research which is both good and bad. I am happy to see a bigger conversation happening around these born digital and algorithm based studies. I am excited to see people talking about humans, and the experience of human life and emotions. I'm disheartened to see people dismissing the 600k people because it is such a small percentage of the user base. I love that a wider conversation on ethics is starting in terms of methods, consent, IRB, and research that toes the line of corporate and academic research (because more and more things are turning in to both).

What I am sad about though, is my own thing. I'm not seeing a larger conversation on the Terms of Service as the site of consent. For me, I think this is as deep as my toe goes for now. The other stuff is overwhelming beacause people are.. highly invested (and this is a good thing, as long as we don't talk over each other and talk to each other instead).

So, back to the Terms of Service.  It's an issue. It is a legally binding contract that the end user has no recourse to negotatiate. If you want to use these services you agree to their terms. You, the end user, has no terms and no right to demands or disclosure outside of what's written in those terms. Yet, often we don't read them. And, as this case is showing, when companies act on the promises made in those contracts, we freak out. So, my hopes from the Facebook study fall out are

  • that we can have a bigger conversation on the ethics of using these spaces and tools that pull us into these contracts
  • that as researchers outside of these organizations, we understand that they will have more data than we will ever have access too
  • that we can come up with methods for academic and corporate research that are ethical, collaborative, and cross disciplinary
    • without putting participants at risk without their knowledge
  • we can have a bigger conversation on the role of data, the algorithm, and the human

I'd like to share a site, Terms of Service, Didn't Read. It's a good one.

I'd also love to hear what, if anything, the Facebook study has people questioning/asking/contemplating.

 

 

163

3 comments

It's not surprising that the focus of much ire has been whether or not TOS present adequate informed consent, what I am surprised to (not) see is any conversation about the ethical obligation that PNAS has in this.

PNAS ought to be screening the methodology sections of its submissions and excusing themselves from publishing materials that do not meet the ethical standards we expect when working with human subjects. If an article presents fascinating data but was obtained using morally-repugnant methods, then a publisher should decide that the materials are not meeting their guidelines.

So, for me, this is also an interesting moment for discussing the ethical obligations of publishers and the review processes that are deployed by top tier publications.

189

Heh, did you see their comment? 

http://www.pnas.org/content/early/2014/07/02/1412469111.full.pdf+html

This is one of those areas that is gray for me... Facebook is a company so they don't have the same standards as an academic institution in terms of needing to gain approval for testing, thus, my focus on the TOS. I'm feeling like the overall reaction to the study, though, is if corporations want to come to spaces previously dominated by academics and share their work, then they need to alter part of their business model... meaning, they need to add an line item for a proper external IRB.. and I don't know that that is always possible in a business setting.

Plus, when you are working with this the issue of speed (just thinking about algorithms changing multiple times throughout the day), means that if they were to start trying to make their findings more public, and that required a normal academic review process, it would mess up workflows and lots of other stuff.

So, the question that i'd push back at you is, do you think it is okay for industry players to publish in academic journals, or should they stick with industry focused journals? If they had published this in an industry journal, would it have received the same amount of flack?

190

I'm also pretty surprised that PNAS reviewing process was not part of the debate. We know peer review is highly variable in rigor and that this could be just a bad coincidence, but I was expecting some debate. The editor of the paper did comment on the IRB (or lack thereof) and why more information about the process of data collection was not requested. This is a good start and her answers make the case that PNAS assumed the university IRB had screened the research.

184