Recently we have seen a lot of bad news about digital technologies, social media platforms, and data breeches. A part of me hesitates to call it 'news' since most of what's now part of public/popular knowledge has been known to those who work on social technology and information technology research for some time (as in a decade or so). Nevertheless, there is a steady beat of headlines on data being leaked, hacked, or just plain old sold these days.
Here's some good news: there are several new research efforts underway.*
Susan Benesch and J. Nathan Matias announced the launch of their study to diminish Twitter abuse. I have had the pleasure of getting to know both scholars through a network that works to address online harassment and I am eager to see how their hypothesis that social norms can help to transform online behavior holds up in current U.S. media ecologies. I'm also very keen to learn from their framework for ethical research on and with a public platform. I appreciate that Benesch and Matias have put a lot of thought into the financial, legal, and ethical paradigms for their study.
In a related effort, the Second Workshop on Abusive Language Online is scheduled to take place at the EMNLP (Empirical Methods in Natural Language Processing) Conference in Brussels in October 2018. I am a part of the organizing committee for this event, which is impeccably led by Zeerak Waseem, and a participant from the first workshop held last year (you can see the Proceedings here - they give a hint of the really rich conversations we had over the course of a full day). I'm particularly excited that ALW2 continues the tradition of bringing media and digital studies and critical race studies scholars together with those working in natural language processing and information studies.
Also this week, Facebook announced the launch of a new independent Elections Initiative in partnership with the Social Science Research Council designed to give researchers access to proprietary data in order to better understand the impacts of FB on elections. Take Facebook's effort here with a grain of salt (and a dash of too late), but I can attest to the difficulty in getting access to proprietary systems for research purposes. I hope this helps address the black boxing of social media tech. Like the Twitter partnership, I also hope that SSRC and several independent funding group's involvement helps to ensure the independent and non-exploitative research we need. See more: SSRC President, Dr. Alondra Nelson's statement regarding the launch. If you can stomach Zuck's "I feel bad" rhetoric, you can read more about the initiative in this Atlantic interview. Want more? Check out Cathy Davidson's post on the effort as well!
Finally - if you're in the Seattle area 4/10, considering checking out this great panel: Consent in the Time of Data Mining (tomorrow's panel at Ethical Research for Human Centered Design in Seattle)
* caveat: I worry about "research." As women of color have taught me, too often the veneer of academic research is cover for work that reproduces colonial and white supremacist structures that pervade dominant cultures, including those of research universities and initiatives. So I'll be watching all of this research carefully. I'm glad that there's attention to independence from the social media companies, but that doesn't mean guarantee that the research advances the cause of JUSTICE along with that of ethics (check out the work of the Center for Media Justice and this piece by Malkia Cyril and Karlos Schmeider on why we need media justice). Nor does independence from corporations ensure that those with privilege won't steal the work of others, as Moya Bailey and Trudy point out in their piece on the erasure of black women and femmes in social media work. If you're a white scholar working in social media spaces and you wonder if you need to do better, consider consulting the Respect and Power-and-Control Wheels created by the Digital Alchemists.
Image credit: Sam Felder, CC BY-SA 2.0, https://www.flickr.com/people/samfelder/?rb=1