Blog Post

Settings for Trust in Connected Learning: an interview with danah boyd

Settings for Trust in Connected Learning: an interview with danah boyd

Building Trust in Connected Learning Environments Interview: danah boyd

This interview is part of the Digital Media and Learning Competition 5 Trust Challenge. The Trust Challenge funds successful collaborations or “laboratories” where challenges to trust in connected learning environments can be identified and addressed. Successful labs will create scalable, innovative, and transformative exemplars of connected learning that bridge technological solutions with complex social considerations of trust.  Find out more about the competition at

danah boyd is a Principal Researcher at Microsoft Research and a Fellow at Harvard's Berkman Center for Internet and Society. She is also the founder and president of a new think/do tank called the Data & Society Research Institute. Her research examines the intersection of technology and society. Currently, she's focused on research questions related to "big data", privacy and publicity, and teen culture. Her recent book - "It's Complicated: The Social Lives of Networked Teens" - has received widespread praise from scholars, parents, and journalists.  Blog: and Twitter: @zephoria

1. What about our contemporary moment makes understanding trust important?

The public projects a lot onto technology. It is seen as both the savior of our current economy and the destroyer of our cultural fabric.  The companies and organizations that build a lot of these systems are perfectly aware of how imperfect they are, but many people assume technology to be perfect and infallible (or outright evil).  To complicate matters further, the organizations that are building or employing new technologies are rarely local or connected deeply to the communities that use them.  As a result, a whole host of questions about trust emerge.  How do we understand the technologies? The companies that build them? The organizations that deploy them? The parties that abuse them? Given our general fear and misunderstanding of technology, this gets complicated very fast.
2. How are you thinking about trust in regard to connected learning?
When we talk about connected learning, we're implicating a whole host of different actors to enable learning - educators, parents, students, librarians, administrators, government agencies, technologists, learning companies, etc.  We need those varied actors to understand, respect, and trust one another.  And then we need them to help bake trust into the systems that they build - technological, social, and governmental.  At a technological level, trust requires security, privacy, and safety sitting at the center of the story.  These things take on a different valence when we're talking about social and governmental decision-making.  But trust starts from collectively recognizing that we're all working towards a desirable goal of empowering learners and realizing that getting there will be imperfect and require iteration.
3. What are some of the biggest challenges to engendering trust you see in connected learning?
Distrust. <grin>  More seriously, I do think that there's a lot of distrust between different actors in the network. Some of this comes from historical battles, but there is also genuine fear and concern about what new technologies and disruption writ large mean for those who have spent their lives in education. The other core issue is that people's failure to understand technology's strengths and weaknesses mean that the public often has unreasonable expectations regarding technology and its application.  This is not helped by industry actors who are happy to sell the moon without accounting for the limitations of what various tools can or cannot promise.
4. Do you know of any tools, procedures, apps, and/or systems enabling or disabling trust? How are they doing this? What do these  tools, procedures, and/or systems change how learning can happen in connected learning environments?
Advancements in this arena happen at multiple levels. For example, encryption can be a powerful tool for enhancing privacy and security.  Public commitments and correction procedures - such as those made by Wikipedia - can go a long way in building trust over time, even when people doubt the service at the beginning.  Publicly detailed data management plans, such as those required by many federal grants, can be a great mechanism for assessing the efforts of a particular endeavor. The most important thing to remember is that no system is perfect, no procedure infallible.  So a huge part of the process of building and sustaining trust is to plan for what happens when things go wrong.  We do this all the time in education - think about fire drills - but we don't realize how important this is when we think about technology.
5. What are some of the literacies you think are required for learners to  have a digital “trust literacy”?

I think that people need to understand how data is collected, aggregated, sold, and used in the process of enabling all sorts of everyday services.  Why do you think you got the results you got on Google? How did Amazon decide to recommend that other product to you? Why are you seeing the ads you're seeing on your local newspaper's site? What happens when you Like something? The more that people can understand how data operates in a networked society, the more that we can have a meaningful conversation about trust.  And the more that people can start asking questions of the services they are using in order to hold those services accountable.


No comments