Blog Post

Trust and the Moment of Technological Faith: interview with Nishant Shah

Trust and the Moment of Technological Faith: interview with Nishant Shah

Building Trust in Connected Learning Environments Interview: Nishant Shah

This interview is part of the Digital Media and Learning Competition 5 Trust Challenge. The Trust Challenge funds successful collaborations or “laboratories” where challenges to trust in connected learning environments can be identified and addressed. Successful labs will create scalable, innovative, and transformative exemplars of connected learning that bridge technological solutions with complex social considerations of trust.  Find out more about the competition at

Nishant Shah has a PhD in cybercultures and is a professor of Internet and Aesthetics of New Media at Leuphana University, Lueneburg, Germany. He is the co-founder of the Bangalore based Centre for Internet & Society where he was the Research Director for 6 years, and also a knowledge partner to the Dutch development non-profit Hivos, working on developing new practices of change in network societies. His current research is at the intersections of body, digital technologies, gender and sexuality, collaborative pedagogy and connected learning.

1. What about our contemporary moment makes understanding trust important?

We live in times of faith. As more and more, our technologies become transparent, they also become opaque. We work with machines that promise that What we see is what we get, but that is an empty promise. Because increasingly, as the lag time between the input, processing and display of data gets reduced, we lose control over the machinations that run in the background. The contemporary moment is a moment of the interface, where all our attention is geared towards understanding, improving and analysing the interfaces. However, these interfaces are surfaces. Interfaces hide the infrastructure. Even as we worry about better visual representations, more accurate mapping, and stronger connectivity, we are losing control of the real operations where decisions of power, control, regulation, containment and censorship reside. This is what I call the moment of faith. In order to stop having blind faith that governments, private corporations, technologies, or indeed the people that we connect with, will all behave as expected, in conditions of transparency, we need to think about trust again. Trust is faith quantified. Trust requires enumeration. Trust needs responsiveness and responsibility. And more than anything else, trust demands reciprocity. So instead of having faith, and then be constantly surprised at how different things are, we need to start thinking about trust – its processes, its mechanics and its measurement.

2. Often when we hear terms like “student data” or “student privacy” we don’t hear them in conversation with “trust”. Do you have any thoughts on why that might be the case?

I think that there is a deliberate division of intellectual discourse, when it comes to some of the most important debates around the intersections of digital technologies and learning. The questions around data, for instance, are divided into two discrete sets. The educators and learners are invited to engage with concerns around privacy, identity, robustness and authenticity of data whereas questions of trust, security, licensing and storage are often relegated to the realms of the technologists and designers. This division is not only erroneous but downright dangerous, because it makes people believe that they don’t need to worry about the other concerns – because somebody else is taking care of it. This is why concepts like ‘trust’ become important. In order to talk about the design of trust, we will need to now straddle these divisions and think of them as not only co-existent but also inextricably tied to each other. We will have to acknowledge that questions of data identification, identity, quantification and ownership are tied closely to the digital architecture, conditions of access, protocols of design and ownership of platforms. It opens up a dialogue between the artificially created silos of technology development and content production, or technical architecture and political control, making these into technosocial questions rather than technical or social questions.

3. How are you thinking about trust in regard to connected learning?

For me, what is most important in the landscape of connected learning is to map the bottlenecks of trust. If we were to go with the metaphor of the network, and connections as intersections, then finding out the flows of trust, the traffics, the infrastructures, the nodes and hubs and routes that trust processes and data takes, is what is most important. Especially because connected learning seeks to overturn the systems of authority which traditionally ensured and safeguarded trust practices, it becomes important to see how different communities of learners interact with each other, but also with the service providers, regulators, policy actors, communities of support and of infrastructure. So when I think of trust in connected learning, I am more interested in thinking about how protocols of trust can be established – how it can be measured and effectively reported, and how it can be infused with the affective, the human, the subjective and the personal, rather than just interface and reporting fixes.

4. What are some of the biggest challenges to engendering trust you see in connected learning?

I like to think of opportunities, rather than challenges, because challenge presumes that there is an active resistance against developing trust in connected learning environments and promises The active resistance is easier to overcome, because it only needs education, training and information. However, the opportunities are going to be in the more complex questions that emerge in connected learning.

The first one set of opportunities is in recognising our education and learning processes as shaped by technologies. When it comes to connected learning, there is an easy argument of novelty that presumes that this is the first time our learning is intersecting with technologies. However, even the most cursory critical history will teach us that the entire modern education system has been shaped by technologies of information production, storage, and distribution. This schism between people who make apps for learning and people who engage in the process of teaching and learning has to be removed. And that is going to take more than just practice. It is going to require a common vocabulary and a dialogue that allows the different stakeholders to actually understand each other’s processes and modes of working, and maybe even engineer hands-on immersion into the work. To build connected learning, we might need to first connect the different elements involved in the field and get them to trust each other.

Given how connected learning is not restricted to the traditional learning environments, the second set of opportunities is going to be shaped around what is at stake. It is easy to think of fixes and platforms and designs and databases as modes of bringing together connected learning ideas. However, we need to find a common grounds, a political vision a set of values that we embody as we work through new partnerships, open collaborations and participatory processes in open learning. It is one thing to operationalize trust through processes and practices, but it is going to take more effort and resources in figuring out that at the core of our different approaches is a common set of ideas and ideologies that bring us all together.

The third set of opportunities, are in dismantling the notion of trust itself. There is already a growing rhetoric of niceness, inclusion, respect and generosity that is often used to actually penalise radical ideas or those who refuse to subscribe to one narrative of power and politics. We need to make sure that we think of trust not as a finite thing, but as a continuously iterative process which will find contradictions and discrepancies, not only on the outside, but from within the community. What constitutes trust, and how do we ensure that trust does not become a monopoly or a grand narrative, and constantly allows for mistrust and scepticism, instead of developing a faith in trust.

5. Do you know of any tools, procedures, apps, and/or systems enabling or disabling trust? How are they doing this? What do these  tools, procedures, and/or systems change how learning can happen in connected learning environments?

I am hoping that this competition actually brings to our attention some of the most creative ways by which trust can be enabled, and once enabled, sustained. There are many different existing protocols that are useful, in making sure that trust is a part of a system, and they range from UI design to architecture to human intervention. For instance, verification certificates, checking of URLs for malicious code or redirects, enforcing https logins, etc. are great ways by which access is made safe and encourages people to perform complex processes like financial transaction or medical data transfer. Similarly, collaborative databases that verify the provenance of information, self-editing and corrective algorithms that help provide better information sources, and user-generated curation of information that enables new insights and access to the online information, are all ways by which an environment of trust is created. Human intervention, where community conflicts get mitigated, offensive material gets flagged, trolls get punished, and new people are encouraged to participate are ways by which trust gets generated. All of these are elements that we need to be able to incorporate in our connected learning environments, along with the traditionally known structures of creating safe and inclusive spaces for learners to create, innovate and experiment with knowledge content and processes.


No comments