Blog Post

#DMLTrust Webinar 3 Write-Up: Social-Emotional Literacies and Digital Citizenship

#DMLTrust Webinar 3 Write-Up: Social-Emotional Literacies and Digital Citizenship
This is an easy-to-talk-about but hard-to-move-towards contextual frameworks topic. We are in the process of defining digital citizenship. Our models of citizenship tend to require a certain amount of trust in the systems of control, grouping, or whatever other mechanism is defining the citizenship. But we often talk about the experience and rights of citizenship instead of how the system is encoding citizenship.

I was lucky enough to be included on the last ConnectedLearning.tv webinar being produced in collaboration with HASTAC as part of the pre-conversation/kick off of the DML Trust Challenge. The Trust Challenge will be awarding $1.2 million to institutions and organizations that come up with tools that address the challenges to trust in connected learning environments that are scalable, innovative, and transformative exemplars of connected learning that bridge technological solutions with complex social considerations of trust. I was the representative from the HASTAC team administering the DML V Trust Challenge. I was also there as someone who teaches University-level media and culture courses in connected learning environments. My co-participants included:

  • Anne Collier - Youth/tech news blogger, and Editor of NetFamilyNews.org
  • Janelle Bence - Educator at New Tech High @ Coppell in Dallas
  • Jessie Daniels - Professor at the City University of New York (CUNY), and FemTechNet supporter
  • Anna Smith (moderator)- Educational researcher, teacher educator & teacher; founder of #literacies chat on Twitter

Our topic was Trust, Social-Emotional Literacies and Digital Citizenship Best Practices.

Initial Thoughts

This is an easy-to-talk-about but hard-to-move-towards contextual frameworks topic. We are in the process of defining digital citizenship. Our models of citizenship tend to require a certain amount of trust in the systems of control, grouping, or whatever other mechanism is defining the citizenship. But we often talk about the experience and rights of citizenship instead of how the system is encoding citizenship. I think this is the same for digital citizenship. We get caught on individual experience and have trouble breaking it down to the bits that are actually enabling the types of behaviour that make us ask if something is “good” or “bad”, “safe” or “dangerous”.

Another thought: an hour really isn’t enough time to talk about this. I hope the conversation continues in the various online spaces we find ourselves in.

The Talk

We started with trying to put some bounds on digital citizenship. The basic thing that seemed to come out is that digital citizenship is no longer about behaviour and a social contract. In connected learning space particularly, it is not about punishment, control, and safety in the same way. Instead, when we talk about digital citizenship, it is about engagement, relationship management, and impact.  We all seemed to agree that at this moment, an understanding of social justice and social activism is required to make sure we make digital citizenship something positive. The addition to digital spaces where the impact of being in a shared physical space is absent, it requires a different type of social and emotional literacy.

This brought us to the 5 Social and Emotional Learning Core Competencies.

  • Self-awareness
  • Self-management
  • Social awareness
  • Relationship skills
  • Responsible decision making

I’m wondering how these core competencies can be built into tools, and, as always, if there needs to be an adjustment for digital learning spaces or if this stays the same.  I’m also curious about the stream of thought that sees the digital as a space without a social contract. I’m wondering if a social contract could be created for digital learning space or if…

The social contract needs to be created in the learning spaces assuming they are going to be using a closed or control system. This seems to be the thing that was lingering in most of the conversation in my opinion. How digital tools and spaces are used is highly contextual. That means that each iteration of learning on the system requires it’s own ingroup social contract, guidelines, manifesto, or constitution. It also needs to be fluid enough to adjust to the things that happen in the middle of the module or learning experience. This led to mini-version of my infamous rants on the value of lurking that was summed up perfectly in a tweet posted to the Google+ hangout page “ lurking is "legitimate peripheral participation (H/T to Lave & Wenger)".

There was pushback, because engagement is key in many ways, but I still think when we think of engagement in these spaces, we need to acknowledge that it isn’t safe for everyone engage. Some people have more risk than others, which requires more time to create the trust relationship with both the tool and the group that will be sharing in the learning experience.

And trust is still central. When we get to the point of talking about the social and emotionally literacies we have to take the following into consideration:

The changes that come with:

  • closed versus open systems.
  • murkiness of participants and participant roles.
  • awareness of privacy, safety, and best practices of a given tool.
  • digital based practices.

All of these things shift how we think about trust in digital connected learning environments.

There are a few half-saids and many unsaids from the webinar I’d like to bring up. I mentioned my past weekends issue with the language “master/slave” language in programming, and how that might alienate some learners. I’ve also received some feedback that these webinars have had an American slant. The Communication Studies scholar in me has to say “of course it does!!”. When we look at the internet, where it came from, and the languages that are used to turn things into pretty pages, they are, for the most part, American based tools, started by a very specific group of people, something people are talking about a bit more now that Twitter released it’s diversity numbers. Tools, much like systems of citizenship, have built in biases, assumptions, and abilities based on the people who build them. Additionally, as informed users of these tools, we should be questioning why this is the case, and what that means for how trust is encoded into our current tools versus tools that will be made in the future and for the Trust challenge.

That brings me to my partial list of the things that were unsaid during the conversation that are still barriers to this new citizenship that requires a trust in both the community and the tools:

  • Language
  • Nationality
  • Geographic Location
  • Socio-Economic Status
  • Race
  • Ethnicity
  • Gender
  • Sexuality
  • Age
  • Access
  • Resources
  • Ability

 

And I know my own biases mean I am probably missing many. But that is why the conversation has to keep going. I don’t think there will be a tool built that can solve all of these, but scalability means that we should be working towards being able to be modified, forked, or adjusted to allow as many people as possible accessible and safe learning through digital tools.

Resources and Readings:

Community of Practice: http://en.wikipedia.org/wiki/Community_of_practice

Knowledge Streams: http://inq13.gc.cuny.edu/knowledge-streams/

Social and Emotional Learning, 5 Core Competencies: http://www.casel.org/social-and-emotional-learning/core-competencies

Cheshire, Coye (2011). "Online Trust, Trustworthiness, or Assurance?" Daedalus. Vol. 140, Issue 4: 49-58. http://www.mitpressjournals.org/doi/pdfplus/10.1162/DAED_a_00114

180

No comments