Project Q&A With: NatureBadges: Open Source Nature & Science Badge System

Project Q&A With: NatureBadges: Open Source Nature & Science Badge System

NatureBadges: Open Source Nature & Science Badge System is a collaboration between the Smithsonian Institution’s National Museum of Natural History and LearningTimes. NatureBadges leverages the fact that NMNH is the second-most visited museum in the world to connect the onsite physical museum experience to digital tools for lifelong learning and engagement. The museum will be a hub for a strong international network of science and nature badges so that the audiences introduced to badging through innovative hands-on digital activities at the museum will have the opportunity to jumpstart their informal learning through badges from dozens of organizations.

What are the 3 most important things about building a badge system you would share with another organization just getting started?

We think our experience and advice would be most useful to other museums looking to integrate badging.

1. How do badges work best in the context of a museum visit?
Many badge systems are constructed around particular programs or workshops that the participants have signed up for. In contrast, designing something for the museum visitor who often feels they have little time to deeply engage creates challenges for creating a meaningful badge system. As we know from the Dallas Museum of Art’s new badge system, there are compelling ways to engage the museum visitor through badges. But if you want to connect badges to specific learning goals - have them represent a skill learned or a deep engagement with a subject, you need to think about how a quick 5 or 10 minute experience connects to a badge that means something outside the ecosystem of your museum.

2. How will your badges be meaningful outside your museum?
If your badges are things that you want your users to use when applying to jobs or college, for example, you need to make sure that the assessment and criteria for earning the badge are clear and translatable. In the case of something like photography skills, this might be fairly straightforward. But in a case of learning about geological forces through a game, the learning goals and assessment of them are more complex.

3. What and how will you assess?
This is an important trade-off: in order to create badge-able learning activities that visitors can engage in ad-hoc, we need to have automatic assessment done by software rather than people. Because we don’t want to create quizzes but instead provide inquiry-based experiences, the automatic assessment cannot measure anything but time spent, or whether a visitor accesses something. And so, the assessment we are able to do is assessing some aspects of engagement. Really, any discussion of badges for learning comes back to a really complex discussion about assessment, so it’s important to know that as you embark on your badges project.

Our badge project is due for beta launch in August 2013 and public launch in November 2013.

We are leading efforts to create and support our badge project in conjunction with education, exhibits, IT, and curatorial partners at NMNH and other Smithsonian museums.

Our badge strategy has involved coordination with other stakeholders at NMNH and beyond. We have designed many iterations of the badging system strategy and integrated it into our evolving complex system of activities, and designed graphic iterations of the badges interfaces for the system.

Software Development and IT Integration: We have also developed digital software that forms the underlying structure for how our museum and online visitors will earn badges. The digital tools have been created in collaboration with our science educators and provide a flexible structure for the ongoing development of inquiry-based learning experiences. The software supports challenge-based games, skill-building exercises, and guided inquiry. The tools created are based on the tools that our research scienctists use, and include a lightbox for comparing media; drawing, measuring and annotation tools,; a personalized fieldbook; and a context tool for using multimedia, including 3D objects, to solve problems, and investigate issues. Rebecca has directed the development of this software, including UX oversight, design direction, vendor management, QA testing, data management, IT systems integration, privacy and security coordination, and collaboration with our education and science teams. The software is fully designed, and is in Alpha state, on track for beta release by August 2013.

Content Development: Our educators have been creating content for the badge system. Our UX designer for all of the digital activities has guided our educators in building participatory experiences that our users will do to earn the badges. These interactive activities cover topics ranging from geologic forces to adaptations in creatures living between grains of sand. Although our educators have created participatory activities for our museum audiences in the past, the integration with digital tools and badges is unprecedented, and so the work of integrating our science, our educational philosophy and experience, digital tools, challenge-based participatory experiences for teens, and badges has been challenging and very interesting.

Outreach: Our team has reached out to the rest of the Smithsonian Institution, other badge projects, and other museums to share information about badges, and we have presented about badges at AAM, and participated in NY Hall of Science’s badge study. She is an evangelist for badges at our museum, and has presented to our staff on many occasions about the intricacies of digital badging.

The badging system is integrated into an innovative, museum-wide project called Q?rius. The cornerstone of Q?rius is a 10,000 square foot interactive space opening in November 2013 that’s part lab, part collections vault, part DIY garage, and part town square. It features immersive self-guided experiences and programs led by Smithsonian staff on current discoveries and their relevance for visitors lives. The badging system is integrated with activities; its development has been a major driver of how activities are developed with and for our audiences. Therefore, a large team composed of IT, education, and exhibits professionals has been involved in some level of its conceptualization and development. •

Who were you addressing with your badge system design?

Our primary audience is youth age 10-18. Within that, we are focusing on:

  • Youth visiting the National Museum of Natural History
  • Youth visiting our website
  • Youth volunteers at the museum

What were your initial goals for the badges? Did those goals change at all throughout the design process?

  • Badges should incentivize onsite visitors to spend more time in Q?rius than they would otherwise.
  • Badges should incentivize onsite visitors to pursue more advanced activities and deeper understanding of science.
  • Badges should increase post-visit online visitors. There will be more incentive for people to visit us online after they leave if they have started to earn a badge.
  • NMNH badges should be meaningful and reflect actual time commitment, learning and understanding.
  • The Badge Project should provide us some frameworks for partnerships with other organizations with badge programs, to lead networks of informal science and nature education organizations.

In the first year:

  • More than 40% of visitors to Q?rius should register and start earning stars towards badges for the activities they complete
  • 5% of Q?rius visitors should earn a badge in the first year
  • The numbers are modest for the first year, which is seen as a pilot year for testing the appeal of badges in the interactive museum environment.

Goals for User Outcomes:

  • Badges should be a fun way to engage with our content, by ‘playing the game’ of badging: earning points, leveling up, seeing progess
  • Badges should help participants understand themselves as science learners for the long term.
  • Badges should make it easy and attractive to get to content related to user’s own interests and keep engaging with our content after leaving the museum.

Overall, the goals have remained the same, with some slight changes. We realize that in the first year after launch, a primary goal is to test and evaluate this badging system in the context of our museum. We’re building the system to be flexible so we can best do this. Also, we have realized that, for the part of our badging system that will be implemented for onsite visitors, the goal that states “NMNH badges should be meaningful and reflect actual time commitment, learning and understanding” may need to be modified because the need for automatic assessment combined with inquiry-based learning means that full assessment of learning and understanding is not entirely possible.

What types of badges are you using (participation, skill, certification, etc.)? Are there levels or pathways represented in your badges?

We have two different models: participation badges for our onsite and online visitors, and certification badges for our youth volunteers.

In Q?rius, visitors earn stars towards Badges by doing activities and participating in programs. Software on touchscreens integrated with activities that also include touchable objects and scientific tools (like microscopes) throughout the Center automatically give logged-in visitors badges as they do activities. These badges are awarded based on participation. This same participation model is available online on the Q?rius website, where visitors can earn stars towards Badges by doing online activities. Stars represent steps towards badges, and the accumulation of starts towards badges forms a pathway through our activities.

  • Stars, the building blocks for the badges, are earned by doing activities. For example, to earn the Hidden Worlds badge, a user does the following activities:
  • Alien Planets: 5-step activity to learn pattern recognition skills and apply them to microscopic and macro geologic imagery. (10 stars)
  • Biodiversity Challenge: A 12-step activity to identify microscopic animals and simulate the sequencing of their DNA. (15 stars)
  • Stone Age Chef: A 4-step activity to dig into our 6000-object education collection and find tools and materials that our ancestors would have needed to create a meal. (10 stars)
  • Dig Deep: A 14-step activity to build a model that simulates major geologic forces, and take a challenge to simulate drilling deep in the earth to try to find precious resources. (17 stars)
  • Youth Volunteers earn stars and badges for completing training and for completing milestones, such as 100 hours volunteered. There will be multiple levels of badges. Program leaders will have the ability to easily give virtual badges to all participants, even those who have not registered with Q?rius. All participants can receive an email with their badge. Those who choose to login to the Credly badge site will be able to share their achievements on social media and anywhere else online.

The levels for the youth volunteer badges have not been finalized but will likely include:

  • Q?RIUS Volunteer: Volunteer basic training courses completed
  • Specialist Volunteer: For example, Human Origins Specialist, or Geologist Specialist for special training completed
  • Volunteer Super Star: Awarded for special effort like community outreach or behind-the-scenes work with a scientist
  • Certified Smithsonian Volunteer: All training plus 100 hours spent volunteering
  • Gold Certified Smithsonian Volunteer: 100 hrs + 10 Super Stars

How were the criteria for the badges determined? What pedagogies (if any) informed the learning and badge system design?

All of our activities and training are inquiry-based and constructivist. We are also focused on object-based learning, especially as our museum’s research is centered on our collections. Criteria for earning badges were determined by the difficulty and time commitment necessary to complete participatory badges.

What are three things you learned about badge system design? What would you do differently if you were to start over?

We learned that designing for badging requires a systems-based approach that can be at odds with other requirements for content creation. For example, there are certain topics that we need to cover in order to fulfill our educational mission, but they don’t all fit neatly into categories that are intuitive to the end user while not over-simplifying themes.

Also, badges seem well suited to workshops or other contexts where they represent certification or participation in a program with a clear beginning, middle and end. For inquiry-based learning in a free-choice, non-linear, user-driven environment, badge design gets more complicated. Also, badges lend themselves well to a system built on levels of achievement, so it is important that content be organized in this way. We are actually very curious about how the badging system will work when fully implemented in this environment, as we see it as an innovative approach that could be of some value for engaging museum audiences in new ways.

We’re undecided on whether it would have been easier to build the badge system after all of our content was created, instead of simultaneous development. On one hand, building the badge system deeply informed the way we thought about our activities and the activities we chose to pursue in important ways. On the other, if our content was in place before starting, we would have had a clearer set of parameters to work within.

What is left to do? What is left unanswered?

As we have yet to deploy our badge system to the public, there are still questions about rate of adoption and how challenging the earning of badges should be for the optimal experience. We’re looking forward to intensive evaluation in the pre-launch and post-launch period. We will benefit from looking at the assessments done by other badge providers to evolve our evaluation methodology to best answer these questions. We also will continue to benefit from being an active part of the badging community, as we see that our system will succeed if we link it with other badge opportunities.

What are the 3 main challenges to widespread adoption of your badge system for your organization?

Within NMNH and across the Smithsonian, there is a tremendous amount of interest in and support for badges. The main challenges are: creating or using shared assessment criteria so that badges can be connected between very different programs, a system of tagging badges so that users and badge providers can easily find related badges from other organizations, and staff time to integrate existing programs with badge systems.

What is your badge system testing strategy? How have you or will you be testing your badge system prior to deployment?

We have been testing our badge system prior to deployment through extensive user testing integrated with activity testing. The full digital activity system Beta launch is in August, so our testing to date has used interactive PowerPoint-based mock ups of digital interactives and badging. Testing of the user experience of integrating digital experiences, objects, and staff assistance has led to important design modifications, including closer alignment between physical and digital activities, clearer parameters and protocols for identifying when and how to earn points, and refinement of total number of points per activity and for each badge.

Badge system testing will be integrated with the evaluation plan for Q?rius, which has three main categories of evaluation: Implementation, Improvement, and Impact. Improvement evaluation will be ongoing and will drive changes to the user experience based on feedback we receive from users. Improvement evaluation will continue the type of product testing currently under way, but with the full product.

What is the success of your badge system contingent upon?

  • Our ability to effectively communicate about our badges to our museum audience.
  • Connecting our badge system to other badges.
  • Continued support from our organization throughout an initial experimentation and testing stage.

What have you done, or do you plan to do, to evaluate your badge earner community?

Two types of evaluation will help us to evaluate the badge earner community. Implementation evaluation will provide us with data about the number of users who register to earn badges, the most popular activities for earning points, the total number of people who earn badges, where badges are earned (onsite or at home), and how many people access the badging system both onsite and at home. Impact evaluation will measure the extent to which badges inspire new questions, motivate people to learn (and earn) more, and convey key messages about badge content (e.g., Hidden Worlds). Together with evaluation for improvement, these types of evaluation will paint a thorough picture of the value added of the badging experience for Q?rius users.

What will you do with the results of your evaluations?

Evaluation results will be used to continuously improve the badging system as the development and implementation teams work together to improve Q?rius products and experiences. We also intend to share our evaluation results with the badging community at the Smithsonian and with our stakeholder communities, such as the American Alliance of Museums (AAM), Association of Science-Technology Centers (ASTC), and Visitor Studies Association (VSA).

Please describe any impact your badge system may have already had on your organization and your learners.

In important ways, designing many of our learning experiences with badging in mind has helped us think about creating engaging, intuitive pathways through our complex and varied content. Thinking about a badge system has helped us think about how we can best help our museum visitors, volunteers and interns create personalized, meaningful connections to their experience here. This is important to museum education, and an aspect of this project we’re really looking forward to exploring further.

How would you characterize the impact your badge system will have on the badge ecosystem?

We hope that, in working through the complicated questions of how to integrate a badging system into a museum visitor experience in order to promote inquiry-based learning, we help other organizations that have similar challenges. Also, with the Smithsonian’s extensive reach and our large audiences, we hope that our work can increase recognition of badges among users and potential badge providers.

What plans do you have to scale your badge system?

We are working on plans to extend badging systems at the Smithsonian. Currently the efforts have been limited to specific projects, and have not been pan-institutional. Our next steps are to extend badges within our organization and beyond it to our partners.

How successfully are you getting institutional buy-in, or adoption from your learners?

Institutional buy-in has been successful, with multiple internal symposiums and workshops in the last six months about badging at the Smithsonian. Initial concept-level testing with users has been promising.

Once your badge system is built, how self-sustaining is it? How much do you anticipate maintenance to be?

It is difficult to separate the cost of the badge system from all of the other parts of this project, such as content development, volunteer management, project management, and software maintenance. In some ways, we expect the badge system to be self-sustaining, but given that we hope to experiment and adjust the system continuously.

No comments