Sorry for the delay, but now I'm back with updates on Virtual Peace!
A couple of weeks ago, we had an all-team meeting, which marked the mid-point for the development of the project. Now we've got only a little over a month to launch! While there are still many things to do, and many hurdles to overcome, the meeting made it clear that the technical aspects of the project are well under way. Of course, the most visually compelling part of the meeting was the presentation of avatar development by the computer science interns. You can see more of their work here. If you watch this video, it is truly impressive to see how well these guys have been able to make the avs look like the real participants in Hurricane Mitch disaster relief efforts (the video was made by Vanessa Sochat to showcase her work, but it speaks to the quality of work being done by everyone working on that end of the project!)
Perhaps more exciting for me was much of the discussion about game design itself. Watching (and participating) as educators Kacie Wallace, Natalia Mirovitskaya and Tim Lenoir negotiated potential game/simulation features with game designers Tony Sturtzel and Troy Bowman was a real lesson in how to think about the future of digital media and video games/simulations for pedagogical purposes. The discussion among the group really centered on how to maximize the benefits of digital media -- the ability to record and capture, immediate tallying/score-keeping, multi-player observation and interaction, interactive visual display -- while both understanding the limits of the medium and still keeping in mind many non-tangible (or at least less tangible) educational goals.
Since Virtual Peace centers around a disaster relief and conflict resolution scenario based on Hurricane Mitch (1998), there is much concrete data to incorporate into the game. From an educational standpoint, it's important to make this as realistic as possible: we're using the real numbers of, say, available doctors, medical supplies or food that might be needed in a given area, as well as real data regarding which organizations -- CARE, Doctors Without Borders, PAHO, etc. -- had to offer. The list of supply and personnel needs was quite extensive, and needed to be limited in order to be functional from a game/programming standpoint. Another concern was how to display the interactions -- what needs are being met and by whom? Someone had previously suggested a regional map with updating icons, and it was really motivating to engage with this core group as they quickly moved from abstract display concept to a full parsing of detailed data and display possibilities.
Another concern, all along, has been how to track non-tangible learning in order to translate it into "teachable moments" in the post-game review (basically a classroom/interactive web debriefing of the simulation session). The map function, along with the in-game tagging feature, ended up being more useful for this than previously anticipated. While earlier visions of the game primarily involved the instructor tagging "teachable moments" for after-game review, we were able to come up with a way for the students themselves to make offers of non-tangible resources or to make offers of, say, added negotiations -- which might not show up in the simple tally of resources -- which could be automatically tagged and reviewed. This will allow both instructors and students to track back to game moments they might have missed (because there will be multiple small meetings and multiple voice channels operating at the same time within the simulation).
And, while it might not seem obvious, that multi-voice-channel feature is one of the coolest aspects of our simulation! Virtual Heroes has helped design numerous educational simulations with multi-text-chat channels, but this will be the first time they've done it with multiple voice channels. While Tony and Troy have indicated that this is a challenge from a programming standpoint, it's definitely an exciting one for both them and us!
Other aspects under negotiation included ways to integrate the interactive website with the simulation itself. Should students be required to switch back and forth between game and website in order to access data? Is that feasible? How much data should be put into the game (which is harder to change/update because of programming requirements) and how much provided on the website (which is more difficult to access during game play), etc.?
Working with this awesome crew to think through many great ideas for interactive learning from a practical, implementation standpoint was really eye-opening for me.
Since the meeting, I've been working with our video guru, Harrison Lee, along with Kacie Wallace and others, to track down and get permissions for video footage to be used both during game-play (another awesome feature!) and on the website to help inform students of both the extent of the disaster they are entering and their roles as disaster relief personnel. Boy, I sure wasn't prepared for the bureaucracy involved! But after a few weeks of contacting various NGOs, relief organizations, etc., we were finally able to track down some awesome footage from CARE, as well as other video available online, and Harrison has been editing this into a compact but informative and compelling opening montage.
Of course, game designers and computer science interns continue their work, and as I find out more about it, I'll continue to update. The web development team continues, too, to work on implementing a more interactive website. Hopefully that'll be open for viewing and participation soon!