Blog Post

Designing for Disaster in the age of algorithms and the nuclear

There is a conference taking place at The New School right now that is asking how design can imagine and prepare for extreme existential risk. What is extreme existential risk, one might ask? Ed Keller, conference organizer, put's it this way:

"Today, due to everyday revolutions in communications, computation, biotech, and nanotech, we face, statistically speaking, a range of existential risks that could transform or eradicate humanity and irrevocably alter all the systems on our planet. Indeed, along with massive geopolitical transformations catalyzed by energy and resource scarcity and systems management issues, we face constant social upheaval as a consequence of technologically driven globalization.  From fast-forward cultural hybridization to nearly lifelike, esoteric economic instruments [and their spectacular collapse], we can sense an advance wave that heralds ever more extraordinary disruptive phenomena, including truly ubiquitous computation or even embryonic artificial intelligence systems. Farther a-field, off world, there is the risk of global catastrophe, through asteroid impacts or greater cosmologically scaled disasters."

Last Friday's speakers hit on several of these risks, and two that seem timely:

1: the nuclear

While John Bolton reflects in the NYT about the recent increase in Republican voting power and its ability to thwart U.S. ratification of the latest version of the nuclear arms reduction treaty, Elizabeth Ellsworth and Jamie Kruse wonder, if from a design perspective, we shouldn't just go ahead and creatively encourage a more open relationship with radioactive nuclear material. That is, since the moment Oppenheimer said "it worked!", we entered into a new historical era defined by the species-made and wholesale possibility of self-destruction. So while traveling freely along an open highway, Ellsworth and Kruse took a video and shared it during their recent presentation, of a truck, loaded with nuclear waste, headed to WIPP. They proclaim: this is very real.

So is the San Andreas Fault line, they add. What does it mean to live in the shadow of the big one, they ask? Is an earthquake overdue and tectonically destined to devastate the 9th largest economy in the world? What happens if that truck crashes on its way to the nuclear disposal site? Wait, what's that you say? The same radioactivity that has the potential to destroy our entire species has a minimum half-life of 24,000 years and John Bolton gets to tell the NYT that the US needs to abandon its treaty with Russia, which strives to minimize another arm's race?

This is real, so real that we can actually take videos of the nuclear material as it passes us in commercial trucks along open, commonly-traveled highways. Is this what it will take to imagine a world of less denial?  Perhaps Ellsworth and Kruse should consider making their video more public so others can consider why.

2. algorithmic trading

David Gersten, who was creatively assigned to speak after Ellsworth and Kruse then reminded us that the vast majority of the economic transactions that now take place in the world happen as a result of software programs designed to make nano-second trade decisions that move millions, if not billions of dollars at a time. But what does this mean and why is this timely?

The short answer to this question is partially depicted in Erwin Wagenhofer's most recent documentary, Let's Make Money. At the peak of the global financial crisis, Wagenhofer takes us to the southern coast of Spain and shows us millions of newly constructed, albeit empty condos and townhouses scattered across the desert landscape. More than 800 golf courses inter-connect them and drain enough water to support 200,000 people every year and yet there are no golfers. It is an abandoned wasteland built in part by recent speculative practices of hedge fund managers, private equity firms and less famously, by computer-driven algorithms.

Wagenhofer then reminds us that 11.5 trillion dollars are "held" in offshore banking centers, centers that have carefully been designed to evade tax liability. If this money were to somehow be taxed at a 30% rate, $250 billion of government revenue would be generated each year. What could we do with such money, one might ask, but before we even consider this question we have to grapple with the very real notion that the financial and legal frameworks that created offshore banking in the first place were designed precisely to horde wealth, not share it.

As showcased at Mozilla's Drumbeat: Anne Balsamo's work on the technological imagination stresses that many of society's hierarchical assumptions are embodied by our most advanced digital tools.  Well, what if the most powerful of these tools are actually banking devices operated by the banking elite and are designed to operate freely, based on code and profit and then suddenly some version of the big one comes along? And what about the mistakenly published, "confidential" U.S. government report that details (266 pages) hundreds of civilian nuclear sites to the public inlcuding maps pointing to fuel stockpiled of nuclear weapons?

Call me cynical but I suspect that soon there will be not one, but a series of big ones and people like John Bolton will survive them and then be in very visible, if not political positions to advocate for controlling the resulting chaos with extreme military force.

For anyone that is interested in hearing more of what Kruse, Ellsworth or Gersten said, or hearing from any of the other desiging for exsitential risk conference speakers, you can find streaming video feeds here.


No comments