For HASTAC readers too young to understand the reference in my title, here's a bit of history: Long, long ago, way back in the past millennium even, Bill Clinton ran a successful presidential campaign against the supposedly unbeatable incumbent, George H. W. Bush, who was ignoring the recession (Clinton insisted) in order to focus on the end of the Cold War or the Persian Gulf War. Whenever Clinton would veer into those debate areas, his campaign director James Carville would get him back on course with his trump card by reiterating: "It's the economy, Stupid!"
I wasn't going to respond at all to New York Times Executive Editor's plaintive, if hyperbolic, critique of all social media, "The Twitter Trap," but I'm hearing Carville-like yelling in my ear and realize I have to. Far, far too many powerful, brilliant, important people who should know a lot better are blaming technology for all kinds of things, and I better come clean and start entering the debate at that level. So, okay: "IT'S NOT THE TECHNOLOGY, STUPID!"
It is just so hard to believe how many reputable intellectuals, writers, scientists, social scientists, and even educators are willing to indulge in a specious logic that they would never allow on another topic. They like to say that the Internet makes us shallow, stupid, distracted, lonely, or, in the case of this piece by the executive editor of the New York Times, that it somehow compromises us morally and spiritually: "My own anxiety," Keller writes, "is less about the cerebrum than about the soul." I can only imagine an executive of his stature snickering with derision remembering how so-called "primitive people" said exactly the same thing about photography.
Here are two facts: (1) we now know from the new science of attention and the most recent findings in neuroscience that our brain is not, as was previously thought, an inheritance that comes with all of its components fixed and certain; the brain is a learning organism and that means it is constantly changed by its environment, but what it experiences, and by its interactions. But (2) except in B-horror movies ("The Brain that Wouldn't Die" or "The Brain from Planet Arous" and so forth), the brain doesn't power itself and it doesn't power us. The brain R us. That is, what we experience our brain experiences. If we give it a steady diet of junk food or alcohol or Ritalin, it changes. If we give it a steady stream of "Jersey Shore," that's what it learns. If we give it a steady diet of item-response multiple choice testing (the ridiculous form of testing which, we know, does nothing except prepare students to do well on that particular form of testing), it learns how to think like those tests. If we inspire ourselves to curiosity, expose ourselves to challenges and then succeed and reinforce our ability to take challenges, our brain learns how to extrapolate from challenges. And if we spend all day on line doing idiotic things, then, well, that is what we learn how to do well---spending all day on line doing idiotic things. We are what we do. Our brain is what it does.
But that's not about technology, it's about humanity. Between the human brain and the computer screen, comes us, our will, our desires, our habits, our training, our work, our incentives, our motivations, our culture, our society, our institutions, all of the things that make us human. It's NOT the Technology, Stupid! It is about what we--you and I--do with the technology. It always has been, it always will be.
This is not to say technology doesn't matter. It does. We are fifteen years into the biggest communications revolution since the invention of steam-powered presses and machine-made ink and paper. That mechanization of printing technologies suddenly made books and newspapers affordable to the masses for the first time in human history. That happened starting in the late 18th century and continued through to the end of the nineteenth and early twentieth, with more and more mechanization that allowed for ever-more rapid printing methods. From the beginning, the availability of cheaply printed books and newspapers had a lot of people very worried--including Founding Fathers like Thomas Jefferson and John Adams. Both worried that a new U.S. ideal of representative democracy would turn to anarchy if the "rabble" had all that unfettered popular culture without a preacher at the ready to tell them how to interpret the text and keep them on track. Being doers, not whiners, Jefferson and Adams both, in different ways, set about thinking through what institutions needed to change if, in fact, a new technology had put books into middle- and working class people's hands for the first time. They thought about the concept of universal public schooling, for example, since you needed not only to educate people to read but to educate them in how to read wisely and sanely.
We are fifteen years into the commercialization of the Internet. We have all made tremendous adjustments to these new forms of technology and social media. I don't know about you but I do not need a new "study" to tell me my life has been changed by email, texting, blogging, tweeting, Facebooking, Wikipedia, eBay, Amazon.com, my iPad, my Blackberry, and on and on and on. It's NOT the Technology, Stupid! I hear James Carville shouting. It's about all of the ways life is changing and how technology facilitates, reshapes, redistributes the everyday patterns, facts, and habits of life. And it is about us figuring out the best ways to live given these rapid and continuing changes.
We all think we know what "work" is and we all think we know what "school" is but we really only know about the ways that leisure, home, learning, and living were reorganized by the Industrial Revolution for the last couple hundred years (a blink in the timeline of human history). Like Jefferson and Adams figuring out what had to change for a new democratic populace that suddenly had access to all that print, our contemporary leaders like New York Times executive editor Bill Keller need to take their role as cultural arbiters a lot more seriously and think about what needs to come next for our society. It's NOT the technology. We need to reconceive new possiblities for living, learning, and working together well. It is about finding the best ways to change our institutions to support our new ways of living, learning, and working. We need new institutions to support our digital ways of living, working, and learning just as the industrial era needed its institutions to support its ways.
Here's an example: The industrial age worked hard to separate "work" from "home." Everything about the common or public schools started in the mid-nineteenth century reinforced that division: from the school bell ringing for each child at the same time of day, to each child entering school at age 6 whether they were ready or not, about sitting in tidy rows, and, then, later, in the early twentieth-century, all the new ways of measuring success: IQ tests, multiple choice tests. Around the same time came specialization of disciplines, the "two cultures" divide of arts and humanities versus science and technology, professional schools, and on and on. All these metrics and institutions put an emphasis on standardization over standards, uniformity over idiosyncratic creativity, and working in a linear pattern towards a goal. Everything about work (beginning with the physical structure of the office building or the assembly line and going to Human Resources departments that structure and enforce uniform regulations) was structured to maintain those separations.
We now live in a world where work and leisure are impossibly intermixed and conjoined, at our desktops, on planes, in airports, at picnics, over the dinner table. We need new rules and new norms and new standards for the world we live in now. What we do not need is nostalgia for the practices developed 150 years ago for a world that no longer is relevant to the way we live now.
Please, Mr. Keller. We need you. We need you and "Nicholas Carr, Jaron Lanier, Gary Small, Gigi Vorgan, William Powers, et al" (as you cite them) and other researchers in this field to stop whining and start thinking in creative and innovative ways about how we can remake and redesign our habits and practices, our schools and workplaces, for the world we inhabit now--not the one that some of us, of a certain age, were born into. The world has changed. We have changed. Like Jefferson and Adams, we need to think about what we need to maximize the opportunities of the world we live in, not the old one we remember, often in a far more golden and glowy way than is deserved. IT'S NOT THE TECHNOLOGY, STUPID! It's about us, you and me, and how we can learn to live, work, and learn together, not just for our future and our kids' future, but for the world that, all of us, together, very much live in right now, today.
Cathy N. Davidson is co-founder of HASTAC, and author of The Future of Thinking: Learning Institutions for a Digital Age (with HASTAC co-founder David Theo Goldberg), and the forthcoming Now You See It: How the Brain Science of Attention Will Transform the Way We Live, Work, and Learn (publication date, Viking Press, August 18, 2011). below. For an early, prepublication review of Now You See It in Bloomberg BusinessWeek, click here.
A starred review in the May 30 Publisher's Weekly notes: "Davidson has produced an exceptional and critically important book, one that is all-but-impossible to put down and likely to shape discussions for years to come."
Fast Company's Anya Kamenetz: "Davidson is a professor at Duke University, a dyslexic, and a geek: The combination has made her a savvy, realistic, and observant critic of todays technoculture. [She] thinks the time has come to reassess our approach ... to everything. She isn't the first to point out that modern-day anxieties about texting tots have analogues in earlier centuries. But her work is the most powerful yet to insist that we can and should manage the impact of these changes in our lives."
For more information, visit www.nowyouseeit.net or order on Amazon.com by clicking on the book below.