Blog Post

How Does the Internet Change Our Idea of Human Nature?

Reblogged from COSMOS:  The Science of Everything

Australia's Number 1 Science Magazine

 

http://www.cosmosmagazine.com/features/online/3099/happy-40th-birthday-i...

Opinion

Happy 40th birthday, Internet!

27 October 2009

Cosmos Online

In 1969 a UCLA team sent the first message over ARPANET, the computer network that later became known as the Internet. Since then it has fundamentally changed humanity

World Wide Web

That event, which took place on 29 October 1969, is celebrated today as the moment the Internet was born. It ushered in a technological revolution that requires us to understand and appreciate the changeable nature not just of technology but also of our brain, humanity and our society.

In the blink of an eye, we have changed our most basic ways of proceeding in the world.

Checking our Facebook page before making the morning coffee. Googling our symptoms to decide whether or not to call a doctor. Handing over our day's work to be completed by a colleague halfway around the globe. Using a computing device the size of a wallet to set up a meeting, guide us to a new restaurant, organise a political rally, locate a new church, or introduce us to the love of our lives. And then Tweeting a real-time account of it all to our followers.

Any twentieth-century expert viewing this future in a crystal ball would have declared it pure science fiction. And it isn't the technology that would have shocked them. It's our changed behaviour. Welcome, dear experts, to the twenty-first century, where our only constant is constant change.

If we feel disoriented, it's because we happen to be living in one of the most challenging and transformative eras in all human history. Work, play, communication, interpersonal relations, leisure activities, commerce, politics, economics, security, sources of information and disinformation - all the large and small reflexes of our lives - happen differently than they did just two or three decades ago.

If we feel that everything around us is changing faster than we can manage, it is because the ground has shifted beneath our feet and we have shifted.

Our twentieth-century paradigm for human nature was based on the assembly line model of human progress. Each person on the line has a task, there is a hierarchy of command assigning each task, you do your task, I do mine, the whole thing chugs along, and, in the end, we have a Model T.

You evaluate your success in terms of your ability to do your task -or your ability to get others to do it. You evaluate your company in terms of its ability to turn out Model T's cheaply and efficiently. You evaluate your society in terms of its ability to keep all of its members on task, in a role, producing a product.

The twentieth-century brain, too, is orderly, with a highly evolved prefrontal cortex passing down 'executive' decisions to the other parts of the brain, right down to the lowly, emotional, reflexive 'reptilian' amygdala hunkering below. Human abilities (and social classes) are arrayed on a similar hierarchy.

Of course, it never works that way in practice, but you contrive ways of testing, norms for evaluating, systems for educating, that reinforce the standards suited to making the best possible contributors to that assembly line. You chart abilities and the disabilities, promote according to the former, punish or prescribe according to the latter.

The first generation of computers did nothing to change the twentieth-century's paradigm of human progress and abilities. Whether a professional Univac mainframe computer the size of a small house or the first desktops that began appearing in homes around 1983, the computer was still basically a tool. It didn't disrupt the assembly line paradigm.

In fact, the term 'hardwiring' that became a dominant late twentieth-century metaphor for the human brain comes from the permanent and unchangeable electrical circuitry within an early computer that dictates its capabilities.

Very quickly, the analogy of the hardwired CPU (central processing unit) was extended by psychologists, sociologists, biologists, and philosophers to humans, as a paradigm for the way humans evolved with certain hardwired features of brain and therefore behaviour that, like your computer, could not be changed.

Then came the Internet. Creating the Internet depended on the antithesis of the Machine Age paradigms of fixed capacities and protocols. The Internet ideal for human productivity was worldwide collaboration enabled by open principles and an interactive, iterative process.

This new mode of creating together resulted in brilliant operating systems like Linux, or a Web browser called Netscape Navigator (now, Mozilla Firefox). Eventually we ended up with the creation of the infinitely malleable and participatory infrastructure of the World Wide Web that has transformed all of our Googling, Facebooking, MySpacing, YouTubing, Flickring, and Twittering lives today.

This mode of collaboration is based on a new paradigm for human nature. If Tim Berners-Lee and his colleagues had believed in the twentieth-century paradigm of top-down, hardwired, cost-benefit Rational Choice economic motivation and individualist progress, the World Wide Web would not exist today.

At its most idealistic, the Information Revolution teaches us that the group, not the individual, powers the world, and that interactive process - not linear progress - is the highest form of human endeavour.

It argues that humans evolved a massive prefrontal cortex not for rational, decisive, linear hierarchical thinking but in order to be able to change, adjust, interact, communicate, and collaborate in situations where the end result is unpredictable.

If the twentieth-century paradigm for the brain is the hardwired CPU, I would argue that the new paradigm for the twenty-first century brain is the iPod or iPhone, with 75,000 possible Apps (and counting) available for downloading, some created by developers, others by users, all in constant need of updates and customising. There's an App for just about everything in the twenty-first century brain because a changing world needs a brain that is not a product but an interactive processor.

An ability to react with alacrity - with cheerful readiness - requires us to understand and appreciate the interactive nature of our brain, our humanity, and our society. We're not there yet. We have not yet taken full advantage of our capacity for change, individually or collectively.

We have not yet rearranged our conceptions, standards, methods of assessment, or traditional institutions to reflect changes that we've already made in our daily lives. We must do so in order to be prepared for changes that will come even more rapidly in the future.


Cathy N. Davidson is the John Hope Franklin Humanities Institute Professor of Interdisciplinary Studies at Duke University in North Carolina, USA. Her most recent book, co-authored with David Theo Goldberg, is The Future of Thinking: Learning Institutions in a Digital Age (MIT Press, 2009).

110

No comments