Program or Be Programmed

From P2P Foundation
Jump to navigation Jump to search

* Book: Program or Be Programmed: Ten Commands for a Digital Age. Douglas Rushkoff. OR Books, 2010.

URL = http://www.orbooks.com/our-books/program/


Introduction

Here is what Douglas writes about his motivation for reading the book:

“As I was shooting the Frontline documentary Digital Nation, about the ways in which technology is changing the way people think and act, I realized we were all missing the point. Technologies don’t change people; people use technologies to change one another.

I see the swath of books coming out from my peers, each one trying to determine whether the net makes us smarter or dumber, and I see them all falling into one extreme or another – either shunning the net entirely, or demanding we all migrate to the digital realm and not look back. The best of them call for some sort of balancing act, where we walk a middle path between techno-utopianism and dystopianism.

But none of this matters if we don’t accept our own roles as the makers of this technology. We are the programmers. By looking at our technologies in a vacuum, as if they can have some effect on us like a pill or an artificial ingredient, we forget about the person on the other side of the screen.

Instead of looking at whether this stuff is so “good” or so “bad,” I want us to start looking at the particular biases of the technologies we are using. How do they lean? What sorts of behaviors do our programs encourage or discourage?

Only then can we begin to judge one another on how we’re choosing to use this stuff, and whether we should be using it differently.

That’s why I wrote this book, looking at how the digital environments we’re creating together (or having created for us by companies we don’t know) influence our choices and behaviors. For a technology as biased toward p2p exchange of ideas and values as this one, the net is nonetheless moving increasingly toward centralized control and the support of traditional monopolies – even as they pretend to be promoting p2p. It’s time we move it consciously in the other direction.

To that end, I’m releasing this book through an independent publisher so that it costs about half of what it would cost through traditional top-down publishers, and I’m donating 10% of my proceeds to WikiMedia and Archive.org.”


Excerpt

"Our screens are the windows through which we are experiencing, organizing, and interpreting the world in which we live. They are also the interfaces through which we express who we are and what we believe to everyone else. They are fast becoming the boundaries of our perceptual and conceptual apparatus; the edge between our nervous systems and everyone else's, our understanding of the world and the world itself. And they have been created entirely by us.

But -- just as we do with religion, law, and almost every other human invention -- we tend to relate to digital technology as a pre-existing condition of the universe. We think of our technologies in terms of the applications they offer right out of the box instead of how we might change them or even write new ones. We are content to learn what our computers already do instead of what we can make them do.

This isn't even the way a kid naturally approaches a video game. Sure, a child may play the video game as it's supposed to be played for a few dozen or hundred hours. When he gets stuck, what does he do? He goes online to find the "cheat codes" for the game. Now, with infinite ammunition or extra-strength armor, he can get through the entire game. Is he still playing the game? Yes, but from outside the confines of the original rules. He's gone from player to cheater.

After that, if he really likes the game, he goes back online to find the modification kit -- a simple set of tools that lets a more advanced user change the way the game looks and feels. So instead of running around in a dungeon fighting monsters, a kid might make a version of the game where players run around in a high school fighting their teachers -- much to the chagrin of parents and educators everywhere. He uploads his version of the game to the Internet, and watches with pride as dozens or even hundreds of other kids download and play his game, and then comment about it on gamers' bulletin boards. The more open it is to modification, the more consistent software becomes with the social bias of digital media.

Finally, if the version of the game that kid has developed is popular and interesting enough, he just may get a call from a gaming company looking for new programmers. Then, instead of just creating his own components for some other programmer's game engine, he will be ready to build his own.

These stages of development -- from player to cheater to modder to programmer -- mirror our own developing relationship to media through the ages. In preliterate civilizations, people attempted to live their lives and appease their gods with no real sense of the rules. They just did what they could, sacrificing animals and even children along the way to appease the gods they didn't understand. The invention of text gave them a set of rules to follow -- or not. Now, everyone was a cheater to some extent, at least in that they had the choice of whether to go by the law, or to evade it. With the printing press came writing. The Bible was no longer set in stone, but something to be changed. Martin Luther posted his ninety-five theses, the first great "mod" of Catholicism and later, nations rewrote their histories by launching their revolutions.

Finally, the invention of digital technology gives us the ability to program: to create self-sustaining information systems, or virtual life. These are technologies that carry on long after we've created them, making future decisions without us. The digital age includes robotics, genetics, nanotechnology, and computer programs -- each capable of self-regulation, self-improvement, and self-perpetuation. They can alter themselves, create new versions of themselves, and even collaborate with others. They grow. These are not just things you make and use. These are emergent forms that are biased toward their own survival. Programming in a digital age means determining the codes and rules through which our many technologies will build the future -- or at least how they will start out.

The problem is that we haven't actually seized the capability of each great media age. We have remained one dimensional leap behind the technology on offer. Before text, only the Pharaoh could hear the words of the gods. After text, the people could gather in the town square and hear the word of God read to them by a rabbi. But only the rabbi could read the scroll. The people remained one stage behind their elite. After the printing press a great many people learned to read, but only an elite with access to the presses had the ability to write. People didn't become authors; they became the gaming equivalent of the "cheaters" who could now read the Bible for themselves and choose which laws to follow.

Finally, we have the tools to program. Yet we are content to seize only the capability of the last great media renaissance, that of writing. We feel proud to build a web page or finish our profile on a social networking site, as if this means we are now full-fledged participants in the cyber era. We remain unaware of the biases of the programs in which we are participating, as well as the ways they circumscribe our newfound authorship within their predetermined agendas. Yes, it is a leap forward, at least in the sense that we are now capable of some active participation, but we may as well be sending text messages to the producers of a TV talent show, telling them which of their ten contestants we think sings the best. Such are the limits of our interactivity when the ways in which we are allowed to interact have been programmed for us in advance.

Our enthusiasm for digital technology about which we have little understanding and over which we have little control leads us not toward greater agency, but toward less. We end up at the mercy of voting machines with "black box" technologies known only to their programmers, whose neutrality we must accept on faith. We become dependent on search engines and smart phones developed by companies we can only hope value our productivity over their bottom lines. We learn to socialize and make friends through interfaces and networks that may be more dedicated to finding a valid advertising model than helping us find one another.

Yet again, we have surrendered the unfolding of a new technological age to a small elite who have seized the capability on offer. But while Renaissance kings maintained their monopoly over the printing presses by force, today's elite is depending on little more than our own disinterest. We are too busy wading through our overflowing inboxes to consider how they got this way, and whether there's a better or less frantic way to stay informed and in touch. We are intimidated by the whole notion of programming, seeing it as a chore for mathematically inclined menials than a language through which we can re-create the world on our own terms.

We're not just building cars or televisions sets -- devices that, if we later decide we don't like, we can choose not to use. We're tinkering with the genome, building intelligent machines, and designing nanotechnologies that will continue where we leave off. The biases of the digital age will not just be those of the people who programmed it, but of the programs, machines, and life-forms they have unleashed. In the short term, we are looking at a society increasingly dependent on machines, yet decreasingly capable of making or even using them effectively. Other societies, such as China, where programming is more valued, seem destined to surpass us -- unless, of course, the other forms of cultural repression in force there offset their progress as technologists. We shall see. Until push comes to shove and geopolitics force us to program or perish, however, we will likely content ourselves with the phone apps and social networks on offer. We will be driven toward the activities that help distract us from the coming challenges -- or stave them off -- rather than the ones that encourage us to act upon them.

But futurism is not an exact science, particularly where technology is concerned. In most cases, the real biases of a technology are not even known until that technology has had a chance to exist and replicate for a while. Technologies created for one reason usually end up having a very different use and effect. The "missed call" feature on cell phones ended up being hacked to give us text messaging. Personal computers, once connected to phone lines, ended up becoming more useful as Internet terminals. Our technologies only submit to our own needs and biases once we hack them in one way or another. We are in partnership with our digital tools, teaching them how to survive and spread by showing them how they can serve our own intentions. We do this by accepting our roles as our programs' true users, rather than subordinating ourselves to them and becoming the used.

In the long term, if we take up this challenge, we are looking at nothing less than the conscious, collective intervention of human beings in their own evolution. It's the opportunity of a civilization's lifetime. Shouldn't more of us want to participate actively in this project?

Digital technologies are different. They are not just objects, but systems embedded with purpose. They act with intention. If we don't know how they work, we won't even know what they want. The less involved and aware we are of the way our technologies are programmed and program themselves, the more narrow our choices will become; the less we will be able to envision alternatives to the pathways described by our programs; and the more our lives and experiences will be dictated by their biases.

On the other hand, the more humans become involved in their design, the more humanely inspired these tools will end up behaving. We are developing technologies and networks that have the potential to reshape our economy, our ecology, and our society more profoundly and intentionally than ever before in our collective history. As biologists now understand, our evolution as a species was not a product of random chance, but the forward momentum of matter and life seeking greater organization and awareness. This is not a moment to relinquish our participation in that development, but to step up and bring our own sense of purpose to the table. It is the moment we have been waiting for." (http://www.realitysandwich.com/program_or_be_programmed)


On Openness: Sharing is Not Stealing

Douglas calls for a new ethics of sharing, not mindless copying of everyone else's creative work:

"Digital networks were built for the purpose of sharing computing resources by people who were themselves sharing resources, technologies, and credit in order to create it. This is why digital technology is biased in favor of openness and sharing. Because we are not used to operating in a realm with these biases, however, we oft en exploit the openness of others or end up exploited ourselves. By learning the difference between sharing and stealing, we can promote openness without succumbing to selfishness.


No matter how private and individual we try to make our computers, our programs, and even our files, they all slowly but surely become part of the cloud. Whether we simply back up a file by sending it to the server holding our email, or go so far as to create a website archive, we all eventually make use of computing resources we don’t actually own ourselves. And, eventually, someone or something else uses something of ours, too. It’s the natural tug of digital technology toward what may well be its most essential characteristic: sharing. From the CPU at the heart of a computer distributing calculations to various coprocessors, to the single mainframe at a university serving hundreds of separate terminals, computer and network architecture has always been based on sharing resources and distributing the burden. This is the way digital technology works, so it shouldn’t surprise us that the technologists building computers and networks learned to work in analogous ways.

Perhaps because they witnessed how effective distributed processing was for computers, the builders of the networks we use today based both their designs as well as their own working ethos on the principles of sharing and openness. Nodes on the Internet, for example, must be open to everyone’s traffic for the network to function. Each node keeps the packets that are addressed to it and passes on the others—allowing them to continue their journey toward their destination. Servers are constantly pinging one another, asking questions, getting directions, and receiving the help they need. This is what makes the Internet so powerful, and also part of what makes the Internet so vulnerable to attack: Pretty much everything has been designed to talk to strangers and off er assistance. This encouraged network developers to work in the same fashion. The net was built in a “gift economy” based more on sharing than profit. Everyone wanted a working network, everyone was fascinated by the development of new soft ware tools, so everyone just did what they could to build it. This work was still funded, if indirectly. Most of the programmers were either university professors or their students, free to work for credit or satisfaction beyond mere cash.

Pretty much everything we use on the Internet today—from email and the web to streaming media and videoconferencing—was developed by this nonprofit community, and released as what they called freeware or shareware. The thrill was building the network, seeing one’s own innovations accepted and extended by the rest of the community, and having one’s lab or school get the credit. The boost to one’s reputation could still bring financial reward in the form of job advancement or speaking fees, but the real motivator was fun and pride.

As the net became privatized and commercialized, its bias for openness and sharing remained. Only now it is oft en people and institutions exploiting this bias in order to steal or extract value from one another’s work. Digital technology’s architecture of shared resources, as well as the gift economy through which the net was developed, have engendered a bias toward openness. It’s as if our digital activity wants to be shared with others. As a culture and economy inexperienced in this sort of collaboration, however, we have great trouble distinguishing between sharing and stealing.

In many ways—most ways, perhaps—the net’s spirit of openness has successfully challenged a society too ready to lock down knowledge. Teachers, for example, used to base their authority on their exclusive access to the information their pupils wished to learn. Now that students can find out almost anything they need to online, the role of the teacher must change to that of a guide or coach—more of a partner in learning who helps the students evaluate and synthesize the data they find. Similarly, doctors and other professionals are encountering a more educated clientele. Sure, sometimes the questions people ask are silly ones, based on misleading ads from drug companies or credit agencies. Other times, however, clients demonstrate they are capable of making decisions with their professionals rather than surrendering their authority to them—oft en leading to better choices and better results.

The net’s bias toward collaboration has also yielded some terrific mass participatory projects, from technologies such as the Firefox browser and Linux operating system to resources like Wikipedia. As examples of collective activity, they demonstrate our ability to work together and share the burden in order to share yet again in the tool we have gained. For many, it is a political act and a personal triumph to participate in these noncommercial projects and to do so for reasons other than money.

These experiences and tools have, in turn, engendered an online aesthetic that is itself based in sharing and repurposing the output of others. As early as the 1920s, artists called the Dadaists began cutting up text and putting it together in new ways. In the 1960s, writers and artists such as William Burroughs and Brion Gysin were experimenting with the technique, physically cutting up a newspaper or other text object into many pieces and then recombining them into new forms. They saw it as a way to break through the hypnosis of traditional media and see beyond its false imagery to the real messages and commands its controllers were trying to transmit to us without our knowledge. Digital technology has turned this technique from a fringe art form to a dominant aesthetic.

From the record “scratching” of a deejay to the cut and paste functions of the text editor, our media is now characterized by co-opting, repurposing, remixing, and mashing-up. It’s not simply that a comic book becomes a movie that becomes a TV series, a game, and then a musical on which new comic books are based. Although slowly mutating, that’s still a single story or brand moving through different possible incarnations. What we’re in the midst of now is a mediaspace where every creation is fodder for every other one.

Kids repurpose the rendering engines in their video games to make movies, called “machinima,” starring the characters in the game. Movies and TV shows are re-edited by fans to tell new stories and then distributed on free servers. This work is fun, creative, and even inspiring. But sometimes it also seems to cross lines. Books are quoted at length or in decontextualized pieces only to be included as part of someone else’s work, and entire songs are repurposed to become the backing tracks of new ones. And almost none of the original creators—if that term still means anything—are credited for their work.

In the best light, this activity breaks through sacrosanct boundaries, challenging monopolies on culture held by institutions from the church to Walt Disney. Aft er all, if it’s out there, it’s everyone‘s. But what, if anything, is refused to the churn? Does committing a piece of work to the digital format mean turning it over to the hive mind to do with as it pleases? What does this mean for the work we have created? Do we have any authority over it, or the context in which it is used? We applaud the teenager who mashes up a cigarette commercial to expose the duplicity of a tobacco company.


But what about when a racist organization mashes up some video of your last speech to make a false point about white supremacy?

This is the liability of “processing” together. We are living in an age when thinking itself is no longer a personal activity but a collective one. We are immersed in media and swimming in the ideas of other people all the time. We do not come up with our thoughts by ourselves anymore, so it’s awfully hard to keep them to ourselves once we share them. Many young people I’ve encountered see this rather terrifying loss of privacy and agency over our data as part of a learning curve. They see the human species evolving toward a more collective awareness, and the net’s openness as a trial run for a biological reality where we all know each other’s thoughts through telepathy.

Whether or not we are heading for shared consciousness, this “learning curve” should still be in effect. In short, we need to develop the manners and ethics that make living and working together under these conditions pleasant and productive for everyone."


Interview

Excerpted from an interview by Scott Bukti of Newsvine.


Douglas Rushkoff:

"Sometimes I love digital technology a whole lot. Sometimes I am very upset about it. Many theorists are either much too exuberant and utopian about it, or too pessimistic and fearful. There are very few who hold a balanced view. This is because digital technology tends to invite and propel these extremes.

As for me personally, I still love technology. Like I say all over the book: I am not writing about what technology is doing to us. That’s the wrong question. I am writing about what we are doing to one another through technology.

My opinion of technology has not changed. My opinion of people may have changed, however. I am surprised most people would rather remain powerless and unaware of how the world works. I had thought this was the result of oppression. Now I fear this is just the way a majority of people choose to live. So instead of fighting power, I am now more dedicated to making people less afraid of thought. Thinking doesn’t hurt. Not that much, anyway. And it’s better to see what is happening than to die unaware.

On a related note, one of the biggest ways I’ve seen the internet affect people including me – and I gather you since you include it here – is this matter of “always being on.” Or as you put it in the chapter titled “Do not be always on”:

“The human nervous system exists in the present tense. We live in a continuous ‘now’ and time is always passing for us. Digital technologies do not exist in time, at all. By marrying our time-based bodies and minds to technologies that are biased against time altogether, we end up divorcing ourselves from the rhythms, cycles and continuity on which we depend for coherence.”


  • Do you think this a common problem people have? Do you have advice on how people deal with this? Should they sometimes leave all electronic devices off for a day or something?

I wouldn’t tell people to take any particular electronic sabbath or anything. It’s a nice idea, of course, and people are welcome to try it. But such a suggestion or prescription wouldn’t be in keeping with what I’m trying to do, which is liberate people from the false notion that they are living in reaction to technology. Or that tech is doing something to them that they have to mitigate.

The point of this command – do not be always on – and the chapter itself is to show people that digital technology is biased toward asychronous activity. It exists outside time. This means we get to do digital things in our own time. This was the advantage of email over the telephone. You lose the conversation and the inflection, but you get to answer when you want to.

I look at the problem of information overload and distraction and attention not as problems of technology but problems of the way we use it. If we attempt to live in the same temporal reality as our digital devices, we get really screwed up. And we don’t have to.

The simple advice is not to be always on – don’t connect yourself to your technologies in some permanent, tethered way. Use them consciously when you want to. It’s pretty simple. I get as much email as almost anyone. Upwards of a thousand emails a day. And I do not let them vibrate on arrival. They’re over on a server somewhere waiting for me. That’s the main trick.


  • “What the internet lacks today indicates the possibilities for what can only be understood as a new operating system: a 21st century, decentralized way of conducting political, commercial and human affairs. Could you say more about the new OS, how turbulent the transition will be, and how long you think it will take?

It may be so turbulent that it’s not allowed to happen. When I talk of a new OS, I really just mean a new way of doing things.

For the past 600 years, we’ve been living under a scheme of forced centralization. Kings and Lords got threatened by the emergence of real peer-to-peer economic activity in the late middle ages, and so they made direct commerce illegal. That’s why we have corporations and central currency: they were developed by hired financiers of the 11th and 12th centuries to disrupt the rise of the middle class, what they called the bourgeoisie.

Although they had to fight a few wars to keep this going, they managed to maintain control over our economic and social activity – even after the Enlightenment and its revolutions. So it’s a pretty well entrenched way of doing things, where people get “jobs” in order to work and spend central currency that they earn in order to get stuff.

Now, we have a peer-to-peer medium that would let people transact directly again, even exchange ideas and methods for doing stuff. We don’t have to do everything through Amazon or Murdoch or Wal-Mart or even our government. We can engage directly.

That is, until Comcast or whoever decides they don’t want us using our conduit that way, or net neutrality is no longer maintained.*“What the internet lacks today indicates the possibilities for what can only be understood as a new operating system: a 21st century, decentralized way of conducting political, commercial and human affairs.

Could you say more about the new OS, how turbulent the transition will be, and how long you think it will take?

It may be so turbulent that it’s not allowed to happen. When I talk of a new OS, I really just mean a new way of doing things.

For the past 600 years, we’ve been living under a scheme of forced centralization. Kings and Lords got threatened by the emergence of real peer-to-peer economic activity in the late middle ages, and so they made direct commerce illegal. That’s why we have corporations and central currency: they were developed by hired financiers of the 11th and 12th centuries to disrupt the rise of the middle class, what they called the bourgeoisie.

Although they had to fight a few wars to keep this going, they managed to maintain control over our economic and social activity – even after the Enlightenment and its revolutions. So it’s a pretty well entrenched way of doing things, where people get “jobs” in order to work and spend central currency that they earn in order to get stuff.

Now, we have a peer-to-peer medium that would let people transact directly again, even exchange ideas and methods for doing stuff. We don’t have to do everything through Amazon or Murdoch or Wal-Mart or even our government. We can engage directly.

That is, until Comcast or whoever decides they don’t want us using our conduit that way, or net neutrality is no longer maintained." (http://rushkoff.com/2011/02/02/newsvine-interview/)

More Information

  1. Protocollary Power
  2. long collective interview: http://trustcurrency.blogspot.com/2010/10/community-conversation-with-douglas.html