Protocally Power is a concept developed by Alexander Galloway in his book Protocol, to denote the new way power and control are exercized in distributed networks.
From Alexander Galloway in his book Protocol:
"Protocol is not a new word. Prior to its usage in computing, protocol referred to any type of correct or proper behavior within a specific system of conventions. It is an important concept in the area of social etiquette as well as in the fields of diplomacy and international relations. Etymologically it refers to a fly-leaf glued to the beginning of a document, but in familiar usage the word came to mean any introductory paper summarizing the key points of a diplomatic agreement or treaty.
However, with the advent of digital computing, the term has taken on a slightly different meaning. Now, protocols refer specifically to standards governing the implementation of specific technologies. Like their diplomatic predecessors, computer protocols establish the essential points necessary to enact an agreed-upon standard of action. Like their diplomatic predecessors, computer protocols are vetted out between negotiating parties and then materialized in the real world by large populations of participants (in one case citizens, and in the other computer users). Yet instead of governing social or political practices as did their diplomatic predecessors, computer protocols govern how specific technologies are agreed to, adopted, implemented, and ultimately used by people around the world. What was once a question of consideration and sense is now a question of logic and physics.
To help understand the concept of computer protocols, consider the analogy of the highway system. Many different combinations of roads are available to a person driving from point A to point B. However, en route one is compelled to stop at red lights, stay between the white lines, follow a reasonably direct path, and so on. These conventional rules that govern the set of possible behavior patterns within a heterogeneous system are what computer scientists call protocol. Thus, protocol is a technique for achieving voluntary regulation within a contingent environment.
These regulations always operate at the level of coding--they encode packets of information so they may be transported; they code documents so they may be effectively parsed; they code communication so local devices may effectively communicate with foreign devices. Protocols are highly formal; that is, they encapsulate information inside a technically defined wrapper, while remaining relatively indifferent to the content of information contained within. Viewed as a whole, protocol is a distributed management system that allows control to exist within a heterogeneous material milieu.
It is common for contemporary critics to describe the Internet as an unpredictable mass of data--rhizomatic and lacking central organization. This position states that since new communication technologies are based on the elimination of centralized command and hierarchical control, it follows that the world is witnessing a general disappearance of control as such.
This could not be further from the truth. I argue in this book that protocol is how technological control exists after decentralization. The "after" in my title refers to both the historical moment after decentralization has come into existence, but also--and more important--the historical phase after decentralization, that is, after it is dead and gone, replaced as the supreme social management style by the diagram of distribution." (http://rhizome.org/discuss/view/12004)
Citations on Design as a function of Protocollary Power
"Yes, networks are grown. But the medium they grow in, in this case the software that supports them, is not grown but designed & architected. The social network ecosystem of the blogosphere was grown, but the blog software that enabled it was designed. Wikis are a socially grown structure on top of software that was designed. It's fortuitous that the social network structures that grew on those software substrates turn out to have interesting & useful properties.
With a greater understanding of which software structures lead to which social network topologies & what the implications are for the robustness, innovativeness, error correctiveness, fairness, etc. of those various topologies, software can be designed that will intentionally & inevitably lead to the growth of political social networks that are more robust, innovative, fair & error correcting." (http://www.greaterdemocracy.org/archives/000471.html)
Mitch Kapor on 'Politics is Architecture'
"“politics is architecture": The architecture (structure and design) of political processes, not their content, is determinative of what can be accomplished. Just as you can’t build a skyscraper out of bamboo, you can’t have a participatory democracy if power is centralized, processes are opaque, and accountability is limited." (http://blog.kapor.com/?p=29)
Power in Networks: Virtual Location (V. Krebs)
"In social networks, location is determined by your connections and the connections of those around you – your virtual location.
Two social network measures, Betweenness and Closeness, are particularly revealing of a node’s advantageous or constrained location in a network. The values of both metrics are dependent upon the pattern of connections that a node is embedded in. Betweenness measures the control a node has over what flows in the network – how often is this node on the path between other nodes? Closeness measures how easily a node can access what is available via the network – how quickly can this node reach all others in the network? A combination where a node has easy access to others, while controlling the access of other nodes in the network, reveals high informal power." (http://www.orgnet.com/PowerInNetworks.pdf)
Fred Stutzman on Pseudo-Govermental Decisions in Social Software
"When one designs social software, they are forced to make pseudo-governmental decisions about how the contained ecosystem will behave. Examples of these decisions include limits on friending behavior, limits on how information in a profile can be displayed, and how access to information is restricted in the ecosystem. These rules create and inform the structural aspects of the ecosystem, causing participants in the ecosystem to behave a specific way.
As we use social software more, and social software more neatly integrates with our lives, a greater portion of our social rules will come to be enforced by the will of software designers. Of course, this isn't new - when we elected to use email, we agree to buy into the social consequences of email. Perhaps because we are so used to making tradeoffs when we adopt social technology, we don't notice them anymore. However, as social technology adopts a greater role in mediating our social experience, it will become very important to take a critical perspective in analyzing how the will of designers change us." (http://chimprawk.blogspot.com/2006/10/colonialist-perspective-in-social.html)
Philippe Zafirian on the Two Faces of the Control Society
Philippe Zafirian's citation suggests that protocollary power is related to a shift with the 'Society of Control', from disciplinary control, to the control of engagement.
"Gilles Deleuze, commentant Foucault, a développé une formidable intuition : nous basculons, disait-il, de la société disciplinaire dans la société de contrôle. Ou, pour dire les choses de manière légèrement différente, de la société de contrôle disciplinaire à la société de contrôle d'engagement . Sous une première face, on pourra interpréter ce contrôle comme une forme d'exercice d'un pouvoir de domination, d'un pouvoir structurellement inégalitaire, agissant de manière instrumentale sur l'action des autres. Ce contrôle d'engagement se distingue, en profondeur, du contrôle disciplinaire en ce qu'il n'impose plus le moule des "tâches", de l'assignation à un poste de travail, de l'enfermement dans la discipline d'usine. Il n'enferme plus, ni dans l'espace, ni dans le temps. Il cesse de se présenter comme clôture dans la cellule d'une prison, elle-même placée sous constante surveillance. Selon l'intuition de Deleuze, on passe du moule à la modulation, de l'enfermement à la circulation à l'air libre, de l'usine à la mobilité inter-entreprises. Tout devient modulable : le temps de travail, l'espace professionnel, le lien à l'entreprise, les résultats à atteindre, la rémunération… La contractualisation entre le salarié et l'employeur cesse elle-même d'être rigide et stable. Elle devient perpétuellement renégociable. Tout est en permanence susceptible d'être remis en cause, modifié, altéré." (http://perso.wanadoo.fr/philippe.zarifian/page109.htm)
Social Values in Technical Code:
"In a paper about the hacker community, Hannemyr compares and contrasts software produced in both open source and commercial realms in an effort to deconstruct and problematize design decisions and goals. His analysis provides us with further evidence regarding the links between social values and software code. He concludes:
"Software constructed by hackers seem to favor such properties as flexibility, tailorability, modularity and openendedness to facilitate on-going experimentation. Software originating in the mainstream is characterized by the promise of control, completeness and immutability" (Hannemyr, 1999).
To bolster his argument, Hannemyr outlines the striking differences between document mark-up languages (like HTML and Adobe PDF), as well as various word processing applications (such as TeX and Emacs verses Microsoft Word) that have originated in open and closed development environments. He concludes that "the difference between the hacker’s approach and those of the industrial programmer is one of outlook: between an agoric, integrated and holistic attitude towards the creation of artifacts and a proprietary, fragmented and reductionist one" (Hannemyr, 1999). As Hannemyr’s analysis reveals, the characteristics of a given piece of software frequently reflect the attitude and outlook of the programmers and organizations from which it emerges" (http://www.firstmonday.org/issues/issue8_10/jesiek/)
Social Media: Re-introducing centralization through the back door
"n media theory much has been made of the one-sided and centralised broadcast structure of television and radio. the topology of the broadcast system, centralised, one-to-many, one-way, has been compared unfavourable to the net, which is a many-to-many structure, but also one-to-many and many-to-one, it is, in terms of a topology, a highly distributed or mesh network. So the net has been hailed as finally making good on the promise of participatory media usage. What so called social media do is to re-introduce a centralised structure through the backdoor. While the communication of the users is 'participatory' and many-to-many, and so on and so forth, this is organised via a centralised platform, venture capital funded, corporately owned. Thus, while social media bear the promise of making good on the emancipatory power of networked communication, in fact they re-introduce the producer-consumer divide on another layer, that of host/user. they perform a false aufhebung of the broadcast paradigm. Therefore I think the term prosumer is misleading and not very useful. while the users do produce something, there is nothing 'pro' as in professional in it.
This leads to a second point. The conflict between labour and capital has played itself out via mechanization and rationalization, scientific management and its refinement, such as the scientific management of office work, the proletarisation of wrongly called 'white collar work', the replacement of human labour by machines in both the factory and the office, etc. What this entailed was an extraction of knowledge from the skilled artisan, the craftsman, the high level clerk, the analyst, etc., and its formalisation into an automated process, whereby this abstraction decidedly shifts the balance of power towards management. Now what happened with the transition from Web 1.0 to 2.0 is a very similar process. Remember the static homepage in html? You needed to be able to code a bit, actually for many non-geeks it was probably the first satisfactory coding experience ever. You needed to set the links yourself and check the backlinks. Now a lot of that is being done by automated systems. The linking knowledge of freely acting networked subjects has been turned into a system that suggests who you link with and that established many relationships involuntarily. It is usually more work getting rid of this than to have it done for you. Therefore Web 2.0 in many ways is actually a dumbing down of people, a deskilling similar to what has happened in industry over the past 200 years.
Wanted to stay short and precise, but need to add, social media is a misnomer. What social media would be are systems that are collectively owned and maintained by their users, that are built and developed according to their needs and not according to the needs of advertisers and sinister powers who are syphoning off the knowledge generated about social relationships in secret data mining and social network analysis processes.
So there is a solution, one which I continue to advocate: lets get back to creating our own systems, lets use free and open source software for server infrastructures and lets socialise via a decentralised landscape of smaller and bigger hubs that are independently organised, rather than feeding the machine ..." (IDC mailing list, Oct 31, 2009)
Protocollary Power and P2P
Michel Bauwens on the P2P aspects of Protocollary Power:
The P2P era indeed adds a new twist, a new form of power, which we have called Protocollary Power, and has first been clearly identified and analyzed by Alexander Galloway in his book Protocol. We have already given some examples. One is the fact that the blogosphere has devised mechanisms to avoid the emergence of individual and collective monopolies, through rules that are incorporated in the software itself. Another was whether the entertainment industry would succeed in incorporating software or hardware-based restrictions to enforce their version of copyright. There are many other similarly important evolutions to monitor: Will the internet remain a point to point structure? Will the web evolve to a true P2P medium through Writeable Web developments? The common point is this: social values are incorporated, integrated in the very architecture of our technical systems, either in the software code or the hardwired machinery, and these then enable/allow or prohibit/discourage certain usages, thereby becoming a determinant factor in the type of social relations that are possible. Are the algorhythms that determine search results objective, or manipulated for commercial and ideological reasons? Is parental control software driven by censorship rules that serve a fundamentalist agenda? Many issues are dependent on hidden protocols, which the user community has to learn to see (as a new form of media literacy and democratic practice), so that it can become an object of conscious development, favoring peer to peer processes, rather than the restrictive and manipulative command and control systems. In P2P systems, the formal rules governing bureaucratic systems are replaced by the design criteria of our new means of production, and this is where we should focus our attention. Galloway suggests that we make a diagram of the networks we participate in, with dots and lines, nodes and edges. Important questions then become: Who decides who can participate?, or better, what are the implied rules governing participation? (since there is no specific 'who' or command in a distributed environment); what kind of linkages are possible? On the example of the internet, Galloway shows how the net has a peer to peer protocol in the form of TCP/IP, but that the Domain Name System is hierarchical, and that an authorative server could block a domain family from operating. This is how power should be analyzed. Such power is not per se negative, since protocol is needed to enable participation (no driving without highway code!), but protocol can also be centralized, proprietary, secret, in that case subverting peer to peer processes. However, the stress on protocol, which concerns what Yochai Benkler calls the 'logical layer' of the networks, should not make us forget the power distribution of the physical layer (who owns the networks), and the content layer (who owns and controls the content).
Protocols are Designed by People
"Galloway is correct to point out that there is control in the internet, but instead of reifying the protocol or even network form itself, an ontological mistake that would be like blaming capitalism on the factory, it would be more suitable to realise that protocols embody social relationships. Just as genuine humans control factories, genuine humans – with names and addresses – create protocols. These humans can and do embody social relations that in turn can be considered abstractions, including those determined by the abstraction that is capital. But studying protocol as if it were first and foremost an abstraction without studying the historic and dialectic movement of the social forms which give rise to the protocols neglects Marx’s insight that
Technologies are organs of the human brain, created bythe human hand; the power of knowledge, objectified.
Bearing protocols’ human origination in mind, there is no reason why they must be reified into a form of abstract control when they can also be considered the solution to a set of problems faced by individuals within particular historical circumstances. If they now operate as abstract forms of control, there is no reason why protocols could not also be abstract forms of collectivity. Instead of hoping for an exodus from protocols by virtue of art, perhaps one could inspect the motivations, finances, and structure of the human agents that create them in order to gain a more strategic vantage point. Some of these are hackers, while others are government bureaucrats or representatives of corporations – although it would seem that hackers usually create the protocols that actually work and gain widespread success. To the extent that those protocols are accepted, this class that I dub the ‘immaterial aristocracy’ governs the net. It behoves us to inspect the concept of digital sovereignty in order to discover which precise body or bodies have control over it." (http://www.metamute.org/en/Immaterial-Aristocracy-of-the-Internet)
There is no Openness without secrecy and control!
Tim Leberecht insists, taking Wikileaks as a case study, that there is no openness without secrecy:
“an ecosystem on the Social Web could be seen as a system in permanent crisis – it is always in flux, and its composition and value are constantly threatened by a multitude of forces, from the inside and the outside. What if we understood “designing for the loss of control” as designing for structures that are in a permanent crisis? Crises are essentially disruptions that shock the system. They are deviations from routines, and the very variance that the advocates of planning and programs (the “Push” model) so despise. At their own peril, because they fail to realize that variance is the mother of all meaning; it is variance that challenges the status quo, pulls people and their passions towards you, and propels innovation. “Designing for the loss of control” means designing for variance.
One system in permanent crisis that contains a high level of variance is WikiLeaks. The most remarkable thing about the site appears to be the dichotomy between the uncompromised transparency it aims at and the radical secrecy it requires to do so. The same organization that depends on the loss of control for its content very much depends on a highly controlled environment to protect itself and keep operating effectively. But not just that: Ironically, secrecy is also a fundamental prerequisite for the appeal of WikiLeaks’ “there are no secrets” claim. Simply put: there is no light without darkness. And there is no WikiLeaks without secrets.
Applied to systems and solutions design, this means that total openness is the antidote to openness. When everything is open, nothing is open. In order to design openness, one of the first decisions designers have to make is therefore to determine what needs to remain closed. This is a strategic task: making negative choices for positive effects. You need to build enough variance into a system to make it “flow” and yet retain some control over the underlying parameters (access, boundaries, authorship, participants, agenda, process, conversation, collaboration, documentation, etc.). Only if you maintain the fundamental ability to at least manage (and modify) the conditions for openness, will you be able to create it. To design for the loss of control, control the parameters that enable it.” (http://designmind.frogdesign.com/blog/openness-or-how-do-you-design-for-the-loss-of-control.html-0)
Marvin Brown onCivic Design
"A citizen is one among the many—one among others. Citizens are members. We are always citizens “of.” “Of what?” Of the many? Yes. But citizens are not mobs or crowds. Citizens are members of civic communities, and citizens create and re-create civic communities. The civic, in other words, comes into existence when we participate in civic conversations as citizens.
Civic conversations are quite different from commercial conversations. Commercial conversations are about commerce—about the exchange and the overall flow of things. Civic conversations are about how we want to live together—about the design of our collective life. Civic conversations should be the context or platform for commercial conversations. Only when we know how we want to live together will we know how to design the flow of things.
In the history of the United States, for the most part, commercial conversations have dominated civic conversations. Still, we have witnessed the rise of the civic, such as in the civil rights movement. And, now, we see it again. The civic is occupying the commercial.
The goal, of course, is not to eliminate commerce, but to civilize it—to have commercial conversations about how to provide for one another on a civic platform of moral equality and reciprocity. Commerce is not the problem. The problem is its separation from civic norms.
When people say, ”We have seen the problem and the problem is us,” they deceive themselves. We are not the problem. The problem is one of design. Our current design of how we live together in unjust and unsustainable, and it is still controlled by commercial conversations without any moral foundation. Those who control financial markets are sovereign. If we expand and protect civic conversations we may, in time, participate in the solution—an economy based on civic norms making provisions for this and future generations." (http://www.civilizingtheeconomy.com/2011/12/what-is-a-citizen-and-the-civic/)
Historical Origins of the Internet Protocol
"The principles guiding the early designs of the Internet supposed a deep perversion of traditional models of hierarchical military power. This perversion occurred the moment the military moved from a communications model of command and control to one based on distributed command and control. To understand how this move was possible it is useful to reread the opening words to Reliable digital communications systems using unreliable network repeater nodes , perhaps the most bizarre introduction to a technical paper ever written:
The cloud-of-doom attitude that nuclear war spells the end of the earth is slowly lifting from the minds of the many....A new view emerges: the possibility of a war exists but there is much that can be done to minimize the consequences.
If war does not mean the end of the earth in a black-and-white Manner, then it follows that we should do those things that make the shade of gray as light as Possible: to plan now to minimize potential destruction and to do all those things necessary to permit the survivors of the holocaust to shuck their ashes and reconstruct the economy swiftly."
The author is Paul Baran, and the 1960 paper describes the first ever theoretical model for an entirely digital distributed communications network. When Baran started the research that led to his model, a few years earlier, more than a decade of nuclear threat had been enough to dissolve the euphoric sense of American invulnerability that resulted from WWII, so the survivor and his gray world had to be invented.
Cultural artefacts of the time show an imaginary where there is no paradigmatic survivor but rather a reproduction of class structure as societies go through the experience of the apocalypse. Within the space of the ruling ideological framework, Baran's 'shades of gray' started to emerge. Cold War era movies like When the Wind Blows , The War Game , The Day After , or the Japanese (and post-Hiroshima) Barefoot Gen portray almost identical dramas of lay survivors as they negotiate the dawn of hell on earth. These lay survivors were, however, at best secondarily who the web was created for. Placebos in the form of nuclear emergency contingency pamphlets were the only packages being distributed to them. Their worse-than-death agony was expected, integral part of the ever flourishing collection of nuclear war scenarios. Belonging in this sense to a different category of cultural products of the era are Kubrik’s acclaimed Dr. Strangelove and Herman Kahn’s less acclaimed book On Thermonuclear War , the former a slightly caricaturised version of the terroristic rationality of the latter. These portray a very different perspective of surviving the apocalypse, that of the powerful. Survivability of the elite, even after absolute Doomsday-machine powered annihilation, was initially the one remaining issue.
In between these two extreme experiences of the apocalypse (one mediated by a wooden ‘inner core or refuge’ and a pamphlet, and the other by reinforced concrete and endless sex), existing societies started representing their pre-apocalyptic relationships of power through a new and flourishing ecology: an ever increasing diversity of individualistic bunkers tailored, ironically, to the individual 'nuclear family' and its corresponding social status. For instance, a booklet called The Family Fallout Shelter distributed by the US government."The least expensive shelter described is the Basement Concrete Block Shelter. The most expensive is the Underground Concrete Shelter"
The sanctity of the affordance abyss between the layman and the president was first transgressed by the figure of the secondary commander. Once his needs entered the realm of what is taken seriously after the bomb, the logic of post apocalyptic life (i.e. of the network) had been perverted. It seems now like an insignificant concession, but it was all it took to redraw the diagram of power. In a 1990 interview Paul Baran recalls how the seemingly subtle shift of accommodating for the needs of secondary commanders came to conceptually redefine his model:
The great communications need of the time was a survivable communications capacity that could broadcast a single teletypewriter channel. The term used to describe this need was "minimal essential communications," a euphemism for the President to be able to say "You are authorized to fire your weapons". Or "hold your fire". These are very short messages. The initial strategic concept at that time was if you can build a communications system that could survive and transmit such short messages, that is all that is needed... . The major initial objection to the scheme was its limited bandwidth. The generals would say, "Yes, that would be okay for the President. But I gotta do this, and so and so gotta do this, and that command gotta do that. We need more communication than a single teletypewriter channel." After receiving this message back consistently, I said, "Okay, back to the drawing board. But this time I'm going to give them so damn much communication capacity they won't know what in hell to do with it all." So that became my next objective. Then I went from there to try to design a survivable network with so much more capacity and capability that this common objection to bandwidth limitation would be overcome.
This is the moment when the perversion happened, when the movement toward distributed command and control took place. The limited bandwidth distribution model still reproduced the polarity of power in the sense that it only considered the limited requirements of the President. Boosting bandwidth made it useful for secondary actors. Suddenly, the architecture of the desirable network stopped mimicking the hierarchies of the chain of command. In the aftermath, even in the absence of the top commanders, a network of secondary commanders would have means of communication and perhaps retaliatory power. The shift was reflected not only in the model of distributed communications but at all levels, especially in the characteristics of the data routing protocol. The concept for this protocol received the name of 'hot potato' packet switching.
"Thus, in the system described, each node will attempt to get rid of its messages by choosing alternate routes if its preferred route is busy or destroyed. Each message is regarded as a "hot potato," and rather than hold the "hot potato," the node tosses the message to its neighbour, who will now try to get rid of the message."
In terms of control the 'hot potato' model is a shift from node-centric control to immanent control distributed through the network. Because the node has no control over the full life of a packet, there is no feedback relationship between node and packet. Hence, there is really no nodal control as per Norbert Wiener's seminal definition of control as feedback . Packet control does ultimately happen, but as a result of the whole network informing the packet of the best available route in real time, which is to say that ultimately it is the multitude of packets who are, collectively, in control of themselves. In practical terms this means that end users in Baran's design find themselves in a situation of equipotentiality.
Paul Baran's network was never built but the model he proposed was a major influence in the creation of ARPANET, the primordial web built by the US Department of Defence. In 1973 the need arose to reconcile incompatibilities inherent to diverse data transmission technologies and to communication with other networks like the French network CYCLADES, and so the TCP/IP protocol suite was designed. Because TCP/IP enabled networks of diverse characteristic to communicate, the resulting network-of-networks was called the Inter-net.
Robert Kahn and Vinton Cerf, creators of TCP/IP, describe it as "a simple but very powerful and flexible protocol which provides for variation in individual network packet sizes, transmission failures, sequencing, flow control, and the creation and destruction of process-to-process associations." In the TCP/IP protocol flexibility, simplicity and scalability join survivability as the defining features of the design. However, in a 1988 report called The Design Philosophy of the DARPA Internet Protocols David D. Clark recounts the original objectives of the Internet architecture and discusses “the relation between these goals and the important features of the protocols.
The solution Paul Baran crafted for the problem of network survivability, to essentially distribute power evenly throughout the network, irreparably breaks the social structure that over centuries had revolved around processes of accumulation and consolidation of power. Because it was initially confined to the few who already were supposed to hold power to begin with, distributed power was a tolerable concept. The egalitarian idea of distributed power, paradoxically made possible only as a means of military ‘command and control’ to survive absolute violence, was accepted under the retroactively delusional assumption that the distribution of power wouldn’t affect its concentration.
The struggles in the network emerge from the tension between the contradictory concepts of 'command' and 'control', buried deep in the protocol that governs it. The expression ‘command and control’ describes the abilities to initiate and stop action, respectively. “At its crudest level 'command and control' in nuclear war can be boiled down to this: command means being able to issue the instruction to 'fire' missiles, and control means being able to say 'cease firing'” It can be conflated into a more simple term: ‘power’; as this bipolar attribute is distributed, pre-existing powers experience loss, disorientation and traumatic distress. If we follow Foucault’s propositions according to which power is always relational and that it exists to be exercised, a network for distributed command and control, (i.e. distributed power), creates a situation where centerless power is exercised in all directions. And so, the paranoid genesis of the Net, combined with the absolute impossibility of the military, and even the academic field, to foresee the impact their toy would have in the planet, provided the enormous historical faux-pas in the logic of power that is the Internet. ” (February 2012)
Wikileaks: Two Exploits Against Protocollary Power
" In its transformation from a group of hacktivists using the opportunities of the internet to let data speak truth to power, to a phenomenon that leveraged much of the internet and impacted the production of mass media, Wikileaks illustrates the new possibilities of protocol and resistance.
In their 2007 book The Exploit, Galloway and Thacker argue that the network is merely a condition of possibility for the operation of protocol, which can direct control around the network. Within the form of power that is protocol is the potential for an 'exploit' or disruption of protocol. The exploit is a property of the system, but it is also the thing that disrupts the system. This is one thing that WikiLeaks has effectively done; by identifying the logic of control underlying both secrets and the way that secrets have in the past and may now become news
The Wikileaks phenomenon displays a couple of exploits. First, the insertion of Wikileaks into the production of news exploited features of the journalism process, inserting a new intermediary into the process and potentially creating a kind of disruptive innovation in the production of journalistic content. This disruption is based on the permanence and reproducibility of internet data . Second, the response of Anonymous to attempts to shut down the WikiLeaks organization reiterated how the exploit is a central property of a system of protocol: despite the increasing state regulation and governance of the internet, it still to some degree operates based on principles of distributed power.
WikiLeaks became significant because of its lucky ability to exploit the news production process. Through the summer of 2010, internet scholars, security specialists and hacktivists gleefully discussed the tidbits of scandal and deluges of data that WikiLeaks released. This ranged from Sarah Palin’s e-mail to thousands of pages on the US involvement in Afghanistan. But these leaks were not effective at drawing attention to the secrets in the way that Assange may have initially intended, at least in his writings. Thus, the partnerships with news organizations became important in advancing his single purpose. It also contributed to the conflation of WikiLeaks as an organization with Assange as a character, as Anderson's piece in this collection describes in detail.
The partnerships, as they expanded beyond Assange's goals, began to attract significant attention to the leaks. Whereas the leaked information about Afghanistan was so voluminous that only a few media stories broke, the diplomatic cables were redacted by journalists working with large newspapers. As Sarah Ellison's Vanity Fair article describes in wincing detail, this partnership was awkward for Assange, for the other members of WikiLeaks, and especially for the various journalists and newspapers involved.
We can read this unique partnership with a leaker, a non-national holder of leaks, and the conventional mass media as an exploit of the news-making process, and an innovation in reporting. As Aaron Bady notes,
The way most journalists “expose” secrets as a professional practice — to the extent that they do — is just as narrowly selfish: because they publicize privacy only when there is profit to be made in doing so, they keep their eyes on the valuable muck they are raking, and learn to pledge their future professional existence on a continuing and steady flow of it. In muck they trust.
The partnership between the mass media journalists and WikiLeaks has provoked much reflection by the journalistic community. Like Ellison, many of them concentrate on the incongruity of negotiating with Assange, who often threatened to release underrated diplomatic cables if the process did not go as smoothly as he had hoped. This was exacerbated by the media attention and legal threats placed on Assange himself. However, what these reports have in common is description of a shifting role for the journalist: not necessarily collecting breaking news, but instead working to validate and contextualize raw data that has come from another source. In this way, WikiLeaks has exploited the network of news production and transformed it. Using the capacity of the internet to easily reproduce and maintain identical data, ithas created a unique and persistent digital repository for the type of information which in the past would have been provided to journalists by trusted sources.
With the complicity of newsrooms, WikiLeaks has created a significant new innovation in the way that international news is investigated and released. It has succeeded in having previously unavailable material put out in to the public domain. But it has not done this by maintaining a 'people power' wiki with every leak freely available. It has instead, through a combination of luck and strategy, added an innovation to the function of newsrooms strained by budget cuts. The publication of stories based on leaked cables continues even now, with a string of revelations appearing this week in India's national newspapers.
The WikiLeaks phenomenon also introduced a second exploit. This one was the not the result of any actions of Assange or WikiLeaks volunteers. It was the outcome of a perception by individual internet users that powerful government and corporate interests shouldn't use technical tools to shut down Wikileaks without some retaliation – of the same type, of course.
After the release of the diplomatic cables, Wikileaks experienced censure by the US government, as well as the suspension of its bank accounts. Yochai Benkler describes the events this way in a February 2011 working paper:
“Responding to a call from Senate Homeland Security Committee Chairman Joe Lieberman, several commercial organization tried to shut down Wikileaks by denial of service of the basic systems under their respective control. Wikileaks' domain name server provider stopped pointing at the domain “wikileaks.org,” trying to make it unreachable. Amazon, whose cloud computing platform was hosting the data, cut off hosting services for the site. Banks and payment companies, like Mastercard, Visa, and PayPal, as well as the Swiss postal bank, cut off payment service to Wikileaks in an effort to put pressure on the site's ability to raise money from supporters around the world. It is hard to identify the extent to which direct government pressure, beyond the public appeals for these actions and their subsequent praise from Senator Joe Liberman, was responsible for these actions.”
This government pressure, although it was coupled with pressure from the US companies was not centrally coordinated. Instead, it took the form of an 'integrated, cross-system attack” in Benkler's terms. In response, another exploit took place. Individuals aligning themselves with online group identity Anonymous staged DDoS counter-attacks, succeeding in shutting down MasterCard and Paypay'swebsites temporarily. In a Foreign Policy article from December 2010, Evgeny Morozov likened denial of service attacks to “sit-ins” that are intended to disrupt institutions, temporarily. He considered the actions of Anonymous to be a kind of direct action against internet censorship, which is separate from the role of WikiLeaks as a whistle-blower with ties to non-governmental organizations and the mass media. In particular, the use of distributed denial of service attacks, which flood web sites with requests, is inconvenient from a technical point of view and requires resources to mitigate, but which is not inherently damaging. Following these retaliatory actions, thousands of individuals set up mirror sites of all of the wikileaks.org content, defeating the purpose of cutting off access to the site. Both of these activities were undertaken not by action coordinated far in advance, but by thousands of individuals who could be mobilized quickly to act in the support of a shared principle
The response from Anonymous and other internet users is a reminder that the Internet is not structured the way a broadcaster is. It can be 'killed' in one country, but for the moment it remains a set of interlinked distributed networks, where data that can cheaply and easily be reproduced can be maintained for long periods of time, across national borders. Exploits of these networks don't only come from powerful actors. They can come from individuals and collectives, and may indicate a new role for the multitude." (http://mediacommons.futureofthebook.org/tne/pieces/wikileaks-phenomenon-and-new-media-power)
Key Books to Read
- Protocol. Alexander Galloway.
- Program or Be Programmed: Ten Commands for a Digital Age. Douglas Rushkoff. OR Books, 2010.
- The Exploit, Alexander Galloway and Eugene Thacker
- Frank Pascuale. The Black Box Society: The Secret Algorithms That Control Money and Information.
- Code Space. Software and Everyday Life. Rob Kitchin and Martin Dodge. MIT Press, 2011 : Examples of code/space include airport check-in areas, networked offices, and cafés that are transformed into workspaces by laptops and wireless access. Kitchin and Dodge argue that software, through its ability to do work in the world, transduces space. Then Kitchin and Dodge develop a set of conceptual tools for identifying and understanding the interrelationship of software, space, and everyday life, and illustrate their arguments with rich empirical material. And, finally, they issue a manifesto, calling for critical scholarship into the production and workings of code rather than simply the technologies it enables."
- "The WikiLeaks Phenomenon and New Media Power" by Alison Powell. The New Everyday April 8 2011. http://mediacommons.futureofthebook.org/tne/pieces/wikileaks-phenomenon-and-new-media-power
- Gillespie, Tarleton L., The Politics of Platforms (May 1, 2010). New Media & Society, Vol. 12, No. 3, 2010. 
- Democratizing software: Open source, the hacker ethic, and beyond by Brent K. Jesiek. First Monday, volume 8, number 10 (October 2003), URL: http://firstmonday.org/issues/issue8_10/jesiek/index.html
- The Immaterial Aristocracy of the Net
- Overview of architectures of control in the digital environment, by Dan Lockton.
- The Politics of Code in Web 2.0. By Ganaele Langlois, Fenwick McKelvey, Greg Elmer, and Kenneth Werbin. Fibreculture Journal, Issue 14. 
- How Commercial Social Networks Hinder Connective Learning
- Kevin Slavin on the Algorithms that Shape Our World, mentions in particular the stock trading algorithms