If peer to peer is the relational dynamic at play in distributed networks, and peer production the process whereby common use value is produced, then peer governance refers to the way peer production is managed.
See also the related concept of Panarchy
"Peer governance is a new mode of governance and bottom-up mode of participative decision-making that is being experimented in peer projects, such as Wikipedia and FLOSS (Bauwens, 2005a, and 2005b). Thus peer governance is the way that peer production, the process in which common value is produced, is managed."
"Peer governance’s main characteristics are the Equipotentiality, i.e. the fact that in a peer project all the participants have an equal ability to contribute, although that not all the participants have the same skills and abilities (Bauwens, 2005a, and 2005b); the Heterarchy as a form of community; and the Holoptism i.e. the ability for any part to know the whole (Deleuze, 1986). Holoptism allows participants for free access to all information, in contrast with panoptism where participants have access on a ‘need to know basis’ (Bauwens, 2005a, and 2005b).
Stadler (2008) submits that leadership in peer projects is not egalitarian, but meritocratic: “Everyone is free, indeed, to propose a contribution, but the people who run the project are equally free to reject the contribution outright”, as “the core task of managing a Commons is to ensure not just the production of resources, but also to prevent its degradation from the addition of low quality material”. In the next chapter, where the way that Wikipedia is governed is explicitly explained, it becomes more evident that the common conception that peer governance is an anarchistic mode is mistaken. Actually, benevolent dictatorships are common in peer projects (Bauwens, 2005a, and 2005b; Malcolm, 2008). These can be found for instance in Linux project where Linus Torvalds is the benevolent dictator (Malcolm, 2008) or as we see later in Wikipedia where Jimmy Wales is also the Benevolent Dictator. In addition, Coffin (2006) mentions some characteristics of the peer governance of successful open source communities. To become more concrete, firstly the membership is open and widespread premised on participation. The collaboration among the members of the project is geographically dispersed, asynchronous and organized in networks. Moreover, the project is transparent and the dialogues among the participants are recorded, and the materials of the project are subjected to open review. There is a mechanism for institutional history, as well as the setting of a compelling foundational artifact around which the production and the participation will be organized is crucial. So, all these give rise to the formation of a community where the sense of project ownership is wide and decisions are taken by a hybrid political system, premised on meritocracy, what I call peer governance.
Coffin (2006) also refers to the necessity for a benevolent dictator (who typically is one of the founders of the project) adding that the foundation developers and the early adopters set the project ethos, as well. The founder, along with the first members, upholds the right to fork out. Axel Bruns (interview with Bruns, 2009) defines the benevolent dictators “as ones of several heterarchical leaders of the community, who have risen to their positions through consistent constructive contribution and stand and fall with the quality of their further performance”. It is obvious that through such leadership roles, they may gain the ability to push through unpopular decisions. So, as Bruns notes, “if they abuse that power, theirs become a malicious leadership” and what we should expect at this point is “a substantial exodus of community members”. Therefore, following Bruns’ narrative, “the continued existence of the project at that moment would depend very much on whether the number of exiting members can be made up for in both quality and quantity by incoming new participant."
2. Main characteristics of peer governance
"Coffin (2006) mentions some obvious characteristics of successful open source/p2p communities. Firstly, the membership is open and widespread, premised on participation. The free collaboration among the members is geographically dispersed, asynchronous and organized in networks. Moreover, projects are transparent and dialogues among participants are recorded, with the materials of projects like Wikipedia subject to open review (there is a mechanism for institutional history). So, at the first glance, openness, networking, participation and transparency appear as the main characteristics of governance in peer projects. More closely, these projects do not operate in strict hierarchies of command and control, but rather in heterarchies. They operate “in a much looser [environment] which … allows for the existence of multiple teams of participants working simultaneously in a variety of possibly opposing directions.”  According to Bruns (2008), heterarchies are not simply adhocracies, but ad hoc meritocracies which, however, are at risk of transforming themselves into more inflexible hierarchies. In addition, following Bauwens (2005a; 2005b), peer projects are based on the organizing principle of equipotentiality, i.e., everyone can potentially cooperate in a project — no authority can pre–judge the ability to cooperate. In peer projects, equipotential participants self–select themselves to the section to which they want to contribute (Bauwens, 2005b). Moreover, unlike panoptism (i.e., the way knowledge is distributed in hierarchical projects where only the top of the pyramid has a full view), peer groups are characterized by holoptism, i.e., the ability for any part to have horizontal knowledge of what is going on, but also the vertical knowledge concerning the aims of the project (Bauwens, 2005b)." (http://firstmonday.org/htbin/cgiwrap/bin/ojs/index.php/fm/article/view/2613/2479)
Typology of Commons Regulation
Three Levels of Governance (in the context of forking); from the article: Code Forking, Governance, and Sustainability in Open Source Software 
By Linus Nyman, Juho Lindman:
The nature of the industry dictates that programs cannot maintain a stable steady state for an extended period of time. They must continue to evolve in order to remain useful and relevant. Without continual adaptation, a program will progressively become less satisfactory (Lehman, 1980). Conversely, truly successful software is able to adapt and even outlive the hardware for which it was originally written (Brooks, 1975). Therefore, the ability to change and evolve is a key component of software sustainability. Although stagnation may be a precursor to obsolescence, obsolescence need not creep into a project over time; it is often a design feature.
Popularized in the 1950s by American industrial designer Brooks Stevens (The Economist, 2009), the concept of planned obsolescence stands in stark contrast to the concept of sustainability. Stevens defined planned obsolescence as the act of instilling in the buyer “the desire to own something a little newer, a little better, a little sooner than is necessary” (Brooks Stevens’ biography). Considered “an engine of technological progress” by some (Fishman et al., 1993), yet increasingly problematized in the business ethics literature (Guiltinan, 2009), planned obsolescence is part of every consumer’s life. Although contemporary software development and distribution have characteristics that differ substantially from the industrial products of the 1950s, the revenue models of companies in the software marketplace often welcome elements such as system versioning, to encourage repurchases of a newer version of the same system, or vendor lock-ins that limit the customer choice to certain providers of system or product (for a further review, see Combs, 2000). Newer versions of programs may introduce compatibility problems with earlier operating systems or programs (e.g., lack of backwards compatibility in Internet Explorer, Microsoft Office, or OS X’s OpenStep APIs). Some programs also introduce new file formats, which can cause compatibility issues with earlier versions of the program (e.g., docx vs. doc). Furthermore, end-of-life announcements and concerns over end-of-support deadlines may encourage users to upgrade, regardless of the real need to do so.
The right to fork code makes implementing such elements impracticable in open source. The right to improve a program, the right to combine many programs, and the right to make a program compatible with other programs and versions are all fundamental rights that are built into the very definition of open source. Research has shown these rights are often exercised (Fitzgerald, 2006). The result of this constant collaborative improvement in open source systems is that any program with the support of the open source community can enjoy assured relevance rather than planned obsolescence. Furthermore, with renewed community interest, programs that have decayed and fallen into disuse can be revived and updated by forking the code from the original program. In fact, this is a fairly common practice: of the almost 400 forks studied by Nyman and Mikkonen (2011), 7% involved the reviving of an abandoned project. As long as there is sufficient community interest in a project, forking can allow for constant improvement in software functionality.
The possibility to fork is central to the governance of any open source community. The shared ownership of open source projects allows anyone to fork a project at any time. Therefore, no one person or group has a “magical hold” over the project (Fogel, 2006). Since a fork involving a split of the community can hurt overall productivity, Fogel notes that the potential to fork a program is “the indispensable ingredient that binds developers together”.
One of the concerns among open source communities is what Lerner and Tirole (2002) call the hijacking of the code. Hijacking occurs when a commercial vendor attempts to privatize a project’s source code. The 2008 acquisition of MySQL, an open source relational database management system, by Sun Microsystems and subsequent acquisition of Sun by Oracle is an example of a case involving community concern over potential hijacking. It had been argued that such a series of acquisitions would lead to the collapse of both MySQL and the open source movement at large (Foremski, 2006). Responding to such claims, Moody (2009) noted that, while open source companies can be bought, open source communities cannot. Forking provides the community that supports an open source project with a way to spin off their own version of the project in case of such an acquisition. Indeed, this is what happened in the case of MYSQL. The original MySQL developer, Michael (“Monty”) Widenius, forked the MySQL code and started a new version under a different name, MariaDB, due to concerns regarding the governance and future openness of the MySQL code (for details, see Widenius' blog [February 5, 2009 and December 12, 2009] and press release).
Similarly, in 2010, community concerns regarding governance led to a forking of the OpenOffice (OO) project. The Document Foundation, which included a team of long-term contributors to OO, forked the OO code to begin LibreOffice. The spinoff project emphasized the importance of a “transparent, collaborative, and inclusive” government (The Document Foundation). A recent analysis of the LibreOffice project indicates that this fork has resulted in a sustainable community with no signs of stagnation (Gamalielsson and Lundell, 2012). Given that forking ensures that any project can continue as long as there is sufficient community interest, we have previously described forking as the “invisible hand of sustainability” in open source software (Nyman et al., 2011).
Commonly, forking occurs due to a community’s desire to create different functionality or focus the project in a new direction. Such forks are based on a difference in software requirements or focus, rather than a distrust of the project leaders. When they address disparate community needs, different versions can prosper.
In a traditional company, it is the management, headed by the CEO and board of directors, that controls the company and provides the impetus for continued development. While the vision of the leadership is similarly integral to the eventual success of any open source project, their continued control is more fragile and hinges upon their relationship with and responses to the community. Forking cannot be prevented by business models or governance systems. The key lies in appropriate resource allocation and careful community management. Managers must strike a delicate balance between providing a driving force while appeasing and unifying the community. (For an overview of open source governance models, see OSS Watch; for discussion on building technical communities, see Skerrett, 2008; for discussion on open source community management, see Byron, 2009.)
Within the dynamic world of open source software, natural selection acts as a culling force, constantly choosing only the fittest code to survive (Torvalds, 2001). However, the right to fork means that any company can duplicate any competitor’s open source software distributions; thus, competitive advantage cannot depend on the quality of the code alone. However, it is worth stressing that possibility does not equal success. The right to fork a commercially successful program with the intention of competing for the same customer base still leaves the would-be competitor with issues regarding trademarks, brand value and recognition, as well as the existing developer and user base of the original program. Even though forking allows companies to compete with identical open source software, it is nevertheless cooperation that is considered to be the key to corporate success (Skerrett, 2011; Muegge, 2011).
Open source software is free, but it is also increasingly developed and supported for commercial gains (Wheeler, 2009). While the right to fork may seem to make for a harsh business environment, open source companies can and do thrive. With its billion-dollar revenue, Red Hat is one such example. While their revenue primarily comes from subscriptions and services related to their software (see Suehle’s  TIM Review Q&A for a more in-depth look at the secret of Red Hat’s success), Red Hat’s programs themselves are largely based on forks of programs by other developers. This phenomenon of combining forked programs is not unique to Red Hat: the hundreds of different Linux distributions are all made possible by the forking of existing products and repackaging them as a new release.
Forking lays the building blocks for innovators to introduce new functionalities into the market, and the plethora of online forges have hundreds of thousands of programs available for forking and reuse in any new, creative way the user can imagine, allowing for the rapid adaptation to the needs of end users. Hence, the practice of forking allows for the development of a robust, responsive software ecosystem that is able to meet an abundance of demands (Nyman et al., 2012).
The old adage, "one man’s trash is another man’s treasure" is particularly salient in open source software development. Soon after Nokia’s abandonment of the MeeGo project in 2011 (press release; MeeGo summary), the Finnish company Jolla announced that it would create a business around its revival, made possible by forking the original code (press release). On July 16, 2012, Jolla announced a contract with D. Phone, one of the largest cell phone retailers in China, and on November 21 they launched Sailfish OS. However, one does not need to be an open source business to benefit from the right to fork. Forking can also aid companies who choose to use an existing program, or develop it for personal use. The requirement in open source to share one’s source code is linked with distribution, not modification, which means that one can fork a program and modify it for in-house use without having to supply the code to others. However, a working knowledge of licenses as well as license compatibility (when combining programs) is crucial before undertaking such an endeavour (for a discussion of licenses, see St. Laurent , Välimäki , or Meeker  for a discussion of architectural design practices in the combining of licenses, see Hammouda and colleagues ." (http://timreview.ca/article/644)
Discussion 1: Description
Steve Weber on the Governance of Peer Production projects
"When I use the term "governance" in this discussion, I am using it in the way it is used in international relations. In that context "governance" is not government, it is typically not authoritative, and in fact it is not about governing in a traditional sense as much as it is about setting parameters for voluntary relationships among autonomous parties. Given that perspective, the central question becomes, How has the open source process used technology along with both new- and old-style institutions of governance to manage complexity among a geographically dispersed community not subject to hierarchical control?
More generally, there has been a dramatic shift in what it means to be a leader in a religious community. Change the foundations of property, and you change the network of relationships that radiate outward from that which is owned, in fundamental and unexpected ways.
I will come back to this point later when I discuss power and communities; but for the moment, recognize an interesting corollary. In nonauthoritative settings -- that is, in relationships primarily characterized as bargaining relationships -- power derives in large part from asymmetrical inter dependence. Put simply, the less dependent party to the relationship (the one who would be harmed least by a rupture) is the more powerful. In the kinds of modern religious communities I am talking about here, it is the leader who is dependent on the followers more than the other way around. This dependence changes the dynamics of power. Leaders of open source projects understand the consequences intuitively -- the primary route to failure for them is to be unresponsive to their followers. In Hirschman's terms, this is a community that empowers and positively legitimates exit while facilitating voice." (http://www.sauria.com/blog/2006/Jun/04)
Felix Stadler on the Meritocratic Leadership in Peer Production
= leadership/hierarchy in peer production is not egalitarian, but meritocratic
"The openness in open source is often misunderstood as egalitarian collaboration. However, FOSS is primarily open in the sense that anyone can appropriate the results, and do with them whatever he or she wants (within the legal/normative framework set out by the license). This is what the commons, a shared resource, is about. Free appropriation. Not everyone can contribute. Everyone is free, indeed, to propose a contribution, but the people who run the project are equally free to reject the contribution outright. Open source projects, in their actual organization, are not egalitarian and not everyone is welcome. The core task of managing a commons is to ensure not just the production of resources, but also to prevent its degradation from the addition of low quality material.
Organizationally the key aspects of FOSS projects are that participation is voluntary and – what is often forgotten – that they are tightly structured. Intuitively, this might seem like a contradiction, but in practice it is not. Participation is voluntary in a double sense. On the one hand, people decide for themselves if they want to contribute. Tasks are never assigned, but people volunteer to take responsibility. On the other hand, if contributors are not happy with the project’s development, they can take all the project’s resources (mainly, the source code) and reorganize it differently. Nevertheless, all projects have a leader, or a small group of leaders, who determine the overall direction of the projects and which contributions from the community are included in the next version, and which are rejected. However, because of the doubly voluntary nature, the project leaders need to be very responsive to the community, otherwise the community can easily get rid of them (which is called ‘forking the project’). The leader has no other claim for his (and it seems to be always a man) position than to be of service to the community. Open Source theorist Eric S. Raymond has called this a benevolent dictatorship. More accurately, it is called the result of a voluntary hierarchy in which authority flows from responsibility (rather than from the power to coerce).
Thus, the FOSS world is not a democracy, where everyone has a vote, but a meritocracy, where the proven experts – those who know better than others what they are doing and do it reliably and responsibly – run the show. The hierarchical nature of the organization directly mirrors this meritocracy. The very good programmers end up on top, the untalented ones either drop out voluntarily, or, if they get too distracting, are kicked out. Most often, this is not an acrimonious process, because in coding, it’s relatively easy to recognize expertise, for the reasons mentioned earlier. No fancy degrees are necessary. You can literally be a teenager in a small town in Norway and be recognized as a very talented programmer. Often it’s a good strategy to let other people solve problems more quickly than one could oneself, since usually their definition of the problem and the solution is very similar to one’s own. Thus, accepting the hierarchical nature of such projects is easy. It is usually very transparent and explicit. The project leader is not just a recognized crack, but also has to lead the project in a way that keeps everyone reasonably happy. The hierarchy, voluntary as it may be, creates numerous mechanisms of organizational closure, which allows a project to remain focused and limits the noise/signal ratio of communication to a productive level.
Without an easy way to recognize expertise, it is very hard to build such voluntary hierarchies based on a transparent meritocracy, or other filters that increase focus and manage the balance between welcoming people who can really contribute and keeping out those who do not." (http://publication.nodel.org/On-the-Differences)
Jeremy Malcolm: the balance between hierarchy and decentralisation
Source: Chapter 4 of the book, Multi-Stakeholder Governance and the Internet Governance Forum. Jeremy Malcolm. Terminus, 2008
“The common conception of most open source software projects as being anarchistic is actually a myth. In most open source software development projects, anarchy is balanced with hierarchical control.
It is in fact common for open source software development projects to be governed by a “Benevolent Dictator for life” (or BDFL). These are found in projects ranging from the Linux operating system kernel itself, of which Linus Torvalds is the BDFL, Linux-based operating system distributions such as Ubuntu led by Mark Shuttleworth, application software such as the Samba networking suite coordinated by Andrew Tridgell, and programming languages such as Perl, PHP and Pythonin which Larry Wall, Rasmus Lerdorf and Guido van Rossum respectively act as project leaders in perpetuity.
The position of BDFL normally falls to the developer who initiated a project, though in the case of multiple original core developers, the phenomenon of a benevolent oligarchy for life is not unknown (for example Matt Mullenweg and Ryan Boren for the WordPress blog engine).
In the case of the Linux kernel, Torvalds who is perhaps the archetype of a BDFL, possesses ultimate authority to decide which contributions (”patches”) to the Linux operating system kernel should be accepted and which should be refused. Torvalds no longer personally manages the whole of the kernel and has delegated authority to a number of trusted associates to manage particular subsystems and hardware architectures, but it remains his authority to appoint these so-called “lieutenants” and to supervise their work. A document distributed with the Linux kernel source code that is subtitled “Care And Operation Of Your Linus Torvalds” describes him as “the final arbiter of all changes accepted into the Linux kernel.”
Thus contrary to what might be assumed from Raymond’s claim about “the Linux archive sites, who’d take submissions from anyone,” the Linux kernel development process is neither anarchistic nor consensual: if Torvalds does not like a patch, it does not go in to the kernel. This has often antagonised other kernel developers, one of them commencing a long-running thread on the kernel development mailing list by saying:
Linus doesn’t scale, and his current way of coping is to silently drop the vast majority of patches submitted to him onto the floor. Most of the time there is no judgement involved when this code gets dropped. Patches that fix compile errors get dropped. Code from subsystem maintainers that Linus himself designated gets dropped. A build of the tree now spits out numerous easily fixable warnings, when at one time it was warning-free. Finished code regularly goes unintegrated for months at a time, being repeatedly resynced and re-diffed against new trees until the code’s maintainer gets sick of it. This is extremely frustrating to developers, users, and vendors, and is burning out the maintainers. It is a huge source of unnecessary work. The situation needs to be resolved. Fast.
Torvalds’ initially unapologetic response recalls another classic example of his sardonic view of his position as BDFL, when announcing the selection of a penguin logo for Linux. Acknowledging the comments of those who had expressed reservations about it, Torvalds concluded with the quip, “If you still don’t like it, that’s ok: that’s why I’m boss. I simply know better than you do.”
See also: Linux - Governance
Mozilla and OpenOffice.org
"The Mozilla and OpenOffice.org projects, amongst the most successful and highest-profile of all open source projects, provide a slightly different example of hierarchical ordering in open source software development.In these cases, the authority is not that of an individual, but a corporation: originally Netscape Communications in the case of Mozilla, and Sun Microsystems in the case of OpenOffice.org.
As well as leading development, Netscape originally held the “Mozilla” trade mark (as Linus Torvalds does for “Linux” in various jurisdictions), and until 2001 required modifications to its source code to be licensed under terms that exclusively exempted it from the copyleft provisions applicable to other users. Similarly, Sun required (and continues to require) contributors to the OpenOffice.org project to assign joint copyright in their work to it.
This kind of collective hierarchical control over an open source software project can also be exercised by a civil society organisation. The non-profit Mozilla Foundation, for example, succeeded to the rights of Netscape, such as the trade mark and rights under the Netscape Public License. Membership of its governing body (or “staff”) is by invitation only."
See also: Firefox - Governance
Another example of such an organisation, also taken from one of the most prominent and successful open source projects, is the Apache Software Foundation (ASF), which is best known for the Apache HTTP Server which powers the majority of Web sites on the Internet.
The case of the ASF illustrates well that there are also various strata of developers underneath the BDFL. One study has categorised these into core members (or maintainers), active developers, peripheral developers, bug reporters, readers and passive users, and confirmed previous findings that the core developers are generally the smallest group but write the majority of the project’s code. Whilst developers in lower strata are mostly self-selected, in many projects, including those of the ASF, the core developers are selected by the BDFL, applying stringent meritocratic standards.
The Apache Software Foundation is a non-profit corporation governed by a board of nine directors who are elected by the Foundation’s members for one-year terms, and who in turn appoint a number of officers (63, in 2007) to oversee its day-to-day operations. As of 2007 there are 156 members of the ASF, each of whom was invited to join on the basis of their previous contributions to ASF projects, and whose invitation was extended by a majority vote of the existing members.
Each project overseen by the ASF (such as the HTTP Server Project) is in turn governed by a Project Management Committee (PMC) comprised of a self-selected group of ASF members. Whilst non-members can also contribute to project development, they are required to assign copyright in their contributions to the ASF. PMC chairs are officers of the ASF who have power to define the PMC’s rules (if any) and the responsibility to report to the board. The ASF itself claims that it
- represents one of the best examples of an open organization that has found balance between structure and flexibility. We have grown from 200 committers to almost 800, and that number continues to grow on a daily basis. We have been able to create several software products that are leaders in their market. We have also been able to find balance between openness and economical feasibility. This has earned us respect from a range of people, from single individuals to multinational corporations. We hope to continue to provide inspiration for businesses, governments, education, and for other software foundations.
One final example of hierarchical ordering in open source software development is found in comparing two similar Linux-based operating system distributions, Debian GNU/Linux and Ubuntu. The Debian project was the first to be established, in 1993. Although not incorporated, an associated incorporated body Software in the Public Interest, Inc (SPI) was formed in 1997 to provide the Debian project (along with various other open source projects) with administrative and legal support. It does not take an active role in the development of the Debian distribution.
The Debian project is led by a Project Leader who is elected by the project’s members, known as its Developers (or Maintainers), for a one year term. The Project Leader presides over a Project Secretary, a Technical Committee of up to eight members, and various other technical positions, all of whom are appointed from amongst the Developers by the Project Leader (save that the Project Leader appoints a new Project Secretary jointly with the incumbent in that role). The process for becoming a Developer is a meritocratic one, the requirements of which are set out in a detailed policy document. As at 2007 there are over 1100 Developers, although many of these are inactive.
The autonomy of the Project Leader is expressly limited by clause 5.3 of the Debian Constitutionwhich provides, “The Project Leader should attempt to make decisions which are consistent with the consensus of the opinions of the Developers” and “should avoid overemphasizing their own point of view when making decisions in their capacity as Leader.” The same applies to the Technical Committee, which is only to make decisions as a last resort, where “efforts to resolve it via consensus have been tried and failed” (clause 6.3.6). Clause 4.1 provides that any decision of the Project Leader may be overruled by a general resolution of the Developers, as may any decision of the Technical Committee by a two thirds majority resolution."
See also: Debian - Governance
This is in contrast to the position within the governance of the Ubuntu distribution. Mark Shuttleworth founded the distribution in 2004, and termed himself its SABDFL (self-appointed Benevolent Dictator for life). He exercises a casting vote on both of its main decision-making bodies: the Technical Board and the Ubuntu Community Council.
The Technical Board is a committee which makes decisions on technical issues relating to the distribution, such as which software it should include and how this should be packaged and installed. It acts by consensus, but only amongst its own members, and is not required to reflect the views of the Ubuntu community at large. Its members, currently four, are appointed by Shuttleworth subject to confirmation by a vote of all developers, and they serve a one year term.
The Community Council is responsible for overseeing the social structure of the Ubuntu community, including the creation of Teams and Projects with responsibility for particular tasks such as documentation and release management, and the appointment of new team leaders, developers and members. It also maintains a Code of Conduct to which all members of the Ubuntu community are required to adhere, and arbitrates disputes arising under it. The Community Council, also currently of four members, is appointed by Shuttleworth subject to confirmation by a vote of all developers and members, and sits for a two year term.
The developers or maintainers of Ubuntu are equivalent to those of Debian and nominated by a similar process, save that they are divided into two tiers: Masters of the Universe or MOTU, who have authority to maintain packages only in the unsupported “universe” and “multiverse” classes available for installation on an Ubuntu system, and Core Developers who are uniquely authorised to maintain packages in the fully supported “main” class which are installed by default by the Ubuntu distribution.
In addition, Ubuntu recognises as “members” those who have provided a significant contribution to the Ubuntu community, perhaps by providing support, writing documentation or reporting bugs, but who do not require the power to directly maintain packages. Members, like developers, are required to agree to the Code of Conduct and have the right to vote to confirm new appointments to the Ubuntu Community Council.
A survey of desktop users of Linux distributions conducted in 2005 by Open Source Development Labs (OSDL) revealed that in little over a year since its establishment in 2004, the Ubuntu distribution was already in use by more than 50 per cent of respondents. A prominent former Debian Developer who resigned in 2006 compared the Debian and Ubuntu distributions by saying,
- There’s a balance to be struck between organisational freedom and organisational effectiveness. I’m not convinced that Debian has that balance right as far as forming a working community goes. In that respect, Ubuntu’s an experiment—does a more rigid structure and a greater willingness to enforce certain social standards result in a more workable community?"
See also: Ubuntu - Governance
In fact of the examples given of open source projects in which a significant hierarchical structure exists or has existed—the Linux kernel, Mozilla, OpenOffice.org, Apache, and Ubuntu—all are the most widely-used open source projects in their class, and have large and active communities of developers.
How can this be reconciled with the earlier hypothesis that it was the very lack of hierarchy that empowered developers and attracted them to volunteer their services to open source projects?
Despite the fact that its significance to developers had earlier been downplayed, the answer is found in the open source licence. It is the open source license that enforces benevolence upon the dictator. It does this by ensuring that for any open source project, there is always relatively costless freedom of exit, in that any developers who feel they are being oppressed by a project leader can simply cease participating in the project, take its source code, and use it as the base for a new project of their own (known as a “fork” of the original project). This “exit-based empowerment” enjoyed by developers mitigates the power of the project leaders.
As Torvalds has put it,
- I am a dictator, but it’s the right kind of dictatorship. I can’t really do anything that screws people over. The benevolence is built in. I can’t be nasty. If my baser instincts took hold, they wouldn’t trust me, and they wouldn’t work with me anymore. I’m not so much a leader, I’m more of a shepherd. Now all the kernel developers will read that and say, “He’s comparing us to sheep.” It’s more like herding cats.
The Linux kernel has, indeed, been forked numerous times. One prominent fork was that maintained by Red Hat Linux developer Alan Cox, who released a series of kernel source trees differentiated from Torvalds’ official kernel by the “-ac” tag, and which contained patches not yet accepted by Torvalds. However since 2002, a technical solution to Torvalds’ backlog was found in the use of specialised revision control software (originally, ironically, a proprietary product called BitKeeper,and subsequently an open source equivalent called Git written by Torvalds himself). This has placated many of Torvalds’ critics, and resulted in the obsolescence of many former forks of the kernel.
Both Mozilla’s Firefox browser and the OpenOffice.org office suite have also been forked. The Debian project, for example, has replaced Firefox in its distribution with a forked version called Iceweasel, to escape the onerous trade mark licence conditions imposed by the Mozilla Foundation for the use of the Firefox name and logo.As for OpenOffice.org, a prominent fork called NeoOffice has been customised to integrate more smoothly with the Mac OS X operating system. Debian itself has also spawned a number of derivative distributions, although these are seldom termed forks. A federation of Debian-based distributions (though not including Ubuntu) is the DCC Alliance.
Admittedly, forking an open source project is not completely costless. Although the cost of the infrastructure required to host a new project is minimal (often even free), a new name and logo will be required (either because the originals are protected by trade marks as in the case of Firefox and OpenOffice.org, or simply because of the practical necessity to distinguish the two projects).
Usually the most significant cost however is that it will be necessary for the new project leader to establish a community of users and developers to support the project in the long term. This is defined by economic sociologists as the cost of developing social capital. Thus, the more successful the parent project is (and the more cohesive its communities of developer and users), the higher its social capital will be, the higher the transaction costs of a fork, and the more effectively that fork will have to differentiate itself from its parent in order to overcome those costs.
This is illustrated by the case of Samba-TNG which forked from the highly successful Samba project in 1999, seeking to differentiate itself by first offering the facility to replace a Microsoft Windows NT server as a Primary Domain Controller. However it struggled to build a development community comparable in size and expertise to that of its parent project, which in the meantime implemented its own version of Samba-TNG’s differentiating feature. In comparison, forks of less dominant and stable projects (such as the oft-criticised PHP-Nuke content management system) have been forked more often and more successfully.
This characteristic of the transaction costs associated with migration from one open source project to another provides a cohesive force against the unnecessary fragmentation of open source projects, that will only be overcome if enough developers become sufficiently dissatisfied to form a viable competing project (which the project leaders have an incentive not to allow to happen, lest they lose their base of developers). In comparison, developers within Microsoft Corporation face much higher transaction costs in replicating their work and their communities elsewhere if they are dissatisfied, if indeed it is possible for them to do so at all.
Thus it is from the unexpected source of the open source licence that a possible solution is found to the problem of how to ensure that an organisation acts benevolently rather than tyrannically, even though it may be structured in hierarchical rather than democratic form. The open source licence achieves this by providing an implicit check on the power of the organisation’s leadership, effectively conditioning its continued exercise of that power on the consent of its constituents.”
Discussion 2: Details
Characteristics of successfull collaborative projects
" characteristics of successful free software/open source communities:
- open and widespread membership based upon participation
- geographically distributed, asynchronous, networked collaboration
- project transparency, particularly open, recorded dialog and peer review of project materials,
- discussion and decisions
- a compelling foundational artifact to organize participation and build upon
- collaborative, iteratively clarified, living documents and project artifacts
- a mechanism for institutional history
- a community–wide sense of project ownership
- a hybrid political system based upon meritocracy
- a trusted benevolent dictator, typically the project founder
- foundational developers and early adopters who, along with the benevolent dictator, set project ethos
- consensus as a decision–making tool
- upholding the right to fork."
From Iandoli, Klein, & Zollo , summarized by Vasilis Kostakis:
"in order virtual communities to work properly, three important governance problems have to be dealt with:
• Attention governance: we must attract a considerable number of users, reduce the risk of premature convergence and enable sufficient exploration of the search space by countervailing the influences of informational pressure, social pressure and common knowledge;
• Participation governance: we must retain a critical mass of motivated diverse users, and provide them with support and incentives for evidence-based reasoning as well as the sharing of unique personal knowledge;
• Community governance: we must identify the rules and the organizational structures of the community in terms of the process and roles that enable attention governance and effective participation."
Source: Iandoli, L., Klein, M., and Zollo, G. (2008) “Can We Exploit Collective Intelligence for Collaborative Deliberation? The Case of the Climate Change Collaboratorium”, MIT Sloan School Working Paper 4675-08.
Discussion 3: Peer Governance in General
Bob Jessop on Governance and Meta-Governance
Bob Jessop uses the similar term of 'governance', but I believe we should distinguish peer governance as the management of bottom-up peer groups, from the extension towards multiple stakeholders of the governance of existing institutions. A further cause of confusion is that the general literature uses governance often in a generic way, often barely distinguishable from “government" but also in specific ways that have no relationship with peer governance such as in the case of ‘corporate governance’. It is therefore preferable to use “peer governance".
Bob Jessop clearly distinguishes governance, as peer governance, from the market and the state:
“Even on this minimal definition, governance can be distinguished from the ‘invisible hand’ of uncoordinated market exchange based on the formally rational pursuit of self-interest by isolated market agents; and from the ‘iron fist’ (perhaps in a ‘velvet glove’) of centralised, top-down imperative co-ordination in pursuit of substantive goals established from above … Political theorists now suggest that governance is an important means to overcome the division between rulers and ruled in representative regimes and to secure the input and commitment of an increasingly wide range of stakeholders in policy formulation and implementation. In this sense governance also has normative significance. It indicates a revaluation of different modes of co-ordination not just in terms of their economic efficiency or their effectiveness in collective goal attainment but also in terms of their associated values. Thus ‘governance’ has acquired positive connotations such as ‘middle way’, ‘consultation’, ‘negotiation’, ‘subsidiarity’, ‘reflexivity’, ‘dialogue’, etc., in contrast to the anarchy of the market or the state's ‘iron fist’." (http://www.cddc.vt.edu/host/lnc/papers/JessopGovernance.htm)
More commentary by Bob Jessop:
- Generalized peer governance in the context of asymmetrical capital-labour relations, Jessop
“the apparent promise of symmetry in social partnership as a form of reflexive self-organisation may not be realized. For there are marked structural asymmetries in the capital-labour relation and in the forms of interdependence between the economic and the extra-economic conditions for accumulation." (http://www.cddc.vt.edu/host/lnc/papers/JessopGovernance.htm)
- Failures of peer governance/state interrelationships, Jessop
“The second set of potential sources of governance failure concerns the contingent insertion of partnerships and other reflexive, self-organising arrangements into the more general state system – especially in terms of the relative primacy of different modes of co-ordination and access to institutional support and material resources to pursue reflexively-arrived at governance objectives. There are three key aspects of this second set of constraints. First, as both governance and government mechanisms exist on different scales (indeed one of their functions is to bridge scales), success at one scale may well depend on what occurs on other scales. Second, co-ordination mechanisms may also have different temporal horizons. One function of contemporary forms of governance (as of quangos and corporatist arrangements beforehand) is to enable decisions with long-term implications to be divorced from short-term political (especially electoral) calculations. But disjunction's may still arise between the temporalities of different governance and government mechanisms. Third, although various governance mechanisms may acquire specific techno-economic, political, and/or ideological functions, the state typically monitors their effects on its own capacity to secure social cohesion in divided societies. It reserves to itself the right to open, close, juggle, and re-articulate governance arrangements not only in terms of particular functions but also from the viewpoint of partisan and overall political advantage … First, governance attempts may fail because of over-simplification of the conditions of action and/or deficient knowledge about causal connections affecting the object of governance. This is especially problematic when this object is an inherently unstructured but complex system, such as the insertion of the local into the global economy. Indeed, this leads to the more general ‘governability’ problem, i.e., the question of whether the object of governance could ever be manageable, even with adequate knowledge … Second, there may be co-ordination problems on one or more of the interpersonal, interorganisational, and intersystemic levels. These three levels are often related in complex ways. Thus interorganisational negotiation often depends on interpersonal trust; and de-centred intersystemic steering involves the representation of system logic's through interorganisational and/or interpersonal communication. Third, linked to this is the problematic relationship between those engaged in communication (networking, negotiation, etc.) and those whose interests and identities are being represented. Gaps can open between these groups leading to representational and legitimacy crises and/or to problems in securing compliance. And, fourth, where there are various partnerships and other governance arrangements concerned with interdependent issues, there is a problem of co-ordination among them." (http://www.cddc.vt.edu/host/lnc/papers/JessopGovernance.htm)
- The problem of metagovernance, by Bob Jessop
“The most cursory review of attempts at governance, whether through the market, imperative co-ordination, or self-organisation, reveals an important role for learning, reflexivity, and metagovernance. Indeed, if markets, states, and governance are each prone to failure, how is economic and political co-ordination for economic and social development ever possible and why is it often judged to have succeeded? This highlights the role of the ‘meta-structures’ of interorganisational co-ordination (Alexander 1995: 52) or, more generally, of ‘metagovernance’, i.e., the governance of governance. This involves the organisation of the conditions for governance in its broadest sense. Thus, corresponding to the three basic modes of governance (or co-ordination) distinguished above, we can distinguish three basic modes of metagovernance and one umbrella mode.
First, there is ‘meta-exchange’. This involves the reflexive redesign of individual markets (e.g., for land, labour, money, commodities, knowledge – or appropriate parts or subdivisions thereof) and/or the reflexive reordering of relations among two or more markets by modifying their operation and articulation.
Second, there is ‘meta-organisation’. This involves the reflexive redesign of organisations, the creation of intermediating organisations, the reordering of inter-organisational relations, and the management of organisational ecologies (i.e., the organisation of the conditions of organisational evolution in conditions where many organisations co-exist, compete, co-operate, and co-evolve).
Third, there is ‘meta-heterarchy’. This involves the organisation of the conditions of self-organisation by redefining the framework for heterarchy or reflexive self-organisation.
Fourth, and finally, there is ‘metagovernance’. This involves re-articulating and ‘collibrating’ the different modes of governance. The key issues for those involved in metagovernance are ‘(a) how to cope with other actors’ self-referentiality; and (2) how to cope with their own self-referentiality' (Dunsire 1996: 320). Metagovernance involves managing the complexity, plurality, and tangled hierarchies found in prevailing modes of co-ordination. It is the organisation of the conditions for governance and involves the judicious mixing of market, hierarchy, and networks to achieve the best possible outcomes from the viewpoint of those engaged in metagovernance.
Thus metagovernance does not eliminate other modes of co-ordination. Markets, hierarchies, and heterarchies still exist; but they operate in a context of ‘negotiated decision-making’." (http://www.cddc.vt.edu/host/lnc/papers/JessopGovernance.htm)
Paul B. Hartzog on Panarchy
Paul B. Hartzog at http://panarchy.com/Members/PaulBHartzog/Writings/Features
"Governance in Panarchy is characterized by the primacy of relational behaviors among governance organizations. Some of these organizations may be traditional nation-states, at least for a while. It is likely that nation-states will be replaced by numerous other governance organizations that demonstrate a better "fit" with their constituents' needs than do today's national governments. Numerous political scholars (Rosenau, et al) have noted trends towards micro- and macro- governance. In addition, governance is becoming increasingly transnational. Through these trends, the distinction between "governmental" and "non-governmental" organizations, particularly in international politics, will become increasingly less apparent, and probably will ultimately be reduced to the possession of military might on the part of "governments". As this function, too, becomes internationalized into a global peace force, even that distinction will fade." (http://panarchy.com/Members/PaulBHartzog/Writings/Features)
"What is governance without government? How does it work? Government is enforcement by coercion backed up by force or the threat of force. Governance involves voluntary compliance by the governed because of shared norms and values. Governance without government only becomes possible in an Information Age, because it relies entirely on accurate information and transparency. The key feature of the following examples is that there is no higher authority enforcing compliance. Rather, the benefits of participation themselves enforce compliance." (http://panarchy.com/Members/PaulBHartzog/Writings/Governance)
European Internet Self-Regulation Proposal
A concrete example of a proposed peer governance scheme for the European internet:
The proposal is from a series of internet players and advocacy groups, who have gotten together in order to promote, and practice, peer governance of the internet in Europe, and aim to convince the EU of the value of this approach:
“Flexible, decentralized, evolving, the network is very similar to the internet in its way of functioning.
- A decentralized network
In this peer network, all organizations will be associated at the same level, without any hierarchy between them.
- Flexibility, a core concept
This network will operate in a very flexible way. For some members, the network will essentially be a resource centre, allowing them to gather information, to be in touch with other members. For others, it will facilitate cooperation at an European level on subjects of common interests, feeding the Commission with proposals. Density and configuration of the network will vary according to the demands of its members and also to the issues. The working groups will be created with the interested members and will publish recommendations accordingly.
The network is built as an open structure, as regards new members willing to join, or non members it wants to work with. Other organizations in other European countries are welcome to participate in this new process. The network also aims at building a structured dialogue with other international players on subjects of common interest.
- A progressive construction by its members
The launch of the network in December 2003 is just a beginning. The network will be built progressively under the impulse of its members (see next page). The first meeting of the European internet coregulation network will take place in March 2004 in Paris. It will be the opportunity for the members to set its priorities for next months. The launch of the first working groups will take place in June 2004. This website is an immediate illustration of the common will of its participants. It already presents the network, its projects, its members and proposes links to their websites. It will include later more functionalities, like a collaborative work platform." (http://network.foruminternet.org/article.php3?id_article=20)
Co-regulation (peer governance) and the state, cited from a co-regulation proposal:
“Through this cooperation process, the players have the opportunity to reach a consensus point on each subject, facilitating the respect of the adopted rules. It also allows to implement aside of the state regulation (rules, laws, international agreements) new means coming from self regulation like best practices, technical means, efficient networks to share information, etc. This process does not aim at discrediting States intervention. States remain the preferred players as they are the only entities able to decide and enforce public rules. “ (http://network.foruminternet.org/article.php3?id_article=20)
- Linux - Governance
- Fetchmail - Governance
- GNU Compiler Collection - Governance
- Firefox - Governance
- Debian - Governance
- Ubuntu - Governance
Open Source Software projects are not self-governing
"Being self-organizing and having central control are not completely incompatible--sort of. Imagine a group of random castaways marooned on a desert island. Initially there is no hierarchy to this group. But the group may choose to elect a leader or committee to guide them. They would choose as leaders the people who seem most able to help them survive. The castaways would be self-organizing, yet have established some central control. This is analogous to an open-source project allowing one member to serve as project leader, because it is in everyone's best interest.
However, the argument only goes so far. Over time, leaders get stuck on certain ways of doing things and become accustomed to their power. They are resistant to change. The group's ability to organize itself diminishes, even though it began in an egalitarian way. Similarly, open-source projects are not very self-organizing two years after they start. From a newcomer's point of view, things are not as democratic as they once were.
Of course, as Raymond points out, unhappy newcomers are free to fork the project if they don't like what the leaders are doing. However, this only proves the project is no longer self-organizing--it is now two projects." (http://www.softpanorama.org/OSS/webliography.shtml)
- The Economist on the place of hierarchy in open source projects, at http://www.economist.com/business/displaystory.cfm?story_id=5624944
- Writings, by Paul Hartzog of panarchy.com on the related concept of panarchy are here  and here 
Key Books to Read
- Cyberchiefs. Autonomy and Authority in Online Tribes. Mathieu O’Neil. Macmillan/Pluto Press, 2009. Excellent monograph.
Panarchy: Understanding Transformations in Human and Natural Systems. L. H. Gunderson, C. S. Holling. Island Press, 2001
The Future of the Capitalist State. Bob Jessop. 2001
"offers a series of key conceptual distinctions - between types of capitalist state and types of state in capitalist society and discusses the process of de-territorialisation and its impact on governance and meta-governance."