The (dis)illusions of a rebel: A reappraisal of the General Public

License through techno-organizational analysis

Prodromos Tsiavos

Department of Information Systems

London School of Economics and Political Science

Houghton Street, London WC2A 2AE, United Kingdom

E–mail:

Shifting from Risk regulation to Risks produced by regulation, this study attempts to explore the emergence and operation of regulatory ecologies as an organic approach for dealing with the issue of Risk and regulation. Theories concerning counterproductive regulation have identified the increasing problem of regulations that produce systemic side-effects and fail to deliver their pre-set goals, especially when Information Technology is the focus or the tool of the regulatory effort. Technical Measures of Protection and highly sophisticated copyright or outsourcing contracts have been employed in order to manage Risks stemming from such counter-productive regulatory phenomena. However, such interventions have exaggerated rather than eased off the problem of regulatory side effects. Objective of this paper is to investigate a model for regulation-building that adopts an alternative stance against the phenomenon of counterproductive regulation. After a brief historical analysis of software development along the respective organisational and legal developments that aims at exploring evolving Risk and Information Systems perspectives we question the different attitudes that an organisational model like that of Free/Open Source Development entails. A focused narrative of the events that led to the creation of the first Free/Open Source License and its organisational implications advocates our working hypothesis that because of its fundamentally different organisational arrangements Free/Open Source development represents an alternative approach to Risk. Its crux lies on the proliferation rather than delimiting of consistencies. This hypothesis is further explored in the Gnutella case study. The aim is to approach the Free/Open Source organisation for the Gnutella protocol as an alternative regulatory paradigm. Gnutella being a utilitarian artefact with regulatory properties is the ideal candidate for investigating changing attitudes against risk both in information systems development and regulatory theory. What happens in terms of Risk attitudeswhen the contractual and organisational structure of a regulation/ Information Systems building organisation is fundamentally different (as it is in the case of Open Source organisations) from the dominant model? The paper concludes with an enumeration of the limitations of such an alternative regulatory model and calls for further research in both the law and the information systems development disciplines.

1. Framing the problem: from regulating risks to risky regulations

Regulating Risks can be a risky business. This is by no means a new observation. Sunstein’s {Sunstein, 1990 #30} example of pollution regulation is an illustrative one: the introduction of regulations that render mandatory the installation of emission control equipment on new motor vehicles could alleviate their cost and lead car owners to retain their existing old vehicles thus increasing instead of delimiting the air-pollution levels. The case of oil spills regulation exhibits a similar pattern: cleaning oil spills often involves the use of methods and substances that may themselves be dangerous for certain ecosystems whereas the possibility of further pollution dispersion as a result of the cleaning efforts cannot be excluded {Katz, 1994 #31}. It seems that attempting to intervene in any complex system entails a multiplication rather than mitigation of Risks. Grabosky {Grabosky, 1995 #32}, in an attempt to illustrate the typology of risks emanating from a variety of regulatory interventions, presents an array of regulatory examples ranging from asbestos removal {Warren, 1993 #33} disposal of hazardous waste {Block, 1980 #34} and cross-border pollution {Andrews, 1993 #35} to liability laws that stifle business innovation {Sigler, 1998 #36}] or tax regulation that creates parallel markets for tax avoidance enterprises {Hutter, 1993 #37}.

Building regulation is notoriously difficult. Implementing it is even harder {McBarnet, 1992 #38}. But what happens when the unintentional consequences become the norm, when regulation produces uniformly patterned behaviour, which is not the one desired by the creators of the regulation? We could put the blame on the politicians that are more interested in short term results and risks perceptions rather than long term solutions and the addressing of the actual risks {Leone, 1986 #39}; we can talk of “bad science” or even “engineering flaws in the design and implementation of regulatory activities” {Grabosky, 1995 #32}. However, the problem of “counter-productive regulation” {Grabosky, 1995 #32}remains and is here to stay for as long as we approach it epidemically refusing to challenge its underlying premises {Baldwin, 1998 #40}. A closer look at the evolution of regulation -both as theory and as practice- in conjunction with the trajectories of various technologies is essential to gain some insights into the problem of Risk.

Regulation theory has for a long time attempted to address issues of environmental, health or even financial risks {Power, 1997 #41}. As theories and practices of regulation came to a level of maturity in the mid 1990s {Baldwin, 1998 #40} questions related to the risks created by regulatory intervention started emerging {Grabosky, 1995 #32}. The issue was not just how regulation could contribute to the mitigation or management of risk of complex systems like the environment or financial markets, but also how risks originating from the operation of regulations themselves could be handled. This development coincided with the “emancipation” of regulatory theory from the realms of market regulation and competition law. Regulation “vocabulary” was now to be used in areas as diverse as family law, corporate governance {Teubner, 2000 #42} or –most importantly for our study- internet regulation {Murray, 2002 #43}, {Biegel, 2001 #44}, {Lessig, 1998 #45}.

Research concerned with the risks related to the deployment and enforcement of regulation has been particularly intense in areas where information technology has had a profound impact, such as the Internet or any kind of digital networks. The works of Lessig {Lessig, 1998 #45}, {Lessig, 2000 #46},{Lessig, 2001 #47}, Biegel {Biegel, 2001 #44}, Chengalur-Smith [Chengalur-Smith, 2003 #10] or Benkler {Benkler, 2001 #48}, {Benkler, 2002 #49} are excellent examples of a stream of research that sought to address the adverse effects that regulatory interventions can have on technological innovation or the balancing of rights within techno-legal ecosystems {Tsiavos, 2003 #3},[Hosein, 2003 #50]. Although Risk does not appear as a technical term in the literature concerning regulation of (or through) digital technologies, the theme of Risk and blame have a constant presence. To quote Baldwin [Baldwin, 1998 #40], “as ‘risk’ becomes increasingly politicized and construed as ‘danger’ rather than its original technical meaning of statistical or mathematical probability”, it becomes a central preoccupation of contemporary regulatory studies [Power, 1997 #41].

2. The importance of being ecological

In the case of the unanticipated effects that regulation concerning information technology often has, the vocabulary used to describe the phenomenon is illustrative of the reasons why a Risk driven approach is suggested [Grabosky, 1995 #32] as a possible way of dealing with the arising issues. Boyle [Boyle, 1997 #51; Boyle, 2003 #52] refers to “Environmentalism on the Net” and the need to develop regulatory policies that could sustain the preservation of the “ecology” of the commons in a world dominated by technical measures of intellectual property protection and over-stretched copyright regulation. Similar is the argument made by Lawrence Lessing in “The Future of Ideas” [Lessig, 2001 #47]. Yochai Benkler [Benkler, 2001 #48; Benkler, 2002 #49] talks of “(t)he battle over the institutional ecosystem in the digital environment” to refer to the evolutionary process by which different production modes (such as “peer production” that characterises open source development and peer-to-peer dissemination of content) instigate the creation of different regulatory creatures. Hosein, Tsiavos and Whitley [Hosein, 2003 #50] have referred to the interaction between technology and regulation as a technological ecology of regulation or as T-ecology of regulation. The discourse on risks stemming from regulation is indicative of the image that regulation has for itself and in a great extend resembles the systems that Risk regulation is concerned with: in a rather self-referential fashion, regulation is viewed by itself as a complex system that when interacting with technology produces side-effects harmful for itself and all surrounding systems. When referring to his concept of “counterproductive” regulation, Grabosky makes an important remark concerning the systemic nature of society, the importance of an ecological approach for regulatory intervention and the role of Risk in the way late-modern [Kallinikos, 2001 #53] society operates:

“In addition to inadequate understanding of basic casual processes, there is often among policy entrepreneurs an inadequate appreciation of the systemic nature of modern society. Interventions can trigger other causal processes. The functional disruption of related systems is familiar to students of ecology. Similar principles apply in regulatory life. Regulatory policies, like public policies generally, have wider implications (…) Given the density of contemporary social space, efforts to influence one variable are likely to influence others, directly or indirectly. Engineers of a given regulatory domain are often insufficiently aware of the wider social ecology – the complex, interdependent systems of social life in which the target behaviour resides.” [Grabosky, 1995 #32]

The fact that regulatory interventions are often the source of more problems than the ones they purport to solve indicates the need for a radical reconsideration of both the way in which regulation is built and conceived. The model in which regulation is constructed and enacted in information-technology-intense settings is under increasing strains: instead of achieving regulatory integration and control of contingencies, it causes fragmentation of the regulatory modalities and proliferation of unanticipated contingencies.

3. Objectives of the study

This paper aims at providing an alternative perspective on how regulation should be treated in relation to Risks stemming from its very own operation. It argues that Risk as the occurrence of unintended and adverse consequences in relation to regulation cannot be handled by introducing tougher enforcement mechanisms or by re-enforcing the rights of the existing stakeholders. The lack of matching between the intentionality embedded in regulatory structures and the results of its enforcement is only the symptom of a wider phenomenon that relates to the way in which regulation is produced. Therefore, research that focuses just on the content and impact of regulation misses the point. The focus should also be on the way regulation is produced and the degree in which different actors have the capacity to inscribe their interests into different regulatory instruments.

Such a perspective has significant implications in the way regulation is conceptualised. We suggest a shift from a mono-dimensional and state driven to a multi-source regulation model. In the mainstream regulatory model the primary goals is achievement of control of contingencies and the enforcement of a particular set of possibilities that are considered as acceptable by a central authority. On the contrary, the model suggested by this paper advocates a proliferation of contingencies by allowing more actors to have input in the regulatory production process. When control is concentrated in a single point as it happens in the case of traditional legal regulatory making, it becomes practically difficult to achieve its goals and susceptible to abuses of power. On the contrary, when control is distributed we witness the emergence of a regulatory landscape that although a first reading would suggest being a chaotic, is in fact a solution that serves more adequately the interests of the involved actors.

The implications of such a viewpoint for Risk and regulation problems are profound as it calls for an opening of the regulation to all possible inputs and for a proliferation instead of delimitation of contingencies.

This paper constitutes an exploratory study of such an alternative regulatory model. By following the evolution of the development process of a technical piece of regulation, namely that of the open source development of the Gnutella protocol, we investigate the operation of a different model for dealing with risks stemming from regulation. The analysis of the Gnutella case is preceded by a deconstruction of the concept of Free/ Open Source Licensing schemes. We consider such an approach essential for the realisation of the goals of our study. The Gnutella development is based on the General Public License, the archetypical form of Free/ Open Source licenses. It is necessary to understand where this license fits into the whole history of Systems Development discipline and the respective legal development to appreciate the paradigm that it represents. The epistemological dimension of this study is therefore crucial for the understanding of alternative regulatory models. Thus, the following section is devoted in an analysis of the research design and methodology of the paper.

4. Research Design and Data Collection

In this section we present the reasons behind the choice of the development of the Gnutella protocol as our primary case study and the relation between that particular case study and the broader Free/Open Source phenomenon. Finally, we present the data collection and analysis techniques that have been employed for the drawing of this first set of conclusions.

4.1 Methodological impetus: Why the Gnutella protocol?

We decide to embark to an investigation for an alternative regulatory model for risk and regulation and decide for that purpose to study a protocol. The choice of a protocol stems from the fact that although it is a set of rules, it is not regulation in its traditional form as “sustained and focused control exercised by a public agency over activities that are valued by a community” [Selznick, 1985 #54]. It differs from classic regulation both in the sense that it is not state derived and that the rules it contains are primarily addressed to non-humans: a protocol is a standard to which different technologies should adhere. At the same time a standard is exceptionally important for humans as the technologies that comply with standards are often omnipresent in various facets of human activity. In any case, it falls under the broader definition of regulation as a form of “social control or influence –where all mechanisms affecting behaviour- whether these be state-derived or from other sources (e.g. markets). (…) Within this usage of the term “regulation” there is no requirement that the regulatory effects of a mechanism are deliberate or design rather than merely incidental to other objectives.” (Baldwin et al).

The Gnutella protocol was chosen as a particularly interesting one as it comes to set the rules for the creation of a peer-to-peer technology. Peer-to-peer technologies, especially in the late 1990s, have been considered as almost synonymous to anarchy, identified with anti-regulation or characterised as disruptive technologies {Oram, 2001 #59}. This has been the case with technologies such as Napster, Audiogalaxy, Aimster and Kazaa, where the conflict between legal instruments and file-sharing technologies has led to an escalation from both sides, described in the literature as the ‘cock-roach phenomenon” [Tsiavos, 2002 #28]: for every generation of stricter laws and further enforcement tools, more advanced and diverse file-sharing technologies have been developed. The situation has been further complicated by the fact that regulatory interventions, particularly of legal nature such as the DMCA, the European Directive on Copyright, the Super DMCA laws and the Draft enforcement Directives have all been viewed with increasing scepticism from legal scholars and practitioners, artists and computer scientists. The introduction of further restrictions that could only operate with the assistance of Information Technologies has led to a series of clashes with other legal systems such as privacy and data protection laws, free speech constitutional clauses or competition law provisions. Moreover, it has set substantial barriers to creativity and innovation by rendering illegal experimentation with generic sets of technologies {Anderson, 2003 #75}and violated what has often been described as the “right to tinker”. In other words, all the latest versions of copyright laws were a classic example of counter productive regulation and the peer-to-peer technologies the catalyst that caused all these disruptions.

For all these reasons, the study of the regulatory instrument, namely the Gnutella protocol, which has been chosen by a particular type of peer-to-peer technology, as the Gnutella network is, to promote innovation has been an interesting theme to investigate. The research question that we set to investigate related to the nature of this new form of regulation and the ways in which it dealt –explicitly or implicitly- with the issue of Risk.

4.2 Historicity and Agnosticism

Most of the accounts of regulation –with the great exception of feminist and critical legal studies- tend to approach it as a self existing entity whose creation process is neither questioned nor really investigated. This study on the contrary, approaches the regulations contained in the Gnutella protocol as the product of an organisational setting that keeps constantly evolving and in that sense Bruno Latour’s work has been of particular importance for us: In his Clarendon lectures at Oxford University (Latour 2002), drawing upon ideas that have always been present in his writings suggests that current thinking about organisations and organizing, starts with a problem found in almost all social science research in the past hundred years, namely that it accepts all too readily that social structures exist and that the task of enquiry is to explore the effects of these social groups on particular research questions (Cordella and Shaikh 2003). In contrast to this view, he argues, the actor–network approach does not presuppose that “the social” exists, but rather seeks to understand, in minute detail if necessary, exactly how the thing we call “the social” is formed and maintained.

This paper explores this assumption, which is also alluded to by Weick (Weick 1998) in his discussion of improvisation, that organization is an output rather than an input. This study does not assume an organized project, but rather begins with low level data collection about a particular OSS project. Indeed, the origins of this paper lie in the problems the researchers had in trying to identify the particular organisation starting point of a project for a related piece of research (Tsiavos, et al. 2002).

4.3 Methodology and Data Collection Techniques

F/OS has been described as a social phenomenon a business model or a movement. Irrespective of its characterisation, it has two main elements in its crux: a copyright license with particular features and a development process that is based on the particular licensing scheme. There is a variety of F/OS licensing schemes, but we choose to concentrate on the General Public license, since it is the one on which the development of the Gnutella protocol is based on. Because of our focus on the process of the creation of the regulation, we decided to deconstruct the mechanics of the Open Source Licensing scheme on which the development of the Gnutella protocol is based.