Felix Stalder on Thu, 6 Jun 2002 09:16:00 +0200 (CEST) |
[Date Prev] [Date Next] [Thread Prev] [Thread Next] [Date Index] [Thread Index]
<nettime> Open Source Intelligence |
Open Source Intelligence Felix Stalder and Jesse Hirsh First Monday, volume 7, number 6 (June 2002), URL: http://firstmonday.org/issues/issue7_6/stalder/index.html Abstract: The Open Source movement has established over the last decade a new collaborative approach, uniquely adapted to the Internet, to developing high-quality informational products. Initially, its exclusive application was the development of software (GNU/Linux and Apache are among the most prominent projects), but increasingly we can observe this collaborative approach being applied to areas beyond the coding of software. One such area is the collaborative gathering and analysis of information, a practice we term "Open Source Intelligence". In this article, we use three case studies - the nettime mailing list, the Wikipedia project and the NoLogo Web site - to show some the breadth of contexts and analyze the variety of socio-technical approaches that make up this emerging phenomenon. Contents = Open Source Collaborative Principles = A Few Examples of Open Source Intelligence = The Future of OS-INT ------------------------------------------------------------------------ In the world of secret services, Open Source Intelligence (OS-INT) means useful information gleaned from public sources, such as scientific articles, newspapers, phone books and price lists. We use the term differently. In the followings OS-INT means the application of collaborative principles developed by the Open Source Software movement [1] to the gathering and analysis of information. These principles include: peer review, reputation- rather than sanctions-based authority, the free sharing of products, and flexible levels of involvement and responsibility. Like much on the Internet in general, including the Open Source Software movement, practice preceded theory also in the case of OS-INT. Many of the Internet's core technologies were created to facilitate free and easy information sharing among peers. This always included two-way and multicast communication so that information could not only be distributed efficiently, but also evaluated collaboratively. E-mail lists - the most simple of all OS-INT platforms - have been around since the mid 1970s [2]. In the 1980s, bulletin boards, FidoNet and Usenet provided user-driven OS-INT platforms with more sophisticated and specialized functionality. In the 1990s, many of these platforms were overshadowed by the emergence of the Word Wide Web. Tim Berners-Lee's foundational work on Web standards was guided by a vision of peer collaboration among scientists distributed across the globe [3]. While OS-INT's precedents reach back through the history of the Internet - and if one were to include peer-reviewed academic publishing, much beyond that - a series of recent events warrant that it be considered a distinct phenomenon that is slowly finding its own identity, maturing from a practice "in itself" to one "for itself." The culture of the Internet as a whole has been changing. The spirit of free sharing that characterized the early days is increasingly being challenged by commodity-oriented control structures which have traditionally dominated the content industries. At this point, insatead of being the norm, free sharing of information is becoming the exception, in part because the regulatory landscape is changing. The extension of copyrights and increasingly harsh prosecution of violations are attempts to criminalize early Net culture in order to shore up the commodity model, which is encountering serious difficulties in the digital environment [4]. In other areas, years of experience with the rise and fall of "proto-OS-INT" forums has accumulated to become a kind of connective social-learning process. Uncounted e-mail lists went through boom and bust cycles, large numbers of newsgroups flourished and then fell apart due to pressures from anti-social behavior. Spam became a problem. Endless discussions raged about censorship imposed by forum moderators, controversial debates erupted about ownership of forums (is it the users or the providers?), difficulties were encountered when attempting to reach any binding consensus in fluctuating, loosely integrated groups. The condensed outcome of these experiences is a realization that a sustainable, open and collaborative practice is difficult to achieve and that new specialized approaches must be developed in order to sustain the fine balance between openness and a healthy signal/noise ratio. In other words, self-organization needs some help. The emerging field of OSI-INT is made up of numerous, independent projects. Each of them, such as the Nettime e-mail list, Wikipedia and the NoLogo.org Web site which will be discussed in the following, has a distinct history that led them to develop different technical and social strategies, in order to realize some or all of the open source collaborative principles. Open Source Collaborative Principles One of the early precedents of open source intelligence is the process of academic peer review. As academia established a long time ago, in the absence of fixed and absolute authorities, knowledge has to be established through the tentative process of consensus building. At the core of this process is peer review, the practice of peers evaluating each other's work, rather than relying on external judges. The specifics of the reviewing process are variable, depending on the discipline, but the basic principle is universal. Consensus cannot be imposed, it has to be reached. Dissenting voices cannot be silenced, except through the arduous process of social stigmatization. Of course, not all peers are really equal, not all voices carry the same weight. The opinions of those people to whom high reputation has been assigned by their peers carry more weight. Since reputation must be accumulated over time, these authoritative voices tend to come from established members of the group. This gives the practice of peer review an inherently conservative tendency, particularly when access to the peer group is strictly policed, as it is in academia, where diplomas and appointments are necessary to enter the elite circle. The point is that the authority held by some members of the group - which can, at times, distort the consensus-building process - is attributed to them by the group, therefore it cannot be maintained (easily) against the will of the other group members. If we follow Max Weber's definition that power is the ability to "impose one's will upon the behavior of other persons," [5] this significantly limits the degree to which established members can yield power. Eric Raymond had the same limitations in mind when he noted that open source projects are often run by "benevolent dictators" [6]. They are not benevolent because the people are somehow better, but because their leadership is based almost exclusively on their ability to convince others to follow. Thus the means of coercion are very limited. Hence, a dictator who is no longer benevolent, i.e. who alienates his or her followers, loses the ability to dictate. The ability to coerce is limited, not only because authority is reputation-based, but also because the products that are built through a collaborative process are available to all members of the group. Resources do not accumulate with the elite. Therefore, abandoning the leader and developing the project in a different direction - known as "forking" in the Open Source Software movement - is relatively easy and always a threat to the established players. The free sharing of the products produced by the collaboration among all collaborators - both in their intermediary and final forms - ensures that that there are no "monopolies of knowledge" that would increase the possibility of coercion. The free sharing of information has nothing to do with altruism or a specific anti-authoritarian social vision. It is motivated by the fact that in a complex collaborative process, it is effectively impossible to differentiate between the "raw material" that goes into a creative process and the "product" that comes out. Even the greatest innovators stand on the shoulders of giants. All new creations are built on previous creations and provide inspiration for future ones. The ability to freely use and refine those previous creations increases the possibilities for future creativity. Lawrence Lessig calls this an "innovation commons," and cites its existence as one of the major reasons why the Internet as a whole developed so rapidly and innovatively [7]. It is also important to note that an often overlooked characteristic of open source collaboration is the flexible degree of involvement in and responsibility for the process that can be accommodated. The hurdle to participating in a project is extremely low. Valuable contributions can be as small as a single, one-time effort - a bug report, a penetrating comment in a discussion. Equally important, though, is the fact that contributions are not limited to just that. Many projects also have dedicated, full-time, often paid contributors who maintain core aspects of the system - such as maintainers of the kernel, or editors of a slash site. Between these two extremes - one-time contribution and full-time dedication - all degrees of involvement are possible and useful. It is also easy to slide up or down the scale of commitment. Consequently, dedicated people assume responsibility when they invest time in the project, and lose it when they cease to be fully immersed. Hierarchies are fluid and merit-based, however and whatever merit means to the peers. This also makes it difficult for established members to continue to hold onto their positions when they stop making valuable contributions. In volunteer organizations, this is often a major problem, as early contributors sometimes try to base their influence on old contributions, rather than letting the organizations change and develop. None of these principles were "invented" by the Open Source Software movement. However, they were updated to work on the Internet and fused into a coherent whole in which each principle reinforces the other in a positive manner. The conservative tendencies of peer review are counter-balanced with relatively open access to the peer group: a major difference from academia, for instance. Most importantly, the practice of Open Source has proved that these principles are a sound basis for the development of high-end content that can compete with the products produced by commodity-oriented control structures [8]. A Few Examples of Open Source Intelligence < nettime > Nettime is an e-mail list founded in the summer of 1995 by a group of cultural producers and media activists during a meeting at the Venice Biennale. As its homepage states, the list focuses on "networked cultures, politics, and tactics" [9]. Its actual content is almost entirely driven by members' submissions. It is a good example of true many-to-many communication. Nettime calls its own practice "collaborative text filtering." The filter is the list itself - or to be more precise, the cognitive capacities of the people on the list. The list consists of peers with equal ability - though not necessarily interest - to read and write. The practice of peer review takes place on the list and in real time. The list serves as an early warning system for the community, a discussion board for forwarded texts as well as a sizeable amount of original writing, and, equally importantly, an alternative media channel. This last function became most prominent during the war against Yugoslavia, when many of members living in the region published their experiences of being on the receiving end of not-so-smart, not-so-precise bombs. By March 2002, the number of subscribers had grown to 2,500. The number of people who read nettime posts, however, is higher than the number of subscribers to the list. Nettime maintains a public Web-based archive that is viewed extensively, and some of the subscriber addresses are lists themselves. Also, as a high-reputation list, many of the posts get forwarded by individual subscribers to more specialized lists (another kind of collaborative text filtering), in addition to being published in print and other electronic media. The majority of subscribers come from Western Europe and North America, but the number of members from other regions is quite sizeable [10]. Over the years, autonomous lists have been spun off in other languages: Dutch, Romanian, Spanish/Portuguese, French and Mandarin. A Japanese list is currently in preparation. Despite its growth and diversity, nettime has retained a high degree of coherent culture and developed an original of technology-savvy, leftist media critique, stressing the importance of culture and social aspects of technology, as well as the importance of art, experimentation and hands-on involvement. This flexible coherence has been strengthened through a series of real-life projects, such as paper publications including a full-scale anthology [11], and a string of conferences and "nettime-meetings" in Europe during the 1990s. Since its inception, the list has been running on majordomo, a then popular open source e-mail list package, and assorted hypermail and mhonarc based Web archives. Technically, the list has undergone little development. Initially, for almost three years, the list was open and unmoderated, reflecting the close-knit relationships of its small circle of subscribers and the still "clubby" atmosphere of netculture. However, after spam and flame wars became rampant, and the deteriorating signal/noise ratio began to threaten the list's viability, moderation was introduced. In majordomo, moderation means that all posts go into a queue and the moderators - called "list-owners," an unfortunate terminology - decide which posts get put though to the list, and which are deleted. This technological set-up makes the moderation process opaque and centralized. The many list members cannot see which posts have not been approved by the few moderators. Understandably, in the case of nettime, this has led to a great deal of discussion about censorship and "power grabbing" moderators. The discussion was particularly acrimonious in the case of traffic-heavy ASCII-art and spam-art that can either be seen as creative experimentation with the medium, or as destructive flooding of a discursive space. Deleting commercial spam, however, was universally favored. In order to make the process of moderation more transparent, an additional list was introduced in February 2000, nettime-bold. This channel has been carrying all posts that go into the queue prior to moderators' evaluation. Because this list is also archived on the Web, members can view for themselves the difference between what was sent to the list and what was approved by the moderators. In addition to increasing the list's transparency, having access to the entire feed of posts created the option for members to implement parallel but alternative moderation criteria. In practice, however, this has not yet occurred. Nevertheless, giving members this option has transformed the status of the moderators from being the exclusive decision makers to "trusted filters." It has also provided the possibility for forking (i.e. the list splitting into two differently moderated forums). Nettime is entirely run by volunteers. Time and resources are donated. The products of nettime are freely available to members and non-members alike. Even the paper publications are available in their entirety in the nettime archives [12]. Reflecting its history and also the diversity of its contributors and submissions, nettime has maintained the rule that "you own your own words." Authors decide how to handle redistribution of their own texts, though to be frank, it is hard to have control over a text's after-life once it has been distributed to 2,500 addresses and archived on the Web. Despite its many advantages - ease of use, low technical requirements for participating, direct delivery of the messages into members' inboxes - the format of the e-mail list is clearly limited when it comes to collaborative knowledge creation. Moderation is essential once a list reaches a certain diversity and recognition, but the options for how to effect this moderation are highly constrained. Nettime's solution - establishing an additional unmoderated channel - has not essentially changed the fact that there is a very strict hierarchy between moderators and subscribers. While involvement is flexible (ranging from lurkers to frequent contributors) the responsibility is inflexibly restricted to the two fixed social roles enabled by the software (subscriber and moderator). The additional channel has also not changed the binary moderation options: approval or deletion. The social capacities built into the e-mail list software remain relatively primitive, and so are the options for OS-INT projects using this platform. < wikipedia.com > Wikipedia is a spin-off of Nupedia. Nupedia - the name is a combination of GNU and encyclopedia - is a project to create an authoritative encyclopedia inspired, and morally supported, by Richard Stallman's GNU project [13]. However, apart from being published under an open license, Nupedia's structure is similar to the traditional editorial process. Experts write articles that are reviewed by a board of expert editors (with some public input via the "article in progress" section) before being finalized, approved, and published. Once published, the articles are finished. Given the extensive process, it's not surprising that the project has been developing at a glacial pace. Wikipedia was started in early 2001 as an attempt to create something similar - a free encyclopedia that would ultimately be able to compete with the Encyclopaedia Britannica - but it was developed via a very different, much more open process. The two projects are related but independent - Nupedia links to articles on Wikipedia if it has no entries for a keyword, and some people contribute to both projects, but most don't. The project's technological platform is called Wikiweb, named after the Hawaiian word wikiwiki, which means fast [14]. The original software was written in 1994 but recently rewritten to better handle the rapidly growing size and volume of Wikipedia. The Wiki platform incorporates one of Berners-Lee's original concepts for the Web: to let people not only see the source code, but also freely edit the content of pages they view. In the footer of most Wikipages is the option to "Edit this page," which gives the user access to a simple form that allows them to change the displayed page's content. The changes become effective immediately, without being reviewed by a board or even the original author. Each page also has a "history" function that allows users to review the changes and, if necessary, revert to an older version of the page. In this system, writing and editing are collective and cumulative. A reader who sees a mistake or omission in an article can immediately correct it or add the missing information. Following the open source peer-review maxim, formulated by Eric Raymond as "given enough eyeballs, all bugs are shallow," this allows the project to grow not only in number of articles, but also in terms of the articles' depth, which should improve over time through the collective input of knowledgeable readers. Since the review and improvement process is public and ongoing, there is no difference between beta and release versions of the information (as there is in Nupedia). Texts continuously change. Peer-review becomes peer-editing, resulting in what Larry Sanger, one of the original project leaders, hailed as the "most promiscuous form of publishing." At least as far as its growth is concerned, the project has been very successful. It passed 1,000 pages around February 12, 2001, and 10,000 articles around September 7, 2001. In its first year of existence, over 20,000 encyclopedia entries were created - that's a rate of over 1,500 articles per month. By the end of March 2002, the number of articles had grown to over 27,000. The quality of the articles is a different matter and difficult to judge in a general manner. Casual searching brings up some articles that are in very good shape and many that aren't. Of course, this is not surprising given the given the fact that the project is still very young. Many of the articles function more as invitations for input than as useful reference sources. For the moment, many texts have an "undergraduate" feel to them, which may be appropriate, since the project just finished its "first year." However, it remains to be seen if the project will ever graduate. Both Nupedia and Wikipedia have been supported by Jimbo Wales, CEO of the San Diego-based search engine company Bomis, who has donated server space and bandwidth to the project. The code-base was rewritten by a student at the University of Cologne, Germany, and for a bit more than one year, Larry Sanger held a full-time position (via Bomis) as editor-in-chief of Nupedia and chief organizer at Wikipedia. In January 2002, funding ran out and Larry resigned. He now contributes as a volunteer. There are currently close to 1,200 registered users, but since it's possible to contribute anonymously, and quite a few people do, the actual number of contributors is most likely higher. Wikipedia has not suffered from the resignation of its only paid contributor. It seems that it has reached, at least for the moment, the critical mass necessary to remain vibrant. Since anyone can read and write, the paid editor did not have any special status. His contributions were primarily cognitive, because he had more time than anyone else did to edit articles and write initial editing rules and FAQ files. His influence was entirely reputation-based. He could, and did, motivate people, but he could not force anyone to do anything against their will. The products of this encyclopedia are freely available to anyone. The texts are published under the GNU Free Document license [15]. This states that the texts can be copied and modified for any purpose, as long as the original source is credited and the resulting text is published under the same license. Not only the individual texts are available, the entire project - including its platform - can be downloaded as a single file for mirroring, viewing offline, or any other use. Effectively, not even the system administrator can control the project. The scale of people's involvement in the project is highly flexible, ranging from the simple reader who corrects a minor mistake, to the author who maintains a lengthy entry, to the editor who continuously improves other people's entries. These roles depend entirely on each contributor's commitment, and are not pre-configured in the software. Everyone has the same editing capabilities. So far, the project has suffered little from the kind of vandalism that one might expect to occur given its open editing capabilities. There are several reasons for this. On the one hand, authors and contributors who have put effort into creating an entry have a vested interest in maintaining and improving the resource, and due to the "change history" function, individual pages can be restored relatively easily. The latest version of the platform has an added feature that can send out alerts to people who request them whenever a specific page has been changed. The other reason is that the project still has a "community" character to it, so there seems to be a certain shared feeling that it is a valuable resource and needs to be maintained properly. Finally, in case of read differences over content, it's often easier to create a new entry rather than to fight over an existing one. This is one of the great advantages of having infinite space. So far, self-regulation works quite well. It remains to be seen how long the current rate of growth can be sustained, and if it really translates into an improvement over the quality of the individual encyclopedia entries. So far, the prospects look good, but there are very few examples of the long-term dynamics of such open projects. Given the fact that its stated competitor, the Encyclopaedia Britannica, has been publishing since 1768, long term development is clearly essential to such a project. < NoLogo.org > NoLogo.org is perhaps the most prominent second-generation slash site. This makes it a good example of how the OS-INT experience, embodied by a specific code, is now at a stage where it can be replicated across different contexts with relative ease. NoLogo.org is based on the current, stable release of Slashcode, an open source software platform released under the GPL, and developed for and by the Slashdot community. Slashdot is the most well-known and obvious example of OS-INT, since it is one of the main news and discussion sites for the open source movement. Of particular importance for OS-INT is the collaborative moderation process supported by the code. Users who contribute good stories or comments on stories are rewarded with "karma," which is essentially a point system that enables people to build up their reputation. Once a user has accumulated a certain number of points, she can assume more responsibilities, and is even trusted to moderate other people's comments. Points do have a half-life however. If a user stops contributing, their privileges expire. Each comment can be assigned points by several different moderators, and the final grade (from -1 to +5) is an average of all the moderators' judgments. A good contribution is one that receives high grades from multiple moderators. This creates a kind of double peer-review process. The first is the content of the discussion itself where people respond to one another, and the second is the unique ranking of each contribution. This approach to moderation addresses very elegantly several problems that bedevil e-mail lists. First, the moderation process is collaborative. No individual moderator can impose his or her preferences. Second, moderation means ranking, rather than deleting. Even comments ranked -1 can still be read. Third, users set their preferences individually, rather than allowing a moderator to set them for everyone. Some might enjoy the strange worlds of -1 comments, whereas others might only want to read the select few that garnered +5 rankings. Finally, involvement is reputation- (i.e. karma-) based and flexible. Since moderation is collaborative, it's possible to give out moderation privileges automatically. Moderators have very limited control over the system. As an additional layer of feedback, moderators who have accumulated even more points through consistently good work can "meta-moderate," or rank the other moderators. The social potential embodied in Slashcode was available when Naomi Klein's January 2000 book No Logo: Taking Aim at the Brand Bullies became a sudden international best-seller. In the wake of the anti-globalization protests in Seattle in November 1999, and after, the book began to sell in the 10,000s and later 100,000s. She found herself caught in a clash of old and new media and facing a peculiar problem. A book is a highly hierarchical and centralized form of communication - there is only one single author, and a very large number of readers. It is centralized because users form a relationship with the author, while typically remaining isolated from one another. This imbalance of the broadcast model is usually not a problem, since readers lack efficient feedback channels. However, today many readers have e-mail and began to find Naomi's e-mail address on the Web. She started receiving e-mails en masse, asking for comments, advice, and information. There was no way she could take all these e-mails seriously and respond to them properly. The imbalance between the needs of the audience and the capacities of the author were just too great, particularly since Naomi had no interest in styling herself as the leader or guru of the anti-globalization movement. (Of course that didn't stop the mass media from doing so without her consent.) As she explains the idea behind the Nologo.org: "Mostly, we wanted a place where readers and researchers interested in these issues could talk directly to one another, rather than going through me. We also wanted to challenge the absurd media perception that I am "the voice of the movement," and instead provide a small glimpse of the range of campaigns, issues and organizations that make up this powerful activist network - powerful precisely because it insistently repels all attempts to force it into a traditional hierarchy" [16]. The book, which touched a nerve for many people, created a global, distributed y"communityy" of isolated readers. The book provided a focus, but nowhere to go except to the author. The Slashcode-based Web site provided a readily available platform for the readers to become visible to one another and break through the isolation created by the book. The book and the OS-INT platform are complementary. The book is a momentary and personal solidification of a very fluid and heterogeneous movement. The coherent analysis that the traditional author can produce still has a lot of value. The OS-INT platform, on the other hand, is a reflection of the dynamic multiplicity of the movement, a way to give back something to the readers (and others) and a connective learning process. More than the book, Nologo.org fuses action with reflection. Of course, all the problems that are traditionally associated with public forums are still there, dissent - at times vitriolic and destructive - is voiced, but the moderating system allows members of the group to deal with differences in opinion in ways that do not impede the vitality of the forum. The learning process of Slashdot, in terms of to how to deal with these issues, benefited NoLogo significantly. Within the first year, 3,000 users registered on the site which serves requests of some 1,500 individual visitors per day. The Future of OS-INT As a distinct practice, Open Source Intelligence is still quite young and faces a few challenges. First, there is the issue of scale. Compared to traditional broadcast media, OS-INT projects are still very small (with the exception of slashdot, which has about half a million registered users) [17]. Since scale and exposure significantly affect the social dynamics, growth might not come easily for many projects. Second, there is an issue of economics. Most OSI-INT projects are pure volunteer projects. Resources are donated. Wikipedia, for example, depends on Bomis Inc. for hardware and bandwidth. NoLogo.org is financed through royalties from book sales. Most OS-INT project have not yet produced any revenue to cover some of the inevitable costs. So far, they have quite successfully relied on donations (from sympathetic individuals, corporations or foundations), but prolonged crisis of the Internet economy does not necessarily make it easier to raise funds, which becomes more important as the projects grow in size and the infrastructure/bandwidth needs increase. Compared to traditional production and publishing models, OS-INT projects take part to a large degree outside the traditional monetary economy. Contributors, by and large, are not motivated by immediate financial gain. However, not all resources can be secured without money, so new and creative models of financing such projects need to be found. Slashdot, for example, which could rely for a long time on advertisement as a main revenue source, recently had to increase the size of banners in order to keep up with costs. However, it gave users the possibility to access the site without advertisement - in exchange for a small subscription fee. It is likely that OSI-INT projects, from an economic point of view, will develop into a hybrid involving direct revenues (e.g. subscription, advertisement), goodwill donations and volunteer efforts. How these different elements will relate to one another will change from project to project. There is a lot of room - and need - for creative experiments. Despite these challenges, there are good reasons to be optimistic about its future. First, the socio-technological learning process is deepening. The platforms and practices of OS-INT are becoming better understood, and consequently the hurdles for users as well as providers are getting lower. On the users' side, the experience of learning how to deal with participatory, rather than broadcast media is growing. Their distinct character is being developed, mastered and appreciated. For providers, the learning experience of OS-INT is embedded in sophisticated, freely available GPL software. The start-up costs for new projects are minimal, and possibilities for adapting the platform to the idiosyncratic needs of each project are maximized. The resulting diversity, in turn, enriches the connective learning process. Second, as the mass media converges into an ever smaller number of (cross-industrial) conglomerates, which relentlessly promote and control their multitude of media products, the need for alternative information channels rises, at least among people who invest time and cognitive energy into being critically informed. Given the economics of advertisement-driven mass media, it is clear that the possibilities of an "alternative newspaper" is rather limited. OS-INT platforms, by distributing labor throughout the community, offer the possibility of reaching a wider audience without being subject to the same economic pressures that broadcast and print media face to deliver those audiences to advertisers, particularly considering the fact that paid subscriptions allow access to advertisement-free content. The more homogenous the mainstream media becomes, the more room opens up for alternatives. And if these alternatives are to be viable, then they must not be limited to alternative content, but must also explore the structure of their production. This is the promise and potential of OS-INT. The range of technologies are as wide as the range of communities, and a close relationship exists between the two. Technologies open and close possibilities in the same sense that social communities do. As Lawrence Lessig pointed out, what code is to the online world, architecture is to the physical world [18]. The way we live and the structures in which we live are deeply related. The culture of technology increasingly becomes the culture of our society. Acknowledgments An earlier version of this paper was presented at the conference "Critical Upgrade: Reality Check for Cyber Utopias" (Zagreb, 4-5 May 2002). Notes 1. We use the term Open Source for its deliberate openness. Contrary to the more narrow term Free Software, Open Source seems better suited to label a general collaborative approach not limited to code. We acknowledge the historical and ideological differences between the two concepts, but we believe that they are of limited relevance in the context of the present argument. 2. http://www.zakon.org/robert/internet/timeline/#1970s, accessed 25 March 2002. 3. Tim Berners-Lee with Mark Fischetti, 1999. Weaving the Web: The Original Design and the Ultimate Destiny of the World Wide Web by its Inventor. New York: HarperCollins 4. Lawrence Lessig, 2001. The Future of Ideas: The Fate of the Commons in a Connected World. New York: Random House. 5. Max Weber, 1954. Max Weber on Law in Economy and Society. Translated by Talcott Parsons. Cambridge, Mass.: Harvard University Press 6. Eric Raymond, 2000. "Homesteading the Noosphere," at http://www.tuxedo.org/~esr/writings/cathedral-bazaar/homesteading/x349.html. 7. Lawrence Lessig (2001). 8. Often, but not always, these principles are supported by licenses setting the legal parameters for what can, or cannot, be done with the informational products governed by them. For an overview of the different licenses, see the Open Source initiative's list of more than 30 "approved licenses" at http://www.opensource.org/licenses. 9. http://www.nettime.org. 10. http://amsterdam.nettime.org/Lists-Archives/nettime-l-0203/msg00080.html. 11. J. Bosma, P. Van Mourik Broekman, T. Byfield, M. Fuller, G. Lovink, D. McCarty, P. Schultz, F. Stalder, M. Wark, and F. Wilding (editors), 1999. Readme! Ascii Culture and the Revenge of Knowledge. New York: Autonomedia. 12. http://www.nettime.org/pub.html. 13. http://www.gnu.org/encyclopedia/free-encyclopedia.html. 14. http://www.wiki.org. 15. http://www.wikipedia.com/wiki/GNU+Free+Documentation+License. 16. http://www.nologo.org/letter.shtml. 17. OS-INT projects take place on the Internet hence they still cannot have the broad reach of traditional broadcast media. 18. Lawrence Lessig, 1999. Code and Other Laws of Cyberspace. New York: Basic Books. --------------------++----- Les faits sont faits. http://felix.openflows.org # distributed via <nettime>: no commercial use without permission # <nettime> is a moderated mailing list for net criticism, # collaborative text filtering and cultural politics of the nets # more info: majordomo@bbs.thing.net and "info nettime-l" in the msg body # archive: http://www.nettime.org contact: nettime@bbs.thing.net