summaryrefslogtreecommitdiffstats
path: root/data
diff options
context:
space:
mode:
authorRalph Amissah <ralph@amissah.com>2010-08-22 21:58:53 +0000
committerRalph Amissah <ralph@amissah.com>2010-08-24 15:55:11 +0000
commitc9f8bc67faa18a583124aab0ea84828f9aaa4f07 (patch)
treed1968a2261c3032189519188cba0a7edcc5b4340 /data
parentmarkup sample, viral sprial, many url fixes (diff)
downloadsisu-markup-samples-c9f8bc67faa18a583124aab0ea84828f9aaa4f07.zip
sisu-markup-samples-c9f8bc67faa18a583124aab0ea84828f9aaa4f07.tar.xz
markup samples, corrections to book indexes
Diffstat (limited to 'data')
-rw-r--r--data/v1/samples/free_as_in_freedom.richard_stallman_crusade_for_free_software.sam_williams.sst2
-rw-r--r--data/v1/samples/the_wealth_of_networks.yochai_benkler.sst68
-rw-r--r--data/v1/samples/two_bits.christopher_kelty.sst36
-rw-r--r--data/v2/samples/democratizing_innovation.eric_von_hippel.sst20
-rw-r--r--data/v2/samples/free_as_in_freedom.richard_stallman_crusade_for_free_software.sam_williams.sst2
-rw-r--r--data/v2/samples/the_wealth_of_networks.yochai_benkler.sst32
-rw-r--r--data/v2/samples/two_bits.christopher_kelty.sst36
7 files changed, 107 insertions, 89 deletions
diff --git a/data/v1/samples/free_as_in_freedom.richard_stallman_crusade_for_free_software.sam_williams.sst b/data/v1/samples/free_as_in_freedom.richard_stallman_crusade_for_free_software.sam_williams.sst
index e94a4cc..8ef920b 100644
--- a/data/v1/samples/free_as_in_freedom.richard_stallman_crusade_for_free_software.sam_williams.sst
+++ b/data/v1/samples/free_as_in_freedom.richard_stallman_crusade_for_free_software.sam_williams.sst
@@ -2028,7 +2028,7 @@ Although not the first person to view software as public property, Stallman is g
Predicting the future is risky sport, but most people, when presented with the question, seemed eager to bite. "One hundred years from now, Richard and a couple of other people are going to deserve more than a footnote," says Moglen. "They're going to be viewed as the main line of the story."
The "couple other people" Moglen nominates for future textbook chapters include John Gilmore, Stallman's GPL advisor and future founder of the Electronic Frontier Foundation, and Theodor Holm Nelson, a.k.a. Ted Nelson, author of the 1982 book, Literary Machines. Moglen says Stallman, Nelson, and Gilmore each stand out in historically significant, nonoverlapping ways. He credits Nelson, commonly considered to have coined the term "hypertext," for identifying the predicament of information ownership in the digital age. Gilmore and Stallman, meanwhile, earn notable credit for identifying the negative political effects of information control and building organizations-the Electronic Frontier Foundation in the case of Gilmore and the Free Software Foundation in the case of Stallman-to counteract those effects. Of the two, however, Moglen sees Stallman's activities as more personal and less political in nature.
-={Electronic Frontier Foundation;Gilmore, John;Nelson, Theodor Holm+2;Nelson Ted+2}
+={Electronic Frontier Foundation;Gilmore, John;Nelson, Theodor Holm+2;Nelson, Ted+2}
"Richard was unique in that the ethical implications of unfree software were particularly clear to him at an early moment," says Moglen. "This has a lot to do with Richard's personality, which lots of people will, when writing about him, try to depict as epiphenomenal or even a drawback in Richard Stallman's own life work."
diff --git a/data/v1/samples/the_wealth_of_networks.yochai_benkler.sst b/data/v1/samples/the_wealth_of_networks.yochai_benkler.sst
index a7ae407..e9e087a 100644
--- a/data/v1/samples/the_wealth_of_networks.yochai_benkler.sst
+++ b/data/v1/samples/the_wealth_of_networks.yochai_benkler.sst
@@ -80,7 +80,7 @@ Much of the early work in this project was done at New York University, whose la
Since 2001, first as a visitor and now as a member, I have had the remarkable pleasure of being part of the intellectual community that is Yale Law School. The book in its present form, structure, and emphasis is a direct reflection of my immersion in this wonderful community. Practically every single one of my colleagues has read articles I have written over this period, attended workshops where I presented my work, provided comments that helped to improve the articles--and through them, this book, as well. I owe each and every one of them thanks, not least to Tony Kronman, who made me see that it would be so. To list them all would be redundant. To list some would inevitably underrepresent the various contributions they have made. Still, I will try to say a few of the special thanks, owing much yet to ,{[pg xii]}, those I will not name. Working out the economics was a precondition of being able to make the core political claims. Bob Ellickson, Dan Kahan, and Carol Rose all engaged deeply with questions of reciprocity and commonsbased production, while Jim Whitman kept my feet to the fire on the relationship to the anthropology of the gift. Ian Ayres, Ron Daniels during his visit, Al Klevorick, George Priest, Susan Rose-Ackerman, and Alan Schwartz provided much-needed mixtures of skepticism and help in constructing the arguments that would allay it. Akhil Amar, Owen Fiss, Jerry Mashaw, Robert Post, Jed Rubenfeld, Reva Siegal, and Kenji Yoshino helped me work on the normative and constitutional questions. The turn I took to focusing on global development as the core aspect of the implications for justice, as it is in chapter 9, resulted from an invitation from Harold Koh and Oona Hathaway to speak at their seminar on globalization, and their thoughtful comments to my paper. The greatest influence on that turn has been Amy Kapczynski's work as a fellow at Yale, and with her, the students who invited me to work with them on university licensing policy, in particular, Sam Chaifetz.
-Oddly enough, I have never had the proper context in which to give two more basic thanks. My father, who was swept up in the resistance to British colonialism and later in Israel's War of Independence, dropped out of high school. He was left with a passionate intellectual hunger and a voracious appetite for reading. He died too young to even imagine sitting, as I do today with my own sons, with the greatest library in human history right there, at the dinner table, with us. But he would have loved it. Another great debt is to David Grais, who spent many hours mentoring me in my first law job, bought me my first copy of Strunk and White, and, for all practical purposes, taught me how to write in English; as he reads these words, he will be mortified, I fear, to be associated with a work of authorship as undisciplined as this, with so many excessively long sentences, replete with dependent clauses and unnecessarily complex formulations of quite simple ideas.
+Oddly enough, I have *{never had the proper context}* in which to give two more basic thanks. My father, who was swept up in the resistance to British colonialism and later in Israel's War of Independence, dropped out of high school. He was left with a passionate intellectual hunger and a voracious appetite for reading. He died too young to even imagine sitting, as I do today with my own sons, with the greatest library in human history right there, at the dinner table, with us. But he would have loved it. Another great debt is to David Grais, who spent many hours mentoring me in my first law job, bought me my first copy of Strunk and White, and, for all practical purposes, taught me how to write in English; as he reads these words, he will be mortified, I fear, to be associated with a work of authorship as undisciplined as this, with so many excessively long sentences, replete with dependent clauses and unnecessarily complex formulations of quite simple ideas.
Finally, to my best friend and tag-team partner in this tussle we call life, Deborah Schrag, with whom I have shared nicely more or less everything since we were barely adults. ,{[pg 1]},
@@ -94,7 +94,7 @@ A series of changes in the technologies, economic organization, and social pract
The rise of greater scope for individual and cooperative nonmarket production of information and culture, however, threatens the incumbents of the industrial information economy. At the beginning of the twenty-first century, we find ourselves in the midst of a battle over the institutional ecology of the digital environment. A wide range of laws and institutions-- from broad areas like telecommunications, copyright, or international trade regulation, to minutiae like the rules for registering domain names or whether digital television receivers will be required by law to recognize a particular code--are being tugged and warped in efforts to tilt the playing field toward one way of doing things or the other. How these battles turn out over the next decade or so will likely have a significant effect on how we come to know what is going on in the world we occupy, and to what extent and in what forms we will be able--as autonomous individuals, as citizens, and as participants in cultures and communities--to affect how we and others see the world as it is and as it might be.
2~ THE EMERGENCE OF THE NETWORKED INFORMATION ECONOMY
-={information economy:emergence of+9;networked environment policy+52;networked environment policy:emergence of+9}
+={information economy:emergence of+9;networked information economy+52|emergence of+9}
The most advanced economies in the world today have made two parallel shifts that, paradoxically, make possible a significant attenuation of the limitations that market-based production places on the pursuit of the political ,{[pg 3]}, values central to liberal societies. The first move, in the making for more than a century, is to an economy centered on information (financial services, accounting, software, science) and cultural (films, music) production, and the manipulation of symbols (from making sneakers to branding them and manufacturing the cultural significance of the Swoosh). The second is the move to a communications environment built on cheap processors with high computation capabilities, interconnected in a pervasive network--the phenomenon we associate with the Internet. It is this second shift that allows for an increasing role for nonmarket production in the information and cultural production sector, organized in a radically more decentralized pattern than was true of this sector in the twentieth century. The first shift means that these new patterns of production--nonmarket and radically decentralized--will emerge, if permitted, at the core, rather than the periphery of the most advanced economies. It promises to enable social production and exchange to play a much larger role, alongside property- and marketbased production, than they ever have in modern democracies.
={nonmarket information producers+4;physical constraints on information production+2;production of information:physical constraints on+2}
@@ -121,7 +121,7 @@ In the networked information economy, the physical capital required for producti
Because the presence and importance of nonmarket production has become so counterintuitive to people living in market-based economies at the end of the twentieth century, part I of this volume is fairly detailed and technical; overcoming what we intuitively "know" requires disciplined analysis. Readers who are not inclined toward economic analysis should at least read the introduction to part I, the segments entitled "When Information Production Meets the Computer Network" and "Diversity of Strategies in our Current Production System" in chapter 2, and the case studies in chapter 3. These should provide enough of an intuitive feel for what I mean by the diversity of production strategies for information and the emergence of nonmarket individual and cooperative production, to serve as the basis for the more normatively oriented parts of the book. Readers who are genuinely skeptical of the possibility that nonmarket production is sustainable and effective, and in many cases is an efficient strategy for information, knowledge, and cultural production, should take the time to read part I in its entirety. The emergence of precisely this possibility and practice lies at the very heart of my claims about the ways in which liberal commitments are translated into lived experiences in the networked environment, and forms the factual foundation of the political-theoretical and the institutional-legal discussion that occupies the remainder of the book.
2~ NETWORKED INFORMATION ECONOMY AND LIBERAL, DEMOCRATIC SOCIETIES
-={democratic societies+15;information economy:democracy and liberalism+15;liberal societies+15;networked environment policy:democracy and liberalism+15}
+={democratic societies+15;information economy:democracy and liberalism+15;liberal societies+15;networked information economy:democracy and liberalism+15}
How we make information, how we get it, how we speak to others, and how others speak to us are core components of the shape of freedom in any society. Part II of this book provides a detailed look at how the changes in the technological, economic, and social affordances of the networked information environment affect a series of core commitments of a wide range of liberal democracies. The basic claim is that the diversity of ways of organizing information production and use opens a range of possibilities for pursuing % ,{[pg 8]}, the core political values of liberal societies--individual freedom, a more genuinely participatory political system, a critical culture, and social justice. These values provide the vectors of political morality along which the shape and dimensions of any liberal society can be plotted. Because their practical policy implications are often contradictory, rather than complementary, the pursuit of each places certain limits on how we pursue the others, leading different liberal societies to respect them in different patterns. How much a society constrains the democratic decision-making powers of the majority in favor of individual freedom, or to what extent it pursues social justice, have always been attributes that define the political contours and nature of that society. But the economics of industrial production, and our pursuit of productivity and growth, have imposed a limit on how we can pursue any mix of arrangements to implement our commitments to freedom and justice. Singapore is commonly trotted out as an extreme example of the trade-off of freedom for welfare, but all democracies with advanced capitalist economies have made some such trade-off. Predictions of how well we will be able to feed ourselves are always an important consideration in thinking about whether, for example, to democratize wheat production or make it more egalitarian. Efforts to push workplace democracy have also often foundered on the shoals--real or imagined--of these limits, as have many plans for redistribution in the name of social justice. Market-based, proprietary production has often seemed simply too productive to tinker with. The emergence of the networked information economy promises to expand the horizons of the feasible in political imagination. Different liberal polities can pursue different mixtures of respect for different liberal commitments. However, the overarching constraint represented by the seeming necessity of the industrial model of information and cultural production has significantly shifted as an effective constraint on the pursuit of liberal commitments.
@@ -167,10 +167,10 @@ The networked information economy also allows for the emergence of a more critic
={Balkin, Jack;communities:critical culture and self-reflection+1;critical culture and self-reflection+1;culture:criticality of (self-reflection)+1;democratic societies:critical culture and social relations+1;Fisher, William (Terry);Koren, Niva Elkin;Lessig, Lawrence (Larry);self-organization: See clusters in network topology self-reflection+1;liberal societies:critical culture and social relations}
Throughout much of this book, I underscore the increased capabilities of individuals as the core driving social force behind the networked information economy. This heightened individual capacity has raised concerns by many that the Internet further fragments community, continuing the long trend of industrialization. A substantial body of empirical literature suggests, however, that we are in fact using the Internet largely at the expense of television, and that this exchange is a good one from the perspective of social ties. We use the Internet to keep in touch with family and intimate friends, both geographically proximate and distant. To the extent we do see a shift in social ties, it is because, in addition to strengthening our strong bonds, we are also increasing the range and diversity of weaker connections. Following ,{[pg 16]}, Manuel Castells and Barry Wellman, I suggest that we have become more adept at filling some of the same emotional and context-generating functions that have traditionally been associated with the importance of community with a network of overlapping social ties that are limited in duration or intensity.
-={attention fragmentation;Castells, Manuel;fragmentation of communication;norms (social): fragments of communication;regulation by social norms: fragmentation of communication;social relations and norms:fragmentation of communication;communities: fragmentation of;diversity:fragmentation of communication;Castells, Manuel}
+={attention fragmentation;Castells, Manuel;fragmentation of communication;norms (social): fragmentation of communication;regulation by social norms: fragmentation of communication;social relations and norms:fragmentation of communication;communities: fragmentation of;diversity:fragmentation of communication;Castells, Manuel}
2~ FOUR METHODOLOGICAL COMMENTS
-={information economy:methodological choices+14;networked environmental policy. See policy networked information economy:methodological choices+14}
+={information economy:methodological choices+14;networked environmental policy:See policy;networked information economy:methodological choices+14}
There are four methodological choices represented by the thesis that I have outlined up to this point, and therefore in this book as a whole, which require explication and defense. The first is that I assign a very significant role to technology. The second is that I offer an explanation centered on social relations, but operating in the domain of economics, rather than sociology. The third and fourth are more internal to liberal political theory. The third is that I am offering a liberal political theory, but taking a path that has usually been resisted in that literature--considering economic structure and the limits of the market and its supporting institutions from the perspective of freedom, rather than accepting the market as it is, and defending or criticizing adjustments through the lens of distributive justice. Fourth, my approach heavily emphasizes individual action in nonmarket relations. Much of the discussion revolves around the choice between markets and nonmarket social behavior. In much of it, the state plays no role, or is perceived as playing a primarily negative role, in a way that is alien to the progressive branches of liberal political thought. In this, it seems more of a libertarian or an anarchistic thesis than a liberal one. I do not completely discount the state, as I will explain. But I do suggest that what is special about our moment is the rising efficacy of individuals and loose, nonmarket affiliations as agents of political economy. Just like the market, the state will have to adjust to this new emerging modality of human action. Liberal political theory must first recognize and understand it before it can begin to renegotiate its agenda for the liberal state, progressive or otherwise.
={capabilities of individuals:technology and human affairs+5;human affairs, technology and+5;individual capabilities and action: technology and human affairs+5}
@@ -212,7 +212,7 @@ The important new fact about the networked environment, however, is the efficacy
={collaborative authorship: See also peer production collective social action}
2~ THE STAKES OF IT ALL: THE BATTLE OVER THE INSTITUTIONAL ECOLOGY OF THE DIGITAL ENVIRONMENT
-={commercial model of communication+9;industrial model of communication+9;information economy:institutional ecology+9;institutional ecology of digital environment+9;networked environment policy:institutional ecology+9;proprietary rights+9;traditional model of communication+9}
+={commercial model of communication+9;industrial model of communication+9;information economy:institutional ecology+9;institutional ecology of digital environment+9;networked information economy:institutional ecology+9;proprietary rights+9;traditional model of communication+9}
No benevolent historical force will inexorably lead this technologicaleconomic moment to develop toward an open, diverse, liberal equilibrium. ,{[pg 23]}, If the transformation I describe as possible occurs, it will lead to substantial redistribution of power and money from the twentieth-century industrial producers of information, culture, and communications--like Hollywood, the recording industry, and perhaps the broadcasters and some of the telecommunications services giants--to a combination of widely diffuse populations around the globe, and the market actors that will build the tools that make this population better able to produce its own information environment rather than buying it ready-made. None of the industrial giants of yore are taking this reallocation lying down. The technology will not overcome their resistance through an insurmountable progressive impulse. The reorganization of production and the advances it can bring in freedom and justice will emerge, therefore, only as a result of social and political action aimed at protecting the new social patterns from the incumbents' assaults. It is precisely to develop an understanding of what is at stake and why it is worth fighting for that I write this book. I offer no reassurances, however, that any of this will in fact come to pass.
@@ -220,7 +220,7 @@ The battle over the relative salience of the proprietary, industrial models of i
={property ownership+5;commons}
This is not to say that property is in some sense inherently bad. Property, together with contract, is the core institutional component of markets, and ,{[pg 24]}, a core institutional element of liberal societies. It is what enables sellers to extract prices from buyers, and buyers to know that when they pay, they will be secure in their ability to use what they bought. It underlies our capacity to plan actions that require use of resources that, without exclusivity, would be unavailable for us to use. But property also constrains action. The rules of property are circumscribed and intended to elicit a particular datum--willingness and ability to pay for exclusive control over a resource. They constrain what one person or another can do with regard to a resource; that is, use it in some ways but not others, reveal or hide information with regard to it, and so forth. These constraints are necessary so that people must transact with each other through markets, rather than through force or social networks, but they do so at the expense of constraining action outside of the market to the extent that it depends on access to these resources.
-={constrains of information production:physical+2;physical constraints on information production+2}
+={constrains of information production, physical+2;physical constraints on information production+2}
Commons are another core institutional component of freedom of action in free societies, but they are structured to enable action that is not based on exclusive control over the resources necessary for action. For example, I can plan an outdoor party with some degree of certainty by renting a private garden or beach, through the property system. Alternatively, I can plan to meet my friends on a public beach or at Sheep's Meadow in Central Park. I can buy an easement from my neighbor to reach a nearby river, or I can walk around her property using the public road that makes up our transportation commons. Each institutional framework--property and commons--allows for a certain freedom of action and a certain degree of predictability of access to resources. Their complementary coexistence and relative salience as institutional frameworks for action determine the relative reach of the market and the domain of nonmarket action, both individual and social, in the resources they govern and the activities that depend on access to those resources. Now that material conditions have enabled the emergence of greater scope for nonmarket action, the scope and existence of a core common infrastructure that includes the basic resources necessary to produce and exchange information will shape the degree to which individuals will be able to act in all the ways that I describe as central to the emergence of a networked information economy and the freedoms it makes possible.
={commons}
@@ -236,7 +236,7 @@ Social and economic organization is not infinitely malleable. Neither is it alwa
This book is offered, then, as a challenge to contemporary liberal democracies. We are in the midst of a technological, economic, and organizational transformation that allows us to renegotiate the terms of freedom, justice, and productivity in the information society. How we shall live in this new environment will in some significant measure depend on policy choices that we make over the next decade or so. To be able to understand these choices, to be able to make them well, we must recognize that they are part of what is fundamentally a social and political choice--a choice about how to be free, equal, productive human beings under a new set of technological and ,{[pg 28]}, economic conditions. As economic policy, allowing yesterday's winners to dictate the terms of tomorrow's economic competition would be disastrous. As social policy, missing an opportunity to enrich democracy, freedom, and justice in our society while maintaining or even enhancing our productivity would be unforgivable. ,{[pg 29]},
-:C~ Part One - The Networked Information Economy
+:B~ Part One - The Networked Information Economy
1~p1 Introduction
={communities:technology-defined social structure+9;norms (social):technology-defined structure+9;regulation by social norms: technology-defined structure+9;social relations and norms: technology-defined structure+9;social structure, defined by technology+9;technology:social structure defined by+9}
@@ -299,6 +299,8 @@ The actual universe of information production in the economy then, is not as dep
The ideal-type strategy that underlies patents and copyrights can be thought of as the "Romantic Maximizer." It conceives of the information producer as a single author or inventor laboring creatively--hence romantic--but in expectation of royalties, rather than immortality, beauty, or truth. An individual or small start-up firm that sells software it developed to a larger firm, or an author selling rights to a book or a film typify this model. The second ideal type that arises within exclusive-rights based industries, "Mickey," is a larger firm that already owns an inventory of exclusive rights, some through in-house development, some by buying from Romantic Maximizers. ,{[pg 43]},
={Mickey model+3;Romantic Maximizer model+2}
+<:pb>
+
!_ Table 2.1: Ideal-Type Information Production Strategies
={demand-side effects of information production;Joe Einstein model+1;learning networks+1;limited sharing networks+1;Los Alamos model+1;nonmarket information producers:strategies for information production+1;RCA strategy+1;Scholarly Lawyers model+1;sharing:limited sharing networks}
@@ -491,7 +493,7 @@ How are we to know that the content produced by widely dispersed individuals is
={accreditation:Amazon+1;Amazon+1;filtering:Amazon+1;relevance filtering:Amazon+1}
Amazon uses a mix of mechanisms to get in front of their buyers of books and other products that the users are likely to purchase. A number of these mechanisms produce relevance and accreditation by harnessing the users themselves. At the simplest level, the recommendation "customers who bought items you recently viewed also bought these items" is a mechanical means of extracting judgments of relevance and accreditation from the actions of many individuals, who produce the datum of relevance as byproduct of making their own purchasing decisions. Amazon also allows users to create topical lists and track other users as their "friends and favorites." Amazon, like many consumer sites today, also provides users with the ability ,{[pg 76]}, to rate books they buy, generating a peer-produced rating by averaging the ratings. More fundamentally, the core innovation of Google, widely recognized as the most efficient general search engine during the first half of the 2000s, was to introduce peer-based judgments of relevance. Like other search engines at the time, Google used a text-based algorithm to retrieve a given universe of Web pages initially. Its major innovation was its PageRank algorithm, which harnesses peer production of ranking in the following way. The engine treats links from other Web sites pointing to a given Web site as votes of confidence. Whenever someone who authors a Web site links to someone else's page, that person has stated quite explicitly that the linked page is worth a visit. Google's search engine counts these links as distributed votes of confidence in the quality of the page pointed to. Pages that are heavily linked-to count as more important votes of confidence. If a highly linked-to site links to a given page, that vote counts for more than the vote of a site that no one else thinks is worth visiting. The point to take home from looking at Google and Amazon is that corporations that have done immensely well at acquiring and retaining users have harnessed peer production to enable users to find things they want quickly and efficiently.
-={accreditation:Google;communities:critical culture and self-reflection+1;culture:critically of (self-reflection)+1;filtering:Google;Google;relevance filtering:Google}
+={accreditation:Google;communities:critical culture and self-reflection+1;culture:criticality of (self-reflection)+1;filtering:Google;Google;relevance filtering:Google}
The most prominent example of a distributed project self-consciously devoted to peer production of relevance is the Open Directory Project. The site relies on more than sixty thousand volunteer editors to determine which links should be included in the directory. Acceptance as a volunteer requires application. Quality relies on a peer-review process based substantially on seniority as a volunteer and level of engagement with the site. The site is hosted and administered by Netscape, which pays for server space and a small number of employees to administer the site and set up the initial guidelines. Licensing is free and presumably adds value partly to America Online's (AOL's) and Netscape's commercial search engine/portal and partly through goodwill. Volunteers are not affiliated with Netscape and receive no compensation. They spend time selecting sites for inclusion in the directory (in small increments of perhaps fifteen minutes per site reviewed), producing the most comprehensive, highest-quality human-edited directory of the Web--at this point outshining the directory produced by the company that pioneered human edited directories of the Web: Yahoo!.
={accreditation:Open Directory Project (ODP);critical culture and self-reflection:Open Directory Project;filtering:Open Directory Project (ODP);ODP (Open Directory Project);Open Directory Project (ODP);relevance filtering:Open Directory Project (ODP);self-organization:Open Directory Project}
@@ -629,7 +631,7 @@ The independence of Web sites is what marks their major difference from more org
={Slashdot+1;accreditation:Slashdot+1;filtering:Slashdot+1;relevance filtering:Slashdot+1;peer production:maintenance of cooperation+1;structured production:maintenance of cooperation+1}
Cooperation in peer-production processes is usually maintained by some combination of technical architecture, social norms, legal rules, and a technically backed hierarchy that is validated by social norms. /{Wikipedia}/ is the strongest example of a discourse-centric model of cooperation based on social norms. However, even /{Wikipedia}/ includes, ultimately, a small number of people with system administrator privileges who can eliminate accounts or block users in the event that someone is being genuinely obstructionist. This technical fallback, however, appears only after substantial play has been given to self-policing by participants, and to informal and quasi-formal communitybased dispute resolution mechanisms. Slashdot, by contrast, provides a strong model of a sophisticated technical system intended to assure that no one can "defect" from the cooperative enterprise of commenting and moderating comments. It limits behavior enabled by the system to avoid destructive behavior before it happens, rather than policing it after the fact. The Slash code does this by technically limiting the power any given person has to moderate anyone else up or down, and by making every moderator the subject of a peer review system whose judgments are enforced technically-- that is, when any given user is described by a sufficiently large number of other users as unfair, that user automatically loses the technical ability to moderate the comments of others. The system itself is a free software project, licensed under the GPL (General Public License)--which is itself the quintessential example of how law is used to prevent some types of defection from the common enterprise of peer production of software. The particular type of defection that the GPL protects against is appropriation of the joint product by any single individual or firm, the risk of which would make it less attractive for anyone to contribute to the project to begin with. The GPL assures that, as a legal matter, no one who contributes to a free software project need worry that some other contributor will take the project and make it exclusively their own. The ultimate quality judgments regarding what is incorporated into the "formal" releases of free software projects provide the clearest example of the extent to which a meritocratic hierarchy can be used to integrate diverse contributions into a finished single product. In the case of the Linux kernel development project (see chapter 3), it was always within the power of Linus Torvalds, who initiated the project, to decide which contributions should be included in a new release, and which should not. But it is a funny sort of hierarchy, whose quirkiness Steve Weber ,{[pg 105]}, well explicates.~{ Steve Weber, The Success of Open Source (Cambridge, MA: Harvard University Press, 2004). }~ Torvalds's authority is persuasive, not legal or technical, and certainly not determinative. He can do nothing except persuade others to prevent them from developing anything they want and add it to their kernel, or to distribute that alternative version of the kernel. There is nothing he can do to prevent the entire community of users, or some subsection of it, from rejecting his judgment about what ought to be included in the kernel. Anyone is legally free to do as they please. So these projects are based on a hierarchy of meritocratic respect, on social norms, and, to a great extent, on the mutual recognition by most players in this game that it is to everybody's advantage to have someone overlay a peer review system with some leadership.
-={Wikipedia project;Torvalds, Linus;Weber, Steve;General Public License (GPL):See also fre software;GPL (General Public License):See Also free software;licensing:GPL (General Public License)}
+={Wikipedia project;Torvalds, Linus;Weber, Steve;General Public License (GPL):See also free software;GPL (General Public License):See Also free software;licensing:GPL (General Public License)}
In combination then, three characteristics make possible the emergence of information production that is not based on exclusive proprietary claims, not aimed toward sales in a market for either motivation or information, and not organized around property and contract claims to form firms or market exchanges. First, the physical machinery necessary to participate in information and cultural production is almost universally distributed in the population of the advanced economies. Certainly, personal computers as capital goods are under the control of numbers of individuals that are orders of magnitude larger than the number of parties controlling the use of massproduction-capable printing presses, broadcast transmitters, satellites, or cable systems, record manufacturing and distribution chains, and film studios and distribution systems. This means that the physical machinery can be put in service and deployed in response to any one of the diverse motivations individual human beings experience. They need not be deployed in order to maximize returns on the financial capital, because financial capital need not be mobilized to acquire and put in service any of the large capital goods typical of the industrial information economy. Second, the primary raw materials in the information economy, unlike the industrial economy, are public goods--existing information, knowledge, and culture. Their actual marginal social cost is zero. Unless regulatory policy makes them purposefully expensive in order to sustain the proprietary business models, acquiring raw materials also requires no financial capital outlay. Again, this means that these raw materials can be deployed for any human motivation. They need not maximize financial returns. Third, the technical architectures, organizational models, and social dynamics of information production and exchange on the Internet have developed so that they allow us to structure the solution to problems--in particular to information production problems--in ways ,{[pg 106]}, that are highly modular. This allows many diversely motivated people to act for a wide range of reasons that, in combination, cohere into new useful information, knowledge, and cultural goods. These architectures and organizational models allow both independent creation that coexists and coheres into usable patterns, and interdependent cooperative enterprises in the form of peer-production processes.
={computers;hardware;personal computers;physical machinery and computers}
@@ -803,7 +805,7 @@ The other quite basic change wrought by the emergence of social production, from
The overarching point is that social production is reshaping the market conditions under which businesses operate. To some of the incumbents of the industrial information economy, the pressure from social production is experienced as pure threat. It is the clash between these incumbents and the new practices that was most widely reported in the media in the first five years of the twenty-first century, and that has driven much of policy making, legislation, and litigation in this area. But the much more fundamental effect on the business environment is that social production is changing the relationship of firms to individuals outside of them, and through this changing the strategies that firms internally are exploring. It is creating new sources of inputs, and new tastes and opportunities for outputs. Consumers are changing into users--more active and productive than the consumers of the ,{[pg 127]}, industrial information economy. The change is reshaping the relationships necessary for business success, requiring closer integration of users into the process of production, both in inputs and outputs. It requires different leadership talents and foci. By the time of this writing, in 2005, these new opportunities and adaptations have begun to be seized upon as strategic advantages by some of the most successful companies working around the Internet and information technology, and increasingly now around information and cultural production more generally. Eric von Hippel's work has shown how the model of user innovation has been integrated into the business model of innovative firms even in sectors far removed from either the network or from information production--like designing kite-surfing equipment or mountain bikes. As businesses begin to do this, the platforms and tools for collaboration improve, the opportunities and salience of social production increases, and the political economy begins to shift. And as these firms and social processes coevolve, the dynamic accommodation they are developing provides us with an image of what the future stable interface between market-based businesses and the newly salient social production is likely to look like. ,{[pg 128]}, ,{[pg 129]},
={von Hippel, Eric}
-:C~ Part Two - The Political Economy of Property and Commons
+:B~ Part Two - The Political Economy of Property and Commons
1~p2 Introduction
={commons+5;property ownership+5}
@@ -1232,7 +1234,7 @@ Another dimension that is less well developed in the United States than it is in
={Gilmore, Dan;Pantic, Drazen;Rheingold, Howard;mobile phones;text messaging}
2~ NETWORKED INFORMATION ECONOMY MEETS THE PUBLIC SPHERE
-={information economy:effects on public sphere+21;networked environment policy:effects on public sphere+21}
+={information economy:effects on public sphere+21;networked information economy:effects on public sphere+21}
The networked public sphere is not made of tools, but of social production practices that these tools enable. The primary effect of the Internet on the ,{[pg 220]}, public sphere in liberal societies relies on the information and cultural production activity of emerging nonmarket actors: individuals working alone and cooperatively with others, more formal associations like NGOs, and their feedback effect on the mainstream media itself. These enable the networked public sphere to moderate the two major concerns with commercial mass media as a platform for the public sphere: (1) the excessive power it gives its owners, and (2) its tendency, when owners do not dedicate their media to exert power, to foster an inert polity. More fundamentally, the social practices of information and discourse allow a very large number of actors to see themselves as potential contributors to public discourse and as potential actors in political arenas, rather than mostly passive recipients of mediated information who occasionally can vote their preferences. In this section, I offer two detailed stories that highlight different aspects of the effects of the networked information economy on the construction of the public sphere. The first story focuses on how the networked public sphere allows individuals to monitor and disrupt the use of mass-media power, as well as organize for political action. The second emphasizes in particular how the networked public sphere allows individuals and groups of intense political engagement to report, comment, and generally play the role traditionally assigned to the press in observing, analyzing, and creating political salience for matters of public interest. The case studies provide a context both for seeing how the networked public sphere responds to the core failings of the commercial, mass-media-dominated public sphere and for considering the critiques of the Internet as a platform for a liberal public sphere.
@@ -1573,6 +1575,8 @@ If culture is indeed part of how we form a shared sense of unexamined common kno
If you run a search for "Barbie" on three separate search engines--Google, Overture, and Yahoo!--you will get quite different results. Table 8.1 lists these results in the order in which they appear on each search engine. Overture is a search engine that sells placement to the parties who are being searched. Hits on this search engine are therefore ranked based on whoever paid Overture the most in order to be placed highly in response to a query. On this list, none of the top ten results represent anything other than sales-related Barbie sites. Critical sites begin to appear only around the twentyfifth result, presumably after all paying clients have been served. Google, as we already know, uses a radically decentralized mechanism for assigning relevance. It counts how many sites on the Web have linked to a particular site that has the search term in it, and ranks the search results by placing a site with a high number of incoming links above a site with a low number of incoming links. In effect, each Web site publisher "votes" for a site's ,{[pg 286]}, ,{[pg 287]}, relevance by linking to it, and Google aggregates these votes and renders them on their results page as higher ranking. The little girl who searches for Barbie on Google will encounter a culturally contested figure. The same girl, searching on Overture, will encounter a commodity toy. In each case, the underlying efforts of Mattel, the producer of Barbie, have not changed. What is different is that in an environment where relevance is measured in nonmarket action--placing a link to a Web site because you deem it relevant to whatever you are doing with your Web site--as opposed to in dollars, Barbie has become a more transparent cultural object. It is easier for the little girl to see that the doll is not only a toy, not only a symbol of beauty and glamour, but also a symbol of how norms of female beauty in our society can be oppressive to women and girls. The transparency does not force the girl to choose one meaning of Barbie or another. It does, however, render transparent that Barbie can have multiple meanings and that choosing meanings is a matter of political concern for some set of people who coinhabit this culture. Yahoo! occupies something of a middle ground--its algorithm does link to two of the critical sites among the top ten, and within the top twenty, identifies most of the sites that appear on Google's top ten that are not related to sales or promotion.
={Barbie (doll), culture of+4}
+<:pb>
+
% table moved after paragraph
!_ Table 8.1: Results for "Barbie" - Google versus Overture and Yahoo!
@@ -1629,7 +1633,7 @@ Only two encyclopedias focus explicitly on Barbie's cultural meaning: Britannica
The relative emphasis of Google and /{Wikipedia}/, on the one hand, and Overture, Yahoo!, and the commercial encyclopedias other than Britannica, on the other hand, is emblematic of a basic difference between markets and social conversations with regard to culture. If we focus on the role of culture as "common knowledge" or background knowledge, its relationship to the market--at least for theoretical economists--is exogenous. It can be taken as given and treated as "taste." In more practical business environments, culture is indeed a source of taste and demand, but it is not taken as exogenous. Culture, symbolism, and meaning, as they are tied with marketbased goods, become a major focus of advertising and of demand management. No one who has been exposed to the advertising campaigns of Coca-Cola, Nike, or Apple Computers, as well as practically to any one of a broad range of advertising campaigns over the past few decades, can fail to see that these are not primarily a communication about the material characteristics or qualities of the products or services sold by the advertisers. ,{[pg 290]},
They are about meaning. These campaigns try to invest the act of buying their products or services with a cultural meaning that they cultivate, manipulate, and try to generalize in the practices of the society in which they are advertising, precisely in order to shape taste. They offer an opportunity to generate rents, because the consumer has to have this company's shoe rather than that one, because that particular shoe makes the customer this kind of person rather than that kind--cool rather than stuffy, sophisticated rather than common. Neither the theoretical economists nor the marketing executives have any interest in rendering culture transparent or writable. Whether one treats culture as exogenous or as a domain for limiting the elasticity of demand for one's particular product, there is no impetus to make it easier for consumers to see through the cultural symbols, debate their significance, or make them their own. If there is business reason to do anything about culture, it is to try to shape the cultural meaning of an object or practice, in order to shape the demand for it, while keeping the role of culture hidden and assuring control over the careful cultural choreography of the symbols attached to the company. Indeed, in 1995, the U.S. Congress enacted a new kind of trademark law, the Federal Antidilution Act, which for the first time disconnects trademark protection from protecting consumers from confusion by knockoffs. The Antidilution Act of 1995 gives the owner of any famous mark--and only famous marks--protection from any use that dilutes the meaning that the brand owner has attached to its own mark. It can be entirely clear to consumers that a particular use does not come from the owner of the brand, and still, the owner has a right to prevent this use. While there is some constitutional free-speech protection for criticism, there is also a basic change in the understanding of trademark law-- from a consumer protection law intended to assure that consumers can rely on the consistency of goods marked in a certain way, to a property right in controlling the meaning of symbols a company has successfully cultivated so that they are, in fact, famous. This legal change marks a major shift in the understanding of the role of law in assigning control for cultural meaning generated by market actors.
-={Antidilutation Act of 1995;branding:trademark dilutation;dilutation of trademaks;logical layer of institutional ecology:trademark dilutation;proprietary rights:trademark dilutation;trademark dilutation;information production, market-based:cultural change, transparency of+4;market-based information producers: cultural change, transparency of+4;nonmarket information producers:cultural change, transparency of+4}
+={Antidilutation Act of 1995;branding:trademark dilutation;dilutation of trademarks;logical layer of institutional ecology:trademark dilutation;proprietary rights:trademark dilutation;trademark dilutation;information production, market-based:cultural change, transparency of+4;market-based information producers: cultural change, transparency of+4;nonmarket information producers:cultural change, transparency of+4}
Unlike market production of culture, meaning making as a social, nonmarket practice has no similar systematic reason to accept meaning as it comes. Certainly, some social relations do. When girls play with dolls, collect them, or exhibit them, they are rarely engaged in reflection on the meaning of the dolls, just as fans of Scarlett O'Hara, of which a brief Internet search suggests there are many, are not usually engaged in critique of Gone with the ,{[pg 291]}, Wind as much as in replication and adoption of its romantic themes. Plainly, however, some conversations we have with each other are about who we are, how we came to be who we are, and whether we view the answers we find to these questions as attractive or not. In other words, some social interactions do have room for examining culture as well as inhabiting it, for considering background knowledge for what it is, rather than taking it as a given input into the shape of demand or using it as a medium for managing meaning and demand. People often engage in conversations with each other precisely to understand themselves in the world, their relationship to others, and what makes them like and unlike those others. One major domain in which this formation of self- and group identity occurs is the adoption or rejection of, and inquiry into, cultural symbols and sources of meaning that will make a group cohere or splinter; that will make people like or unlike each other.
@@ -1681,10 +1685,10 @@ We can analyze the implications of the emergence of the networked information ec
The opportunities that the network information economy offers, however, often run counter to the central policy drive of both the United States and the European Union in the international trade and intellectual property systems. These two major powers have systematically pushed for ever-stronger proprietary protection and increasing reliance on strong patents, copyrights, and similar exclusive rights as the core information policy for growth and development. Chapter 2 explains why such a policy is suspect from a purely economic perspective concerned with optimizing innovation. ,{[pg 303]}, A system that relies too heavily on proprietary approaches to information production is not, however, merely inefficient. It is unjust. Proprietary rights are designed to elicit signals of people's willingness and ability to pay. In the presence of extreme distribution differences like those that characterize the global economy, the market is a poor measure of comparative welfare. A system that signals what innovations are most desirable and rations access to these innovations based on ability, as well as willingness, to pay, overrepresents welfare gains of the wealthy and underrepresents welfare gains of the poor. Twenty thousand American teenagers can simply afford, and will be willing to pay, much more for acne medication than the more than a million Africans who die of malaria every year can afford to pay for a vaccine. A system that relies too heavily on proprietary models for managing information production and exchange is unjust because it is geared toward serving small welfare increases for people who can pay a lot for incremental improvements in welfare, and against providing large welfare increases for people who cannot pay for what they need.
2~ LIBERAL THEORIES OF JUSTICE AND THE NETWORKED INFORMATION ECONOMY
-={human development and justice:liberal theories of+7;human welfare:liberal theories of justice+7;information economy:justice, liberal theories of+7;justice and human development:liberal theories of+7;liberal societies:theories of justice+7;networked environment policy:justice, liberal theories of+7;welfare:liberal theories of justice+7|see also justice and human development}
+={human development and justice:liberal theories of+7;human welfare:liberal theories of justice+7;information economy:justice, liberal theories of+7;justice and human development:liberal theories of+7;liberal societies:theories of justice+7;welfare:liberal theories of justice+7|see also justice and human development}
Liberal theories of justice can be categorized according to how they characterize the sources of inequality in terms of luck, responsibility, and structure. By luck, I mean reasons for the poverty of an individual that are beyond his or her control, and that are part of that individual's lot in life unaffected by his or her choices or actions. By responsibility, I mean causes for the poverty of an individual that can be traced back to his or her actions or choices. By structure, I mean causes for the inequality of an individual that are beyond his or her control, but are traceable to institutions, economic organizations, or social relations that form a society's transactional framework and constrain the behavior of the individual or undermine the efficacy of his or her efforts at self-help.
-={background knowledge:see culture bad luck, justice and+2;DSL:see broadband networks dumb luck, justice and+2;luck, justice and+2;misfortune, justice and+2;organizational structure:justice and+2;structure of organizations:justice and+2}
+={background knowledge:see culture bad luck, justice and+2;DSL:see broadband networks dumb luck, justice and+2;luck, justice and+2;misfortune, justice and+2;organization structure:justice and+2;structure of organizations:justice and+2}
We can think of John Rawls's /{Theory of Justice}/ as based on a notion that the poorest people are the poorest because of dumb luck. His proposal for a systematic way of defending and limiting redistribution is the "difference principle." A society should organize its redistribution efforts in order to make those who are least well-off as well-off as they can be. The theory of desert is that, because any of us could in principle be the victim of this dumb luck, we would all have agreed, if none of us had known where we ,{[pg 304]}, would be on the distribution of bad luck, to minimize our exposure to really horrendous conditions. The practical implication is that while we might be bound to sacrifice some productivity to achieve redistribution, we cannot sacrifice too much. If we did that, we would most likely be hurting, rather than helping, the weakest and poorest. Libertarian theories of justice, most prominently represented by Robert Nozick's entitlement theory, on the other hand, tend to ignore bad luck or impoverishing structure. They focus solely on whether the particular holdings of a particular person at any given moment are unjustly obtained. If they are not, they may not justly be taken from the person who holds them. Explicitly, these theories ignore the poor. As a practical matter and by implication, they treat responsibility as the source of the success of the wealthy, and by negation, the plight of the poorest--leading them to be highly resistant to claims of redistribution.
={Rawls, John+1;Nozick, Robert;redistribution theory+1}
@@ -1953,17 +1957,31 @@ group{
Notes:
-a. Large ambiguity results because technology transfer office reports increased revenues for yearend 2003 as $178M without reporting expenses; University Annual Report reports licensing revenue with all "revenue from other educational and research activities," and reports a 10 percent decline in this category, "reflecting an anticipated decline in royalty and license income" from the $133M for the previous year-end, 2002. The table reflects an assumed net contribution to university revenues between $100-120M (the entire decline in the category due to royalty/royalties decreased proportionately with the category).
-
-b. University of California Annual Report of the Office of Technology Transfer is more transparent than most in providing expenses--both net legal expenses and tech transfer direct operating expenses, which allows a clear separation of net revenues from technology transfer activities.
+a. Large ambiguity results because technology transfer office reports increased
+revenues for yearend 2003 as $178M without reporting expenses; University
+Annual Report reports licensing revenue with all "revenue from other
+educational and research activities," and reports a 10 percent decline in this
+category, "reflecting an anticipated decline in royalty and license income"
+from the $133M for the previous year-end, 2002. The table reflects an assumed
+net contribution to university revenues between $100-120M (the entire decline
+in the category due to royalty/royalties decreased proportionately with the
+category).
+
+b. University of California Annual Report of the Office of Technology Transfer
+is more transparent than most in providing expenses--both net legal expenses
+and tech transfer direct operating expenses, which allows a clear separation of
+net revenues from technology transfer activities.
c. Minus direct expenses, not including expenses for unlicensed inventions.
d. Federal- and nonfederal-sponsored research.
-e. Almost half of this amount is in income from a single Initial Public Offering, and therefore does not represent a recurring source of licensing revenue.
+e. Almost half of this amount is in income from a single Initial Public
+Offering, and therefore does not represent a recurring source of licensing
+revenue.
-f. Technology transfer gross revenue minus the one-time event of an initial public offering of LiquidMetal Technologies.
+f. Technology transfer gross revenue minus the one-time event of an initial
+public offering of LiquidMetal Technologies.
}group
@@ -2039,7 +2057,7 @@ Increased practical individual autonomy has been central to my claims throughout
={communities:virtual+9;virtual communities+9:see also social relations and norms}
We are seeing two effects: first, and most robustly, we see a thickening of preexisting relations with friends, family, and neighbors, particularly with those who were not easily reachable in the pre-Internet-mediated environment. Parents, for example, use instant messages to communicate with their children who are in college. Friends who have moved away from each other are keeping in touch more than they did before they had e-mail, because email does not require them to coordinate a time to talk or to pay longdistance rates. However, this thickening of contacts seems to occur alongside a loosening of the hierarchical aspects of these relationships, as individuals weave their own web of supporting peer relations into the fabric of what might otherwise be stifling familial relationships. Second, we are beginning to see the emergence of greater scope for limited-purpose, loose relationships. These may not fit the ideal model of "virtual communities." They certainly do not fit a deep conception of "community" as a person's primary source of emotional context and support. They are nonetheless effective and meaningful to their participants. It appears that, as the digitally networked environment begins to displace mass media and telephones, its salient communications characteristics provide new dimensions to thicken existing social relations, while also providing new capabilities for looser and more fluid, but still meaningful social networks. A central aspect of this positive improvement in loose ties has been the technical-organizational shift from an information environment dominated by commercial mass media on a oneto-many model, which does not foster group interaction among viewers, to an information environment that both technically and as a matter of social practice enables user-centric, group-based active cooperation platforms of the kind that typify the networked information economy. This is not to say that the Internet necessarily effects all people, all social groups, and networks identically. The effects on different people in different settings and networks will likely vary, certainly in their magnitude. My purpose here, however, is ,{[pg 358]}, to respond to the concern that enhanced individual capabilities entail social fragmentation and alienation. The available data do not support that claim as a description of a broad social effect.
-={communication:thickening of preexisting relations;displacement of real-world interactions;family relations, strengthening of;loose affiliations;neighborhood relations, strengthening of;networked public sphere:loose affiliations;norms (social):loose affiliations|thickening of preexisting relations;peer production:loose affiliations;preexisting relations, thickening of;public sphere:loose affiliations;regulation by social norms:loose affiliations|thickening of preexisting relations;scope of loose relationships;social relations and norms:loose affiliations|thickening of preexisting relations;supplantation of real-world interaction;thickening of preexisting relations}
+={communication:thickening of preexisting relations;displacement of real-world interaction;family relations, strengthening of;loose affiliations;neighborhood relations, strengthening of;networked public sphere:loose affiliations;norms (social):loose affiliations|thickening of preexisting relations;peer production:loose affiliations;preexisting relations, thickening of;public sphere:loose affiliations;regulation by social norms:loose affiliations|thickening of preexisting relations;scope of loose relationships;social relations and norms:loose affiliations|thickening of preexisting relations;supplantation of real-world interaction;thickening of preexisting relations}
2~ FROM "VIRTUAL COMMUNITIES" TO FEAR OF DISINTEGRATION
@@ -2068,7 +2086,7 @@ The concerns represented by these early studies of the effects of Internet use o
={Coleman, James;Granovetter, Mark;Putnum, Robert}
There are, roughly speaking, two types of responses to these concerns. The first is empirical. In order for these concerns to be valid as applied to increasing use of Internet communications, it must be the case that Internet communications, with all of their inadequacies, come to supplant real-world human interactions, rather than simply to supplement them. Unless Internet connections actually displace direct, unmediated, human contact, there is no basis to think that using the Internet will lead to a decline in those nourishing connections we need psychologically, or in the useful connections we make socially, that are based on direct human contact with friends, family, and neighbors. The second response is theoretical. It challenges the notion that the socially embedded individual is a fixed entity with unchanging needs that are, or are not, fulfilled by changing social conditions and relations. Instead, it suggests that the "nature" of individuals changes over time, based on actual social practices and expectations. In this case, we are seeing a shift from individuals who depend on social relations that are dominated by locally embedded, thick, unmediated, given, and stable relations, into networked individuals--who are more dependent on their own combination of strong and weak ties, who switch networks, cross boundaries, and weave their own web of more or less instrumental, relatively fluid relationships. Manuel Castells calls this the "networked society,"~{ Manuel Castells, The Rise of Networked Society 2d ed. (Malden, MA: Blackwell Publishers, Inc., 2000). }~ Barry Wellman, "networked individualism."~{ Barry Wellman et al., "The Social Affordances of the Internet for Networked Individualism," Journal of Computer Mediated Communication 8, no. 3 (April 2003). }~ To simplify vastly, it is not that people cease to depend on others and their context for both psychological and social wellbeing and efficacy. It is that the kinds of connections that we come to rely on for these basic human needs change over time. Comparisons of current practices to the old ways of achieving the desiderata of community, and fears regarding the loss of community, are more a form of nostalgia than a diagnosis of present social malaise. ,{[pg 363]},
-={Castells, Manuel;Wellman, Barry;displacement of real-world interaction+5;family relations, strengthening of+5;loose affiliations;neighborhood relations, strengthening of+5;networked public sphere:loose affiliations;norms (social):loose affiliations;peer production:loose affiliations;public sphere:loose affiliations;regulations by social norms:loose affiliations;social relations and norms:loose affiliations;supplantation of real-world interaction+5;thickening of preexisting relations+5}
+={Castells, Manuel;Wellman, Barry;displacement of real-world interaction+5;family relations, strengthening of+5;loose affiliations;neighborhood relations, strengthening of+5;networked public sphere:loose affiliations;norms (social):loose affiliations;peer production:loose affiliations;public sphere:loose affiliations;regulation by social norms:loose affiliations;social relations and norms:loose affiliations;supplantation of real-world interaction+5;thickening of preexisting relations+5}
3~ Users Increase Their Connections with Preexisting Relations
={e-mail:thickening of preexisting relations+4;social capital:thickening of preexisting relations+4}
@@ -2139,7 +2157,7 @@ Empirically, it seems that the Internet is allowing us to eat our cake and have
The conceptual answer has been that the image of "community" that seeks a facsimile of a distant pastoral village is simply the wrong image of how we interact as social beings. We are a networked society now--networked individuals connected with each other in a mesh of loosely knit, overlapping, flat connections. This does not leave us in a state of anomie. We are welladjusted, networked individuals; well-adjusted socially in ways that those who seek community would value, but in new and different ways. In a substantial departure from the range of feasible communications channels available in the twentieth century, the Internet has begun to offer us new ways of connecting to each other in groups small and large. As we have come to take advantage of these new capabilities, we see social norms and software coevolving to offer new, more stable, and richer contexts for forging new relationships beyond those that in the past have been the focus of our social lives. These do not displace the older relations. They do not mark a fundamental shift in human nature into selfless, community-conscious characters. We continue to be complex beings, radically individual and self-interested ,{[pg 377]}, at the same time that we are entwined with others who form the context out of which we take meaning, and in which we live our lives. However, we now have new scope for interaction with others. We have new opportunities for building sustained limited-purpose relations, weak and intermediate-strength ties that have significant roles in providing us with context, with a source of defining part of our identity, with potential sources for support, and with human companionship. That does not mean that these new relationships will come to displace the centrality of our more immediate relationships. They will, however, offer increasingly attractive supplements as we seek new and diverse ways to embed ourselves in relation to others, to gain efficacy in weaker ties, and to interpolate different social networks in combinations that provide us both stability of context and a greater degree of freedom from the hierarchical and constraining aspects of some of our social relations. ,{[pg 378]}, ,{[pg 379]},
-:C~ Part Three - Policies of Freedom at a Moment of Transformation
+:B~ Part Three - Policies of Freedom at a Moment of Transformation
1~p3 Introduction
@@ -2185,7 +2203,7 @@ The first two parts of this book explained why the introduction of digital compu
={commercial model of communication:mapping, framework for+13;industrial model of communication:mapping, framework for+13;institutional ecology of digital environment:mapping, framework for+13;layers of institutional ecology+13;policy:mapping institutional ecology+13;policy layers+13;traditional model of communication:mapping, framework for+13}
Two specific examples will illustrate the various levels at which law can operate to shape the use of information and its production and exchange. The first example builds on the story from chapter 7 of how embarrassing internal e-mails from Diebold, the electronic voting machine maker, were exposed by investigative journalism conducted on a nonmarket and peerproduction model. After students at Swarthmore College posted the files, Diebold made a demand under the DMCA that the college remove the materials or face suit for contributory copyright infringement. The students were therefore forced to remove the materials. However, in order keep the materials available, the students asked students at other institutions to mirror the files, and injected them into the eDonkey, BitTorrent, and FreeNet filesharing and publication networks. Ultimately, a court held that the unauthorized publication of files that were not intended for sale and carried such high public value was a fair use. This meant that the underlying publication of the files was not itself a violation, and therefore the Internet service provider was not liable for providing a conduit. However, the case was decided on September 30, 2004--long after the information would have been relevant ,{[pg 390]}, to the voting equipment certification process in California. What kept the information available for public review was not the ultimate vindication of the students' publication. It was the fact that the materials were kept in the public sphere even under threat of litigation. Recall also that at least some of the earlier set of Diebold files that were uncovered by the activist who had started the whole process in early 2003 were zipped, or perhaps encrypted in some form. Scoop, the Web site that published the revelation of the initial files, published--along with its challenge to the Internet community to scour the files and find holes in the system--links to locations in which utilities necessary for reading the files could be found.
-={Diebold Elections Systems+3;electronic voting machines (case study)+3;networked public sphere:Diebold Election Systems case study+3;policy:Diebold Election Systems case study+3;public sphere:Diebold Election Systems case study+3;voting, electronic+3}
+={Diebold Election Systems+3;electronic voting machines (case study)+3;networked public sphere:Diebold Election Systems case study+3;policy:Diebold Election Systems case study+3;public sphere:Diebold Election Systems case study+3;voting, electronic+3}
There are four primary potential points of failure in this story that could have conspired to prevent the revelation of the Diebold files, or at least to suppress the peer-produced journalistic mode that made them available. First, if the service provider--the college, in this case--had been a sole provider with no alternative physical transmission systems, its decision to block the materials under threat of suit would have prevented publication of the materials throughout the relevant period. Second, the existence of peer-to-peer networks that overlay the physical networks and were used to distribute the materials made expunging them from the Internet practically impossible. There was no single point of storage that could be locked down. This made the prospect of threatening other universities futile. Third, those of the original files that were not in plain text were readable with software utilities that were freely available on the Internet, and to which Scoop pointed its readers. This made the files readable to many more critical eyes than they otherwise would have been. Fourth, and finally, the fact that access to the raw materials--the e-mails--was ultimately found to be privileged under the fair-use doctrine in copyright law allowed all the acts that had been performed in the preceding period under a shadow of legal liability to proceed in the light of legality.
@@ -2205,7 +2223,7 @@ The remainder of this chapter provides a more or less detailed presentation of t
A quick look at table 11.1 reveals that there is a diverse set of sources of openness. A few of these are legal. Mostly, they are based on technological and social practices, including resistance to legal and regulatory drives toward enclosure. Examples of policy interventions that support an open core common infrastructure are the FCC's increased permission to deploy open wireless networks and the various municipal broadband initiatives. The former is a regulatory intervention, but its form is largely removal of past prohibitions on an entire engineering approach to building wireless systems. Municipal efforts to produce open broadband networks are being resisted at the state legislation level, with statutes that remove the power to provision broadband from the home rule powers of municipalities. For the most part, the drive for openness is based on individual and voluntary cooperative action, not law. The social practices of openness take on a quasi-normative face when practiced in standard-setting bodies like the Internet Engineering Task Force (IETF) or the World Wide Web Consortium (W3C). However, none of these have the force of law. Legal devices also support openness when used in voluntaristic models like free software licensing and Creative Commons?type licensing. However, most often when law has intervened in its regulatory force, as opposed to its contractual-enablement force, it has done so almost entirely on the side of proprietary enclosure.
Another characteristic of the social-economic-institutional struggle is an alliance between a large number of commercial actors and the social sharing culture. We see this in the way that wireless equipment manufacturers are selling into a market of users of WiFi and similar unlicensed wireless devices. We see this in the way that personal computer manufacturers are competing ,{[pg 395]}, over decreasing margins by producing the most general-purpose machines that would be most flexible for their users, rather than machines that would most effectively implement the interests of Hollywood and the recording industry. We see this in the way that service and equipment-based firms, like IBM and Hewlett-Packard (HP), support open-source and free software. The alliance between the diffuse users and the companies that are adapting their business models to serve them as users, instead of as passive consumers, affects the political economy of this institutional battle in favor of openness. On the other hand, security consciousness in the United States has led to some efforts to tip the balance in favor of closed proprietary systems, apparently because these are currently perceived as more secure, or at least more amenable to government control. While orthogonal in its political origins to the battle between proprietary and commons-based strategies for information production, this drive does tilt the field in favor of enclosure, at least at the time of this writing in 2005.
-={commercial model of communication:security related policy;industrial model of communication:security-related policy;institutional ecology of digital environment:security-related policy;policy:security-related;security-related policy;traditional model of communication:security-related policy}
+={commercial model of communication:security-related policy;industrial model of communication:security-related policy;institutional ecology of digital environment:security-related policy;policy:security-related;security-related policy;traditional model of communication:security-related policy}
% paragraph end moved above table
diff --git a/data/v1/samples/two_bits.christopher_kelty.sst b/data/v1/samples/two_bits.christopher_kelty.sst
index 39e34b6..1c833c8 100644
--- a/data/v1/samples/two_bits.christopher_kelty.sst
+++ b/data/v1/samples/two_bits.christopher_kelty.sst
@@ -108,7 +108,7 @@ At first glance, the thread tying these projects together seems to be the Intern
={Internet+12:relation to Free Software;Free Software:relation to Internet;public sphere:theories of}
Both the Internet and Free Software are historically specific, that is, not just any old new media or information technology. But the Internet is many, many specific things to many, many specific people. As one reviewer of an early manuscript version of this book noted, "For most people, the Internet is porn, stock quotes, Al Jazeera clips of executions, Skype, seeing pictures of the grandkids, porn, never having to buy another encyclopedia, MySpace, e-mail, online housing listings, Amazon, Googling potential romantic interests, etc. etc." It is impossible to explain all of these things; the meaning and significance of the proliferation of digital pornography is a very different concern than that of the fall of the print encyclopedia ,{[pg 5]}, and the rise of Wikipedia. Yet certain underlying practices relate these diverse phenomena to one another and help explain why they have occurred at this time and in this technical, legal, and social context. By looking carefully at Free Software and its modulations, I suggest, one can come to a better understanding of the changes affecting pornography, Wikipedia, stock quotes, and many other wonderful and terrifying things.~{ Wikipedia is perhaps the most widely known and generally familiar example of what this book is about. Even though it is not identified as such, it is in fact a Free Software project and a "modulation" of Free Software as I describe it here. The non-technically inclined reader might keep Wikipedia in mind as an example with which to follow the argument of this book. I will return to it explicitly in part 3. However, for better or for worse, there will be no discussion of pornography. }~
-={Wikipedia}
+={Wikipedia (collaborative encyclopedia)}
Two Bits has three parts. Part I of this book introduces the reader to the concept of recursive publics by exploring the lives, works, and discussions of an international community of geeks brought together by their shared interest in the Internet. Chapter 1 asks, in an ethnographic voice, "Why do geeks associate with one another?" The answer—told via the story of Napster in 2000 and the standards process at the heart of the Internet—is that they are making a recursive public. Chapter 2 explores the words and attitudes of geeks more closely, focusing on the strange stories they tell (about the Protestant Reformation, about their practical everyday polymathy, about progress and enlightenment), stories that make sense of contemporary political economy in sometimes surprising ways. Central to part I is an explication of the ways in which geeks argue about technology but also argue with and through it, by building, modifying, and maintaining the very software, networks, and legal tools within which and by which they associate with one another. It is meant to give the reader a kind of visceral sense of why certain arrangements of technology, organization, and law—specifically that of the Internet and Free Software—are so vitally important to these geeks.
={geeks;Napster;technology:as argument}
@@ -223,7 +223,7 @@ The study of distributed phenomena does not necessarily imply the detailed, loca
={Weber, Max}
It is in this sense that the ethnographic object of this study is not geeks and not any particular project or place or set of people, but Free Software and the Internet. Even more precisely, the ethnographic object of this study is "recursive publics"—except that this concept is also the work of the ethnography, not its preliminary object. I could not have identified "recursive publics" as the object of the ethnography at the outset, and this is nice proof that ethnographic work is a particular kind of epistemological encounter, an encounter that requires considerable conceptual work during and after the material labor of fieldwork, and throughout the material labor of writing and rewriting, in order to make sense of and reorient it into a question that will have looked deliberate and ,{[pg 21]}, answerable in hindsight. Ethnography of this sort requires a long-term commitment and an ability to see past the obvious surface of rapid transformation to a more obscure and slower temporality of cultural significance, yet still pose questions and refine debates about the near future.~{ Despite what might sound like a "shoot first, ask questions later" approach, the design of this project was in fact conducted according to specific methodologies. The most salient is actor-network theory: Latour, Science in Action; Law, "Technology and Heterogeneous Engineering"; Callon, "Some Elements of a Sociology of Translation"; Latour, Pandora’s Hope; Latour, Re-assembling the Social; Callon, Laws of the Markets; Law and Hassard, Actor Network Theory and After. Ironically, there have been no actor-network studies of networks, which is to say, of particular information and communication technologies such as the Internet. The confusion of the word network (as an analytical and methodological term) with that of network (as a particular configuration of wires, waves, software, and chips, or of people, roads, and buses, or of databases, names, and diseases) means that it is necessary to always distinguish this-network-here from any-network-whatsoever. My approach shares much with the ontological questions raised in works such as Law, Aircraft Stories; Mol, The Body Multiple; Cussins, "Ontological Choreography"; Charis Thompson, Making Parents; and Dumit, Picturing Personhood. }~ Historically speaking, the chapters of part II can be understood as a contribution to a history of scientific infrastructure—or perhaps to an understanding of large-scale, collective experimentation.~{ I understand a concern with scientific infrastructure to begin with Steve Shapin and Simon Schaffer in Leviathan and the Air Pump, but the genealogy is no doubt more complex. It includes Shapin, The Social History of Truth; Biagioli, Galileo, Courtier; Galison, How Experiments End and Image and Logic; Daston, Biographies of Scientific Objects; Johns, The Nature of the Book. A whole range of works explore the issue of scientific tools and infrastructure: Kohler, Lords of the Fly; Rheinberger, Towards a History of Epistemic Things; Landecker, Culturing Life; Keating and Cambrosio, Biomedical Platforms. Bruno Latour’s "What Rules of Method for the New Socio-scientific Experiments" provides one example of where science studies might go with these questions. Important texts on the subject of technical infrastructures include Walsh and Bayma, "Computer Networks and Scientific Work"; Bowker and Star, Sorting Things Out; Edwards, The ,{[pg 316]}, Closed World; Misa, Brey, and Feenberg, Modernity and Technology; Star and Ruhleder, "Steps Towards an Ecology of Infrastructure." }~ The Internet and Free Software are each an important practical transformation that will have effects on the practice of science and a kind of complex technical practice for which there are few existing models of study.
-={actor network theory;Internet+1}
+={Actor Network Theory;Internet+1}
A methodological note about the peculiarity of my subject is also in order. The Attentive Reader will note that there are very few fragments of conventional ethnographic material (i.e., interviews or notes) transcribed herein. Where they do appear, they tend to be "publicly available"—which is to say, accessible via the Internet—and are cited as such, with as much detail as necessary to allow the reader to recover them. Conventional wisdom in both anthropology and history has it that what makes a study interesting, in part, is the work a researcher has put into gathering that which is not already available, that is, primary sources as opposed to secondary sources. In some cases I provide that primary access (specifically in chapters 2, 8, and 9), but in many others it is now literally impossible: nearly everything is archived. Discussions, fights, collaborations, talks, papers, software, articles, news stories, history, old software, old software manuals, reminiscences, notes, and drawings—it is all saved by someone, somewhere, and, more important, often made instantly available by those who collect it. The range of conversations and interactions that count as private (either in the sense of disappearing from written memory or of being accessible only to the parties involved) has shrunk demonstrably since about 1981.
={ethnographic data:availability of+5}
@@ -307,7 +307,7 @@ _1 2. Boyle, "The Second Enclosure Movement and the Construction of the Public D
2~ From the Facts of Human Activity
Boston, May 2003. Starbucks. Sean and Adrian are on their way to pick me up for dinner. I’ve already had too much coffee, so I sit at the window reading the paper. Eventually Adrian calls to find out where I am, I tell him, and he promises to show up in fifteen minutes. I get bored and go outside to wait, watch the traffic go by. More or less right on time (only post-dotcom is Adrian ever on time), Sean’s new blue VW Beetle rolls into view. Adrian jumps out of the passenger seat and into the back, and I get in. Sean has been driving for a little over a year. He seems confident, cautious, but meanders through the streets of Cambridge. We are destined for Winchester, a township on the Charles River, in order to go to an Indian restaurant that one of Sean’s friends has recommended. When I ask how they are doing, they say, "Good, good." Adrian offers, "Well, Sean’s better than he has been in two years." "Really?" I say, impressed.
-={Doyle, Sean+6;Groper Adrian+6}
+={Doyle, Sean+6;Gropper, Adrian+6}
Sean says, "Well, happier than at least the last year. I, well, let me put it this way: forgive me father for I have sinned, I still have unclean thoughts about some of the upper management in the company, I occasionally think they are not doing things in the best interest of the company, and I see them as self-serving and sometimes wish them ill." In this rolling blue confessional Sean describes some of the people who I am familiar with whom he now tries very hard not to think about. I look at him and say, "Ten Hail Marys and ten Our Fathers, and you will be absolved, my child." Turning to Adrian, I ask, "And what about you?" Adrian continues the joke: "I, too, have sinned. I have reached the point where I can see absolutely nothing good coming of this company but that I can keep my investments in it long enough to pay for my children’s college tuition." I say, "You, my son, I cannot help." Sean says, "Well, funny thing about tainted money . . . there just taint enough of it."
@@ -1120,7 +1120,7 @@ The absence of an economic or corporate mandate for Thompson’s and Ritchie’s
={AT&T+14;McIlroy, Douglas}
UNIX was unique for many technical reasons, but also for a specific economic reason: it was never quite academic and never quite commercial. Martin Campbell-Kelly notes that UNIX was a "non-proprietary operating system of major significance."~{ Campbell-Kelly, From Airline Reservations to Sonic the Hedgehog, 143. }~ Kelly’s use of "non-proprietary" is not surprising, but it is incorrect. Although business-speak regularly opposed open to proprietary throughout the 1980s and early 1990s (and UNIX was definitely the former), Kelly’s slip marks clearly the confusion between software ownership and software distribution that permeates both popular and academic understandings. UNIX was indeed proprietary—it was copyrighted and wholly owned by Bell Labs and in turn by Western Electric ,{[pg 127]}, and AT&T—but it was not exactly commercialized or marketed by them. Instead, AT&T allowed individuals and corporations to install UNIX and to create UNIX-like derivatives for very low licensing fees. Until about 1982, UNIX was licensed to academics very widely for a very small sum: usually royalty-free with a minimal service charge (from about $150 to $800).~{ Ritchie’s Web site contains a copy of a 1974 license (http://cm.bell-labs.com/cm/cs/who/dmr/licenses.html) and a series of ads that exemplify the uneasy positioning of UNIX as a commercial product (http://cm.bell-labs.com/cm/cs/who/dmr/unixad.html). According to Don Libes and Sandy Ressler, "The original licenses were source licenses. . . . [C]ommercial institutions paid fees on the order of $20,000. If you owned more than one machine, you had to buy binary licenses for every additional machine [i.e., you were not allowed to copy the source and install it] you wanted to install UNIX on. They were fairly pricey at $8000, considering you couldn’t resell them. On the other hand, educational institutions could buy source licenses for several hundred dollars—just enough to cover Bell Labs’ administrative overhead and the cost of the tapes" (Life with UNIX, 20-21). }~ The conditions of this license allowed researchers to do what they liked with the software so long as they kept it secret: they could not distribute or use it outside of their university labs (or use it to create any commercial product or process), nor publish any part of it. As a result, throughout the 1970s UNIX was developed both by Thompson and Ritchie inside Bell Labs and by users around the world in a relatively informal manner. Bell Labs followed such a liberal policy both because it was one of a small handful of industry-academic research and development centers and because AT&T was a government monopoly that provided phone service to the country and was therefore forbidden to directly enter the computer software market.~{ According to Salus, this licensing practice was also a direct result of Judge Thomas Meaney’s 1956 antitrust consent decree which required AT&T to reveal and to license its patents for nominal fees (A Quarter Century of UNIX, 56); see also Brock, The Second Information Revolution, 116-20. }~
-={AT&T:Bell Labratories+13;licensing, of UNIX+6;proprietary systems: open vs.;monopoly}
+={AT&T:Bell Laboratories+13;licensing, of UNIX+6;proprietary systems: open vs.;monopoly}
Being on the border of business and academia meant that UNIX was, on the one hand, shielded from the demands of management and markets, allowing it to achieve the conceptual integrity that made it so appealing to designers and academics. On the other, it also meant that AT&T treated it as a potential product in the emerging software industry, which included new legal questions from a changing intellectual-property regime, novel forms of marketing and distribution, and new methods of developing, supporting, and distributing software.
@@ -1174,7 +1174,7 @@ Unfortunately, Commentary was also legally restricted in its distribution. AT&T
={trade secret law+1}
Thus, these generations of computer-science students and academics shared a secret—a trade secret become open secret. Every student who learned the essentials of the UNIX operating system from a photocopy of Lions’s commentary, also learned about AT&T’s attempt to control its legal distribution on the front cover of their textbook. The parallel development of photocopying has a nice resonance here; together with home cassette taping of music and the introduction of the video-cassette recorder, photocopying helped drive the changes to copyright law adopted in 1976.
-={copyright:changes in}
+={copyright:changes in 1976}
Thirty years later, and long after the source code in it had been completely replaced, Lions’s Commentary is still widely admired by geeks. Even though Free Software has come full circle in providing students with an actual operating system that can be legally studied, taught, copied, and implemented, the kind of "literary criticism" that Lions’s work represents is still extremely rare; even reading obsolete code with clear commentary is one of the few ways to truly understand the design elements and clever implementations that made the UNIX operating system so different from its predecessors and even many of its successors, few, if any of which have been so successfully ported to the minds of so many students.
={design+2}
@@ -1255,7 +1255,7 @@ The open-systems story is also a story of the blind spot of open systems—in th
={intellectual property;interoperability+21;openness (component of Free Software):intellectual property and}
Standardization was at the heart of the contest, but by whom and by what means was never resolved. The dream of open systems, pursued in an entirely unregulated industry, resulted in a complicated experiment in novel forms of standardization and cooperation. The creation of a "standard" operating system based on UNIX is the story of a failure, a kind of "figuring out" gone haywire, which resulted in huge consortia of computer manufacturers attempting to work together and compete with each other at the same time. Meanwhile, the successful creation of a "standard" networking protocol—known as the Open Systems Interconnection Reference Model (OSI)—is a story of failure that hides a larger success; OSI was eclipsed in the same period by the rapid and ad hoc adoption of the Transmission Control Protocol/Internet Protocol (TCP/IP), which used a radically different standardization process and which succeeded for a number of surprising reasons, allowing the Internet ,{[pg 145]}, to take the form it did in the 1990s and ultimately exemplifying the moral-technical imaginary of a recursive public—and one at the heart of the practices of Free Software.
-={figuring out;Open Systems Interconnection (OSI), as reference model;Openness (component of Free Software):standardization and;protocols:Open Systems Interconnection (OSI)|TCP/IP;standards organizations;TCP/IP (Transmission Control Protocol/Internet Protocol)}
+={figuring out;Open Systems Interconnection (OSI):as reference model;Openness (component of Free Software):standardization and;protocols:Open Systems Interconnection (OSI)|TCP/IP;standards organizations;TCP/IP (Transmission Control Protocol/Internet Protocol)}
The conceiving of openness, which is the central plot of these two stories, has become an essential component of the contemporary practice and power of Free Software. These early battles created a kind of widespread readiness for Free Software in the 1990s, a recognition of Free Software as a removal of open systems’ blind spot, as much as an exploitation of its power. The geek ideal of openness and a moral-technical order (the one that made Napster so significant an event) was forged in the era of open systems; without this concrete historical conception of how to maintain openness in technical and moral terms, the recursive public of geeks would be just another hierarchical closed organization—a corporation manqué—and not an independent public serving as a check on the kinds of destructive power that dominated the open-systems contest.
={Napster}
@@ -1441,7 +1441,7 @@ The growth of Free Software in the 1980s and 1990s depended on openness as a con
={Open Systems:networks and+28}
The struggle to standardize UNIX as a platform for open systems was not the only open-systems struggle; alongside the UNIX wars, another "religious war" was raging. The attempt to standardize networks—in particular, protocols for the inter-networking of multiple, diverse, and autonomous networks of computers—was also a key aspect of the open-systems story of the 1980s.~{ The distinction between a protocol, an implementation and a standard is important: Protocols are descriptions of the precise terms by which two computers can communicate (i.e., a dictionary and a handbook for communicating). An implementation is the creation of software that uses a protocol (i.e., actually does the communicating; thus two implementations using the same protocol should be able to share data. A standard defines which protocol should be used by which computers, for what purposes. It may or may not define the protocol, but will set limits on changes to that protocol. }~ The war ,{[pg 167]}, between the TCP/IP and OSI was also a story of failure and surprising success: the story of a successful standard with international approval (the OSI protocols) eclipsed by the experimental, military-funded TCP/IP, which exemplified an alternative and unusual standards process. The moral-technical orders expressed by OSI and TCP/IP are, like that of UNIX, on the border between government, university, and industry; they represent conflicting social imaginaries in which power and legitimacy are organized differently and, as a result, expressed differently in the technology.
-={moral and technical order;Networks:protools for+3;Open Systems Interconnection (OSI), as reference model+27;protocols:Open Systems Interconnection (OSI)+27|TCP/IP;TCP/IP (Transmission Control Protocol/Internet Protocol)+27;religious wars+3;social imaginary;standards process+3}
+={moral and technical order;Networks:protools for+3;Open Systems Interconnection (OSI):as reference model+27;protocols:Open Systems Interconnection (OSI)+27|TCP/IP;TCP/IP (Transmission Control Protocol/Internet Protocol)+27;religious wars+3;social imaginary;standards processes+3}
OSI and TCP/IP started with different goals: OSI was intended to satisfy everyone, to be the complete and comprehensive model against which all competing implementations would be validated; TCP/IP, by contrast, emphasized the easy and robust interconnection of diverse networks. TCP/IP is a protocol developed by bootstrapping between standard and implementation, a mode exemplified by the Requests for Comments system that developed alongside them as part of the Arpanet project. OSI was a "model" or reference standard developed by internationally respected standards organizations.
={Arpanet (network)+18;Request for Comments (RFC)}
@@ -1467,7 +1467,7 @@ One important feature united almost all of these experiments: the networks of th
={antitrust}
TCP/IP and OSI have become emblematic of the split between the worlds of telecommunications and computing; the metaphors of religious wars or of blood feuds and cold wars were common.~{ Drake, "The Internet Religious War." }~ A particularly arch account from this period is Carl Malamud’s Exploring the Internet: A Technical Travelogue, which documents Malamud’s (physical) visits to Internet sites around the globe, discussions (and beer) with networking researchers on technical details of the networks they have created, and his own typically geeky, occasionally offensive takes on cultural difference.~{ Malamud, Exploring the Internet; see also Michael M. J. Fischer, "Worlding Cyberspace." }~ A subtheme of the story is the religious war between Geneva (in particular the ITU) and the Internet: Malamud tells the story of asking the ITU to release its 19,000-page "blue book" of standards on the Internet, to facilitate its adoption and spread.
-={Malmud, Carl+1;standards process+4}
+={Malmud, Carl+1;standards processes+4}
The resistance of the ITU and Malamud’s heroic if quixotic attempts are a parable of the moral-technical imaginaries of openness—and indeed, his story draws specifically on the usable past of Giordano Bruno.~{ The usable past of Giordano Bruno is invoked by Malamud to signal the heretical nature of his own commitment to openly publishing standards that ISO was opposed to releasing. Bruno’s fate at the hands of the Roman Inquisition hinged in some part on his acceptance of the Copernican cosmology, so he has been, like Galileo, a natural figure for revolutionary claims during the 1990s. }~ The "bruno" project demonstrates the gulf that exists between two models of legitimacy—those of ISO and the ITU—in which standards represent the legal and legitimate consensus of a regulated industry, approved by member nations, paid for and enforced by governments, and implemented and adhered to by corporations.
={Bruno, Giordano;Usable pasts;International Organization for Standardization (ISO)+3}
@@ -1486,10 +1486,10 @@ Until the mid-1980s, the TCP/IP protocols were resolutely research-oriented, and
={Cerf, Vinton+2;Kahn, Robert;TCP/IP (Transmission Control Protocol/Internet Protocol):goals of+2}
The explicit goal of TCP/IP was thus to share computer resources, not necessarily to connect two individuals or firms together, or to create a competitive market in networks or networking software. Sharing between different kinds of networks implied allowing the different networks to develop autonomously (as their creators and maintainers saw best), but without sacrificing the ability to continue sharing. Years later, David Clark, chief Internet engineer for several years in the 1980s, gave a much more explicit explanation of the goals that led to the TCP/IP protocols. In particular, he suggested that the main overarching goal was not just to share resources but "to develop an effective technique for multiplexed utilization of existing interconnected networks," and he more explicitly stated the issue of control that faced the designers: "Networks represent administrative boundaries of control, and it was an ambition of this project to come to grips with the problem of integrating a number ,{[pg 173]}, of separately administrated entities into a common utility."~{ Clark, "The Design Philosophy of the DARPA Internet Protocols," 54-55. }~ By placing the goal of expandability first, the TCP/IP protocols were designed with a specific kind of simplicity in mind: the test of the protocols’ success was simply the ability to connect.
-={Clark,David}
+={Clark, David}
By setting different goals, TCP/IP and OSI thus differed in terms of technical details; but they also differed in terms of their context and legitimacy, one being a product of international-standards bodies, the other of military-funded research experiments. The technical and organizational differences imply different processes for standardization, and it is the peculiar nature of the so-called Requests for Comments (RFC) process that gave TCP/IP one of its most distinctive features. The RFC system is widely recognized as a unique and serendipitous outcome of the research process of Arpanet.~{ RFCs are archived in many places, but the official site is RFC Editor, http://www.rfc-editor.org/. }~ In a thirty-year retrospective (published, naturally, as an RFC: RFC 2555), Vint Cerf says, "Hiding in the history of the RFCs is the history of human institutions for achieving cooperative work." He goes on to describe their evolution over the years: "When the RFCs were first produced, they had an almost 19th century character to them—letters exchanged in public debating the merits of various design choices for protocols in the ARPANET. As email and bulletin boards emerged from the fertile fabric of the network, the far-flung participants in this historic dialog began to make increasing use of the online medium to carry out the discussion—reducing the need for documenting the debate in the RFCs and, in some respects, leaving historians somewhat impoverished in the process. RFCs slowly became conclusions rather than debates."~{ RFC Editor, RFC 2555, 6. }~
-={standards process;Request for Comments (RFC)+2}
+={standards processes;Request for Comments (RFC)+2}
Increasingly, they also became part of a system of discussion and implementation in which participants created working software as part of an experiment in developing the standard, after which there was more discussion, then perhaps more implementation, and finally, a standard. The RFC process was a way to condense the process of standardization and validation into implementation; which is to say, the proof of open systems was in the successful connection of diverse networks, and the creation of a standard became a kind of ex post facto rubber-stamping of this demonstration. Any further improvement of the standard hinged on an improvement on the standard implementation because the standards that resulted were freely and widely available: "A user could request an RFC by email from his host computer and have it automatically delivered to his mailbox. . . . RFCs were also shared freely with official standards ,{[pg 174]}, bodies, manufacturers and vendors, other working groups, and universities. None of the RFCs were ever restricted or classified. This was no mean feat when you consider that they were being funded by DoD during the height of the Cold War."~{ Ibid., 11. }~
={Software:implementation of;standards:implementation+9|validation of;Secrecy+1}
@@ -1568,7 +1568,7 @@ Stallman’s GNU General Public License "hacks" the federal copyright law, as is
={Copyleft licenses (component of Free Software):as hack of copyright law+1;Copyright+1}
Like all software since the 1980 copyright amendments, Free Software is copyrightable—and what’s more, automatically copyrighted as it is written (there is no longer any requirement to register). Copyright law grants the author (or the employer of the author) a number of strong rights over the dispensation of what has been written: rights to copy, distribute, and change the work.~{ Copyright Act of 1976, Pub. L. No. 94-553, 90 Stat. 2541, enacted 19 October 1976; and Copyright Amendments, Pub. L. No. 96-517, 94 Stat. 3015, 3028 (amending §101 and §117, title 17, United States Code, regarding computer programs), enacted 12 December 1980. All amendments since 1976 are listed at http://www.copyright.gov/title17/92preface.html. }~ Free Software’s hack is to immediately make use of these rights in order to abrogate the rights the programmer has been given, thus granting all subsequent licensees rights to copy, distribute, modify, and use the copyrighted software. Some licenses, like the GPL, add the further restriction that every licensee must offer the same terms to any subsequent licensee, others make no such restriction on subsequent uses. Thus, while statutory law suggests that individuals need strong rights and grants them, Free Software licenses effectively annul them in favor of other activities, such as sharing, porting, and forking software. It is for this reason that they have earned the name "copyleft."~{ The history of the copyright and software is discussed in Litman, Digital Copyright; Cohen et al., Copyright in a Global Information Economy; and Merges, Menell, and Lemley, Intellectual Property in the New Technological Age. }~
-={Copyright:changes in|rights granted by}
+={Copyright:changes in 1976|rights granted by}
This is a convenient ex post facto description, however. Neither Stallman nor anyone else started out with the intention of hacking copyright law. The hack of the Free Software licenses was a response to a complicated controversy over a very important invention, a tool that in turn enabled an invention called EMACS. The story of the controversy is well-known among hackers and geeks, but not often told, and not in any rich detail, outside of these small circles.~{ See Wayner, Free for All; Moody, Rebel Code; and Williams, Free as in Freedom. Although this story could be told simply by interviewing Stallman and James Gosling, both of whom are still alive and active in the software world, I have chosen to tell it through a detailed analysis of the Usenet and Arpanet archives of the controversy. The trade-off is between a kind of incomplete, fly-on-the-wall access to a moment in history and the likely revisionist retellings of those who lived through it. All of the messages referenced here are cited by their "Message-ID," which should allow anyone interested to access the original messages through Google Groups (http://groups.google.com). }~
@@ -1854,10 +1854,10 @@ The final component of Free Software is coordination. For many participants and
={Free Software:open source vs.;Open Source:Free Software vs.;peer production;practices:five components of Free Software+2;Source Code Management tools (SCMs)}
Coordination is important because it collapses and resolves the distinction between technical and social forms into a meaningful ,{[pg 211]}, whole for participants. On the one hand, there is the coordination and management of people; on the other, there is the coordination of source code, patches, fixes, bug reports, versions, and distributions—but together there is a meaningful technosocial practice of managing, decision-making, and accounting that leads to the collaborative production of complex software and networks. Such coordination would be unexceptional, essentially mimicking long-familiar corporate practices of engineering, except for one key fact: it has no goals. Coordination in Free Software privileges adaptability over planning. This involves more than simply allowing any kind of modification; the structure of Free Software coordination actually gives precedence to a generalized openness to change, rather than to the following of shared plans, goals, or ideals dictated or controlled by a hierarchy of individuals.~{ On the distinction between adaptability and adaptation, see Federico Iannacci, "The Linux Managing Model," http://opensource.mit.edu/papers/iannacci2.pdf. Matt Ratto characterizes the activity of Linux-kernel developers as a "culture of re-working" and a "design for re-design," and captures the exquisite details of such a practice both in coding and in the discussion between developers, an activity he dubs the "pressure of openness" that "results as a contradiction between the need to maintain productive collaborative activity and the simultaneous need to remain open to new development directions" ("The Pressure of Openness," 112-38). }~
-={adaptability:planning vs.+1|as a form of critique+1|adaptation vs.;coordination (component of Free Software):individual virtuosity vs. hierarchical planning+2;critique, Free Software+1;goals, lack of in Free Software+1;hackers:curiosity and virtuosity of+1;hierarchy, in coordination+5;planning+1}
+={adaptability:planning vs.+1|as a form of critique+1|adaptation vs.;coordination (component of Free Software):individual virtuosity vs. hierarchical planning+2;critique, Free Software as+1;goals, lack of in Free Software+1;hackers:curiosity and virtuosity of+1;hierarchy, in coordination+5;planning+1}
Adaptability does not mean randomness or anarchy, however; it is a very specific way of resolving the tension between the individual curiosity and virtuosity of hackers, and the collective coordination necessary to create and use complex software and networks. No man is an island, but no archipelago is a nation, so to speak. Adaptability preserves the "joy" and "fun" of programming without sacrificing the careful engineering of a stable product. Linux and Apache should be understood as the results of this kind of coordination: experiments with adaptability that have worked, to the surprise of many who have insisted that complexity requires planning and hierarchy. Goals and planning are the province of governance—the practice of goal-setting, orientation, and definition of control—but adaptability is the province of critique, and this is why Free Software is a recursive public: it stands outside power and offers powerful criticism in the form of working alternatives. It is not the domain of the new—after all Linux is just a rewrite of UNIX—but the domain of critical and responsive public direction of a collective undertaking.
-={Linux (Free Software project)+8;novelty, of free software;recursive public+1}
+={Linux (Free Software project)+8;novelty, of Free Software;recursive public+1}
Linux and Apache are more than pieces of software; they are organizations of an unfamiliar kind. My claim that they are "recursive publics" is useful insofar as it gives a name to a practice that is neither corporate nor academic, neither profit nor nonprofit, neither governmental nor nongovernmental. The concept of recursive public includes, within the spectrum of political activity, the creation, modification, and maintenance of software, networks, and legal documents. While a "public" in most theories is a body of ,{[pg 212]}, people and a discourse that give expressive form to some concern, "recursive public" is meant to suggest that geeks not only give expressive form to some set of concerns (e.g., that software should be free or that intellectual property rights are too expansive) but also give concrete infrastructural form to the means of expression itself. Linux and Apache are tools for creating networks by which expression of new kinds can be guaranteed and by which further infrastructural experimentation can be pursued. For geeks, hacking and programming are variants of free speech and freedom of assembly.
={public sphere:theories of;Apache (Free Software project)+4;experimentation;infrastructure}
@@ -2083,7 +2083,7 @@ Both the Apache project and the Linux kernel project use SCMs. In the case of Ap
While SCMs are in general good for managing conflicting changes, they can do so only up to a point. To allow anyone to commit a change, however, could result in a chaotic mess, just as difficult to disentangle as it would be without an SCM. In practice, therefore, most projects designate a handful of people as having the right to "commit" changes. The Apache project retained its voting scheme, for instance, but it became a way of voting for "committers" instead for patches themselves. Trusted committers—those with the mysterious "good taste," or technical intuition—became the core members of the group.
The Linux kernel has also struggled with various issues surrounding SCMs and the management of responsibility they imply. The story of the so-called VGER tree and the creation of a new SCM called Bitkeeper is exemplary in this respect.~{ See Steven Weber, The Success of Open Source, 117-19; Moody, Rebel Code, 172-78. See also Shaikh and Cornford, "Version Management Tools." }~ By 1997, Linux developers had begun to use cvs to manage changes to the source code, though not without resistance. Torvalds was still in charge of the changes to the official stable tree, but as other "lieutenants" came on board, the complexity of the changes to the kernel grew. One such lieutenant was Dave Miller, who maintained a "mirror" of the stable Linux kernel tree, the VGER tree, on a server at Rutgers. In September 1998 a fight broke out among Linux kernel developers over two related issues: one, the fact that Torvalds was failing to incorporate (patch) contributions that had been forwarded to him by various people, including his lieutenants; and two, as a result, the VGER cvs repository was no longer in synch with the stable tree maintained by Torvalds. Two different versions of Linux threatened to emerge.
-={Miller, Dave;Source Code Management tools (SCMs):see also Bitkeeper;Concurrent Versioning System (cvs):Linux and;Linux (Free Software project):VGER tree and+2;Bitkeeper (Source Code Management software)+12;Torvalds, Linux:in bitkeeper controversy+12}
+={Miller, Dave;Source Code Management tools (SCMs):see also Bitkeeper;Concurrent Versioning System (cvs):Linux and;Linux (Free Software project):VGER tree and+2;Bitkeeper (Source Code Management software)+12;Torvalds, Linus:in bitkeeper controversy+12}
A great deal of yelling ensued, as nicely captured in Moody’s Rebel Code, culminating in the famous phrase, uttered by Larry McVoy: "Linus does not scale." The meaning of this phrase is that the ability of Linux to grow into an ever larger project with increasing complexity, one which can handle myriad uses and functions (to "scale" up), is constrained by the fact that there is only one Linus Torvalds. By all accounts, Linus was and is excellent at what he does—but there is only one Linus. The danger of this situation is the danger of a fork. A fork would mean one or more new versions would proliferate under new leadership, a situation much like ,{[pg 233]}, the spread of UNIX. Both the licenses and the SCMs are designed to facilitate this, but only as a last resort. Forking also implies dilution and confusion—competing versions of the same thing and potentially unmanageable incompatibilities.
={McVoy, Larry+11;Moody, Glyn;forking:in Linux+1}
@@ -2186,7 +2186,7 @@ In part III I confront this question directly. Indeed, it was this question that
={cultural significance;recursive public+3;Free Software:components of+1}
Connexions modulates all of the components except that of the movement (there is, as of yet, no real "Free Textbook" movement, but the "Open Access" movement is a close second cousin).~{ In January 2005, when I first wrote this analysis, this was true. By April 2006, the Hewlett Foundation had convened the Open Educational Resources "movement" as something that would transform the production and circulation of textbooks like those created by Connexions. Indeed, in Rich Baraniuk’s report for Hewlett, the first paragraph reads: "A grassroots movement is on the verge of sweeping through the academic world. The open education movement is based on a set of intuitions that are shared by a remarkably wide range of academics: that knowledge should be free and open to use and re-use; that collaboration should be easier, not harder; that people should receive credit and kudos for contributing to education and research; and that concepts and ideas are linked in unusual and surprising ways and not the simple linear forms that textbooks present. Open education promises to fundamentally change the way authors, instructors, and students interact worldwide" (Baraniuk and King, "Connexions"). (In a nice confirmation of just how embedded participation can become in anthropology, Baraniuk cribbed the second sentence from something I had written two years earlier as part of a description of what I thought Connexions hoped to achieve.) The "movement" as such still does not quite exist, but the momentum for it is clearly part of the actions that Hewlett hopes to achieve. }~ Perhaps the most complex modulation concerns coordination—changes to the practice of coordination and collaboration in academic-textbook creation in particular, and more generally to the nature of collaboration and coordination of knowledge in science and scholarship generally.
-={coordination (components of Free Software);movement (component of Free Software)+2}
+={coordination (component of Free Software);movement (component of Free Software)+2}
Connexions emerged out of Free Software, and not, as one might expect, out of education, textbook writing, distance education, or any of those areas that are topically connected to pedagogy. That is to say, the people involved did not come to their project by attempting to deal with a problem salient to education and teaching as much as they did so through the problems raised by Free Software and the question of how those problems apply to university textbooks. Similarly, a second project, Creative Commons, also emerged out of a direct engagement with and exploration of Free Software, and not out of any legal movement or scholarly commitment to the critique of intellectual-property law or, more important, out of any desire to transform the entertainment industry. Both projects are resolutely committed to experimenting with the given practices of Free Software—to testing their limits and changing them where they can—and this is what makes them vibrant, risky, and potentially illuminating as cases of a recursive public.
={affinity (of geeks);commons+1;Creative Commons+1;pedagogy;recursive public:examples of+1}
@@ -2208,7 +2208,7 @@ Around 1998 or 1999, Rich decided that it was time for him to write a textbook o
={Burris, C. Sidney;Connexions project:textbooks and+4;Rice University}
At about the same time as his idea for a textbook, Rich’s research group was switching over to Linux, and Rich was first learning about Open Source and the emergence of a fully free operating system created entirely by volunteers. It isn’t clear what Rich’s aha! moment was, other than simply when he came to an understanding that such a thing as Linux was actually possible. Nonetheless, at some point, Rich had the idea that his textbook could be an Open Source textbook, that is, a textbook created not just by him, but by DSP researchers all over the world, and made available to everyone to make use of and modify and improve as they saw fit, just like Linux. Together with Brent Hendricks, Yan David Erlich, ,{[pg 249]}, and Ross Reedstrom, all of whom, as geeks, had a deep familiarity with the history and practices of Free and Open Source Software, Rich started to conceptualize a system; they started to think about modulations of different components of Free and Open Source Software. The idea of a Free Software textbook repository slowly took shape.
-={Linux (Free Software project);Open Source:inspiration for Connexions+27;Reedstorm, Ross}
+={Linux (Free Software project);Open Source:inspiration for Connexions+27;Reedstrom, Ross}
Thus, Connexions: an "open content repository of high-quality educational materials." These "textbooks" very quickly evolved into something else: "modules" of content, something that has never been sharply defined, but which corresponds more or less to a small chunk of teachable information, like two or three pages in a textbook. Such modules are much easier to conceive of in sciences like mathematics or biology, in which textbooks are often multiauthored collections, finely divided into short chapters with diagrams, exercises, theorems, or programs. Modules lend themselves much less well to a model of humanities or social-science scholarship based in reading texts, discussion, critique, and comparison—and this bias is a clear reflection of what Brent, Ross, and Rich knew best in terms of teaching and writing. Indeed, the project’s frequent recourse to the image of an assembly-line model of knowledge production often confirms the worst fears of humanists and educators when they first encounter Connexions. The image suggests that knowledge comes in prepackaged and colorfully branded tidbits for the delectation of undergrads, rather than characterizing knowledge as a state of being or as a process.
={Connexions project:model of learning in|modules in+1}
@@ -2224,7 +2224,7 @@ Free Software—and, in particular, Open Source in the guise of "self-organizing
={Connexions project:relationship to education+2;distance learning+2}
Thus, Rich styled Connexions as more than just a factory of knowledge—it would be a community or culture developing richly associative and novel kinds of textbooks—and as much more than just distance education. Indeed, Connexions was not the only such project busy differentiating itself from the perceived dangers of distance education. In April 2001 MIT had announced that it would make the content of all of its courses available for free online in a project strategically called OpenCourseWare (OCW). Such news could only bring attention to MIT, which explicitly positioned the announcement as a kind of final death blow to the idea of distance education, by saying that what students pay $35,000 and up for per year is not "knowledge"—which is free—but the experience of being at MIT. The announcement created pure profit from the perspective of MIT’s reputation as a generator and disseminator of scientific knowledge, but the project did not emerge directly out of an interest in mimicking the success of Open Source. That angle was ,{[pg 252]}, provided ultimately by the computer-science professor Hal Abelson, whose deep understanding of the history and growth of Free Software came from his direct involvement in it as a long-standing member of the computer-science community at MIT. OCW emerged most proximately from the strange result of a committee report, commissioned by the provost, on how MIT should position itself in the "distance/e-learning" field. The surprising response: don’t do it, give the content away and add value to the campus teaching and research experience instead.~{ "Provost Announces Formation of Council on Educational Technology," MIT Tech Talk, 29 September 1999, http://web.mit.edu/newsoffice/1999/council-0929.html. }~
-={Abelson, Hal;Massachusetts Institute of Technology (MIT):open courseware and+2;Open CourseWare (OCW)+2;Connexions poject:Open CourseWare+2}
+={Abelson, Hal;Massachusetts Institute of Technology (MIT):open courseware and+2;Open CourseWare (OCW)+2;Connexions project:Open CourseWare+2}
OCW, Connexions, and distance learning, therefore, while all ostensibly interested in combining education with the networks and software, emerged out of different demands and different places. While the profit-driven demand of distance learning fueled many attempts around the country, it stalled in the case of OCW, largely because the final MIT Council on Educational Technology report that recommended OCW was issued at the same time as the first plunge in the stock market (April 2000). Such issues were not a core factor in the development of Connexions, which is not to say that the problems of funding and sustainability have not always been important concerns, only that genesis of the project was not at the administrative level or due to concerns about distance education. For Rich, Brent, and Ross the core commitment was to openness and to the success of Open Source as an experiment with massive, distributed, Internet-based, collaborative production of software—their commitment to this has been, from the beginning, completely and adamantly unwavering. Neverthless, the project has involved modulations of the core features of Free Software. Such modulations depend, to a certain extent, on being a project that emerges out of the ideas and practices of Free Software, rather than, as in the case of OCW, one founded as a result of conflicting goals (profit and academic freedom) and resulting in a strategic use of public relations to increase the symbolic power of the university over its fiscal growth.
={Reedstrom, Ross}
@@ -2292,7 +2292,7 @@ Creative Commons provided more than licenses, though. It was part of a social im
={moral and technical order;social imaginary}
Creative Commons was thus a back-door approach: if the laws could not be changed, then people should be given the tools they needed to work around those laws. Understanding how Creative Commons was conceived requires seeing it as a modulation of both the notion of "source code" and the modulation of "copyright licenses." But the modulations take place in that context of a changing legal system that was so unfamiliar to Stallman and his EMACS users, a legal system responding to new forms of software, networks, and devices. For instance, the changes to the Copyright Act of 1976 created an unintended effect that Creative Commons would ultimately seize on. By eliminating the requirement to register copyrighted works (essentially granting copyright as soon as the ,{[pg 261]}, work is "fixed in a tangible medium"), the copyright law created a situation wherein there was no explicit way in which a work could be intentionally placed in the public domain. Practically speaking an author could declare that a work was in the public domain, but legally speaking the risk would be borne entirely by the person who sought to make use of that work: to copy it, transform it, sell it, and so on. With the explosion of interest in the Internet, the problem ramified exponentially; it became impossible to know whether someone who had placed a text, an image, a song, or a video online intended for others to make use of it—even if the author explicitly declared it "in the public domain." Creative Commons licenses were thus conceived and rhetorically positioned as tools for making explicit exactly what uses could be made of a specific work. They protected the rights of people who sought to make use of "culture" (i.e., materials and ideas and works they had not authored), an approach that Lessig often summed up by saying, "Culture always builds on the past."
-={copyright:requirement to register;sharing source code (component of Free Software):modulations of;creative commons:activism of+1;public domain+4}
+={copyright:requirement to register;sharing source code (component of Free Software):modulations of;Creative Commons:activism of+1;public domain+4}
The background to and context of the emergence of Creative Commons was of course much more complicated and fraught. Concerns ranged from the plights of university libraries with regard to high-priced journals, to the problem of documentary filmmakers unable to afford, or even find the owners of, rights to use images or snippets in films, to the high-profile fights over online music trading, Napster, and the RIAA. Over the course of four years, Lessig and the other founders of Creative Commons would address all of these issues in books, in countless talks and presentations and conferences around the world, online and off, among audiences ranging from software developers to entrepreneurs to musicians to bloggers to scientists.
={Napster;Recording Industry Association of America (RIAA)}
diff --git a/data/v2/samples/democratizing_innovation.eric_von_hippel.sst b/data/v2/samples/democratizing_innovation.eric_von_hippel.sst
index ee567f0..f47afc5 100644
--- a/data/v2/samples/democratizing_innovation.eric_von_hippel.sst
+++ b/data/v2/samples/democratizing_innovation.eric_von_hippel.sst
@@ -105,7 +105,7 @@ The whole sport of high-performance windsurfing really started from that. As soo
By 1998, more than a million people were engaged in windsurfing, and a large fraction of the boards sold incorporated the user-developed innovations for the high-performance sport.
The user-centered innovation process just illustrated is in sharp contrast to the traditional model, in which products and services are developed by manufacturers in a closed way, the manufacturers using patents, copyrights, and other protections to prevent imitators from free riding on their innovation investments. In this traditional model, a user's only role is to have needs, which manufacturers then identify and fill by designing and producing new products. The manufacturer-centric model does fit some fields and conditions. However, a growing body of empirical work shows that users are the first to develop many and perhaps most new industrial and consumer products. Further, the contribution of users is growing steadily larger as a result of continuing advances in computer and communications capabilities.
-={Intellectual property rights:See also Private-collective innovation|copyrights and|innovation and+2;Copyrights:See Intellectual property rights;Manufacturers:government policy and+2;Product development+2;Users:See also Lead Users|government policy and;Economic benefit, expectations of by lead users:by manufacturers+;Economic benefit, expectations of by lead users:by manufacturers+12;Government policy:manufacturer innovation and+2;Manufacturers:expectations of economic benefit by+26}
+={Intellectual property rights:See also Private-collective innovation|copyrights and|innovation and+2;Copyrights:See Intellectual property rights;Manufacturers:government policy and+2;Product development+2;Users:government policy and;Economic benefit, expectations of by lead users:by manufacturers+5;Economic benefit, expectations of by lead users:by manufacturers+12;Government policy:manufacturer innovation and+2;Manufacturers:expectations of economic benefit by+26}
In this book I explain in detail how the emerging process of user-centric, democratized innovation works. I also explain how innovation by users provides a very necessary complement to and feedstock for manufacturer innovation.
@@ -115,7 +115,7 @@ The ongoing shift of innovation to users has some very attractive qualities. It
% check government policy
Users, as the term will be used in this book, are firms or individual consumers that expect to benefit from /{using}/ a product or a service. In contrast, manufacturers expect to benefit from /{selling}/ a product or a service. A firm or an individual can have different relationships to different products or innovations. For example, Boeing is a manufacturer of airplanes, but it is also a user of machine tools. If we were examining innovations developed by Boeing for the airplanes it sells, we would consider Boeing a manufacturer-innovator in those cases. But if we were considering innovations in metal-forming machinery developed by Boeing for in-house use in building airplanes, we would categorize those as user-developed innovations and would categorize Boeing as a user-innovator in those cases.
-={Users:See also Lead users|characteristics of+2;Manufacturers:characteristics of+2}
+={Users:characteristics of+2;Manufacturers:characteristics of+2}
Innovation user and innovation manufacturer are the two general "functional" relationships between innovator and innovation. Users are unique in that they alone benefit /{directly}/ from innovations. All others (here lumped under the term "manufacturers") must sell innovation-related products or services to users, indirectly or directly, in order to profit from innovations. Thus, in order to profit, inventors must sell or license knowledge related to innovations, and manufacturers must sell products or services incorporating innovations. Similarly, suppliers of innovation-related materials or services---unless they have direct use for the innovations---must sell the materials or services in order to profit from the innovations.
={Innovation:See also Innovation communities|functional sources of;Suppliers}
@@ -156,7 +156,7 @@ Research provides a firm grounding for these empirical findings. The two definin
User-innovators with stronger "lead user" characteristics develop innovations having higher appeal in the general marketplace. Estimated OLS function: Y = 2.06 + 0.57x, where Y represents attractiveness of innovation and x represents lead-user-ness of respondent. Adjusted R^{2}^ = 0.281; p = 0.002; n = 30. Source of data: Franke and von Hippel 2003.
!_ Why Many Users Want Custom Products (Chapter 3)
-={Custom products:heterogeneity of user needs and+2;User need+2;Users:See also Lead users|innovate-or-buy decisions by+8|needs of+2}
+={Custom products:heterogeneity of user needs and+2;User need+2;Users:innovate-or-buy decisions by+8|needs of+2}
Why do so many users develop or modify products for their own use? Users may innovate if and as they want something that is not available on the market and are able and willing to pay for its development. It is likely that many users do not find what they want on the market. Meta-analysis of market-segmentation studies suggests that users' needs for products are highly heterogeneous in many fields (Franke and Reisinger 2003).
={Reisinger, H.}
@@ -165,7 +165,7 @@ Mass manufacturers tend to follow a strategy of developing products that are des
={Apache web server software;Manufacturers:lead users and}
!_ Users' Innovate-or-Buy Decisions (Chapter 4)
-={Custom products:heterogeneity of user needs and+3|manufacturers and+3|agency costs and+2;User need+3;Users:needs of+3;Manufacturers:innovation and+9|innovate-or-buy decisions and+4;Users:See also Lead Users|agency costs and+2}
+={Custom products:heterogeneity of user needs and+3|manufacturers and+3|agency costs and+2;User need+3;Users:needs of+3;Manufacturers:innovation and+9|innovate-or-buy decisions and+4;Users:agency costs and+2}
Even if many users want "exactly right products" and are willing and able to pay for their development, why do users often do this for themselves rather than hire a custom manufacturer to develop a special just-right product for them? After all, custom manufacturers specialize in developing products for one or a few users. Since these firms are specialists, it is possible that they could design and build custom products for individual users or user firms faster, better, or cheaper than users could do this for themselves. Despite this possibility, several factors can drive users to innovate rather than buy. Both in the case of user firms and in the case of individual user-innovators, agency costs play a major role. In the case of individual user-innovators, enjoyment of the innovation process can also be important.
={Agency costs+1;Manufacturers:custom products and+2;Custom products:users and+3;Economic benefit, expectations of by lead users:by manufacturers+13}
@@ -180,7 +180,7 @@ A small model of the innovate-or-buy decision follows. This model shows in a qua
={Innovation communities:social welfare, and;Manufacturers:social welfare and+21;Social welfare:manufacturer innovation and+21|user innovation and+21}
Chapter 4 concludes by pointing out that an additional incentive can drive individual user-innovators to innovate rather than buy: they may value the /{process}/ of innovating because of the enjoyment or learning that it brings them. It might seem strange that user-innovators can enjoy product development enough to want to do it themselves---after all, manufacturers pay their product developers to do such work! On the other hand, it is also clear that enjoyment of problem solving is a motivator for many individual problem solvers in at least some fields. Consider for example the millions of crossword-puzzle aficionados. Clearly, for these individuals enjoyment of the problem-solving process rather than the solution is the goal. One can easily test this by attempting to offer a puzzle solver a completed puzzle---the very output he or she is working so hard to create. One will very likely be rejected with the rebuke that one should not spoil the fun! Pleasure as a motivator can apply to the development of commercially useful innovations as well. Studies of the motivations of volunteer contributors of code to widely used software products have shown that these individuals too are often strongly motivated to innovate by the joy and learning they find in this work (Hertel et al. 2003; Lakhani and Wolf 2005).
-={Hertel, G.;Lakhani, K.;Wolf, B.;Innovation process;User:See also Lead users|innovation process and, 7;Free software:See also Open source software;Hackers;Herrmann, S.}
+={Hertel, G.;Lakhani, K.;Wolf, B.;Innovation process;Users:innovation process and+7;Free software:See also Open source software;Hackers;Herrmann, S.}
!_ Users' Low-Cost Innovation Niches (Chapter 5)
={Users:low-cost innovation niches of+3}
@@ -214,7 +214,7 @@ Active efforts by innovators to freely reveal---as opposed to sullen acceptance-
={Innovation communities+3}
Innovation by users tends to be widely distributed rather than concentrated among just a very few very innovative users. As a result, it is important for user-innovators to find ways to combine and leverage their efforts. Users achieve this by engaging in many forms of cooperation. Direct, informal user-to-user cooperation (assisting others to innovate, answering questions, and so on) is common. Organized cooperation is also common, with users joining together in networks and communities that provide useful structures and tools for their interactions and for the distribution of innovations. Innovation communities can increase the speed and effectiveness with which users and also manufacturers can develop and test and diffuse their innovations. They also can greatly increase the ease with which innovators can build larger systems from interlinkable modules created by community participants.
-={Users:innovation communities+2}
+={Users:innovation communities and+2}
Free and open source software projects are a relatively well-developed and very successful form of Internet-based innovation community. However, innovation communities are by no means restricted to software or even to information products, and they can play a major role in the development of physical products. Franke and Shah (2003) have documented the value that user innovation communities can provide to user-innovators developing physical products in the field of sporting equipment. The analogy to open source innovation communities is clear.
={Franke, N.;Shah, S.;Free software;Innovation communities:open source software and|physical products and|sporting equipment and;Open source software:innovation communities and}
@@ -304,7 +304,7 @@ The studies cited in table 2.1 clearly show that a lot of product development an
!_ Table 2.1
Many respondents reported developing or modifying products for their own use in the eight product areas listed here.
-={Lüthje, C.+1;Urban, G.+1;Franke, N.+1;Herstatt, C.+1;Morrison, Pamela+1;von Hippel, E.+1;Lead users:Apache web server software and+1r|library information search system and+1|mountain biking and+1|outdoor consumer products and+1|pipe hanger hardware and+1|printed circuit CAD software and+1|surgical equipment and+;Library information search system+1;Mountain biking+1;Outdoor products+1;Pipe hanger hardware+1;Printed circuit CAD software+1;Surgical equipment+1}
+={Lüthje, C.+1;Urban, G.+1;Franke, N.+1;Herstatt, C.+1;Morrison, Pamela+1;von Hippel, E.+1;Lead users:Apache web server software and+1r|library information search system and+1|mountain biking and+1|outdoor consumer products and+1|pipe hanger hardware and+1|printed circuit CAD software and+1|surgical equipment and+3;Library information search system+1;Mountain biking+1;Outdoor products+1;Pipe hanger hardware+1;Printed circuit CAD software+1;Surgical equipment+1}
table{~h c4; 20; 45; 15; 20;
@@ -844,7 +844,7 @@ Those interested can easily enhance their intuitions about heterogenity of user
={Users:innovation and+4|innovate-or-buy decisions by+74}
Why does a user wanting a custom product sometimes innovate for itself rather than buying from a manufacturer of custom products? There is, after all, a choice---at least it would seem so. However, if a user with the resources and willingness to pay does decide to buy, it may be surprised to discover that it is not so easy to find a manufacturer willing to make exactly what an individual user wants. Of course, we all know that mass manufacturers with businesses built around providing standard products in large numbers will be reluctant to accommodate special requests. Consumers know this too, and few will be so foolish as to contact a major soup producer like Campbell's with a request for a special, "just-right" can of soup. But what about manufacturers that specialize in custom products? Isn't it their business to respond to special requests? To understand which way the innovate-or-buy choice will go, one must consider both transaction costs and information asymmetries specific to users and manufacturers. I will talk mainly about transaction costs in this chapter and mainly about information asymmetries in chapter 5.
-={Custom products:users and+3;Innovation process+3;Manufacturers:innovation and+3;Transaction costs+3;Users:innovation process+3|and paying for innovations}
+={Custom products:users and+3;Innovation process+3;Manufacturers:innovation and+3;Transaction costs+3;Users:innovation process and+3|and paying for innovations}
I begin this chapter by discussing four specific and significant transaction costs that affect users' innovate-or-buy decisions. Next I review a case study that illustrates these. Then, I use a simple quantitative model to further explore when user firms will find it more cost-effective to develop a solution---a new product or service---for themselves rather than hiring a manufacturer to solve the problem for them. Finally, I point out that /{individual}/ users can sometimes be more inclined to innovate than one might expect because they sometimes value the /{process}/ of innovating as well as the novel product or service that is created.
@@ -1841,7 +1841,7 @@ Users that innovate and wish to freely diffuse innovation-related information ar
={Lessig, L.}
!_ R&D Subsidies and Tax Credits
-={Government policy:&D subsidies and+3}
+={Government policy:R&D subsidies and+3}
In many countries, manufacturing firms are rewarded for their innovative activity by R&D subsidies and tax credits. Such measures can make economic sense if average social returns to innovation are significantly higher than average private returns, as has been found by Mansfield et al. (1977) and others. However, important innovative activities carried out by users are often not similarly rewarded, because they tend to not be documentable as formal R&D activities. As we have seen, users tend to develop innovations in the course of "doing" in their normal use environments. Bresnahan and Greenstein (1996a) make a similar point. They investigate the role of "co-invention" in the move by users from mainframe to client-server architecture.~{ See also Bresnahan and Greenstein 1996b; Bresnahan and Saloner 1997; Saloner and Steinmueller 1996. }~ By "co-invention" Bresnahan and Greenstein mean organizational changes and innovations developed and implemented by users that are required to take full advantage of a new invention. They point out the high importance that co-invention has for realizing social returns from innovation. They consider the federal government's support for creating "national information infrastructures" insufficient or misallocated, since they view co-invention is the bottleneck for social returns and likely the highest value locus for invention.
={Bresnahan, T.;Greenstein, S.;Mansfield, E.;Users:co-invention and}
@@ -2270,7 +2270,7 @@ _* Recall that Urban and von Hippel (1988) tested the relative commercial attrac
={Urban, G.;Printed circuit CAD software}
_* Herstatt and von Hippel (1992) documented a lead user project seeking to develop a new line of pipe hangers---hardware used to attach pipes to the ceilings of commercial buildings. Hilti, a major manufacturer of construction-related equipment and products, conducted the project. The firm introduced a new line of pipe hanger products based on the lead user concept and a post-study evaluation has shown that this line has become a major commercial success for Hilti.
-={Herstatt;Pipe hanger hardware}
+={Herstatt, C.;Pipe hanger hardware}
_* Olson and Bakke (2001) report on two lead user studies carried out by Cinet, a leading IT systems integrator in Norway, for the firm's two major product areas, desktop personal computers, and Symfoni application GroupWare. These projects were very successful, with most of the ideas incorporated into next-generation products having been collected from lead users.
={Bakke, G.;Olson, E.}
diff --git a/data/v2/samples/free_as_in_freedom.richard_stallman_crusade_for_free_software.sam_williams.sst b/data/v2/samples/free_as_in_freedom.richard_stallman_crusade_for_free_software.sam_williams.sst
index 563dd23..213c76e 100644
--- a/data/v2/samples/free_as_in_freedom.richard_stallman_crusade_for_free_software.sam_williams.sst
+++ b/data/v2/samples/free_as_in_freedom.richard_stallman_crusade_for_free_software.sam_williams.sst
@@ -2026,7 +2026,7 @@ Although not the first person to view software as public property, Stallman is g
Predicting the future is risky sport, but most people, when presented with the question, seemed eager to bite. "One hundred years from now, Richard and a couple of other people are going to deserve more than a footnote," says Moglen. "They're going to be viewed as the main line of the story."
The "couple other people" Moglen nominates for future textbook chapters include John Gilmore, Stallman's GPL advisor and future founder of the Electronic Frontier Foundation, and Theodor Holm Nelson, a.k.a. Ted Nelson, author of the 1982 book, Literary Machines. Moglen says Stallman, Nelson, and Gilmore each stand out in historically significant, nonoverlapping ways. He credits Nelson, commonly considered to have coined the term "hypertext," for identifying the predicament of information ownership in the digital age. Gilmore and Stallman, meanwhile, earn notable credit for identifying the negative political effects of information control and building organizations-the Electronic Frontier Foundation in the case of Gilmore and the Free Software Foundation in the case of Stallman-to counteract those effects. Of the two, however, Moglen sees Stallman's activities as more personal and less political in nature.
-={Electronic Frontier Foundation;Gilmore, John;Nelson, Theodor Holm+2;Nelson Ted+2}
+={Electronic Frontier Foundation;Gilmore, John;Nelson, Theodor Holm+2;Nelson, Ted+2}
"Richard was unique in that the ethical implications of unfree software were particularly clear to him at an early moment," says Moglen. "This has a lot to do with Richard's personality, which lots of people will, when writing about him, try to depict as epiphenomenal or even a drawback in Richard Stallman's own life work."
diff --git a/data/v2/samples/the_wealth_of_networks.yochai_benkler.sst b/data/v2/samples/the_wealth_of_networks.yochai_benkler.sst
index d45d6be..67a8d62 100644
--- a/data/v2/samples/the_wealth_of_networks.yochai_benkler.sst
+++ b/data/v2/samples/the_wealth_of_networks.yochai_benkler.sst
@@ -75,7 +75,7 @@ Much of the early work in this project was done at New York University, whose la
Since 2001, first as a visitor and now as a member, I have had the remarkable pleasure of being part of the intellectual community that is Yale Law School. The book in its present form, structure, and emphasis is a direct reflection of my immersion in this wonderful community. Practically every single one of my colleagues has read articles I have written over this period, attended workshops where I presented my work, provided comments that helped to improve the articles--and through them, this book, as well. I owe each and every one of them thanks, not least to Tony Kronman, who made me see that it would be so. To list them all would be redundant. To list some would inevitably underrepresent the various contributions they have made. Still, I will try to say a few of the special thanks, owing much yet to ,{[pg xii]}, those I will not name. Working out the economics was a precondition of being able to make the core political claims. Bob Ellickson, Dan Kahan, and Carol Rose all engaged deeply with questions of reciprocity and commonsbased production, while Jim Whitman kept my feet to the fire on the relationship to the anthropology of the gift. Ian Ayres, Ron Daniels during his visit, Al Klevorick, George Priest, Susan Rose-Ackerman, and Alan Schwartz provided much-needed mixtures of skepticism and help in constructing the arguments that would allay it. Akhil Amar, Owen Fiss, Jerry Mashaw, Robert Post, Jed Rubenfeld, Reva Siegal, and Kenji Yoshino helped me work on the normative and constitutional questions. The turn I took to focusing on global development as the core aspect of the implications for justice, as it is in chapter 9, resulted from an invitation from Harold Koh and Oona Hathaway to speak at their seminar on globalization, and their thoughtful comments to my paper. The greatest influence on that turn has been Amy Kapczynski's work as a fellow at Yale, and with her, the students who invited me to work with them on university licensing policy, in particular, Sam Chaifetz.
-Oddly enough, I have never had the proper context in which to give two more basic thanks. My father, who was swept up in the resistance to British colonialism and later in Israel's War of Independence, dropped out of high school. He was left with a passionate intellectual hunger and a voracious appetite for reading. He died too young to even imagine sitting, as I do today with my own sons, with the greatest library in human history right there, at the dinner table, with us. But he would have loved it. Another great debt is to David Grais, who spent many hours mentoring me in my first law job, bought me my first copy of Strunk and White, and, for all practical purposes, taught me how to write in English; as he reads these words, he will be mortified, I fear, to be associated with a work of authorship as undisciplined as this, with so many excessively long sentences, replete with dependent clauses and unnecessarily complex formulations of quite simple ideas.
+Oddly enough, I have *{never had the proper context}* in which to give two more basic thanks. My father, who was swept up in the resistance to British colonialism and later in Israel's War of Independence, dropped out of high school. He was left with a passionate intellectual hunger and a voracious appetite for reading. He died too young to even imagine sitting, as I do today with my own sons, with the greatest library in human history right there, at the dinner table, with us. But he would have loved it. Another great debt is to David Grais, who spent many hours mentoring me in my first law job, bought me my first copy of Strunk and White, and, for all practical purposes, taught me how to write in English; as he reads these words, he will be mortified, I fear, to be associated with a work of authorship as undisciplined as this, with so many excessively long sentences, replete with dependent clauses and unnecessarily complex formulations of quite simple ideas.
Finally, to my best friend and tag-team partner in this tussle we call life, Deborah Schrag, with whom I have shared nicely more or less everything since we were barely adults. ,{[pg 1]},
@@ -89,7 +89,7 @@ A series of changes in the technologies, economic organization, and social pract
The rise of greater scope for individual and cooperative nonmarket production of information and culture, however, threatens the incumbents of the industrial information economy. At the beginning of the twenty-first century, we find ourselves in the midst of a battle over the institutional ecology of the digital environment. A wide range of laws and institutions-- from broad areas like telecommunications, copyright, or international trade regulation, to minutiae like the rules for registering domain names or whether digital television receivers will be required by law to recognize a particular code--are being tugged and warped in efforts to tilt the playing field toward one way of doing things or the other. How these battles turn out over the next decade or so will likely have a significant effect on how we come to know what is going on in the world we occupy, and to what extent and in what forms we will be able--as autonomous individuals, as citizens, and as participants in cultures and communities--to affect how we and others see the world as it is and as it might be.
2~ THE EMERGENCE OF THE NETWORKED INFORMATION ECONOMY
-={information economy:emergence of+9;networked environment policy+52;networked environment policy:emergence of+9}
+={information economy:emergence of+9;networked information economy+52|emergence of+9}
The most advanced economies in the world today have made two parallel shifts that, paradoxically, make possible a significant attenuation of the limitations that market-based production places on the pursuit of the political ,{[pg 3]}, values central to liberal societies. The first move, in the making for more than a century, is to an economy centered on information (financial services, accounting, software, science) and cultural (films, music) production, and the manipulation of symbols (from making sneakers to branding them and manufacturing the cultural significance of the Swoosh). The second is the move to a communications environment built on cheap processors with high computation capabilities, interconnected in a pervasive network--the phenomenon we associate with the Internet. It is this second shift that allows for an increasing role for nonmarket production in the information and cultural production sector, organized in a radically more decentralized pattern than was true of this sector in the twentieth century. The first shift means that these new patterns of production--nonmarket and radically decentralized--will emerge, if permitted, at the core, rather than the periphery of the most advanced economies. It promises to enable social production and exchange to play a much larger role, alongside property- and marketbased production, than they ever have in modern democracies.
={nonmarket information producers+4;physical constraints on information production+2;production of information:physical constraints on+2}
@@ -116,7 +116,7 @@ In the networked information economy, the physical capital required for producti
Because the presence and importance of nonmarket production has become so counterintuitive to people living in market-based economies at the end of the twentieth century, part I of this volume is fairly detailed and technical; overcoming what we intuitively "know" requires disciplined analysis. Readers who are not inclined toward economic analysis should at least read the introduction to part I, the segments entitled "When Information Production Meets the Computer Network" and "Diversity of Strategies in our Current Production System" in chapter 2, and the case studies in chapter 3. These should provide enough of an intuitive feel for what I mean by the diversity of production strategies for information and the emergence of nonmarket individual and cooperative production, to serve as the basis for the more normatively oriented parts of the book. Readers who are genuinely skeptical of the possibility that nonmarket production is sustainable and effective, and in many cases is an efficient strategy for information, knowledge, and cultural production, should take the time to read part I in its entirety. The emergence of precisely this possibility and practice lies at the very heart of my claims about the ways in which liberal commitments are translated into lived experiences in the networked environment, and forms the factual foundation of the political-theoretical and the institutional-legal discussion that occupies the remainder of the book.
2~ NETWORKED INFORMATION ECONOMY AND LIBERAL, DEMOCRATIC SOCIETIES
-={democratic societies+15;information economy:democracy and liberalism+15;liberal societies+15;networked environment policy:democracy and liberalism+15}
+={democratic societies+15;information economy:democracy and liberalism+15;liberal societies+15;networked information economy:democracy and liberalism+15}
How we make information, how we get it, how we speak to others, and how others speak to us are core components of the shape of freedom in any society. Part II of this book provides a detailed look at how the changes in the technological, economic, and social affordances of the networked information environment affect a series of core commitments of a wide range of liberal democracies. The basic claim is that the diversity of ways of organizing information production and use opens a range of possibilities for pursuing % ,{[pg 8]}, the core political values of liberal societies--individual freedom, a more genuinely participatory political system, a critical culture, and social justice. These values provide the vectors of political morality along which the shape and dimensions of any liberal society can be plotted. Because their practical policy implications are often contradictory, rather than complementary, the pursuit of each places certain limits on how we pursue the others, leading different liberal societies to respect them in different patterns. How much a society constrains the democratic decision-making powers of the majority in favor of individual freedom, or to what extent it pursues social justice, have always been attributes that define the political contours and nature of that society. But the economics of industrial production, and our pursuit of productivity and growth, have imposed a limit on how we can pursue any mix of arrangements to implement our commitments to freedom and justice. Singapore is commonly trotted out as an extreme example of the trade-off of freedom for welfare, but all democracies with advanced capitalist economies have made some such trade-off. Predictions of how well we will be able to feed ourselves are always an important consideration in thinking about whether, for example, to democratize wheat production or make it more egalitarian. Efforts to push workplace democracy have also often foundered on the shoals--real or imagined--of these limits, as have many plans for redistribution in the name of social justice. Market-based, proprietary production has often seemed simply too productive to tinker with. The emergence of the networked information economy promises to expand the horizons of the feasible in political imagination. Different liberal polities can pursue different mixtures of respect for different liberal commitments. However, the overarching constraint represented by the seeming necessity of the industrial model of information and cultural production has significantly shifted as an effective constraint on the pursuit of liberal commitments.
@@ -162,10 +162,10 @@ The networked information economy also allows for the emergence of a more critic
={Balkin, Jack;communities:critical culture and self-reflection+1;critical culture and self-reflection+1;culture:criticality of (self-reflection)+1;democratic societies:critical culture and social relations+1;Fisher, William (Terry);Koren, Niva Elkin;Lessig, Lawrence (Larry);self-organization: See clusters in network topology self-reflection+1;liberal societies:critical culture and social relations}
Throughout much of this book, I underscore the increased capabilities of individuals as the core driving social force behind the networked information economy. This heightened individual capacity has raised concerns by many that the Internet further fragments community, continuing the long trend of industrialization. A substantial body of empirical literature suggests, however, that we are in fact using the Internet largely at the expense of television, and that this exchange is a good one from the perspective of social ties. We use the Internet to keep in touch with family and intimate friends, both geographically proximate and distant. To the extent we do see a shift in social ties, it is because, in addition to strengthening our strong bonds, we are also increasing the range and diversity of weaker connections. Following ,{[pg 16]}, Manuel Castells and Barry Wellman, I suggest that we have become more adept at filling some of the same emotional and context-generating functions that have traditionally been associated with the importance of community with a network of overlapping social ties that are limited in duration or intensity.
-={attention fragmentation;Castells, Manuel;fragmentation of communication;norms (social): fragments of communication;regulation by social norms: fragmentation of communication;social relations and norms:fragmentation of communication;communities: fragmentation of;diversity:fragmentation of communication;Castells, Manuel}
+={attention fragmentation;Castells, Manuel;fragmentation of communication;norms (social): fragmentation of communication;regulation by social norms: fragmentation of communication;social relations and norms:fragmentation of communication;communities: fragmentation of;diversity:fragmentation of communication;Castells, Manuel}
2~ FOUR METHODOLOGICAL COMMENTS
-={information economy:methodological choices+14;networked environmental policy. See policy networked information economy:methodological choices+14}
+={information economy:methodological choices+14;networked environmental policy:See policy;networked information economy:methodological choices+14}
There are four methodological choices represented by the thesis that I have outlined up to this point, and therefore in this book as a whole, which require explication and defense. The first is that I assign a very significant role to technology. The second is that I offer an explanation centered on social relations, but operating in the domain of economics, rather than sociology. The third and fourth are more internal to liberal political theory. The third is that I am offering a liberal political theory, but taking a path that has usually been resisted in that literature--considering economic structure and the limits of the market and its supporting institutions from the perspective of freedom, rather than accepting the market as it is, and defending or criticizing adjustments through the lens of distributive justice. Fourth, my approach heavily emphasizes individual action in nonmarket relations. Much of the discussion revolves around the choice between markets and nonmarket social behavior. In much of it, the state plays no role, or is perceived as playing a primarily negative role, in a way that is alien to the progressive branches of liberal political thought. In this, it seems more of a libertarian or an anarchistic thesis than a liberal one. I do not completely discount the state, as I will explain. But I do suggest that what is special about our moment is the rising efficacy of individuals and loose, nonmarket affiliations as agents of political economy. Just like the market, the state will have to adjust to this new emerging modality of human action. Liberal political theory must first recognize and understand it before it can begin to renegotiate its agenda for the liberal state, progressive or otherwise.
={capabilities of individuals:technology and human affairs+5;human affairs, technology and+5;individual capabilities and action: technology and human affairs+5}
@@ -207,7 +207,7 @@ The important new fact about the networked environment, however, is the efficacy
={collaborative authorship: See also peer production collective social action}
2~ THE STAKES OF IT ALL: THE BATTLE OVER THE INSTITUTIONAL ECOLOGY OF THE DIGITAL ENVIRONMENT
-={commercial model of communication+9;industrial model of communication+9;information economy:institutional ecology+9;institutional ecology of digital environment+9;networked environment policy:institutional ecology+9;proprietary rights+9;traditional model of communication+9}
+={commercial model of communication+9;industrial model of communication+9;information economy:institutional ecology+9;institutional ecology of digital environment+9;networked information economy:institutional ecology+9;proprietary rights+9;traditional model of communication+9}
No benevolent historical force will inexorably lead this technologicaleconomic moment to develop toward an open, diverse, liberal equilibrium. ,{[pg 23]}, If the transformation I describe as possible occurs, it will lead to substantial redistribution of power and money from the twentieth-century industrial producers of information, culture, and communications--like Hollywood, the recording industry, and perhaps the broadcasters and some of the telecommunications services giants--to a combination of widely diffuse populations around the globe, and the market actors that will build the tools that make this population better able to produce its own information environment rather than buying it ready-made. None of the industrial giants of yore are taking this reallocation lying down. The technology will not overcome their resistance through an insurmountable progressive impulse. The reorganization of production and the advances it can bring in freedom and justice will emerge, therefore, only as a result of social and political action aimed at protecting the new social patterns from the incumbents' assaults. It is precisely to develop an understanding of what is at stake and why it is worth fighting for that I write this book. I offer no reassurances, however, that any of this will in fact come to pass.
@@ -215,7 +215,7 @@ The battle over the relative salience of the proprietary, industrial models of i
={property ownership+5;commons}
This is not to say that property is in some sense inherently bad. Property, together with contract, is the core institutional component of markets, and ,{[pg 24]}, a core institutional element of liberal societies. It is what enables sellers to extract prices from buyers, and buyers to know that when they pay, they will be secure in their ability to use what they bought. It underlies our capacity to plan actions that require use of resources that, without exclusivity, would be unavailable for us to use. But property also constrains action. The rules of property are circumscribed and intended to elicit a particular datum--willingness and ability to pay for exclusive control over a resource. They constrain what one person or another can do with regard to a resource; that is, use it in some ways but not others, reveal or hide information with regard to it, and so forth. These constraints are necessary so that people must transact with each other through markets, rather than through force or social networks, but they do so at the expense of constraining action outside of the market to the extent that it depends on access to these resources.
-={constrains of information production:physical+2;physical constraints on information production+2}
+={constrains of information production, physical+2;physical constraints on information production+2}
Commons are another core institutional component of freedom of action in free societies, but they are structured to enable action that is not based on exclusive control over the resources necessary for action. For example, I can plan an outdoor party with some degree of certainty by renting a private garden or beach, through the property system. Alternatively, I can plan to meet my friends on a public beach or at Sheep's Meadow in Central Park. I can buy an easement from my neighbor to reach a nearby river, or I can walk around her property using the public road that makes up our transportation commons. Each institutional framework--property and commons--allows for a certain freedom of action and a certain degree of predictability of access to resources. Their complementary coexistence and relative salience as institutional frameworks for action determine the relative reach of the market and the domain of nonmarket action, both individual and social, in the resources they govern and the activities that depend on access to those resources. Now that material conditions have enabled the emergence of greater scope for nonmarket action, the scope and existence of a core common infrastructure that includes the basic resources necessary to produce and exchange information will shape the degree to which individuals will be able to act in all the ways that I describe as central to the emergence of a networked information economy and the freedoms it makes possible.
={commons}
@@ -488,7 +488,7 @@ How are we to know that the content produced by widely dispersed individuals is
={accreditation:Amazon+1;Amazon+1;filtering:Amazon+1;relevance filtering:Amazon+1}
Amazon uses a mix of mechanisms to get in front of their buyers of books and other products that the users are likely to purchase. A number of these mechanisms produce relevance and accreditation by harnessing the users themselves. At the simplest level, the recommendation "customers who bought items you recently viewed also bought these items" is a mechanical means of extracting judgments of relevance and accreditation from the actions of many individuals, who produce the datum of relevance as byproduct of making their own purchasing decisions. Amazon also allows users to create topical lists and track other users as their "friends and favorites." Amazon, like many consumer sites today, also provides users with the ability ,{[pg 76]}, to rate books they buy, generating a peer-produced rating by averaging the ratings. More fundamentally, the core innovation of Google, widely recognized as the most efficient general search engine during the first half of the 2000s, was to introduce peer-based judgments of relevance. Like other search engines at the time, Google used a text-based algorithm to retrieve a given universe of Web pages initially. Its major innovation was its PageRank algorithm, which harnesses peer production of ranking in the following way. The engine treats links from other Web sites pointing to a given Web site as votes of confidence. Whenever someone who authors a Web site links to someone else's page, that person has stated quite explicitly that the linked page is worth a visit. Google's search engine counts these links as distributed votes of confidence in the quality of the page pointed to. Pages that are heavily linked-to count as more important votes of confidence. If a highly linked-to site links to a given page, that vote counts for more than the vote of a site that no one else thinks is worth visiting. The point to take home from looking at Google and Amazon is that corporations that have done immensely well at acquiring and retaining users have harnessed peer production to enable users to find things they want quickly and efficiently.
-={accreditation:Google;communities:critical culture and self-reflection+1;culture:critically of (self-reflection)+1;filtering:Google;Google;relevance filtering:Google}
+={accreditation:Google;communities:critical culture and self-reflection+1;culture:criticality of (self-reflection)+1;filtering:Google;Google;relevance filtering:Google}
The most prominent example of a distributed project self-consciously devoted to peer production of relevance is the Open Directory Project. The site relies on more than sixty thousand volunteer editors to determine which links should be included in the directory. Acceptance as a volunteer requires application. Quality relies on a peer-review process based substantially on seniority as a volunteer and level of engagement with the site. The site is hosted and administered by Netscape, which pays for server space and a small number of employees to administer the site and set up the initial guidelines. Licensing is free and presumably adds value partly to America Online's (AOL's) and Netscape's commercial search engine/portal and partly through goodwill. Volunteers are not affiliated with Netscape and receive no compensation. They spend time selecting sites for inclusion in the directory (in small increments of perhaps fifteen minutes per site reviewed), producing the most comprehensive, highest-quality human-edited directory of the Web--at this point outshining the directory produced by the company that pioneered human edited directories of the Web: Yahoo!.
={accreditation:Open Directory Project (ODP);critical culture and self-reflection:Open Directory Project;filtering:Open Directory Project (ODP);ODP (Open Directory Project);Open Directory Project (ODP);relevance filtering:Open Directory Project (ODP);self-organization:Open Directory Project}
@@ -1229,7 +1229,7 @@ Another dimension that is less well developed in the United States than it is in
={Gilmore, Dan;Pantic, Drazen;Rheingold, Howard;mobile phones;text messaging}
2~ NETWORKED INFORMATION ECONOMY MEETS THE PUBLIC SPHERE
-={information economy:effects on public sphere+21;networked environment policy:effects on public sphere+21}
+={information economy:effects on public sphere+21;networked information economy:effects on public sphere+21}
The networked public sphere is not made of tools, but of social production practices that these tools enable. The primary effect of the Internet on the ,{[pg 220]}, public sphere in liberal societies relies on the information and cultural production activity of emerging nonmarket actors: individuals working alone and cooperatively with others, more formal associations like NGOs, and their feedback effect on the mainstream media itself. These enable the networked public sphere to moderate the two major concerns with commercial mass media as a platform for the public sphere: (1) the excessive power it gives its owners, and (2) its tendency, when owners do not dedicate their media to exert power, to foster an inert polity. More fundamentally, the social practices of information and discourse allow a very large number of actors to see themselves as potential contributors to public discourse and as potential actors in political arenas, rather than mostly passive recipients of mediated information who occasionally can vote their preferences. In this section, I offer two detailed stories that highlight different aspects of the effects of the networked information economy on the construction of the public sphere. The first story focuses on how the networked public sphere allows individuals to monitor and disrupt the use of mass-media power, as well as organize for political action. The second emphasizes in particular how the networked public sphere allows individuals and groups of intense political engagement to report, comment, and generally play the role traditionally assigned to the press in observing, analyzing, and creating political salience for matters of public interest. The case studies provide a context both for seeing how the networked public sphere responds to the core failings of the commercial, mass-media-dominated public sphere and for considering the critiques of the Internet as a platform for a liberal public sphere.
@@ -1628,7 +1628,7 @@ Only two encyclopedias focus explicitly on Barbie's cultural meaning: Britannica
The relative emphasis of Google and /{Wikipedia}/, on the one hand, and Overture, Yahoo!, and the commercial encyclopedias other than Britannica, on the other hand, is emblematic of a basic difference between markets and social conversations with regard to culture. If we focus on the role of culture as "common knowledge" or background knowledge, its relationship to the market--at least for theoretical economists--is exogenous. It can be taken as given and treated as "taste." In more practical business environments, culture is indeed a source of taste and demand, but it is not taken as exogenous. Culture, symbolism, and meaning, as they are tied with marketbased goods, become a major focus of advertising and of demand management. No one who has been exposed to the advertising campaigns of Coca-Cola, Nike, or Apple Computers, as well as practically to any one of a broad range of advertising campaigns over the past few decades, can fail to see that these are not primarily a communication about the material characteristics or qualities of the products or services sold by the advertisers. ,{[pg 290]},
They are about meaning. These campaigns try to invest the act of buying their products or services with a cultural meaning that they cultivate, manipulate, and try to generalize in the practices of the society in which they are advertising, precisely in order to shape taste. They offer an opportunity to generate rents, because the consumer has to have this company's shoe rather than that one, because that particular shoe makes the customer this kind of person rather than that kind--cool rather than stuffy, sophisticated rather than common. Neither the theoretical economists nor the marketing executives have any interest in rendering culture transparent or writable. Whether one treats culture as exogenous or as a domain for limiting the elasticity of demand for one's particular product, there is no impetus to make it easier for consumers to see through the cultural symbols, debate their significance, or make them their own. If there is business reason to do anything about culture, it is to try to shape the cultural meaning of an object or practice, in order to shape the demand for it, while keeping the role of culture hidden and assuring control over the careful cultural choreography of the symbols attached to the company. Indeed, in 1995, the U.S. Congress enacted a new kind of trademark law, the Federal Antidilution Act, which for the first time disconnects trademark protection from protecting consumers from confusion by knockoffs. The Antidilution Act of 1995 gives the owner of any famous mark--and only famous marks--protection from any use that dilutes the meaning that the brand owner has attached to its own mark. It can be entirely clear to consumers that a particular use does not come from the owner of the brand, and still, the owner has a right to prevent this use. While there is some constitutional free-speech protection for criticism, there is also a basic change in the understanding of trademark law-- from a consumer protection law intended to assure that consumers can rely on the consistency of goods marked in a certain way, to a property right in controlling the meaning of symbols a company has successfully cultivated so that they are, in fact, famous. This legal change marks a major shift in the understanding of the role of law in assigning control for cultural meaning generated by market actors.
-={Antidilutation Act of 1995;branding:trademark dilutation;dilutation of trademaks;logical layer of institutional ecology:trademark dilutation;proprietary rights:trademark dilutation;trademark dilutation;information production, market-based:cultural change, transparency of+4;market-based information producers: cultural change, transparency of+4;nonmarket information producers:cultural change, transparency of+4}
+={Antidilutation Act of 1995;branding:trademark dilutation;dilutation of trademarks;logical layer of institutional ecology:trademark dilutation;proprietary rights:trademark dilutation;trademark dilutation;information production, market-based:cultural change, transparency of+4;market-based information producers: cultural change, transparency of+4;nonmarket information producers:cultural change, transparency of+4}
Unlike market production of culture, meaning making as a social, nonmarket practice has no similar systematic reason to accept meaning as it comes. Certainly, some social relations do. When girls play with dolls, collect them, or exhibit them, they are rarely engaged in reflection on the meaning of the dolls, just as fans of Scarlett O'Hara, of which a brief Internet search suggests there are many, are not usually engaged in critique of Gone with the ,{[pg 291]}, Wind as much as in replication and adoption of its romantic themes. Plainly, however, some conversations we have with each other are about who we are, how we came to be who we are, and whether we view the answers we find to these questions as attractive or not. In other words, some social interactions do have room for examining culture as well as inhabiting it, for considering background knowledge for what it is, rather than taking it as a given input into the shape of demand or using it as a medium for managing meaning and demand. People often engage in conversations with each other precisely to understand themselves in the world, their relationship to others, and what makes them like and unlike those others. One major domain in which this formation of self- and group identity occurs is the adoption or rejection of, and inquiry into, cultural symbols and sources of meaning that will make a group cohere or splinter; that will make people like or unlike each other.
@@ -1680,10 +1680,10 @@ We can analyze the implications of the emergence of the networked information ec
The opportunities that the network information economy offers, however, often run counter to the central policy drive of both the United States and the European Union in the international trade and intellectual property systems. These two major powers have systematically pushed for ever-stronger proprietary protection and increasing reliance on strong patents, copyrights, and similar exclusive rights as the core information policy for growth and development. Chapter 2 explains why such a policy is suspect from a purely economic perspective concerned with optimizing innovation. ,{[pg 303]}, A system that relies too heavily on proprietary approaches to information production is not, however, merely inefficient. It is unjust. Proprietary rights are designed to elicit signals of people's willingness and ability to pay. In the presence of extreme distribution differences like those that characterize the global economy, the market is a poor measure of comparative welfare. A system that signals what innovations are most desirable and rations access to these innovations based on ability, as well as willingness, to pay, overrepresents welfare gains of the wealthy and underrepresents welfare gains of the poor. Twenty thousand American teenagers can simply afford, and will be willing to pay, much more for acne medication than the more than a million Africans who die of malaria every year can afford to pay for a vaccine. A system that relies too heavily on proprietary models for managing information production and exchange is unjust because it is geared toward serving small welfare increases for people who can pay a lot for incremental improvements in welfare, and against providing large welfare increases for people who cannot pay for what they need.
2~ LIBERAL THEORIES OF JUSTICE AND THE NETWORKED INFORMATION ECONOMY
-={human development and justice:liberal theories of+7;human welfare:liberal theories of justice+7;information economy:justice, liberal theories of+7;justice and human development:liberal theories of+7;liberal societies:theories of justice+7;networked environment policy:justice, liberal theories of+7;welfare:liberal theories of justice+7|see also justice and human development}
+={human development and justice:liberal theories of+7;human welfare:liberal theories of justice+7;information economy:justice, liberal theories of+7;justice and human development:liberal theories of+7;liberal societies:theories of justice+7;welfare:liberal theories of justice+7|see also justice and human development}
Liberal theories of justice can be categorized according to how they characterize the sources of inequality in terms of luck, responsibility, and structure. By luck, I mean reasons for the poverty of an individual that are beyond his or her control, and that are part of that individual's lot in life unaffected by his or her choices or actions. By responsibility, I mean causes for the poverty of an individual that can be traced back to his or her actions or choices. By structure, I mean causes for the inequality of an individual that are beyond his or her control, but are traceable to institutions, economic organizations, or social relations that form a society's transactional framework and constrain the behavior of the individual or undermine the efficacy of his or her efforts at self-help.
-={background knowledge:see culture bad luck, justice and+2;DSL:see broadband networks dumb luck, justice and+2;luck, justice and+2;misfortune, justice and+2;organizational structure:justice and+2;structure of organizations:justice and+2}
+={background knowledge:see culture bad luck, justice and+2;DSL:see broadband networks dumb luck, justice and+2;luck, justice and+2;misfortune, justice and+2;organization structure:justice and+2;structure of organizations:justice and+2}
We can think of John Rawls's /{Theory of Justice}/ as based on a notion that the poorest people are the poorest because of dumb luck. His proposal for a systematic way of defending and limiting redistribution is the "difference principle." A society should organize its redistribution efforts in order to make those who are least well-off as well-off as they can be. The theory of desert is that, because any of us could in principle be the victim of this dumb luck, we would all have agreed, if none of us had known where we ,{[pg 304]}, would be on the distribution of bad luck, to minimize our exposure to really horrendous conditions. The practical implication is that while we might be bound to sacrifice some productivity to achieve redistribution, we cannot sacrifice too much. If we did that, we would most likely be hurting, rather than helping, the weakest and poorest. Libertarian theories of justice, most prominently represented by Robert Nozick's entitlement theory, on the other hand, tend to ignore bad luck or impoverishing structure. They focus solely on whether the particular holdings of a particular person at any given moment are unjustly obtained. If they are not, they may not justly be taken from the person who holds them. Explicitly, these theories ignore the poor. As a practical matter and by implication, they treat responsibility as the source of the success of the wealthy, and by negation, the plight of the poorest--leading them to be highly resistant to claims of redistribution.
={Rawls, John+1;Nozick, Robert;redistribution theory+1}
@@ -2052,7 +2052,7 @@ Increased practical individual autonomy has been central to my claims throughout
={communities:virtual+9;virtual communities+9:see also social relations and norms}
We are seeing two effects: first, and most robustly, we see a thickening of preexisting relations with friends, family, and neighbors, particularly with those who were not easily reachable in the pre-Internet-mediated environment. Parents, for example, use instant messages to communicate with their children who are in college. Friends who have moved away from each other are keeping in touch more than they did before they had e-mail, because email does not require them to coordinate a time to talk or to pay longdistance rates. However, this thickening of contacts seems to occur alongside a loosening of the hierarchical aspects of these relationships, as individuals weave their own web of supporting peer relations into the fabric of what might otherwise be stifling familial relationships. Second, we are beginning to see the emergence of greater scope for limited-purpose, loose relationships. These may not fit the ideal model of "virtual communities." They certainly do not fit a deep conception of "community" as a person's primary source of emotional context and support. They are nonetheless effective and meaningful to their participants. It appears that, as the digitally networked environment begins to displace mass media and telephones, its salient communications characteristics provide new dimensions to thicken existing social relations, while also providing new capabilities for looser and more fluid, but still meaningful social networks. A central aspect of this positive improvement in loose ties has been the technical-organizational shift from an information environment dominated by commercial mass media on a oneto-many model, which does not foster group interaction among viewers, to an information environment that both technically and as a matter of social practice enables user-centric, group-based active cooperation platforms of the kind that typify the networked information economy. This is not to say that the Internet necessarily effects all people, all social groups, and networks identically. The effects on different people in different settings and networks will likely vary, certainly in their magnitude. My purpose here, however, is ,{[pg 358]}, to respond to the concern that enhanced individual capabilities entail social fragmentation and alienation. The available data do not support that claim as a description of a broad social effect.
-={communication:thickening of preexisting relations;displacement of real-world interactions;family relations, strengthening of;loose affiliations;neighborhood relations, strengthening of;networked public sphere:loose affiliations;norms (social):loose affiliations|thickening of preexisting relations;peer production:loose affiliations;preexisting relations, thickening of;public sphere:loose affiliations;regulation by social norms:loose affiliations|thickening of preexisting relations;scope of loose relationships;social relations and norms:loose affiliations|thickening of preexisting relations;supplantation of real-world interaction;thickening of preexisting relations}
+={communication:thickening of preexisting relations;displacement of real-world interaction;family relations, strengthening of;loose affiliations;neighborhood relations, strengthening of;networked public sphere:loose affiliations;norms (social):loose affiliations|thickening of preexisting relations;peer production:loose affiliations;preexisting relations, thickening of;public sphere:loose affiliations;regulation by social norms:loose affiliations|thickening of preexisting relations;scope of loose relationships;social relations and norms:loose affiliations|thickening of preexisting relations;supplantation of real-world interaction;thickening of preexisting relations}
2~ FROM "VIRTUAL COMMUNITIES" TO FEAR OF DISINTEGRATION
@@ -2081,7 +2081,7 @@ The concerns represented by these early studies of the effects of Internet use o
={Coleman, James;Granovetter, Mark;Putnum, Robert}
There are, roughly speaking, two types of responses to these concerns. The first is empirical. In order for these concerns to be valid as applied to increasing use of Internet communications, it must be the case that Internet communications, with all of their inadequacies, come to supplant real-world human interactions, rather than simply to supplement them. Unless Internet connections actually displace direct, unmediated, human contact, there is no basis to think that using the Internet will lead to a decline in those nourishing connections we need psychologically, or in the useful connections we make socially, that are based on direct human contact with friends, family, and neighbors. The second response is theoretical. It challenges the notion that the socially embedded individual is a fixed entity with unchanging needs that are, or are not, fulfilled by changing social conditions and relations. Instead, it suggests that the "nature" of individuals changes over time, based on actual social practices and expectations. In this case, we are seeing a shift from individuals who depend on social relations that are dominated by locally embedded, thick, unmediated, given, and stable relations, into networked individuals--who are more dependent on their own combination of strong and weak ties, who switch networks, cross boundaries, and weave their own web of more or less instrumental, relatively fluid relationships. Manuel Castells calls this the "networked society,"~{ Manuel Castells, The Rise of Networked Society 2d ed. (Malden, MA: Blackwell Publishers, Inc., 2000). }~ Barry Wellman, "networked individualism."~{ Barry Wellman et al., "The Social Affordances of the Internet for Networked Individualism," Journal of Computer Mediated Communication 8, no. 3 (April 2003). }~ To simplify vastly, it is not that people cease to depend on others and their context for both psychological and social wellbeing and efficacy. It is that the kinds of connections that we come to rely on for these basic human needs change over time. Comparisons of current practices to the old ways of achieving the desiderata of community, and fears regarding the loss of community, are more a form of nostalgia than a diagnosis of present social malaise. ,{[pg 363]},
-={Castells, Manuel;Wellman, Barry;displacement of real-world interaction+5;family relations, strengthening of+5;loose affiliations;neighborhood relations, strengthening of+5;networked public sphere:loose affiliations;norms (social):loose affiliations;peer production:loose affiliations;public sphere:loose affiliations;regulations by social norms:loose affiliations;social relations and norms:loose affiliations;supplantation of real-world interaction+5;thickening of preexisting relations+5}
+={Castells, Manuel;Wellman, Barry;displacement of real-world interaction+5;family relations, strengthening of+5;loose affiliations;neighborhood relations, strengthening of+5;networked public sphere:loose affiliations;norms (social):loose affiliations;peer production:loose affiliations;public sphere:loose affiliations;regulation by social norms:loose affiliations;social relations and norms:loose affiliations;supplantation of real-world interaction+5;thickening of preexisting relations+5}
3~ Users Increase Their Connections with Preexisting Relations
={e-mail:thickening of preexisting relations+4;social capital:thickening of preexisting relations+4}
@@ -2198,7 +2198,7 @@ The first two parts of this book explained why the introduction of digital compu
={commercial model of communication:mapping, framework for+13;industrial model of communication:mapping, framework for+13;institutional ecology of digital environment:mapping, framework for+13;layers of institutional ecology+13;policy:mapping institutional ecology+13;policy layers+13;traditional model of communication:mapping, framework for+13}
Two specific examples will illustrate the various levels at which law can operate to shape the use of information and its production and exchange. The first example builds on the story from chapter 7 of how embarrassing internal e-mails from Diebold, the electronic voting machine maker, were exposed by investigative journalism conducted on a nonmarket and peerproduction model. After students at Swarthmore College posted the files, Diebold made a demand under the DMCA that the college remove the materials or face suit for contributory copyright infringement. The students were therefore forced to remove the materials. However, in order keep the materials available, the students asked students at other institutions to mirror the files, and injected them into the eDonkey, BitTorrent, and FreeNet filesharing and publication networks. Ultimately, a court held that the unauthorized publication of files that were not intended for sale and carried such high public value was a fair use. This meant that the underlying publication of the files was not itself a violation, and therefore the Internet service provider was not liable for providing a conduit. However, the case was decided on September 30, 2004--long after the information would have been relevant ,{[pg 390]}, to the voting equipment certification process in California. What kept the information available for public review was not the ultimate vindication of the students' publication. It was the fact that the materials were kept in the public sphere even under threat of litigation. Recall also that at least some of the earlier set of Diebold files that were uncovered by the activist who had started the whole process in early 2003 were zipped, or perhaps encrypted in some form. Scoop, the Web site that published the revelation of the initial files, published--along with its challenge to the Internet community to scour the files and find holes in the system--links to locations in which utilities necessary for reading the files could be found.
-={Diebold Elections Systems+3;electronic voting machines (case study)+3;networked public sphere:Diebold Election Systems case study+3;policy:Diebold Election Systems case study+3;public sphere:Diebold Election Systems case study+3;voting, electronic+3}
+={Diebold Election Systems+3;electronic voting machines (case study)+3;networked public sphere:Diebold Election Systems case study+3;policy:Diebold Election Systems case study+3;public sphere:Diebold Election Systems case study+3;voting, electronic+3}
There are four primary potential points of failure in this story that could have conspired to prevent the revelation of the Diebold files, or at least to suppress the peer-produced journalistic mode that made them available. First, if the service provider--the college, in this case--had been a sole provider with no alternative physical transmission systems, its decision to block the materials under threat of suit would have prevented publication of the materials throughout the relevant period. Second, the existence of peer-to-peer networks that overlay the physical networks and were used to distribute the materials made expunging them from the Internet practically impossible. There was no single point of storage that could be locked down. This made the prospect of threatening other universities futile. Third, those of the original files that were not in plain text were readable with software utilities that were freely available on the Internet, and to which Scoop pointed its readers. This made the files readable to many more critical eyes than they otherwise would have been. Fourth, and finally, the fact that access to the raw materials--the e-mails--was ultimately found to be privileged under the fair-use doctrine in copyright law allowed all the acts that had been performed in the preceding period under a shadow of legal liability to proceed in the light of legality.
@@ -2218,7 +2218,7 @@ The remainder of this chapter provides a more or less detailed presentation of t
A quick look at table 11.1 reveals that there is a diverse set of sources of openness. A few of these are legal. Mostly, they are based on technological and social practices, including resistance to legal and regulatory drives toward enclosure. Examples of policy interventions that support an open core common infrastructure are the FCC's increased permission to deploy open wireless networks and the various municipal broadband initiatives. The former is a regulatory intervention, but its form is largely removal of past prohibitions on an entire engineering approach to building wireless systems. Municipal efforts to produce open broadband networks are being resisted at the state legislation level, with statutes that remove the power to provision broadband from the home rule powers of municipalities. For the most part, the drive for openness is based on individual and voluntary cooperative action, not law. The social practices of openness take on a quasi-normative face when practiced in standard-setting bodies like the Internet Engineering Task Force (IETF) or the World Wide Web Consortium (W3C). However, none of these have the force of law. Legal devices also support openness when used in voluntaristic models like free software licensing and Creative Commons?type licensing. However, most often when law has intervened in its regulatory force, as opposed to its contractual-enablement force, it has done so almost entirely on the side of proprietary enclosure.
Another characteristic of the social-economic-institutional struggle is an alliance between a large number of commercial actors and the social sharing culture. We see this in the way that wireless equipment manufacturers are selling into a market of users of WiFi and similar unlicensed wireless devices. We see this in the way that personal computer manufacturers are competing ,{[pg 395]}, over decreasing margins by producing the most general-purpose machines that would be most flexible for their users, rather than machines that would most effectively implement the interests of Hollywood and the recording industry. We see this in the way that service and equipment-based firms, like IBM and Hewlett-Packard (HP), support open-source and free software. The alliance between the diffuse users and the companies that are adapting their business models to serve them as users, instead of as passive consumers, affects the political economy of this institutional battle in favor of openness. On the other hand, security consciousness in the United States has led to some efforts to tip the balance in favor of closed proprietary systems, apparently because these are currently perceived as more secure, or at least more amenable to government control. While orthogonal in its political origins to the battle between proprietary and commons-based strategies for information production, this drive does tilt the field in favor of enclosure, at least at the time of this writing in 2005.
-={commercial model of communication:security related policy;industrial model of communication:security-related policy;institutional ecology of digital environment:security-related policy;policy:security-related;security-related policy;traditional model of communication:security-related policy}
+={commercial model of communication:security-related policy;industrial model of communication:security-related policy;institutional ecology of digital environment:security-related policy;policy:security-related;security-related policy;traditional model of communication:security-related policy}
% paragraph end moved above table
diff --git a/data/v2/samples/two_bits.christopher_kelty.sst b/data/v2/samples/two_bits.christopher_kelty.sst
index 85efb46..1cff4f9 100644
--- a/data/v2/samples/two_bits.christopher_kelty.sst
+++ b/data/v2/samples/two_bits.christopher_kelty.sst
@@ -94,7 +94,7 @@ At first glance, the thread tying these projects together seems to be the Intern
={Internet+12:relation to Free Software;Free Software:relation to Internet;public sphere:theories of}
Both the Internet and Free Software are historically specific, that is, not just any old new media or information technology. But the Internet is many, many specific things to many, many specific people. As one reviewer of an early manuscript version of this book noted, "For most people, the Internet is porn, stock quotes, Al Jazeera clips of executions, Skype, seeing pictures of the grandkids, porn, never having to buy another encyclopedia, MySpace, e-mail, online housing listings, Amazon, Googling potential romantic interests, etc. etc." It is impossible to explain all of these things; the meaning and significance of the proliferation of digital pornography is a very different concern than that of the fall of the print encyclopedia ,{[pg 5]}, and the rise of Wikipedia. Yet certain underlying practices relate these diverse phenomena to one another and help explain why they have occurred at this time and in this technical, legal, and social context. By looking carefully at Free Software and its modulations, I suggest, one can come to a better understanding of the changes affecting pornography, Wikipedia, stock quotes, and many other wonderful and terrifying things.~{ Wikipedia is perhaps the most widely known and generally familiar example of what this book is about. Even though it is not identified as such, it is in fact a Free Software project and a "modulation" of Free Software as I describe it here. The non-technically inclined reader might keep Wikipedia in mind as an example with which to follow the argument of this book. I will return to it explicitly in part 3. However, for better or for worse, there will be no discussion of pornography. }~
-={Wikipedia}
+={Wikipedia (collaborative encyclopedia)}
Two Bits has three parts. Part I of this book introduces the reader to the concept of recursive publics by exploring the lives, works, and discussions of an international community of geeks brought together by their shared interest in the Internet. Chapter 1 asks, in an ethnographic voice, "Why do geeks associate with one another?" The answer—told via the story of Napster in 2000 and the standards process at the heart of the Internet—is that they are making a recursive public. Chapter 2 explores the words and attitudes of geeks more closely, focusing on the strange stories they tell (about the Protestant Reformation, about their practical everyday polymathy, about progress and enlightenment), stories that make sense of contemporary political economy in sometimes surprising ways. Central to part I is an explication of the ways in which geeks argue about technology but also argue with and through it, by building, modifying, and maintaining the very software, networks, and legal tools within which and by which they associate with one another. It is meant to give the reader a kind of visceral sense of why certain arrangements of technology, organization, and law—specifically that of the Internet and Free Software—are so vitally important to these geeks.
={geeks;Napster;technology:as argument}
@@ -209,7 +209,7 @@ The study of distributed phenomena does not necessarily imply the detailed, loca
={Weber, Max}
It is in this sense that the ethnographic object of this study is not geeks and not any particular project or place or set of people, but Free Software and the Internet. Even more precisely, the ethnographic object of this study is "recursive publics"—except that this concept is also the work of the ethnography, not its preliminary object. I could not have identified "recursive publics" as the object of the ethnography at the outset, and this is nice proof that ethnographic work is a particular kind of epistemological encounter, an encounter that requires considerable conceptual work during and after the material labor of fieldwork, and throughout the material labor of writing and rewriting, in order to make sense of and reorient it into a question that will have looked deliberate and ,{[pg 21]}, answerable in hindsight. Ethnography of this sort requires a long-term commitment and an ability to see past the obvious surface of rapid transformation to a more obscure and slower temporality of cultural significance, yet still pose questions and refine debates about the near future.~{ Despite what might sound like a "shoot first, ask questions later" approach, the design of this project was in fact conducted according to specific methodologies. The most salient is actor-network theory: Latour, Science in Action; Law, "Technology and Heterogeneous Engineering"; Callon, "Some Elements of a Sociology of Translation"; Latour, Pandora’s Hope; Latour, Re-assembling the Social; Callon, Laws of the Markets; Law and Hassard, Actor Network Theory and After. Ironically, there have been no actor-network studies of networks, which is to say, of particular information and communication technologies such as the Internet. The confusion of the word network (as an analytical and methodological term) with that of network (as a particular configuration of wires, waves, software, and chips, or of people, roads, and buses, or of databases, names, and diseases) means that it is necessary to always distinguish this-network-here from any-network-whatsoever. My approach shares much with the ontological questions raised in works such as Law, Aircraft Stories; Mol, The Body Multiple; Cussins, "Ontological Choreography"; Charis Thompson, Making Parents; and Dumit, Picturing Personhood. }~ Historically speaking, the chapters of part II can be understood as a contribution to a history of scientific infrastructure—or perhaps to an understanding of large-scale, collective experimentation.~{ I understand a concern with scientific infrastructure to begin with Steve Shapin and Simon Schaffer in Leviathan and the Air Pump, but the genealogy is no doubt more complex. It includes Shapin, The Social History of Truth; Biagioli, Galileo, Courtier; Galison, How Experiments End and Image and Logic; Daston, Biographies of Scientific Objects; Johns, The Nature of the Book. A whole range of works explore the issue of scientific tools and infrastructure: Kohler, Lords of the Fly; Rheinberger, Towards a History of Epistemic Things; Landecker, Culturing Life; Keating and Cambrosio, Biomedical Platforms. Bruno Latour’s "What Rules of Method for the New Socio-scientific Experiments" provides one example of where science studies might go with these questions. Important texts on the subject of technical infrastructures include Walsh and Bayma, "Computer Networks and Scientific Work"; Bowker and Star, Sorting Things Out; Edwards, The ,{[pg 316]}, Closed World; Misa, Brey, and Feenberg, Modernity and Technology; Star and Ruhleder, "Steps Towards an Ecology of Infrastructure." }~ The Internet and Free Software are each an important practical transformation that will have effects on the practice of science and a kind of complex technical practice for which there are few existing models of study.
-={actor network theory;Internet+1}
+={Actor Network Theory;Internet+1}
A methodological note about the peculiarity of my subject is also in order. The Attentive Reader will note that there are very few fragments of conventional ethnographic material (i.e., interviews or notes) transcribed herein. Where they do appear, they tend to be "publicly available"—which is to say, accessible via the Internet—and are cited as such, with as much detail as necessary to allow the reader to recover them. Conventional wisdom in both anthropology and history has it that what makes a study interesting, in part, is the work a researcher has put into gathering that which is not already available, that is, primary sources as opposed to secondary sources. In some cases I provide that primary access (specifically in chapters 2, 8, and 9), but in many others it is now literally impossible: nearly everything is archived. Discussions, fights, collaborations, talks, papers, software, articles, news stories, history, old software, old software manuals, reminiscences, notes, and drawings—it is all saved by someone, somewhere, and, more important, often made instantly available by those who collect it. The range of conversations and interactions that count as private (either in the sense of disappearing from written memory or of being accessible only to the parties involved) has shrunk demonstrably since about 1981.
={ethnographic data:availability of+5}
@@ -293,7 +293,7 @@ _1 2. Boyle, "The Second Enclosure Movement and the Construction of the Public D
2~ From the Facts of Human Activity
Boston, May 2003. Starbucks. Sean and Adrian are on their way to pick me up for dinner. I’ve already had too much coffee, so I sit at the window reading the paper. Eventually Adrian calls to find out where I am, I tell him, and he promises to show up in fifteen minutes. I get bored and go outside to wait, watch the traffic go by. More or less right on time (only post-dotcom is Adrian ever on time), Sean’s new blue VW Beetle rolls into view. Adrian jumps out of the passenger seat and into the back, and I get in. Sean has been driving for a little over a year. He seems confident, cautious, but meanders through the streets of Cambridge. We are destined for Winchester, a township on the Charles River, in order to go to an Indian restaurant that one of Sean’s friends has recommended. When I ask how they are doing, they say, "Good, good." Adrian offers, "Well, Sean’s better than he has been in two years." "Really?" I say, impressed.
-={Doyle, Sean+6;Groper Adrian+6}
+={Doyle, Sean+6;Gropper, Adrian+6}
Sean says, "Well, happier than at least the last year. I, well, let me put it this way: forgive me father for I have sinned, I still have unclean thoughts about some of the upper management in the company, I occasionally think they are not doing things in the best interest of the company, and I see them as self-serving and sometimes wish them ill." In this rolling blue confessional Sean describes some of the people who I am familiar with whom he now tries very hard not to think about. I look at him and say, "Ten Hail Marys and ten Our Fathers, and you will be absolved, my child." Turning to Adrian, I ask, "And what about you?" Adrian continues the joke: "I, too, have sinned. I have reached the point where I can see absolutely nothing good coming of this company but that I can keep my investments in it long enough to pay for my children’s college tuition." I say, "You, my son, I cannot help." Sean says, "Well, funny thing about tainted money . . . there just taint enough of it."
@@ -1106,7 +1106,7 @@ The absence of an economic or corporate mandate for Thompson’s and Ritchie’s
={AT&T+14;McIlroy, Douglas}
UNIX was unique for many technical reasons, but also for a specific economic reason: it was never quite academic and never quite commercial. Martin Campbell-Kelly notes that UNIX was a "non-proprietary operating system of major significance."~{ Campbell-Kelly, From Airline Reservations to Sonic the Hedgehog, 143. }~ Kelly’s use of "non-proprietary" is not surprising, but it is incorrect. Although business-speak regularly opposed open to proprietary throughout the 1980s and early 1990s (and UNIX was definitely the former), Kelly’s slip marks clearly the confusion between software ownership and software distribution that permeates both popular and academic understandings. UNIX was indeed proprietary—it was copyrighted and wholly owned by Bell Labs and in turn by Western Electric ,{[pg 127]}, and AT&T—but it was not exactly commercialized or marketed by them. Instead, AT&T allowed individuals and corporations to install UNIX and to create UNIX-like derivatives for very low licensing fees. Until about 1982, UNIX was licensed to academics very widely for a very small sum: usually royalty-free with a minimal service charge (from about $150 to $800).~{ Ritchie’s Web site contains a copy of a 1974 license (http://cm.bell-labs.com/cm/cs/who/dmr/licenses.html) and a series of ads that exemplify the uneasy positioning of UNIX as a commercial product (http://cm.bell-labs.com/cm/cs/who/dmr/unixad.html). According to Don Libes and Sandy Ressler, "The original licenses were source licenses. . . . [C]ommercial institutions paid fees on the order of $20,000. If you owned more than one machine, you had to buy binary licenses for every additional machine [i.e., you were not allowed to copy the source and install it] you wanted to install UNIX on. They were fairly pricey at $8000, considering you couldn’t resell them. On the other hand, educational institutions could buy source licenses for several hundred dollars—just enough to cover Bell Labs’ administrative overhead and the cost of the tapes" (Life with UNIX, 20-21). }~ The conditions of this license allowed researchers to do what they liked with the software so long as they kept it secret: they could not distribute or use it outside of their university labs (or use it to create any commercial product or process), nor publish any part of it. As a result, throughout the 1970s UNIX was developed both by Thompson and Ritchie inside Bell Labs and by users around the world in a relatively informal manner. Bell Labs followed such a liberal policy both because it was one of a small handful of industry-academic research and development centers and because AT&T was a government monopoly that provided phone service to the country and was therefore forbidden to directly enter the computer software market.~{ According to Salus, this licensing practice was also a direct result of Judge Thomas Meaney’s 1956 antitrust consent decree which required AT&T to reveal and to license its patents for nominal fees (A Quarter Century of UNIX, 56); see also Brock, The Second Information Revolution, 116-20. }~
-={AT&T:Bell Labratories+13;licensing, of UNIX+6;proprietary systems: open vs.;monopoly}
+={AT&T:Bell Laboratories+13;licensing, of UNIX+6;proprietary systems: open vs.;monopoly}
Being on the border of business and academia meant that UNIX was, on the one hand, shielded from the demands of management and markets, allowing it to achieve the conceptual integrity that made it so appealing to designers and academics. On the other, it also meant that AT&T treated it as a potential product in the emerging software industry, which included new legal questions from a changing intellectual-property regime, novel forms of marketing and distribution, and new methods of developing, supporting, and distributing software.
@@ -1160,7 +1160,7 @@ Unfortunately, Commentary was also legally restricted in its distribution. AT&T
={trade secret law+1}
Thus, these generations of computer-science students and academics shared a secret—a trade secret become open secret. Every student who learned the essentials of the UNIX operating system from a photocopy of Lions’s commentary, also learned about AT&T’s attempt to control its legal distribution on the front cover of their textbook. The parallel development of photocopying has a nice resonance here; together with home cassette taping of music and the introduction of the video-cassette recorder, photocopying helped drive the changes to copyright law adopted in 1976.
-={copyright:changes in}
+={copyright:changes in 1976}
Thirty years later, and long after the source code in it had been completely replaced, Lions’s Commentary is still widely admired by geeks. Even though Free Software has come full circle in providing students with an actual operating system that can be legally studied, taught, copied, and implemented, the kind of "literary criticism" that Lions’s work represents is still extremely rare; even reading obsolete code with clear commentary is one of the few ways to truly understand the design elements and clever implementations that made the UNIX operating system so different from its predecessors and even many of its successors, few, if any of which have been so successfully ported to the minds of so many students.
={design+2}
@@ -1241,7 +1241,7 @@ The open-systems story is also a story of the blind spot of open systems—in th
={intellectual property;interoperability+21;openness (component of Free Software):intellectual property and}
Standardization was at the heart of the contest, but by whom and by what means was never resolved. The dream of open systems, pursued in an entirely unregulated industry, resulted in a complicated experiment in novel forms of standardization and cooperation. The creation of a "standard" operating system based on UNIX is the story of a failure, a kind of "figuring out" gone haywire, which resulted in huge consortia of computer manufacturers attempting to work together and compete with each other at the same time. Meanwhile, the successful creation of a "standard" networking protocol—known as the Open Systems Interconnection Reference Model (OSI)—is a story of failure that hides a larger success; OSI was eclipsed in the same period by the rapid and ad hoc adoption of the Transmission Control Protocol/Internet Protocol (TCP/IP), which used a radically different standardization process and which succeeded for a number of surprising reasons, allowing the Internet ,{[pg 145]}, to take the form it did in the 1990s and ultimately exemplifying the moral-technical imaginary of a recursive public—and one at the heart of the practices of Free Software.
-={figuring out;Open Systems Interconnection (OSI), as reference model;Openness (component of Free Software):standardization and;protocols:Open Systems Interconnection (OSI)|TCP/IP;standards organizations;TCP/IP (Transmission Control Protocol/Internet Protocol)}
+={figuring out;Open Systems Interconnection (OSI):as reference model;Openness (component of Free Software):standardization and;protocols:Open Systems Interconnection (OSI)|TCP/IP;standards organizations;TCP/IP (Transmission Control Protocol/Internet Protocol)}
The conceiving of openness, which is the central plot of these two stories, has become an essential component of the contemporary practice and power of Free Software. These early battles created a kind of widespread readiness for Free Software in the 1990s, a recognition of Free Software as a removal of open systems’ blind spot, as much as an exploitation of its power. The geek ideal of openness and a moral-technical order (the one that made Napster so significant an event) was forged in the era of open systems; without this concrete historical conception of how to maintain openness in technical and moral terms, the recursive public of geeks would be just another hierarchical closed organization—a corporation manqué—and not an independent public serving as a check on the kinds of destructive power that dominated the open-systems contest.
={Napster}
@@ -1427,7 +1427,7 @@ The growth of Free Software in the 1980s and 1990s depended on openness as a con
={Open Systems:networks and+28}
The struggle to standardize UNIX as a platform for open systems was not the only open-systems struggle; alongside the UNIX wars, another "religious war" was raging. The attempt to standardize networks—in particular, protocols for the inter-networking of multiple, diverse, and autonomous networks of computers—was also a key aspect of the open-systems story of the 1980s.~{ The distinction between a protocol, an implementation and a standard is important: Protocols are descriptions of the precise terms by which two computers can communicate (i.e., a dictionary and a handbook for communicating). An implementation is the creation of software that uses a protocol (i.e., actually does the communicating; thus two implementations using the same protocol should be able to share data. A standard defines which protocol should be used by which computers, for what purposes. It may or may not define the protocol, but will set limits on changes to that protocol. }~ The war ,{[pg 167]}, between the TCP/IP and OSI was also a story of failure and surprising success: the story of a successful standard with international approval (the OSI protocols) eclipsed by the experimental, military-funded TCP/IP, which exemplified an alternative and unusual standards process. The moral-technical orders expressed by OSI and TCP/IP are, like that of UNIX, on the border between government, university, and industry; they represent conflicting social imaginaries in which power and legitimacy are organized differently and, as a result, expressed differently in the technology.
-={moral and technical order;Networks:protools for+3;Open Systems Interconnection (OSI), as reference model+27;protocols:Open Systems Interconnection (OSI)+27|TCP/IP;TCP/IP (Transmission Control Protocol/Internet Protocol)+27;religious wars+3;social imaginary;standards process+3}
+={moral and technical order;Networks:protools for+3;Open Systems Interconnection (OSI):as reference model+27;protocols:Open Systems Interconnection (OSI)+27|TCP/IP;TCP/IP (Transmission Control Protocol/Internet Protocol)+27;religious wars+3;social imaginary;standards processes+3}
OSI and TCP/IP started with different goals: OSI was intended to satisfy everyone, to be the complete and comprehensive model against which all competing implementations would be validated; TCP/IP, by contrast, emphasized the easy and robust interconnection of diverse networks. TCP/IP is a protocol developed by bootstrapping between standard and implementation, a mode exemplified by the Requests for Comments system that developed alongside them as part of the Arpanet project. OSI was a "model" or reference standard developed by internationally respected standards organizations.
={Arpanet (network)+18;Request for Comments (RFC)}
@@ -1453,7 +1453,7 @@ One important feature united almost all of these experiments: the networks of th
={antitrust}
TCP/IP and OSI have become emblematic of the split between the worlds of telecommunications and computing; the metaphors of religious wars or of blood feuds and cold wars were common.~{ Drake, "The Internet Religious War." }~ A particularly arch account from this period is Carl Malamud’s Exploring the Internet: A Technical Travelogue, which documents Malamud’s (physical) visits to Internet sites around the globe, discussions (and beer) with networking researchers on technical details of the networks they have created, and his own typically geeky, occasionally offensive takes on cultural difference.~{ Malamud, Exploring the Internet; see also Michael M. J. Fischer, "Worlding Cyberspace." }~ A subtheme of the story is the religious war between Geneva (in particular the ITU) and the Internet: Malamud tells the story of asking the ITU to release its 19,000-page "blue book" of standards on the Internet, to facilitate its adoption and spread.
-={Malmud, Carl+1;standards process+4}
+={Malmud, Carl+1;standards processes+4}
The resistance of the ITU and Malamud’s heroic if quixotic attempts are a parable of the moral-technical imaginaries of openness—and indeed, his story draws specifically on the usable past of Giordano Bruno.~{ The usable past of Giordano Bruno is invoked by Malamud to signal the heretical nature of his own commitment to openly publishing standards that ISO was opposed to releasing. Bruno’s fate at the hands of the Roman Inquisition hinged in some part on his acceptance of the Copernican cosmology, so he has been, like Galileo, a natural figure for revolutionary claims during the 1990s. }~ The "bruno" project demonstrates the gulf that exists between two models of legitimacy—those of ISO and the ITU—in which standards represent the legal and legitimate consensus of a regulated industry, approved by member nations, paid for and enforced by governments, and implemented and adhered to by corporations.
={Bruno, Giordano;Usable pasts;International Organization for Standardization (ISO)+3}
@@ -1472,10 +1472,10 @@ Until the mid-1980s, the TCP/IP protocols were resolutely research-oriented, and
={Cerf, Vinton+2;Kahn, Robert;TCP/IP (Transmission Control Protocol/Internet Protocol):goals of+2}
The explicit goal of TCP/IP was thus to share computer resources, not necessarily to connect two individuals or firms together, or to create a competitive market in networks or networking software. Sharing between different kinds of networks implied allowing the different networks to develop autonomously (as their creators and maintainers saw best), but without sacrificing the ability to continue sharing. Years later, David Clark, chief Internet engineer for several years in the 1980s, gave a much more explicit explanation of the goals that led to the TCP/IP protocols. In particular, he suggested that the main overarching goal was not just to share resources but "to develop an effective technique for multiplexed utilization of existing interconnected networks," and he more explicitly stated the issue of control that faced the designers: "Networks represent administrative boundaries of control, and it was an ambition of this project to come to grips with the problem of integrating a number ,{[pg 173]}, of separately administrated entities into a common utility."~{ Clark, "The Design Philosophy of the DARPA Internet Protocols," 54-55. }~ By placing the goal of expandability first, the TCP/IP protocols were designed with a specific kind of simplicity in mind: the test of the protocols’ success was simply the ability to connect.
-={Clark,David}
+={Clark, David}
By setting different goals, TCP/IP and OSI thus differed in terms of technical details; but they also differed in terms of their context and legitimacy, one being a product of international-standards bodies, the other of military-funded research experiments. The technical and organizational differences imply different processes for standardization, and it is the peculiar nature of the so-called Requests for Comments (RFC) process that gave TCP/IP one of its most distinctive features. The RFC system is widely recognized as a unique and serendipitous outcome of the research process of Arpanet.~{ RFCs are archived in many places, but the official site is RFC Editor, http://www.rfc-editor.org/. }~ In a thirty-year retrospective (published, naturally, as an RFC: RFC 2555), Vint Cerf says, "Hiding in the history of the RFCs is the history of human institutions for achieving cooperative work." He goes on to describe their evolution over the years: "When the RFCs were first produced, they had an almost 19th century character to them—letters exchanged in public debating the merits of various design choices for protocols in the ARPANET. As email and bulletin boards emerged from the fertile fabric of the network, the far-flung participants in this historic dialog began to make increasing use of the online medium to carry out the discussion—reducing the need for documenting the debate in the RFCs and, in some respects, leaving historians somewhat impoverished in the process. RFCs slowly became conclusions rather than debates."~{ RFC Editor, RFC 2555, 6. }~
-={standards process;Request for Comments (RFC)+2}
+={standards processes;Request for Comments (RFC)+2}
Increasingly, they also became part of a system of discussion and implementation in which participants created working software as part of an experiment in developing the standard, after which there was more discussion, then perhaps more implementation, and finally, a standard. The RFC process was a way to condense the process of standardization and validation into implementation; which is to say, the proof of open systems was in the successful connection of diverse networks, and the creation of a standard became a kind of ex post facto rubber-stamping of this demonstration. Any further improvement of the standard hinged on an improvement on the standard implementation because the standards that resulted were freely and widely available: "A user could request an RFC by email from his host computer and have it automatically delivered to his mailbox. . . . RFCs were also shared freely with official standards ,{[pg 174]}, bodies, manufacturers and vendors, other working groups, and universities. None of the RFCs were ever restricted or classified. This was no mean feat when you consider that they were being funded by DoD during the height of the Cold War."~{ Ibid., 11. }~
={Software:implementation of;standards:implementation+9|validation of;Secrecy+1}
@@ -1554,7 +1554,7 @@ Stallman’s GNU General Public License "hacks" the federal copyright law, as is
={Copyleft licenses (component of Free Software):as hack of copyright law+1;Copyright+1}
Like all software since the 1980 copyright amendments, Free Software is copyrightable—and what’s more, automatically copyrighted as it is written (there is no longer any requirement to register). Copyright law grants the author (or the employer of the author) a number of strong rights over the dispensation of what has been written: rights to copy, distribute, and change the work.~{ Copyright Act of 1976, Pub. L. No. 94-553, 90 Stat. 2541, enacted 19 October 1976; and Copyright Amendments, Pub. L. No. 96-517, 94 Stat. 3015, 3028 (amending §101 and §117, title 17, United States Code, regarding computer programs), enacted 12 December 1980. All amendments since 1976 are listed at http://www.copyright.gov/title17/92preface.html. }~ Free Software’s hack is to immediately make use of these rights in order to abrogate the rights the programmer has been given, thus granting all subsequent licensees rights to copy, distribute, modify, and use the copyrighted software. Some licenses, like the GPL, add the further restriction that every licensee must offer the same terms to any subsequent licensee, others make no such restriction on subsequent uses. Thus, while statutory law suggests that individuals need strong rights and grants them, Free Software licenses effectively annul them in favor of other activities, such as sharing, porting, and forking software. It is for this reason that they have earned the name "copyleft."~{ The history of the copyright and software is discussed in Litman, Digital Copyright; Cohen et al., Copyright in a Global Information Economy; and Merges, Menell, and Lemley, Intellectual Property in the New Technological Age. }~
-={Copyright:changes in|rights granted by}
+={Copyright:changes in 1976|rights granted by}
This is a convenient ex post facto description, however. Neither Stallman nor anyone else started out with the intention of hacking copyright law. The hack of the Free Software licenses was a response to a complicated controversy over a very important invention, a tool that in turn enabled an invention called EMACS. The story of the controversy is well-known among hackers and geeks, but not often told, and not in any rich detail, outside of these small circles.~{ See Wayner, Free for All; Moody, Rebel Code; and Williams, Free as in Freedom. Although this story could be told simply by interviewing Stallman and James Gosling, both of whom are still alive and active in the software world, I have chosen to tell it through a detailed analysis of the Usenet and Arpanet archives of the controversy. The trade-off is between a kind of incomplete, fly-on-the-wall access to a moment in history and the likely revisionist retellings of those who lived through it. All of the messages referenced here are cited by their "Message-ID," which should allow anyone interested to access the original messages through Google Groups (http://groups.google.com). }~
@@ -1840,10 +1840,10 @@ The final component of Free Software is coordination. For many participants and
={Free Software:open source vs.;Open Source:Free Software vs.;peer production;practices:five components of Free Software+2;Source Code Management tools (SCMs)}
Coordination is important because it collapses and resolves the distinction between technical and social forms into a meaningful ,{[pg 211]}, whole for participants. On the one hand, there is the coordination and management of people; on the other, there is the coordination of source code, patches, fixes, bug reports, versions, and distributions—but together there is a meaningful technosocial practice of managing, decision-making, and accounting that leads to the collaborative production of complex software and networks. Such coordination would be unexceptional, essentially mimicking long-familiar corporate practices of engineering, except for one key fact: it has no goals. Coordination in Free Software privileges adaptability over planning. This involves more than simply allowing any kind of modification; the structure of Free Software coordination actually gives precedence to a generalized openness to change, rather than to the following of shared plans, goals, or ideals dictated or controlled by a hierarchy of individuals.~{ On the distinction between adaptability and adaptation, see Federico Iannacci, "The Linux Managing Model," http://opensource.mit.edu/papers/iannacci2.pdf. Matt Ratto characterizes the activity of Linux-kernel developers as a "culture of re-working" and a "design for re-design," and captures the exquisite details of such a practice both in coding and in the discussion between developers, an activity he dubs the "pressure of openness" that "results as a contradiction between the need to maintain productive collaborative activity and the simultaneous need to remain open to new development directions" ("The Pressure of Openness," 112-38). }~
-={adaptability:planning vs.+1|as a form of critique+1|adaptation vs.;coordination (component of Free Software):individual virtuosity vs. hierarchical planning+2;critique, Free Software+1;goals, lack of in Free Software+1;hackers:curiosity and virtuosity of+1;hierarchy, in coordination+5;planning+1}
+={adaptability:planning vs.+1|as a form of critique+1|adaptation vs.;coordination (component of Free Software):individual virtuosity vs. hierarchical planning+2;critique, Free Software as+1;goals, lack of in Free Software+1;hackers:curiosity and virtuosity of+1;hierarchy, in coordination+5;planning+1}
Adaptability does not mean randomness or anarchy, however; it is a very specific way of resolving the tension between the individual curiosity and virtuosity of hackers, and the collective coordination necessary to create and use complex software and networks. No man is an island, but no archipelago is a nation, so to speak. Adaptability preserves the "joy" and "fun" of programming without sacrificing the careful engineering of a stable product. Linux and Apache should be understood as the results of this kind of coordination: experiments with adaptability that have worked, to the surprise of many who have insisted that complexity requires planning and hierarchy. Goals and planning are the province of governance—the practice of goal-setting, orientation, and definition of control—but adaptability is the province of critique, and this is why Free Software is a recursive public: it stands outside power and offers powerful criticism in the form of working alternatives. It is not the domain of the new—after all Linux is just a rewrite of UNIX—but the domain of critical and responsive public direction of a collective undertaking.
-={Linux (Free Software project)+8;novelty, of free software;recursive public+1}
+={Linux (Free Software project)+8;novelty, of Free Software;recursive public+1}
Linux and Apache are more than pieces of software; they are organizations of an unfamiliar kind. My claim that they are "recursive publics" is useful insofar as it gives a name to a practice that is neither corporate nor academic, neither profit nor nonprofit, neither governmental nor nongovernmental. The concept of recursive public includes, within the spectrum of political activity, the creation, modification, and maintenance of software, networks, and legal documents. While a "public" in most theories is a body of ,{[pg 212]}, people and a discourse that give expressive form to some concern, "recursive public" is meant to suggest that geeks not only give expressive form to some set of concerns (e.g., that software should be free or that intellectual property rights are too expansive) but also give concrete infrastructural form to the means of expression itself. Linux and Apache are tools for creating networks by which expression of new kinds can be guaranteed and by which further infrastructural experimentation can be pursued. For geeks, hacking and programming are variants of free speech and freedom of assembly.
={public sphere:theories of;Apache (Free Software project)+4;experimentation;infrastructure}
@@ -2069,7 +2069,7 @@ Both the Apache project and the Linux kernel project use SCMs. In the case of Ap
While SCMs are in general good for managing conflicting changes, they can do so only up to a point. To allow anyone to commit a change, however, could result in a chaotic mess, just as difficult to disentangle as it would be without an SCM. In practice, therefore, most projects designate a handful of people as having the right to "commit" changes. The Apache project retained its voting scheme, for instance, but it became a way of voting for "committers" instead for patches themselves. Trusted committers—those with the mysterious "good taste," or technical intuition—became the core members of the group.
The Linux kernel has also struggled with various issues surrounding SCMs and the management of responsibility they imply. The story of the so-called VGER tree and the creation of a new SCM called Bitkeeper is exemplary in this respect.~{ See Steven Weber, The Success of Open Source, 117-19; Moody, Rebel Code, 172-78. See also Shaikh and Cornford, "Version Management Tools." }~ By 1997, Linux developers had begun to use cvs to manage changes to the source code, though not without resistance. Torvalds was still in charge of the changes to the official stable tree, but as other "lieutenants" came on board, the complexity of the changes to the kernel grew. One such lieutenant was Dave Miller, who maintained a "mirror" of the stable Linux kernel tree, the VGER tree, on a server at Rutgers. In September 1998 a fight broke out among Linux kernel developers over two related issues: one, the fact that Torvalds was failing to incorporate (patch) contributions that had been forwarded to him by various people, including his lieutenants; and two, as a result, the VGER cvs repository was no longer in synch with the stable tree maintained by Torvalds. Two different versions of Linux threatened to emerge.
-={Miller, Dave;Source Code Management tools (SCMs):see also Bitkeeper;Concurrent Versioning System (cvs):Linux and;Linux (Free Software project):VGER tree and+2;Bitkeeper (Source Code Management software)+12;Torvalds, Linux:in bitkeeper controversy+12}
+={Miller, Dave;Source Code Management tools (SCMs):see also Bitkeeper;Concurrent Versioning System (cvs):Linux and;Linux (Free Software project):VGER tree and+2;Bitkeeper (Source Code Management software)+12;Torvalds, Linus:in bitkeeper controversy+12}
A great deal of yelling ensued, as nicely captured in Moody’s Rebel Code, culminating in the famous phrase, uttered by Larry McVoy: "Linus does not scale." The meaning of this phrase is that the ability of Linux to grow into an ever larger project with increasing complexity, one which can handle myriad uses and functions (to "scale" up), is constrained by the fact that there is only one Linus Torvalds. By all accounts, Linus was and is excellent at what he does—but there is only one Linus. The danger of this situation is the danger of a fork. A fork would mean one or more new versions would proliferate under new leadership, a situation much like ,{[pg 233]}, the spread of UNIX. Both the licenses and the SCMs are designed to facilitate this, but only as a last resort. Forking also implies dilution and confusion—competing versions of the same thing and potentially unmanageable incompatibilities.
={McVoy, Larry+11;Moody, Glyn;forking:in Linux+1}
@@ -2172,7 +2172,7 @@ In part III I confront this question directly. Indeed, it was this question that
={cultural significance;recursive public+3;Free Software:components of+1}
Connexions modulates all of the components except that of the movement (there is, as of yet, no real "Free Textbook" movement, but the "Open Access" movement is a close second cousin).~{ In January 2005, when I first wrote this analysis, this was true. By April 2006, the Hewlett Foundation had convened the Open Educational Resources "movement" as something that would transform the production and circulation of textbooks like those created by Connexions. Indeed, in Rich Baraniuk’s report for Hewlett, the first paragraph reads: "A grassroots movement is on the verge of sweeping through the academic world. The open education movement is based on a set of intuitions that are shared by a remarkably wide range of academics: that knowledge should be free and open to use and re-use; that collaboration should be easier, not harder; that people should receive credit and kudos for contributing to education and research; and that concepts and ideas are linked in unusual and surprising ways and not the simple linear forms that textbooks present. Open education promises to fundamentally change the way authors, instructors, and students interact worldwide" (Baraniuk and King, "Connexions"). (In a nice confirmation of just how embedded participation can become in anthropology, Baraniuk cribbed the second sentence from something I had written two years earlier as part of a description of what I thought Connexions hoped to achieve.) The "movement" as such still does not quite exist, but the momentum for it is clearly part of the actions that Hewlett hopes to achieve. }~ Perhaps the most complex modulation concerns coordination—changes to the practice of coordination and collaboration in academic-textbook creation in particular, and more generally to the nature of collaboration and coordination of knowledge in science and scholarship generally.
-={coordination (components of Free Software);movement (component of Free Software)+2}
+={coordination (component of Free Software);movement (component of Free Software)+2}
Connexions emerged out of Free Software, and not, as one might expect, out of education, textbook writing, distance education, or any of those areas that are topically connected to pedagogy. That is to say, the people involved did not come to their project by attempting to deal with a problem salient to education and teaching as much as they did so through the problems raised by Free Software and the question of how those problems apply to university textbooks. Similarly, a second project, Creative Commons, also emerged out of a direct engagement with and exploration of Free Software, and not out of any legal movement or scholarly commitment to the critique of intellectual-property law or, more important, out of any desire to transform the entertainment industry. Both projects are resolutely committed to experimenting with the given practices of Free Software—to testing their limits and changing them where they can—and this is what makes them vibrant, risky, and potentially illuminating as cases of a recursive public.
={affinity (of geeks);commons+1;Creative Commons+1;pedagogy;recursive public:examples of+1}
@@ -2194,7 +2194,7 @@ Around 1998 or 1999, Rich decided that it was time for him to write a textbook o
={Burris, C. Sidney;Connexions project:textbooks and+4;Rice University}
At about the same time as his idea for a textbook, Rich’s research group was switching over to Linux, and Rich was first learning about Open Source and the emergence of a fully free operating system created entirely by volunteers. It isn’t clear what Rich’s aha! moment was, other than simply when he came to an understanding that such a thing as Linux was actually possible. Nonetheless, at some point, Rich had the idea that his textbook could be an Open Source textbook, that is, a textbook created not just by him, but by DSP researchers all over the world, and made available to everyone to make use of and modify and improve as they saw fit, just like Linux. Together with Brent Hendricks, Yan David Erlich, ,{[pg 249]}, and Ross Reedstrom, all of whom, as geeks, had a deep familiarity with the history and practices of Free and Open Source Software, Rich started to conceptualize a system; they started to think about modulations of different components of Free and Open Source Software. The idea of a Free Software textbook repository slowly took shape.
-={Linux (Free Software project);Open Source:inspiration for Connexions+27;Reedstorm, Ross}
+={Linux (Free Software project);Open Source:inspiration for Connexions+27;Reedstrom, Ross}
Thus, Connexions: an "open content repository of high-quality educational materials." These "textbooks" very quickly evolved into something else: "modules" of content, something that has never been sharply defined, but which corresponds more or less to a small chunk of teachable information, like two or three pages in a textbook. Such modules are much easier to conceive of in sciences like mathematics or biology, in which textbooks are often multiauthored collections, finely divided into short chapters with diagrams, exercises, theorems, or programs. Modules lend themselves much less well to a model of humanities or social-science scholarship based in reading texts, discussion, critique, and comparison—and this bias is a clear reflection of what Brent, Ross, and Rich knew best in terms of teaching and writing. Indeed, the project’s frequent recourse to the image of an assembly-line model of knowledge production often confirms the worst fears of humanists and educators when they first encounter Connexions. The image suggests that knowledge comes in prepackaged and colorfully branded tidbits for the delectation of undergrads, rather than characterizing knowledge as a state of being or as a process.
={Connexions project:model of learning in|modules in+1}
@@ -2210,7 +2210,7 @@ Free Software—and, in particular, Open Source in the guise of "self-organizing
={Connexions project:relationship to education+2;distance learning+2}
Thus, Rich styled Connexions as more than just a factory of knowledge—it would be a community or culture developing richly associative and novel kinds of textbooks—and as much more than just distance education. Indeed, Connexions was not the only such project busy differentiating itself from the perceived dangers of distance education. In April 2001 MIT had announced that it would make the content of all of its courses available for free online in a project strategically called OpenCourseWare (OCW). Such news could only bring attention to MIT, which explicitly positioned the announcement as a kind of final death blow to the idea of distance education, by saying that what students pay $35,000 and up for per year is not "knowledge"—which is free—but the experience of being at MIT. The announcement created pure profit from the perspective of MIT’s reputation as a generator and disseminator of scientific knowledge, but the project did not emerge directly out of an interest in mimicking the success of Open Source. That angle was ,{[pg 252]}, provided ultimately by the computer-science professor Hal Abelson, whose deep understanding of the history and growth of Free Software came from his direct involvement in it as a long-standing member of the computer-science community at MIT. OCW emerged most proximately from the strange result of a committee report, commissioned by the provost, on how MIT should position itself in the "distance/e-learning" field. The surprising response: don’t do it, give the content away and add value to the campus teaching and research experience instead.~{ "Provost Announces Formation of Council on Educational Technology," MIT Tech Talk, 29 September 1999, http://web.mit.edu/newsoffice/1999/council-0929.html. }~
-={Abelson, Hal;Massachusetts Institute of Technology (MIT):open courseware and+2;Open CourseWare (OCW)+2;Connexions poject:Open CourseWare+2}
+={Abelson, Hal;Massachusetts Institute of Technology (MIT):open courseware and+2;Open CourseWare (OCW)+2;Connexions project:Open CourseWare+2}
OCW, Connexions, and distance learning, therefore, while all ostensibly interested in combining education with the networks and software, emerged out of different demands and different places. While the profit-driven demand of distance learning fueled many attempts around the country, it stalled in the case of OCW, largely because the final MIT Council on Educational Technology report that recommended OCW was issued at the same time as the first plunge in the stock market (April 2000). Such issues were not a core factor in the development of Connexions, which is not to say that the problems of funding and sustainability have not always been important concerns, only that genesis of the project was not at the administrative level or due to concerns about distance education. For Rich, Brent, and Ross the core commitment was to openness and to the success of Open Source as an experiment with massive, distributed, Internet-based, collaborative production of software—their commitment to this has been, from the beginning, completely and adamantly unwavering. Neverthless, the project has involved modulations of the core features of Free Software. Such modulations depend, to a certain extent, on being a project that emerges out of the ideas and practices of Free Software, rather than, as in the case of OCW, one founded as a result of conflicting goals (profit and academic freedom) and resulting in a strategic use of public relations to increase the symbolic power of the university over its fiscal growth.
={Reedstrom, Ross}
@@ -2278,7 +2278,7 @@ Creative Commons provided more than licenses, though. It was part of a social im
={moral and technical order;social imaginary}
Creative Commons was thus a back-door approach: if the laws could not be changed, then people should be given the tools they needed to work around those laws. Understanding how Creative Commons was conceived requires seeing it as a modulation of both the notion of "source code" and the modulation of "copyright licenses." But the modulations take place in that context of a changing legal system that was so unfamiliar to Stallman and his EMACS users, a legal system responding to new forms of software, networks, and devices. For instance, the changes to the Copyright Act of 1976 created an unintended effect that Creative Commons would ultimately seize on. By eliminating the requirement to register copyrighted works (essentially granting copyright as soon as the ,{[pg 261]}, work is "fixed in a tangible medium"), the copyright law created a situation wherein there was no explicit way in which a work could be intentionally placed in the public domain. Practically speaking an author could declare that a work was in the public domain, but legally speaking the risk would be borne entirely by the person who sought to make use of that work: to copy it, transform it, sell it, and so on. With the explosion of interest in the Internet, the problem ramified exponentially; it became impossible to know whether someone who had placed a text, an image, a song, or a video online intended for others to make use of it—even if the author explicitly declared it "in the public domain." Creative Commons licenses were thus conceived and rhetorically positioned as tools for making explicit exactly what uses could be made of a specific work. They protected the rights of people who sought to make use of "culture" (i.e., materials and ideas and works they had not authored), an approach that Lessig often summed up by saying, "Culture always builds on the past."
-={copyright:requirement to register;sharing source code (component of Free Software):modulations of;creative commons:activism of+1;public domain+4}
+={copyright:requirement to register;sharing source code (component of Free Software):modulations of;Creative Commons:activism of+1;public domain+4}
The background to and context of the emergence of Creative Commons was of course much more complicated and fraught. Concerns ranged from the plights of university libraries with regard to high-priced journals, to the problem of documentary filmmakers unable to afford, or even find the owners of, rights to use images or snippets in films, to the high-profile fights over online music trading, Napster, and the RIAA. Over the course of four years, Lessig and the other founders of Creative Commons would address all of these issues in books, in countless talks and presentations and conferences around the world, online and off, among audiences ranging from software developers to entrepreneurs to musicians to bloggers to scientists.
={Napster;Recording Industry Association of America (RIAA)}