Santa Clara University

STS Nexus

Issues in Global Knowledge Sharing: Prospects and Pitfalls

Code Talkers by Aaron Uchikura
STUDENT ART COMPETITION  - “Code Talkers”  By Aaron Uchikura

Geoffrey C. Bowker


                For the past few hundred years, many books and articles have begun with a phrase such as: “we are entering a period of rapid change unimagined by our ancestors.” The statement is both as true and as false now as it has been over the previous two centuries. It is true because we are as a society adjusting to a whole new communication medium (the Internet) and new ways of storing, manipulating, and presenting information. We are, as sociologist Manuel Castells and others remind us, now in many ways an information economy, with many people tied to computers in one way or another during our working day and in our leisure hours (see, for example, Castells’ book The Rise of the Networked Society, or his article in STS NEXUS 1.1). It is false because we are faced with the same old problems, such as getting food, shelter and water to our human population, living in some kind of equilibrium with nature, as we ever were. What are the current potential impacts of the new knowledge economy on science and technology policies for sustainable life?

It has often been asserted that science is a public good; meaning that scientific work does not fit into the globally dominant market economy. In the new knowledge economy, however, we are increasingly seeing the penetration of the market right down to the molecular level, right down to the rudimentary elements of scientific enquiry. Thus it is possible to patent genes and genetically modified plants and animals. In this process, a new nexus has developed for managing nature and natural resources.  Taking a fairly wide definition of ownership, we can see three main issues arising from the implementation of this knowledge/information market: control of knowledge, privacy concerns, and patterns of ownership.


Control of Knowledge

The first, control of knowledge, can best be explained by asking the question:

“Who has the right to speak authoritatively in the name of science?” Since the mid-nineteenth century this has been a fairly simple question to answer: only professionally trained scientists and doctors can speak for their disciplines. Only they had access to the resources necessary to speak with authority about a given subject—they possessed the journals, the libraries, the professional experience. Within the new information economy this is not the case. For example, many patient groups are now being formed on the Internet. These groups often know more about a rare condition (for example, renal cell carcinoma) than a local doctor does—they can share information 24 hours a day, and can bring together patients from all over the world. This flattening out of knowledge hierarchies can be a very powerful social force. It carries along with it, though, the need to educate the enfranchised public about critical readership of the Web. There are many Web sites which look official and authoritative but in fact only push the hobby-horse of a particular individual or group. Through our schools and universities, we have gleaned good training in reading and criticizing print media; but we have little expertise as a culture in dealing with highly distributed information sources, such as the Internet.


Privacy Concerns

Privacy concerns are a significant dimension of science and technology policy in the new economy. It is now technically possible to generate and search very large databases, and to use the information to integrate data from a whole series of domains. As this happens, the potential for data abuse increases exponentially. Much has been written, for example, about data mining of the Icelandic population in the late 1990s. After considerable public debate, the Icelandic government agreed to sell the medical and genealogy records of the country’s 275,000 citizens to a private medical research company who was looking for genetic markers for major illnesses.   Iceland was selected for this study for two central reasons: its popu lation comes from a relatively restricted gene pool, and it has excellent medical records dating back approximately a thousand years. While the scientific results may prove useful in this case, the exercise certainly opens the specter of genetic screening of prospective employees by a given company. It is extremely difficult to keep records private over the new information infrastructure; many third-party companies, for example, compile together data from a variety of different agencies in order to generate a new, marketable form of knowledge. There is no point in trying to adhere to the old canons of privacy. However, open public debate and education about the possibilities of the new infrastructure are essential.

Knowledge Ownership

According to “public good” lines of argument about science, it is in the interest of the state to fund techno-scientific research since there will be a payoff for society as a whole in terms of infrastructural development. With the increasing privatization of knowledge (as we turn into a knowledge-based economy), it is unclear to what extent the vaunted openness of the scientific community will endure. Many refer back to a “golden age” when universities were separate from industry in a way that they are not today. While a lot of this talk is highly exaggerated (science has always been an eminently practical pursuit) it remains the case that we are in the process of building new understandings of scientific knowledge.

      A key question internationally has been that of “who owns what knowledge?” This is most visible in fields like biodiversity prospecting, where international agreements reimburse local citizens for bringing in biologically active plants and so forth.  However, the ownership patterns of this sort of knowledge are very difficult to adjudicate in Western terms. For example, consider a Mexican herbalist selling a biologically active plant in a market in Tijuana. He owns the plant, but he is not the knowledge source about biologically active plants. This knowledge cannot be attributed to a single discoverer (as is needed in many western courts of law adjudicating matters of intellectual propert ownership). It is unlikely that the herbalist will be able to trace the specimen’s chain of ownership going back to the original harvesting of the specific plant being sold.  Instead, this knowledge of origin probably resides in tradition, often held by the women of a collectivity.

             Similarly, groups such as the Australian aborigines or the Native Americans had very different concepts of land ownership from their white settler counterparts. This led to complex negotiations that continue even today about the protection of natural resources. We need anthropological/sociological studies of local knowledge (to the extent to which this is being mined by scientists) in order to help design just frameworks and studies regarding issues of data ownership in different countries. There is a danger, when we talk of the explosion of information in the new knowledge economy, that we forget the role of traditional knowledge in the development of sustainable policies for a particular region. For example, in the Himalayas, research has shown that the management of some parks has relied on models brought in from the outside, and typically taught to villagers through the distribution of television programs. Though highly effective, these models ignore centuries of local ecological knowledge because it is practice based, and has its own intricate weaving of knowledge about the environment, religious belief, and mythological expression. These elements are very real contributors in shaping knowledge in this community despite the fact that they cannot easily be conjured into a form that can be retained or displayed on a computer.


Working collaboratively

                Many have expressed hope that in the developing world, the new information infrastructure will provide the potential for a narrowing of the knowledge gaps between countries. Thus, an effective global digital library would allow third world researchers access to the latest journals. Distributed computing environments (such as the GRID, being developed in the United States, see would permit supercomputer-grade access to scientists and technologists throughout the world. The expansion of technologies like cell-phone usage in countries without landlines could virtually propel these countries into the “information future.” As powerful as these visions are, they need to be tempered with some real concerns. The first is that an information infrastructure, such as the Internet, functions like a Greek democracy of old—citizens who have access may be considered equals, but those without access are being left further and further behind. Secondly, access is never truly equal—the fastest connections and computers (needed for running the latest software) tend to be concentrated in the first world. This point is frequently overlooked by those who hail the end of the digital divide—they forget that this divide in itself is a moving target. Thirdly, governments in the developing world have expressed skepticism about the usefulness of putting their data resources out onto the Internet. Just as in the nineteenth century, the laissez-faire economics of free trade was advocated by developed countries with the most to gain (because they had organizations in place ready to take advantage of emerging possibilities). In this age, the greatest advocates of the free and open exchange of information are the developed countries with robust computing infrastructures that can optimize the data resources and use them to produce patentable genes, organisms and so forth. Some developing countries actually see this as a second wave of colonialism—the first pillaging  material resources, and the second pillaging information. All of these concerns can be met through the development of careful information policies designed to address this urgent need.

  International electronic communications hold out the apparent promise of breaking down a first-world/third-world divide in science. Developments such as the SPARC project ( allow scientists to operate equipment remotely via the Internet. The technology allows them to efficiently manipulate atmospheric sensing devices in the harsh, inclement Arctic Circle region without having to travel to this part of the world.   The opportunity to attend an international conference “virtually,” using teleconferencing equipment, is another example of how electronic communications has the potential to change the international landscape.    And if universities succeed in wresting control over scientific publications from the huge publishing houses, then easy and inexpensive access to the latest scientific articles becomes possible for a researcher in places like the Australian outback.

At the same time, there are a number of forces working to reinforce the traditional center/periphery divide in science internationally. Even with the move to open up access to scientific publications and equipment, there is no guarantee that the “invisible colleges,” which operate informally and determine various outcomes, such as who gets invited to which conference, will change; indeed the evidence seems to be to the contrary. In the current state of technological development, there is a significant gap between information access in different regions of any given country, or different parts of the world. Consider the analogy of the telephone. In prin- ciple, anyone can phone anywhere in the world. However, in practice some regions have more or less reliable phone services, which may or may not include access to digital resources over phone lines. In fact, half of the world’s population does not have telephone access of any kind.

                We can go beyond the continuing digital divide, however, to consider the possibility of mounting very large scale scientific data collection and knowledge sharing efforts. Such efforts are central to the social sciences, and to the sciences of ecology and biodiversity. With the development of handheld computing devices, it is becoming possible for a semi-skilled scientific worker with minimal training to go into the field and bring back significant results. Thus in Costa Rica, the ongoing attempt to catalog botanical species is being carried out largely by “para-taxonomists,” who are provided with enough skills in using interactive keys (which help in plant recognition) to carry out their work almost as effectively as a fully trained systematist. Computer-assisted workers, together with the deployment of remote sensing devices whose inputs can be treated automatically, hold out the possibility of scaling up the processes of scientific research so that they are truly global in depth and breadth.


Figure 1 - Countries with highest species diversity


Collaborative work is central to the new knowledge economy. Traditionally, scientific breakthroughs have been associated with particular laboratories—the Cavendish laboratory in Cambridge, England for example. Such laboratories have always been a site for intellectual collaboration—where visiting scholars exchange ideas, conferences are held, and graduate students are trained. However, it is impossible nowadays to imagine managing a large scale scientific project without including highly distributed collaborators, particularly if one is seeking to develop a truly global scientific culture.

Although technoscientific work is inherently collaborative, management structures in universities and industry still tend to support the heroic myth of the individual researcher. Many scientists turn away from collaborative, interdisciplinary work, which is most needed in order to develop policies for sustainable life, because they are risking their careers if they publish outside of their own field. There is significant institutional inertia, whereby an old model of science is being applied to a brave new world.

                This inertia is evident most clearly in the area of scientific publications. First in the field of physics, and then in a number of other scientific disciplines we are witnessing the spread of electronic preprints and electronic journals. Quite simply, traditional academic journals run by huge publishing conglomerates cannot turn around papers quickly enough to meet the needs of scientists working in cutting-edge fields.

Throughout the past two centuries, there has been a relatively stable configuration whereby journal articles have become the central medium for the dissemination and exchange of scientific ideas. Now there is essentially no principle reason why a scientist should not publish their findings directly to the web. As in many sectors of the new information economy, the development of the new publication medium is leading to a reconsideration of just what kind of value the large publishing houses add to journal production. As more and more journals go online, they are being forced to go beyond their traditional service of providing distribution networks and find ways to bring their material onto the web. It seems likely that all major scientific journals will soon be Internet accessible; even though the economics of such distribution is not yet fully worked out (possible charging mechanisms include paying for each paper down-loaded, or purchasing an institutional subscription for an entire journal). A more far reaching implication is that the journal article may no longer function as the unit of currency within the research community. Very large scale collaborative databases, for example, such as the human genome databank, are a new kind of product that was made possible by the development of the Web.

        The policy implications are clear. Great attention must be paid to the social and organizational setting of technoscientific work in order to take full advantage of the possibilities for faster research and publication cycles. There is a well known paradox about the development of computing (the productivity paradox) according to which the introduction of computers into a workplace tends to lead to a lowering of productivity in the short term (about 20 to 30 years). Economist Paul David and others have argued that in this evolution we are still using the old ways of working, and trying to adapt them to the production of electronic text. A new academic field, social informatics, has grown up precisely with the goal of exploring social and organizational aspects of the new knowledge economy (


                The choices that we are making now about how to share our collective emerging wisdom (powered by very large databases) are key to the health of our planet and of its human occupants.  We are now in the process of replacing nineteenth century imperialism, with its land grabs and pillaging of raw materials with a new twenty-first century version, with the pillaging of local knowledge and of scientific knowledge about developing countries.   A just and equitable regime of knowledge sharing is possible and projects like BioForge ( indicate that institutional, organizational, and technical procedures can be put in place.  At a time when the private sphere seems to be expanding relentlessly, we should pause to consider the benefits, for ourselves and for countless future generations, of a public sphere of scientific knowledge.

About the Author

Geoffrey Bowker

Geoffrey C. Bowker is the Regis and Dianne McKenna Professor at Santa Clara University and Executive Director at the Center for Science, Technology, and Society.  Prior to joining Santa Clara University, Geoffrey Bowker was professor and chair of the Department of Communication, UC San Diego. He studies social and organizational aspects of the development of very large-scale information infrastructures. His first book, Science on the Run, discussed the development of information practices in the oil industry. Along with Leigh Star, he has recently completed a book on the history and sociology of medical classifications, Sorting Things Out: Classification and Practice (1999). Since his invitation to join the biodiversity subcommittee of the President’s Committee of Advisors on Science and Technology, Bowker has been working in the field of biodiversity and environmental informatics. He has just completed a digital government funded project on long term databases in environmental science. He was a 2002-2003 member of an OECD working group on international data sharing in science. He is also working on projects at the San Diego Computer Center and in the Long Term Ecological Research Network on the formative evaluation of scientific cyber infrastructures. Bowker has a Ph.D in history and philosophy of science from Melbourne University.


Printer-friendly format