Cooperative Work and Coordinative Practices
Contributions to the Conceptual Foundations of
Computer-Supported Cooperative Work (CSCW)
Kjeld Schmidt
Part I and III
Springer, London, 2011
For Irene
Contents
Preface ......................................................................................................................5
Part I: Progress report ..............................................................................................9
1. Cooperative work and coordinative practices ...........................................11
Part II: Surveying the connections .........................................................................39
2. Riding a tiger, or CSCW (1991) ...............................................................41
3. Taking CSCW seriously (1992) ................................................................57
4. The organization of cooperative work (1994) ..........................................87
5. Coordination mechanisms (1996) ...........................................................107
6. Of maps and scripts (1997) .....................................................................149
7. The critical role of workplace studies in CSCW (2000) .........................167
8. The problem with ‘awareness’ (2002) ....................................................175
9. Remarks on the complexity of cooperative work (2002)........................187
10. Ordering systems (2004) .......................................................................223
Part III: CSCW reconsidered ...............................................................................277
11. Formation and fragmentation................................................................279
12. Frail foundations ...................................................................................385
13. Dispelling the mythology of computational artifacts ...........................417
References ............................................................................................................441
Index ....................................................................................................................483
Preface
This book is about cooperative work and the coordinative practices through which
order in cooperative work is accomplished.
The development of computing technologies have from the very beginning
been tightly interwoven with the development of cooperative work. Indeed, in important respects the challenges facing cooperative work in different domains have
at various points been decisive in motivating and shaping crucial computing technologies such as interactive computing and networking. Over the last couple of
decades computing technologies are also and increasingly being developed and
used for coordinative purposes, as means of regulating complex activities involving multiple professional actors, in factories and hospitals, in pharmaceutical laboratories and architectural offices, and so on. The economic importance of the
applications of these coordination technologies is enormous but their design often
inadequate. The problem is that our understanding of the coordinative practices,
for which these coordination technologies are being developed, is quite deficient,
leaving systems designers and software engineers to base their system designs on
rudimentary technologies. The result is that these vitally important systems,
though technically sound, typically are experienced as cumbersome, inefficient,
rigid, crude.
The research reflected in this book addresses these very practical problems and
is concerned with trying to establish — in the intermundia between the social sciences and computer science — a conceptual foundation for the research area of
Computer-Supported Cooperative Work (CSCW). What is cooperative work in
the first place? Is it something of which we can talk and reason sensibly? Is it a
category of practice that can be observed, described, and analyzed in anything like
a rigorous manner? How do the many actors engaged in this kind of practice accomplish their tasks in an orderly fashion, without succumbing to chaos? Can we
distinguish classes of practices, coordinative practices, by means of which they do
so? Can they be observed, described, and analyzed? How are these coordinative
practices organized? How do they evolve? How do actors manage to organize routine cooperative activities? What difficulties do they face and how do they cope
with them? By means of which conventions, procedures, techniques, etc. do they
regulate their joint work? How are these practices facilitated by traditional technologies, from paper and pencil, forms and binders, to time tables and archives?
What are the costs and benefits of such technologies? Which issues arise when
such practices are computerized, when control and execution of routines and
6
Cooperative Work and Coordinative Practices
schemes are transferred to computational artifacts? Can we devise computational
facilities by means of which ordinary actors themselves can develop their coordinative practices, devise methods and tools for improved coordination?
Those are the kinds of questions I have been trying to answer in the course of
the last twenty–five years. They are not questions for which sociology has answers, because they are not questions sociology has raised. They are questions
raised by the diffusion of computing technologies in cooperative work settings.
❧
The book comprises three rather different bodies of text. The bulk of the book —
Part II — consists of articles written from 1991 to 2004 in which I have addressed
and explored the issues and problems of cooperative work and coordinative practices in different directions. What unites these studies is a conception of cooperative work that makes it a researchable phenomenon, amenable to a technological
research program. Instead of the ideological notion of ‘cooperation’ as an ethical
imperative or the sociological notion of ‘cooperative work’ as coextensive with
the notion of social nature of human conduct, these studies are based on a conception of cooperative work as observable relations of interdependence that are
formed in response to practical exigencies but which then in turn require the development a family of equally observable coordinative practices.
The purpose of assembling these articles, of which some have reached a large
audience and some not, is to place them together, back to back, and thereby highlight their connectedness. In other words, the aim is to present a set of contributions to the conceptual foundations of the field of CSCW that, although it is unfinished business as far as unified conceptual framework is concerned, is nevertheless sufficiently elaborated and tested as an ensemble to be taken on: applied, extended, amended, challenged…
The articles are reprinted without substantive changes. What changes I have
made are these. I have deleted ‘abstracts’ from articles that had any: they are useful in journals or conference proceedings, as announcements to the busy reader,
but would here be more of a distraction. I have also removed the usual but often
terse acknowledgment statements. Typographical and other minor technical faults
have been corrected without notice. Similarly, incomplete or faulty citations and
references have been corrected, and citations and references have been reformatted to a common standard.
However, although based a common approach, these studies were not in a
strong sense conducted in a planned and goal-directed manner and the resulting
ensemble of articles evidently exhibits inconsistencies, false starts, in addition to
the inevitable repetitions. To provide the reader with an initial overview of the
meandering argument, Part I contains an introduction in the form of a ‘progress
Preface
7
report’. It gives a sketch of the development of the conception of cooperative
work and coordinative practices and, by making the underlying research strategy
explicit, serves to show how the different contributions are somehow connected.
Finally, since the research represented by these articles has had, in part at least,
a distinctly programmatic character — the subtitle is intended to indicate just that
— the book is also an occasion to revert to where the journey started, to the issue
of what CSCW is all about. Not for the sake of whipping a dead horse, but simply
because the discussion is as topical as ever. In fact, as a research area CSCW is in
disarray, and it is time to reconsider CSCW’s research program. This is the aim of
Part III.
Copenhagen, 1 May 2010
Kjeld Schmidt
Acknowledgments
The history of my research, as reflected in this book, is a clear demonstration of how
tricky the concept of cooperative work can be. Some articles were obviously written in
close collaboration with colleagues, while others, the majority, were written by myself.
But even these could not have been written had I not been collaborating, in different
ways, with a large number of colleagues. In fact, irrespective of the formal authorship of
the individual articles, the general framework developed in the articles collected here has
evolved over many years in more or less continual debates with scholars with whom I
have debated my work and who have offered opposition, often staunch but always stimulating opposition, to my notions and contentions. I should mention, at the very least, Hans
Andersen, Liam Bannon, Susanne Bødker, John Bowers, Geof Bowker, Giorgio De
Michelis, Peter Carstensen, Eli Gerson, Christine Halverson, Christian Heath, Erling
Havn, Betty Hewitt, Thomas Hildebrandt, John Hughes, Rachel Israël, Bjarne Kaavé,
Finn Kensing, Kristian Kreiner, Jacques Leplat, Paul Luff, Gloria Mark, Morten Nielsen,
Irene Odgaard, Wolfgang Prinz, Dave Randall, Jens Rasmussen, Mike Robinson, Tom
Rodden, Yvonne Rogers, Pascal Salembier, Dan Shapiro, Wes Sharrock, Carla Simone,
Susan Leigh Star, Lucy Suchman, Carsten Sørensen, Halina Tomaszewska, and Ina Wagner.
As is typical of research work these days, my work has been carried out in collaboration with countless partners and coworkers in a number of European and Danish research
projects such as, to name but the most important, FAOR, TIA, CoTech, MOHAWC,
COMIC, COTCOS, DMM, DIWA, FASIT, IDAK, HIT, CITH, Cosmobiz… Those who
were involved too will recognize the acronyms and will know my debt. It also so happens
that the research reflected in this volume has been carried out while I was working for a
string of institutions: Dansk Datamatik Center, the research center of the Danish Trade
Union Federation (LO), Risø National Laboratory, the Technical University of Denmark,
the IT University of Copenhagen, the University of Siegen, and Copenhagen Business
School. Without the support of these institutions, none of this could have been accomplished.
8
Cooperative Work and Coordinative Practices
An early version of the present book, carrying the same title, was submitted to the IT
University of Copenhagen for the dr.scient.soc. degree. The two official opponents, Wes
Sharrock and Yrjö Engeström, were gracious and the degree was awarded me in June
2007.
The present book differs from the dissertation in many respects. Most importantly, it
contains new chapters in which I, at the instigation of the anonymous reviewers, undertake a critical discussion of CSCW. The new chapters are the three that make up Part III.
On the other hand, in order to prevent the book from becoming excessively large, two
articles have been omitted from Part II.
A couple of sojourns as visiting professor with Volker Wulf’s group at the University
of Siegen, Germany, in 2007 and 2008 gave me the welcome opportunity to begin drafting the new chapters. A version of the account of the formation and fragmentation of the
CSCW (in Chapter 11) was used as a basis for an article published as a discussion paper
(‘Divided by a common acronym’) at ECSCW 2009. A planned chapter on ‘The concept
of work in CSCW’ was taken out as used as a basis for another article and submitted for
COOP 2010. Readers who want to inspect those aspects of my critique of the state of
CSCW are referred to these articles.
I was fortunate that a number of colleagues, among them Liam Bannon, Susanne
Bødker, Lars Rune Christensen, Lise Justensen, Dave Randall, Satu Reijonen, Signe
Vikkelsø, and Volker Wulf, have commented on versions of Part III (or, rather, fragments
thereof), directing my attention to all kinds of shortcomings, not least points where the
argument was acutely in need of clarification. I thank them all, hasting to add that the
responsibility for remaining shortcomings in terms of style, grammar, logic, clarity,
judgment, and plain good sense is mine alone.
Part I
Progress report
‘So here I am, in the middle way, having had twenty years […]
Trying to learn to use words, and every attempt
Is a wholly new start, and a different kind of failure […]
And so each venture
Is a new beginning, a raid on the inarticulate
With shabby equipment always deteriorating
In the general mess of imprecision of feeling,
Undisciplined squads of emotion. And what there is to conquer
By strength and submission, has already been discovered
Once or twice, or several times, by men whom one cannot hope
To emulate—but there is no competition—
There is only the fight to recover what has been lost
And found and lost again and again: and now, under conditions
That seem unpropitious.’
T. S. Eliott: Four Quartets
Chapter 1
Cooperative work and coordinative
practices
Over the last few decades, the interests and concerns of researchers from areas or
disciplines that are otherwise rather disparate have been converging on at set of
issues that are closely related, in practical terms as well as conceptually, and
which all somehow center upon cooperative work practices.
We have by now a rather overwhelming body of literature that, in different
ways, is concerned with issues of cooperative work practice, although the issues,
as is always the case, are named and framed differently by different research traditions: ‘articulation work’ (Strauss, 1985; Strauss, et al., 1985; Gerson and Star,
1986), ‘situated action’ (Suchman, 1987), ‘due process’ (Gerson and Star, 1986),
‘working division of labor’ (Anderson, et al., 1987), ‘actor networks’ (Latour,
1987; Law and Hassard, 1999), ‘horizontal coordination’ (Aoki, 1988), ‘boundary
objects’ (Star and Griesemer, 1989; Star, 1989), ‘distributed cognition’ (Hutchins,
1991, 1995), ‘socially shared cognition’ (Resnick, et al., 1991), ‘distributed decision-making’ (Rasmussen, et al., 1991), ‘communities of practice’ (Lave, 1991),
‘coordination theory’ (Malone and Crowston, 1990, 1992), ‘cooperative work’
(Bannon and Schmidt, 1989; Schmidt and Bannon, 1992), ‘heedful interrelating’
(Weick and Roberts, 1993), ‘contextualized integration of human activities’
(Harris, 1995, 2000), ‘team work’ (Grudin, 1999), ‘team situational awareness’
(Endsley and Garland, 2000; McNeese, et al., 2001), ‘embodied interaction’
(Dourish, 2001), etc.
Whatever the name and irrespective of the frame, the various research undertakings listed above focus on problems such as: How is concerted action of multiple individuals actually accomplished? Through which practices is such action
coordinated and integrated? How do actors manage to act in a sufficiently concerted way, under conditions of partial knowledge and uncertainty, and how do
they routinely manage to do their joint work in an orderly fashion in spite of local
troubles and the heterogeneity of interests, approaches, and perspectives? What is
12
Cooperative Work and Coordinative Practices
the role of formal constructs such as checklists, plans, blueprints, standard operating procedures, classification schemes, coding schemes, notations, etc.? How are
they constructed, appropriated, applied, amended? What is the role of material
artifacts and practices of writing in this context? How do actors interact through
inscribed artifacts and how do they coordinate and integrate their individual activities by means of such devices? How do the different material characteristics of
infrastructures, settings, and artifacts impact on cooperative practices? How do
transformations of these material artifacts, media, and modalities affect the practices and the organization of cooperative work?
These questions have also defined the research work reflected in this book, but
the research strategy that has been developed and pursued in the course of this
work differs in important respects from some of the other approaches. Outlining
this strategy and how it has developed is the topic of this chapter. A brief account
of how it all began in my individual case is an appropriate place to start.
1. The road to CSCW
The problem of cooperative work — understanding the changing forms cooperative work takes under different economical and technological conditions as well
as the skills involved in cooperative work — has, it seems, been with me for ages.
Since I, as a young man in the late 1960s, immersed myself in Marx’ Grundrisse
(1857-58b) and Das Kapital (1867b), the phenomenon of cooperative work has
played a central role in my understanding of working class organization, that is,
cooperative work conceived of as the material source of working class autonomy
and assertiveness and as the source of a progressive constitution of modern industrial society based on the ‘association of free producers’. In fact, I became a sociologist in an attempt to understand these issues and my very first publications addressed those very issues (e.g., 1970).
Then came the so-called ‘microprocessor revolution’ of the early 1980s. With
the microprocessor the computer became a commodity. It became economically
feasible not only to incorporate computers in plastic boxes with keyboards and
screens that could be sold to individuals as ‘personal’ appliances, but also to incorporate computers in virtually any part of the production facilities in industrial
settings. The world of industrial production was in for radical transformation. In
industry, apart from ‘process industries’ such as chemical industries and power
plants where key processes are automatic by nature, manual control had been
prevalent up to this point in time. It is true that the overall flow of materials and
parts had been mechanized in mass-production industries. The assembly lines of
the automobile industries were of course famous but of marginal economic importance; the typical manufacturing enterprise was more like an engineering shop
than a mass-production plant. And in any case, in engineering shops and largescale manufacturing alike, production control was still ‘manual’: the control of the
Progress report: Cooperative work and coordinative practices
13
production processes (cutting, grinding, welding, etc.) were ‘in the hands’ of
workers. Now, with the advent of the micro-processor, this began to change on a
giant scale. In rapid succession an extended family of new technologies were devised and began to be rolled out: computer-controlled machining centers, industrial robots, automated materials-handling and transportation systems, flexible manufacturing systems (FMS), computer-aided design and manufacturing
(CAD/CAM), production-planning and control systems (MRP), and so on
(Schmidt, et al., 1984). It was also evident that similar upheavals were underway
outside of production proper, in design and engineering, in the administrative domain, etc.
I was thrilled. The chance of witnessing, in one’s life time, a technological, organizational, occupational, social, and economical transformation of such magnitude and scope was not lost on me. In that mood, it did not take long for me to decide to enter the fray. My move was, more specifically, motivated by my realization of a number of deep-set methodological problems with the program I had
been pursuing until then.
(1) A sociological study that tries to determine the social and organizational
impact of a specific technological change is faced with a methodological nightmare. The reason is that, in order to do so, one must first of all be able to characterize the technology in question with respect to actual practices. How can a sociologist, of all people, adequately characterize technologies? Worse, how does he
do that with novel, perhaps not yet fully developed technologies? How does he do
it without understanding the specific roles of different technologies in actual
working practices? What happens is of course that sociologists stick to secondguessing, that is, to produce post-festum ‘predictions’, or that they produce forecasts on an aggregate level so elevated as to be meaningless.
(2) As pointed out by none other than Marx (1877, 1881), there is no ‘superhistorical’ ‘master key’ that allows us to anticipate societal developments in any
specificity. Processes of social change may exhibit striking regularities but they
always play out in a particular ‘historical milieu’, as a result of which they may
have widely different effects in different local settings. The industrial revolution
in Britain, for example, produced an entirely different ‘historical milieu’ for preindustrial branches of production in Britain as well as for other countries. Understanding technological change and its impact is no exception. Even when a specific technology is understood and has been adequately characterized, its ‘effects’
may differ widely according to the socio-economic milieu. e.g., the national and
regional rate of employment, the quality and coverage of the educational system,
migration patterns, social security systems, labor protection, etc.
(3) Moreover, it quickly dawned on me that central to the transformation process generated by ‘the microprocessor revolution’ were some very complex research issues such as, for example, the famous one of ‘allocation of functionality
between human and machine’. Enlightened engineers were beginning to realize
14
Cooperative Work and Coordinative Practices
that the default strategy of automating whatever can be automated leads to all
kinds of dysfunctional socio-technical systems and, especially in the case of safety-critical systems, possibly to accidents and disasters. They discovered that the
‘allocation of functionality’ is a design problem in its own right. At first, it was
hoped that cognitive psychology could help out by offering something close to a
check list of ‘tasks’ that are optimally best allocated to machines and humans, respectively (Jordan, 1963). It did not work out that way, of course (Kantowitz and
Sorkin, 1987). The problem is a wicked one (Rittel and Webber, 1973).
The design of complex technical systems presumes certain ‘job designs’ which
in turn presume technical systems of certain shapes and forms. Coping with the
circularity of the problem requires an understanding of the ‘dynamics’ of technical system and job design, which in turn requires an understanding of the technical, organizational, and socio-economic environment. And since modern work
is cooperative work, this means that the design of complex technical systems and
the shape and form of the organization of cooperative work are inexorably intertwined. As I saw it, CAD/CAM systems, FMS systems, MRP systems, office information systems, etc. were all systems by means of which workers would cooperate and also coordinate their cooperative activities.
My realizing these methodological problems prompted me to make my move.
Twenty years earlier, in 1965, I had dropped out of university, where I was initially studying philosophy, to become a computer programmer. My motive was partly
to make a living, of course, but also partly fascination with the new technology. In
1985, I made a parallel move. I left academia to join a private research laboratory
(Dansk Datamatik Center) where I became responsible for the lab’s research in
office information systems. That move took me directly to the research area of
CSCW, which was then just being formed.
In the few years I spent at DDC, I became increasingly involved in doing field
work. After some initial ‘quick and dirty’ workplace studies in various administrative organizations (e.g., a regional planning office, a standardization organization), I did fieldwork in domains as diverse as engineering design (e.g., design of
cement factories), mathematical research, and portfolio management. In the
course of these studies, the problem of cooperative work became far more concrete to me than it had been before 1985. In a report on Integrated Engineering
Workstations from 1987, I summarized the observations I had made in my studies
in a few theses on ‘forms of cooperative work’ (1987), which were first turned
into a short paper that I had the opportunity to present at the first European workshop on CSCW in Athens (1988b), and then into a rather long paper that was presented at a workshop on ‘technology and distributed decision-making’ in Germany (1988c).
The strategy that I, in effect, pursued in this early work exploited the rather unusual experience I had gained from doing a series of workplace studies in very
different settings in quick succession. It provided me with the opportunity of ob-
Progress report: Cooperative work and coordinative practices
15
serving patterns of cooperative work in a large number of settings and hence of
subjecting the work in these settings to a comparative analysis. Although I was
doing field work ‘for money’ and did not have much time to do a systematic comparative analysis, my understanding of cooperative work, as it emerged from these
studies, differed radically from the then prevailing notions of cooperative work. I
already knew from the classic study by Heinrich Popitz and his colleagues (1957)
that ‘group work’ is a rare occurrence in industrial settings, the ‘group fetishism’
of especially American sociology notwithstanding. In my own studies, I did not
see much ‘team work’ either, nor did I observe actors solemnly decide to ‘collaborate’ to reach a ‘shared goal’. Instead, the cooperative work arrangements, which
were easily observable, came across, vividly and massively, as an entirely practical matter, motivated by highly pragmatic (but contradictory) concerns such as
external requirements, operational constraints, and limited resources. What I saw
were patterns of cooperative work emerging, changing, and dissolving again in
response to recurring or changing requirements, constraints, and resources.
This insight — that cooperative work is a ubiquitous occurrence in industrial
and similar work settings, that it is not something invented by sociologists, socialpsychologists, or management consultants but is a routine practical measure to
meet practical needs — was the platform from which I engaged in CSCW. It first
of all provided me with a basis for defining cooperative work, and hence the
scope of CSCW, and was then, in turn, instrumental in pointing to some absolutely key issues for CSCW.
Instead of giving a summary of the various arguments and positions that are
generally spelled out well enough, and more than often enough, in this collection,
I will here concentrate on the strategy rather than the particular campaigns. However, in order to give readers who are unfamiliar with the research reported in this
book a chance for following the remainder of this introduction, let me introduce a
very simple example of cooperative work in everyday life.
2. The concept of cooperative work: The mundane case of
moving
Consider two men moving a dining table set consisting of a table and six chairs
from one end of a living room to the other.
They may do this for all sorts of reasons. Perhaps they agree to the very idea of
moving the table and even that it should be moved to that particular location.
They may have discussed different possibilities and only then negotiated a solution. In this case they may be said to have a ‘shared goal’ in the sense that a certain future state of affairs, a new location of the table set, has been stated as desirable and explicitly agreed to. ‘OK’, one of them may have said, eventually, ‘I
think you’re right, let’s move it to the window.’ ‘Yes, let’s try,’ the other one said.
16
Cooperative Work and Coordinative Practices
Perhaps the two men do not agree on the desirability of moving the table set at
all. Perhaps one of them is merely assisting the other person, for a fee perhaps, or
to return a favor, or out of sheer generosity. Whatever the reason, he does not really care very much about the location of the table set. He does not ‘share’ the other
man’s ‘goal’, he’s just helping out. ‘Now, where do you want it?’, he asks. ‘Over
there, by the window.’ ‘All right.’
Now, whatever the social arrangement — who wants to move the table and
who acquiesces and who merely lends a helping hand to someone else’s project
— actually moving the table involves a specific category of interaction.
Moving the chairs is straightforward. Each of the men just picks up one chair at
the time, carries it to the other end, and puts it down, and then repeats the operation until they are finished. To do this, in fact, two men are not needed. It might
even be easier for one man to do it, since being two requires certain coordinative
measures: they must take care not to be in each other’s way or to bump into the
other, and they have to make sure that they do not put down the chairs in a location where these will become an obstacle to the other or pose an obstruction when
moving the table.
Carrying the table is a different kind of activity. Let us say that the table is
large and heavy. It just might be possible for one of them to move it to the new
location by dragging or pushing it across the floor, but that would surely damage
the beautiful hardwood floor and perhaps also put excessive stress on the joints of
the table. That is, by being more than one to do the job, it is feasible for them to
move the table without causing damage to the table, to other things, or to themselves.
Now, to move the table, each of them grabs it, lifts it, and they then carry it to
its new location. But apart from these individual-sounding actions (grabbing the
table, lifting it, carrying it, and putting it down again), how do they do it as a joint
effort? Which interactions occur between the two men to make it happen?
First of all, and very much as in the case of the chairs, they must somehow
agree about what to do with the table. They also need to make initial arrangements
such as, who takes which end of the table, and perhaps also the exact destination.
Having sorted that out, they need to synchronize their respective actions: when to
pick it up, when to start walking, the general direction in which to walk, the pace
of walking, and when to put it down and at which exact spot by the window. If
they do not handle this coordination well, the trivial task of moving the table may
turn out to be demanding and may cause some broken furniture and even an injured back. These coordinative actions can happen in myriad ways. But basically,
by holding the table in their hands, they are both immediately ‘aware’ of the state
of the table: its location in space (altitude, pitch, and roll), its velocity, its weight.
As soon as one of them walks slightly more briskly or slows down just a little,
tilts the table to this or that side, lowers it or raises it, the changed state of the table is instantly conveyed to the other man who then has to act accordingly, by do-
Progress report: Cooperative work and coordinative practices
17
ing likewise or by counter-acting. In the act of carrying the table, the two men are
causally interrelated.
The two men are also able to interact in other ways. They may hold the table in
such a way that they can see each other’s faces; each may then be able to gather
from the other man’s expression if he is having problems or what he intends to do
next and may adjust his own actions to that. Each of them can talk, groan, and nod
to make the partner understand his problems or intentions, but also to
acknowledge that the other man’s problems or intentions have been understood,
and so on. If these coordinative actions — the nodding, grunting, talking, swearing, shouting — are not sufficiently effective, any one of the two men can deliberately change the state of the table (force it in a certain direction, stop abruptly,
shake it, etc.) to make a point that did not come across too well verbally or
through gestures. They are thus likely to succeed.
Indeed, moving a table horizontally a few meters across the floor of a room
does not pose extraordinary challenges to the two men’s coordinative competencies. But taking the table through a door opening might. The narrow space of a
doorway, compared to that of a room, will typically impose strict constraints on
the operation. The men cannot move the table horizontally through the door opening but will have to tilt it. Perhaps the width of the doorway is less than the height
of the table and perhaps the next room is a narrow corridor. In this case they may
have to carry the table vertically while simultaneously turning it around its vertical axis to get the legs through. In order to do this, then, they have to take the table through a carefully choreographed sequence of spatial positions while moving
it forward, through the opening, into the corridor. The more severe constraints of
this task, compared to moving a table to another position in the same room,
change the nature of the cooperative task considerably. Since the degrees of freedom are much fewer, the activities of two men become more ‘tightly coupled’.
There is, literally and metaphorically, less leeway for each of them. They need to
coordinate their individual actions much more closely.
Things would be somewhat different if they are, say, moving a rolled-up carpet. When taking the carpet through the doorway they can bend it fairly easily,
which may practically neutralize the constraints otherwise posed by a doorway
leading into a narrow corridor. Their individual actions are therefore less tightly
coupled, and they need not strive to coordinate their individual actions as closely
and carefully. On the other hand, however, due to the carpet’s floppiness, its state
is not as immediately apperceptible to the two men as that of the table. It may not
be immediately obvious, from the state of the carpet as experienced locally, by
each of the two, what the other is doing to the carpet. That is, while moving the
carpet does not pose strong demands on their coordinative skills, when close coordination for some reason is called for they may have to be more verbally explicit than when moving a table.
18
Cooperative Work and Coordinative Practices
Imagine, finally, an effort of moving on an entirely different scale such as, for
instance, when a family is moving to a new house or a firm is relocating to a new
building. In such cases, more than two persons will be involved for the simple
reason that the number of items to be moved is much larger. If only two men were
to do the job, the exercise could easily last for weeks or months. In the case of the
family’s moving to a new home, the effort may involve friends and family or professional movers; in the case of the relocation of the firm, the effort will undoubtedly require dozens of professional movers. In any event, we would observe exactly the same practices as the ones we have just described, the lifting and carrying of chairs, tables, boxes, etc. The important difference is the scale: these actions will be happening in parallel and will be repeated multiple times. However,
since many more items and many more actors are involved, what decision analysts call the ‘space of possibilities’ is vastly larger than in the case of simply
moving one table and six chairs from one end of a room to the other. There simply
are so many things to move and so many places things can be moved to. And, to
confound the problem, there are dozens of actors who are simultaneously picking
up items and moving them to other locations. In such cases we will observe specialized professional practices. Furniture, lamps, carpets, etc. will carry labels telling where they are to be put (‘Kitchen’ or ‘Room K4.55’). Boxes will have similar labels indicating not only their destination but also what they contain (‘Porcelain’, ‘Unfinished manuscripts’). The movers may also have floor plans of the
building, and doors may be similarly marked with labels with inscriptions that
correspond to the inscriptions on the floor plan (‘Kitchen’ or ‘Room K4.55’) and
on the items that are to be taken there. If the relocation operation is a large one,
one may also observe a pre–specified workflow of sorts, for example a list indicating the sequence in which items belonging to particular building sections are to
be relocated (‘Monday: Ground floor; Tuesday: First floor’, etc.), in order to
avoid congestion, confusion, chaos.
3. Strategic distinctions
Our mundane example has already induced us to make a series of important and
interlaced distinctions.
(1) The first distinction is of course that of individual work and cooperative
work. Whereas each of the men could move the chairs individually, they could not
do so, for whatever reason, when it came to moving the table. For this task the
joint effort of two men was required; when approaching and then grabbing the
table, the two men entered a cooperative work arrangement.
One should notice, however, that there is an important issue of scope or granularity in making this distinction in an actual case. Instead of focusing on their
moving the chairs and then moving the table as separate tasks, we could just as
well look at the moving of the whole dining table set as a set. Looked at this way,
Progress report: Cooperative work and coordinative practices
19
it would still be a cooperative effort, since two men would be required for at least
part of the effort, but it would still be composed of a range of actions, of which
some (moving the chairs) were ‘loosely coupled’ and others (moving the table)
‘tightly coupled’ cooperative activities. In fact, the example we are looking at here
could itself be merely a small part of a much larger cooperative relocation effort.
The level of scope or granularity at which we describe it depends on the purpose
of our investigation.
We also distinguish the work itself, the work of moving the table set, from the
secondary interactions required to coordinate and integrate the contributions of
multiple individuals, for which I have adopted the term used by Anselm Strauss
and his colleagues: articulation work (Strauss, 1985; Gerson and Star, 1986).
(2) When we describe the cooperative activities of moving the furniture, we
are applying a distinct analytical perspective. We look at the cooperative effort
and the practices involved in that effort without knowing very much about the socio-economic setting in which it takes place. In fact, we do not need to know the
socio-economic roles of the two men: if either or both of them are wage earners
and do this for a salary, or if they live there and do it for their own benefit, or if
one of them is providing neighborly help. Nor do we need to know anything about
their ‘state of mind’, that is, if they are happy or not happy about the whole ordeal. In short, we can focus on and investigate cooperative work and coordinative
practices as a distinct domain of practice, while leaving the socio-economic and
organizational setting in the background.
To be more precise, a number of distinctions are involved here (cf. Schmidt,
1994c, 2002a). There is the unfolding pattern of cooperative interdependencies
and interactions, as the two men engage in the task and perform their work: as
they approach the table set, pick up the chairs and carry them, one at a time, to the
end of the room, and then return to the table, pick it up, and carry that too. These
shifting patterns of actually enacted relationships is what I call the cooperative
work arrangements. Other categories of relationship can also be distinguished, in
particular the relatively stable configuration of actors for which the term work organization is normally used. The distinction is that of ‘mobilization’ versus ‘deployment’, that is, between the contingency arrangement (e.g., the particular configuration of workers with a range of skills deemed adequate to handle the tasks
expected on a particular shift) and the enacted arrangement. In our example of
moving, two men are enlisted in the contingency arrangement because the work to
be done includes the moving of a large and heavy table; the enacted arrangements
coalesce and dissipate again, as the two men first move the chairs and then, jointly, the table.
Both of these perspectives are essential when looking at cooperative work, not
only the enacted arrangements but also the contingency arrangement, because the
shifting cooperative work arrangements play out among the members of the work
20
Cooperative Work and Coordinative Practices
organization. They combine and deploy as the situation unfolds, on the basis of
what is to be done, what it requires, who is ready, etc.
By contrast to these perspectives, in which cooperative work is conceived of as
material relationships and which are central to CSCW, there are of course other
perspectives, in which cooperative work is conceived of from the point of view of
the socio-economic relationships that are also played out in cooperative activities,
in and through the material relationships. There is the unit of appropriation
through which resources are committed and pooled and the results of the effort are
allocated to the participants — the economic unit, if you will. In our case, different units may be involved: the family whose furniture it is and who will hopefully
benefit from the moving about, and possibly the neighbor who may be helping, or
the professional movers who in turn may be wage earners or members of a cooperative. And there are, finally, the contractual arrangements through which members of the unit regulate their diverse, partially incongruent, sometimes conflicting
interests and concerns: the pizza and beer that the neighbor is due, or the contract
specifying the ‘transaction’ between the family and the movers.
In making these distinctions, or rather, in recognizing the different domains of
discourse in which we talk differently about practical organization of work, I was
(implicitly) influenced by the Marxian distinction between ‘material’ and ‘social’
relationships of human sociality (for an excellent reconstruction, cf. G. A. Cohen,
1978). However, I was also strongly influenced by neo-classical institutional economics (Williamson, 1979, 1981). This is not as unprincipled as it may seem.
Williamson makes the exactly same distinction as I do here: between the relationship of interdependence in work and its immediate organization (the cooperative
work arrangement and the work organization) on one hand, and on the other the
contractual governance arrangements regulating ‘transactions’, the relationships
of ownership and appropriation. The small but important difference is that he focuses on the socio-economic relationships of transfer of ownership (‘transactions’) and considers the cooperative work arrangements as a singularity, whereas
I have shifted ‘figure and ground’ and focus on the cooperative work arrangement, pushing the socio-economic relationships to the back.
This distinction — between cooperative work and the contractual settings in
which it is situated — is useful for defining the ‘boundary’ between CSCW and
Information Systems research and other areas of organizational IT. The distinction
is not, as it is sometimes posited (e.g., Grudin, 1994, 1999), one of size (‘small
groups’ versus ‘organizations’) but one of perspective. The CSCW perspective
addresses IT sub specie cooperative work practices, irrespective of the institutional economics (not to mention the size) of the arrangement, whereas IS research
addresses IT sub specie the socio-economic interests and motives of the actors
(business models and the concomitant performance measurement and remuneration arrangements).
Progress report: Cooperative work and coordinative practices
21
The distinction between cooperative work and the institutional and contractual
arrangement is fundamental to my strategy. It allows us to single out ‘cooperative
work’ as a distinct category of practice that can be conceived of independently of
actors’ motives and interests and thus to talk fairly unambiguously about ‘interdependence’.
(3) In our scenario we begin to discriminate different ‘kinds’ of relationships of
interdependence. We noticed, for example, that moving chairs and moving tables
and moving carpets involve interdependencies with rather different characteristics. We also noticed that moving the table across a room involves interdependencies that are distinctly different from moving the same table from one room to another. These different characteristics can be conceived of a so many kinds of
complexity of cooperative work.
Being interdependent in work is categorially different from being ‘interdependent’ by virtue of sharing a scarce resource, such as the road system in the
morning rush-hour, or being ‘interdependent’ by virtue of sharing a budget, as one
does when employed with the same company. Different rules apply and hence different practices are involved. Without the distinction, the term ‘interdependence’
is analytically useless.
Thus defined, the concept of ‘interdependence’ itself plays a strategic role.
First of all, it has provided a firm ground for defining cooperative work in a way
that does not subscribe to notions of occult alignment of minds such as ‘shared
goal’ or ‘shared understanding’.1 Such mentalist definitions invariably end up in
tautologies: cooperative work is defined by a shared goal, and the criterion for ascribing a shared goal to actors is that they — well, act in concert. If cooperative
work is conceived of this way, we are not really, i.e., accountably, able to speak of
a cooperative effort that is not carried out successfully. By contrast, the concept of
interdependence in work enables us to conceive of cooperative work in terms of
actual observable conduct.
In addition it has served a heuristic or methodological function. When we conceive of cooperative work in terms of observable interdependencies, the obvious
next step is to investigate the different characteristics of different relations of interdependence, such as, for instance, the ‘degree of coupling’, the direction of dependence, as well as irreversibility, uncertainty, and various temporal characteristics, etc. This has proved analytically quite productive in the context of workplace
studies, by offering a useful path towards a systematic conceptual framework for
workplace studies and for comparative analysis.
The concept of interdependence expresses the particular material, dynamic, and
environmental characteristics of a particular cooperative effort. It therefore also
1 I am not exaggerating. Endsley and Jones, for example, explicitly conceive of ‘common goal’ and ‘shared
understanding’ as ‘overlapping’ ‘sets’ and even go as far as to talk about ‘shared situation awareness’ and
‘shared mental models’ (Endsley and Jones, 2001).
22
Cooperative Work and Coordinative Practices
implies that different relations of interdependence may have different characteristics. The relationships of interdependence that actors enter when forming a specific cooperative work arrangement to undertake a particular task pose specific
‘complexities’ for the cooperating actors to cope with that differ from those posed
by a similar task in another setting or by another task. Carrying a table across a
floor poses coordinative complexities of a more manageable kind than carrying
the same table through a narrow door frame, whereas the relocation of an entire
firm from one address to another in turn poses quite different and far less tractable
complexities.
Although the concept of complexity was at first introduced rather informally in
my thinking about cooperative work and was not discussed at all until my ‘Remarks on the complexity of cooperative work’ (2002a), it served well as an intuitive and cogent way of expressing what motivated the widespread use of specialized coordinative practices involving coordinative artifacts. The concept of the
complexity of cooperative work was also useful by implicitly highlighting those
settings and practices for which CSCW technology might be most relevant and
have the highest potential.
(4) We have, in our fictional case, observed a variety of coordinative practices,
ranging from the trivial one of moving the chairs prior to moving the table and
avoiding collisions, to carrying the table in synchrony in the right direction, to the
entirely different specialized practices of using standardized inscriptions to identify or categorize items and of developing work schedules.
The concept of interdependence also, and this has been its most important strategic function, offers a framework for carefully and deliberately embracing the
entire spectrum of coordinative practices, from actors’ effortlessly and unnoticeably aligning their own activities with those of others (in CSCW often referred to
under the label ‘mutual awareness’), to actors’ methodically regulating their interdependent activities through pre-established schemes expressed in a set of rules
(conventions, operational procedures) and concomitant, appropriately formatted
textual artifacts (forms, taxonomies, schedules, etc.). So, instead of creating a categorical gulf between, say, ‘awareness’ and ‘workflows’, the concept of interdependencies enables us to conceive of coordinative practices as an open-ended repertoire of practices, some inconspicuously quotidian and ubiquitous, others exceedingly specialized and sophisticated.
The idea that different cooperative work arrangements have to cope with interdependencies of different complexity and that they develop different coordinative
practices to accomplish exactly that, gave me a handle on a vast and heterogeneous class of coordinative practices that all rely on coordinative artifacts and, be-
Progress report: Cooperative work and coordinative practices
23
hind that, practices of writing. I dubbed them ‘coordination mechanisms’.2 They
are massively present in modern cooperative work settings and one would have to
be blind (or ideologically blinded) not to notice them. They are ubiquitous because economically vital.
4. Coordinative practices: From ‘coordination mechanisms’ to
‘ordering systems’
The concept of coordination mechanisms was developed in opposition to the then
prevailing opinion in CSCW according to which IT systems cannot or should not
regulate interaction. While the observation that formal organizational constructs
are widely used in cooperative work was far from controversial, apprehension had
grown and become widespread in the CSCW community with respect to the idea
that computer systems could be successfully designed to regulate cooperative interaction by means of computational procedures, workflows, process models, etc.
These misgiving were not at all groundless. Early attempts in CSCW to build systems that somehow imposed rules on cooperative interaction such as The Coordinator (Flores, et al., 1988), DOMINO (Victor and Sommer, 1989), etc. were generally perceived as failures, sometimes even by the designers themselves
(Kreifelts, et al., 1991a), and a number of critical sociological studies by Lucy
Suchman and others argued that such constructs, instead of determining action in
a ‘strong sense’, by specifying step by step how work is actually performed, serve
as ‘maps’ that responsible and competent actors may or may not consult to accomplish their work (Suchman, 1987).
From the very beginning I found these interpretations of the experiences and of
the field work data problematic and found the conclusions drawn from them unduly pessimistic. Fearing that the thinking that was already rapidly becoming the
CSCW canon (Agre, 1990) would condemn CSCW research to a program that,
devoid of sociological realism and practical relevance, posited that computational
systems should simply provide a space of sorts for unregulated interaction (‘media
spaces’, ‘workspaces’, ‘collaborative virtual environments’), I suggested an alternative research program (Schmidt, 1991a).3 Using Suchman’s dictum that ‘plans
are resources for situated action’ as my shield, I very cautiously sketched my alternative approach:
‘models of cooperative work in CSCW systems (whether procedures, schemes of allocation of
tasks and responsibilities, or taxonomies and thesauri, etc.) should be conceived of as resources
for competent and responsible workers. That is, the system should make the underlying model
2 In fact, these practices were at first called ‘mechanisms of interaction’ (1994b), but having realized that the
term ‘interaction’ covers about everything in social life and that the term ‘mechanism of interaction’ thus
were far too sweeping, I later adopted the more modest term ‘coordination mechanisms’.
3 My critique of Suchman’s analysis of the role of formal organizational constructs was unfolded a few years
later in my article entitled ‘Of maps and scripts’ (1997). (Cf. also the discussion in Chapter 12).
24
Cooperative Work and Coordinative Practices
accessible to users and, indeed, support users in interpreting the model, evaluate its rationale and
implications. It should support users in applying and adapting the model to the situation at hand;
i.e., it should allow users to tamper with the way it is instantiated in the current situation, execute
it or circumvent it, etc. The system should even support users in modifying the underlying model
and creating new models in accordance with the changing organizational realities and needs. The
system should support the documentation and communication of decisions to adapt, circumvent,
execute, modify etc. the underlying model. In all this, the system should support the process of
negotiating the interpretation of the underlying model, annotate the model or aspects of it etc.’
(Schmidt, 1991a)
4.1. Coordination mechanisms in practice
I had originally begun to concern myself with work practices that somehow depend on formal constructs when I was doing my early work on office information
systems for administrative work domains in the 1980s, but from about 1990, when
I joined Jens Rasmussen’s group at Risø, my colleagues and I started on a systematic investigation of the phenomenon. Bjarne Kaavé was already engaged in his
fascinating study of production planning and control practices in a Danish manufacturing plant (Kaavé, 1990). Our discussions and analyses of his observations
played an important role in developing my understanding of interdependence and
of the role of coordination mechanisms. A little later, Peter Carstensen and Carsten Sørensen did a study of a large industrial design project and were able to observe, virtually first hand, how a group of ordinary engineers developed and
adopted a set of procedures and forms (e.g., a bug report form, a binder, a spreadsheet with a project schedule) in an attempt to cope with a cooperative effort that
had become chaotic (Carstensen, 1994; Carstensen, et al., 1995a; Carstensen,
1996; Carstensen and Sørensen, 1996). To us this was a demonstration that coordination mechanisms cannot be reduced (Braverman style) to mere control instruments in the service of capital. They are, in some important respects at least,
indispensable practical means for maintaining order under conditions of division
of labor. Hans Andersen’s study of ‘change notes’ in another design organization
complemented these findings (H. H. K. Andersen, 1994b).
This view of coordination mechanisms — that they are essential means that
members of cooperative work arrangements devise, adopt, and adapt in order to
be able to manage their complex interdependencies — was later substantiated by a
series of studies of ‘self-governing production groups’ in Danish industry that
were carried out from 1998 onwards. Such groups are a key element in a strategy
that aims at increasing the competitive power of manufacturing in high-cost
Western countries by increasing operational flexibility and product quality. However, as it had been pointed out by Irene Odgaard in a study of production groups
at a large Danish manufacturing company (1994), the groups were largely unable
to accomplish the coordination tasks that had been delegated to them because they
did not have the requisite tools to do it properly.
Progress report: Cooperative work and coordinative practices
25
Inspired by this, my colleagues and I embarked on a series of studies of shopfloor production planning and control in Danish industrial enterprises that lasted
from 1998 to 2002. In the first of these studies, of shop-floor planning and control
in a manufacturing enterprise that was then switching from forecast-driven to order-driven production, we were able to show that the standard MRP system was
far to crude to offer the required coordination support on the shop floor. On the
other hand, it was evident that the models underlying the MRP system (e.g., bill
of materials, routing schemes, processing schemes) were as indispensable as in
the case studied by Kaavé. Our study resulted in a demonstrator prototype of a
system that would exploit the models underlying the MRP system but give operators extensive power to overrule the plans generated by the MRP system, whenever they decided that the generated plans were inadequate, and our prototype would
then, again on the basis of known interdependencies, try to anticipate the effects
of the new plans enforced by the operators. One could say that the kind of system
we sketched was an interactive MRP system (Carstensen, et al., 1999; Odgaard, et
al., 1999).
After that, in a subsequent and much larger research project, we launched a series of concurrent field studies in five Danish manufacturing enterprises: a shipyard, a maritime propulsion manufacturing plant, a cable manufacturer, a manufacturer of steel cabinets, and a manufacturer of electronic instruments
(Carstensen, et al., 2001; Carstensen and Schmidt, 2002). This series of studies, in
which members of production groups played an active role, further substantiated
the line of thinking that was orienting our research: that the kinds of construct we
have called ‘coordination mechanisms’ are of critical importance to actors in
complex cooperative work settings; that actors build, adopt, manipulate, adapt
such schemes when it is deemed useful to do so; that their ability to do so is critical to the productivity, effectiveness, and quality of their work and essential to
their collective control of their daily working life; and that there may be potentially vast benefits to be gained from developing information technologies that support ordinary workers in those practices. In short, the studies served as ‘proof of
concept’ for the concept of ‘coordination mechanism’ as well as a powerful reminder of the practical importance of the problem.
4.2. Understanding computational coordination mechanisms
We knew of course, from the outset, that for coordination mechanisms to be truly
viable the existing technological platforms were insufficient. Therefore, at the
same time as these analytical and design studies were pursued, but tightly interlaced with them, another long-term research program was launched in close collaboration with Carla Simone and her colleagues (at the Departments of Computer
Science at the Universities of Milano and Torino).
26
Cooperative Work and Coordinative Practices
In this work, we understood coordination mechanisms as consisting of two
basic and closely related elements: (a) a coordinative protocol: a set of rules pertaining to interaction (taken-for-granted ways of proceeding, established conventions, official policies, standard operating procedures); and (b) a coordinative artifact: a stable data structure expressed in a standardized graphical format.
From our field work we concluded that a computational coordination mechanism should meet a set of requirements which we expressed as follows: Computational coordination mechanisms should be ‘malleable’. This has several implications. A CSCW system of this kind should enable actors to define the protocol of
a new coordination mechanism and also to later redefine it, in order to be able to
meet changing conditions, by making lasting modification to it. Furthermore, actors should be able to control the execution of the protocol and make local and
temporary modifications to its behavior, for example to cope with unforeseen contingencies. In order for actors to be able to define, specify, and control the execution of the mechanism, the protocol should be ‘visible’ to actors at ‘the semantic
level of articulation work’, i.e., it should be expressed in terms that are meaningful to competent members of the cooperative work arrangement. Moreover, to allow for incomplete initial specification of the protocol, it should be possible for
actors to specify the behavior of the computational coordination mechanism incrementally, while it is being executed. And finally, we had observed that coordination mechanisms, even though they were typically developed for handling specific coordination issues, so to speak enter relationships with other mechanisms.
More precisely, a particular coordination mechanism will typically be part of a
wider complex of interdependent mechanisms. A change to the state of one mechanism may thus have implications for the state of another, and the propagation of
state changes from one mechanism to another will therefore have to be taken care
of somehow, manually or automatically (Schmidt, et al., 1995). Consequently, a
computational coordination mechanism should be constructed in such a way that
it can be linked to other coordination mechanisms in the wider setting (for the
consolidated formulation of this conception, cf. Schmidt and Simone, 1996).
In a systematic attempt to fully understand and, ultimately, meet these requirements an experimental notation was developed, i.e., a set of categories and
predicates of articulation work and the rules of their combination. Ariadne, as the
notation was called, was designed to enable actors to express, construct, maintain,
execute, and link computational coordination mechanisms. (The work involved
was extensive and resulted in a large number of publications. For a brief summary
of this work, cf. Simone and Schmidt, 1998). In the development of the Ariadne
notation, a considered decision was made to postpone the implementation and
concentrate on developing a formal specification of its elements and on evaluating
it against the requirements and scenarios derived from our field studies. This
strategy was adopted, deliberately and explicitly, in order to avoid having the notation influenced, in an implicit and uncontrollable manner, by the inevitable limi-
Progress report: Cooperative work and coordinative practices
27
tations of currently available implementation platforms. The formal specification
showed that it was feasible to construct malleable coordination mechanisms by
means of the notation. Subsequently, a ‘concept demonstration’ of the formal
specification of the notation was implemented in an environment which is particularly suitable to managing relational structures and their behavior. This partial implementation established that the internal architecture of the Ariadne notation is
workable (Simone, et al., 1995b).
In the context of the overall strategy I have been pursuing, the concept of ‘coordination mechanisms’ has done a good job. It has provided a workable approach
for CSCW research to address the realities of complex cooperative work settings
and has thus offered an alternative to programs that in my view could cut no ice
and which have subsequently been abandoned as viable programs. In addition, we
were able to show that the recommended approach was theoretically feasible.
4.3. Coordination mechanisms reconsidered
As noted above, we defined a coordination mechanism as consisting of a coordinative protocol as well as a concomitant coordinative artifact. This duality of coordination mechanisms was important for our work, for a number of reasons. It
was crucial that the concept of coordination mechanism was not, as so often happens, instantly dissolved in idle metaphorical talk. It is almost a defining characteristic of present-day intellectual life that any new and interesting concept that
arrives on the scene is immediately appropriated, stretched, transformed, abused,
and eventually rendered practically useless. And it was obvious that there was a
significant temptation to use the term to denote any type and form of convention,
from dinner party etiquette to the grammar of ordinary speech. We therefore
found it of critical importance to restrict the use of the term to the historically specific class of practices that have developed as an integral aspect of complex cooperative work practices, coordinative practices that are sufficiently standardized
and specialized that they are complemented by standardized and specialized coordinative artifacts. Similarly, my colleagues and I did not want the vast array of
artifacts that populate cooperative work settings, bug report forms as well as
screw drivers, time tables as well as machining stations, to be included under the
concept of coordination mechanisms. That would instantly render the concept
meaningless. Artifacts ‘as such’ have nothing other than abstract materiality in
common and is thus an empty notion. We wanted to address a specific class of
artifacts, namely, specialized artifacts, coordinative artifacts, that have been devised to serve in a regulatory capacity in cooperative work arrangements and that,
thus, are used in accordance with specific sets of rules, namely, coordinative protocols. In view of this, and for the sake of intellectual economy, it was assumed —
or rather stipulated — that a ‘coordination mechanism’ is defined by having one
and only one ‘artifact’.
28
Cooperative Work and Coordinative Practices
This stipulation had an additional advantage. We had observed that coordination mechanisms, although devised for handling specific coordination issues, are
regularly used in conjunction with other mechanisms. The ‘one artifact, one protocol’ stipulation seemed to make it relatively straightforward to identify and delimit individual coordination mechanisms and thereby to conceive of and construct well-bounded computational (models of) coordination mechanisms that
then, again in a relatively straightforward way, could be combined to form complex coordination mechanisms.
Now, let us then look at the costs of that strategy. For analytical purposes the
concept of coordination mechanisms has serious shortcomings. Some of the shortcomings were known and had been identified from the beginning or early in the
process, namely the ways in which coordinative issues of time and space were
dealt with. As far as issues of time were concerned, when the Ariadne notation
was devised, we were well aware that we only had a superficial understanding of
the issue of time in coordinative practices. As a result, the temporal aspects of coordination could only be expressed as pre- and post-conditions, and as points in
time, of course. The problem was duly noted and left for later work. As for spatial
issues in coordinative practices, when we began to investigate production control
systems such as MRP systems more in depth, it became clear that the coordinative
protocols incorporated in these systems could not express spatial aspects of coordination, such as, for instance, limited storage space on the shop floor or in the
shipping department. Again the problem was duly noted and left for later work.
More seriously, the deliberate rigor was obtained at a high price. Firstly, the
concept of coordination mechanisms was developed on the paradigm of workflows and thus does not support analysts in understanding, describing, or even noticing other kinds of coordinative protocols. Secondly, the ‘one artifact, one protocol’ stipulation is unduly restrictive when used analytically and may lead analysts to engage in futile analytical exercises. I will address the problems in that
order.
(1) The concept of coordination mechanisms was developed on the paradigm
of pre-established workflows: an MRP system, a kanban system, a bug report, a
project schedule, a change note. The kind of protocol we took as the exemplar has
a fundamentally temporal structure: when A has done x, the task (in state x’) is to
be transferred to B who then has to do y.
The fact that we granted privileged status to protocols of this procedural kind
was not the result of an oversight. It was obvious already then that other coordinative techniques play a crucial role in cooperative work, most importantly classification schemes. In fact, the Ariadne notation has a slot for ‘conceptual structures’,
but these were only treated in a rudimentary manner, subordinate to procedural
protocols. Our problem was not one of principle but one of practical expediency:
we had not as yet had the opportunity to do proper studies of practices of classifi-
Progress report: Cooperative work and coordinative practices
29
cation that would have enabled us to address them thoroughly, and so we simply
left the problem for later.
Anyway, this makes the concept of ‘coordination mechanism’ overly exclusive
when used analytically. It excludes ab initio important coordinative practices, not
by definition, but due to the framework’s lack of a rich and differentiated set of
distinctions. It notoriously makes analysts overlook coordinative artifacts and protocols of crucial importance, for the simple reason that they are not, or apparently
not, part of a coordination mechanism.
In sum, then, workflow specifications (schedules, time tables, routing schemes)
should be demoted from the status of paradigm to the one of a special case of coordinative artifacts and protocols. Although it may, in some cases, be relevant and
appropriate to single out coordination mechanisms from the wider cluster of coordinative practices, the analyst should not be unaware that workflows can not work
if actors are not also using, for example, maps, templates, location designators,
etc., as well as classification schemes, nomenclatures, ranking schemes, verification and validation procedures, coding schemes, notations, etc.
(2) As noted above, the ‘one artifact, one protocol’ stipulation turns out to be
unduly restrictive too.
On one hand, there evidently are coordinative protocols to which no coordinative artifacts are attached, at least not directly, and protocols evidently exist at
multiple levels of abstraction. That this was also realized by us when the framework was developed is manifest from the fact that it was felt necessary to include
‘policies’ (e.g., Simone, et al., 1995b). ‘Policies’ were taken to be exactly that:
global protocols, with no associated artifacts, that constrain the local specification
and application of protocols.
On the other hand, there evidently are coordinative protocols to which multiple
artifacts are associated. It would be excessively pedantic or directly meaningless
to divide such protocols into a range of discrete sub-protocols to correspond to the
various distinct artifacts. To take but a simple example: Organizing a meeting in
an organization, a design meeting, say, requires not only that a call or an agenda is
issued, perhaps with an attached set of minutes of the preceding meeting, a list of
participants, etc., but also a myriad of artifacts such as clocks and calendars, codes
for rooms (names or numbers) and often inscriptions of these codes on doors,
floor plans, etc. Although all of these artifacts of course presume and imply historically developed skills and conventions,4 it makes little analytical sense to insist, for each artifact, that the associated protocol must be discrete, nor does it for
that matter make much sense to insist, in each and every case, on conceiving of
the associated skills and conventions as coordinative protocols. The problem here
is, on one hand, that artifacts such as clocks and calendars, besides their use for
4 For excellent accounts of the development of these ‘generic’ script-based coordinative practices, cf. the
classic work by Jack Goody (1977, 1987), David Olson (1994), and Alfred Crosby (1997).
30
Cooperative Work and Coordinative Practices
coordinative purposes in work settings, are used generally in modern civilization,
for an infinite variety of purposes, and, on the other hand, that it leaves large
white spots on the map of coordinative practices if they are left out.
The coordination mechanism framework may thus engender a rather doctrinarian approach to analysis, and in fact, experience has shown that scholastic debates
easily erupt as to whether a particular artifact is or is not part of a particular coordination mechanism. An alarm on the bridge of a ship, for instance? City maps in
fire engines? Deployment plans? Access instructions? This conclusion is reinforced when we take into account not just clocks and calendars and maps and
floor plans. Let us therefore, in accordance with the strategic aim of not granting
privileged status to particular types of coordinative practice, try a fresh look at
what is actually there.
(3) What an analyst observes when entering a modern workplace is a plethora
of coordinative artifacts. He or she will see bulletin boards, shift staffing plans,
vacation plans, phone lists, shelves with dozens and dozens binders, often subdivided by markers, stacks of files on desks and shelves, in- and outboxes, archive
boxes, production plans at work stations, production orders, part drawings, product specifications, etc. The analyst should not dogmatically exclude any of these
from consideration. We have to understand how they are used, as a heterogeneous
totality, in coordinative practices.
For me the occasion for unpacking the concept of coordination mechanisms
and resume the original program of embracing coordinative practices in their endlessly rich multiplicity arose when I seriously began to address the issues of classification schemes.
Classification schemes, I knew from the very beginning, is a vitally important
phenomenon, but also one for which I did not have anything like proper empirical
foundation. I had observed its importance in my various studies, for instance, of
the distribution of research papers within the mathematical community, of the
handling of labor protection and tariff contract cases in trade unions, etc. In fact, it
was to a large degree the realization that classification practices are hugely important in cooperative work, which motivated Liam Bannon and myself to highlight ‘common information spaces’ as a central problem for CSCW (Bannon and
Schmidt, 1989; Schmidt and Bannon, 1992).
A serious attempt to get a handle on the issue of classification was undertaken
in the middle of the 1990s, in collaboration with Hans Andersen. The study of the
design organization in a large Danish manufacturer of water pumps identified
some, to us, very interesting coding practices, in particular a ‘product key’, a coding scheme that, by generating a unique, predictable, and reproducible designation
for each of the about 25,000 product variants produced by the company, also and
at the same time, generated a rigorous classification of the same items and of the
hundreds of thousands of associated documents. The scheme had the additional
advantage of being open-ended: changes to the design of a particular product vari-
Progress report: Cooperative work and coordinative practices
31
ant, e.g., the introduction of new materials, say Teflon, for sealing a shaft, would
be reflected in the coding scheme and thereby in the name and classification of the
variant (Schmidt, et al., 1995; H. H. K. Andersen, 1997).
But it was not until Ina Wagner and I, from 2001 onwards, jointly undertook a
systematic analysis of coordinative artifacts based on the rich set of data she had
gathered in the course of her long-term ethnographic study of architectural work
practices, that substantial progress was finally made.
4.4. Ordering systems
Investigating the large and heterogeneous collection of coordinative artifacts Ina
Wagner had gathered in her ethnography of an architectural office, we began to
understand that a wide variety of ordering principles were at play in the coordinative practices of architectural work. There were, of course, the temporal ordering
of workflows (sequential order, stipulated phases, deadlines, versions). But there
were also, intertwined with the workflows, a divine multiplicity of ordering principles, such as practices of validating documents (creator, status), as well as practices of identifying and naming documents, practices of association, aggregation,
and classification of documents, coding schemes, notations, etc.
Accordingly, we embarked on a meticulous analysis of each of the different
coordinative artifacts used by the members of the architectural office, in an effort
to reconstruct the practical logic and principles of ordering embodied in their different graphical formats and their interrelationships.
We quickly abandoned all hope of coming back with a finite set of ordering
principles. The task we set out to accomplish was, rather, to understand the logic
of these practices, without forsaking the observed multiplicity of coordinative artifacts and practices. The strategy we adopted can be said to involve two moves:
(1) The practical logic of multiplicity. The multiplicity of coordinative artifacts
and protocols we observe in modern cooperative work settings is not accidental;
nor is it an artifact of superficial analysis. The multiplicity is constitutive. We
have long ago learned from Gerson and Star that ‘No representation of the world
is either complete or permanent (Gerson and Star, 1986, p. 257). Coordinative
protocols and the principles of ordering they incorporate are constructed to handle
issues that are ‘local’ in the sense that they are limited to a specific activity, process, project, etc. or to a specific coordinative issue. They are specialized constructs.
Rationality is always local rationality. This is not because the world is absurd
or irrational, as if such a proposition would make sense. Nor is it because we mere
mortals, constrained from ‘bounded rationality’, cannot grasp the rationality of the
world, as if it would make sense to conceive of an agent with infinite rationality.
Rationality is always local simply because the production of any kind of insight,
knowledge, conceptualization, theory, representation, formulae, model, principle,
32
Cooperative Work and Coordinative Practices
protocol, procedure, scheme, and so on requires time and effort and because time
and effort in practice are limited resources. Rationality is local for reasons of
practical epistemology. The fragmented rationality that we seem to observe in the
multiplicity of protocols and artifacts is the result of what Bourdieu has called the
economy of practical logic: ‘The economy of logic […] dictates that no more logic
is mobilized than is required by the needs of practice’ (Bourdieu, 1980, p. 145).
Practitioners are not in a position to indulge in unnecessary ‘logic’; to get the job
done and in general ‘move on’, they have to economize on logic, consistency, and
coherence.
(2) The principle of historical specification.5 In making sense of the wealth of
coordinative practices and their tricky interdependencies and in developing the
required rather delicate distinctions, it was of paramount importance to steer well
clear of what Gilbert Ryle has called the ‘intellectualist legend’ (Ryle, 1949). This
legend is deeply entrenched in our thinking, it is a cornerstone of our modern mythology and constitutes the key strategic asset of cognitivism. In order to account
for intelligent conduct, the intellectualist imputes occult operations of inference of
a specifically intellectual nature to any kind of intelligent conduct. With respect to
classification in particular, the intellectualist confounds specialized practices such
as classification, which have been developed historically and rely on complex literate practices, with an organism’s ability to tacitly and immediately discriminate
ordinary features of the world such as edible and non–edible things or sad and
happy faces. The problem we were facing was the same problem that a decade
previously had made us make the ‘one artifact, one protocol’ stipulation, namely,
the urge to use concepts indiscriminately, to blur or ignore distinctions, which has
gained so much impetus and become so prevalent in the course of the cognitivist
movement. But instead of issuing a new stipulation, we were now able to express
the criteria much more succinctly. What makes a coordinative protocol what it is
(in addition to its specific coordinative function, of course), is not that it has a
specific bond to an artifact, but that it is a specific kind of literate practice (that
serves a specific coordinative function). In other words, coordinative practices are
specialized practices that in turn presume an entire range of other, equally historically specific, literate practices.
In our effort to get a grip on the specifics of the practices we were trying to understand, the work of Ludwig Wittgenstein and Gilbert Ryle again and again
helped us to stave off imminent confusion. Furthermore, in our attempt to disentangle the web of interlaced semiological practices as (elements of) coordinative
practices and to do so meticulously and accountably, the work of the British ‘integrational linguist’ Roy Harris has proved to be immensely valuable (cf., e.g.,
Harris, 1986, 1995, 2000).
5 The term ‘principle of historical specification’ was introduced by Karl Korsch (1938).
Progress report: Cooperative work and coordinative practices
33
As this work progressed and matured, Ina Wagner and I, in our effort to be able
to embrace the multifarious nature of coordinative practices in contemporary
workplaces as exemplified in the work of architects, developed an approach in
which coordinative artifacts and protocols in their infinite variety are taken as the
point of departure, without any presumption that they bond or have to bond in
specific ways. However, in going beyond the concept of coordination mechanisms, the concepts of coordinative artifacts and protocols were not abandoned at
all. They are applied, not as mere subordinate elements of coordination mechanisms, but in an open-ended way, as observable and reportable phenomena. Similarly, the concept of the interlinkage of coordinative protocols and artifacts is not
abandoned either but is rather, again, applied in an open-ended way. The emphasis is on how myriads of coordinative protocols and artifacts are related and connected in different ways and in an intricately recursive manner, and how they
form more or less well-defined and more or less tightly coupled clusters. We call
such clusters ‘ordering systems’ (Schmidt and Wagner, 2004).6 The concept is
related to the concept of interlinked coordination mechanisms but does not grant
privileged status to a certain kind of coordinative protocol and artifact, nor does it
stipulate a strict pairing of the elements. The purpose is to support the analyst in
embracing the motley of coordinative practices required in highly complex cooperative work settings.
5. CSCW’s radical program
As this progress report comes to an end, a few words on the rationale of the entire
program are called for. Why would a conceptual foundation for CSCW be needed
in the first place? Why the systematically conceptual approach that has been pursued so doggedly? Why not simply build computational artifacts that can be put to
good use here and now?
Firstly, has the strategy led me the wrong way? Does the complexity of the research program I have been developing, as evidenced by its still unfinished state,
the openness of the whole thing at this stage, indicate that the strategy is somehow
deeply flawed, perhaps even mistaken? Might it not, after all, be wiser, to reconsider the alternative strategy, the one of developing technologically advanced
‘spaces’ of whatever kind for unregulated interaction? Well, that would amount to
not answering the question that was asked and answering another one instead.
However useful such ‘spaces’ might be for many purposes, they do not offer anything like a strategy for the complex cooperative work settings, the factories and
hospitals, pharmaceutical laboratories and architectural offices, for which specialized coordinative protocols and artifacts, ordering systems, and galaxies of order6 We began using the term ‘ordering systems’ at the same time as the Nestor of systematic zoology, the late
Ernst Mayr, began using it in exactly the same way (E. Mayr and Bock, 2002).
34
Cooperative Work and Coordinative Practices
ing systems, are vitally indispensable. Practitioners simply develop these practices
in order to somehow master the complexities of their interdependent work.
Furthermore, the apparent simplicity of the ‘awareness space’ strategy, if one
can call it that, is deceptive. The problem with the notion of ‘awareness’, as it is
used in CSCW, is not just that it is poorly understood or that it barely has been
defined. The problem is, I have slowly come to realize, that it grows out of an effort to give an explanation where none is needed.7 The notion of ‘awareness’ is
used as a proxy for a mental state of some kind (‘awareness information’, or what
have you) that the individual produces and that then somehow prompts the individual to adjust his or her conduct accordingly.
As the word ‘awareness’ has been used in CSCW, by myself and many others,
it is what Gilbert Ryle calls a ‘heed concept’ (1949, pp. 135 ff.). It does not explain a performance by reference to some occult preceding state; it characterizes
it. Just as the term ‘intelligence’ cannot be used to explain smart conduct, but rather is used to characterize the conduct in question as inventive, ingenious, clever,
diligent, adroit, imaginative, cunning, or whatever is meant in the context, the
term ‘mutual awareness’ does not explain anything, nor does it stand proxy for an
explanation, but is rather used for describing that a particular cooperative activity
is successfully aligned and meshed, and that this was accomplished effortlessly
and inconspicuously, without conversations, queries, commands, gesturing,
emails, or other kinds of interruptive interaction.
However, the term that was picked for this job, the term ‘awareness’, is not a
‘heed concept’ at all. One can be aware of things that one does not take into account (or heed) in one’s actions. The fact that I did not heed some good advice,
could, but does not necessarily, imply that I was not aware of it: I could have ignored the advice for all sorts of reasons. ‘Awareness’ is an ‘attention concept’. It
was probably picked to do the job for which it has been used because ‘being
aware’ is close to ‘realizing’, ‘conscious of’, ‘noticing’ but still is used quite differently. Alan White, one of Ryle’s colleagues, has elaborated the difference well:
‘What one is conscious of or what one realises, one must be aware of. But one can become aware
of things otherwise than by realising them and one may be aware of them even when one is not
conscious of them. […] Being aware of something is entirely different from noticing something.
We may become, remain and cease to be aware whereas we either notice or fail to notice. We can
be continuously aware but we cannot continuously notice. “Noticing”, but not “being aware”, signifies an occurrence.’ (White, 1964)
This is not the place to elaborate these distinctions.8 The point I am trying to make
is that ‘awareness’ in CSCW has been used as a heed concept and that it, at the
7 I have sketched a critique of the notion of ‘awareness’ in CSCW in the article ‘The trouble with “awareness” (2002b) but that critique is merely an outline of what in my view needs to be done in this matter (cf.
also, Schmidt and Simone, 2000).
8 Alan White’s work is both an excellent introduction to Ryle and an essential corrective (cf., e.g., White,
1964, 1967, 1982).
Progress report: Cooperative work and coordinative practices
35
very same time, has had all the usual connotations of ‘awareness’: that one can
‘be aware’ without necessarily noticing whatever it is one is or becomes aware of
and without realizing it. That is, ‘awareness’ has been doing not just one but two
jobs in CSCW, the job of a heed concept and the job of an attention concept.
Hence, I submit, all the confusion. ‘Awareness’ has been used in two ways: officially, to describe that somebody is acting in accordance with the situation, and,
unofficially, to imply that this is accomplished because of some un– or subconscious processes or mechanisms or some particular mental state. As a result, it has
made us look in the wrong direction. It made us search for a mental intermediary
where none normally is.
Cooperating actors mutually heed what each other is doing and do so effortlessly and without interrupting ongoing work because they (normally) know the
work and hence know what the others are doing, could be doing, should be doing,
would not be doing, and so on. They know the drill. Heeding what goes on is part
of competent conduct. Their heeding is also effortless and seamless because work
(normally) takes place in a material setting that is replete with indications of states
and processes and activities that, for competent actors, (normally) makes it
straightforward to assess the state of affairs and know what adjustments might be
called for.
Now, if this argumentation holds, this means that we, instead of searching for
putative intermediate mental states, should try to identify the strategies competent
cooperating actors employ to heed what colleagues are doing etc. How do they
discriminate significant states, possible states, problematic states, etc.? What do
they monitor for in the setting? What is ignored as irrelevant, what is taken into
account? And so on. As the next step, this then leads to constructive explorations
in an entirely different direction than that of ‘space’ technologies (no pun intended), namely, in the direction of finding ways in which these very specific monitoring strategies can be supported (by sensor technology, for instance). The conclusion is, then, that in order to support ‘mutual awareness’ adequately, the strategy
would be to develop novel kinds of protocols, based on in-depth analyses of the
perceptual and similar coordinative strategies of cooperating actors in complex
settings, that among the actors convey selected indications about, for example,
significant patterns of states. Ironically, then, instead of abandoning our program
of developing technologies to support the coordinative protocols that have been
developed in complex cooperative work settings, we should rather explore ways
of complementing ‘natural protocols’ with ‘artificial’ coordinative protocols.
The program of CSCW, when taken seriously, is indeed an ambitious one. As
argued repeatedly over the years, and perhaps most explicitly in my remarks on
‘The critical role of workplace studies in CSCW’ (2000), the problem for CSCW
is a radical one. If CSCW is to deliver on its promise and develop technologies
that can support cooperative work as it exists out there, in laboratories, factories,
and hospitals, etc., the field must be able to offer technologies that enable ordi-
36
Cooperative Work and Coordinative Practices
nary workers to do, in computational environments, what they do now: express
coordinative protocols and construct coordinative artifacts. In contrast to the coordination technologies that do exist and the systems that are being rolled out, the
problem for CSCW is that of developing technologies that enable ordinary actors
to construct computational devices by means of which they can (a) regulate their
own complex cooperative activities more efficiently, safely, dependably, etc. and
(b) at the same time control the regulation of their activities.
More than that. If such means are not developed, the ongoing diffusion of information technologies in working life will undoubtedly cause all sorts of impediments and disruptions, as the ability of workers to manage their everyday coordinative concerns is eroded or lost. These prospects are already becoming reality.
The proliferation of network protocols (ftp, email, http, chat, instant messaging,
etc.) already creates parallel and mutually exclusive communication channels,
which invariably impedes mutual awareness in ways that are strikingly analogous
to classical multi-user database systems that were carefully designed to shield actors from experiencing the presence of others and thus create a setting in which
actors can only monitor the activities of colleagues with great difficulty and outside of the system (Rodden, et al., 1992). This problem is not alleviated but rather
compounded by rigid coordinative protocols that are incorporated in groupware
systems and in group calendar systems, scheduling systems, booking systems,
document management systems, CAD systems, electronic patient records, etc.
The various protocols are of course specialized constructs that cannot simply be
unified for all purposes. The problem is that, as it is, actors are left with little or
no means of practical integration of the various protocols to get the job done, locally and temporarily — other than doing it manually. Thus, without a rigorous
conceptual foundation, collaboration technologies are in risk of becoming part of
the problem rather than the solution.
6. For lack of a conclusion
The research represented by this book never had the character of a research project. It was never that well delimited. It was rather a research program, or even
more to the point: the research primarily aimed at developing a research program.
This is reflected in the subtitle of the book: Contributions to the conceptual foundations of CSCW.
A conclusion in the standard sense is thus not appropriate. Anyway, this much
has, after all, been achieved:
(1) Cooperative work has been identified as a phenomenon we can study systematically, as a category of work practice, distinct from its organizational and
socio-economic form, and irrespective of what mutual feelings of companionship
actors may or may not have. That is, cooperative work practices have been made a
researchable phenomenon.
Progress report: Cooperative work and coordinative practices
37
(2) This in turn has cleared a path for making coordinative practices, their
methods and techniques, a researchable phenomenon as well.
(3) The research is grounded in investigations of specialized coordinative practices in different settings and has identified key features of these practices: the
central role of coordinative artifacts and their associated protocols. It has shown
that coordinative practices often – if not generally – involve entire clusters of such
coordinative artifacts, ‘ordering systems’, and that there is what could be called a
higher-order logic to this clustering, namely, that the same general schemes and
notations are reused and recombined endlessly.
(4) These investigations may also serve, in a loose way, as examples of how
investigations of coordinative practices might be performed. Perhaps the main
contribution of the research lies here, in offering, certainly not a paradigm, but
some examples that other researchers may want to emulate, extend, develop.
(5) It has finally been demonstrated that it is – in principle, at least – technically feasible to create computational environments by means of which ordinary
workers, not programmers, can define and execute ‘coordination mechanisms’ in
a fully distributed and flexible manner. For this purpose a notation for defining
‘computational coordination mechanisms’ was specified. However, the technical
aspects of this research of course lies outside of the scope of the present book.
The research program represented by the articles collected in Part II of this
book has made substantial progress but is far from finished. It has only just begun.
Many issues are still open, even wide open.
Part II
Surveying the connections
‘One difficulty with philosophy is that we lack a synoptic view.
We encounter the kind of difficulty we should have with the geography of a country for which we had no map, or else a map of isolated bits. We can walk about the country quite well, but when we
are forced to make a map we go wrong. A map will show different
roads through the same country, any one of which we can take,
though not two, just as in philosophy we must take up problems
one by one though in fact each problem leads to a multitude of
others. We must wait until we come round to the starting point
before we can proceed to another section, that is, before we can
either treat of the problem we first attacked or proceed to another.
In philosophy matters are not simple enough for us to say “Let's
get a rough idea”, for we do not know the country except by knowing the connections between the roads. So I suggest repetition as a
means of surveying the connections.’
Wittgenstein, The Yellow Book (1934-35, p. 45)
277
Part III
CSCW reconsidered
‘We shall not cease from exploration
And the end of all our exploring
Will be to arrive where we started
And know the place for the first time.’
T. S. Eliot: Four Quartets
279
Chapter 11
Formation and fragmentation
There is an old Danish maxim, befitting a nation of seafarers: ‘When there is
wind, it’s time to make sail’. For CSCW, now is such a time.
CSCW is unique in being not merely an interdisciplinary research field but a
field of technological research that depends critically on in-depth studies of actual
cooperative work practices in material settings. In accordance with this commitment CSCW has articulated and undertaken a critical examination and revision of
fundamental assumptions and tenets in computing concerning socially distributed
control of computationally distributed systems. At the same time, and by virtue of
this commitment to development of technology, CSCW has been a major force in
developing an understanding of work practices that has upset and overthrown the
intellectualist and mechanistic (or ‘cognitivist’) notions and theories of orderly
activities that only one or two decades ago seemed unassailable and unquestionable. In the process, conceptual frameworks and investigative strategies and techniques have been developed that help us to hone in on the ways in which mundane
artifacts and clusters of artifacts are deployed and developed by practitioners. This
has set a new and very high standard for rigorous analysis of actual work practices.
On the other hand, however, many signs indicate general perplexity and a sentiment that the research program that has brought the field this far no longer offers
reliable directions. New computing technologies are redefining the general technological matrix in which CSCW was originally formed and have spawned new
areas of technological research. For example, technologies such as wireless networks (GSM, WiFi, Bluetooth), positioning technologies (GPS, RFID, etc.), sensor and actuator technologies, handheld and wearable devices, etc., have given
rise to research areas next to CSCW such as ‘ubiquitous’ or ‘pervasive’ computing, while the very same technologies obviously offer great potential for one of
the central issues in CSCW, namely the support of ‘mutual awareness’. Is that an
indication that CSCW is or is on the way to become a thing of the past (cf., e.g.
Crabtree, et al., 2005)? Similar concerns are engendered by the development of
technologies such as high-level computational notations for ‘business process
modelling’ (e.g., BPEL), ‘peer-to-peer’ protocols, ‘service-oriented architectures’
280
Cooperative Work and Coordinative Practices
(SOA), and so on (cf. van der Aalst, 2007).73 In the course of two decades, the
technological matrix of CSCW has become extremely heterogeneous. Moreover,
new application areas for ‘collaborative computing’ (broadly defined) have developed, such as, for instance, large-scale cooperative efforts in scientific communities (‘e-science’, ‘cyberinfrastructures’).74 In return, these application areas are
bound to have significant influence on ‘collaborative computing’ (e.g., computational ‘ontologies’), but will undoubtedly also add to the heterogeneity of the matrix.75
These centrifugal forces are strengthened by a surge of studies of the various
ways in which well-known ‘collaboration technologies’ are being used: the new
patterns and styles of social interaction that reportedly can be observed among
people communicating ‘on-line’ by means of what is often — awkwardly —
called ‘social software’ or ‘social media’, a motley of protocols, services, and facilities ranging from simple protocols such as instant messaging and chat to blogs
to ‘social networking’ services and facilities such as Facebook, Twitter, Second
Life, Flickr, YouTube, MySpace, etc. These new ‘collaboration’ facilities, currently appearing in rapid succession, have enchanted sectors of the general public.
It is hardly surprising, therefore, that this surge of public interest would captivate
CSCW researchers as well; for it is of course quite interesting to sociologists and
psychologists alike when a mass audience discovers and appropriates a technology, especially one that enables people to experience new forms of intercourse and
interaction. However, the focus of this line of research is on the new forms of social life and human interaction these resources are reported to afford and engender. For CSCW, this represents a major problem shift in as much as the commitment to development of technology is receding. By focusing on the presumptive
‘effects’ of the various ‘social’ software and media designs, or on the new forms
of interaction they afford, this line of research has, for all practical purposes,
abandoned the ambition of contributing constructively, systematically, or simply
accountably to the development of new technology. In short, this research is reactive with respect to technology and design (cf. Schmidt, 2009). In a related development, to some extent overlapping, CSCW’s focus on ‘work’ is increasingly
considered a historical relic of no great importance (Crabtree, et al., 2005). On
this view, it is considered of little or no import whether the envisioned ‘collaboration technologies’ are used for ordinary work in hospitals and factories or for
games and gossip (for a critique, cf. Schmidt, 2010).
73 For an impression of the rich variety of collaborative computing technologies that are currently being investigated, cf. the recent conferences on Computer Supported Cooperative Work in Design (e.g., Shen, et al.,
2008). (Cf. also: http://www.cscwd.org/).
74 These research programs have been outlined by the Atkins committee under the US National Science
Foundation (Atkins, et al., 2003) and by the e-Infrastructure Working Group under the Office for Science and
Innovation in the UK (2004). These new application areas are already the object of intense interest in CSCW
(cf., e.g., Jirotka, et al., 2006; Lee, et al., 2010).
75 There is a considerable literature on ‘ontologies’. For an overview, cf. Gruber’s short article (2009). For
initial studies of the actual construction of computational ‘ontologies’ by members of scientific communities
cf. the studies by Dave Randall, Wes Sharrock and their colleagues (Randall, et al., 2007b).
Chapter 11: Formation and fragmentation
281
The field is in flux. There is confusion as to the direction of the field. There is
even confusion as to what constitutes an interesting and relevant problem or a valid contribution. The field is becoming fragmented. It is time to reappraise the
field. The objective cannot be to restate or rephrase the traditional definitions of
CSCW, for the fragmentation is a clear indication that the received understandings of the field’s program and its conceptual foundation are now deficient. We
need to reconsider CSCW’s research program.
It is customary to say that CSCW emerged in the late 1980s. And it is of course a
fact that CSCW as an institutionalized research field was formed at this time. And
indeed, this was no accident. CSCW formed in response to the possibilities and
challenges posed by the then novel network technologies as represented by the
Internet. All this is true and important. However, the problem with this way of
framing CSCW historically is that it elides the central role that the challenges facing cooperative work practices have played in the development of computing
technologies in general. It tempts us to think of CSCW as a field concerned with
the application of an already existing technology and with the effects of that.
The coupling of computing technologies with cooperative work is not a historical accident, nor is it, for that matter, a recent event. The challenges of enabling
the actual formation of cooperative work relations in the first place, or of reducing
the complexity of coordinating cooperative work, or even of eliminating the overhead cost of coordination by making cooperative work superfluous by automating
the work process, — have been significant motivating forces in the development
of machine technologies in general and of computing technologies in particular. It
is therefore quite appropriate to start there.
‘Collaborative computing’, understood broadly and informally as computing
technologies that facilitate, mediate, or support collaboration in general and cooperative work in particular, is not a yesterday’s child. Understood this broadly,
‘collaborative computing’ is about as old as travelling by jet airplane. However,
continuities notwithstanding, ‘collaborative computing’ has undergone major
shifts of technological paradigm. And for the purpose of reconsidering CSCW’s
research program, identifying and articulating these paradigm shifts are a crucial
first step. The aim of this chapter is to do just that. In fact, the bulk of the chapter
is devoted to an outline of the major shifts in the prehistory and emergence of
computing technologies, from the development of the concept and techniques of
systematic division of labor in the 18th century to the development of the technologies of mechanical control in the Industrial Revolution, to the development of
the ‘universal’ stored-program computer in the middle of the 20th century as a
generic control mechanism, to the development of real-time computing during the
Cold War as a means of facilitating large-scale cooperative work.
This rather extensive discussion of technological developments and shifts that
predate CSCW by decades and even centuries will show that CSCW, with benefit,
can be seen an endeavor in continuation of the long tradition of development of
282
Cooperative Work and Coordinative Practices
work organization and technology. More than that, the outline of technological
development will sketch the background against which the practice-oriented research program of CSCW is to be understood: computing technologies offer unparalleled means of constructing machine systems that facilitate, mediate, and
regulate cooperative work — unparalleled, not just in terms of speed of operation
but also, and most importantly, in terms of operational flexibility and cost of construction and modification of designs. This has, in turn, made it technically and
economically realistic to develop types of machine system, coordination technologies, that support cooperative work by regulating interdependent activities
(workflow systems, etc.) but would have been practically impossible until about
20 years ago. Seen in this light, CSCW is not a avant-gardist fancy but, rather, a
research effort that arises from practical concerns of ordinary work organizations.
Having recounted the prehistory and formation of CSCW, I turn to the development of the different research programs in CSCW. I will first recount the story
of the development, prior to the official inauguration of CSCW, of research programs concerned with ‘computer-mediated communication’ and ‘office automation’. By the time when CSCW formed as an institutionalized research field, these
research programs had landed in a situation where they were seen as inherently
flawed. Key researchers in the constituent communities realized that these research programs, in different ways, were based on assumptions about cooperative
work practices that were fundamentally problematic: both conceptually and methodologically. This prompted what the philosopher of science Imre Lakatos (1970)
has termed a ‘problem shift’. A research program was developed and articulated
in which ethnographic and other kinds of in-depth workplace studies would play a
key role in developing computing technologies that are adequate for the task of
‘supporting’ cooperative work by uncovering and conceptualizing the logics of
cooperative work practices.
After having outlined the main stages of this development, I will move on to
discuss the accomplishments of this practice-oriented program: or rather, its
patchy and tentative achievements. I will here focus on the complicated and apparently puzzling relationship between ethnography and the development of technology; the reason being that much of the confusion about CSCW’s program may
be due to simplistic notions about the relationship between ethnography and development of technology in CSCW.
My aim in all this is not to write a history of CSCW but to outline the space
CSCW occupies in the greater scheme of things.
1. Cornerstones: The concepts of ‘practice’ and ‘technology’
Concepts are institutions. They change over time; not by fiat but as the accumulated result of their distributed use — sometimes coinciding, sometimes contradictory— in the normative activities of people. In the words of John Austin, ‘Our
common stock of words embodies all the distinctions men have found worth
drawing, and the connexions they have found worth marking, in the life-times of
Chapter 11: Formation and fragmentation
283
many generations’ (Austin, 1961, p. 130). The concepts of ‘practice’ and ‘technology’ come with a suite of connotations and references, indeed a load of baggage, that we can only ignore at our peril, for then we do not know what we are in
fact saying.
That is we first of all have to briefly clarify the concepts of ‘technology’ and
‘technology development’ and their relationship to concepts such as ‘artifact’,
‘system’, and ‘design’ — and how these concepts are related to the concept of
‘practice’.
In these matters, gross simplifications abound. While ‘technology’ to the practicing engineer is a complex of principles, models, concepts, and while it to the
anthropologist is the material aspect of a culture, it is, to the sales manager of an
electronics store, the gadgets on the shelves. This is confusing enough but can,
with a little care, be handled as an instance of peaceful coexistence of diverging
concepts. But with the notion of ‘technology development’ the situation is one of
outright strife. For some, technology is simply the practical brother of science,
with science providing the theories and models and with technology merely ‘applying’ the insights of science by ‘deriving’ technical principles etc. from scientific theories. For others, technology is not only a scientific discipline in its own
right but, what’s more, the breadwinner of the family of the sciences. And for others again, technology is largely parasitic on the accumulated practical experience
of ordinary workers. The first camp will refer to science-based technologies such
as semiconductors and pharmaceuticals to justify their claims, while the third
camp will refer to the record of several thousand years of history of technological
development (agriculture and metallurgy, the wheel and the steam engine). Passions are running high, so high, indeed, that it is tempting to think of it all as
simply different groups of social actors clamoring for honor, attention, funding,
patent royalties, and a place in the sun. Anyway, in this ideological fog of scientism against romanticism against technocratic exploiters of both, the appreciation
of the enormous complexity and variation of technological development is trampled flat. However, over the last few decades historians of technology have managed to develop a quite differentiated picture. Since CSCW has had its share of
simplistic notions about technology development, an understanding of the complexities and variability of technology development will be essential for the purpose of reconsidering CSCW as a technological research area and of identifying
the technical paradigms and program shifts that has characterized its course so far.
With the concept of ‘technology’ we are fortunate in that the pedigree is fairly
well documented. It originates as part and parcel of the ‘practice turn’ initiated by
the intellectual movement of the 17th and 18th centuries that is generally referred
to as the Enlightenment. The concept of ‘practice’, by contrast, was developed in
the course of life-times of very many generations and has grown out of a tradition
that goes back to early thinking about work, the nature of work, and the role of
work in human life and in the very concept of humanity, and, as such, it is a tradition that has followed a course determined by both practical and intellectual con-
284
Cooperative Work and Coordinative Practices
cerns. And, indeed, the modern concept of ‘practice’ was developed in intimate
relationship with the development of the concept of ‘technology’.
The concept of praxis (or practice) derives from antique Greek thinking of the
4th century BCE. Both Plato and Aristotle made a distinction between theoria
(contemplative activity), poesis (making, production), and praxis (mere activity).
While the distinction between theoria and praxis does not seem alien to modern
readers, the one between poesis and praxis certainly seems odd. The distinction
reflects the extremely sharp class divisions that characterized Greek society of
that time.76
In his Nicomachean Ethics Aristotle stated that the concepts of ‘making [poesis] and acting [praxis] are different’, pointing out that different kinds of reasoning are involved: ‘the reasoned state of capacity to act is different from the reasoned state of capacity to make.’ He was also careful to point out that praxis is not
subsumed under poesis; they are of different kinds: ‘they are not included one in
the other; for neither is acting making nor is making acting.’ On the other hand,
poesis involves art (techne): ‘art is identical with a state of capacity to make, involving a true course of reasoning’ (Nic. Ethics, 1140a).
Now, in contrast to Plato, Aristotle did not belittle experience. It is, Aristotle
remarked, ‘through experience’ that ‘science and art come to men’. This is
achieved by abstraction: ‘art arises when from many notions gained by experience
one universal judgement about a class of objects is produced’ (Metaphysics,
981a). In the same vein, he recognized the importance of experience in an ordinary line of action such as treating a patient:
‘With a view to action experience seems in no respect inferior to art, and men of experience succeed even better than those who have theory without experience. (The reason is that experience is
knowledge of individuals, art of universals, and actions [praxis] and productions [poesis] are all
concerned with the individual […])’ (Metaphysics, 981a)
In the art of medicine, that is, a man that has ‘theory without experience’ is likely
‘to fail to cure’, because he does not know the individual or particular instance
included in the universal, that is, because he is unlikely to be able to deal with the
contingencies of the particular case.
However, torn between, on one hand, a striving for knowledge of the divine
and universal ‘forms’ of which particular things are but instantiations, and on the
other hand a redeeming curiosity in the rich multiplicity of the world of experience, that is, trapped in an insurmountable dichotomy of the universal and the particular, Aristotle’s thinking exhibited great strain. So, having acknowledged the
importance of experience for useful mundane purposes, he went on to apply the
same dichotomy of the universal and the particular to his analysis of the role of
experience:
76 The authority on the question of slavery in ancient Greek society is Finley (Finley, 1959, 1973). For an
updated synthesis, cf. Garlan’s study (1988). For Aristotle’s view on slavery more specifically, cf. the penetrating study by Garnsey (1996). Wiedemann (1981) has produced a comprehensive collection of antique
texts on slavery. — My understanding of the relationship between theoria and praxis in Aristotle in indebted
to Farrington (1944/49), Redlow (1966), and Bien (1968-69, 1989).
Chapter 11: Formation and fragmentation
285
‘But yet we think that knowledge and understanding belong to art rather than to experience, and
we suppose artists to be wiser than men of experience […]; and this because the former know the
cause, but the latter do not. For men of experience know that the thing is so, but do not know why,
while the others know the ‘why’ and the cause. Hence we think also that the master–workers in
each craft are more honourable and know in a truer sense and are wiser than the manual workers,
because they know the causes of the things that are done (we think the manual workers are like
certain lifeless things which act indeed, but act without knowing what they do, as fire burns, – but
while the lifeless things perform each of their functions by a natural tendency, the labourers perform them through habit); thus we view them as being wiser not in virtue of being able to act, but
of having the theory for themselves and knowing the causes.’ (Metaphysics, 981a-b)
The, for us remarkable, statement that manual workers are ‘like certain lifeless
things’ that ‘act without knowing what they do, as fire burns’ is not accidental, a
slip of the pen; it expresses a view with deep roots in Greek thinking, especially in
the thinking of Plato and Aristotle and their schools. And it is this contempt in
which they held ordinary manual work, the work of carpenters and shoemakers
and ploughmen, that underlies their distinction of poesis and praxis. The slave is,
Aristotle said, ‘the minister of action’ (Politics, 1254a).
The underlying rationale for this categorization was first of all that the work of
slaves and work that could be performed by slaves, in short, manual work, consists in bodily activities to meet bodily needs, and that whatever activity involves
a significant element of bodily action, action on the condition of the material
world, is slave work, praxis. His ordering can be conceived of as a scale of relations of superiority and inferiority: ‘the rule of the soul over the body, and of the
mind and the rational element over the passionate, is natural and expedient’, and
the same applies to men and animals, and the male and the female: ‘the one rules,
and the other is ruled; this principle, of necessity, extends to all mankind’. He then
went on to considering the ordering of work activities:
‘Where then there is such a difference as that between soul and body, or between men and animals
(as in the case of those whose business is to use their body, and who can do nothing better), the
lower sort are by nature slaves, and it is better for them as for all inferiors that they should be under the rule of a master. […] Whereas the lower animals cannot even apprehend a principle; they
obey their instincts. And indeed the use made of slaves and of tame animals is not very different;
for both with their bodies minister to the needs of life.’ (Politics, 1254b)
Aristotle extended this to include any action that serves to satisfy a need, be it by
producing necessities of life or pleasure, as opposed to knowledge that does not
‘aim at giving pleasure or the necessities of life’:
‘At first he who invented any art that went beyond the common perceptions of man was naturally
admired by men, not only because there was something useful in his inventions, but because he
was thought wise and superior to the rest. But as more arts were invented, and some were directed
to the necessities of life, others to its recreation, the inventors of the latter were always regarded as
wiser than the inventors of the former, because their braches of knowledge did not aim at utility.
Hence when all such inventions were already established, the sciences which do not aim at giving
pleasure or the necessities of life were discovered, and first in the places where men first began to
have leisure. This is why the mathematical arts were founded in Egypt; for there the priestly caste
was allowed to be at leisure.’ (Metaphysics, 981b).
286
Cooperative Work and Coordinative Practices
That is, the criterion of Aristotle’s ranking of activities is the inverse of utility.
Inventions aiming a giving pleasure are in turn topped by ‘the sciences which do
not aim at giving pleasure or at the necessities of life’, science pursued in order to
simply know. This ranking of activity and knowledge reflects two related circumstances. Aristotle took for granted that the arts that ‘aim at giving pleasure or the
necessities of life’, had already completed their task. The very idea of technical
development beyond what had already been accomplished, and hence the notion
of building theoretical development upon practical experience, was alien to this
view. This view, in turn, is intimately related to his political philosophy and his
effort to perpetuate the system of slavery: ‘as the man is free […] who exists for
his own sake and not for another’s’, that is, the master as opposed to the slave, ‘so
we pursue this as the only free science, for it alone exists for its own sake’ (Metaphysics, 982b).
Finally, as a third criterion for the ranking, Aristotle mobilized the intellectual
incapacity of the slave: ‘in general it is a sign of the man who knows, that he can
teach, and therefore we think art more truly knowledge than experience is; for artists can teach, and men of mere experience cannot’ (Metaphysics, 981b). Manual
workers, ‘men of mere experience’, rank low, in Aristotle’s view, not because
their work is deficient in some way (what they do is obviously generally successful: ‘men of experience succeed even better than those who have theory without
experience’), but because they cannot explain what they are doing in terms of
‘first causes and principles’.
In sum: ‘the man of experience is thought to be wiser than the possessors of
any sense-perception whatever, the artist wiser than the men of experience, the
master–worker than the mechanic, and the theoretical kinds of knowledge to be
more of the nature of Wisdom than the productive.’ (Metaphysics, 981b-982a).
And ‘the slave has no deliberative faculty at all’ (Politics, 1260a). The distinction
between theoria, poesis, and praxis is an expression of this ranking scheme.
Where Plato and Aristotle (and the generations of Christian scholastics and
theologians who followed in their footsteps) praised bios theoretikos, the renaissance thinkers of the new and rapidly expanding world of bourgeois society had
an entirely different agenda. They certainly did not subscribe to the idea that technical knowledge had achieved what could be achieved. They had things to do in
this world. Frances Bacon, for example, to take perhaps the clearest voice among
them, rejected the notion, received from ‘the ancients’, that anything useful at all
could be accomplished when men, in ‘mad effort and useless combination of forces’ ‘endeavor by logic (which may be considered as a kind of athletic art) to
strengthen the sinews of the understanding’ (Bacon, 1620, Preface). Thus, in explicit contradiction of Plato and Aristotle, Bacon argued that theory and practice
are equals, so to speak, and was thereby able to even conceive of theorizing
proved wrong in practice: ‘sciences fair perhaps in theory, but in practice inefficient’ (Bacon, 1620, §II:xlv):
‘Although the roads to human power and to human knowledge lie close together and are nearly the
same, nevertheless, on account of the pernicious and inveterate habit of dwelling on abstractions it
Chapter 11: Formation and fragmentation
287
is safer to begin and raise the sciences from those foundations which have relation to practice, and
to let the active part itself be as the seal which prints and determines the contemplative counterpart.’ (Bacon, 1620, §II:iv)
On this view, ordinary working practices and practical knowledge were no longer
categorially separated from scientific knowledge. It was conceivable to ‘begin and
raise the sciences from those foundations which have relation to practice’. However, Bacon’s ‘practice turn’ was of course largely programmatic. Production was
craft-based and science immature: Galilei had just started his career when Bacon
published his Novum Organon.
However, a century or so later, when Denis Diderot, together with d’Alembert,
edited the famous Encyclopédie, ou dictionnaire raisonné des sciences, des arts et
des métiers (1751-66), the relationship between science and practice that Bacon
had vaguely sensed and promulgated was becoming reality. Diderot thus wrote an
article on ‘Arts’, i.e., the practical crafts, arts, techniques, and sciences, for the
first volume of the Encyclopédie in which he, following Bacon, flatly observed
that ‘It is man’s work [l’industrie de l’homme] applied to the products of nature’,
his effort to satisfy ‘his needs’, ‘that has given birth to the sciences and the arts’
(Diderot, 1751, pp. 265 f.). He then went on to describe the relation between ‘theory’ and ‘practice’ as a reciprocal one:
‘every art has its speculation and its practice: the speculation is nothing but the idle knowledge of
the rules of the art, the practical aspect is the habitual and unreflective application of the same
rules. It is difficult, if not impossible, to develop the practice without speculation, and, reciprocally, to have a solid grasp of the speculation without the practice. There are in every art with respect
to the material, the instruments, and the operation a multitude of circumstances which can only be
learned in practice [usage]. It is for practice to present difficulties and pose phenomena, while it is
for speculation to explain the phenomena and dissolve the difficulties; from which follows that
hardly any but an artisan who masters reasoning that can talk well about his art.’ (Diderot, 1751, p.
266, emphases deleted).
To illustrate his argument, Diderot discussed the relationship between academic
geometry and the practical geometry as exercised in workshops:
‘Everyone will readily agree that there are few artists who can dispense with the elements of mathematics. However, a paradox, the truth of which is not immediately obvious, is that, in many situations, these elements would actually harm them if the precepts were not corrected in practice by
knowledge of a multitude of physical circumstances: knowledge of location, position, irregular
forms, materials and their properties, elasticity, rigidity, friction, texture, durability, as well as the
effects of air, water, cold, heat, dryness, etc.’ (Diderot, 1751, p. 271).
He went on to argue that, for instance, no levers exist ‘for which one could calculate all conditions’. Among these conditions are a large number that are very important in practice:
‘From this follows that a man who knows only intellectual [academic] geometry is usually rather
incompetent and that an artist who knows only experimental geometry is very limited as a worker.
But, in my opinion, experience shows us that it is easier for an artist to dispense with intellectual
geometry than for any man to dispense with some experimental geometry. In spite of the calculus,
the entire issue of friction has remained a matter for experimental and handicraft mathematics.
[…] How many awful machines are not proposed daily by men who have deluded themselves that
288
Cooperative Work and Coordinative Practices
levers, wheels, pulleys, and cables perform in a machine as they do on paper and who have never
taken part in manual work, and thus who never have known the difference in effect of one and the
same machine in reality and as a plan?.’ (Diderot, 1751, p. 271).
In other words, following Bacon, Diderot completely reversed the internal relationship of Aristotelian concept-pair ‘theoria / praxis’. When we talk of ‘practice’
we no longer conceive of it as mere regular activity devoid of ‘reasoning’ and ‘deliberation’. The notional separation of praxis and poesis has been dissolved, and
both the ‘capacity of make’ and the ‘capacity to act’ have been united in the modern concept of practice — united but not conflated. The modern concept of ‘practice’ expresses and is used for emphasizing the complex dialectics of general precepts and action.77
The aim of Diderot and his fellows was not merely to celebrate of the practices
of artisans and handicraft workers, but to find ways to improve received practices.
The modern concept of ‘practice’ was developed as an integral intellectual component of this interventionist endeavor, as a conceptual resource for a movement
devoted to understanding and transforming actual productive practices thorough
investigation and rationalization. And it was in the nexus defined by the modern
conception of practice in its relationship to the concepts of experience, techniques,
skills, and knowledge that the concept of technology was developed.
The concept of technology was developed and articulated to express and reflect
the effort of developing the practices of the arts through ‘speculation’, that is,
through systematic rationalization of the techniques applied in those practice. In
1675 the French minister of finance Jean-Baptiste Colbert, in his persistent effort
to improve the state of the economy and promote the development of manufacturing, invited the Académie royale des sciences (founded in 1666) to produce comprehensive and detailed descriptions of the ‘arts and trades’, that is, the manifold
techniques applied in the various branches of production: ‘The king wished the
Academy to work unceasingly upon a treatise on mechanics, in which theory and
practice should be explained in clear manner that could be grasped by everyone’
(Académie Royale des Sciences, 1729-34, p. 131). As expressed in the Academy’s own mémoire from 1699, the Academy ‘voluntarily accepted’ the assignment of ‘describing the crafts in their present condition’, knowing full well that
the task was ‘dry, thorny, and not at all dazzling’. The resulting Description was
to ‘penetrate to the ultimate details, although it would often prove very difficult to
acquire them from artisans or to explain them’. Thereby ‘an infinity of practices,
full of spirit and inventiveness, but generally unknown, will be drawn from their
shadows’. In this way the crafts would be preserved for posterity, but in addition
‘able men’ who lack the leisure to visit the artisans’ workshops would be able to
‘work on the perfection’ of these practices, just as the Academy itself would not
77 Kant summarized the modern concept of ‘practice’ when he, in an essay started out by recapitulating:
‘One calls a conceptualization of rules, even of practical rules, a theory when these rules, as principles, are
thought of in a certain generality and thus have been abstracted from a multitude of conditions that nonetheless necessarily influence their application. On the other hand, one does not call just any operation a praxis;
rather, only such a purposive endeavor is considered a praxis that is taken to be attained by following certain
generally accepted principles of procedure.’ (Kant, 1793, p. 127).
Chapter 11: Formation and fragmentation
289
fail to remark if something might usefully be amended. (Académie Royale des
Sciences, 1702, pp. 117 f ).
This was of course not the first attempt to give in-depth accounts of specific
work practices based on on-site observations.78 But with hundreds of ‘arts and
trades’ to investigate and describe, the task undertaken by the Academy was of
course enormous, as it involved extensive fieldwork in different lines of trade in
different parts of the country. What is more, a systematic approach was required
and had to be developed. It is not surprising if the research progressed only glacially. Anyway, after a couple of decades, a number of researchers (such as Billettes, Jaugeon, Carré, and others) were assigned to the task and began producing
reports to the Academy, to some extent by requesting staff at the provincial prefectures to undertake the fieldwork and submit reports based on their observations. Papers were read at the regular meetings of the Academy and then filed. In
1720, the scientist René-Antoine Ferchault de Réaumur was put in charge of the
cooperative effort, but by his death in 1757 only a few pieces of the accumulated
analyses had been made publicly available. The reason for the lack of obvious
progress — apart from the enormity of the task, of course — seems to have been
dissatisfaction with the quality of the initial analytical work which was seen as not
sufficiently systematic and accurate. In fact, some of the papers read at the Academy were not included in the Academy’s printed proceedings (cf. e.g., Peaucelle
and Manin, 2006; Peaucelle, 2007, pp. 140 ff.). However, Duhamel du Monceau,
who replaced Réaumur as project manager, succeeded in getting the publication
process organized and from 1761 to 1788 altogether 81 treatises (about 100 volumes) were published under the title Descriptions des arts et métiers.79
The aim of all this, as re-stated in the Academy’s preamble to the Descriptions,
was not merely to ‘examine and describe in turn all operations of the mechanical
arts’ but also and equally ‘to contribute to their progress’. The Academy expected
that ‘new degrees of perfection of the arts’ would be achieved when scholars undertake the effort of investigating and developing the ‘often ingenious operations
performed by the artisan in his workshop; when they see by themselves the needs
of the art, the boundaries at which it stops, the difficulties that prevent it from going further, the assistance that one could transfer from one art to another and
which the worker is rarely expected to know.’ Subjecting work practices as they
have slowly evolved from ‘obscurity’ to systematic study, rationalizing them,
would show the competent worker a way to ‘overcome the obstacles that they
have been unable to cross’, a way to ‘invent new tools’, etc. (Académie Royale
des Sciences, 1761, pp. xvi f.).
The ‘dry, thorny, and not at all dazzling’ effort of the Academy had huge impact: ‘there can be no doubt’ that contemporaneously these scientific descriptions
of arts and handicrafts ‘exerted a potent influence in western Europe’ (Cole and
78 Agricola’s description of the metal trades is a case in point (1556).
79 A revised edition, collected in 20 volumes, started appearing in Neuchâtel in Switzerland shortly after
(Académie Royale des Sciences, 1771-83). A German translation, also in 20 volumes, was published from
1762 to 1795 under the title Schauplatz der Künste und Handwerke (Cole and Watts, 1952, p. 18).
290
Cooperative Work and Coordinative Practices
Watts, 1952, p. 1). It first of all provided a reservoir of empirical findings for
much of the contents of Diderot and d’Alembert’s highly influential Encyclopédie. Although the Encyclopédie started appearing a decade ahead of the Descriptions, many of the authors who contributed to the Encyclopédie had access to
the reports of the Academy researchers or were involved in both projects, so that
the Encyclopédie on many topics anticipated the published contents of the Descriptions, but typically in a less detailed and accurate form.80 From the Descriptions and the Encyclopédie, the scientific description and reconstruction of the
techniques of handicraft production percolated out into French and European economic life, via dictionaries and journals and other forms of popularization.
Furthermore, the Descriptions provided a model for scholars that received
practices were accessible to scholarly analysis and might be much improved by
application of the insights, methods, etc. of the physical, chemical, mechanical,
etc. sciences. Such systematic studies of work practices with a view to their rationalization were given the name ‘technology’ by the contemporary German
scholar Johann Beckmann (1777).81 Referring to the Descriptions des arts et métiers and similar works as the ‘most esteemed general writings on technology’ (p.
39), that is, the model of such research, he defined ‘technology’ as follows:
‘Technology is the science of the transformation of materials or the knowledge of handicrafts. Instead of merely instructing workers to follow the master worker’s prescriptions and habits in order
to fabricate a product, technology provides systematically ordered fundamental directives; how
one for exactly these ends can find the means on the basis of true principles and reliable experiences, and how one can explain and exploit the phenomena occurring in the process of fabrication’
(Beckmann, 1777, p. 19)
Technology, he stated, provides ‘complete, systematic, and perspicuous explanations of all works, their outcomes, and their grounds’ (p. 20).
Beckmann, in his prolific research, continued the paradigm defined by the
Academy in Paris. In fact, he even integrated it in his teaching, in that his students
were taken to local workplaces: ‘One must have tried, without any preparation or
instruction, to get acquainted with factories and manufactories to know how difficult it is’ (p. a7). One of these students, Johann Poppe, continued this work and
wrote the first systematically organized history of technology.82 His brief summary of the emergence of technology as a scholarly discipline deserves quoting in
this context: ‘In the eighteenth century many scholars undertook the arduous task
of obtaining a precise understanding of handicraft, manufactures, and factories.
Some have even made it into an specific research topic. One had become suffi80 The competition posed by the Encyclopédie may have provoked a ‘sense of urgency’ at the Academy and
encouraged it to speed up it own publication effort (Cole and Watts, 1952, p. 10).
81 Beckmann (1777, p. 20) mentioned that he suggested the term with some trepidation in 1772, namely, in a
review of a book by a French upholsterer on the principles of this trade: ‘Technologie oder Kenntnis der
Handwerker’, or ‘technology or knowledge of the craftsman’ (1772, p. 309). (On Beckmann, cf. Exner, 1878;
Beckert, 1983)
82 ‘As a history of technology of substantial scope, Poppe’s book remained almost unique for a century and a
half, during which nearly everyone forgot that it existed’ (Multhauf, 1974, p. 2) — with the notable exception
of Karl Marx who immersed himself in this work (Marx, 1861-63, vol. 3.1 and 3.6).
Chapter 11: Formation and fragmentation
291
ciently convinced of the benefits that these efforts would have for the manual
worker himself and for many managers in public office.’ (Poppe, 1807, p. 62 f.)
Pointing out that the ‘stipulations and customs’ that regulate handicraft ‘were often based on deficient or even non-existent principles’ and were often influenced
by ‘strong prejudices’, Poppe emphasized the role of technology: ‘Different
scholars that have obtained knowledge of several handicrafts have recently, by
virtue of their sciences, already cleared away many things that obstructed the
greater perfection of these trades’ (ibid., p. 63), and referring explicitly to the Descriptions, he stated that ‘The meticulous description of the arts and handicrafts
that the Academy brought to light belongs to the greatest scholarly works of the
18th century. It engendered several works of a similar kind, to some extent very
valuable works, not only in French but also in other languages’ (ibid., pp. 91 f.).
In short, the concepts of ‘technology’ and ‘practice’ were from birth joined at
the hips, with technology as a systematic effort to investigate and transform the
techniques applied in the practices of the useful arts. Accordingly, technology is
traditionally and usefully defined as rationalized or systematic knowledge of the
useful ‘arts’ or techniques (cf., Layton, 1974). Development of technology, then,
is essentially a systematic conceptual endeavor that results in technical
knowledge, methods, principles, etc. ‘Technology’ is an ability-word.83
The term ‘technology’ has of course been adopted for other purposes. In anthropology, for example, ‘technology’ is used as a designation for the ensemble of
techniques of a particular collective of people. In short, it is used as a synonym of
‘material culture’. Now, there is certainly an important and ineluctable element of
craft skill and know-how to the development and application of technology. But
then there is most certainly also an important element of know-how to scientific
work, as there is to all work, however ‘intellectual’ it may be. However, the problem with equating technology with technique is that this usage tends to make us
blind to those practices that are specific to science and technology: systematic application of received knowledge (theoretical or empirical), rationalization of principles and methods, rigorous testing and observation, candid reporting among
peers, replicability in experimental results, etc. For scientist and technologist
alike, ‘similar normative imperatives remain: no engineer, anymore than a scientist, can get away with fudged data, obscure concepts, or imprecise, inadequatelydescribed measurements’ (Constant, 1984, p. 32). In short, technology, like science, is systematic knowledge. The difference lies in technology’s ‘emphasis on
purpose’, as the historian of technology Donald Cardwell puts it (1994, p. 486).
The technological artifact is proof that the idea actually works and works as intended.
As noted by the Academy in its preamble, techniques are ‘born in obscurity’;
they become technology only as a result of systematic analysis and rationaliza83 This also means that an account of technological development cannot adequately be done in the form of a
list of inventions and artifacts but must be an account of development of technical knowledge, focusing on
continuity and diffusion of knowledge, including conceptual fractures and problem shifts. This, of course,
also applies to an account of the prehistory and formation of CSCW.
292
Cooperative Work and Coordinative Practices
tion. And in fact, the great majority of inventions are and have always been what
Cardwell calls ‘empirical’: inventions made ‘by arranging familiar components or
materials in a novel way and without resort to abstract or scientific thinking’
(Cardwell, 1994, p. 492). Inventions that spring from craft techniques may at a
later stage be subjected to reflection and undergo conceptual systematization. In
fact, the development of a technology by way of systematization of empirically
developed techniques have, in many cases, led to major scientific insights. Thus
thermodynamics was, by an large, created by engineers and not by scientist studying heat (Cardwell, 1994). ‘Whether one takes steam power, water power, machine tools, clock making or metallurgy, the conclusion is the same. The technology developed without the assistance of scientific theory, a position summed up
by the slogan “science owes more to the steam engine than the steam engine owes
to science”’ (Laudan, 1984, p. 10).
Now, not all technologies develop in this way: born as techniques ‘in obscurity’, which then evolve through a distributed process of incremental improvements, only to be subjected, eventually, to systematic rationalization and thus
turned into a technology. Some technologies are born in the bright light of scientific insights which are then transformed into technologies: the standard example
of that is of course the dramatic history of the semiconductor technology on the
basis of the theories of quantum mechanics and solid state physics. Still, the transformation of articulated scientific theories and models into workable technologies
is not a straightforward process of deduction; it requires a ‘separate and additional
act of invention’ (Cardwell, 1994, p. 491). All kinds of practical issues have to be
identified, understood, and resolved in the course of transforming scientific insight into a useful technique. As vividly illustrated by the development of semiconductor technology, theories of quantum mechanics and solid state physics only
provided the essential theoretical framework of understanding the observed phenomena. To develop these insight into workable technologies, required decades of
innovation work by thousands of technicians. — That is, according to Cardwell:
‘We have on the one hand crafts, technics and inventions; on the other hand technology, applied
science and inventions. And common to both sides we have innovation, which we interpret as the
action needed to put an invention into practice. Generally speaking, we say that inventions related
to or springing from technics and crafts do not involve systematic knowledge and are, in a sense,
empirical; inventions deriving from technology or applied science involve systematic or scientific
knowledge.’ (Cardwell, 1994, p. 4).
But, as Cardwell then points out, it is, of course, much more complicated than
that: ‘there is a great deal more to effective innovation than these simple definitions suggest’ (ibid.).
CSCW is often loosely described as a field ultimately devoted to the design of
collaborative systems. However, this manner of speaking is misleading, for the
term ‘systems design’ normally refers to the engineering practices of devising a
specific configuration of existing, typically well-known, technologies (such as
software architectures, protocols, modules, interfaces, etc.) so as to meet specific
Chapter 11: Formation and fragmentation
293
requirements. But CSCW is not a specialized branch of practical engineering addressing the specific technical issues of designing, building, introducing, and
evaluating ‘collaborative’ systems for specific settings or types of setting; it is rather a field of research devoted to the development of technologies that system
designers can apply, together with other technologies, old or new, in building systems for cooperative work settings. The terms ‘technology development’ and ‘systems design’ are used for distinctly different types of socio-technical transformation.
The concepts of ‘invention’ and ‘design’ are similarly categorially distinct.
One could, roughly, say that ‘design’ is an intention-word, whereas ‘invention’ is
an outcome-word. The concept of ‘design’ suggests premeditation, forethought in
devising a plan, while ‘invention’ emphasizes the creation of something quite
new. We can exhibit forethought in devising an artifact when we master the required techniques and know the odds; we can then proceed ‘by design’. When it
comes to inventing something, however, we are, to some extent, in certain critical
areas, fumbling, stumbling, searching for solutions. To design an artifact we rely
on systematic technical knowledge; without it, we cannot exhibit forethought in
devising a plan for its construction.
Now, in a particular design effort there may of course be subordinate elements
of invention, just as there typically are subordinate elements of design in any invention. That is, the boundary is blurred in so far as technologies are multi-level
complexes. The point is that technological development provides the basis for design work, the general systematic knowledge which is then applied in the particular design solutions, while the incremental improvements represented by successive or competing design solutions in turn contributes to the further development
and maturation of the technology.
Technologies are, as a rule, not designed; they are developed through a series
of inventions and conceptualizations. It is typically an open-ended process with
all kinds of false-starts and dead-end paths, with deliberate search as well as serendipitous discoveries, with protracted periods of incremental innovation that may
be interrupted by abrupt changes.
The conflation of technology and systems design is rooted in the notion that
technology does not fall under the broad category of knowledge but rather is a
category of artifact. A candid formulation of this position can be found in George
Basalla’s The Evolution of Technology (1988):
‘The artifact — not scientific knowledge, nor the technical community, nor social and economic
factors — is central to technology and technological changes. Although science and technology
both involve cognitive processes, their end result are not the same. The final product of innovative
scientific activity is most likely a written statement, the scientific paper, announcing an experimental finding or new theoretical position. By contrast, the final product of innovative technological activities is typically an addition to the made world: a stone hammer, a clock, an electric motor.
[…] The artifact is a product of the human intellect and imagination and, as with any work of art,
can never be adequately replaced by a verbal description.’ (Basalla, 1988, p. 30).
294
Cooperative Work and Coordinative Practices
In some respects, Basalla’s argument is rather obscure, for instance when he says
that ‘the artifact […] can never be adequately replaced by a verbal description’: in
which sense of the words ‘adequately’ and ‘replace’ could a ‘verbal description’
possibly ‘replace’ ‘the artifact’? Is this even a meaningful proposition? What
seems to be said here is anyway oddly fetishistic, in that it hinges on the implicit
assumption that the product of scientific or technological work somehow speaks
for itself. Basalla’s central claim here is of course that what is essential to a technology is somehow embodied in the thing itself; that being a hammer, a clock, an
electric motor is somehow an intrinsic material property of the thing. But technological artifacts that are not integral to a living practice are merely a heap of junk.
Or perhaps they are on exhibit in a museum as a representation of a past technology the use of which may now be unknown. In fact, it so happens modern archeology has had to develop experimental methods in order to understand the ‘real-life
processes’ through which the physical remnants were produced. For example, the
experimental archeologist Nicholas Toth has reconstructed the seemingly simple
task of fabricating stone hammers similar to those produced by hominids living in
East Africa between 2.4 and 1.5 million years ago. The research effort was extensive and exhausting, but the conclusion clear: ‘Although the products of Oldowan
technology are quite simple, the processes required in the hominid mind to produce these forms show a degree of complexity and sophistication: in other words,
skill’ (Schick and Toth, 1993, pp. 133 f.). In sum, ‘Technology is something much
larger than the tool itself. It refers to the system of rules and procedures prescribing how tools are made and used’ (ibid., p. 49). Or in the words of the historian
Rachel Laudan, the ‘mute presence of the remaining artifacts does not speak for
itself’, and this is the obvious reason why ‘technological knowledge can easily be
lost’ (Laudan, 1984, p. 7).
I quote Basalla because his position, though deeply problematic, has been influential and is widely cited. Moreover, Basalla’s claim has been mobilized in the
general area of Human-Computer Interaction (HCI) in an attempt to understand
the apparently surprising, and to some unsettling, observation that the technologies of interactive computing were developed well in advance of any formulated
theory of interactive computing and that the technical solutions were not ‘deduced’ from psychological theory. In 1991 John Carroll thus noted that ‘some of
the most seminal and momentous user interface design work of the last 25 years’
such as Sutherland’s Sketchpad and Engelbart’s NLS in the 1960s ‘made no explicit use of psychology at all’ (Carroll, 1991, p. 1), and that ‘the original direct
manipulation interfaces’ as represented by the Xerox Alto (released in 1973) and
the Xerox Star (1981) ‘owed little or nothing to scientific deduction, since in fact
the impact went the other way round’ (Carroll, et al., 1991, p. 79). These observations are well founded. More than that, when Carroll and his colleagues then argued that it was not embarrassing but quite legitimate for HCI to engage in post
hoc ‘scientific investigations’ of the techniques of interactive computing that had
developed incrementally through ‘emulation of prior art’ (ibid., p. 75), this was
Chapter 11: Formation and fragmentation
295
again quite justified, for this has been and remains the normal form of technology
development.
However, Carroll and his colleagues then slips into conceiving of technical artifacts in the fetishistic manner propounded by Basalla: ‘it seems typical in HCI
for new ideas to be first codified in exemplary artifacts and only later abstracted
into discursive descriptions and principles’ (Carroll, et al., 1991, p. 79). The problem with this interpretation lies in the very notion that ‘ideas’ are somehow ‘codified’ ‘in’ ‘artifacts’. Codified? As pointed out by the molecular biologist Jacques
Monod years ago, there is nothing in the form, structure, behavior of an object
that makes it an artifact (Monod, 1970). What makes the thing an artifact is the
practices to which it belongs, the practices for which it has been designed and
built, for which it has been appropriated, and in which it is used on a regular basis,
and one can only ‘decode’ the artifact if one knows and understand these practices
(cf. Bannon and Bødker, 1991). After all, ‘reverse engineering’ requires the same
general competencies as the engineering of a device. That is, the problem here is
that, in focusing on the artifact itself, the practices as part of which the techniques
developed and in which they have become integrated have been effectively expunged from consciousness.
In short, technology cannot be reduced the technical artifact although the artifact, of course, plays a pivotal role in the demonstration and application of the
technology. ‘Technology’ is an ability-word.
To complicate matters even more, immensely more, technical knowledge has
what one could call systemic character. The development of a technology will depend upon received knowledge (scientific or practical) that, although it does not
provide a theory of the technology under development, is nevertheless indispensable. Thus even in the case of a technical development as conspicuously craftbased as the development of the steam engine, the mechanists who developed it
would have been at a complete loss without their mastery of geometry and arithmetic and techniques of measurement. More than that, a technology typically
comprises a set of what could be termed constituent technologies, that is, technologies that have been adopted, reconfigured, and put to novel uses as components
of the new technology. Thus distinctly different technologies, each of which may
have evolved incrementally in a distributed way, may deliberately be ‘shifted laterally’, from their context of origin, to become part of or be combined in a novel
technology. A case in point is the railroad. When, in 1829, the Stephensons
demonstrated their prototype steam locomotive, the Rocket, and won the competition,
‘the individual components of the railroad, considered as a system, had been assembled over the
previous fifty years or so. The canal builders had mastered the techniques of drilling tunnels.
building up embankments and digging out cuttings. They had established the legal precedents for
compulsory purchase of land and they had learned how to muster and manage large bodies of
skilled and unskilled men. The millwrights, of the mining industry had invented and developed the
locomotive. The stage coach system had popularized the idea of public passenger transport at so
much per mile and the discipline of the timetable. These components had been progressively refined. What was required was the ability to put them all together and weld them into the public
296
Cooperative Work and Coordinative Practices
railroad so very different from the little mine trackways with their rough, primitive steam locomotives, “iron horses”. The Stephensons came to realize that the railroad could extend to working
people a luxury that only the affluent had been able to afford. Their vision of a rail network to
provide ordinary folk with a cheap, fast. safe and comfortable means of travelling wherever they
wanted to go amounted, when combined with their practical abilities to bring it about, to the
achievements of genius.’ (Cardwell, 1994, p. 234 f.)
Not only were received technologies appropriated and combined in the formation of railroad technology but the accumulated experience with railroad technology prompted the deliberate development of other technologies. When railroad
networks began to operate and spread, issues of coordination began to emerge and
be understood. The operation of geographically dispersed but interdependent and
temporally critical activities, as those of a railway service, requires communication facilities that operate at a much higher speed than the train service itself. Thus
the ‘rapid spread of railroads heightened the demand for the fastest means of
communication along the lines’ (Cardwell, 1994, p. 251). This practical need, to a
large degree, then drove the development of the technologies of the electric telegraph. The discovery of the electromagnetic effect (by H. C. Ørsted in 1820) was
by then already being applied experimentally to communication over distances of
several kilometers (for research purposes) but the coordination needs of the new
railroad operations in Britain soon became a major motivation for developing this
technology. It was tried successfully in 1837-38, and from 1842 practical implementation got underway. By 1848, operations of about half of the British railroad
lines were coordinated by telegraph (Standage, 1998, p. 61).
Now, with expanding networks of rail services, coordinated by means of telegraph services, another ‘bottleneck’ emerged that required additional technological developments. The coach lines had been operating in an environment with a
multitude of local times as determined by longitude, and the railroad companies
initially tried to continue the practice of the coach lines. But, as the historian of
technology David Landes puts it, ‘trains moved too fast for this, continually exposing passengers and crew to discrepancies and confusions’. The first step was to
use the telegraph ‘to transmit almost instantaneously an exact hour and minute
from the central office to every point on the line’. This established ‘a standard
time for all those served by a given network’, but this of course caused coordination problems across railway networks. The next step was therefore to unify local
railway practices by agreeing to adopt the local time at the Greenwich observatory
as a national standard. This was done by the end of 1847. But, as Landes comments, ‘The effectiveness of the change depended, of course, on the creation of a
national time service, communicating precise time signals at regular intervals to
clocks and stations around the country.’ (Landes, 1983, pp. 285 f.). This was, in
turn, made possible by the development of the ‘use of galvanism’, i.e., telegraphic
signals, to synchronize timekeepers in different locations and, indeed, in the
words of the Astronomer Royal, Airy: ‘the extensive dissemination throughout the
Kingdom of accurate time-signals, moved by an original clock at the Royal Observatory’ (quoted in Howse, 1980, p. 95). Implementation of this new technique
of automatic synchronization was begun on the South Eastern Railway in Sep-
Chapter 11: Formation and fragmentation
297
tember 1851 and was completed by August the next year. It was in the following
years extended to post and telegraph offices, town hall clocks, workshops of
chronometer makers, and to factory clocks (ibid., pp. 96-105).
The case of railroad technology illustrates the systemic nature of technology. A
given technology presumes a — sometimes vast — network of other technologies
that serve as component technologies or as part of the technical platform for the
one under consideration. Sometimes the need of such auxiliary or component
technologies is only understood as the new technology gets under way, problems
are identified, and practitioners begin to search for solutions. And sometimes the
relationship between two otherwise different technologies is such that one can
conceive of their development as something akin to co-evolution: the further development of one technology depends on the progress of the other, and the advances in one technology may be held up for a long time until another technology
has achieved a certain level of maturity (stability, performance/cost ratio, etc.).
The systemic character of technology is a key to understanding the dramatic
technological changes that sometimes occur. A new technology may be developed
for certain purposes and it may then be realized by researchers or engineers that it
can substitute a quite different component technology in an otherwise unrelated
technological complex — and it may turn out that the substitute offers solutions to
known problems or even to bottlenecks or limitations that had not yet been perceived. Sometimes the component technologies had been developed for other purposes but are then shifted laterally and combined in innovative ways to form a
new technology (in the case of railroads: rails, heat engines, machine tools, time
tables, etc.).
The systemic character of technology has important methodological implications, in that the challenge of uncovering and grasping ‘the internal development
of technology’ (Laudan, 1984) should be an overriding concern of any effort to
understand the development of technologies — be it railways or CSCW. That is, it
is essential to conceive of technology as systemic and hence of the research and
engineering efforts of developing technical knowledge as (loosely) coherent or
(tentatively) converging ‘paradigms’ (Constant, 1984; Laudan, 1984; T. P.
Hughes, 1987).
Before I move on, I should point out that the Kuhnian notion of ‘paradigm’
should not to be taken as equivalent to the notion of ‘theory’ but as a set of ‘universally recognized scientific achievements that for a time provide model problems and solutions to a community of practitioners’ (Kuhn, 1962, p. x), that is, a
complex of notions of what constitutes a researchable, relevant, and interesting
problem as well as notions of what constitutes findings and solutions; measurement standards, protocols, procedures, methods, instruments, equipment; forms of
apposite argument and conceptualization as well as genres of presentation of data
and findings; as well as the institutionalized forms of these notions, standards, and
forms: channels, policies, and procedures of publication, review, etc.; research
review boards, schemes, etc.; educational institutions, textbooks, professional organizations, and so on — all centered on and incarnated in those contributions that
298
Cooperative Work and Coordinative Practices
are considered exemplary, the ‘paradigm’ (Sharrock and Read, 2002). The development of technical knowledge exhibits similar characteristics. The most important difference is, again, the ‘emphasis on purpose’ in technological research,
which, in practice, is bound up with the concerns for usefulness, costs, stability,
performance, maintenance, etc.
There is an inherent risk in focusing on the ‘internal development of technology’, however. The concept of technology may then simply be confounded with
that of engineering, and particular technologies may become categorized according to certain components technologies. This has as happened rather frequently in
the historiography of computing, when ‘computing’ as a technology is dated from
the Colossus (1943) and the ENIAC (1946) because they were the first electronic
digital computers or the Manchester ‘Baby’ (1948) because it was the first storedprogram digital computer. They were all important steps and their design embodied technologies that have later been found important. But technology is nothing if
it is not ‘a useful art’. The relationship between technology and practice is internal, that is, conceptual, like ‘figure’ and ‘ground’: you can’t have one without the
other. A given technology is only a technology in relation to its application context and we can only compare and identify paradigmatic shifts in relation to that
specific application context.
Does this mean, then, that the ‘internal development of technology’ is a chimera? Not at all. In fact, technological development is often or, indeed, typically
driven by technologists’ addressing known or anticipated internal ‘anomalies’ in
the established technology, trying to overcome instabilities and performance limitations, reduce energy consumption, increase reliability, etc. The paradox dissolves as soon as we remember that technologies are systemic and that the immediate target of component technologies, perhaps technologies way down in the
hierarchy (e.g., in the case of the computer technologies assembled in my laptop
computer, technologies such memory circuits, screen drivers, compilers, parsing
algorithms, etc.), is not the practice of the so-called ‘end user’ but the practice of
the engineers who design and build specific laptop products (as configurations of
component technologies).
The point of these brief remarks on the systemic nature of technology is that,
what we, in the context of CSCW, focus on are computing technologies at the
‘level’ of cooperative work practices. This does not just apply to the history of
technology but equally to our understanding of the processes of technological development in which CSCW is engaged.
Chapter 11: Formation and fragmentation
299
2. Computing technologies and cooperative work
‘By thinking about the history of “technology-in-use” a
radically different picture of technology, and indeed of
invention and innovation, becomes possible. A whole
invisible world of technologies appears.’
Edgerton (2006, p. xi).
From its very beginning, digital computing technology has been applied to cooperative work. Or rather, computing technology was not simply applied to cooperative work; the effort to meet various challenges of cooperative work has had significant impact on the development of the concept of computing. Addressing challenges of cooperative work has played a central role in developing ‘the computer’
into what we know today.
I will briefly recount the major technological paradigm shifts in ordinary work
settings. The point of departure is the development, prior to the industrial revolution, of systematic cooperative forms of work based on advanced division of labor
that, in turn, was premised on having developed and mastered techniques of decomposition and recomposition of work processes However, this form of cooperative work had quite limited development potentials in terms of productivity, labor
shortages, etc., and the development of machine technology can be seen as a
means to overcome these restraints. And again, while the emergence of machinery
at the point of production in the form of ‘machine tools’ and eventually ‘machine
systems’ and concomitant forms of cooperative work relations showed enormous
development potential, it too, eventually, had limitations rooted in the costs of
constructing and modifying mechanical means of operational control. For that
reason, mechanical machine technologies were, largely, confined to areas of mass
production where the construction costs could be offset by high volumes. However, computing technologies, which were initially developed as a means of overcoming the bottleneck of increasingly complex cooperative forms of calculation
work for scientific and engineering purposes, provided the technical basis for constructing and modifying control mechanisms at costs that were extremely low and
that have, in fact, now been continually falling for half a century. The costs of
constructing machine systems have in fact become so insignificant that relatively
flexible machine systems for vast and varying arrangements of cooperative work
have become economically viable. This is where the CSCW research program
arises, not just as an academic pastime, but as a practical problem for modern industrial civilization.
2.1. Division of labor: Progressive forms of work organization
The development of computing can only be understood when seen as an integral
aspect of the development of work practices and their technologies over the last
three centuries.
300
Cooperative Work and Coordinative Practices
To make this claim understandable it is useful to briefly introduce the taxonomy of work organization that has been in widespread use in economic and technological historiography: (i) ‘craft work’ or domestic handicraft, (ii) the ‘putting
out’ system or the ‘Verlagssystem’, (iii) the ‘manufactures’, and (iv) the machinebased ‘factory’ (e.g., Sombart, 1916; Braudel, 1979). The key distinguishing feature among these forms is the different arrangements of division of labor; although the machine-based ‘factory’ should be seen as a form of work organization
in which the systematic division of labor in production is transcended, if only tentatively.
A note of caution, before we move on. These progressive forms of work organization should not be thought of as stages, each stage leading inexorably to the
next as a necessary step in the development of the work organization. The picture
is far more complex than that. Firstly, technological development is ‘uneven’: different forms of work organization coexist across branches of production, across
geographic regions, even within the production of the same product. Work organizations based on handicraft work continues in small scale production and is indeed reinvented time and again when novel kinds of products are first produced.
One can for example observe the same forms at play in the history of the electronic and computer industries. In fact, in the modern automotive industry some components may be produced in domestic handicraft settings, others by advanced machinery, while the ultimate product is assembled in a work organization based on
systematic specialization of handicraft. Secondly, retrograde development may
take place, so that technologically backward forms become competitive again and
gain a new lease of life when, for instance, large reservoirs of disposed peasants
or underemployed workers in rural areas are created or become available or when
dramatic technological changes put working conditions under pressure. That is, to
repeat, the forms of work organization that we observe are not stages. Nor are
they to be thought of as ‘models’ or ‘ideal types’. They are rather like recurring
themes that are played again and again but in varying ways and contexts, often in
the same sequence but sometimes not. So, instead of conceiving of these forms as
stages, one might use a term such as progressive forms of work organization, if
we let the term ‘progressive forms’ refer to two important features: On one hand,
they represent a (precariously) cumulative process of development of organizational and technological knowledge. On the other hand, this cumulative effect is,
of course, strengthened by the totalizing effect of the market. In so far as practical
applications of this knowledge turns out to be successful in some recognized way
(profitable, dependable, viable), the specific form of work organization feeds back
as a competitive challenge to other actors in the market.
Handicraft. In craft work, each worker typically masters the entire range of
tasks involved in fabricating the ultimate product. In its simplest and most widespread form, craft work was an indispensable aspect of the subsistence economy
of traditional rural life (e.g., spinning and weaving). These activities were performed by peasants as a sideline, that is, in periods or at times when agricultural
tasks did not fully occupy their capacity. The unit of production was the individu-
Chapter 11: Formation and fragmentation
301
al peasant household, with division of labor, if any, based on age and gender. In
each of these ‘monocellular’ units, tasks were ‘undifferentiated and continuous’
(Braudel, 1979, p. 298). On the other hand, in domains of work that demanded a
degree of skill that in turn required full-time engagement with the work (e.g.,
blacksmith, cutler, nailmaker, shoemaker, etc.), the work would not be carried out
as a secondary occupation; specialization would be required. However, this type
of craft work would be organized in a way rather similar to the sideline work in
peasant households: as tiny family workshops, each with a master tradesman, twothree journeymen, and one or two apprentices. In both types, adult workers would
possess the skills to undertake the entire range of tasks, and any division of labor
between them would be occasioned and transient.
The ‘putting out’ system. As an integral part of the development of the world
market, especially from the 16th century, this ‘horde of little establishments,
where family craft working was carried on’ (Braudel, 1979, p. 298), gradually became geared to the market and subsumed under new and systematic schemes of
division of labor. Instead of being autonomous units working in parallel without
any overriding plan or scheme, the individual units now became units of a larger
scheme devised and coordinated by ‘merchant entrepreneurs’: ‘a number of individual units spread over a wide area but interconnected’ (ibid., p. 300). Coordinating the work of the many interconnected units, the merchant entrepreneur would
advance the raw materials, take care of transportation of goods from one unit to
the next (spinner, weaver fuller, dyer, shearer). ‘The pattern in every case was a
sequence of manufacturing operations, culminating in the finished product and its
marketing.’ (ibid.).
This system for which no generally accepted term exists,84 developed from the
end of the middle ages. It spread across the non-agricultural economy at large but
became particularly dominant in the textile trades (spinning, weaving, etc.). However, the reason for paying attention to it here is its historical outcome; namely,
that it promoted specialization at the level of production units. This was most pronounced in branches of production that fabricated ‘small and delicate objects’
such as lace and pins and needles, etc. (needles would for instance pass through
72 hands during the production process). But perhaps more importantly, the ‘putting out’ system occasioned merchant entrepreneurs and their assistants to develop practices of analyzing and coordinating work processes (standardization of materials, relative proportions of component processes, logistics) (Sombart, 1916).
Manufactures: From the middle of the 16th century and towards the end of the
18th century the process of deepening and systematic division of labor took a new
form: ‘manufactures’. Its characteristic feature was the centralization under one
roof of the workforce. This made possible not only systematic supervision (and
reduced costs of transportation etc.), but also ‘an advanced division of labor’
(Braudel, 1979, p. 300). What defines this movement is the systematic decompo84 The ‘putting-out’ system is sometimes dubbed ‘proto-industrialization’ in modern economic historiography (e.g., Mendels, 1972) but this notion is typically used as a designation for a stage in economic history
(as pointed out by Coleman, 1984).
302
Cooperative Work and Coordinative Practices
sition of received work processes into component processes and the corresponding specialization of individual workers to perform specific component processes.
In many domains of work the form of work organization that characterized handicraft work, namely, that the individual worker mastered the entire range of tasks
and that the different workers at the workshop could replace each other in performing different kinds of task, was transformed into an organization based on
systematic division of labor in the workshop.
In 1597 Thomas Deloney published a eulogy to the ‘the famous Cloth Workers
in England’ as represented by the cloth maker John Winchcombe, ‘a famous and
worthy man’. The text is of interest because it, as pointed out by the editor of Deloney’s works, shows ‘a detailed knowledge of Newbury, its surroundings, and
the county families of Elizabethan Berkshire, which could only have been obtained by an actual residence there’ and, more to the point, because it offers a contemporary, if colorful and apologetic, description of the work organization in an
early manufacture:
‘Within one roome being large and long,
There stood two hundred Loomes full strong:
Two hundred men the truth is so,
Wrought in these Loomes all in a row.
By euery one a pretty boy,
Sate making quils with mickle ioy;
And in another place hard by,
An hundred women merrily,
Were carding hard with ioyfull cheere,
Who singing sate with voices cleere.’ (Deloney, 1597, pp. 20 f.).
The poet goes on to enumerate the various handicrafts at this site by taking the
reader through a imaginary tour of the establishment, pointing out — in addition
to the 200 weavers, each assisted by a ‘pretty boy’, and the 100 joyfully singing
carders —200 ‘pretty maids’ ceaselessly engaged in spinning in an adjoining
chamber, and in another room 150 ‘children of poore silly men’ ‘picking wool’,
and further, in yet other rooms, 50 male shearers, 80 rowers, 40 dyers, and 20
fullers.
The manufactures were not only a form of organization that was seen as profitable and laudable; they were the object of scholarly interest, as a technology. The
principles of systematic division of labor was formulated repeatedly the course of
the 17th and 18th century, for instance by William Petty:
‘the Gain which is made by Manufactures, will be greater, as the Manufacture it self is greater and
better. For in so vast a City Manufactures will beget one another, and each Manufacture will be
divided into as many parts as possible, whereby the work of each Artisan will be simple and easie;
As for Example. In the making of a Watch, if one Man shall make the Wheels, another the Spring,
another shall Engrave the Dial-Plate, and another shall make the Cases, then the Watch will be
better and cheaper, than if the whole work be put upon any one Man.’ (Petty, 1683, p. 472).
Diderot went further and formulated the principles of the systematic division of
labor in manufactures in his article on ‘Art’ in the first volume of the Encyclopédie. Discussing the advantages of manufactures, Diderot argued:
Chapter 11: Formation and fragmentation
303
‘The speed of work and the perfection of the product both depend entirely on the number of workers assembled. When a manufacture is large, each operation is dedicate to a different man. This
worker here only makes one unique thing his entire life, that one there another thing, so that each
operation is performed well and promptly, and so that each product, while the best made, is also
the cheapest’ (Diderot, 1751, pp. 275 f.).
However, the classic example of this form of work organization is undoubtedly
that of pin manufactures, as given by Adam Smith in the first chapter of his
Wealth of Nations. Arguing that the manufacture of pins, though ‘a very trifling
manufacture’, is one ‘in which the division of labour is very often taken notice of’
and thus a paradigmatic case of ‘division of labour’, Smith goes on to describe the
organization of a particular workshop:
‘One man draws out the wire, another straights it, a third cuts it, a fourth points it, a fifth grinds it
at the top for receiving the head; to make the head requires two or three distinct operations; to put
it on, is a peculiar business, to whiten the pins is another; it is even a trade by itself to put them
into the paper; and the important business of making a pin is, in this manner, divided into about
eighteen distinct operations, which, in some manufactories, are all performed by distinct hands,
though in others the same man will sometimes perform two or three of them. I have seen a small
manufactory of this kind where ten men only were employed, and where some of them consequently performed two or three distinct operations.’ (A. Smith, 1776b, pp. 4 f.).
In reading Smith’s analysis it is important to realize that what he conceptualizes
as division of labor is also, by the same token, cooperative work. The systematic
division of labor is necessarily complemented by systematic cooperative work.
This insight, which was implicit in Smith’s argument, was explicitly made by his
followers such as Edward Wakefield and John Stuart Mill, although their discussion is confused because they, as dedicated followers of Smith, confounds the systematic and regular division of labor at the point of production with the disorganized and spontaneous division of labor at the level of the economy at large
(cf., e.g., Mill, 1848, § I.VIII.1). The point is made more clearly by Mill’s contemporary, the German political economist Friedrich List who, as a protectionist,
was prone to be critical of Smith:
‘a division of the operations, without the association of productive powers for a common end,
would be of very little help in the production. That such a result may be obtained, it is necessary
that the different individuals be associated intellectually and bodily and that they cooperate. He
who makes the head of pins must count upon the labors of him who makes the points, so that he
does not run the risk of making pin heads in vain. The work activities should be in a suitable proportion to each other; the workers ought to be located as near to each other as possible; and their
cooperation should be insured.’ (List, 1841a, p. 165; cf., List, 1841b, p. 231).
For our purposes, this insight is important. Cooperative work is found in the entire
course of human history, but until the systematic division of labor in manufactures
it either occurred sporadically (e.g., hunting of large game, clearing of forest) or
was marginal to ordinary production (construction of sacral buildings, fortifications, dams, roads), and the work organization of the manufactures is thus the first
example systematic and continual cooperative work at the point of production.
Smith was of course not particularly interested in the technological and managerial issues of work organization, but in the effects of work organization on the
304
Cooperative Work and Coordinative Practices
creation of ‘wealth’, and he therefore quickly moved on to expound three advantageous effects of division of labor (A. Smith, 1776b, p. 7): ‘First, the improvement of the dexterity of the workman necessarily increases the quantity of the
work he can perform; and the division of labour, by reducing every man’s business to some one simple operation, and by making this operation the sole employment of his life, necessarily increased very much dexterity of the workman.’
However, in his analysis Smith stayed well within the limits of the manufactures,
emphasizing the advantages over artisanal work but blind to the inherent limitations of the manufactures. He made some very enthusiastic claims concerning the
gains in productivity as a result of division of labor. To determine the effect of
division of labor on productivity, Smith compared his case with that of a single
polyvalent worker:
‘Those ten persons […] could make among them upwards of forty−eight thousand pins in a day.
Each person, therefore, making a tenth part of forty−eight thousand pins, might be considered as
making four thousand eight hundred pins in a day. But if they had all wrought separately and independently, and without any of them having been educated to this peculiar business, they certainly could not each of them have made twenty, perhaps not one pin in a day; that is, certainly, not
the two hundred and fortieth, perhaps not the four thousand eight hundredth part of what they are
at present capable of performing, in consequence of a proper division and combination of their
different operations.’ (A. Smith, 1776b, p. 7).
That is, Smith asserted that the production of pins, under the scheme of division
of labor, had increased labor productivity by a factor 4,800, or at least 240. However, the claim that a single worker, on his own, would hardly be able to produce
one pin per day, leave alone twenty, was sheer guesswork on Smith’s part. In fact,
in artisanal pin-making in France of the 18th century an apprentice should be able
to produce 1,000 pins in half a day, or 2,000 on a daily basis (Peaucelle, 2007, p.
196). A 100 percent increase in productivity is of course significant by any standard, but it is not the mind-blowing rate that has mesmerized Smith’s readers since
then. The point of this is not to be pedantic but rather that the manufactures work
organization was severely limited by the characteristics of the human sensorymotor system. Productivity may increase as a result of the increase in dexterity of
the individual but with a few weeks or of training, the learning curve would reach
a plateau.
As a second advantage of division of labor, Smith pointed to the possibility of
‘saving the time commonly lost in passing from one sort of work to another is
much greater than we should at first view be apt to imagine it’. Productivity will
of course also increase as continuity of operation eliminates the cost of transiting
from one operation to the next (set-up costs, in modern terminology) but, again,
when the set-up costs have been eliminated the productivity rate will also flatten
out.
And as the third and last advantage, Smith argued that specialization engenders
technical innovation:
‘everybody must be sensible how much labour is facilitated and abridged by the application of
proper machinery. It is unnecessary to give any example. I shall only observe, therefore, that the
Chapter 11: Formation and fragmentation
305
invention of all those machines by which labour is so much facilitated and abridged seems to have
been originally owing to the division of labour. Men are much more likely to discover easier and
readier methods of attaining any object when the whole attention of their minds is directed towards
that single object than when it is dissipated among a great variety of things.’’ (A. Smith, 1776b,
pp. 7-10.).
Again, despite the use of the term ‘machinery’, Smith’s analysis remained firmly
entrenched within the conceptual horizon of the manufactures. When talking
about ‘machinery’ he was not talking about what we today understand by machinery, namely devices that can operate automatically (under human supervision, of
course, but without continual human involvement in the transformation process).
This issue will be discussed at length later, but it has to be pointed out here that
Smith, like his contemporaries, was using the terms ‘machine’ and ‘machinery’
very loosely to denote any kind of composite tool that somehow augments human
capacity such as, for example, ‘the ship of the sailor, the mill of the fuller, or even
the loom of the weaver’ (A. Smith, 1776b, p. 12).
The way in Smith presents the case (‘I have seen a small manufactory of this
kind’) and not least the way he has been read may lead one to assume that he in
his analysis described something that to his contemporaries was breaking news. It
was not. As pointed out above, the manufactures form of work organization had
been the object of study for a century before Smith described the pin manufacture.
More than that. The pin manufacture had been a case of special interest to scholars several decades before Smith wrote about it. And although Smith claimed to
have observed the workshop he described, the evidence indicates that this is actually not true (Peaucelle, 2006, 2007). As Smith himself pointed out, the work organization of pin making had, at the time, ‘very often’ been ‘taken notice of’. In
fact, the division of labor in the manufacture of pins had been described long before Smith made use of the case (in his Lectures on Jurisprudence from 1762-64
and in his Wealth of Nations) and it was common knowledge among scholars who
had an interest in technology and economy. Thus, Ephraim Chambers’s Cyclopædia from 1728, well-known and respected at the time, emphasized the advanced division of labor in the manufacture of pins.85 The primary sources for
Smith’s description were French. As mentioned earlier, the French Académie des
science had since 1675 been collecting data concerning manufacturing processes
based on field reports from local observers who were attached to the prefectures
but were doing their research at the direction of scholars attached to the
L’Académie (Peaucelle and Manin, 2006). One of the production processes on
which reports were collected (since 1693) was that of the pin manufactures in
Normandy (Peaucelle and Manin, 2006; Peaucelle, 2007). The analyses were
eventually made publicly available in a various documents, first of all in one of
the volumes of Descriptions des arts et métiers, a collection of writings devoted
to the Art of pin-making edited by Duhamel de Monceau and containing texts by
85 ‘Notwithstanding that there is scarce any Commodity cheaper than Pins, there is none that passes thro’
more hands e’re they come to be sold. — They reckon 25 Workmen successively employ’d in each Pin, between the drawing of the Brass-Wiar, and the sticking of the Pin in the Paper’ (Chambers, 1728).
306
Cooperative Work and Coordinative Practices
Réaumur, Perronet, and Chalouzières (Réaumur, et al., 1761) but also, and better
known, in the very detailed accounts in Diderot’s Encyclopédie (Delayre, 1755;
Perronet, 1765). The findings of these reports and analyses were then further distributed in various forms: in journal reviews (e.g., in the Journal des sçavans,
1761), in pocket dictionaries (such as Macquer’s Dictionnaire portatif des Arts et
Métier, 1766). In short, the work organization of manufactures was the topic of
great interest, and the organization of the manufacture of pins was becoming a
paradigm case for the technological and managerial knowledge of this kind of
work organization.
Chapter 11: Formation and fragmentation
307
Figure 1. Illustrations of the pin making manufacture from Diderot’s Encyclopédie (the supplementary series of plates: Recueil de planches, sur les sciences, les arts liberaux, et les arts méchaniques, avec leur explication, vol. 3, Paris, 1765). Pins were made from brass wire that is drawn
and cut and so forth. The topmost plate shows winding, unwinding, and washing coils of wire before drawing it. The second plate shows the processes of drawing the wire (right), cutting it (center), and pointing the pins (left). The bottom plate depicts the final process of heading the pins by
annealing (right, at the back) and hammering (at the contraption in the center). The pins are finally
tin-coated, washed, polished, and packed (left). The pin making process was described in detail by
Delayre in vol. 5 of l’Encyclopédie (1755), by Duhamel du Monceau in a volume of the Descriptions specifically devoted to the art of making pins (Réaumur, et al., 1761), and by Perronet in vol.
3 the Recueil de planches (1765). (Cf. also Gillispie, 1959, vol. 1, plates 184-186 and associated
text).
308
Cooperative Work and Coordinative Practices
That is, the form of cooperative work that is characteristic of the manufactures
was investigated, analyzed, and referred to by contemporaries as extant and ongoing cases; they became an element of common knowledge: hailed by poets and
propagandists such as Deloney, studied and described in detail by scholars such as
Delayre and Perronet, and ultimately elevated to paradigm status by Adam Smith.
The manufactures play an important historical role in the development of machinery in production processes. The analysis of work processes involved in the
decomposition of craft work, the standardization of methods for each constituent
process, and their recomposition and planned integration paved the way for the
mechanization of constituent processes. Automatic machines had, of course, existed for centuries at this time. Clocks are a case in point. At the point of production, however, machinery only became significant by the end of the 18th century.
Now, manufactures were never simply transformed into machine-based factories,
nor did machinery always emerge from manufactures. Mechanization could also
occur from the construction of machinery to automate an entire process. However,
the manufactures play a critical role in the development of machinery for the automatic enactment of production processes, in that it developed widespread and
(fairly) systematic knowledge of the principles of division of labor: The concept
of division of labor, of sequences of elementary operations, of productivity as
measured by output per worker per day, of different cadencies in a composite process, of ‘line balancing’, etc. The transformation of craft work into a form based
on systematic division of labor can be seen as having been instrumental in developing the requisite technical and managerial practices for mechanization of production to be conceivable as a practical option. That is, the very idea that a received work practice could be analyzed, decomposed, and composed anew was
the decisive concept. In fact, our very concept of technology is predicated on this
insight.
In the development of technology — of technical knowledge, that is — Smith’s
version of the story is hugely important, since it provided the paradigmatic account: that a concerted arrangement of workers, each specialized in performing a
simple operation in a systematically planned sequence, could achieve a productivity greatly surpassing that of the same number of skilled workers, even artisans,
working in parallel. More importantly, it highlighted and directed the attention of
engineers, technologists, and managers as well as scholars to the fact that in a
carefully planned division of labor, the collaborating workers, as a concerted collective, could master a process that none of them could master and perhaps even
account for individually. This was underscored, albeit indirectly, by Charles Babbage when he, in a comment on Adam Smith’s analysis of the advantages of the
division of labor in manufactures, made some critical corrections. Going back to
one of sources used by Smith (namely the abovementioned Perronet), Babbage
points out that a major factor making division of labor economically advantageous
is the circumstance that, whereas workers with the skills of artisans, and corresponding salaries, would be required for the most complex operations, operations,
Chapter 11: Formation and fragmentation
309
such as heading pins, could be performed by workers with very little training and
similar wages (women and children):
‘it appears to me, that any explanation of the cheapness of manufactured articles, as consequent
upon the division of labour, would be incomplete if the following principle were omitted to be
stated. That the master manufacturer, by dividing the work to be executed into different processes,
each requiring different degrees of skill or of force, can purchase exactly that precise quantity of
both which is necessary for each process; whereas, if the whole work were executed by one
workman, that person must possess sufficient skill to perform the most difficult, and sufficient
strength to execute the most laborious, of the operations into which the art is divided.’ (Babbage,
1832, § 226).
This has been much discussed, and rightly so, for it is an important corrective to
Smith’s analysis of division of labor. The point in our context, however, is that
Babbage here underscores a lesson that is easily ignored: that the individual
worker in such arrangements only needs to understand and master his or her partial task and that the partial task might be made so simple that it can be mechanized. From this insight emerges the modern concept of the control function,
predicated on a distinction of planning and execution of operations.
The systematic division of labor that characterizes manufactures has inherent
limitations, as pointed out by Marx:
‘For a proper understanding of the division of labour in manufacture, it is essential that the following points be firmly grasped: First, the analysis of the production process into its specific phases
coincides, here, strictly with the dissolution of a handicraft into its various partial operations.
Whether complex or simple, the performance retains the character of handicraft and, hence, remains dependent on the strength, skill, quickness, and sureness, of the part-worker in handling his
instruments. The handicraft remains the basis. This narrow technological basis excludes a really
scientific analysis of the production process, since each partial process to which the product is
subjected must be capable of being carried out as partial handicraft. Precisely because handicraft
skill, in this way, remains the foundation of the process of production, each worker becomes exclusively appropriated to a partial function and that, for the duration of his life, his labour power is
turned into the organ of this detail function.’ (Marx, 1867a, pp. 274 f. ).86
The manufactures work organization was not only severely limited by the characteristics of the human sensory-motor system and that the learning curve would
reach a plateau; it was also, Marx argued (with Ure), limited by its severe dependence on the handicraft worker and thus that worker’s resistance to attempt to
speed up the process.
Still, the fact that the performance retains the character of handicraft has not
prevented foremen and engineers in the twentieth century from taking specialization to even greater heights than that represented by the classic manufactures. In
order to fully exhaust the performance potential of workers, work processes and
operations were subjected to careful systematic analysis and redesign, and for this
86 Author’s translation from the German original. The standard English translation (by Moore and Aveling)
is unfortunately rather inaccurate and, indeed, occasionally misleading. Anyway, the English translation of
the relevant passages can be found in Chapter XIII (‘Co-operation’), Chapter XIV (‘Division of labour and
manufacture’), and Chapter XV (‘Machinery and modern industry’), esp. Section 1 (‘The development of
machinery’, pp. 374 ff.). (Marx, 1867c).
310
Cooperative Work and Coordinative Practices
purpose engineers such as Frederick W. Taylor (1911) Henri Fayol (1918) and
their many followers (e.g., Urwick and Brech, 1945, 1946, 1948) developed and
applied advanced methods and techniques of observation such as stopwatches,
video cameras, etc., as well as methods of work design such as instruction cards
and slide rules, etc. Likewise, to optimize the performance of the human sensorymotor system physiologists became involved in the design of elementary operations, tools, and work stations and eventually developed ‘human factors’ or ergonomics as a engineering discipline with its own repertoire of methods and techniques.
In a parallel attempt to overcome workers’ resistance, social psychologists became involved in developing methods of increasing workers’ ‘motivation’. This
effort is famously represented by the so-called ‘Hawthorne experiments’ carried
out between 1924 and 1933 (e.g., Roethlisberger and Dickson, 1939). Where Taylor consistently argued for financial means for motivating workers, the social psychologists concluded from these ‘experiments’ that measures such as friendly supervision and ‘social relations’ in ‘work groups’ were of overwhelming importance. However, more recent and more rigorous studies comparing the Hawthorne conclusions and the Hawthorne evidence show ‘these conclusions to be
almost wholly-unsupported’ (Carey, 1967) and that there is ‘no evidence of Hawthorne effects’ (Jones, 1992). This has not prevented the formation of a ‘human
relations’ profession with its own repertoire of methods and tests, however (for
discussions of this phenomenon of managerial ideology, cf. Gillespie, 1991;
Brannigan, 2004).
While initially focused on increasing daily output per worker by optimizing the
part-performance of the part-worker, the methods and techniques developed by
the ‘scientific management’ school were not restricted to that. Major improvements in overall productivity have been obtained by optimizing the flow of work
pieces from one part-worker to the next, in particular by motorizing the transportation of work pieces, typically by means of conveyor belts. The assembly line
one finds in, e.g., automobile assembly plants, and which many sociologist of
work have taken as exemplary of the machine-based factory system, is in fact and
on the contrary the most advanced form of the systematic division of labor exemplified by the pin manufacture. The confusion is nurtured by imagery in which
workers, to the naïve eye, appear as small cogwheel in a vast automaton and it is
reinforced by overindulgence in metaphorical discourse where certain organizational forms are talked of a ‘machines’.87 But even superficial observation of the
organization of work in these plants shows that is generally that of extreme division of labor (for a collection of historical photos, cf. Marius Hammer, 1959).
What became ‘mechanized’ in automobile assembly and in other kinds of assembly work in the course of the 20th century was the transportation of parts and assemblies between workstations, not the operations at the various work stations,
87 The distinguished historian Siegfried Giedion, to take but one example, posited that ‘The symptom of full
mechanization is the assembly line, wherein the entire factory is consolidated into a synchronous organism’
(1948, p. 5).
Chapter 11: Formation and fragmentation
311
and the ‘mechanization’ of the transportation system merely consisted in a conveyer belt arrangement powered by electrical motors. In fact, it is only quite recently, after the microprocessor technology began to stabilize in the late 1970s,
that self-acting machinery is being applied in significant ways in the automotive
industry (e.g., welding robots). Moreover, the assembly line was never representative of industrial production, not even of mass production. Because the assembly
line requires a large investments in an inflexible workflow layout, it was and remains an exceptional case in modern manufacturing. In the words of Taylor’s biographer, Robert Kanigel: ‘While the assembly line remains a common if tired
metaphor, it defines surprisingly little of modern manufacturing. The Taylorized
workplace, on the contrary, appears everywhere, heedless of the lines between one
industry and the next’. In short, ‘Fordism was the special case, Taylorism the universal’ (1997, p. 498).
Later in the twentieth century, engineers such as Taiichi Ohno of Toyota developed and refined the repertoire of methods and techniques developed by the
‘scientific management’ school (Ohno, 1988). Although the principles he and his
colleagues developed, widely known as the ‘Toyota production system’, are
hailed as a break with ‘Taylorism’, they are at least as rigorous in their attention to
systematic analysis of operations and continuity of the flow of work as previous
forms of ‘scientific management’. What is new is rather, on one hand, the refinement of the manufactures form of work organization to take into account other
performance criteria than output per worker per day, namely, criteria such as the
flexibility and responsiveness of the overall cooperative work effort (by ‘just-intime’ principles) as well as product quality (cf. Peaucelle, 2000), and on the other
hand a strong emphasis on ‘continual improvement’ and the involvement of
workers in achieving this (cf. Spear and Bowen, 1999),
Marx’s observation remains valid: the performance of cooperative work based
on advanced de- and recomposition of labor ‘remains dependent on the strength,
skill, quickness, and sureness, of the part-worker in handling his instruments’.
With the introduction of machinery, this dependence is broken and with this also
the dependence of human productivity on progressive deepening of the division of
labor.
2.2. Machinery: The issue of the control function
The factory, the form of organization of work based on mechanization of the constituent work processes, is typically seen as yet another progressive form of work
organization. The reason for this is that the advanced division of labor, based on
decomposition of received handicraft and the recomposition of the production
process as a systematically designed process, historically provided the technology
of work analysis that is an essential foundation of machinery. At the same time,
however, the factory represents a form of work organization that cannot be
grasped as yet another permutation of handicraft and division of labor.
312
Cooperative Work and Coordinative Practices
Mechanization can be conceived of as a process that unfolds in two dimensions: on one hand the process of transfer of operational control from workers to
(thereby increasingly ‘self-regulating’) technical implements, i.e., machines, and
on the other hand the process of technical integration of multiple machines into
machine systems. It is the latter process, the development of machine systems, that
is of particular concern here, as computational coordination technology is a special but now dominant type of machine system.
In manual work, as exemplified by traditional craft work, the process of transformation —the operation of the grindstone, spinning wheel, handloom — is literally ‘in the hands’ of the worker. Some external source of power may be employed (e.g., animal, wind, water, steam), but the operation of the tool (stone,
spindle, shuttle) is controlled by the worker, and the performance thus depends on
his or her skills. With machinery, the control of the operation of the tool is performed ‘automatically’, by the implement, without human continual intervention
or mediation.
This conception is a modern one; it originates in the industrial revolution. One
of the first to make this distinction was Charles Babbage who, for most of the
1820s, conducted extensive field work in ‘a considerable number of workshops
and factories, both in England and on the Continent, for the purpose of endeavouring to make myself acquainted with the various resources of mechanical art’
(Babbage, 1832, p. iii). Faced with ‘the wide variety of facts’ obtained at these
visits, he found it ‘impossible not to trace or to imagine’ ‘some principles which
seemed to pervade many establishments’, in particular the principles of division
of labor and the ‘mechanical principles which regulate the application of machinery to arts and manufactures’ (ibid., pp. iii f.). It was, presumably, his extensive
exposure to the facts of the ground, coupled with his skilled ability for generalization, that then enabled him to see, with some clarity, what was specific in the new
machine technologies that were then being developed and applied, namely that a
set of tools are ‘placed in a frame’, moved ‘regularly by some mechanical contrivance’, and thus ‘acted on by a moving power’ (ibid., §§ 10, 224, pp. 12, 174):
‘When each process has been reduced to the use of some simple tool, the union of
all these tools, actuated by one moving power, constitutes a machine.’ (ibid, §
225, p. 174).
However, the first to express the notion of ‘control’ succinctly was Andrew
Ure in 1835. Like Babbage before him, he had engaged in what we today would
call field studies in the factory districts of England. In his own words, he ‘spent
several months in wandering through the factory districts of Lancashire, Cheshire,
Derbyshire, &c., with the happiest results to his health; having everywhere experienced the utmost kindness and liberality from the mill-proprietors’ (Ure, 1835,
p. ix). Having returned from his field trip, he summarized his findings in a book
that, while shamelessly apologetic of the factory regime, contains astute observations of the then emerging technologies of ‘automatic’ production:
‘The principle of the factory system […] is, to substitute mechanical science for hand skill, and the
partition of a process into its essential constituents, for the division or graduation of labour among
Chapter 11: Formation and fragmentation
313
artisans. On the handicraft plan, labour more or less skilled, was usually the most expensive element of production […] but on the automatic plan, skilled labour gets progressively superseded,
and will, eventually, be replaced by mere overlookers of machines.’ (Ure, 1835, p. 20)
The concept underlying Babbage and Ure’s distinction is that of what we today
call the control function and the related notion of transfer of control from human
to implement; that is, the (partial) elimination of human labor in so far as the actual transformation process is concerned. Simply put, on this view a machine differs from a tool by being able to perform autonomously, within certain limits, by
virtue of a mechanical control function.88
Following Babbage and Ure, but conceptually more stringent, Marx made the
concept of the control function the cornerstone of his analysis of the transformation of work in the course of the industrial revolution. The classic statement on
machinery is in the discussion of ‘Machinery and big industry’ in Capital:
‘On a closer examination of the machine tool or the working machine proper, one recognizes by
and large — though often, no doubt, in very modified form — the apparatus and tools used by the
handicraftsman or the worker of manufactures, but instead of as tools of humans, now as tools of a
mechanism, or mechanical tools. […] The working machine is therefore a mechanism that, after
being set in motion, performs with its tools the same operations that were formerly done by the
workman with similar tools. Whether the motive power is derived from humans or from some other machine, makes no difference in this respect. After the transfer of the tool proper from humans
to a mechanism, a machine replaces a mere tool. The difference strikes one at once, even when the
human remains the prime mover.’ (Marx, 1867a, pp. 303 f.).
This was not an casual remark. Marx had first read Ure and Babbage forty years
earlier (Marx, 1845b, pp. 325-351) and had made this concept of machinery a
pivotal element of his understanding of contemporary developments. Shortly after
these initial studies he wrote a small book in which he, quoting Babbage and Ure,
stated rather unequivocally that
‘What characterizes the division of labor in the automatic workshop is that labor has there completely lost its character of specialism. But the moment every special development stops, the need
for universality, the tendency towards an integral development of the individual begins to be felt.
The automatic workshop wipes out specialists and craft-idiocy.’ (Marx, 1847, chapter 2.2).
And again, ten years later, in the first outline of his economic theory, the
Grundrisse from 1857-58, he makes this point forcefully:
‘Work no longer appears so much as included within the production process; but rather the human
engages in that process as its overseer and regulator. […] It is no longer the worker who interposes
the modified natural object [i.e., the tool] as an intermediate between the object and himself; but
rather, he now interposes the natural process, which he transforms into an industrial one, as a
means between himself and inorganic nature, which he masters. He stands beside the production
process, rather than being its main agent.’ (Marx, 1857-58a, p. 581).
He subsequently reread Babbage and Ure in the context of the extensive studies of
the development of production technologies and work organization (cf. Marx,
88 Charles Kelley (1968) offers a good introduction to the modern concept of control.
314
Cooperative Work and Coordinative Practices
1861-63, pp. 229-318, 1895-2090) that found their ultimate expression in the
chapters on ‘Cooperative work’, ‘Division of labor’, and ‘Machinery’ in Capital.
In view of the fact that the mathematical theory of control functions had not
been proposed yet when Babbage, Ure, and Marx offered these initial formulations of the modern concept of machinery, this is of course quite remarkable.
Their ability to do so, however, was not the result of prescience but was a consequence of their keen interest in what we today would call the changing allocation
of function between human and machine which they considered of the utmost
economic and social importance. This methodological preferences is made explicit at several places, for instance in a letter from 1863 in which Marx summarized
his initial conclusions from a thorough study the development the progressive
forms work organization and especially machinery:
‘there is considerable controversy as to what distinguishes a machine from a tool. After its own
crude fashion, English (mathematical) mechanics calls a tool a simple machine and a machine a
complicated tool. English technologists, however, who take rather more account of economics,
distinguish the two (and so, accordingly, do many, if not most, English economists) in as much as
in one case the motive power emanates from man, in the other from a natural force. […] However,
if we take a look at the machine in its elementary form, there can be no doubt that the industrial
revolution originates, not from motive power, but from that part of machinery called the working
machine by the English, i.e., not from, say, the use of water or steam in place of the foot to move
the spinning wheel, but from the transformation of the actual spinning process itself, and the displacement of that part of human labor that was not mere exertion of power (as in treading a wheel),
but was concerned with processing, working directly on the material to be processed. […] To
those who are merely mathematicians, these questions are of no moment, but they assume great
importance when it comes to establishing a connection between human social relations and the
development of these material modes of production.’ (Marx, 1863, pf. 320 f.). (Cf. also Marx,
1861-63, pp. 1915-1917).
Now, what is interesting from a CSCW perspective, as opposed to a history of
science and technology point-of-view, is not these early developments of the concept of machinery. What is very relevant to CSCW is rather the concept of machine systems these authors suggested. In formulations that are strikingly visionary, Ure observed:
‘The term Factory, in technology, designates the combined operation of many orders of workpeople, adult and young, in tending with assiduous skill a system of productive machines continuously impelled by a central power. This definition includes such organizations as cotton-mills,
flax-mills, silk-mills, woollen-mills, and certain engineering works; but it excludes those in which
the mechanisms do not form a connected series, nor are dependent on one prime mover. But I conceive that this title [the term Factory], in its strictest sense, involves the idea of a vast automaton,
composed of various mechanical and intellectual organs, acting in uninterrupted concert for the
production of a common object, all of them being subordinated to a self-regulated moving force.’
(Ure, 1835, pp. 13 f.).
In his conception of machine systems Marx again followed Ure, but his conception of machine system is significantly more developed. This of course reflects
the rapid development of machine technology had undergone since Ure was roaming the factory districts of England some 33 years earlier. Based on studies of ‘au-
Chapter 11: Formation and fragmentation
315
tomatic workshops’ as represented by paper and envelope factories, power looms,
and printing presses, Marx stated:
‘A proper machine system only takes the place of the particular independent machines, where the
work piece [Arbeitsgegenstand] undergoes a continuous series of different process steps, carried
out by a chain of machine tools that, while of different species, complement one another.’ (Marx,
1867a, p. 309) (cf. also Marx, 1861-63, pp. 1940-1946).
He compared this system of interoperating machines with the division of labor
that were characteristic of the manufactures:
‘The cooperation by division of labor that characterizes Manufacture here reappears, only now as a
combination of partial working machines. The specialized tools of the various specialized workmen, such as those of the beaters, cambers, twisters, spinners, etc., in the woollen manufacture, are
now changed into the tools of specialized machines, each machine constituting a special organ,
with a particular function, in the combined tool-mechanism system.’ (Marx, 1867a, p. 309).
However, Marx went on, ‘an essential difference at once manifests itself’:
‘In the manufacture, it must be possible for each particular partial process to be performed by
workers, individually or in groups, by means of their manual tools. While the worker is adapted to
the process, then on the other hand, the process was previously adapted to the worker. This subjective principle of division of labor vanishes in production by machinery. The process as a whole is
here considered objectively, in and for itself, analyzed into its constituent phases; and the problem
of how to execute each partial process and connect the different partial processes into a whole, is
solved by technical application of mechanics, chemistry, etc.’ (Marx, 1867a, pp. 309 f.).
According to Marx then, in the advanced factory, based on machine systems, the
worker is no longer subsumed under the regime of extreme specialization that
characterizes the manufactures but become a member of a cooperative ensemble
that, as a collective, supervises the operation of the machine system:
‘The combined working machine, now an organized system of different species of individual machines and of groups of such machines, becomes increasingly perfect in so far as the process as a
whole becomes a continuous one, i.e., the less the raw material is interrupted in its passage from
its first phase to its last; in other words, the more its passage from one phase to another is conveyed by the mechanism itself, not by human hand.’ (Marx, 1867a, p. 310).
Marx here clearly conceived of machine systems in terms of a system of automatically interacting machines, as a system of interoperating control functions. With
the development of systems of machines that interoperate, cooperative work becomes (to some extent and at different levels of granularity) mediated and regulated by these machine systems:
‘The implements of labor acquire in machinery a material mode of existence that implies the substitution of human force by natural forces and of experience-based routine by the conscious application of science. In the manufacture, the organization of the social work process is purely subjective, a combination of partial workers; in the machine system, modern industry creates an objective
productive organism, which the worker meets as an existing material condition of production. In
simple cooperation, and even in cooperation predicated on division of labor, the displacement of
the individualized worker by the socialized worker still appears to be more or less accidental. Machinery […] operates only in the hand of directly associated or communal work. Hence the cooperative character of the work process now becomes a technological necessity dictated by the nature of the implement of work itself.’ (Marx, 1867a, p. 315).
316
Cooperative Work and Coordinative Practices
With the industrial mode of production, on this view, cooperative work is more
than an economically advantageous arrangement; it is ‘a technological necessity’.
The machine system presumes a cooperative work arrangement for its operation
and the individual activities of the cooperating workers are in turn mediated and
regulated by the machine system.
Figure 2. Machine system: The printing press of The Times, c. 1851, built by Cowper & Applegath. From a ‘Survey of the Existing State of Arts, Machines, and Manufactures’ by a Committee
of General Literature and Education appointed by the Society for Promoting Christian Knowledge
(1855, p. 211), one of Marx’ primary sources on ‘machine systems’.
While remarkably modern, the conception of machine systems developed by
contemporary analysts such as Ure and Marx is, not surprisingly, limited by the
practical horizon of the factory system of the 19th century. As is evident from the
above quotations, Ure’s conception of ‘the factory’ is predicated on not only ‘the
idea of a vast automaton, composed of various mechanical and intellectual organs,
acting in uninterrupted concert for the production of a common object’ but also on
the premise that ‘all of them being subordinated to a self-regulated moving force’.
What, on Ure’s conception, makes the ‘system of productive machines’ a system,
‘a vast automaton’, is the fact that multiple machines are being ‘continuously impelled by a central power’ (Ure, 1835, pp. 13 f.). The same ambiguity can be
found in Marx:
‘A system of machinery, whether based on mere cooperation of working machines of the same
kind, as in weaving, or on a combination of machines of different kinds, as in spinning, constitutes
in and for itself one huge automaton, as soon as it is driven by a self-acting prime mover. […] The
machinery-based enterprise achieves its most developed form as an organized system of working
machines that receives its motion from a central automaton by transmission machinery.’ (Marx,
1867a, pp. 310 f.)
Chapter 11: Formation and fragmentation
317
The influence from Ure is obvious in that Marx, in discussing machine systems,
also made a ‘prime mover’ the defining feature of machine systems, as opposed to
control of operations. The picture of the central steam engine that, while entrenched in the basement of the factory, drives all machines in the factory via a
highly visible and comprehensive power-transmission system of driving belts,
shafts, and gear trains, surely must have been as evocative then as it is now. It is
nevertheless, prima facie, somewhat puzzling that Marx, who confidently and
clearly stated that ‘the industrial revolution originates, not from motive power, but
from that part of machinery called the working machine’ and that a ‘the elimination of that part of human labor that was not mere exertion of power […], but was
concerned with processing, working directly on the material to be processed’,
would include ‘a self-acting prime mover’ as defining of a machine system. This
ambiguity is not accidental, however. It is rooted in historically given conceptual
limitations. The transmission of power to the tool and the automatic control of the
movements of the tool had not yet been technically separated. Ure and Marx
therefore basically defined ‘machinery’ in terms of the role of the worker vis-à-vis
the implement as such, that is, in terms of the ‘self-acting’ character of its operation, not in terms of specific technical features. For Marx the defining feature was
that it is the machinery, not the worker, that immediately controls the movements
of the tool.
The notion of a distinct control mechanism was absent from the reasoning of
Marx. For good reasons. First of all, in spite of his attention to technical issues in
the historical development of machinery, his primary concern was the changing
role of workers in production, for, as he put it in the letter cited above, the questions of the role of the worker vis-à-vis the implement was ‘of no consequence’ to
‘pure mathematicians’, ‘but they become very important when it is a question of
demonstrating the connection of human social relations to the development of these material modes of production’ (Marx, 1863, p. 321). Thus, when he, following
Ure, made the ‘prime mover’ a defining feature of ‘machine systems’, he was
considering the obvious fact that, while a worker might be the source of energy
with a single working machine, as had been the case for centuries in the case of
the spinning wheel, it was practically impossible for a worker to move a connected system of machines. That is, a continuous ‘external’ supply of energy to drive
the machine system was simply a necessary precondition for these to be workable.
Consequently, while his definition of machinery as such is clearly based on the
notion of a control function, this concept is abandoned when the phenomenon of
machine systems is considered.
Distinct control mechanisms, physically separate from the energy transmission
systems of the implement, were extremely rare. Generally the control function
was performed by the physical union of the mechanism of transfer of power and
the mechanism of controlling the movements of the tool. This practico-technical
circumstance made it difficult to formulate and apply a strict concept of control,
for Ure and Marx and for generations of technologists too.
318
Cooperative Work and Coordinative Practices
For more than a century, machine systems were a rare phenomenon, restricted
to a limited set of branches of mass production. The reason for this is of critical
importance for understanding the role of computing technologies and, by implication, CSCW.
In early machines, such as Richard Roberts’ ‘self-acting mule’ (1825-30), the
control of the behavior of the tool was completely integrated with the system of
transmission and transformation of energy. That is, the movement of the tool and
the workpiece was regulated by the very same parts of the machine (driving belts,
rack-and-pinion, gears, clutches, camshafts, crankshafts, etc.) that transferred energy to the tool (or the workpiece), that is, made it move. For reasons of economy
and reliability, the overwhelming concern of mechanics would therefore be to
keep the number of parts to a minimum. In the words of Larry Hirschhorn in his
brilliant study of mechanization and computer control in the work place: ‘in a
good mechanical design the same part or series of parts simultaneously transmits
power, transforms motion, and controls the speed and direction of movement, in
this way minimizing the number of parts and preventing unwanted action’
(Hirschhorn, 1984, p. 16). Because of this, it is difficult and expensive to modify
the design of such machines. This is crucial for understanding the characteristics
of mechanical technologies and, by contrast, of computing. As Hirschhorn puts it:
‘well-designed machines are also highly constrained ones, single-purpose in character and design and hard to modify. […] In general, since the systems of transmission, transformation, and control share the same parts, modifying one system
inevitably means modifying the others. […] In becoming more productive, they
lose flexibility’ (Hirschhorn, 1984, p. 18). As a result, for more than a century the
domains in which machinery could be applied productively were quite limited.
Machinery was generally only applied to branches of mass-production where the
investment in special-purpose machinery could be amortized. And as far as machine systems are concerned, the scope of productive application was even more
limited. The cost of building and modifying machine systems prevented machine
system technology to spread beyond a few branches of mass-production such as,
for example, the production of envelopes, newspaper printing, chemical production, power production and distribution, metallurgy and of course transportation.
For machines to be effectively flexible requires the mechanism of control be
‘physically separate’ from the mechanism of energy transmission (O. Mayr,
1969). In retrospect it is possible to discern distinctive control mechanisms in the
form of devices such as float valves (devised by, e.g., Ctesibios in the 3rd century
BCE, Heron in the first century CE, Banū Mūsā in the 9th), thermostats (by e.g.,
Drebbel in the 17th century), regulators in windmills (by British millwrights in the
18th century), and speed regulators in steam engines (Watt) (O. Mayr, 1969). But
it is significant that even Watt does not seem to have recognized the ‘universality’
of his invention. Not only did he not think of it as a ‘new invention’ but merely as
an application of an invention made by others for the regulation of water and
windmills. But he does not seem to have commented on the underlying ‘feedback’
principle. This leads the authority on the history of control mechanisms, Otto
Chapter 11: Formation and fragmentation
319
Mayr, to conclude: ‘One might infer from his silence that he did not see anything
particularly interesting in this principle. Compared with the large but straightforward task of producing power, the regulating devices and whatever theory they
involved may have appeared to the sober Watt as secondary, if not marginal’ (O.
Mayr, 1969, pp. 112 f.). The prospect of mastering a vast reservoir of motive
power and to be able to apply it to augment human productive capacity was so
dominant that it was difficult to discern that something ultimately stupendous was
underway: the development of technologies of control systems. Even those who
were understood the essentials of the paradigm shift — Ure and Marx — did not
get it quite right. When discussing machine systems and what made the interconnected machines a system, they were still limited by the concept of the ‘prime
mover’.
The conceptual distinction between the mechanism of provision of energy and
the mechanism of control, or between ‘energy flow’ and ‘control flow’, only became articulated and stable as the distinction was made in practice: when machine
builders in actual practice began to physically separate the mechanism for provision of energy from the mechanism for control of operations. And in actual practice the distinction was only made in the course of a very long process of technological development.89
For a century only sporadic progress was made in terms of physically separate
and hence easily replaceable control mechanisms. Henry Maudslay’s slide rest
lathe for cutting screws from around 1800 represents one of the earliest and, for
ages, prevalent approaches to relaxing the separation of control and power mechanisms. As expressed in a state-of-the-art review from 1855, published on the occasion of the World Exhibition in London in 1851, by the end of the 18th century
‘nearly ever part of a machine had to be made and finished to its required form by
mere manual labour; that is, we were entirely dependent on the dexterity of the
hand and the correctness of the eye of the workman, for accuracy and precision in
the execution of the parts of machinery’. However, with the ‘the advances of the
mechanical processes of manufacture […] a sudden demand for machinery of unwonted accuracy arose’. Maudslay’s new approach consisted in introducing a
‘slide rest’, i.e., a template whereby the movements of the lathe’s cutting tool
would be determined:
‘The principle here alluded to is embodied in a mechanical contrivance which has been substituted
for the human hand for holding, applying, and directing the motion of a cutting-tool to the surface
of the work to be cut, by which we are enabled to constrain the edge of the tool to move along or
across the surface of the object, with such absolute precision, that with almost no expenditure of
muscular exertion, a workman is enabled to produce any of the elementary geometrical forms —
lines, planes, circles, cylinders, cones, and spheres —with a degree of ease, accuracy, and rapidity,
that no amount of experience could have imparted to the hand of the most expert workman.’
(Committee of General Literature and Education, 1855, pp. 238 f.)
89 For an overview of this development, cf. Stuart Bennett’s two volume History of Control Engineering
since 1800 (1979, 1993).
320
Cooperative Work and Coordinative Practices
This technology was soon applied as ‘part of every lathe, and applied in a modified form in the boring mill, the planing machine, the slotting engine, the drilling
machine, &c. &c’ (ibid.) where geometrically accurate lines, planes, circles, cylinders, cones, and spheres were required and gave rise to simple automatic machines for manufacturing parts. ‘Soon after its introduction the slide-rest was
made self-acting, that is, its motion along or across the surface to which the tool it
held was applied were rendered independent of the attention of the workman in
charge of it.’ (ibid.). (Cf. also, Gilbert, 1958).
In his discussion of this technology, Hirschhorn makes an important point:
‘From the beginning, automatic machine tools required some form of template so
that the eyes and hands of the worker, which once guided the tool, could be replaced by a machine piece. The template became the control element of the machine, determining the feed rate and movement of the tool.’ (Hirschhorn, 1984, p.
22). What Hirschhorn does not say but implies is that a control mechanisms based
on a template is of course an pattern transference mechanism in that it transfers its
shape or some transformation of its shape. Jacquard’s famous punched-card device for controlling silk-weaving looms (also from the beginning of the 19th century) operated in a similar manner, the string of cards imparting — again in an
analog manner — a particular pattern to the fabric (cf. W. English, 1958). An impressive piece of design in its own right, the Jacquard loom is not belittled by observing that it is farfetched and rather fanciful to see in it some kind of anticipation of numerically controlled machinery (e.g., Essinger, 2005). In it own very
sophisticated way it basically imparts patterns the way a boot leaves a mark in the
snow: by pattern transference.
Control devices based on templates have taken many forms and do not really
concern us here. It is sufficient to point to the large family of control devices
based on a so-called cam, often an irregularly shaped component, that while rotating guide the motions of a ‘cam follower’ in a way similar to the way in which a
rotating axis guides the movement of the piston. Compared to other linking mechanisms (gear trains, etc.), the cam provides increased flexibility in developing ‘the
form and timing of a particular motion’, since ‘one cam can simply be replaced by
another’. ‘In effect’, Hirschhorn states, ‘the control of movement is no longer contained in an array of gears, clutches, and mechanical stops that together make up
the structure of the machine, but rather in the cam, which is separate from the
body of the machine’ (Hirschhorn, 1984, pp. 22 f.). However, as a control technology camshafts and similar mechanisms again suffered from severe limitations.
On one hand, the construction and production of an irregularly shaped object that
can impart a sequence of precise movements to a tool posed serious challenges.
And on the other hand, and worse, although the cam is physically separate from
the machine and can be replaced, it is a part of the energy transmission system in
that it both provides the guide for the tool and the force to move it:
‘The linkage between the control and transformation systems of the machine places limits on the
economics of cam making. Not only must the cam be powerful enough-that is, of sufficient metallic thickness and mass-to impart the necessary force; it must also be shaped so . as to guide the
Chapter 11: Formation and fragmentation
321
tool accurately. Cams tend to be difficult and costly to shape and quick to wear out. Camfollowing machines were thus used mostly for large runs that produced universal pieces such as
screws and bolts.’ (Hirschhorn, 1984, p. 23)
To put it crudely, mechanical control mechanisms were about as costly to construct and modify as the mechanism they were designed to control. It was for this
reason that machine systems remained an economically marginal phenomenon,
confined to paper production and similar branches of manufacturing that were
producing vast quantities of simple and identical products. The exceptional cases
such as transfer lines in automotive industry (car frames, cylinder blocks) were
truly impressive but were just that: impressive exceptions in an ocean of highly
specialized manual work combined with islands of semiautomatic machine tools
and motorized flow lines (Wild, 1972).
Machinery only began to spread beyond the confines of mass-production when
portable electrical motors began to appear at the end of the 19th century. They
were initially applied in machinery such as sewing machines, lathes, drilling machines, and printing presses. But by the 1920s the major part of industry was
‘electrified’. The electrical motor technology offered several advantages but, most
importantly, it made it technically and economically feasible to make machines
more flexible:
‘as long as power was obtained from a water wheel, a steam engine, or from a large electrical motor that powered a number of devices, the problems of transmitting power to each mechanism and
component of a machine required considerable ingenuity. The mechanical problems of transmitting
mechanical energy where It was needed to all portions of the machine greatly complicated machine structure.’ (Amber and Amber, 1962, p. 146).
With the portable electric motor these almost paralyzing constraints could be relaxed and this made it possible to create machinery that could be modified at lower costs:
‘The mechanic could now place two or more motors in a particular machine. The constraint on
machine design was reduced, since different parts could move at different speeds without being
connected to the same primary power source. Long gear trains were eliminated. In large machines,
independent portable motors could now direct individual segments moving in different planes,
eliminating the need for linkages that translated motions in one place to motions in another.’ ‘To
change the relative speeds of different machine sections, the mechanic or engineer, instead of
stripping the machine and replacing old gears and cams with new ones, need only adjust the relative speeds of the different electric motors. […]. The relaxation of machine constraints opened the
way to increasingly general-purpose machines, machines that could be modified at reasonable
cost.’ (Hirschhorn, 1984, pp. 19, 21).
It was, in practice, only with electromagnetic and electronic control devices
(switches, valves, transistors) that the concept of separate control devices began to
become articulated, but it was only with the advent of the electronic computer that
an economically feasible control technology became available. Machine systems
remained an economically marginal phenomenon until — not just the development of the electronic computer in the late 1940s, but until microprocessors
around 1980 made it technically and economically feasible not only to design
control mechanisms and incorporate them in industrial machinery in the form of
322
Cooperative Work and Coordinative Practices
Computer-Numerical Controlled or CNC machinery (as reflected at the time in,
e.g., Groover, 1980; Kochhar and Burns, 1983; Koren, 1983) and robotics (e.g.,
Engleberger, 1980) but also feasible to interlink machines to form automatically
coordinated machines systems in the form of Flexible Manufacturing Systems or
FMS (e.g., Merchant, 1983) and begin to explore ideas like Computer Integrated
Manufacturing or CIM (e.g., Harrington, 1979). In short, it was only with the
electronic computer in the form of the microprocessor that an economically viable
technical basis for separating control and transformation devices was established.
Now, computer technologies did not, of course, originate from the challenges
of controlling machines in manufacturing. The stored-program electronic computer was developed for the purpose of automatic processing of massive calculation
work.
2.3. The universal control system: The stored-program computer
The electronic computer is just as much a machine, i.e., an automatically operating material artifact, as a self-acting mule or a Jacquard loom. It is conceivable
that it could be constructed by wheels and gears or electromechanical switches.
The difference is that in the former case it is the substantive nature of the moving
parts, the interaction of ‘rigid bodies’, that causes the state change; in the electronic mechanism it is the electrical charge that causes the state change. The (enormous) advantage of the latter is first of all that electrons, due to their small mass
compared to gears, can travel at a velocity close to the speed of light with minimal
expenditure of energy. Where one of Babbage’s gears may have had a mass of,
say, 100 grams, or the parts of electromechanical switch circuits in tabulating machines that were in use for most of the 20th century may have had a mass of only
1 gram and hence could work so much faster, electrons have a mass of 9 × 10-28
grams (Goldstine, 1972, p. 144). This of course means that ‘turning a bit’ ‘on’ or
‘off’ requires an insignificant amount of energy. Instead of state changes propagating in a system of interconnected gears and shafts, clouds of electrons are milling about in an equally causal way. In a ‘macroscopic’ mechanism, one cogwheel
meshes with another, the movement of the first causing the next cogwheel to
change to another position, and a discrete state change has been propagated within
the mechanism. Similarly, in an electronic mechanism electrons are amassed at a
certain ‘gate’ and, when the charge has reached a pre–specified threshold value,
the state of the gate switches. In either case, the pattern of state changes is an observable and highly regular correlation. But just as gears may jam, a wayward
cloud of electrons amassing at a particular spot of the circuit (triggered by, say,
energetic particles from cosmic radiation) may cause a gate to open and thus turn
a bit. Accordingly, a lot of engineering skill goes into the design of both types of
mechanism — ‘macroscopic’ as well as ‘microscopic’ — in order to ensure that
state changes that are supposed to propagate, and only those, do propagate; in
Chapter 11: Formation and fragmentation
323
short, that state changes propagate in a dependable way.90 That is, electronic
computers are causal systems just as much as any other machines, from the medieval water clock to the mechanical calculator to the punched-card tabulator.
Whether macroscopic or microscopic, the behavior of a machine is a causal process, or a configuration of causal processes, that has been harnessed and thus, under certain conditions, behaves in an extremely regular fashion.
What we normally call ‘a computer’ — the Macintosh laptop in front of me,
say, or a mainframe computer somewhere in a climate-controlled window-less
room — of course consists of a number of machines that are quite tangible: one or
more CPUs as well as various specialized integrated circuits such as arithmeticlogic unit, data bus, input and output devices, network connections, etc. In addition, however, ‘the computer’ consists of machines, ‘software programs’, that are
just as material as ‘chips’: they are just invisible and intangible, but so are X-rays
and TV signals.
The computer owes its advantages not just to the enormous speeds afforded by
electronics but to the stored program architecture originally outlined by Alan Turing in his famous article on computable numbers (Turing, 1936).91 Programs are
treated as data and, when launched, reside in the computer’s storage as — invisible but no less physical — electronic patterns that can be activated and deactivated virtually instantaneously.
In short, the computer can be seen as the ultimate control mechanism; or rather,
in Turing’s words in a talk on his design for the Automatic Computing Engine
(ACE) given in 1947, the computer can ‘imitate’ the control mechanism of any
machine: a typesetter, a printing press, a lathe, a machining center, a jukebox. Turing started by referring to his paper ‘On computable numbers’ (1936):
‘Some years ago I was researching on what might now be described as an investigation of the theoretical possibilities and limitations of digital computing machines. I considered a type of machine
which had a central mechanism, and an infinite memory which was contained on an infinite tape.
This type of machine appeared to be sufficiently general.’ (Turing, 1947, p. 378).
He then elaborated on the implications of the stored program architecture:
‘It can be shown that a single special machine of that type can be made to do the work of all. It
could in fact be made to work as a model of any other machine. The special machine may be
called the universal machine; it works in the following quite simple manner. When we have decided what machine we wish to imitate we punch a description of it on the tape of the universal machine. This description explains what the machine would do in every configuration in which it
might find itself. The universal machine has only to keep looking at this description in order to
find out what it should do at each stage. Thus the complexity of the machine to be imitated is concentrated in the tape and does not appear in the universal machine proper in any way.
90 As Herman Goldstine puts it, the ENIAC ‘had to operate with a probability of malfunction of about 1 part
in 1014 in order for it to run for 12 hours without error. Man had never made an instrument capable of operating with this degree of fidelity or reliability, and this is why the undertaking was so risk a one and the accomplishment so great.’ (Goldstine, 1972, p. 153).
91 ‘In the conventional literature, von Neumann is often said to have invented the stored-program computer,
but he repeatedly emphasized that the fundamental conception was Turing’s’ (Copeland and Proudfoot, 2005,
p. 114 et passim).
324
Cooperative Work and Coordinative Practices
If we take the properties of the universal machine in combination with the fact that machine processes and rule of thumb processes are synonymous we may say that the universal machine is one
which, when supplied with the appropriate instructions, can be made to do any rule of thumb process. This feature is paralleled in digital computing machines such as the ACE. They are in fact
practical versions of the universal machine. There is a certain central pool of electronic equipment,
and a large memory. When any particular problem has to be handled the appropriate instructions
for the computing process involved are stored in the memory of the ACE and it is then “set up” for
carrying out that process.’ (Turing, 1947, p. 383).
The Universal Turing Machine is a mathematical construct, not a real-world
machine; it has infinite storage, which no real machine can have, of course. So,
while mathematics deals with ‘theorems, infinite processes, and static relationships’, ‘computer science emphasizes algorithms, finitary constructions, and dynamic relationships’. This means that ‘the frequently quoted mathematical aphorism, “the system is finite, therefore trivial,” dismisses much of computer science’
(Arden, 1980, p. 9). That is, the Universal Turing Machine cannot be taken simply
as the theoretical model of the computer. However, the concept of the storedprogram computer as ‘universal’ or, better, inexhaustibly malleable, is the key
concept in computing technology.
The stored-program computer can be reconfigured endlessly, manually or automatically, in response to internal and external state changes. In fact, what we
call a computer forms a hierarchy of distinct but interacting control mechanisms:
an ‘operating system’ that itself is a hierarchical system of control mechanisms
devoted to the managing of data and programs stored in RAM, input and output
and external storage devices (‘drivers’), and so on, as well as so–called application programs that may also be hierarchical systems and form hierarchical relations with other application programs. This means that the configuration and reconfiguration of the system of machines constituting the computer, the programs
in RAM as well as programs on local harddisks and remote ‘servers’, can be controlled automatically, one machine triggering the execution of another in response
to certain conditions, and that human intervention, if required, can be performed
semi–automatically or ‘interactively’ as, for instance, a mouse click making one
software machine activate another.
What is more, also the design and construction of software machines (i.e., programming, compiling, testing) can be performed semi-automatically. This again
means that designing and constructing software machines is immensely cheaper
and faster than designing and constructing, say, a camshaft or a gear train, or for
that matter rewiring the ‘switchboard’ control panel of an punched-card tabulator.
And it goes without saying that what applies to designing and constructing software machines applies equally to redesigning and reconstructing software machines. Moreover, of course, software machine design specifications can be copied and transported automatically and at insignificant cost. And, finally, the
stored-program technology also makes the construction and modification of largescale machine systems incomparable inexpensive. In short, with the electronic
stored-program computer, we have the technology for ‘the production of ma-
Chapter 11: Formation and fragmentation
325
chines by machines’ vaguely but perspicaciously anticipated by Marx (1867a, p.
314).
However, computing technologies did not come out a box, ready to ‘plug and
play’. They do not, first of all, originate from a particular body of mathematical
theory; to be sure, their development have depended critically upon a host of
mathematical theories (recursive function theory, Boolean algebra, Shannon’s information theory, etc.), but they were not the result of the application of any particular theory. As pointed out by the eminent historian of computation Michael
Mahoney, computer science has taken ‘the form more of a family of loosely related research agendas than of a coherent general theory validated by empirical results. So far, no one mathematical model had proved adequate to the diversity of
computing, and the different models were not related in any effective way. What
mathematics one used depended on what questions one was asking, and for some
questions no mathematics could account in theory for what computing was accomplishing in practice.’ (Mahoney, 1992, p. 361). It is no surprise, then, that ‘the
computer’ was not ‘invented’: ‘whereas other technologies may be said to have a
nature of their own and thus to exercise some agency in their design, the computer
has no such nature. Or, rather, its nature is protean’ (Mahoney, 2005, p. 122). It
would be more accurate to conceive of this in terms of costs and thus say that
computing technology is protean in that the costs of construction and modification
of software machines are drastically reduced compared to those of previous machine technologies. Anyway, according to Mahoney, there therefore was a time, a
rather long time, ‘when the question “What is a computer, or what should it be”,
had no clear-cut answer’ and the computer and computing thus only acquired
‘their modern shape’ in the course of an open-ended process that has lasted decades (Mahoney, 1992, p. 349). And there is no reason why one should assume that
the concept of computing as we know it has solidified and stabilized: the jury is
still out, as the immense malleability of ‘the computer’ is being explored in all
thinkable directions.
In other words, it is confused to conceive of ‘the computer’ as one technology.
Not only does ‘the computer’ in front of me incorporate a host of technologies,
from metallurgy to semiconductor technology to programming languages and operating systems; it can assume an endless range of very different incarnations. It is
a protean technology, and how it develops is determined by its applications in a
far more radical sense than any other technology.
To understand the place of CSCW in this open-ended array of known and possible forms of technology requires that we have an idea of the received concepts
of computing: the practical applications for which various computing technologies
were developed and that, accordingly, have formed our conceptions of computing
and computational artifacts. To do so, I will highlight some of the conceptually
distinct forms.
326
Cooperative Work and Coordinative Practices
2.4. Origins of computing technologies in cooperative work
Electronic computing technologies initially arose from the development of technologies of large-scale calculation work in science and engineering, on one hand,
and in administrative work organizations on the other.
The progressive forms of work organization discussed above also appear as a
recurring theme in the development of the practices of computing. For a couple of
centuries the development of these technologies followed the pattern of development we have dubbed progressive forms of work organization. While this might
be taken as proof that these forms are indeed universally necessary forms of development, the banal truth may be simply that managers of large-scale computing
work knew of the ‘putting out’ system and the systematic division of labor of the
manufactures and applied these as tested technologies. Whatever it is, the major
milestones are the following.
2.4.1. Division of ‘mental labor’
In the first year of the French Revolution the new revolutionary regime decided to
scrap the received systems of measurement and introduce a new and conceptually
coherent system based on the decimal notation. The motive was no secret: it was
an intervention to ensure that the myriad of local measurement systems did not
pose obstacles for the new regime in its need for raising taxes: ‘At that time, each
region was free to establish its own set of measures. Local officials easily manipulated these measures to their own advantage in a number of ways. Commonly,
they could keep a large measure to collect taxes of grain and produce but reserve
smaller measures for the payment of their own debts’ (Grier, 2005, pp. 33 f.). On
top of these mundane motives, however, the new metric system, as it is now
known, should be devised and presented in such a way as to demonstrate the
grandeur of the revolutionary regime. Hence the new metric system would only
allow decimal fractions for the sub-divisions of the units of measure ‘of whatever type’, and accordingly the quadrant of a circle and angular measures were
also to be made to conform to this rule.
The task of producing these table was assigned to the director of the Bureau
du cadastre at the Ecole des ponts et chaussée, Gaspard de Prony, who later related that this implied that ‘all existing trigonometric tables, whether presented
in natural or logarithmic form […] became useless; and it was found necessary
to calculate new ones’ (de Prony, 1824). Moreover, also in the name of grandeur, de Prony ‘was engaged expressly not only to compile tables which left nothing to be desired about their accuracy, but also to make of them “a monument to
calculation the greatest and the most impressive that had ever been executed or
even conceived”’. Adding that these were ‘the exact expressions that were used
in the brief’ he was given, de Prony specified that this meant that the tables
would have to be extended and calculated to 14 or 15 decimal places instead of
8 or 10.
Chapter 11: Formation and fragmentation
327
De Prony accepted the assignment ‘unconditionally’ but, realizing that he
‘could not hope to live long enough to finish the project’, he found himself an
‘embarrassment more arduous than [he] could hide’. However, a happy circumstance ‘unexpectedly’ helped him out of this ‘embarrassment’:
‘Having one day noticed, in the shop of a seller of old books a copy of the first English edition
1776, of Smith’s “Treatise on the Wealth of Nations”, I decided to acquire it, and on opening the
book at random, I came across the chapter where the author had written about the division of labour; citing, as an example of the great advantages of this method, the manufacture of pins. I conceived all of a sudden the idea of applying the same method to the immense job with which I had
been burdened, to manufacture my logarithms as one manufactures pins. I have reasons to believe
that, without realising it, I had already been prepared for this realisation from certain parts of
mathematical analysis, on which I had then been giving tuition at the École Polytechnique.’ (de
Prony, 1824).
De Prony may have been exaggerating somewhat in this account, given about 30
years after the event, perhaps as a rhetorical gesture, for the pin manufacture paradigm was very well known in the circles of engineers and scientists he belonged
to. In fact, his teacher at École des ponts et chaussée, benefactor, and later immediate superior and predecessor as director at the École, was the very same JeanRodolphe Perronet who had provided the most thorough analysis of the very same
pin manufacture in the Normandy that Smith had described (Bradley, 1998). Be
that as it may, de Prony certainly applied the principles of manufactures to accomplish the task at hand.
In a way, the division of labor had already been applied some fifty years before
by Alexis-Claude Clairaut in an effort to calculate the orbit of the comet Halley
had identified in 1682 and thereby its next perihelion. The task was massive because the calculation of the orbit of the comet required a solution to a ‘three body
problem’, as the orbit is influenced by the gravitational fields of large bodies such
as the Sun, Saturn, and Jupiter. Halley did not manage to arrive at a satisfactory
solution but Clairaut invented a method of dividing the calculation in such a way
that it could be performed in a system of division of labor. Together with two
friends, Joseph-Jérôme Lalande and Nicole-Reine Lepaute, Clairaut launched on
the massive task. Almost five months later, in November 1758, they were able to
publish the prediction that the next perihelion would occur on 13 April 1759
(Grier, 2005). Although the prediction was one month off the mark, the principle
of performing massive computations in parallel, as a cooperative effort based on
systematic division of labor, was picked up and refined for similar purposes. Lalande himself, from 1759 tasked with calculating astronomical tables published
annually by the Académie des sciences under the title Connaissance des temps,
employed a small number of skilled ‘computers working out of their own homes’
to do so (Croarken, 2003). In the UK, Nevil Maskelyne, the Astronomer Royal at
Greenwich, similarly tasked with the provision of British sailors with a practical
technique of determining the longitude, advocated a method (the ‘lunar distance
method’) that also required annual calculation and publication of tables (in an
Nautical Almanac), and he thus had to devise a computing system that would enable him to accomplish this as a practical task. He was familiar with the form of
328
Cooperative Work and Coordinative Practices
work organization that had been developed by Lalande (whom he knew well) and
adopted this ‘distributed structure, albeit on a slightly larger scale’: ‘Maskelyne
proposed employing a number of computers, each of whom was to undertake the
complete set of calculations for specified months of the Nautical Almanac’, and
for that purpose he ‘designed a distributed system using relatively skilled workers
and which had more in common with cottage industries such as lace making than
in the factory manufacture of pins’ (Croarken, 2003, p. 52). Maskelyne provided
not only paper and ink, but also ‘computing plans’, that is, instructions that were
written on one side of a sheet of folded stationery and that would summarize each
step of the calculation. ‘On the other side of the paper he drew a blank table, ready
for the computer to complete’ (Grier, 2001, p. 30).
De Prony, who knew Maskelyne well and would have been familiar with the
British arrangement of mass-calculation work (Grier, 2001, p. 35), did not have to
start from scratch. However, whereas the previous experiments in cooperative calculation based on division of labor remained entrenched in artisanal work, as in
the example of calculating the perihelion of Halley’s comet, or in the ‘putting out’
system, as in the examples of calculating astronomical tables, de Prony went all
the way, so to speak, and adopted the form of full-fledged manufactures. He devised ‘his new manufacture’, as he later called it, as a ‘system of the division of
labour’ in three sections: The first was composed of four or five ‘geometricians of
very high merit’ that were given the task of choosing the mathematical formulae
to be used for calculation and checking; the second section was composed of ‘calculators, who possessed a knowledge of analysis’, and whose task it was to construct the ‘spreadsheets’ to be filled in by the members of the third section. The
sheets were divided into 100 intervals, with the numbers of the top line was provided by the ‘calculators’ of the second section.
‘This third group comprised no less than seventy or eighty individuals; but it was the easiest to
form, because, as I had foreseen, they did not need, in order to be admitted [to this group] any preliminary instruction; the one essential condition, for their admission [to the group], was for them to
know the first two rules of arithmetic […]. The ninety-nine remaining lines were then filled in by
means of purely mechanical operations carried out by the 3rd section, each of whom was performing 900 to 1000 additions or subtractions per day, nearly all of whom not having the least theoretical notion on the work which they were doing.’ (de Prony, 1824)
The claim that the computers were only required to master addition and subtraction is emphasized by the fact that the ranks of the third group were staffed by
former ‘hairdressers’, that is, wig dressers. They were in deep trouble after the
aristocratic fashion of wearing wigs suddenly had become a hazardous one
(Grattan-Guinness, 1990). De Prony said almost as much when he in his account
decades later said ‘that many among them came to seek and find, in this special
workshop, a safeguard, a refuge which, happily, was not violated, and that the political circumstances of that time rendered these fully necessary’ (de Prony, 1824).
The ‘calculators’ certainly had no prior understanding of interpolation but that did
not prevent the cooperative effort from working very efficiently, producing 700
results per day (Grattan-Guinness, 1990). When the project was completed, in
Chapter 11: Formation and fragmentation
329
1801, the concerted work of the former hairdressers had produced ‘about
2,300,000 numbers, of which 4 or 500,000 consisted of 14 to 25 digits’ of which
‘99 per cent’ had been calculated ‘by means of a manufacturing procedure’. The
tables were ultimately collected in ‘seventeen grand in-folio volumes’ and were
deposited at the Paris Observatory. The quality of the work of the calculators was
as flawless as it gets. Having explained that ‘All the calculations were done twice:
expedite means of verification, but very rigorous, were prepared in advance’, de
Prony emphasized that he ‘noticed that the sheets the most exempt from error
were, in particular, furnished by those who had the most limited intelligence [education], an existence, so to speak, “automatique”’(de Prony, 1824). The former
hairdressers were thus working in an arrangement very similar to that of the
workers in the pin manufactures: they mastered their local task fragment but not
the larger scheme as conceived by de Prony and his master planners.
The historical role of de Prony’s calculation manufacture is similar to Adam
Smith’s pin manufacture case, although it has not been as dramatic and hyped.
First of all, the cooperative arrangement of calculation organized by de Prony became widely known, not least due to the prominence given to it by Charles Babbage. In 1821 Babbage and his friend John Herschel undertook to produce a set of
mathematical tables for the British Nautical Almanac. Organizing the work in accordance with the principles devised by Maskelyne, Babbage and Herschel employed two skilled calculators under the ‘putting out’ scheme. — Now, to understand the work the two men were engaged in, one should be aware that mathematical tables were of enormous practical importance in the emerging industrial
economy but were also ridden with calculation and typesetting errors. The Nautical Almanac, a set of tables that from the point of view of maritime safety was
‘crucially significant’, ‘contained at least one thousand errors, while the multiplication tables its computers used sometimes erred at least 40 times per page’
(Schaffer, 1996, p. 277). Herschel later commented that ‘an undetected error in a
logarithmic table is like a sunken rock at sea yet undiscovered, upon which it is
impossible to say what wrecks may have taken place’ (Swade, 2003, p. 157). In
short, assured reliability of calculation was of critical economic and social importance. — So, while Babbage and Herschel in 1821 were engaged in the tedious
task of proofreading the manuscripts they had received from their two calculators,
it was suggested by one of them, ‘in a manner which certainly at the time was not
altogether serious, that it would be extremely convenient if a steam-engine could
be contrived to execute calculations for us, to which was replied that such a thing
was quite possible, a sentiment in which we both entire concurred’ (Babbage,
1822, quoted in Campbell-Kelly, 1994b, p. 14).
The shared sentiment should be understood on this background. At that time, in
1821, Babbage and Herschel knew of de Prony’s work. A few years earlier, in
1817, both of them had signed a letter recommending de Prony’s appointment to
the British Royal Society (Bradley, 1998, p. 209), and in 1819, the two of them
had visited Paris, and during this trip Babbage was able to inspect, to some extent,
the Tables du cadastre that— in spite of having been produced twenty years earli-
330
Cooperative Work and Coordinative Practices
er — still had not been printed (for financial reasons). During a visit at the designated publisher of the tables, Didot, Babbage was able to see typeset pages and
was given a copy of the section of the sine tables (Schaffer, 1996, p. 278).
Anyway, shortly after realizing that calculations could be realized ‘by steam’,
Baggage began to design an experimental prototype to demonstrate the feasibility
of mechanical production of tables on the basis of the method of finite differences
(with which he was ‘completely familiar’ well before his visit to Paris, cf.
Lindgren, 1987, pp. 44 f.). We do not know to which extent he in this effort took
de Prony’s table manufacture as a guide or model, but when he in June 1822 presented the prototype of the Difference Engine to the public — he did that in an
open letter to the resident of the Royal Society to obtain official support
(Babbage, 1822) — Babbage used de Prony’s Tables du cadastre as a proof of
concept: if a carefully arrangement of workers, based on division of labor, can
produce sophisticated mathematical tables of high quality by means of straightforward but repeated addition, then a properly designed machine could do the
same, thus making ‘the intolerable labour and fatiguing monotony of a continued
repetition of similar arithmetical calculations’ redundant while at the same time
reducing the required labor force by about 88 percent.
Babbage is often described as one of the earliest pioneers of mechanical computing. His work is of course well-known already and this is not the place for an
account and discussion of his impressive oeuvre.92 But from a practical point of
view, Babbage’s work on the Difference Engines and the Analytical Engine could
be seen as a wasted effort. In spite of the large sums that were invested in their
design and construction, none of the projects were brought to completion. However, his work is of course of great historical interest in its own right. First of all,
it had direct influence on the design of a series of difference engines that were
built and used over the next years by engineers such as Scheutz, Wiberg, Grant,
and Hamann, and in contrast to Babbage’s designs, some of these machines were
actually used (Lindgren, 1987). But on balance, the approach developed by Babbage turned out to be of marginal practical utility (M. R. Williams, 2003). Or in
the words of Alan Bromley, ‘That these were not extensively used or developed,
despite the apparent complete success of the Wiberg machine, indicates that the
entire idea was not well judged. The sub-tabulation task, though laborious, was
not the dominant mathematical task in the preparation of tables nor, with adequate
organization and management, was it of overwhelming practical importance’
(Bromley, 1990, p. 96). One is thus led to the conclusion that the ‘fruits of Babbage’s considerable genius’ were ‘effectively wasted as far as practical influence
is concerned’ (ibid., p. 97).
One should of course not underestimate the moral example of his projects.
They demonstrated to computing researchers in the 20th century (such as Howard
92 For general descriptions of the development of Babbage’s work, cf. his autobiography (Babbage, 1864,
chapters V, VII, and VIII), the host of studies of history of technology devoted to Babbage (e.g. Collier,
1970; Lindgren, 1987; Bromley, 1990; Schaffer, 1996), as well as a few reliable popular biographies (e.g.,
Swade, 2000).
Chapter 11: Formation and fragmentation
331
Aiken and Vannevar Bush) that complex computation by means of automatic
digital artifacts was feasible. However, the inspiration has never been transformed
into anything technologically specific. In fact, it is when ‘we come to examine the
facilities available for programming the Analytical Engine that Babbage’s designs
begin to look strange to modern eyes’ (Bromley, 1990, p. 87). ‘The conclusion
seems inescapable that Babbage did not have a firm command of the issues raised
by the user-level programming of the Analytical Engine. It would be quite wrong
to infer that Babbage did not understand programming per se.’ In so far as programming is concerned, his focus was what we now call microprogramming and
it was from this base that Babbage explored the ideas of user-level programming.
However,
‘The issues of data structuring simply did not arise at the microprogramming level. There is some
evidence to suggest that Babbage’s ideas were moving in the directions now familiar in connection
with the control mechanisms for loop counting in user-level programs. Had an Analytical Engine
ever been brought to working order, there can be no doubt that Babbage’s programming ideas
would have been developed greatly.’ (Bromley, 1990, p. 89)
In other words, if one considers the contribution of the Babbage engines narrowly
from the point of view of the sophisticated technicalities and divorced from its
use, one is likely to miss the fact that the technology he developed was not used
and that he did not arrive at the point where use issues did arise.
The real importance of de Prony’s example is not its probable impact on de development of Babbage’s engine designs. It was, in the words of Babbage, ‘one of
the most stupendous moments of arithmetical calculation which the world has yet
produced’ (Babbage, 1822, p. 302) and in his On the Economy of Machinery and
Manufactures he not only gave a detailed account of de Prony’s accomplishment
but he did so under the heading ‘On the division of mental labour’ (Babbage,
1832, §§ 241-247), stating the conclusion to be drawn from de Prony’s example
sharply in the opening sentence of the chapter: ‘We have already mentioned what
may, perhaps, appear paradoxical to some of our readers that the division of labour can be applied with equal success to mental as to mechanical operations, and
that it ensures in both the same economy of time.’ (Babbage, 1832, § 241). To see
the historical importance of this, one should know that it was this book, not the illfated design projects, that defined Babbage’s reputation among his contemporaries, and it was reprinted many times and translated to a large number of European
languages. He was, in fact, better known as an economist than as a technologist.
His book was based on extensive and conscientious field work in the textile manufacturing districts of England and on the Continent undertaken in course of the
1820s, and as pointed out by the historian Campbell-Kelly in his introduction to
Babbage’s autobiography (1994b), the Economy of Machinery and Manufactures
is generally seen as a ‘being in the direct line of descent’ from Adam Smith’s
Wealth of Nations to Frederick Winslow Taylor’s Principles of Scientific Management (1911).
Accordingly, to the generations of scientists, economists, managers, and workers who read Babbage’s On the Economy of Machinery and Manufactures, de
332
Cooperative Work and Coordinative Practices
Prony’s example was the ‘proof of concept’ that the principle of division of labor,
which they of course knew from Adam Smith and from their daily work, could be
applied equally well to ‘mental labour’. Cooperative work based on advanced division of ‘mental labor’, as devised by de Prony in the wake of the French Revolution, became wide-spread in course of the 19th and early 20th centuries. It not
only became the standard way of handling the increased load of scientific and engineering calculation (Grier, 2001), but it also, with the rise of large-scale financial and industrial corporations, became the predominant way of organizing administrative work in such settings. That is, the real importance of de Prony’s example lies in the model it provided for the organization of mental work in the next
two centuries: the organization of ‘human computers’ and ‘calculators’ in scientific and engineering laboratories, in insurance companies and accounting offices,
in inventory management and production planning in manufacturing, and so forth.
(For a description of a classical case, the Railway Clearing House, cf. CampbellKelly, 1994a). Babbage’s direct assault on the mechanization of mental labor
came to naught. It was de Prony’s scheme that ruled the day for more than 150
years.
For a century after de Prony’s calculation manufacture and for more than half a
century after the difference engines designed on the Babbage model, calculation
work remained strictly ‘manual’, as one is awkwardly tempted to call it, meaning
that all operations are performed by mind and hand, assisted by the use of pen and
paper and perhaps an abacus or a slide rule.
The technology of mechanical calculation machines only matured in the course
of the 19th century and only matured at a glacial speed. For although mechanical
calculating machines date back to the 18th century (e.g., Schickard, ca. 1620; Pascal, 1642; Leibniz, 1674), we should not (again) be misled by the chronology of
inventions and disregard technology in actual use. As pointed out by a historian of
computing, ‘Mechanical calculating machines were essentially useless toys during
the first two centuries of their development. The level of [metal-working] technology of the day guaranteed that any attempt to produce a reliable, easy to use
instrument was doomed to failure.’ (M. R. Williams, 1990, p. 50). The ‘first machine that can be said to have been a commercial success’ was a calculator created
by Thomas de Colmar in 1820, but the technology of mechanical calculation only
stabilized late in the 19th century with the development of the Baldwin-Odhner
calculator (patented 1875) that offered a practical and robust solution to the carry
issue (a variable-toothed gear). The most famous design based on this technology
was perhaps the calculators produced by the Brunsviga company in Germany
from 1892. The scope and level of automatic control remained quite narrow and
low, however; restricted to, for instance, control of carry operations in addition
tasks. As in the cotton trades one hundred years earlier, mechanization of ‘mental
labor’ was an incremental process.
Mechanical calculation machines were operated in isolation from each other,
by the individual part-worker, as a means of speeding up the tedious task of doing
massive arithmetical tasks. The reason for this is that the level of automatic con-
Chapter 11: Formation and fragmentation
333
trol of operations was so rudimentary (automatic carry was the overwhelming issue) that one can safely say that ‘In in these machines the control function was
provided by the human operator’ (Bromley, 1990, p. 59). They were little more
than sophisticated tools. In these conditions, mechanical calculation machines did
not permit the operator to escape the subjection to specialization in performing a
specific operation (or a narrow range of related operations) and take on the role of
a supervising the operations of the machine. The devices were simply incorporated into the received division of labor as a means for increasing the speed at
which individual operations could be performed.
2.4.2. Mechanization of ‘mental labor’
Calculation in administrative work began to become mechanized around the beginning of the 20th century, with the invention and dissemination of punched-card
tabulators (and associated punching equipment, sorters, printers, etc.). Tabulating
machines were soon equipped with plugboards or switchboards, so that the machines could be reconfigured to handle other tasks and card formats.
Invented for use in the processing of massive statistical data (the US census
1890), punched-card machinery quickly became appropriated and used by railroad
and utilities companies as well as by manufacturers and government agencies.
However, because of their costs, punched-card tabulators were generally confined
to use in settings such as these that were in need of (or could exploit) large-scale
data processing:
‘Punched-card machinery was expensive to rent and consequently was only used, at first, by very
large organizations that could make good use of its ability to make short work of a large volume of
transactions; the needs of small businesses could be met adequately by less automatic but lowercost bookkeeping machines, such as those made by Burroughs. The Hollerith [punched-card] machines, however, arrived at a critical period in the development of large-scale American enterprise;
it was during this period in the late nineteenth and early twentieth centuries that much of modern
business accounting practice came into existence, particularly cost accounting in manufacturing.’
(Campbell-Kelly, 1990, p. 145).
As large-scale economic organizations evolved, the use of punched-card tabulating machinery became widespread. By 1913 a journalist reported that
‘the system is used in factories of all sorts, in steel mills, by insurance companies, by electric light
and traction and telephone companies, by wholesale merchandise establishments and department
stores, by textile mills, automobile companies, numerous railroads, municipalities and state governments. It is used for compiling labor costs, efficiency records, distribution of sales, internal
requisitions for supplies and materials, production statistics, day and piece work. It is used for analyzing risks in life, fire and casualty insurance, for plant expenditures and sales of service, by public service corporations, for distributing sales and cost figures as to salesmen, department, customer location, commodity, method of sale, and in numerous other ways. The cards besides furnishing
the basis for regular current reports, provide also for all special reports and make it possible to
obtain them in a mere fraction of the time otherwise required.’ (quoted in Campbell-Kelly, 1990,
p. 145).
That is, with the use of tabulating machinery important sections of administrative
work assumed the character of the machine-based factory.
334
Cooperative Work and Coordinative Practices
It is of some relevance here to note that, in some instances, the technology was
used also in the coordination of cooperative work, in a manner that in some ways
anticipates kanban systems, as opposed to off-line administration of work settings
and processes. In the 1920s and 1930s ‘the increasing variety of styles, colors, and
options began to slow production and delay delivery’, and ‘automobile companies
turned to the use of tabulator machinery to overcome these delays’ (Norberg,
1990, p. 774). Norberg describes an innovative use of tabulating machinery in
Chrysler Corp. for purposes of coordination:
‘Upon receipt of a dealer’s order, two cards were punched with the essential information supplied
by the dealer: routing, region, district, dealer’s name, order number, item number, model, body
type, paint, trim, wheels, transmission, radio, heater, and other options. One card went to the
equivalent of a production planning office where a “Daily Master Building Schedule” was prepared, and one went to the car distribution department where it was filed according to region and
dealer. Multiple copies of the production card went to the various inventory-control points for
parts, while several copies stayed with the car as it was constructed. When the car reached the
shipping department, one of the last cards remaining was checked to see that the order was correct.
If so, the car was shipped and the dealer was notified when to expect it.’ (Norberg, 1990, p. 774)
In this case the punched-card is not simply as record of an event to be processed
at a later stage for secondary use, e.g., for statistical purposes, for purposes of
payment, etc. The card is a mechanically generated coordinative artifact that provides the various stations in the large-scale network of activities with appropriate
information about the particular order.
Like desk-top calculators before them, tabulating machinery was and remained
stand-alone machines, with operators supervising the operation of the individual
machines and handling the transfer of cards between machines: punchers, sorters,
tabulators, printers. On the basis of stacks of discrete cards a higher degree of automatic integration of operations was not feasible, but the technology remained in
use until the cost of electronic computing made it a viable option for ordinary
work settings to move beyond the confines of punched-card tabulator technology.
By the late 1960s traditional punched-card machines had effectively gone out of
production, and by the late 1980s punched cards had all but vanished (CampbellKelly, 1990, p. 151).
So, although component technologies of punched-card tabulating were appropriated and used in the first generations of electronic computers (punched cards,
card readers, printers, plugboards, etc.), the electronic digital computer did not
grow out of this technology, nor did it grow out of the needs of administrative
work. The first generations of electronic digital computers were designed and
built for massive scientific and engineering calculation.
While administrative work became mechanized, the mechanization of scientific
and engineering calculation, remained sporadic and fragmentary. This lasted until
the Second Work War, still relying on desktop calculators. In a few large-scale
research settings that could afford the cost, punched-card tabulators were appropriated for the purposes of scientific and engineering calculation (e.g., Snedecor,
1928; W. J. Eckert, 1940; McPherson, 1942), but in general human computers
Chapter 11: Formation and fragmentation
335
were still tasked with calculating by rote in accordance with a systematic method
supplied by an overseer.
Donald Davies, who was to play a key role in the development packet-switched
digital networks, recalls from his work as a young scientist in the UK during the
Second World War:
‘The Second World War saw scientific research projects of a size and complexity that reached new
levels. Underlying much of the work were complex mathematical models, and the only way to get
working solutions was to use numerical mathematics on a large scale. In the Tube Alloys project,
for example, which became the UK part of the Manhattan Project to make a fission bomb, we had
to determine the critical size of a shape of enriched uranium and then estimate mathematically
what would happen when it exploded. For this problem we used about a dozen “computers” —
young men and women equipped with hand calculators (such as the Brunsviga). These human
computers were “programmed” by physicists like myself.’ (Davies, 2005, p. vii).
This setting was not a unique. As Davies puts it, the ‘same story, with different
physics and different mathematics, was repeated in many centres across the United Kingdom’. In the words of Jack Copeland, ‘The term “computing machine”
was used increasingly from the 1920s to refer to small calculating machines
which mechanized elements of the human computer’s work. For a complex calculation, several dozen human computers might be required, each equipped with a
desktop computing machine.’ (Copeland, 2006b, p. 102). However, just as the
vastly increased scope of administrative data processing had put the calculation
manufacture form of work organization under increasing pressure and thus engendered the rapid growth of punched-card technologies, the scale of calculations required in modern science and engineering caused similar tensions to arise: ‘By the
1940s, […] the scale of some of the calculations required by physicists and engineers had become so great that the work could not easily be done with desktop
computing machine. The need to develop high-speed large-scale computing machinery was pressing.’ (Copeland, 2006b, p. 102). In short, in the domains of science and engineering too, cooperative calculation work, organized on the de
Prony model, had exceeded its capacity for further development.
In sum, it was the problems facing the cooperative efforts of human computers
in science and engineering that motivated the first significant steps towards electronic digital computers, in particular the Colossus, designed by Thomas Flowers
and Max Newman at Bletchley Park in the UK during 1943 for breaking the encrypted messages produced by the German Geheimschreiber, a family of sophisticated teletype cipher machines used for strategic communication within the Nazi
military leadership (Copeland, 2006a; Gannon, 2006),93 and the ENIAC, designed
in the US in 1944-45 for calculating projectile trajectories (Goldstine, 1972; Van
der Spiegel, et al., 2000). The need for similar applications motivated the subsequent series of experimental stored-program computers such as, in the US,
93 The British government kept the very existence of the Colossus secret until 1975, and its function was not
publicly known until 1996 when the US Government declassified documents, written by US liaison officers
at Bletchley Park during the war, in which the function of the Colossus was described. However, a ‘vital report’ (Good, et al., 1945) was only declassified in June 2000 (Copeland, 2006c).
336
Cooperative Work and Coordinative Practices
• the EDVAC, 1945-52, designed by John Mauchly, Presper Eckert, and John
von Neumann (von Neumann, 1945; J. P. Eckert, 1946);
• the Princeton IAS computer, 1946-52, designed by von Neumann (Aspray,
2000), etc.,
and in Britain,
• the ACE, 1945-50, designed by Turing and Wilkinson (Turing, 1945; Turing
and Wilkinson, 1946-47; Turing, 1947; Copeland, 2005);
• the Manchester Mark I, 1946-48, designed by Newman (Napper, 2000); and
• the EDSAC, 1946-49, designed by Maurice Wilkes.
(For general accounts of these efforts, cf. Augarten, 1984, Chapters 4-5;
Campbell-Kelly and Aspray, 1996, Chapter 4)
If we take Turing’s ACE as an example, the motivation was clearly laid out. In
his proposal, written towards the end of 1945. Turing opened the report by stating:
‘Calculating machinery in the past has been designed to carry out accurately and moderately
quickly small parts of calculations which frequently recur. The four processes addition, subtraction, multiplication and division, together perhaps with sorting and interpolation, cover all that
could be done until quite recently […]. It is intended that the electronic calculator now proposed
should be different in that it will tackle whole problems. Instead of repeatedly using labour for
taking material out of the machine and putting it back at the appropriate moment all this will have
to be looked after by the machine itself. This arrangement has very many advantages.
(1) The speed of the machine is no longer limited by the speed of the human operator.
(2) The human element of fallibility is eliminated, although it may to an extent be replaced by mechanical fallibility.
(3) Very much more complicated processes can be carried out than could easily be dealt with by
human labour.
Once the human brake is removed the increase in speed is enormous.’ (Turing, 1945, p. 371).
The same motivation was underscored when Charles G. Darwin, the director of
the UK National Physical Laboratory, in April 1946 wrote a memorandum in
which he argued the case for building the computer proposed by Turing:
‘In the past the processes of computation ran in three stages, the mathematician, the [human] computer, the machine. The mathematician set the problem and laid down detailed instructions which
might be so exact that the computer could do his work completely without any understanding of
the real nature of the problem; the computer would then use the arithmetical machine to perform
his operations of addition, multiplication, etc. In recent times, especially with use of punched card
machines, it has been possible gradually for the machine to encroach on the [human] computer’s
field, but all these processes have been essentially controlled by the rate at which a man can work.’
(Darwin, 1946, p. 54).
Darwin went on by stressing that ‘The possibility of the new machine started from
a paper by Dr. A. M. Turing some years ago when he showed what a wide range
of mathematical problems could be solved, in idea at any rate, by laying down the
rules and leaving a machine to do the rest’ (Darwin, 1946, p. 54).
Computing technology as represented by these pioneering calculating machines were designed to eliminate the ‘human brake’, that is, the cooperative work
of human computers and punched-card operators, just as automatic machine systems a century previously, in other domains of work, had eliminated the coopera-
Chapter 11: Formation and fragmentation
337
tive work of workers in paper mills, etc., while of course constituting cooperative
work of an entirely different sort, performed by machine operators and technicians.
The first electronic digital computers such as the Colossus and the ENIAC
were not stored-program computers. They were specifically designed for performing massive calculations, not as general purpose computers. Thus, to facilitate
configuration and reconfiguration the machines were equipped with plugboards
similar to those used in punched-card tabulators and, in the case of the Colossus,
also switches. However, the configuration work was tedious and the cost in terms
of time significant. For example, configuring the ENIAC by plugging cables, ‘its
users were literally rewiring the machine each time, transforming it into a specialpurpose computer that solved a particular problem’; and, consequently, it took ‘up
to two days’ to configure the ENIAC to solve a new problem, which it might then
solve in a minute (Ceruzzi, 1990, p. 241). Moreover, the ENIAC had been designed for calculating projectile trajectories and was not particularly suited for
other types of massive calculation such as solving partial differential equations
(Campbell-Kelly and Aspray, 1996, p. 91). The impetus to develop storedprogram computers came from such limitations.
The stored-program digital computer technology (e.g., EDVAC and ACE) had
been developed, as a ‘universal’ or ‘general’ technology of large-scale calculation
machinery that was far less expensive and far more flexible than building series of
specialized calculation machine systems. But it was then — in one of those lateral
shifts in which a technology developed for one domain of work is picked up and
appropriated for another domain — gradually and hesitatingly transformed and
appropriated for administrative purposes.
The application of stored-program electronic computers in work settings only
began in the 1950s. The first ‘business’ application, a payroll program, ran on 12
February 1954, on the then newly finished LEO computer (based on the Manchester Mark I architecture), calculating the wages of bakery staff of the Lyons teashop chain in the UK (Ferry, 2003). This application is typical for the use of electronic computers for business purposes, that is, for economic, commercial, organizational, managerial, etc., purposes (beyond applications of scientific calculation): batch processing of large numbers of transaction records.
The stored-program electronic computer was appropriated for business administration purposes as a substitution technology; that is, the new computers were
designed for automating the work of the central computing departments of largescale organizations, replacing entire batteries of punched-card tabulators by a single computer such as the IBM 1401. Most of the computer systems installed in
commerce and government during the 1950s and 1960s were ‘simply the electronic equivalents of the punched-card accounting machines they replaced.’
(Campbell-Kelly and Aspray, 1996, p. 157). The punched-card technology had
helped to shape the organization of business, and by the 1950s and 1960s
‘the highly centralized accounting systems of industry were very much geared to what was technically achievable with the commercially available machines, and a generation of accountants be-
338
Cooperative Work and Coordinative Practices
tween the two world wars grew upon a diet of the standard textbooks on mechanized accounting.
When computers became available in the 1950s and 1960s, they tended to be used at first as glorified electric accounting machines and were simply absorbed into old-fashioned accounting systems.’ (Campbell-Kelly, 1990, pp. 146 f.).
What characterized this computing technology was automatic processing (recording, sorting, merging, aggregating, calculating, printing) of data concerning
economic activities: statistical analysis (actuarial data, sales analysis), invoicing,
sales reports, payroll, inventory control, financial reporting. That is, this was a
technology developed and used for administrative and logistical purposes in ordinary business settings (payroll calculation, production planning). The computer
systems were used for performing various house-holding tasks in organized cooperative work settings but the interdependent activities of the cooperating workers
were neither facilitated by the system, nor mediated in and through the system,
nor regulated by the system: one worker’s actions were not effectuated and propagated to other workers by means of the system. The system remained ‘outside’ of
the practices and settings whose economic transactions it was processing.
2.5. Facilitation of cooperative work: Real-time-computing
Parallel to the development of these technologies of automatic handling of administrative calculation, an entirely different computing technology was being developed that directly addressed the facilitation of cooperative work, namely ‘online’
‘real-time’ computing systems such as air defense systems (SAGE) and airline
reservation systems (SABRE). With this technology computational machine systems were constructed that would constitute the common field of work of multiple
cooperating actors interacting ‘in real time’.
2.5.1. Project Whirlwind
The increased role of airplanes in warfare in World War II caused two bottlenecks: testing new airplane designs and training crews for them were costly and
caused intolerable delays: ‘in 1943 it was taking far too much time and money to
train flight crews to man the more complex, newer warcraft in production, and it
was taking far too much time and money to design high-performance airplanes’
(Redmond and Smith, 1980, p. 1). The obvious path of developing a particular
flight simulator for each particular airplane model was not sustainable, as it would
lead to increased costs of ‘providing a new and different flight trainer for each
warplane model in combat use’. In 1943, this prospect led researchers at MIT’s
Servomechanisms Laboratory together with US Navy planners to develop the alternative strategy of developing configurable flight simulators that could match
the flight characteristics of any particular airplane design: ‘a protean, versatile,
master ground trainer that could be adjusted to simulate the flying behavior of any
one of a number of warplanes’ (Redmond and Smith, 1980, p. 2). The project,
Chapter 11: Formation and fragmentation
339
named Whirlwind, was undertaken in 1944 under the leadership of Jay W. Forrester.94
To realize the idea of a ‘protean and versatile’ simulator, the simulator should
incorporate a computer and, what is more, a computer with the capacity to process
incoming data and solve the system of differential equations at a rate that was sufficient to match the rate of external state changes. That is, the computer should be
able to handle external events rapidly enough to be ready for the next external
event or ‘in real time’.
Forrester initially opted for a design based on an analog computer, but it was
clear that an analog computer ‘would not be nearly fast enough to operate the
trainer in real time’ (Campbell-Kelly and Aspray, 1996, p. 159). However, in the
summer of 1945 he learned that digital computer technology was a viable option.
He learned this from another MIT student, Perry Crawford, who had developed
some initial concepts for real-time process control based on digital electronic calculating systems. Noting that it had ‘recently’ been proposed at MIT ‘that electronic calculating systems can perform a valuable function in fire-control operations’, his thesis set out to ‘describe the elements and operation of a calculating
system for performing one of the operations in the control of anti-aircraft gunfire,
which is, namely, the prediction of the future position of the target.’ (Crawford,
1942, p. 1). At the time, the control systems for gun control and so on were still
invariably based on mechanical or electromechanical technologies. What Crawford suggested was that the mathematical functions could be modelled in a digital
computer that could thereby be made to control real-world processes such as
tracking a moving target: ‘Crawford was the first person to fully appreciate that
such a general-purpose digital computer would be potentially faster and more
flexible than a dedicated analog computer’ (Campbell-Kelly and Aspray, 1996, p.
160). Crawford explained all this to Forrester in 1945 and, as Forrester later put it,
it ‘turned on a light in [my] head’ (ibid.).
The stored-program computer technology was made the foundation of the subsequent development work in Project Whirlwind (by virtue of access to the
ongoing EDVAC design work, cf. Goldstine, 1972), and as the project progressed,
the objective of building a flight simulator receded into the background; instead,
the effort focused increasingly on the challenge of building a real-time digital
computer system (Crawford, 1946). ‘The two young men quickly realized that
they would not be developing simply a sophisticated flight trainer. Instead, they
had stumbled onto a design concept so fundamental that its generality of application was almost staggering to contemplate’ (Redmond and Smith, 1980, p. 217).
94 The following account of Whirlwind and its legacy, the SAGE system and beyond, is based on the insightful studies by O’Neill (1992) and by Campbell-Kelly and Aspray (1996). The two volumes by Redmond and
Smith (1980, 2000) offer a detailed and accurate account; however, the information that is relevant from the
perspective of technology-in-practice has to be dug out from an account that is overwhelmingly focused on
issues of research governance and project management; this makes this major piece of research less immediately useful for researchers with CSCW or HCI interests and concerns.
340
Cooperative Work and Coordinative Practices
A major technical issue in developing Whirlwind was the speed and reliability
of computer memory. Consequently, significant effort was devoted to developing
the new technology of magnetic core memory, based on a web of tiny magnetic
ceramic rings. Core memory technology, which did not become operational until
the summer of 1953, made Whirlwind ‘by far the fastest computer in the world
and also the most reliable’ (Campbell-Kelly and Aspray, 1996, p. 167). It was a
‘monumental achievement’ (O’Neill, 1992, p. 13).95 The operational speeds offered by core memory was further underscored by the development for Whirlwind
of
‘the intricate details of “synchronous parallel logic” — that is, the transmitting of electronic pulses, or digits, simultaneously within the computer rather than sequentially, while maintaining logical coherence and control. This feature accelerated enormously […] the speeds with which the
computer could process its information.’ (Redmond and Smith, 1980, p. 217).
Whirlwind was also ‘first and far ahead in its visual display facilities’ which,
among other things, facilitated the ‘plotting of computed results on airspace maps’
(Redmond and Smith, 1980, p. 216). Complementary to this feature, was a ‘light
gun’ with which the operator could select objects and write on the display: ‘As a
consequence of these two features, direct and simultaneous man-machine interaction became feasible’ (ibid.).
That the Whirlwind research continued and eventually became massively funded despite the fact that no stored-program digital computer existed at the time
(and would not exist until the ‘Manchester Baby’ ran its first test 21 June 1948)
was due to external events in 1949. The US intelligence services revealed that the
Soviet Union had exploded a nuclear bomb in August that year and furthermore
possessed bomber aircraft capable of delivering such weapons at targets in the
US. Quickly a committee was put to work to evaluate the implications for US airdefense. In 1950, the committee concluded that the existing air defense system
was wholly inadequate for the current situation. It candidly compared the existing
system ‘to an animal that was at once “lame, purblind, and idiot-like”’, adding, in
order not to leave readers guessing, that ‘of these comparatives, idiotic is the
strongest’ (ADSEC Report, October 1950, quoted in Redmond and Smith, 1980,
p. 172). The problem was basically that the coordination effort of the existing system severely limited its capacity. As O’Neill explains the conundrum:
‘In the existing system, known simply as the “manual system”, an operator watched a radar screen
and estimated the altitude, speed, and direction of aircraft picked up by scanners. The operator
checked the tracked plane against known flight paths and other information. If it could not by
identified, the operator guided interceptors into attack position. When the plane moved out of the
range of an operator’s radar screen, there were only a few moments in which to “hand-over” or
transfer the direction of the air battle to the appropriate operator in another sector.’ (O’Neill, 1992,
p. 15).
95 Together with printed circuits, core memory made possible the mass production of computers such as the
IBM 1401, announced in October 1959.
Chapter 11: Formation and fragmentation
341
This cooperative work organization, the ‘manual system’, did not scale up to meet
the challenge of large numbers of aircraft with intercontinental reach and carrying
nuclear weapons. The problem was systemic (Wieser, 1985). In the words of
O’Neill again:
‘In a mass attack, the manual handling of air defense would present many problems. For example,
the manual system would be unable to handle detection, tracking, identification, and interception
for more than a few targets in the range of anyone radar. The radar system did no~ provide adequate coverage for low altitude intrusion. The way to get around the lack of low altitude surveillance was the “gap filler” radar. Because this “gap filler” radar was limited to a few tens of miles,
the system required more frequent hand-overs, and further taxed the manual-control system and its
operators by reducing the time available to intercept an attacking bomber.’ (O’Neill, 1992, pp. 1516).
It was concluded, then, that a radical transformation of the organization of the airdefense organization and its technical infrastructure was required. The new air
defense system that was eventually built and was named Semi-Automatic Ground
Environment, or SAGE, was divided into 23 Direction Centers distributed
throughout the USA. Each of these centers would be responsible for monitoring
the airspace of the sector and, if required, for directing and coordinating military
response. The work of each Direction Center was supported by a high-speed electronic digital processing machine that would receive and process data from radar
sites via a system of hundreds of leased telephone circuits. In 1950 it was decided
that the SAGE computer system would be based on the Whirlwind design.
Whirlwind was, of course, an experimental system and was not fit for production. An engineered version of Whirlwind was developed which was initially
simply known as Whirlwind II, but then named XD-1 until it finally, as a production version manufactured by IBM, was renamed AN/FAQ-7 (or the Q-7 as it was
often called).
‘The direction center itself was to be semiautomatic; that is, routine tasks would be done automatically under the supervision of operators. A high-speed digital computer would collect target reports from the radar network, transform them into a common coordinate system, perform automatic track-while-scan […], and compute interceptor trajectories. Operators filtered the radar data,
had override control (i.e., could initiate or drop tracks), performed friend-or-foe identification
function, assigned interceptors to targets, and monitored engagements through voice communication with the interceptor pilots.’ (Wieser, 1985, p. 363)
To determine the feasibility of the plan, a ‘computer-controlled collisioncourse interception’ test was undertaken on 20 April 1951 above Bedford, Massachusetts. According to the test report by C. Robert Wieser dated 23 April 1951,
‘three successive trial interceptions were run with live aircraft under control of the
WWI [Whirlwind I] computer’ (quoted in Redmond and Smith, 2000, p. 1). The
pilot of the intercepting aircraft reported that from a distance of about 60 kilometers (40 miles) he was brought to within 1,000 meters of his target. Three days
later it was decided to build a prototype of the SAGE system, ‘an elaborate, multiradar experimental system tied into Whirlwind I’ (Redmond and Smith, 2000, p.
2). The development of this experiment prototype, called the Cape Cod system,
proceeded in an entirely iterative manner. Robert Wieser, who was deeply in-
342
Cooperative Work and Coordinative Practices
volved in the development of the Cape Cod prototype as an engineer recalls that
the ‘development of the new concept and its embodiment in the Cape Cod System
relied heavily on iterative cycles of experiment-learn-improve. […] However inelegant, the approach worked very well. The ever-present realism of radar clutter,
telephone-line noise, and limited computer memory drove the development pace
faster than a mathematical analytical approach could ever have done’ (Wieser,
1985, p. 364).
The Cape Cod system was based on the ‘engineered version’ of Whirlwind, the
XD-1, and was ready for experiments in 1952. It supported 30 air-force operators
working next to each other at consoles equipped with CRT displays on which digitized radar data could be selected for analysis by means of a light pen (CampbellKelly, 2003, p. 37).
In March 1953, Robert Wieser gave a talk to visitors to a Cape Cod demonstration. Introducing them to what they were about to see, he said:
‘The radar data is fed into the Whirlwind I computer at the Barta Building in Cambridge, which
processes the data to provide 1) vectoring instructions for mid-course guidance of manned interceptors and 2) special displays for people who monitor and direct the operation of the system. [¶]
In processing data, the computer automatically performs the track while-scan function, which consists of l) taking in radar data in polar coordinates, 2) converting it to rectangular coordinates referred to a common origin, 3) correlating or associating each piece of data with existing tracks to
find out which pieces of data belong to which aircraft, and 4) using the data to bring each track upto-date with a new smoothed velocity and position, and 5) predicting track positions in the future
for the next correlation or for dead reckoning if data is missed. Once smoothed tracks have been
calculated, the computer then solves the equations of collision-course interception and generates
and displays the proper vectoring instructions to guide an interceptor to a target.
This process is not, however, wholly automatic. The initiation of new tracks can be done automatically or manually, or both methods can be used, each in different geographical areas of the system. Also the decision as to which aircraft tracks are targets and which tracks are interceptors is
made by people and inserted manually into the machine by means of a light gun. The light gun is a
photocell device which is placed over the desired blip on the display scope and then sends a pulse
into the computer to indicate to the computer that action (for example, “start tracking”) is to be
taken on that particular aircraft. The action which the machine takes is defined by manually setting
a selector switch, which the computer automatically senses and interprets (for example, “handle
this aircraft as an interceptor”). The human beings make decisions and improvise while the computer handles the routine tasks under their supervision. In order to facilitate human supervision, a
rapid, flexible display system is required. The principal means of display is the cathode-ray tube,
which can accept information very rapidly and present both symbols and geographical positions of
aircraft. Flexibility is achieved by programming the computer to display various categories of information on different display cables. The human operator can switch these cables at his scope and
thus select at any time the type of information (or combination of types) which he wishes to observe.’ (Wieser, 1953, p. 2).
From the beginning, that is, the system was planned as ‘semi-automatic’, that
is, the computer system was designed to work in a mode radically different from
the automatic calculations for which other contemporary computers were being
designed. The Cape Cod prototype and the ultimate SAGE computer systems (the
Q-7s) were deliberately designed for interactive computing, complete with real-
Chapter 11: Formation and fragmentation
343
time computing, graphical CRT displays, handheld selection devices (light guns,
joysticks), direct manipulation.
The SAGE system was an enormous machine system, a system of 23 interconnected computers (and an equal number of backup computers) connected to a vast
array of radar stations, intercept fighter aircraft, etc., that afforded a large-scale
cooperative work effort encompassing a distributed ensemble of 2,300 operators.
Again, O’Neill’s account is very informative about the actual work at the centers:
‘As many as 100 Air Force personnel used a SAGE computer in a single facility at the same time.
They used an assortment of equipment to communicate to the computer, such as cathode ray tube
(CRT) displays, keyboards, switches, and light guns. This equipment provided the operators with
the information they needed to make decisions and provided them a way to send commands to the
computer.
The SAGE system was designed to make use of consoles with CRTs for displaying the tracks of
radar-reported aircraft and providing visual maps to improve comprehension. Computer programs
controlled the beam that created a track’s image on the CRT surface by supplying the coordinates
needed in drawing and refreshing the image. Light guns allowed the operators to aim at an unidentified plane image on a screen and the system would then give information about the specified
image. If the operator determined that a plane was hostile, the system would tell the operator
which of the many interceptors or guided missiles on hand were in the best position-to intercept it.
The operators filtered the radar data, had the ability to override, performed the friend-or-foe identification function, assigned interceptors to targets, and monitored engagements through voice
communication with the interceptor pilots. SAGE required visual displays, real-time responsiveness, and communication between computers. These are all elements of interactive computer use.
From the beginning, SAGE was designed to replace the manual system only partially. It was semiautomatic; a human element remained an important part of the system. The splitting of tasks was
seen as a good approach to the problem; machines were necessary because human operators working without computers could not make the necessary calculations fast enough to counter an attack.
But the person was also very important. The computers could keep the minds of the human operators “free to make only the necessary human judgements of battle - when and where to fight.’
(O’Neill, 1992, pp. 19-21)
Whirlwind and its aftermath, Cape Cod and SAGE, inaugurated and defined a
new technological paradigm, typically referred to as computerized real-time transaction processing systems (O’Neill, 1992, p. 22). This is a technology that facilitates workers in a cooperative effort in as much as the system provides them with
a common field of work in the form of a data set or other type of digital representation (possibly coupled to facilities outside of the digital realm) and thus enables
them to cooperate by changing the state of the data set in some strictly confined
way:
‘On-line transaction processing systems were programmed to allow input to be entered at terminals, the central processor to do the calculation, and the output displayed back at the terminal in a
relatively short period of time. These systems could be used, for example, to allow several users to
make inquiries or update requests, which were handled as they arrived with only a short delay. The
services that could be requested were restricted to those already pre-programmed into the system.’
(O’Neill, 1992, pp. 22-23)
The SAGE system certainly facilitated cooperative work, in that state changes initiated by one worker propagated via the system to other workers, but from the
344
Cooperative Work and Coordinative Practices
point of view of CSCW, one would not categorize it as a system that supported
cooperative work. The operators were interdependent in their work by virtue of
the system (radar stations, communication lines, tracking devices, handovers) and
interacted in an orderly way regulated by the system but were strictly limited in
what action and interaction they could undertake in and through the system. As
O’Neill points out, ‘Although they could specify what information was displayed,
the operators could not change the program or the situations’ (O’Neill, 1992, p.
21). Thus, when a US Air Force colonel at the time characterized the SAGE system as ‘a servomechanism spread over an area comparable to the American Continent’, he was quite right (Mindell, 2002, p. 313).
The limits of this paradigm were evident from the very beginning. It was, for
example, obvious that there would be situations where air-defense operators at
Direction Centers would need to interact with colleagues, perhaps at another center, in ways different from ‘those already preprogrammed into the system’. Thus,
to enable operators to handle contingencies, it was realized at a very early stage in
the development of the system that operators would need to be able to interact
outside of the functionalities afforded by the system. Thus, in the summer of
1953, a ‘“man-to-man” telephone intercommunication system’ was installed so as
to enable operators in the Combat Center (of the Direction Center) and at some
remote locations to simply talk to one another (Redmond and Smith, 2000, pp.
310 f.). That is, operators were not supported in enacting or developing coordinative practices by the computational system. The handling of contingencies had to
be carried out outside of the computational system, as ordinary conversations
among operators within each center or as telephone conversations among operators across centers. CSCW begins with the realization of these limitations in the
Whirlwind paradigm.
2.5.2. The Whirlwind legacy
The SAGE system had immediate and far-reaching impact on computing technologies. From the early 1960s, applications of the paradigm epitomized by SAGE
such as computerized transaction processing systems for air traffic control, banking, manufacturing production planning and control, and inventory control began
to upset the forms of work organization that had dominated administrative work
settings for ages, either in the form of armies of ‘human computers’ organized on
the basis of the de Prony principles or in the form of punched-card machine operators working in configurations similar to those of the cotton mills of Lancashire around 1830.
It began with airline reservation systems. The first major civilian project to exploit the Whirlwind paradigm was the SABRE airline reservation system developed by IBM for American Airlines. It was, again, a critical situation in a cooperative work setting that motivated the development effort
In 1954 a airline reservations office would come across somewhat like this portrayal of American Airline’s Chicago office:
Chapter 11: Formation and fragmentation
345
‘A large cross-hatched board dominates one wall, its spaces filled with cryptic notes. At rows of
desks sit busy men and women who continually glance from thick reference books to the wall display while continuously talking on the telephone and filling out cards. One man sitting in the back
of the room is using field glasses to examine a change that has just been made on the display
board. Clerks and messengers carrying cards and sheets of paper hurry from files to automatic
machines. The chatter of teletype and sound of card sorting equipment fills the air. As the departure date for a flight nears, inventory control reconciles the seating inventory with the card file of
passenger name records. Unconfirmed passengers are contacted before a final passenger list is sent
to the departure gate at the airport. Immediately prior to take off, no-shows are removed from the
inventory file and a message sent to downline stations canceling their space.’ (McKenney, 1995, p.
97).
An office like this would house about 60 reservation clerks and 40 ‘follow-up
clerks’ and ‘board maintainers’. The logic of this scenery is as follows. In the
1930s American Airline had adopted a decentralized reservations system maintained at the particular flight’s departure point. This system, called ‘request-andreply’, involved significant coordination work, as it required agents to coordinate
with the inventory control department (two messages) before confirming the reservation with a third message. In addition, passenger-specific data (name, telephone number, and itinerary) had to be recorded at the time of confirmation on a
‘passenger name record card’ and subsequently transmitted via telephone or teletype to the inventory control department. To reduce the cost of coordination, an
amended system was introduced a decade later: until a flight was 80 percent sold
out, agents were free to accept bookings and were only required to report actual
sales, allowing passenger requests to be accommodated quickly. This reduced
message volumes to half. At the same time, the inventory control department
monitored sales, and when available seats decreased to a prescribed level, a ‘stopsale message’ was broadcast to all agents. To make this ‘sell and report’ system
work required a buffer of seats. As McKenney observes, ‘These largely manual
systems were time-consuming, requiring from several hours to several days to
complete a reservation. Moreover, they were cumbersome and unreliable, plagued
by lost names, reservations made but not confirmed with travelers, and inaccuracies in the passenger lists held at boarding gates, with the result that most business
travelers returning home on a Friday would have their secretaries book at least
two return flights.’ (McKenney, 1995, p. 98)
At that point, although thousands of reservations were processed every day, the
operation ran virtually without the use of machinery, apart from an electromechanical system at the inventory control department for maintaining an inventory
of sold and available seats. In fact, as pointed out by Campbell-Kelly and Aspray,
the reservations office ‘could have been run on almost identical lines in the 1890s’
(1996, p. 170). The reason for this was that existing data-processing technologies
operated in batch-processing mode. The primary goal of punched-card tabulating
machinery that dominated the administrative domain in large-scale business operations until this time was to reduce the cost of each transaction. This was done by
completing each operation for all transaction records before the next operation on
the records was begun. An individual transaction simply had to wait until an eco-
346
Cooperative Work and Coordinative Practices
nomically viable batch of transactions had been accumulated. That is, the cost of
each transaction was traded off against the lapse time of processing the transaction. According to Campbell-Kelly and Aspray, ‘the time taken for each transaction to get through the system was at least an hour, but more typically half a day’
(ibid.). As noted above, the new business computing technologies were designed
and applied as little more than advanced punched-card tabulators. What these
technologies offered was not another mode of operation but batch-processing at a
higher level of automation. With punched-card tabulating machinery human involvement was required in the transition from one batch process to the next: cards
had to be picked up and carried from, say, sorting to tabulating to re-sorting. But
with computers equipped with tape stations this vestige of human intervention
could be eliminated. When the transaction represented by the punched cards had
been ‘read’ by the system’s card reader, the worker could stand back and supervise the entire process. For most businesses, batch processing was a cost-effective
mode of operation; it was how accounting offices updated accounts, banks processed checks, insurance companies issued policies, and utility companies invoiced clients. But the airline industry, by contrast, had ‘instantaneous processing
needs’, and trading transaction cost off against lapse time was not an option.
Hence the overwhelmingly manual character of the entire reservations operation by 1954. But in the 1950s the volume of traffic and the number of flights increased. Thus ‘the ability to process passenger reservations assumed increased
importance. Given the highly perishable nature of the airlines’ product, reservation personnel needed to have rapid access to seat availability, be able to register
customer purchases instantly, and in the event of a cancellation quickly recognize
that a valuable item of stock had been returned to inventory and was available for
resale.’ (McKenney, 1995, pp. 98 f.). Consequently, operating the reservations
office as a manufactory was no longer viable. The ‘boards became more crowded’
and the clerks ‘had to sit farther and farther away from them’. So, by 1953 ‘American Airlines had reached a crisis.’ (Campbell-Kelly and Aspray, 1996, pp. 170
f.).
Having learned of the real-time transaction processing technology that was
then under rapid development, American Airlines decided to undertake a thorough
transformation of its reservation system based on this paradigm. The new system
was to be developed by IBM (where Perry Crawford had been employed in 1952
to develop real-time applications). The project, later dubbed SABRE, was formally established in 1957 and was the ‘at the time easily the largest civilian computerization task ever undertaken’. The system was implemented in the early 1960s
on IBM 7090 computers, which were for all practical purposes solid-state versions of the Q-7. The final system, which was fully operational in 1964, connected some 1,100 agents using desktop terminals across the US, who had access to ‘a
constantly updated and instantly accessible passenger name record’ containing
various information about the passenger, including telephone contacts, hotel and
automobile reservations, etc. (Campbell-Kelly and Aspray, 1996, pp. 172-174;
Campbell-Kelly, 2003). As in the SAGE system, the SABRE system facilitated
Chapter 11: Formation and fragmentation
347
cooperative work: one worker’s action on computational objects (or structured
data sets) represented in the system in some form, e.g., his or her entering a flight
reservation (name, date, flight number, etc.), is stored (and perhaps transmitted to
another center) to allow other workers (perhaps elsewhere) to perceive and act on
this information. In other words, the system mediates the propagation of changes
to the state of the common field of work of the many workers involved in the
large-scale cooperative effort of handling flight reservations.
In this paradigm the computational system does not support articulation work.
What characterizes these systems is real-time online transaction processing.
‘Transaction processing allows a person to interact with the system in ways that
are limited and predetermined’ (O’Neill, 1992, p. 2). Whatever might be required
to handle a given task in addition to and beyond what the transaction contains,
e.g., sorting out ambiguities, mistakes, etc., is done outside of the system, through
some other medium, e.g., by telephone. In short and rather schematically, realtime online transaction processing systems facilitate cooperative work but not articulation work.
From the vantage point of CSCW, there are many reasons to take note of the
Whirlwind project and its aftermath. As already described, Whirlwind provided
the paradigm for a computing technology in which the interdependent activities of
multiple of workers (hundreds, thousands even) are facilitated and mediated by
the system. More than that: Whirlwind played a crucial and unique role in the development of computing technology beyond the paradigm of massive calculation:
digital communication networks, real-time computing, interactive computing, direct manipulation, etc. The conceptual foundation of much of modern computing
was developed in the course of this effort.
2.5.3. Interactive computing
Interactive computing grew out of technologies and practices far away from scientific and administrative calculation. It grew out of technologies of real-time control of external processes. The stored-program digital computer provided the very
possibility of extending and transforming the technologies of real-time control of
external processes, but it is no accident that Whirlwind was developed by MIT’s
Servomechanism Laboratory. The interactive computing paradigm as developed
in the course of Whirlwind and so on extended and transformed technologies of
gunnery control that in turn build on knowledge of real-time processes such as
anti-aircraft control, bomb aiming, automatic aircraft stabilizers, etc.
However, the Whirlwind legacy is not simply a faint pattern constructed in retrospect by the historian. The continuity was real to actors at the time.
First of all, there was a direct personal continuity: ‘The people who worked on
the two projects gained an appreciation for how people could interact with a computer. This appreciation helped the development and maturing of interactive computing, linked with time-sharing and networking’ (O’Neill, 1992, p. 11). More
than that, the Whirlwind project and the subsequent development of the SAGE air
defense system were research and development projects on a huge scale. The
348
Cooperative Work and Coordinative Practices
overall cost of the SAGE system exceeded the cost of the Manhattan project by a
wide margin. The project therefore had tremendous impact on American computer
science and engineering communities. In fact, half of the trained programming
labor force of the US at the time (1959) were occupied with the development of
the software for the SAGE system (Campbell-Kelly, 2003, p. 39). (Cf. also Baum,
1981).
Secondly, when the Whirlwind project was finished in 1953 and many researchers from the Servomechanisms Lab were relocated to work on Cape Cod
and SAGE at a new laboratory (the Lincoln Laboratory), the original Whirlwind
prototype was not scrapped but remained available on the MIT campus. Furthermore, at the Lincoln Laboratory researchers developed transistorized experimental
computers based on the Whirlwind architecture. Named TX-0 and TX-2, they
‘provided quick response and a variety of peripheral input/output equipment
which allowed these machines to be used interactively’. These computers were
also handed over to MIT campus in 1958 and were used intensively by graduate
students and researchers: ‘The attitude of the people using the Whirlwind computer and these test computers was important in the establishment of a “culture” of
interactive computing in which computers were to be partners with people in creative thinking’ (O’Neill, 1992, p. 24). For example, Fernando Corbató, who later
played a central role in the development of time-sharing, recalls that ‘many of us
[at MIT] had cut our teeth on Whirlwind. Whirlwind was a machine that was like
a big personal computer, in some ways, although there was a certain amount of
efficiency batching and things. We had displays on them. We had typewriters, and
one kind of knew what it meant to interact with a computer, and one still remembered.’ (Corbató, 1990, p. 14). In fact, the interactive computing ‘culture’ O’Neill
refers to was so ingrained among Whirlwind programmers that they were not particularly interested in time-sharing when this movement got under way but were
rather, in Corbató’s words, preoccupied with ‘creating in some sense a humongous personal computer (laugh). This included people like Ivan Sutherland, for example, who did the Sketchpad using TX-2. That was a direct result of being able
to have it all to himself, as though it were a big personal computer.’ (Corbató,
1990, p. 15).
Thus the protracted and costly research effort that went into building the
Whirlwind and its offspring provided the basic principles and techniques of the
interactive computing paradigm. Later developments extended and refined the notion of ‘interacting’ with a computer in ‘real time’ far beyond the initial and quite
narrow constraints of the concept of real-time transaction processing and also beyond other Whirlwind technologies such as CRT displays, and so on. Major steps
were taken already in the 1960s, in laboratories using Whirlwind-type computers
such as the TX-2. Ivan Sutherland’s Sketchpad (1963) is probably the first documented implementation of modern computer graphics technologies.
A series of major advances resulted from the tenacious research effort of
Douglas Engelbart who, of course, was well aware of the interactive computing
paradigm when this effort began (Engelbart, 1962, §IIIA5; cf. also Bardini, 2000,
Chapter 11: Formation and fragmentation
349
passim). It is well known that the first experimental versions of the computer
mouse was developed by Engelbart’s laboratory at the Stanford Research Institute
(W. K. English, et al., 1967), but a significantly more important step was taken in
1968 when Engelbart at an AFIPS conference demonstrated that many of the
technologies that we now consider essential components of interactive computing
(direct manipulation, bit-mapped displays, mouse, message handling, etc.) could
be realized in an integrated fashion on the basis of an architecture providing unrestrained access to multiple application programs (Engelbart and English, 1968).
Some of the crucial points on the further development trajectory of this technology (or web of technologies) include the Xerox Alto from 1973 (Thacker, et al.,
1979; Lampson, 1988; Thacker, 1988) and the Xerox 8010 ‘Star’ from 1981 (D.
C. Smith, et al., 1982a; D. C. Smith, et al., 1982b; J. Johnson, et al., 1989). Interactive computing, as we know it, finally became a practical reality with the release of the Apple Macintosh in 1984.
2.5.4. The arrested growth of interactive computing
Whirlwind went online in 1951, development of the Cape Cod system was finished around 1956, and the SAGE system went operational around 1960. Still, it
took a generation for the technologies of interactive computing to become a practical reality for workers outside of a small population of computer scientists and
workers in certain time-critical work settings such as air defense and airline reservation. What impeded the further development of interactive computing, beyond
on-line transaction processing? The answer is not only that the development of
Whirlwind was funded at a generous level that hardly has been seen since then,
but also that the basic principles of interactive computing could not be disseminated beyond the few areas where their application was economically sustainable.
For computing equipment was enormously expensive until the development of
integrated circuits in the 1960s and especially microprocessors in the 1970s made
mass production of computers economically viable.96
Semiconductor technology was very long in the making, and involved advances in fundamental theories of physics as well as the development of mass markets
for electronics products. Researchers in the area of solid state physics had investigated the electrical properties semiconductors since the 1830s (although not by
that name). What caught their attention was the fact that the electrical resistance
of substances such as silver sulphide increased with increasing temperature, contrary to the behavior of ordinary conductors. In 1874 it was discovered (by Ferdi96 Excellent general accounts of the history of semiconductor technology can be found in the studies by
Braun and Macdonald (1978) and by Orton (2009). The development of the transistor and integrated circuit
technologies is recounted by the historians Riordan and Hoddeson (1997). Based on interviews with key actors, Reid (2001) gives a readable journalistic account of the semiconductor history from the beginnings to
the invention of the microprocessor but his account is unfortunately marred by American hero-worship. —
On the other hand, Bo Lojek (2007) offers a detailed and candid story of this development, reminding us that
the history of modern technology is somewhat distorted by the picture of technology development as ‘a systematic effort of exceptional leadership’ carefully nurtured by corporate PR departments and patent lawyers
and that, based on his own experience as an engineer in the semiconductor industry, ‘the company establishment was frequently one of the biggest, if not the biggest, obstacle’.
350
Cooperative Work and Coordinative Practices
nand Braun) that a metal wire contacting a crystal of lead sulphide (another semiconductor) would conduct electricity in one direction only. The rectifying effect,
as it was called, was used in very simple radio receivers (‘crystal radio’) in the
first decades of the twentieth century, but the effect could not be not explained
until the development of quantum mechanics in the 1920s and 1930s provided the
theoretical framework for beginning to model the behavior of electrons in solids.
But it was not until the end of the 1930s that the rectification effect was explained
(by Mott, Schottky, and Davydov) and the theoretical basis for the invention of
the transistor was laid, and even then an enormous systematic effort — e.g., charting the properties of semiconductors such as germanium and silicon in the course
of developing radar technology during World War II — was required to arrive at
the point when the transistor effect was actually discovered (by Bardeen, Shockley, and Brattain at Bell Labs on 23 December 1947). Still, as Braun and Macdonald put it in their history of semiconductor technologies, even then
‘a vast amount of development work remained to be done if the transistor was ever to become a
technological achievement rather than just a scientific curiosity’. ‘This is a case of a very large
gulf of ignorance separating the early ideas from any possibility of realisation. Much knowledge
had to be accumulated before the original ideas could be modified so as to create practical devices’
(Braun and Macdonald, 1978, pp. 24 f., 47).
So, although transistors were in commercial production from 1951, the technology did not affect the computer industry for more than a decade. The production of transistors posed ‘staggering problems’. Vital parameters such as conductivity of a semiconductor crystal depend critically on the nature, amount, and distribution of non-native atoms in the crystal lattice (at degrees of accuracy of 1 of
108), which made semiconductor production an exceedingly difficult art to master.
In the early 1950s the price for one transistor was about $20 while the price for a
thermionic valve (or vacuum tube) was about $1; by the end of 1953, when the
annual production reached 1 million transistors, the price was about $8. This initially excluded transistors from supplanting thermionic valves in most civilian applications. One of the first applications was in hearing aids, which began around
1953, and by the end of 1954 the portable radio was launched, but in general the
technology only developed slowly, largely driven by military applications and the
US space program.
Anyway, at this time the transistor was not nearly as reliable as the thermionic
valve. For several years, the choice material for transistors was germanium, which
is far more pliant in the production process but also highly sensitive to variations
in temperature and not very suitable for making the switches that make up digital
circuitry in the first place. Silicon, on the other hand, would afford a far more reliable transistor but was also very difficult to work with. Silicon transistors therefore only became an economic reality for the computer industry with the invention of integrated circuits in the early 1960s. Consequently, the computer industry
stayed with the thermionic valve until around 1960 when printed circuits provided
a way to apply transistor technology in large-scale production of computers such
as the IBM 1401.
Chapter 11: Formation and fragmentation
351
In addition to the ‘staggering problems’ encountered in the production of transistors, the application of the technology in computing was impeded by another
problem, an impasse called the ‘tyranny of numbers’. Circuits consisted of discrete components (transistors, resistors, capacitors, etc.) that had to be handled
individually. In the words of Jack Morton, a manager semiconductor research at
Bell Labs, ‘Each element must be made, tested, packed, shipped, unpacked, retested, and interconnected one-at-a-time to produce the whole system’. To engineers it was evident that this meant that the failure rate of circuitry would increase
exponentially with increased complexity. Thus, so long as large systems had to be
built by connecting ‘individual discrete components’, this posed a ‘numbers barrier for future advances’ (quoted in Reid, 2001, p. 16). No matter how reliable the
individual components were, the circuits were only as reliable as the connections,
and connections were generally manually wired: ‘The more complex the system,
the more interconnections were needed and the greater the chance of failure
through this cause’ (Braun and Macdonald, 1978, p. 99).
The integrated circuit — an entire circuit made out of one monolithic crystal —
was developed to break this barrier. The idea was conceived and developed by
several researchers interpedently of each other (Jack Kilby at Texas Instruments
and Robert Noyce at Fairchild Semiconductors). As Kilby expressed his idea in
his notebook on 24 July 1958: ‘The following circuit elements could be made on a
single slice: resistors, capacitor, distributed capacitor, transistor’ (quoted in Reid,
2001, p. 77). The tedious and error-prone manual process of testing and assembling circuits from discrete components could be eliminated. However, it was not
the application of integrated circuits in computers that primarily motivated and
drove the development of this technology but more or less frivolous applications
such as digital wristwatches and pocket calculators. But as the integrated circuit
technology matured and prices fell, small computer manufacturers such as Digital
Equipment and Data General adopted the integrated circuit as the basis of compact ‘minicomputers’ that could be sold for less than $100,000. At this stage, interactive computing in the form of engineering workstations slowly began to become a reality beyond a few exceptionally well-equipped computing research laboratories. However, it took the microprocessor to make interactive computing an
economic reality.
It is worth noting that the microprocessor was not developed for the computer
industry at all, but was conceived as a means of reducing the cost and time required to design circuitry for appliances such as electronic calculators. One could
say that the ‘numbers barrier’ had been replaced by a ‘circuit designer barrier’. In
1969, Intel, then a new company, landed a contract to develop a set of calculator
chips for Hayakawa Electric Co. Tasked to work on the project, Marcian E. Hoff,
suggested to make a general-purpose computer chip instead of a set of hard-wired
calculators. The idea was subsequently developed by Stan Mazor and Federico
Faggin and others and the first ‘micro-programmable computer on a chip’, the Intel 4004, was eventually released at the end of 1971, about 35 years after the discovery of the transistor and the demonstration of the first stored-program digital
352
Cooperative Work and Coordinative Practices
computer and almost 30 years after the conception of interactive computing was
first realized in Whirlwind. — Intel 4004 was followed by the 8008 (8-bit wordlength processor) chip, presented in 1973. Parallel developments occurred at Texas Instruments. By the mid-seventies the microprocessor was an established and
accepted technology complete with computer-aided design environments: ‘In no
more than five years a whole industry had changed its emphasis from the manufacture of dedicated integrated circuits [for pocket calculators, etc.], to the manufacture of what in effect were very small computers’ (Braun and Macdonald,
1978, p. 110). Where the 4-bit Intel 4004 of 1971 contained 2,300 devices (transistors, etc.), the 32-bit processors that began to be produced about ten years later
integrated about 200,000 transistors and were comparable with mainframe computers in performance (ibid., pp. 108-112).
As pointed out by Braun and Macdonald (ibid.), underlying this remarkable
leap was a ‘favorable constellation’ of factors in the semiconductor industry.
Firstly, the invention of electronic calculators and their success had created a mass
market for integrated circuits. Advances in solid-state physics in general and semiconductor manufacturing technologies in particular (e.g., clean-room fabrication
of ‘metal–oxide–semiconductors’ or MOS by means of photolithography) made it
possible to produce integrated circuits with increasing densities and low power
consumption. Secondly, related technical developments in high-density MOS
technology made it possible to produce semiconductor memory chips. Thirdly,
minicomputer technology had matured to the stage that their hardware architectures could serve as a model for microprocessor design. And so on.
The microprocessor, initially developed for calculators, was shifted laterally
and formed a critical component technology in the personal computer that was
then created. In the words of Douglas Bell, the designer of many of Digital’s PDP
computers, in his keynote address to the ACM conference on the History of Personal Workstations, ‘The microprocessor, memory, and mass storage technology
appearing in 1975 lead directly to the personal computer industry’. Improvements
in these technologies were ‘the sole enabling determinants of progress in computing’ (1988, pp. 9 f.).
That is, when Engelbart undertook his experimental work on the NLS in the
second part of the 1960s, integrated circuits were in their infancy. The computer
platform that was available to his Augmentation Research Center for its experiments consisted of a couple of CDC minicomputers, but in 1967, the laboratory,
thanks to a grant from ARPA, was able to acquire its first time-sharing computer;
it cost more than $500,000 (Bardini, 2000, pp. 123, 251). The Xerox Alto, released only seven years later, in 1974, had a performance comparable to that of a
minicomputer but the cost had dwindled to about $12,000. This was still well
above what was commercially viable, but the Alto was anyway primarily developed as an experimental platform. Eventually 1,500 machines were built and used
by researchers at Xerox and at a few universities (Thacker, 1988).
‘Alto served as a valuable prototype for Star. Over a thousand Altos were eventually built, and
Alto users have had several thousand work-years of experience with them over a period of eight
Chapter 11: Formation and fragmentation
353
years, making Alto perhaps the largest prototyping effort in history. There were dozens of experimental programs written for the Alto by members of the Xerox Palo Alto Research Center. Without the creative ideas of the authors of those systems, Star in its present form would have been
impossible.’ (D. C. Smith, et al., 1982b, p. 527 f.).
In other words, due to the drastically reduced costs of computing power due to
microprocessor technology, an experimental platform for interactive computing
research had become available. And then, eventually, one day in 1975 Apple cofounder Stephen Wozniak could walk in and buy a MOS Technology 6502 microprocessor for $20 (Freiberger and Swaine, 2000, p. 264), and less than ten
years later Apple could release the first Macintosh at a price about $1,200. The
technology of interactive computing was, so to speak, dormant for a decade or
more, until Moore’s law propelled the technology forward (cf. alsoGrudin, 2006a,
b).
The long gestation period was not due to theoretical shortcomings. For the
technologies of interactive computing were never derived from any preexisting
theoretical knowledge. In fact, the technologies of interactive computing were developed by computer technicians to satisfy requirements they themselves had
formulated on the basis of principles and concepts known from their own daily
work practices. On one hand, in the design of the Alto and the Star the technicians
built on and generalized concepts that were deeply familiar to any Western adult
working in an office environment. In their own words, ‘the Star world is organized in terms of objects that have properties and upon which actions are performed. A few examples of objects in Star are text characters, text paragraphs,
graphic lines, graphic illustrations, mathematical summation signs, mathematical
formulas, and icons’ (D. C. Smith, et al., 1982b, p. 523). On the other hand, in
interpreting these everyday concepts technically the designers applied the conceptual apparatus of objected-oriented programming (objects, classes of objects,
properties, messages) that had been developed by Kristen Nygaard and Ole-Johan
Dahl as a technique to obtain ‘quasi-parallel processing’ (Dahl and Nygaard,
1966; cf. also Nygaard and Dahl, 1978) and had later been further developed in
the 1960s by Alan Kay and others in the form of Smalltalk:
‘Every object has properties. Properties of text characters include type style, size, face, and posture
(e.g., bold, italic). Properties of paragraphs include indentation, leading, and alignment. Properties
of graphic lines include thickness and structure (e.g., solid, dashed, dotted). Properties of document icons include name, size, creator, and creation date. So the properties of an object depend on
the type of the object.’ (D. C. Smith, et al., 1982b, p. 523)
Similarly, the technicians could build on ‘fundamental computer science concepts’ concerning the manipulation of data structures in order to provide application-independent or ‘generic’ commands that would give a user the ability to master multiple applications and to ‘move’ data between applications:
‘Star has a few commands that can be used throughout the system: MOVE, COPY, DELETE,
SHOW PROPERTIES, COPY PROPERTIES, AGAIN, UNDO, and HELP. Each performs the
same way regardless of the type of object selected. Thus we call them generic commands. […]
These commands are more basic than the ones in other computer systems. They strip away extra-
354
Cooperative Work and Coordinative Practices
neous application specific semantics to get at the underlying principles. Star’s generic commands
are derived from fundamental computer science concepts because they also underlie operations in
programming languages. For example, program manipulation of data structures involves moving
or copying values from one data structure to another. Since Star’s generic commands embody fundamental underlying concepts, they are widely applicable. Each command fills a host of needs.
Few commands are required. This simplicity is desirable in itself, but it has another subtle advantage: it makes it easy for users to form a model of the system.’ (D. C. Smith, et al., 1982b, p.
525).
The important point here is that the design concepts reflected the technicians’ own
practical experience, in that they could generalize the concepts of their own work
(typographical primitives such as characters, paragraphs, type styles, etc.) and
from similar concepts already generalized in computer science (MOVE, COPY, DELETE, etc.).
No ethnographic fieldwork was required to develop an understanding of concepts like character, paragraph, line, and illustration or move, copy, and paste. For
a long stretch, then, interactive computing technologies could be further developed and refined in a manner very similar to the way in which email was developed: by technicians building tools for their own use, based on their daily practices and in a rather intuitive iterative design and evaluation process. Thus, for example, when Steve Wozniak was designing the Apple II, he ‘was just trying to
make a great computer for himself and impress his friends at the Homebrew
Computer Club. […] Most of the early Apple employees were their own ideal
costumers.’ The members of the Macintosh design team were similarly motivated:
‘we were our own ideal customers, designing something that we wanted for ourselves more than anything else’ (Hertzfeld, 2005, pp. xvii f.). They knew the concept of interactive computing from Engelbart’s work (who had been building on
the Whirlwind experience); they knew from their own lives what they would require of its design; and they had the advanced technical skills to realize their ideas.
That is, the technologies of interactive computing were initially developed in
the course of a deliberate design effort (in the Whirlwind and Cape Cod projects),
drawing upon on principles of ‘man-machine systems’ based on experiences from
servomechanisms. After that, the technologies of interactive computing were subjected to two decades of almost ‘arrested growth’. However, the technology of
microprocessors, mass-produced CPUs provided a burgeoning platform of development on which the computer scientists at SRI, at Xerox PARC, at Apple, and
elsewhere could extend, elaborate, and refine the principles of interactive computing on the basis of their own practical experiences
The first CHI conference convened in 1982, that is, one year after the release
of the Xerox Star and two years before the release of the first Macintosh. That is,
as an institutionalized research field, HCI only emerged after the technology had
matured and was on the way out of the laboratories. This has caused all kinds of
soul searching.
Chapter 11: Formation and fragmentation
355
As shown earlier, John Carroll noted that ‘some of the most seminal and momentous user interface design work of the last 25 years’ such as Sutherland’s
Sketchpad and Engelbart’s oN Line System (NLS) ‘made no explicit use of psychology at all’ (Carroll, 1991, p. 1). Carroll’s observation is fully vindicated by
the original documents, although it should be mentioned that psychologists at
RAND and the SDC such as Allen Newell were involved in the development of
the SAGE system software already from the 1950s, tasked with testing the software modules that were to support operators at the air defense direction centers
(Chapman, et al., 1959; Baum, 1981). That is, psychology did contribute to the
development of interactive computing but the contribution was made almost a
decade prior to Sutherland and Engelbart’s work and had become incorporated in
the interactive computing paradigm as represented by real-time transaction processing technology that formed the technological baseline for their work.97
As already noted, Carroll’s observations also apply to the development of the
Xerox Alto and the Xerox Star. But again a little nuance is required, in as much as
human factors researchers were involved in ‘human factors testing’ of the design
of the Star. Some of that work was reported at the first CHI conference by Bewley
et al. (1983). However, in the same paper Bewley et al. claim that ‘Recognizing
that design of the Star user interface was a major undertaking, the design team
approached it using several principles, derived from cognitive psychology’ (p.
72). The authors provide no justification for this claim, nor is it supported by the
available contemporary documentation concerning the ‘Star’ design process. In
fact, the very general features they cite were already present in the design of the
Alto that was designed ten years earlier. The claim thus has the appearance of a
rationalization. On the other hand, just for the record, the development of the
mouse did draw upon principles from experimental psychology, in particular
‘Fitts’ law’ (1954) concerning the capacity of the human motor system (cf. D. C.
Smith, et al., 1982b).
Anyway, and as a matter of fact, the principles of ‘direct manipulation’ were
only articulated and systematized a decade after the release of the Xerox Alto
(Shneiderman (1983) and Hutchins, Hollan, and Norman (1986)). HCI research
developed as a post festum systematization effort devoted to understanding the
principles of the interactive computing techniques that had emerged over three
decades, from the Whirlwind to the Alto, the Star, and the Macintosh.
2.5.5. Interactive computing and the cybernetic notion of ‘human-computer
system’
Interactive computing had its intellectual roots in the real-time transaction processing technologies that in turn emerged from control-engineering conceptions of
human control of processes in ‘real time’ (gun control, bomb aiming, servo steering). Generating research traditions such as ‘man-machine systems’ research, this
97 Stuart Card remarks cannily that what Newell was engaged in at RAND and SDC today would be called
‘Computer Support for Cooperative Work’ (1996, p. 259), a claim not without merit.
356
Cooperative Work and Coordinative Practices
conception of machinery has left a rich and quite influential legacy: a conceptual
framework that has been expressed in different terms over time — from the original notions of ‘man-machine system’ (Sutherland, 1963), ‘man-computer symbiosis’ (Licklider, 1960), and ‘human intellect’ ‘augmentation system’ (Engelbart,
1962) to contemporary notions of ‘human-computer interaction’ and ‘interaction
design’ — but the underlying conception is the same. What unites all these different notions is the concept of unity of control in the man-machine system dyad: the
notional steersman directing the dyad towards his or her ‘goal’. It is thus, invariably, presumed that control is vested in an individual or in a ‘team’ or ‘community’
acting in such as way that unity of control can be presumed as a given.
The argument can be made that the basic concept of HCI — the humancomputer dyad — still reflects this outlook and that the defining conceptual axis
of HCI research has remained that of the human-machine dyad. It goes without
saying that communication and interaction among people is not excluded from
this conception, but the conception is incommensurable with conceptions of the
essentially distributed nature of cooperative work and of the sociality of human
competencies. This becomes evident when attempts are made to address technical
requirements of cooperative work under the aegis of this conception, such as, for
example, when Engelbart and Lehtman in 1988 outlined their vision of a ‘community handbook’ as ‘a monolithic whole’, ‘a uniform, complete, consistent, upto-date integration of the special knowledge representing the current status of the
community’ (Engelbart and Lehtman, 1988), the essentially distributed nature of
such a construct was entirely absent from the conception.
The very same conception underlies the ‘groupware’ research paradigm: computing technologies that facilitate the exchange of messages and files and thereby
‘collaboration’ or ‘group work’ or ‘shared knowledge’ among dispersed individuals working together ‘at a distance’ as a ‘group’ or ‘team’ towards a ‘collaborative
goal’ (cf., for example, G. M. Olson and Olson, 2003). The problem with such
notions is that the issue of the distributed character of cooperative work — the
issue of the distributed control of coordination — is reduced to issues of geographical dispersion: the issues of heterogeneity of practices, incongruence and
incommensurability of conceptual schemes, etc. are not reflected and cannot be
integrated in the cybernetic ‘human-machine system’ conception.
2.6. Facilitation of articulation work: Computer-mediated communications
The notion of computer-mediated communications began with the notion of
‘time-sharing’ operating systems that matured around 1960. Since computer systems at the time were excessively expensive (printed circuit boards and core
memory was only then becoming standard technologies), it was mandatory that
computer systems were operating close to full capacity. Consequently, most of the
few computers that were around were running in a batch-processing mode, one
job after another on a ‘first-in, first-served’ basis, or, as it was aptly expressed by
J. C. R. Licklider, the ‘conventional computer-center mode of operation’ was ‘pat-
Chapter 11: Formation and fragmentation
357
terned after the neighborhood dry cleaner (“in by ten, out by five”)’ (Licklider and
Clark, 1962, p. 114). This economic regime could be tolerated for administrative
purposes such as payroll and invoice processing that, since punched-card tabulating had been adopted earlier in the century, was organized on batch-processing
principles anyway. But in most work settings, this regime effectively precluded all
those applications that require high-frequency or real-time interactions between
user and the digital representations. In particular, the ‘in by ten, out by five’ regime made programming, especially debugging, a deadening affair. This gave ordinary computer technicians a strong motive for devising alternative modes of operation. As described by O’Neill, researchers at MIT, who had ‘cut their teeth’ on
Whirlwind and had experienced interactive computing first-hand, ‘were unwilling
to accept the batch method of computer use for their work’ and ‘sought ways to
continue using computers interactively’. However, ‘it would be too costly to provide each user with his or her own computer’ (O’Neill, 1992, p. 44). So, around
1960 the idea of letting a central computer system service several users ‘simultaneously’ was hatched. In the words of John McCarthy, one of the fathers of the
idea, the solution was an operating system that would give ‘each user continuous
access to the machine’ and permit each user ‘to behave as though he were in sole
control of a computer’ (McCarthy, 1983). The first running operating system of
this kind seems to have been the Compatible Time-Sharing System or CTSS. It
was built at MIT by a team headed by Fernando Corbató and was first demonstrated in 1961 (Corbató, et al., 1962). The various users were connected to the
‘host’ computer via terminals and each would have access to the computing power
of the ‘host’ as if he or she was the only user.
Now, the users of the first of these systems were typically engaged in cooperative work. Some were engaged in developing operating systems or other largescale software projects and were, as a vital aspect of this, engaged in various
forms of discourse with colleagues within the same project teams and research
institutions, that is, with colleagues already connected to the central computer system. Likewise, software technicians would need to coordinate with system operators about possibly lost files to be retrieved, about eagerly-awaited print jobs in
the queue, etc. The time-sharing operating system they were building or using
provided a potential solution to this need, and the idea of using the system to
transfer text messages from one worker to another did not require excessive technical imagination. As one of the designers of one of the first email systems recalls:
‘[CTSS] allowed multiple users to log into the [IBM] 7094 from remote dial-in terminals[] and to
store files online on disk. This new ability encouraged users to share information in new ways.
When CTSS users wanted to pass messages to each other, they sometimes created files with names
like TO TOM and put them in "common file" directories, e.g. M1416 CMFL03. The recipient could
log into CTSS later, from any terminal, and look for the file, and print it out if it was there.’ (Van
Vleck, 2001)
A proper mail program, ‘a general facility that let any user send text messages to
any other, with any content’ was written for CTSS by Tom Van Vleck and Noel
358
Cooperative Work and Coordinative Practices
Morris in the summer of 1965 (ibid.). It allowed one programmer to send a message to other individual programmers, provided one knew the project they worked
on, or to everybody on the same project. The message was not strictly speaking
‘sent’; it was appended to a file called MAILBOX in the recipient’s home directory.
The same year Van Vleck and Morris also devised a program (.SAVED) ‘that allowed users to send lines of text to other logged-in users’, that is, a primitive form
of ‘instant messaging’ (ibid.).
The scope of the exchange of messages with these and similar programs was
limited by the boundary of the hierarchy comprising the local central computer
system and the terminals connected to it. Messages could not travel beyond the
event horizon of this black hole. This world of isolated systems dissolved with the
development of network computing. The motivation driving the development was
(again) not to develop facilities for human interaction, not to mention cooperative
work, but to utilize scarce resources in a more economical way.
For Licklider, who also initially headed the development of ARPANET, the
motivation for the network was to reduce ‘the cost of the gigantic memories and
the sophisticated programs’. When connected to a network, the cost of such
shared resources could be ‘divided by the number of users’ (Licklider, 1960, p. 8).
This vision was spelled out by Lawrence Roberts, who took over as manager of
the development of ARPANET in 1966. Noting that the motivation for past at-
tempts at computer networks had been ‘either load sharing or interpersonal
message handling’, he pointed to ‘three other more important reasons for
computer networks […], at least with respect to scientific computer applications’, namely:
‘Data sharing: The program is sent to a remote computer where a large data base exists.’
‘Program sharing: Data is sent to a program located at a remote computer and the answer is returned.’
‘Remote Service: Just a query need be sent if both the program and the data exist at a remote location. This will probably be the most common mode of operation until communication costs come
down [since this] category includes most of the advantages of program and data sharing but requires less information to be transmitted between the computers.’ (Roberts, 1967, p. 1).
On the other hand, electronic mail was explicitly ruled out as of no significance:
‘Message Service: In addition to computational network activities, a network can be used to handle
interpersonal message transmissions. This type of service can also be used for educational services
and conference activities. However, it is not an important motivation for a network of scientific
computers’. (Roberts, 1967, p. 1).
These motivations for developing the ARPANET were upheld and confirmed as
the net began to be implemented:
‘The goal of the computer network is for each computer to make every local resource available to
any computer in the net in such a way that any program available to local users can be used remotely without degradation. That is, any program should be able to call on the resources of other
computers much as it would call a subroutine. The resources which can be shared in this way include software and data, as well as hardware. Within a local community, time-sharing systems
Chapter 11: Formation and fragmentation
359
already permit the sharing of software resources. An effective network would eliminate the size
and distance limitations on such communities.’ (Roberts and Wessler, 1970, p. 543).
Thus, as summarized by Ian Hardy, in his very informative history of the origins of network email, the primary motive was economic.
‘ARPANET planners never considered email a viable network application. [They] focused on
building a network for sharing the kinds of technical resources they believed computer researchers
on interactive systems would find most useful for their work: programming libraries, research data, remote procedure calls, and unique software packages available only on specific systems.’
(Hardy, 1996, p. 6).
After pioneering work on the underlying packet-switching architecture and
protocols, the experimental ARPANET was launched in 1969, connecting measly
four nodes.98 In the summer of 1971, when the network had expanded to fifteen
nodes, a programmer named Ray Tomlinson at BBN (Bolt, Beranek, and Newman), devised a program for sending email over the network. He recalls that,
while he was making improvements to a single-host email program (SNDMSG) for
a new time-sharing operating system (TENEX) for the PDP-10 minicomputer,
‘the idea occurred to [him]’ to combine SNDMSG with en experimental file-transfer
protocol (CPYNET) to enable it to send a message across the network, from one
host to another, and append it to the recipient’s MAILBOX file. For that purpose he
also devised the address scheme NAME@HOST that has become standard (Hardy,
1996; Tomlinson, 2001). ‘The first message was sent between two machines that
were literally side by side. The only physical connection they had (aside from the
floor they sat on) was through the ARPANET’, that is, through the local Interface
Message Processor (IMP) that handled packet switching. To test the program, he
sent a number of test messages to himself from one machine to the other. When
he was satisfied that the program seemed to work, he sent a message to the rest of
his group explaining how to send messages over the network: ‘The first use of
network email announced its own existence’ (Tomlinson, 2001). The program
was subsequently made available to other sites on the net that used TENEX on
PDP-10s and was soon adapted the IBM 360 and other computers (Salus, 1995, p.
95).
An instant success within the tiny world of ARPANET programmers, this very
first network email program triggered a chain reaction of innovation that within
less than a couple of years resulted in the email designs we use today: a list of
available messages indexed by subject and date, a uniform interface to the handling of sent and received mail, forwarding, reply, etc. — all as a result of programmers’ improving on a tool they used themselves. In 1977, an official ARPANET standard for electronic mail was adopted (Crocker, et al., 1977).
The history of network email after that is well known. The technology migrated beyond the small community of technicians engaged in building computer
98 Much of the pioneering work on packet switching was done by Donald Davies and his colleagues at the
British National Physical Laboratory (Davies, 1965, 1966). The NPL network, which was also launched in
1969, was built with rather similar objectives in mind (On NPLNet, cf. also Abbate, 1999).
360
Cooperative Work and Coordinative Practices
networks to computer research in general and from there to the world of science
and eventually to the world at large.
What is particularly remarkable in this story, and what also surprised those involved when they began to reflect on the experience, was ‘the unplanned, unanticipated and unsupported nature of its birth and early growth. It just happened, and
its early history has seemed more like the discovery of a natural phenomenon than
the deliberate development of new technology’ (Myer and Dodds, 1976, p. 145).
And at a meeting in January 1979, convened to discuss the ‘the state of computer
mail in the ARPA community and to reach some conclusions to guide the further
development of computer mail systems’, it was ‘noted’ as a fact ‘that most of the
mail systems were not formal projects (in the sense of explicitly sponsored research), but things that “just happened”’ (Postel, 1982, p. 2). In the same vein, the
official ARPANET Completion Report notes that ‘The largest single surprise of
the ARPANET program has been the incredible popularity and success of network mail’ (Heart, et al., 1978, p. III-110).
Network email was not only ‘unplanned’ and ‘unanticipated’; it was ‘mostly
unsupported’ (Heart, et al., 1977, p. III-67; Abbate, 1999, p. 109), for, as noted
already, the objective of the ARPANET was resource sharing. Not only had Lawrence Roberts not included network mail in the original plans for the network, he
had excluded it as ‘not an important motivation for a network of scientific
computers’ (Roberts, 1967, p. 1). However, as Janet Abbate points out in her history of the emergence of the Internet, the aimed-for resource sharing failed to materialize. By 1971, when the original fifteen sites for which the net had been built
were all connected, ‘most of the sites were only minimally involved in resource
sharing’ (Abbate, 1999, p. 78). She notes that the ‘hope that the ARPANET could
substitute for local computer resources was in most cases not fulfilled’, and adds
that, as the 1970s progressed, ‘the demand for remote resources actually fell’,
simply because minicomputers ‘were much less expensive than paying for time on
a large computer’ (ibid., pp. 104 f.). The budding network technology represented
by ARPANET was on the verge of being superseded and outcompeted by the proliferation of relatively inexpensive minicomputers. Thus, ‘Had the ARPANET’s
only value been as a tool for resource sharing, the network might be remembered
today as a minor failure rather than a spectacular success’ (ibid, pp. 104-106).
As it happened, email quickly ‘eclipsed all other network applications in volume of traffic’ (Abbate, 1999, p. 107). In fact, according to Hafner and Lyon
email amounted to 75 percent of the traffic on the net as early as 1973 (2003, pp.
189, 194). One reason why the ARPANET succeeded as an experiment was that
‘since members used the network protocols in their own work, they had the incentive and the experience to create and improve new services’ (Abbate, 1999, p. 69).
That is, as in the case of local email on time-sharing operating systems, network email came as an afterthought, devised by computer technicians for their
own use, as a means for coordinating their cooperative effort of building, operating, and maintaining a large-scale construction, in this case the incipient Internet.
Email was thrown together like the scaffolding for a new building, only to be-
Chapter 11: Formation and fragmentation
361
come a main feature, relegating the resulting building itself, which had been the
original and official objective, to something close to a support structure.
This pattern — technicians building tools for use in their own laboratories —
was to be repeated again and again, as evidenced by, for example, CSNET developed by Larry Landweber and others 1979-81 to provide under-privileged computer scientists access to ARPANET as well as mail, directory services, news, announcements, and discussion on matters concerning computer science; USENET
which was developed in 1979 by Jim Ellis and Tom Truscott as a news exchange
network for Unix users without access to ARPANET (cf. Quarterman, 1990, pp.
235-251; Hauben and Hauben, 1997); the World Wide Web developed in 1989 at
CERN by Tim Berners-Lee and Robert Caillau (Berners-Lee, 1990; Gillies and
Cailliau, 2000); and the ARCHIE and GOPHER network file search protocols designed by Alan Emtage in 1989 and Mark P. McCahill in 1991, respectively
(Gillies and Cailliau, 2000).
3. The formation of CSCW
In important respects, digital computing technologies developed in response to
challenges faced by cooperative work. Firstly, the digital stored-program computer was developed as a technology for automating the large-scale calculation work
that since the time of the French Revolution had been performed cooperatively, in
an advanced form of division of labor (‘human computers’) but by the middle of
the 20th century was becoming far too complex to be carried out by the received
methods. Secondly, the basic concepts and techniques of interactive computing
were developed as a technology of facilitating large-scale cooperative work (air
defense, air traffic control, airline reservation) that had become too complex to be
done by conventional means, manual or mechanical. Thirdly, the basic concepts
and techniques of distributed messaging (network email, etc.) were developed as a
technology for facilitating and mediating the coordinative activities required by
the cooperative effort of building and managing a large-scale infrastructure,
namely, the ARPANET, and were quickly complemented by a host of other technologies such as mailing lists, news groups, file sharing, etc.
Furthermore, of course, with the arrival of affordable personal computer (such
as the Apple II in 1977, the IBM PC in 1981, and the Macintosh in 1984) and the
establishment of the basic Internet protocols (TCP/IP in 1983), interactive computing was reaching a state where it was becoming realistic for ordinary workers, not
merely to have access the strictly constrained repertoire of functionality available
in real-time transaction processing systems, but to have unrestricted local command of a computer with a palette of applications. It was thereby becoming realistic for computer technologies to become an integral part of cooperative work practices in entirely new ways.
362
Cooperative Work and Coordinative Practices
3.1. Proto-CSCW: ‘Computer-mediated communications’
The new possibilities opened for several partly overlapping research areas with
different conceptual frameworks.
First of all, of course, email and other technologies of computer-based message
handling caught the attention of researchers and technology pundits as a fascinating new topic for studies of communication in organizations, as a new management technology along with the telegraph, teleprinters, fax, microfilm, etc., and
engendered a small corpus of evaluation studies of the presumptive effects and
impact of electronic mail (Uhlig, 1977; Kiesler, et al., 1984; Eveland and Bikson,
1987) as well as studies that can better be categorized as market analyses (Panko,
1977, 1984).
More importantly, a small but persistent research area was formed that focused
on developing a computing technology generally known as ‘computer conferencing’ (for an overview of this work, cf. Kerr and Hiltz, 1982). In fact, computer
conferencing research developed simultaneously with network email, in the
course of the 1970s. But in contrast to email, in a ‘conference’ communications
were not restricted to point-to-point message exchanges but were as a default
‘public’ exchanges among attendees at the online ‘conference’. That was the idea,
anyway. ‘Computer conferencing’ was originally merely a variant of the online
transaction processing paradigm, with dispersed participants logging-in to a central host computer, with ‘messages’ treated like transactions, and with exchanges
similarly constrained by a pre-established structure. In EMISARI, for instance,
each ‘message’ was confined to 10 lines of text (Kerr and Hiltz, 1982) while control was centralized and fixed. ‘Computer conferencing’ was later, as distributed
network architectures became ubiquitous, implemented on a message-passing
model; but the notion of centralized control of the communication structure has
remained fundamental. In fact, the more ambitious experiments in this line of research, such as EMISARI and EIES by Turoff et al. (Turoff, 1971, 1972, 1973;
Hiltz and Turoff, 1978) or FORUM and PLANET by Vallee et al. (Vallee, 1974),
explored the rather grand design vision of ‘group communication’ structured according to some presumptively rational model.
Later, ‘computer conferencing’ was often advocated as a remedy for the ‘information overload’ which was seen as an inexorable consequence of point-topoint message exchange (Palme, 1984; Hiltz and Turoff, 1985), but, ironically, in
the 1980s ‘computer conferencing’ was often used, by users who at the time did
not have direct access to the Internet, as a platform for exchanging emails.
Anyway, ‘computer conferencing’ soon became categorized together with
email as ‘computer-mediated communication’, a label still very much in use (Kerr
and Hiltz, 1982; Hiltz and Turoff, 1985; Rice, 1987; Quarterman, 1990; Rice,
1990; Turoff, 1991; Barnes, 2003). However, the research agenda was never succinctly defined but the general drift of the efforts was clear enough. In an early
central study of the field, commissioned by the NSF (Hiltz and Kerr, 1981), the
authors attempted to ‘collect and synthesize current knowledge about computer-
Chapter 11: Formation and fragmentation
363
mediated communication systems’ (ibid., p. viii). However, the ‘synthesis’ focused on issues of ‘acceptance’. The authors note that the idea of computermediated communication ‘seems simple enough at first glance’, but ‘It is the applications and impacts that are startling, and the acceptance of the technology that
is problematical’ (Kerr and Hiltz, 1982, p. ix). That is, there was a problem with
‘acceptance’ but the technology was anyhow ‘startling’. Instead of reflecting on
underlying design assumptions, the authors argued that, ‘computer-mediated
communication […] requires that people accept fairly radical changes in the way
they work and even in the way they think, if they are to reap the potential benefits’ (ibid.).
In presenting the objectives of their book, Kerr and Hiltz highlighted the following issues in ‘computer-mediated communications’:
‘1. What are the important considerations in designing software or choosing a system from the
many available options and capabilities?
2. What factors determine whether such systems are likely to be accepted or rejected?
3. What are the likely impacts of such systems upon the individuals, groups, and organizations
which use them? It is not the economic costs and benefits, but the social problems and “payoffs”
in the form of enhanced performance and organizational efficiency that should be the main considerations in deciding whether or not to use a computer-mediated communication system.’ (Kerr and
Hiltz, 1982, p. x)99
This research agenda has, in a way, become classical: it can be seen as a model of
an approach to CSCW research that has been going on stubbornly ever since:
which factors determine user ‘acceptance’ and what are the likely impacts of such
systems?100 So, while the experiments with conferencing systems sometimes allowed for long-term use and thus evolution of ‘user behavior’ (e.g., Hiltz and
Turoff, 1981) and considerable effort was devoted to extensive empirical evaluation, the evaluation studies associated with computer conferencing never were
such that underlying principles and concepts of the conferencing idea were questioned: the evaluations invariably focused on ‘acceptance’ and ‘satisfaction’.
In any event, this whole line of research on ‘computer-mediated communication’ remained mired in common-sense notions of ‘communication’ with respect
to work practices and never began to address the principles and concepts underlying the various communication techniques that had been developed by practitioners and visionaries. The systematic conceptual effort required to transform technique into technology was absent.
However, the critical questions were raised, but outside of the small coterie of
computer conferencing researchers. The systematic conceptual effort was under-
99 A fourth issue was also listed, but is not really an ‘issue’ but a call for ‘formal evaluation and feedback
from users to guide the implementation’ (Kerr and Hiltz, 1982, p. x).
100 Take for example the research agenda of Sproull and Kiesler: ‘How will these technologies influence and
change organizations? Does a computer network make work groups more effective? How do people treat one
another when their only connection is a computer message? What kinds of procedures best suit long-distance
management using a computer network? What problems do these technologies alleviate-and what problems
do they create?’ (Sproull and Kiesler, 1991, p. ix).
364
Cooperative Work and Coordinative Practices
taken by a separate research program, normally and confusingly also categorized
as ‘computer-mediated communication’. The European ‘computer-mediated
communications’ community emerged in the wake of the European efforts to develop computer networking (cf. Gillies and Cailliau, 2000). As TCP/IP slowly became available in operating systems and developers began to be able to take it for
granted, and as the ‘message handling’ standards stabilized in the first half of the
1980, European researchers, organized under the aegis of the European Commission’s COST-11 program, embarked on what was seen as the logical next step,
namely, developing the standards required for putting it all together: email as well
as directories, calendars, schedules, and so on.
The point of departure for these efforts is well summarized and exemplified by
this observation made by Bowers et al. and presented at a European conference on
‘teleinformatics’ (Speth, 1988): ‘Simple electronic mail meets only the most basic
messaging requirements of group communication, while relatively sophisticated
conferencing and bulletin board systems offering supposedly “all-purpose” facilities reflect their designers’ limited intuitions about what users will wish to do, and
such systems can be difficult to adapt to support specific tasks’ (Bowers, et al.,
1988, p. 195). That is, as opposed to the previous line of research on computermediated communications, the CMC research undertaken under COST-11 was
proactive, predicated on the insight that, for instance, the email technology that
had been developed by the enthusiastic ARPANET technicians for their own use
was rudimentary. The extant email protocol did not, for example, allow users to
configure the protocol for special purposes, nor was the protocol sufficiently expressive for general use in work settings. What was required was a specification
language that would allow users to express more than, say, ‘TO, ‘FROM’, and ‘SUBJECT’. Accordingly, ambitious attempts to extend the email protocol by using
speech-act theory as a grammar so as to enable users to express the illocutionary
point of a message by categorizing it as a ‘REQUEST’ or ‘PROMISE’, cf. THE COORDINATOR by Winograd and Flores (Flores, et al., 1988) and CHAOS by di Cindio,
de Michelis, and Simone (De Cindio, et al., 1986).
Similarly, it was realized that ‘email’ and ‘conferencing’ should be seen as
merely instances of a range of ‘group communication’ facilities that therefore had
to be systematically conceptually integrated. Realizing this, research under the
COST-11 program aimed at developing such a conceptual foundation. As a representative of one of the research efforts, the AMIGO project, Hugh Smith noted
that ‘a number of electronic message systems provide the necessary low-level
support for a variety of higher-level structured group communication activities’,
such as news distribution, conferencing, information storage and remote retrieval
but he then added that ‘there is no direct high-level processing support for these
activities within the messaging system’. Specifying a number of critical shortcoming, he emphasized that it was a ‘fundamental requirement that communicated information should be able to be used for more than one function’ but that this requirement was not met, and he went on to note that neither was the requirement
that many kinds of services were needed to ‘support structured communication
Chapter 11: Formation and fragmentation
365
activities in addition to the basic information transfer capability provided’. He further noted that there were ‘no standardized services to support group communication’, and that most existing systems were ‘inflexible and not easy for the endusers to adapt to their specific purposes’ (Hugh T. Smith, 1988, pp. 90 f.). He
called attention to the critical issue that conferencing systems such as those mentioned above ‘often have a centralized resource architecture’ and added that these
architectures therefore do not scale up: ‘In the future the sheer size and political/geographical separation of networked communities will demand that the resources and the management of group communication activities be distributed’
(ibid., p. 90). The COSMOS project, also under COST-11, was moving along parallel lines, focusing on developing ‘a high-level, user-oriented language by means
of which users can alter the structure of their communication environment’
(Bowers, et al., 1988, p. 195; cf. also Bowers and Churcher, 1988, 125).
However, the COST-11 researchers’ critical studies of existing ‘computermediated communications’ techniques and the rigorous attempts to reconstruct
‘computer-mediated communications’ on a systematic conceptual foundation led
to unanticipated conclusions: instead of concluding in a systematically reconstructed technology for ‘group communications’, this line of research ended in the
realization that the shortcomings they had identified were even deeper. Because of
their rigorous approach they realized that problem was rooted in treating ‘communication’ as a separate kind of activity that, presumably, generally is (or can
be) carried out divorced from work practices. This realization led to a program
shift in CSCW as the focus was moved from the concept of ‘communication’ to
the concept of ‘cooperative work practices’.
3.2. The crisis of the message-handling paradigm
Although the experiments with ‘computer conferencing’ at the time were reported
as very promising and successful, this particular research program ran out of
steam. This had to do with these underlying conceptual limitations. ‘Computer
conferencing’ research shared with the standard message exchange paradigm the
presumption that human communication generally is or can be treated as a distinct
activity. True, workers do interrupt their primary work to have conversations and
exchange notes, letters, memos about their work (and about other matters). They
also, occasionally, put their work aside to go to meetings. For some workers, e.g.,
managers, the major part of their work day may be spent in conversations and
meetings. But apart from managerial work and in the greater scheme of things,
conversations and meetings are exceptions, interruptions, ‘a necessary evil’ perhaps, or simply considered ‘a waste of time’. And even when workers engage in
conversations and meetings, such discourses are generally related to the state of
affairs in their work, to the flow of work, the schedule, the production facilities,
and the archives, and in their deliberations workers will discuss schedules, plans,
schemes, and so on; they will collate, arrange, distribute, present, hand out, walk
up to, gather around, point to, gesture at, inspect, amend, etc. all sorts of artifacts.
366
Cooperative Work and Coordinative Practices
By the mid-1980s this insight began to mature and be voiced (cf., e.g., Bannon,
1986, p. 443). The ‘computer-mediated communications’ research program had
arrived at a critical junction. The European ‘computer-mediated communications’
researchers soon realized that the ‘message-handling’ model underlying ‘computer-mediated communications’ was quite limited (Pankoke-Babatz, 1989a). In
work practices, communication is normally not a separate activity; it is typically
an integrated aspect of doing the work. It was therefore considered necessary to be
able to incorporate communication functionality in the various domain-specific
applications. On the other hand, the European ‘computer-mediated communications’ researchers rejected the ‘computer conferencing’ paradigm as a way to provide structure to the exchange of messaging. Guided by ‘a strong commitment to
the actual situation in working life’ (Pankoke-Babatz, 1989c, p. 20), they rejected
the idea underlying the ‘computer conferencing’ paradigm of providing ‘a new
model’ of communication. Instead, they aimed at providing a model that ‘might
be used in the design and implementation’ of local and temporary ‘patterns’ of
interaction. That is, instead of deciding on a particular preconceived conception of
communication functionalities and applications, they ‘chose […] to look at activities and the regulations required by a group of people to co-operatively execute a
particular activity. The model we want to develop should therefore allow specification of such regulations’ (Pankoke-Babatz, 1989c, p. 20). That is, the aim was
to build what one could call an abstract model or a notation that would make it
possible ‘to model the activities, businesses, tasks, actions or work-flow[s], which
are performed by a group of co-operating people’, so as to, in turn, ‘facilitate the
required co-ordination and possibly to automate co-ordination, thus reducing the
co-ordination effort required of the participants in an activity’ (Pankoke-Babatz,
1989b, p. 14).
The European ‘computer-mediated communications’ researchers knew very
well that the development of such computational models and architectures would
have to be grounded in ‘fundamental understanding of Group Communication
processes’ (p. 14), which in turn, because of the complexity and variability of
working practices, would need contributions from ‘sociology, anthropology, economics and political science’ (p. 21). Their ‘strong commitment to the actual situation in working life’ was amply demonstrated in the pre-dominance of the practice-oriented program in the European CSCW research community that began to
coalesce as these research activities ended in 1988.
This critique of the underpinnings of ‘computer-mediated communications’
was also expressed — clearly and succinctly — by Irene Greif in her ‘Overview’
of CSCW in her influential CSCW: A Book of Readings (1988b). Having noted
the rapid development of ‘computer-mediated communications’ from electronic
mail to computer conferencing she then observed:
‘Computer conferencing has since been expanded to support a wide range of “many-to-many
communication” patterns. However, when computer conferencing is applied to some task, the
model breaks down. The unstructured body of messages is suitable for the free-flowing text of
natural language, but does not let us set the computer to work on our problems. Designers who
Chapter 11: Formation and fragmentation
367
draw pictures, software developers who jointly write code, financial analysts who collaborate on a
budget — they all need coordination capabilities as an integral part of their work tools. That means
coordination support within the CAD engineer’s graphics package, within the programmer’s
source-code editor, within the budget writer’s spreadsheet program. It means support for managing
versions of objects, be they pictures, programs, or spreadsheets. It means ways to distribute parts
of the object for work by contributing group members, ways to track the status of those distributed
parts, ways to pull completed objects back together again. The limit of electronic mail and computer conferencing is that they have such features for managing messages only. CSCW widens the
technology’s scope of application to all the objects we deal with.’ (Greif, 1988b, pp. 7 f.)
Greif’s judgment that ‘the model breaks down’ completely matched the diagnosis
that had been made in the European ‘computer-mediated communications’ research community. It is also significant that Greif had reached strikingly similar
conclusions with respect to the new research program: ‘Methodologies for testing
individual user interfaces don’t apply as well to group support systems. As a result, CSCW is looking more to anthropology to find methodologies for studying
groups at work in their natural settings’ (Greif, 1988b, p. 10).
In short, it was becoming clear that the ‘computer-mediated communications’
research program was deeply flawed in its underlying ‘message handling’ outlook, in its focus on communication in the abstract, divorced from the work practices of which it normally is an integral part, but also severely limited in the way it
conceived of the role of empirical studies in technological development. It was
becoming clear, at least to some, that in-depth studies of cooperative work practices in ‘natural settings’ was a prerequisite.
The technologies of computer-mediated communication had not failed in any
direct sense, nor had their potentials been exhausted. What had happened was
what the historian of technology Edward Constant, adopting Kuhn’s concept of
‘anomaly’, has termed ‘presumptive anomaly’. It occurs, not when systems based
on the technology fail ‘in any absolute or objective sense’ but when it is assumed
or known that the technology in question is seriously limited and that it will fail or
be inadequate under certain conditions: ‘No functional failure exists; an anomaly
is presumed to exist; hence presumptive anomaly’ (Constant, 1980, p. 15). Take,
for example, the turbojet revolution as described by Constant. This technological
development did not come about because conventional propeller aircraft technology had ‘failed or faltered by any means or measure: it still held out a great deal
of development still to be done; it still promised and in the event delivered greatly
increased performance’. But the insights of aerodynamics indicated that the conventional technology would fail when aircraft approached the speed of sound and
probably would become inefficient (in terms of speed, fuel efficiency) relative to
alternative technologies such as turbojet propulsion (Constant, 1980, pp. 15 f. et
passim).
The message-handling technologies were seen as having landed in a similar
presumptive crisis: on the communication-mediation paradigm, predicated on
technologies of message handling, it would not be possible to address the coordination challenges of ordinary cooperative work in a way that integrated communication and coordination with everyday work practices and techniques. It was clear
368
Cooperative Work and Coordinative Practices
that message-handling technologies had critical limitations. That messagehandling technologies were found critically limited with respect to work practices
does not mean, of course, that they were found useless for cooperative work settings or for other settings. Nor does it mean that message-handling technologies
could not be further developed, refined, etc. It just means that they were and remain of marginal relevance to key coordinative issues in cooperative work settings.101
3.3. Automation of articulation work: ‘Office automation’
At the same time as it was becoming clear to many ‘computer-mediated communications’ researchers, especially in Europe, that the ‘message handling’ paradigm
was at odds with typical everyday cooperative work practices and that the paradigm thus had to be overcome, researchers in the ‘office automation’ movement
were arriving at similar conclusions, although their point of departure was of
course entirely different.
The ‘office automation’ movement had begun in high spirits in the 1970s,
stimulated by different but intersecting technical developments. As with ‘computer-mediated communications’, the baseline was the advent of computer networks.
But the approach was radically different. Instead of conceiving of computer networks as a ‘medium’, that is, as a facility that regulates human interaction in negligible ways, the ‘office automation’ program deliberately aimed at regulating interaction in significant ways. The seminal idea was that various new techniques
for constructing executable models that had been invented made it worthwhile to
explore whether and to which extent such representations might be exploited as a
means of modelling and regulating ‘office procedures’ and other kinds of workflows: on one hand, the algebraic techniques for building computational models of
distributed systems developed by Petri and others since the early 1960s (cf., e.g.,
Zisman, 1977; Clarence A. Ellis, 1979) and, on the other hand, the equally sophisticated techniques for constructing complex adaptive models developed under the
Artificial Intelligence label (cf., e.g., Hewitt, 1977; Fikes and Henderson, 1980;
Barber and Hewitt, 1982). These hopes were soon defeated, however. Experimental applications such as DOMINO turned out to be felt like ‘straitjackets’ in
actual use (Kreifelts, 1984; Victor and Sommer, 1989; Kreifelts, et al., 1991a;
Kreifelts, et al., 1993). Comparable lessons were learned from the CHAOS experiment (De Cindio, et al., 1988; Bignoli and Simone, 1989; Simone, 1993; Simone
and Divitini, 1999). That is, ‘office work’ was not at all as easily captured and
modelled as had been presumed. Handling contingencies and dealing with inconsistencies turned out to be an essential aspect of cooperative work practices. The
‘office automation’ program had landed in a crisis of its own.
101 However, this also meant that CSCW as an institutionalized research field would be an arena in which
different research programs that do not address the same problem and are characterized by distinctly different
research modalities would co-exist.
Chapter 11: Formation and fragmentation
369
At this point a new approach to technological research was devised: a few sociologists became involved in the effort to understand the status of ‘office procedures’ and cooperative work in general, on one hand Lucy Suchman and Eleanor
Wynn (Wynn, 1979; Suchman, 1982, 1983; Suchman and Wynn, 1984) and on
the other Eli Gerson and Susan Leigh Star (Gerson and Star, 1986).
That this coupling of sociological and technological research would first occur
in the ‘office automation’ movement was hardly accidental. Email and most other
communication technologies were devised by computer technicians for their own
use. That is, they were developed in a distributed and incremental fashion to solve
local problems in practices that were well-known to the designers; and as they
were found to be of general utility they were then — post festum — subjected to
standardization and design. Their development did not require workplace studies
of any kind. On the other hand, computer-conferencing systems were developed
in a proactive manner; they were strictly speaking designed. But their design was
based on normative models of what was claimed to be rational decision making,
not on what was taken to be a well-grounded understanding of an actual practice.
By contrast, however unrealistic the experimental designs of the ‘office automation’ movement turned out to be, nobody were under the illusion that one workflow model would fit all, and each workflow model was presumed to be empirically valid. That is, building technical systems that regulate actions and interactions in the strong sense envisioned by the ‘office automation’ movement was unproblematically thought to require some kind of analysis and modelling of existing procedures. When the models ultimately turned out not to work as anticipated,
the natural next step was to look more carefully at the reality of ‘office work’.
This is anyway what happened. And it was also realized, eventually, that the
problem was not just with this or that particular model or modelling technique. It
was realized that the problem was conceptual. Those early studies of ‘office work’
indicated that received concepts of cooperative work as mere ‘execution’ of preconceived ‘procedures’ were inherently problematic. This point was driven home,
emphatically, both by Gerson and Star and by Suchman in her contemporaneous
critique of the concept of ‘plans’ in cognitive science (Suchman, 1987).
This insight was a fatal blow to the conceptual basis of the ‘office automation’
movement.
3.4. The CSCW research program
The work of Suchman, Wynn, Gerson, and Star had significance beyond these, as
it were, immediate implications. It showed, by way of example, that in-depth studies of actual working practices could have strong impact on conceptual issues in
the development of computing technologies. It demonstrated the adequacy of, and
indeed need for, empirical studies of work practices on the model of Réaumur,
Perronet, and Beckmann — even in this area of, putatively, ‘applied’ science.
This, in my view, was the defining moment of CSCW. The early contributions
by Wynn, Suchman, Gerson, and Star provided the ‘exemplars’, in a Kuhnian
370
Cooperative Work and Coordinative Practices
sense, for defining a new research program in which in-depth studies of cooperative work ‘in the wild’ were considered a prerequisite for developing computer
technologies for human interaction. However, we should remember that new research paradigms are not necessarily heralded as such when they arrive on the
scene. In fact, as pointed out by Kuhn, ‘we must recognize how very limited in
both scope and precision a paradigm can be at the time of its first appearance’.
Thus the ‘success of a paradigm […] is at the start largely a promise of success
discoverable in selected and still incomplete examples.’ (Kuhn, 1962, pp. 23 f.).
This observation certainly applies to the emergence of the practice-oriented research program of CSCW.
The exemplary role of these studies were not only a function of the findings or
of the role of field work in producing them. In both cases the research was integral to settings in which computer scientists and sociologists were addressing the
same set of problems. The work of Suchman and Wynn was, of course, an important part of the research at Xerox PARC where Suchman would later head a
highly influential interdisciplinary group of researchers. It is less well known but
important to note that the work of Gerson and Star anticipated much of was later
to unfold in CSCW in that their research was part of a collaborative research network involving both sociologists and computer scientists. The network, which also included Carl Hewitt, Anselm Strauss, Rob Kling, Adele Clarke, Joan Fujimura, Walt Scacchi, and Les Gasser, brought together sociologists with a track
record in workplace studies of health care and biological research work as well as
computer scientists engaged in developing what would later be known as distributed AI and agent-based architectures.
So, when Liam Bannon and I wrote our programmatic article for the first European CSCW conference in 1989, this was the kind of work we had in mind:
‘CSCW should be conceived as an endeavor to understand the nature and characteristics of cooperative work with the objective of designing adequate computer-based technologies. That is,
CSCW is a research area addressing questions like the following: What are the specific characteristics of cooperative work as opposed to work performed by individuals in seclusion? What are the
reasons for the emergence of cooperative work patterns? How can computer-based technology be
applied to enhance cooperative work relations? How can computers be applied to alleviate the
logistic problems of cooperative work? How should designers approach the complex and delicate
problems of designing systems that will shape social relationships? And so forth. The focus is to
understand, so as to better support, cooperative work.’ (Bannon and Schmidt, 1989, p. 360).
In sum, two intellectual movements merged in the formation of CSCW. On one
hand, ‘computer-mediated communication’ as a technologically oriented research
program had arrived at a stage where it was beginning to dawn on many participants that the program was barking up the wrong tree. It had been focusing on aspects of interaction (‘communication’) that were conceived of as divorced from
work practices but which normally are an integral part of doing the work and
deeply enmeshed in the materiality of the setting and its organizational conventions and procedures. To move beyond that impasse, it was found necessary to
develop an understanding of actual cooperative work practices. On the other hand,
the ‘office automation’ program had landed in a situation where it had become
Chapter 11: Formation and fragmentation
371
clear that formal organizational constructs such as prescribed procedures are not
mere algorithmic subroutines but part and parcel of professional work practices. It
was, again, found necessary to develop an understanding of actual cooperative
work practices. Here the history of CSCW proper begins.
A note of clarification. When I point to the early work of Suchman, Wynn,
Gerson, and Star as ‘exemplars’ of practice-oriented contributions to technological research, this of course does not mean that the formation CSCW was not part
of a wider intellectual movement than circumscribed by Ethnomethodology and
Symbolic Interactionism. To the contrary. It was, and is, a distinct research effort
within a much broader movement that, in different ways, strives to understand
computing in its social context. Thus the Participatory Design movement, which
originated in Scandinavia (e.g., Bjerknes, et al., 1987; Ehn, 1988), brought together computer scientists and others striving to understand the development and
use of computing systems as embodied social practices (building on the work of
Marx, Heidegger, Wittgenstein). Likewise, subversive elements within Artificial
Intelligence such as Terry Winograd quite early had serious doubts as to the conceptual foundations of AI and defected, drawing on as diverse philosophical traditions as those of Heidegger and Austin (1986; 1986). In fact, AI was at the then
under strong and effective external criticism by philosophers from the same traditions by, e.g., Dreyfus (1979) and Searle (1987, 1989) and from a Wittgensteinian
tradition (e.g., Shanker, 1987b, c). At about the same time, a related movement
away from cognitive science towards an ‘ecological’ and ‘naturalistic’ conception
of computing based on Gibsonian psychology was unfolding in Human Factors
engineering (e.g., Vicente and Rasmussen, 1992; Flach, et al., 1995). When I
nonetheless point to these early ‘exemplars’ it is because they, in different ways
and from different intellectual traditions, demonstrated that in-depth studies of
work practices could contribute to the development of the conceptual foundations
of computing technology.
The fecundity of CSCW’s practice-oriented program became evident immediately, even as the program was being tentatively articulated. The first report on the
Lancaster group’s study of air traffic control was presented to the incipient CSCW
community in 1989 (Harper, et al., 1989a) and was quickly followed by the equally emblematic study of the London Underground control room (Heath and Luff,
1991a). Nor did it take long for it to become clear that these new insights would
have radical implications for not only the development of certain classes of applications but for underlying computer technologies. This was, for example, made
explicit with respect to the research area of distributed systems by Rodden and
Blair in their classic paper from 1991. Referring to the ‘the rich patterns of cooperation found in CSCW’ as depicted in the early harvest of ethnographic studies,
the authors emphasized that coordinative practices are specific to work domain
and setting and that cooperating ensembles work ‘in dynamic and unexpected
ways and are themselves dynamic’. Having examined distributed systems architectures in the light of this insight, Rodden and Blair concluded that ‘existing ap-
372
Cooperative Work and Coordinative Practices
proaches to control in distributed systems are inadequate’ (Rodden and Blair,
1991, p. 49) and that the implications for technological research are fundamental:
‘For example, consider the problem of shared access to resources. In most distributed systems this
is dealt with by masking out the existence of other users. Hence sharing is transparent with each
user unaware of the activity of others. This clearly contradicts the needs of CSCW. […] The problem with this approach is that presumed control decisions are embedded into the system and hence
cannot be avoided or tailored for specific classes of application. This is the root of the problem in
supporting CSCW. Because of the dynamic requirements of CSCW applications, it is very unlikely
that such prescribed solutions will be suitable.’ (Rodden and Blair, 1991, p. 59).
Rodden and Blair concluded that ‘CSCW demands a fresh approach to control
which is specifically tailored for cooperative working’ (1991, p. 60). This was a
crucial programmatic proposition. The key problem for CSCW is not ‘communication’ or ‘resource sharing’ but cooperating actors’ control of their interaction
and, by implication, of the computational regulation of their interaction. The ‘root
of the problem in supporting CSCW’, they pointed out, is that coordinative protocols cannot be prescribed once and for all and that ‘control decisions’ must, ultimately, be in the hands of practitioners. This problem is fundamentally different
from the issue of user control of system behavior in HCI, in that control in cooperative work settings is, in principle, distributed. In sum, then, informed by the
findings of the initial ethnographic studies of cooperative work in real-work settings, Rodden and his colleagues, quite succinctly, articulated a line of research
that radically challenges fundamental assumptions in computer science as expressed in the architecture of operating systems, network services (client-server
protocols), database systems, etc. The ‘root of the problem in supporting CSCW’
has since then been spelled out and explored from different perspectives: ‘event
propagation mechanisms’ for ‘awareness’ support, ‘coordinative artifacts and protocols’, and so on.
With the initiation of this line of research, the CSCW research program had so
to speak been fully specified. The early studies by Wynn, Suchman, Gerson, and
Star had demonstrated how sociological inquiries could address conceptual issues
in technological research. The potential of the practice-oriented program, as exemplified by these early paradigms, had been decisively demonstrated in the studies of control room work and other studies that then made Rodden and his colleague subject distributed systems theory to critical scrutiny. Furthermore, the
work of Rodden and his colleagues also had significance beyond its immediate
implications for computer science. With their re-conceptualization of fundamental
issues in distributed computing, CSCW’s practice-oriented research program had
been complemented by an exemplar of the correlative technological research. The
reciprocality of the contributions of sociology and computer science respectively
had been exemplified.
In sum, what had been articulated was a technological research program devoted to the development of computing technologies — but not just that: it was a
quite special technological research program, namely, a technological research
program that depended critically on conceptual contributions from sociology and
Chapter 11: Formation and fragmentation
373
anthropology. That is, the required empirical work was proactive in its orientation: intended to contribute to the conceptual foundation of the technology. In this
regard, CSCW is quite extraordinary: technological development predicated on
ethnographic studies
This research program thus differed distinctly from the kinds of sociological
and socio-psychological studies of technology that had been conducted previously, in that the emphasis was on development of technology, not on evaluation of
‘user satisfaction’ with particular systems, and even less so on prognostications
about the organizational or behavioral ‘effect’ and ‘impact’ of certain technologies.
3.5. Technology and ethnography: An ‘odd mix’?
CSCW seems like a strange bird. To some it looks like ‘requirements engineering’, ‘systems development’, or participatory design. Others, however, simply
take it to be a special area within HCI. To others again it seems like a version of
technology assessment or quality evaluation, or even an extension of media studies. What causes the confusion may be the seemingly paradoxical combination of
technology development grounded in, of all things, ethnographic studies of work
practices: an ‘odd mix’ indeed! (Finholt and Teasley, 1998).
The sense of ‘oddity’ is dispelled when CSCW is looked at in the light of history of technology. As pointed out above, technological research and development
is immensely variegated. Techniques are generally developed in an ‘empirical’
process of incremental improvement of existing techniques, typically in reaction
to known or anticipated bottlenecks, problems of reliability, performance limitations, production and maintenance costs, etc. The same, of course, applies to technologies, only that, with technologies, the development involves systematic analysis and rationalization of existing techniques and technologies. Still, technologies
have systemic character; they form complexes of component and enabling technologies. Fortuitous arrival of new component or enabling technologies may suddenly propel a given technology forward. The microprocessor is a case in point. It
was developed for desktop calculators but was then laterally shifted and appropriated as an essential component technology for personal computers.
Now, some technologies are developed in a proactive way, in an open but directed search for solutions to deal with a (known, anticipated, or imagined) societal problem or to create new possibilities. Obvious examples are railway technology, telegraphy, radio communication, turbojet, radar, space flight, satellite navigation, nuclear fusion energy — as well as digital electronic computing, real-time
transaction processing and digital computer networking. In its orientation, CSCW
belongs to this family of technology development efforts.
That is, in its orientation, the CSCW research program runs counter to the research program that dominates HCI. HCI represents a quite normal modus of
technological research in computing technologies and in general. HCI research
arrived on the scene after interactive computing, and it was from the beginning
374
Cooperative Work and Coordinative Practices
devoted to the conceptual rationalization and refinement of technologies for which
the basic principles had already been developed, albeit not systematically articulated and conceptualized. But for CSCW’s practice-oriented research program, the
requisite technology — the basic concepts and principles — hardly exist yet but
have to be conceived and developed in a proactive technological research effort
informed by ethnographic and other forms of workplace studies. That is, the two
research fields, although closely related with respect to many component technologies but also, to some extent, with respect to the disciplines and methods involved, represent radically different modalities of research. This difference in research modus has been and remains a source of much confusion.
To make the confusion worse, and as shown above, email and other forms of
‘computer-mediated communication’ developed in a ‘spontaneous’ or ‘empirical’
(unintentional, distributed) process of innovation and dissemination made possible
by the fact that technicians ‘used the network protocols in their own work’ and
therefore had ‘the incentive and the experience to create and improve new services’ (Abbate, 1999, p. 69). Thus, just like HCI, CSCW was anticipated by decades of incremental innovation of computer-mediated communication techniques
such as email, instant messaging, shared data sets, and so on, and to continue in
this modus would therefore easily appear the natural and unproblematic way of
proceeding. However, CSCW’s practice-oriented research program reversed
CSCW’s research modus in the course of the early years of formation of CSCW.
The practice-oriented CSCW research program is oriented in the opposite direction vis-à-vis practical technical innovation.
There is yet another source of confusion, however. The design of the very first
computer applications for commercial purposes (payroll systems, etc.) were based
on studies of actual practices. As early as 1953, the requirements analysis for one
of the first business applications, the design of a program for the ordering of
goods for Lyons Teashops in the UK, involved genuine field work (Ferry, 2003,
pp. 121-129). What was new in CSCW was not the idea of doing requirements
analysis as an integrated part of the process of building a particular system for a
particular setting, incorporating an array of more or less well-known technologies
in a way that serves the identified needs. What was new in CSCW was the idea of
doing workplace studies for the purpose of developing new technologies.
In-depth studies of coordinative practices in the age-old tradition exemplified
by the ‘dry, thorny, and not at all dazzling’ studies conducted two or three centuries ago by French academicians such as Réaumur and Perronet or by German
scholars such as Beckmann and Poppe are critically important for CSCW. In that
respect, there is nothing ‘odd’ in technological research informed by studies of
work practice. However, the specific function of these studies in CSCW is not the
improvement of the investigated practices in any immediate or direct sense102 but
102 Studies of coordinative practices with a view to their systematic rationalization would fall under the rubrics of operations research, logistics, etc. (e.g., Morse and Kimball, 1951). Some of the work of ‘scientific
management’ practitioners also belong here. On the other hand, ‘Business Process Reengineering’ is too Bolshevistic in its general stance towards actual work practices to be considered in this context.
Chapter 11: Formation and fragmentation
375
the development of novel coordination technologies that may then be appropriated
for the investigated practices (as well as others) in order to improve and transform
their practices. In other words, in contrast to the classic technological studies of
handicraft-based work practices, the practice-oriented research program of CSCW
is not reactive (post hoc systematization and rationalization) but essentially proactive. In CSCW, the path from ethnographic studies to the conception of technical
solutions to systems design is far more convoluted than in the classic technology
studies and far more convoluted than suggested by the notion of ‘implications for
design’ (Dourish, 2006b; Schmidt, et al., 2007).
4. Accomplishments and shortcomings
It is often intimated, if not indeed insinuated, that CSCW has not accomplished
anything substantial. Or it is bluntly alleged that CSCW has not yet resulted in
technologies in actual use.
The situation is more nuanced than that. Take Lotus Notes, for example. Developed in the late 1980s and early 1990s by a team at Lotus headed by Irene
Greif, and originally released in 1989, it not only had a certified CSCW pedigree
but also incorporated important early CSCW insights concerning the importance
of integrating the provision of message-handling functionalities such as email and
computer conferencing with a shared document database system over the Internet
(Greif and Sarin, 1986; Kawell, et al., 1988). More importantly, its example
spawned a large and rather messy market for what is sometimes called ‘collaborative working environments’ or ‘integrated groupware platforms’ (e.g., Lotus
Notes/DOMINO Microsoft SharePoint, Novell SiteScape, etc.). By 2000, about 70
different commercial platforms of this kind had been registered, and at about the
same time more than 19,000 software houses world-wide were offering products
for the Lotus Notes/DOMINO platform alone (Wilczek and Krcmar, 2001). It is
estimated that the global market for these systems by 2005 was in the magnitude
of $2 billion (Prinz, 2006). The number of users of such systems is estimated to be
well above 200 million, a number that does not include Google’s Mail, Wave.
Calendar, etc. So, if market size and actual use are the criteria, then something
must be said to have been accomplished.
But in fact, the vast majority of what is being marketed as ‘collaborative environments’ represents the state of CSCW by the end of the 1980s, that is, prior to
the systematic coupling of workplace studies to the development of technologies
could have an effect. Even more so, the design principles underlying these systems — one could call it the ‘groupware’ paradigm — derives from the prehistory
of CSCW, not from ethnographic work in CSCW.
‘Groupware’ is basically defined as a package of computer-mediated communication functionalities such as email and instant messaging combined with a
shared repositories. These basic functionalities are now generally supplemented
by calendar and address book functions. These functionalities are to some extent
integrated, typically by offering elementary interoperability of calendar and mes-
376
Cooperative Work and Coordinative Practices
saging in the form of alarms, but also shared address lists, and so on. ‘Groupware’
is a category of products that have be shaped with a view to a mass market that
not only includes team communications in work settings but also any setting
where people may form teams or groups. It is as close to a generic product as it
can get. ‘Groupware’ packages are expressions of a design strategy that offers integration of messaging and file repositories within the framework of existing operating systems architectures, that is, as yet another application next to other applications. From the point of view of commercial software design, this may be
rational, for one thereby dodges the challenge of developing computational coordinative protocols that can be invoked by any application.
The point is that in the design of these products the thicket of cooperative work
practices has been carefully avoided. The technical solutions to the problems
CSCW is addressing will have to be incorporated in all kinds of software machines, from operating systems and network protocols to domain-specific applications (desktop publishing, computer-aided design, production planning and control, electronic patient records, and so forth). CSCW and Groupware are not coextensive fields. The conceptual horizon of groupware ends where that of the
CSCW research program begins.
4.1. Ethnography and technology: A ‘big discrepancy’?
When allegations about CSCW’s lack of fecundity are being made, what is implicitly expected is that what is unique to CSCW should have led to novel technologies, namely, the systematic coupling of ethnographic and other forms of indepth workplace studies to the development of technologies for cooperative work
settings. And the impression is that this coupling is not working properly.
A widely cited source of the impression that there is such a problem, a ‘gap’, in
the transmission of insights from workplace studies to ‘design’ is an article by
Plowman, Rogers, and Ramage in which they reported on a survey of a large part
of the workplace studies that had by then been published in the area of CSCW
(1995). In the article, the authors asserted to have found a ‘paucity of papers detailing specific design guidelines’ (p. 313). They carefully and explicitly refrained
from concluding that ‘workplace studies do not produce specific design guidelines’, but nevertheless suggested that the observed paucity ‘can be attributed to
the lack of reported research which has developed to the stage of a system prototype’ (ibid.).
They went on to surmise that what impeded the progression from workplace
studies to ‘the stage of system prototype’ could be what they described as ‘a big
discrepancy between accounts of sociality generated by field studies and the way
information can be of practical use to system developers’ (p. 321). Not surprisingly, this conjecture (and others in the same vein) has fuelled widespread concern
and continual discussion about the role of workplace studies in CSCW (Dourish
and Button, 1998; Crabtree, et al., 2005; Dourish, 2006b; Randall, et al., 2007a;
Crabtree, et al., 2009).
Chapter 11: Formation and fragmentation
377
It is undoubtedly true that the sample analyzed by Plowman, Rogers, and
Ramage showed only few papers ‘detailing specific design guidelines’. However,
their assertion that there is a ‘lack of reported research’ informed by workplace
studies ‘which has developed to the stage of a system prototype’ was false. As
pointed out in my contribution to the collection on Workplace Studies edited by
Luff et al. (2000), the alleged ‘paucity’ does not ‘reflect on the actual impact of
workplace studies on the development of CSCW technologies’ (Schmidt, 2000, p.
146). It is simply an artifact of a flawed method: The ‘histories of how workplace
studies inform the development of CSCW technologies’ are ‘not always readily
visible in papers reporting on findings from workplace studies. The transfer of
findings and insights typically happens in the course of discussions within crossdisciplinary research teams and are often only documented in design-oriented papers’ (Schmidt, 2000, p. 148, n. 3).
In other words, the presupposition that there is an immediate and direct route
from workplace study to ‘implications for design’ to ‘detailed design guidelines’
to the ‘stage of system prototype’ reflects a widespread but simplistic notion of
the development of technology. To investigate the development of knowledge in
any field, and most certainly the development of technology in an interdisciplinary field such as CSCW, would require that one traces and maps out the complex
pattern formed by the more or less tentative articulations, interpretations, categorizations, conceptualizations of observed phenomena; the ways in which concepts
and ideas percolate and propagate within the research community, as manifested
in more or less sporadic citation patterns or in the often elusive cross-fertilization
among researchers in collaborating laboratories and research projects; and the various simultaneous or subsequent generalizations and transformations of these concepts and ideas as they are appropriated by other actors, perhaps pursuing diverging aims and addressing different problems and issues. Thus, in the case of the
impact of workplace studies in CSCW on technological development one would
have to investigate the (partial and patchy) conceptual framework emerging from
the propagation and transformation of concepts derived from workplace studies,
such as ‘awareness’, ‘boundary object’, ‘artifact’, ‘work setting’, ‘coordinative
protocol’, etc. and how this (partial and patchy) conceptual framework is being
interpreted and appropriated in different ways in technological research.
There is another source of ambiguity in determining the form and extent of the
impact of workplace studies on technical research in CSCW. Workplace studies in
CSCW play two major roles. On one hand, workplace studies have served the critical role of demonstrating the ‘socially organized’ character of human activity.
This critical role is obvious and has been the cause of justifiable celebration. For
instance, in 2002 Jack Carroll published a collection of articles (under the title
Human-Computer Interaction in the New Millennium (Carroll, 2002) to take stock
of the accomplishments of HCI and discuss the challenges and opportunities facing the field, in which he makes the following observation:
‘At first, the focus [of CSCW research] was on collaborative systems (groupware) and humancomputer interactions with collaborative systems. But the more significant impact of CSCW, one
378
Cooperative Work and Coordinative Practices
that goes beyond the conference, is the recognition that all systems are used in a social context. 1n
this sense. CSCW has become more a view of HCI than a subcommunity with[in] it. CSCW has
served as a conduit for the expansion of the science foundation of HCI to incorporate activity theory, ethnomethodology, and conversation analysis, among others’ (Carroll, 2002, p. xxxiii).
This is a distinguished accomplishment by any standard. On the other hand, however, workplace studies also serve the constructive role of being instrumental in
providing, inductively, the empirical and conceptual basis for the technology development that CSCW is ultimately all about. This role is much less obvious and
not at all well understood.
4.2. The case of ‘awareness engines’
To take but one example: the concept of ‘awareness’. As CSCW formed as a research field, it was quickly established that the coordination and integration of
cooperative work activities typically involve highly sophisticated ‘awareness’
practices, i.e., practices of ‘monitoring’ unfolding occurrences in the setting and,
conversely, of making local occurrences ‘publicly visible’ for others in the setting
(e.g., Harper, et al., 1989a; Heath and Luff, 1991a). These practices, that are generally named ‘mutual awareness’, are crucial because they are practically effortless. Although they involve delicate techniques, they typically require no noticeable effort of skilled practitioners because they are spatio-temporally integrated
with the activities of doing the work. Skilled actors are able to coordinate and integrate their interdependent activities without having to interrupt the flow of
work. In fact, to be able to do so is typically what is required to be considered
competent in a given cooperative work practice.
This insight prompted intensive research into different ways of providing computational support for these subtle practices, and over the years different approaches have been developed and explored (for an overview, cf. Schmidt,
2002b). However, what is interesting here is, first of all, that this rich line of research, while inconclusive, has resulted in facilities that are in actual use.
The obvious instance of this is the early web-based groupware system named
Basic Support of Cooperative Work (BSCW), initially developed at GMD in
Germany in the mid 1990s. In contrast to the line of ‘collaborative environments’
in the Lotus Notes tradition, BSCW offers elementary support for ‘mutual awareness’. Users can see traces of actions on documents in the workspace in which
they are participants. Each user can, for example, see if and when another user has
‘read’ a document, changed its name and description, moved it, and so on. Not
only is BSCW of CSCW provenance; the design of the system was clearly informed by the findings of early ethnographic studies (Mariani and Prinz, 1993;
Bentley, et al., 1997a; Bentley, et al., 1997b; Appelt, 1999).
BSCW is currently installed on more than 1,000 servers, typically at research
institutions, servicing ‘tens of thousands of workgroups and hundreds of thousands of users’ (Orbiteam, 2009). Log-analysis studies of patterns of usage in
BSCW show that the ‘awareness service’ is widely used, especially by experi-
Chapter 11: Formation and fragmentation
379
enced users (Appelt, 2001). Nevertheless, the ‘awareness service’ technology as
represented by BSCW is crude and primitive. Its major limitation is that the massively distributed nature of awareness practices is severely curtailed by the architecture. The shared ‘workspace’ (i.e., a directory) is a distinct set of objects to
which a user obtains access by invitation (although ‘rights’ may differ), and
events thus cannot propagate beyond the ‘workspace’ as defined by the inheritance rules of the directory structure. In shot, BSCW’s ‘awareness service’
should merely be taken as an inspired first short at how computational artifacts
might be designed to support the observed ‘awareness’ practices (for a critical
analysis of BSCW, cf. Schümmer and Lukosch, 2007, pp. 501-525).
The picture of ‘awareness’ practices that emerged from the initial ethnographic
studies, was one of horizontal adaptation in a massively distributed mode of interaction. Cooperative work is essentially characterized by distributed control. Now,
work plans and conventions and the concomitant coordinative artifacts of course
play a critical role in reducing the effort of handling the complex interdependencies by providing the normative background of taken-for-granted expectations
(‘He’s supposed to have finished x by now, and… yes indeed, there it is!’). But
they can be seen as ‘local and temporary closures’, to use the apt expression of
Gerson and Star (1986), in contrast to which ‘awareness’ practices can very well
be understood as ‘the work to make work work’. The importance attributed to the
concept of ‘awareness’ in CSCW research largely derives from this insight, and
there is a close conceptual affinity between concepts like ‘situated action’, ‘articulation work’, and ‘mutual awareness’. That is, the defining feature of ‘awareness’
practices as the concept emerged from the ethnographic record is that of massive
distributedness.
In direct continuation of these findings, CSCW research into how to ‘support’
awareness practices has increasingly and predominantly pursued calculi of distributed computation so as to be able to express, facilitate, emulate collaborating
workers’ practices of paying heed to occurrences in their work settings. An early
example is the ‘spatial’ or ‘aura-focus-nimbus’ model that was developed by Benford and others in the COMIC project (Benford and Fahlén, 1993; Benford, et al.,
1994a; Benford, et al., 1994b) and was later generalized by Rodden (1996) and
Sandor et al. (1997). Another approach, suggested by Simone and Bandini (1997,
2002), explored the possible utility of the reaction-diffusion model as a way of
representing and facilitating the distributed propagation of state changes within a
cooperative work setting.
What characterizes these otherwise very different approaches is that they all, in
the words of Sandor et al., attempt ‘to integrate awareness support at fundamental
levels of cooperative system architecture’ (Sandor, et al., 1997, p. 222). As noted
above, this approach was explicitly ‘informed’ by the ethnographic record (cf.,
e.g., Rodden and Blair, 1991; Rodden, et al., 1992). In other words, it was realized at a very early stage in CSCW that, since ‘awareness’ practices are an integral aspect of cooperative work practices, ‘awareness support’ cannot be provided
by a special kind of application, on par with, say, email or instant messaging; it
380
Cooperative Work and Coordinative Practices
has to be provided as a service available to all applications. It was therefore realized that ‘awareness support’ would have to be provided ‘at fundamental levels of
cooperative system architecture’, that is, at the level of the operating system of
networked computational artifacts. In short, the early ethnographic studies were
almost immediately seen to have radical ‘implications for design’ with respect to
operating systems architectures, database technology, and so on.103 The ‘groupware’ model was no longer viable as a model for CSCW.
This has further implications. It means that the transition from workplace studies to the development of new technologies for, e.g., ‘awareness support’ would
become quite complex. Major conceptual and technical issues had — and have —
to be identified and resolved, and at the same time the enormous investment in the
installed base of computer platforms and application software poses a major buffer against anything like swift change. This in turn has had the effect that ‘awareness support’ provided by IT systems in actual use has stayed at the level represented by the rather crude facilities of BSCW or the even cruder ‘awareness support’ offered by instant messaging where users are notified (by icons or sounds) of
the on-line presence of ‘buddies’.
The case of ‘awareness support’ is illuminating in that it also demonstrates internal sources of seeming stagnation. The trajectory of the line of technological
research on ‘awareness support’ that is informed by ethnographic studies is strikingly convoluted.
What was initially being pursued was a calculus for modelling massively distributed interactions among entities. It goes without saying that facilities for integrating ‘awareness support’ at fundamental levels of operating system architectures will have to be quite generic, but these facilities will at the same time also
have to be able to support the domain-specific character of these practices.
The ethnographic finding that practices of heeding (to use a less misleading
term than ‘awareness’) are domain-specific practices seem to have been put aside,
temporarily perhaps, in favor of advancing the ability to model distributed social
processes in general. In cooperative work, actors skillfully ‘monitor for’ certain
cues, signals, etc. in the setting and, conversely, skillfully ‘display’ certain cues,
signals, etc. concerning their local activities as relevant to colleagues. That is,
practices of heeding in cooperative work is first of all not a mental state acquired
‘passively’ (as the term ‘awareness’ might suggest) but a practical stance of monitoring (or displaying), and the categories of cues, signals, etc. that actors are
monitoring for (or displaying) are specific to the given domain of work and the
work setting. Practices of heeding are not generic abilities; they are skills practitioners only master with training and experience, highly elaborate practices that
are virtually effortless because they are an integral aspect of the domain-specific
work practices. And like any other practice, they are essentially conventional and
thus normative (‘What are you doing?! Are you asleep?’ or ‘Don’t worry; I see
103 Identical conclusions have been drawn from other workplace studies investigating ‘coordination mechanisms’ in cooperative work settings (Schmidt, et al., 1993; Schmidt, 1994d).
Chapter 11: Formation and fragmentation
381
the problem and can handle it!’). Without facilities that support actors in these
skilled techniques, by allowing them to express the categories of cues, signals etc.
they monitoring for (or displaying), in short, the protocols of their heeding practices, computational support of ‘mutual awareness’ is unlikely to proceed significantly beyond the rudimentary ‘awareness service’ of BSCW or similar.
There may have been sensible intellectual economy in putting these issues
aside. It is interesting, however, that recent research has begun to address those
very issues. Gross and Prinz have for example presented an ‘Event and Notification Infrastructure’ that aims at providing ‘awareness information in a way that is
adequate for the current situation of the user’. Gross and Prinz emphasize that
‘context itself depends on parameters like the current task, the current type of cooperation, the artefacts and tools used, and so forth’ (Gross and Prinz, 2003, p.
296). In fact, recognizing that heeding practices, like any other practice, are essentially normative, this research aims at developing facilities for contextual awareness support that ‘will allow users to establish conventions’ (Gross and Prinz,
2004, p. 300).
This brings us back to the problem of the putative ‘gap’ or ‘discrepancy’ in
CSCW. The problem is not that there is a ‘lack of reported research [informed by
workplace studies] which has developed to the stage of a system prototype’. There
is evidently no such lack; in fact, the ethnographic findings have obviously engendered a rich tradition even in the fairly esoteric area of ‘awareness support’.
This should not be taken to imply that the relationship between workplace
studies and development of technology is unproblematic. Workplace studies are
no substitute for serious and meticulous conceptualization. On the contrary, and to
paraphrase Garfinkel, workplace studies in CSCW serve ‘as aids to a sluggish imagination’ (Garfinkel, 1967, p. 38). The function of workplace studies is not to
produce ‘implications for design’ or anything of the sort but first of all to challenge taken-for-granted assumptions about cooperative work and coordinative
practices and thus kindle an otherwise sluggish technical imagination. Thus,
bringing findings from ethnographic studies of cooperative work to bear on technological development involves conceptual work — or rather: it essentially consists in conceptual work. As suggested above, there are two aspects to this: a critical one and a constructive one.
The conceptual work of bringing findings from ethnographic studies of cooperative work to bear on technological development involves dissolving run-of-themill constructions seeping in from the various disciplines and from the metaphysics of common-sense theorizing (e.g., ‘shared goals’), questioning what is inadvertently taken as prototypical (e.g., ‘face-to-face’ interaction in ‘teams’), sorting
out category mistakes and the ubiquitous transgressions of sense (e.g., ‘media
richness’), and so on. However, the research on ‘awareness support’ stumbled in
this critical effort.
The early ethnographic studies did not interpret the findings in terms of
‘awareness’. In fact, the term ‘awareness’ doesn’t appear there. The findings were
nonetheless soon interpreted in the light of that concept. Now, since ‘awareness’
382
Cooperative Work and Coordinative Practices
is an ‘attention word’, not a ‘heed word’ (Ryle, 1949; White, 1964), this caused
significant confusion. The ethnographic findings were implicitly given a mentalistic interpretation by being subsumed under the notion of ‘awareness’ that seems
to have been imported from social psychology and small-group sociology. The
findings produced in the early ethnographic studies were thereby, so to speak,
contaminated by abstract constructs, obviously modelled on so-called ‘face-toface interaction’, that had more affinity to Goffman’s notion of ‘focused’ and ‘unfocused’ interaction (Goffman, 1961, 1963) than with the highly specific findings
articulated by the early ethnographic record. The result was a mentalistic notion of
‘awareness’ from which actual practices and protocols had somehow been excluded. The mentalistic interpretation in turn encouraged design experiments such as
‘media spaces’ in which much effort was spent on trying to recreate, with as much
fidelity as possible, the putative paragon of all human interaction: the ‘face-toface’ chat in the office corridor (for critical analyses, cf. Gaver, 1992; Heath and
Luff, 1993).
As a result, much of the crucial insight gained by the ethnographic findings
was easily overlaid by concepts from social psychology such as ‘shared understanding’, ‘shared goals’, etc. in which the very practices through which ‘understanding’ or ‘goals’ become ‘shared’, i.e., unproblematically aligned, taken-forgranted, etc., are glossed over. Consequently, many of the initial explorations into
‘awareness’ services were experiments with — exactly — shared data sets facilitated by joint access to a directory on a server, perhaps augmented by a notification service. Sometimes this was graphically dressed up in a ‘room’ metaphor.
However, it was quickly realized that the conception of cooperative work arrangements underlying this design was faulty, predicated as it was on the deeply
problematic idea of the ‘group’ or ‘team’ as a well defined and clearly bounded
unit. Researchers accordingly began to deconstruct the notion of cooperative work
arrangements as defined in terms of physical space (cf., e.g., Fitzpatrick, et al.,
1996; Harrison and Dourish, 1996; Dourish, 2006a; Ciolfi, et al., 2008).
Bringing findings from ethnographic studies of cooperative work to bear on
technological development also requires the constructive task of reconstructing
the logic of ‘awareness’ practices, the protocols or conventions that competent
actors routinely follow and expect others to follow, and the ways in which such
protocols or conventions are established and maintained. This kind of investigation cannot be driven by the way sociology or anthropology frames problems. To
be able to ‘inform design’ this kind of investigation must be informed by the way
technological research tentatively frames its problems. It is in this context noteworthy that another group of researchers, led by Carla Simone, in pursuing what
seems to be the exact same research problem, has resolved that the ethnographic
evidence of heeding practices that was produced almost twenty years ago (in a
different intellectual milieu and in pursuit of far less specific research questions),
now needs to be complemented and enriched. Simone and her colleagues have
therefore undertaken new ethnographic studies of awareness practices with an ex-
Chapter 11: Formation and fragmentation
383
plicit focus on the ‘conventional’ or normative nature of these practices (Cabitza,
et al., 2007; Cabitza and Simone, 2007; Cabitza, et al., 2009).
The point of all this, then, is that the transition from ethnographic studies to
technological development is an immensely complex effort. On closer inspection,
what may look like a ‘gap’ turns out to be a maze of pathways. Some lines of research lead straight to useful and innovative applications. Other lines of research
turn out to be futile. Yet other lines of research turn out to raise questions that require additional ethnographic work. And so on.
That said, while there is no ‘gap’ or ‘discrepancy’ between workplace studies
and technological development, the impression that something is amiss in CSCW
is not entirely wrong. I suggest that the perceived sluggishness, the impression
that progress is at best intermittent and hesitant, at worst that things are going in
circles or backwards — to a large extent is precipitated the an increased fragmentation of CSCW.
4.3. Logics of fragmentation
There is an interesting logic to the eventual fragmentation of CSCW.
As always with any kind of research, the presence of distinctly different paradigms is of course a recipe for trouble. CSCW is certainly no exception. More
specifically, the continued presence of research within CSCW conducted on the
‘groupware’ paradigm would itself make the very concepts of ‘work’ and ‘work
practice’ problematic. In a ‘groupware’ perspective the term ‘work’ does not
mean work practice, for that concept is alien within a ‘groupware’ paradigm; but
rather, the term ‘work’ merely stands proxy for ‘workplace’ or ‘team’, for what is
addressed is a priori delimited to communication in abstraction from the specific
skills, techniques, procedures, and material settings involved in work. In ‘computer-mediated communications’ or ‘groupware’ systems research, ‘work’ as a research phenomenon has been transubstantiated, in much the same way as ‘work’
is spirited away from much of ‘sociology of work’ (Sharrock and Anderson, 1986,
Chapter 6). It was therefore but a small step for ‘groupware’ research to abandon
the pretense that this research has anything to do with ‘work’ in any ordinary
sense.
On the other hand, however, the practice-oriented research program crumbled
from within. It has made significant progress but has not succeeded in showing
convincing technical solutions to the problem of computational regulation of cooperative work in complex settings. This is not surprising. In contrast to ‘groupware’ research and development, the practice-oriented research program is not
about developing a new class of applications on par with other applications but is
about developing technologies that will provide all applications with coordinative
functionality. That is, CSCW has to break the ‘groupware’ barrier. What is required is technologies to support workers in coordinating their cooperative activities in the domain-specific categories of their work and from within their domainspecific application programs, and the coordinative functionality thus has to be
384
Cooperative Work and Coordinative Practices
provided by computational artifacts that — as separate control mechanisms residing in the network operating system — have to interface with the data structures
and transformation processes of the application programs. This applies to ‘awareness support’ as well as ‘workflows’ or ‘ontologies’. To put it directly, CSCW
research is developing a new kind of technology that, compared to the large installed base of polished ‘groupware’ products, does not have much to show for it
— not until all the elements are there and have been put together, that is.
But these problems of public perception are the least of it. The real issue is that
the CSCW research program was never clearly articulated. Great progress was
achieved in involving workplace studies in technological research, but at the same
time the very notion of work practice has been vacated, emptied of other content
than the notion of mere contingent activity, disconnected from the concepts of
rules and regulations. The practice-oriented program of CSCW — the very idea of
computational regulation of interaction — has ended in contradiction. On one
hand ‘groupware’ is seen as severely limited because it is predicated on communication divorced from work practice, but on the other hand the integration of
computational functionality into the coordinative practices is seen as conceptually
wrongheaded. This contradiction has, in the end, made the CSCW research program largely paralyzed.
Critical workplace studies certainly have flourished and have been justly celebrated, but constructive workplace studies, goal-directed studies of coordinative
practices, protocols, logics of combination and recombination, have only progressed slowly and sporadically. So, when frivolous applications of computing
technologies became a mass market and seized the imagination of the public, with
such frail foundations, the practice-oriented research program was a push-over.
385
Chapter 12
Frail foundations
CSCW, as a research area, is in disarray. Not only are there different schools of
thought, but the different communities are not investigating the same phenomenon
or the same kind of phenomena, nor do they engage in any kind of discourse
about findings. In fact, they would not even be able to compare notes.
This would be a serious predicament for any research area. For an interdisciplinary area this is fatal. What normally unites a research area or a discipline is the
common framework of exemplars, methods, techniques, textbooks, educational
programs and institutions, etc. In an interdisciplinary field, all this is missing, for
here all or most of this scaffolding belongs to the constitutive disciplines. What
unites an interdisciplinary area is, by and large, merely the research problem its
members have gathered to investigate in their different ways and the ‘cornerstone’
concepts in which the problem is identified and expressed. But if the very conceptual foundation, the set of cornerstone concepts, is muddled, incoherent, disputed,
then none of the other factors that otherwise assist an area in conceptual crisis to
get through to the other side are there to stabilize the area. Accordingly, without
reasonable clarity about CSCW’s conceptual foundation, its research problem, the
research community has no accountable criteria of quality, relevance, priority, directions. Its research program dissolves; surpassed paradigms linger on among the
living; arguments erupt about shifting the area’s focus this way or that way or
even ‘widening’ it. CSCW is therefore subjected to centrifugal forces that are
tearing it to pieces: disciplinary chauvinism, distractions of shifting funding
schemes, changing technical fashions, frivolous media interests, etc.
Why is CSCW is such disarray? Why has CSCW been unable to effectively
supersede the various lines of research focusing on mediation of communication?
Why has there, in fact, been a regression to versions of ‘computer-mediated
communications’ that focus on evaluation studies of new applications of the wellknown technologies of exchanges of messages and files? There are several reasons for that. One is surely that CSCW researchers have been reluctant to pursue
the design of computational artifacts for regulating cooperative work and thus are
inclined to conceive of ‘Computer-Supported…’ as something close to ‘Computer-Mediated…’. Their motives may be different: they may, based on experience,
386
Cooperative Work and Coordinative Practices
find existing coordination technologies of this sort intolerably rigid; they may be
inhibited for ideological reasons; or they may be of the persuasion that regulation
by machines of human activity inherently impossible, a conceptual chimera. In
fact, the motives may be mixed, but the latter seems to be the one that has carried
most weight.
Reluctance to pursue the design of computational artifacts for regulating cooperative work may also go a long way to explain the urge to the move the focus of
CSCW ‘away from work’, for in leisure and domestic settings there is no compelling need for regulation of interaction by means of sophisticated coordinative artifacts and, hence, neither by means of computational artifacts, and even less so is
there a need for actors to modify and construct computational artifacts to regulate
interaction. Such practices typically belong to professional work in complex settings. And on the other hand, communication technologies such as email, http,
messaging, chat, etc. are not domain-specific; they do not even have a domain bias. They are as generic as pen and paper. That is, when the focus is on these generic communication technologies there is no reason whatsoever for focusing on
cooperative work. Hence, I suggest, the fragmentation of CSCW.
That is, there is deep confusion in our understanding of work practice, of action
and plan, of computation, and hence of the very issue of computational regulation
of cooperative work. In short, there is considerable confusion in CSCW concerning the field’s conceptual foundation.
The conceptual confusion is double sided: on one hand, the notion of ‘plans’ or
‘procedures’ versus ‘situated action’, and on the other hand, the notion of ‘computation’ inherent in any conception of computational regulation. These notions are
all fraught with confusion and mystification. I made a modest attempt to elucidate
the concepts of ‘plans’ and ‘situated action’ many years ago (Schmidt, 1994d,
1997), albeit with little apparent success. Suchman, for instance, never dignified
the critique with a reply (cf., for example, Suchman, 2007). There may be several
reasons for this. Some would, for example, argue that I misunderstood Suchman’s
argument and made a fuss about nothing. Alternatively, though, my argumentation may have been too hushed to be heard and too timid to cut the mustard. In
any event, I was certainly influenced by what I was trying to critique, sharing in
some ways the confusion and mystification I was trying to dispel, and hence not
able to articulate my concerns and objections with sufficient clarity to be effective. Therefore, as a contribution to clarifying CSCW’s program and thereby, I
hope, help bringing it out of the current crisis, I will try to disentangle the issue of
‘plans and situated actions’ and then move on to the concepts of computation and
computational regulation.
1. Suchman vs. cognitivism
Suchman’s book is a sharp and generally well-articulated critique of ‘cognitive
science’ as exemplified by the cognitivist tradition within cognitive psychology as
well as by its intellectual counterparts in ‘artificial intelligence’ and the offspring
Chapter 12: Frail foundations
387
of this in the form of ‘expert systems’ and ‘office automation’. The focal point of
her critique is the concept of ‘plans’ as it was conceived of by cognitivist theorists. Her target was the view ‘that purposeful action is determined by plans’ and
that this was considered ‘the correct model of the rational actor’, and what motivated her critique was the observation that this view was ‘being reified in the design of intelligent machines’ (Suchman, 1987, p. ix).
In this regard, the formulation given by George Miller, Eugene Gelernter, and
Karl Program in their book Plans and the Structure of Behavior (1960) is representative:
‘Any complete description of behavior should be adequate to serve as a set of instructions, that is,
it should have the characteristics of a plan that could guide the action described. When we speak of
a Plan in these pages, however, the term will refer to a hierarchy of instructions, and the capitalization will indicate that this special interpretation is intended. A Plan is any hierarchical process
in the organism that can control the order in which a sequence of operations is to be performed.
A Plan is, for an organism, essentially the same as a program for a computer.’ (Miller, et al., 1960,
p. 16; quoted in Suchman, 1987, pp. 36 f.).
Suchman summarized the basic tenets of the movement as follows:
‘The cognitivist strategy is to interject a mental operation between environmental stimulus and
behavioral response: in essence, to relocate the causes of action from the environment that impinges upon the actor to processes, abstractable as computation, in the actor’s head. The first premise
of cognitive science, therefore, is that people — or “cognizers” of any sort — act on the basis of
symbolic representations: a kind of cognitive code, instantiated physically in the brain […]. The
agreement among all participants in cognitive science and its affiliated disciplines, however, is that
cognition is not just potentially like computation, it literally is computational.’ (Suchman, 1987, p.
9).
In opposition to this view and drawing on various resources, principally anthropology and sociology (e.g., ethnomethodology and conversation analysis),
Suchman developed a powerful argument based on the concept of ‘situated actions’: ‘By situated actions I mean simply actions taken in the context of particular, concrete circumstances’ (p. viii). Her key argument is that
‘however planned, purposeful actions are inevitably situated actions’ ‘because the circumstances
of our actions are never fully anticipated and are continuously changing around us. As a consequence our actions, while systematic, are never planned in the strong sense that cognitive science
would have it. Rather, plans are best viewed as a weak resource for what is primarily ad hoc activity. It is only when we are pressed to account for the rationality of our actions […] that we invoke the guidance of a plan. Stated in advance, plans are necessarily vague, insofar as they must
accommodate the unforeseeable contingencies of particular situations. Reconstructed in retrospect,
plans systematically filter out precisely the particularity of detail that characterizes situated actions, in favor of those aspects of the actions that can be seen to accord with the plan.’ (pp. viii f.)
Whereas the course of rational action, on the cognitivist view, is causally determined by some, putative, preformed ‘plan’, ‘scheme’, etc., Suchman argued for an
alternative account:
‘The alternative view is that plans are resources for situated action, but do not in any strong sense
determine its course. While plans presuppose the embodied practices and changing circumstances
388
Cooperative Work and Coordinative Practices
of situated action, the efficiency of plans as representations comes precisely from the fact that they
do not represent those practices and circumstances in an of their concrete detail.’ (p. 52)
This account is expressed in a slightly more specified manner by the end of the
book:
‘For situated action […] the vagueness of plans is not a fault, but is ideally suited to the fact that
the detail of intent and action must be contingent on the circumstantial and interactional particulars
of actual situations’. ‘Like other essentially linguistic representations, plans are efficient formulations of situated actions. By abstracting uniformities across situations, plans allow us to bring past
experience and projected outcomes to bear on our present actions. As efficient formulations, however, the significance of plans turns on their relation back to the unique circumstances and unarticulated practices of situated activities.’ (pp. 185 f.)
The influence Suchman’s book has had is of course to a large extent due to this
critique of cognitivism. In fact, for many researchers in the communities concerned with human factors of computing (HCI, PD, CSCW), the book had the effect of something like a declaration of independence. But it also, and not least,
provided initial formulations of the practice-oriented research program for which
her previous work on office procedures had provided one of the exemplars:
‘I have introduced the term situated action. That term underscores the view that every course of
action depends in essential ways upon its material and social circumstances. Rather than attempting to abstract action away from its circumstances and represent it as a rational plan, the approach
is to study how people use their circumstances to achieve intelligent action. Rather than build a
theory of action out of a theory of plans, the aim is to investigate how people produce and find
evidence for plans in the course of situated action. More generally, rather than subsume the details
of action under the study of plans, plans are subsumed by the larger problem of situated action.’
(p. 50)
Not only did Suchman offer an initial formulation of the practice–oriented research program; she did so in a way that from the very outset pointed to the concept of the embodiment of action and the materiality of work practices as foundational: ‘all activity, even the most analytic, is fundamentally concrete and embodied’ (p. viii); accordingly, the materiality of the work setting is not a liability but
an asset to the practitioner: ‘the world is there to be consulted should we choose to
do so’ (p. 47); ‘plans presuppose the embodied practices and changing circumstances of situated action’ (p. 52). In doing so, she emphasized and outlined critical aspects of the understanding of cooperative work that subsequently evolved in
CSCW.
However, Suchman’s book has also left an intellectual legacy that hampers
CSCW with respect to addressing critical aspects of the real-world problems that
it initially set out to address and which remain a domain it is uniquely equipped to
address: the design and use of computational regulation devices as a means of
dealing with the complexities of coordinative practices in cooperative work.
Suchman of course did not deny that ‘plans’ are produced and used, nor did she
say or imply that ‘plans’ are more or less useless. The problems with her account
are far more subtle than that. In my attempt to unravel these problem, I will first
show that Suchman, unwittingly and tacitly, accepted basic premises of the cognitivist position she is trying to dismantle. I will argue that she, because of this, was
Chapter 12: Frail foundations
389
unable to dispose of with cognitivism’s confusion of normative and causal propositions. At the end she therefore wound up with an account that effectively reproduced cognitivism’s view on the nature of computational artifacts.
1.1. Suchman’s strategy
First of all, what kind of argument is presented in Plans and Situated Actions?
It has been argued that the focus of Suchman’s argument ‘is on the notion of
plan as deployed in cognitive science (plans-according-to-cognitive-science)’ and
that, therefore, much of her ‘argumentation does not concern “plans” as we might
use them in ordinary affairs’ (Sharrock and Button, 2003, p. 259). It is certainly
true that it is the cognitive science notion of ‘plans’ Suchman is critiquing in her
book and that her propositions should be read in that context. However, this gallant reading is not supported by the text, and it has in fact been rejected by Suchman: ‘My aim was to take on both senses of “plan,” […] and to explore the differences between them’ (Suchman, 2003, p. 300). Indeed, Suchman’s research prior
writing the book had focused on ‘“plans” as we might use them in ordinary affairs’, namely, organizational procedures, and in that work she did talk about ordinary ‘procedures’ in exactly the same terms as she talked about ‘plans’ in the
book (cf. Suchman, 1982, 1983; Suchman and Wynn, 1984). This concordance
justifiably led Agre to read the book as a book about ordinary plans (Agre, 1990).
However, for unknown reasons, those earlier and concordant studies seem to have
been forgotten by those who have later defended the book against its critics.
Anyway, her exploration of the differences between the cognitivist and the ordinary sense of ‘plan’ was not the conceptual analysis that would have been required to do the job. If it had been that kind of argument she would have been trying to expose the deep conceptual confusion underlying cognitivism. But that she
did not do.
Cognitivism is a special version of what Ryle calls ‘the intellectualist legend’.
This legend is characterized by having implicitly taking the assumed pattern of
intellectual conduct (e.g., a cycle of theorizing, planning, acting, evaluating) as
the paradigm of all intelligent conduct. The theorist thus explains intelligent action by ascribing an anterior ‘plan’ to the action, the latter being the execution of
this occult ‘plan’. But as pointed out by Ryle, ‘Intelligent practice is not a stepchild of theory. On the contrary theorising is one practice amongst others and is
itself intelligently or stupidly conducted’ (Ryle, 1949, p. 26). Preconceived plans,
whether ascribed or avowed, can be as smart or as stupid as any action, planned or
not. A tennis player who is playing without a preconceived plan, or who does not
stop and contemplate her next move, is not necessarily playing mindlessly. We
sometimes make the effort of developing plans for our actions and we sometimes
postpone action until we have a plan, but we most often simply act and, thankfully, in doing so we normally act intelligently, competently, heedfully, etc. If anterior plans were a requisite for intelligent conduct, then the development of plans
would in turn require anterior plans to be intelligent, and so on. ‘To put it quite
390
Cooperative Work and Coordinative Practices
generally, the absurd assumption made by the intellectualist legend is this, that a
performance of any sort inherits all its title to intelligence from some anterior internal operation of planning what to do’ (Ryle, 1949, p. 31).
The cognitivist version of ‘the intellectualist legend’ is one that accounts for
intelligent practice in terms of (postulated, occult) causal processes. As already
shown, Miller et al. (1960) jumps from ‘description of behavior’ to ‘a set of instructions’ to ‘plan’ to organic ‘process’ to sequential ‘control’ to ‘computer program’. In the words of Stuart Shanker in his incisive critique of cognitivism, this
‘muddle’ is the ‘result of trying to transgress logical boundaries governing the
employment of concepts lying in disparate grammatical systems’ (Shanker,
1987c, p. 73). In other words, the muddle is the result of an entire series of category mistakes in close order.
I will develop this line of argument below. The point here is that Suchman does
not deploy this kind of argument. Instead she counters the conceptual confusion of
cognitivism by propounding what is basically an empirical argument, an ‘alternative account of human action’ (Suchman, 1987, p. x), drawing on observational
studies and conceptualizations from social science. This mismatch, an empirical
argument against conceptual confusion, is a significant source of ambiguity.
One more observation on the kind of argument that is presented in Plans and
Situated Actions is required. Instead of a conceptual critique of cognitivism,
Suchman mobilized an array of social science accounts of human action and interaction: ethnomethodology, conversation analysis, studies of instructions, etc. – as
if to say, ‘Look, the cognitivist account is not realistic. The sociological account
offers a richer picture’. Paradoxically, however, if this was indeed her aim, the
examples of ‘plans and situated actions’ she offered and discussed are far from
representative of ordinary plans in ordinary affairs. Here, in a critique of cognitivism premised on presenting an alternative account, one would have expected
that the rich multiplicity would have been demonstrated. For does it make any
sense at all to talk about ‘plans’ in abstract generality, as if it is a genuine concept? Should we not say of the word ‘plan’ what Wittgenstein says of the word ‘to
think’: ‘It is not to be expected of this word that it should have a unified employment; we should rather expect the opposite’ (Wittgenstein, 1945-48, § 112).
The concept of plan is certainly multifarious in its uses. We talk about the floor
plan of a building, CAD plans, production plans or schedules, maintenance plans,
project plans, cancer treatment plans, and so on. What these uses have in common
is the central role of some artifact. In fact, the English word ‘plan’ is derived from
French ‘plan’ (a ground plan or map, as in ‘plan de ville’, from the Latin
‘planum’, a flat surface). The notion is of a drawing on a flat surface, a standard
of correctness that can be used as instruction for action and guide in action and
that can be inspected and consulted in case of doubt, used as proof in case of dispute, and so on.
We certainly also talk about plans in a derived sense in other contexts of ordinary discourse. We use the term, for instance, to claim not only intent but considered intent: ‘I plan to leave early so that I can meet you by noon’, meaning some-
Chapter 12: Frail foundations
391
thing like ‘I have indeed been considering when to leave and decided to depart
early so that I’m certain to arrive at our rendezvous by noon.’ By using the term
‘plan’ in that way in such a context one is declaring not only commitment but also
that one has given one’s promise some serious thought. One might elaborate by
saying: ‘I plan to depart from point A at 9:00, which means that I’ll be at point B
at noon: As promised.’ Such avowals of considered commitment derive their
force from using the term ‘plan’, with its received connotations of publicly visible
inscription, metaphorically: my commitment is as firm as if it was on public record.
Now, Suchman also ascribed ‘plans’ to situations where such avowals may not
or need not to have been uttered, for example to what goes on prior to white water
canoeing:
‘in planning to run a series of rapids in a canoe, one is very likely to sit for a while above the falls
and plan one’s descent. The plan might go something like “I’ll get as far over to the left as possible, try to make it between those two large rocks, then backferry hard to the right to make it
around that next bunch.”’ (p. 52)
It should be noticed, in passing, that this example, possibly due to its brevity, is
ambiguous. Somebody is contemplating how to approach a line of action and we
are told that the plan may require a ‘great deal of deliberation, discussion, simulation, and reconstruction’. We are not told if the plan has any public status, whether the articulation of the plan amounts to an avowal of commitment. That is, what
would happen if the planner or her co-canoeist disregards the plan they have
committed to? If one of them breaks the plan and they end up wet, cold, and
bruised, could the other not then object, ‘But we agreed to backferry hard to the
right, not to the left’. As it stands, the example reads as if the ‘plan’ and its possible ‘abandonment’ is somehow of no consequence to either of them or to others.
In what sense is it a plan then?
Suchman also referred to ‘plans’ belonging to practices that, by contrast, involve the use of inscribed artifacts, such as a traveller (a lone traveller!) using a
map to find his way (p. 189). But a map in the hands of a traveller belongs to an
entirely different kind of practice than someone’s contemplating the course of a
canoe trip. And it can, in turn, be used in quite different ways, as a representation
of the geography and what it may offer to a traveller with time to spare or as a
representation of the trajectory one wants to or, indeed, is obliged to follow.
And, again, ‘plans’ incorporated in the help system of a photocopier, the use of
which Suchman analyzed in the book, belong to practices of yet another kind. In
this case the putative ‘plans’ are computational artifacts that are supposed to regulate peoples’ use of the photocopier. That is, as opposed to the map in the hands
of the bored traveller, this artifact has the capacity to execute controlled state
changes which may physically prevent a user from (or enable him in) doing this
or that in certain circumstances. Whatever merit the particular design may have or
lack, a practice in which a causal mechanism (a computational artifact) is used as
part of a technique of planning is certainly quite different from a practice in which
the artifact of planning is static, which again is quite different from a practice in
392
Cooperative Work and Coordinative Practices
which one has promised, on one’s honor, to perform a task in a certain way,
which, finally, surely is different from a practice in which a person is pondering
the best course of canoeing. These various practices may have something in
common (beyond the noun ‘plan’ we tend to use when referring to these practices)
but nothing that would make ‘plans’ in all fuzzy generality a researchable phenomenon.
What is most remarkable, however, is that any discussion of ordinary plans —
time tables, production schedules, project plans, clinical protocols — is absent
from the analysis. It would not be surprising were such real-world plans systematically omitted from cognitivist discourse,104 but their absence in the context of a
call for ethnographic studies of actual socially situated action is very remarkable
indeed. Their absence is all the more remarkable in light of the fact that Suchman’s early work exactly focused on practices in which such ordinary plans loom
large (e.g., Suchman, 1983).
1.2. Counter-cognitivism
On a decisive issue Suchman’s critique of cognitivism is very clear and firm. She
stated — emphatically and repeatedly — that ‘our actions, while systematic, are
never planned in the strong sense that cognitive science would have it’ (p. ix); that
‘plans are resources for situated action, but do not in any strong sense determine
its course’ (p. 52), and so on. By the expression ‘strong sense’ she obviously
meant ‘causal sense’. That is, she took issue with the basic cognitivist proposition
that rational action is to be explained by reference to some (obscure) causal ‘process’.
However, she did not realize that in ordinary language the concept ‘plan’ (‘following a plan’, ‘agreeing to a plan’, ‘violating a plan’) is a concept of normative
behavior. Hence an expression such as ‘planning in the strong [causal] sense’ is
unintelligible, as it suggests the very possibility of a causal determination of action
by a plan. Plans are normative constructs; that is, they provide criteria for whether
or not a particular action is correctly executed.
On the other hand, if Suchman was using the word ‘plan’ in the sense underlying cognitivism’s metaphysical language (material processes that somehow maintain other material processes in a certain state) then her key propositions in turn
loose their sense. What might it, for instance, then mean to say that, ‘as essentially
linguistic representations, plans are efficient formulations of situated actions’ (p.
186)? The problem is, of course, that if one uses the word ‘plan’ in a contrived
sense, transposed from a quite different domain of discourse, then one has lost the
104 Miller et al. actually do refer to ordinary plans: ‘A public plan exists whenever a group of people try to
cooperate to attain a result that they would not be willing or able to achieve alone. Each member takes upon
himself the performance of some fragment of the public plan and incorporates that fragment into his individual, personal Plans’ (Miller, et al., 1960, p. 98). However, they obviously do not notice that they are here
implying ‘plans’ in a normative sense.
Chapter 12: Frail foundations
393
ability to account for ordinary plans in our everyday life. (At the same time one
has introduced a source of confusion in computer science as well).
Thus, when Suchman talked about ‘plans’, it is generally not clear if the word
is to be read as ‘plans-according-to-cognitive-science’ or as ‘“plans” as we might
use them in ordinary affairs’ (Sharrock and Button, 2003, p. 259). As noted
above, this was deliberate on Suchman’s part. This is more than a source of ambiguity, however. Suchman was herself obviously not aware of this rather important
distinction. The normative status of plans is completely absent from her account.
Cognitivism’s foundations were thus left effectively untouched and intact.
Having from the outset accepted the cognitivist concept of ‘plan’ as meaningful, albeit empirically objectionable, and having thus conceded the high ground,
how could Suchman then possibly mount anything like an attack on cognitivism?
She did so by deploying what could be dubbed a strategy of containment. That is,
she tried to keep the cognitivist peril at bay by neutralizing its account of rational
action and the pernicious implications that flow from it, by insisting that ‘plans
are best viewed as a weak resource for what is primarily ad hoc activity’, etc. The
intellectual costs of this strategy are very high indeed, however. To protect the
phenomenon of real-world human action — socially organized, materially situated, embodied, intentional action — from being reduced to a mere epiphenomenon
of some hidden causal process, Suchman introduced a categorical gap between
plan and situated action. She did that, for instance, when she stated that ‘situated
action turns on local interactions between the actor and contingencies that, while
they are made accountable to a plan, remain essentially outside of the plan’s
scope’ (pp. 188 f.). The proposition is somewhat ambiguous in as much as it is not
entirely clear what is meant by the term ‘scope’, but the expression ‘essentially
outside’ would normally serve to indicate that the concepts of ‘situated action’
and ‘plan’ are conceptually unrelated, i.e., that they are unrelated by definition,
and that any relation therefore would be contingent. This reading is confirmed by
statements throughout the book in which ‘plans’ are granted a role prior to and
subsequent to action, but not in action. For example, and as shown above, Suchman asserted that ‘It is only when we are pressed to account for the rationality of
our actions, given the biases of European culture, that we invoke the guidance of a
plan’ (p. ix). In a similar vein she posited that ‘plans are a constituent of practical
action, but they are constituent as an artifact of our reasoning about action, not as
the generative mechanism of action’ (p. 39). The reading I am suggesting, that
Suchman construed a categorical separation of ‘plans’ and ‘situated actions’, is
further confirmed by her discussion of the canoe example mentioned in passing
above:
‘A great deal of deliberation, discussion, simulation, and reconstruction may go into such a plan.
But, however detailed, the plan stops short of the actual business of getting your canoe through the
falls. When it really comes down to the details of responding to currents and handling a canoe, you
effectively abandon the plan and fall back on whatever embodied skills are available to you. The
purpose of the plan in this case is not to get your canoe through the rapids, but rather to orient you
in such a way that you can obtain the best possible position from which to use those embodied
skills on which, in the final analysis, your success depends.’ (p. 52)
394
Cooperative Work and Coordinative Practices
That is, on Suchman’s view, when one starts acting, one ‘effectively abandons’
the plan; its only purpose now is to ‘orient’ one so as to be able to improvise
smartly. The same picture of plans or ‘abstract representations of situations and of
actions’ was painted in the conclusions of the book:
‘The foundation of actions […] is not plans, but local interactions with our environment, more and
less informed by reference to abstract representations of situations and of actions, and more and
less available to representation themselves. The function of abstract representations is not to serve
as specifications for the local interactions, but rather to orient or position us in a way that will allow us, through local interactions, to exploit some contingencies of our environment, and to avoid
others.’ (p. 188).
Again action was portrayed as essentially ad hoc action, ‘local interactions’, and
again the role of ‘plans’ was seen as merely that of affording improvisation: ‘orient or position us in a way that will allow us, through local interactions, to exploit
some contingencies of our environment, and to avoid others’.
Now, Suchman did state, quite clearly, that ‘the essential nature of action,
however planned or unplanned, is situated’ (p. x). I can of course have no problem
with that statement. All action, planned or improvised, is situated, i.e., contingent,
embodied, materially contextualized, etc. But it is important to realize that this is
not a substantive or empirical proposition but a logico-grammatical one. It simply
states that action and context are internally related concepts. But Suchman tended
to forget this. As already shown, ‘situated actions’ were often characterized as
‘essentially ad hoc’ (e.g., pp. ix, 48, 61, 78), which is plain nonsense. Action is
essentially (!) situated, and some actions are ad hoc, while other actions certainly
are not ad hoc. The action of executing a plan is, being an action, situated; but it is
not, again by definition, ad hoc.
In sum, it is internal to the concept of action that action is situated, contingent.
To say that action is essentially ad hoc or that plans flounder on the contingent
nature of action is deeply confused. Some action is characterized by being overwhelmingly spontaneous, unpremeditated, ad hoc, improvised, etc. Other action is
planned in the sense that there is an obligation to execute the action in certain
ways: steps to be taken in a certain sequence, by certain actors, at certain times, by
using certain resources, etc. Plans do not cause action to take a particular course,
for they cannot cause anything. Just like rules, conventions, notations, etc., plans
are normative constructs of our common practices.
Chapter 12: Frail foundations
395
1.3. Transcendental judgments
‘We understand what it means to set a pocket-watch to
the exact time, or to regulate it to be exact. But what if it
were asked: Is this exactness ideal exactness? […]
No single ideal of exactness has been envisaged; we do
not know what we are to make of this idea’
(Wittgenstein, 1945-46, § 88)
The costs of Suchman’s strategy of containing cognitivism are not limited to the
categorial separation of ‘plans’ and ‘actions’ and to the ‘abandonment’ of plans at
a desolate place outside of situated action.
As we have seen, the cognitivist notion of ‘plan’ is predicated on a notion of ‘a
complete description of behavior’ (Miller, et al., 1960). Within a specific practice
the notion of a complete description or a completely specified plan, instruction,
recipe, etc. of course make sense, in as much as there are criteria for what can be
taken to be a completely specified plan, etc. That is, the plan, the instruction, etc.
is complete when it for an ordinary practitioner is unproblematic to follow it, apply it, use it, etc. However, the cognitivist notion of a ‘a complete description of
behavior’ presumes completeness in the abstract, irrespective of any specific practice.
The cognitivist notion of a ‘complete’ plan is therefore as absurd as the fantastic story of the ‘perfect map’ that Borges has given us. The story is presented as a
text fragment entitled ‘Of Exactitude in Science’ that Borges, playfully, pretends
to have found in an old book, ‘Travels of Praiseworthy Men (1658) by J. A. Suarez Miranda’:
‘… In that Empire, the craft of Cartography attained such Perfection that the Map of a Single
province covered the space of an entire City, and the Map of the Empire itself an entire Province.
In the course of Time, these Extensive maps were found somehow wanting, and so the College of
Cartographers evolved a Map of the Empire that was of the same Scale as the Empire and that
coincided with it point for point. Less attentive to the Study of Cartography, succeeding Generations came to judge a map of such Magnitude cumbersome, and, not without Irreverence, they
abandoned it to the Rigours of sun and Rain. In the western Deserts, tattered Fragments of the Map
are still to be found, Sheltering an occasional Beast or beggar; in the whole Nation, no other relic
is left of the Discipline of Geography.’ (Borges, 1946, p. 141)
Here the absurdity is obvious enough. But the very idea that a ‘complete description’ could be given outside of a practice in which the criterion for completeness
is given is equally absurd.
Suchman was at pains to demonstrate that the cognitivist completeness notion
is untenable, which it certainly is. The problem with Suchman’s strategy is that
she engaged in an overwhelmingly empirical argument to demonstrate that the
metaphysical assumptions of cognitivism are factually groundless. But they are
not groundless; they are meaningless. Her strategy therefore, unwittingly, reproduced the metaphysics of cognitive science, albeit in the inverse. This shows in
several ways.
396
Cooperative Work and Coordinative Practices
Suchman repeatedly stated that plans are inherently ‘vague’ compared with action: ‘Stated in advance, plans are necessarily vague, insofar as they must accommodate the unforeseeable contingencies of particular situations’ (p. ix). Similarly,
she stated that ‘plans are inherently vague’ (pp. 38, 185) or refers to the ‘representational vagueness’ of plans (p. 185). But could action be anything but ‘situated’
in the sense used by Suchman: contextual, embodied, etc.? Would ‘doing x’ not
be categorically different from ‘thinking about doing x’ (or ‘describing doing x’ or
‘planning doing x’)? But of course it would! This is as trivial as saying that there
is a difference between the landscape and the map of the landscape.
How does one compare the ‘vagueness’ or ‘completeness’ of ‘plans’ vis-à-vis
‘actions’? Is there a metric out there one can employ? Surely not. But what does it
mean, then, that ‘plans are inherently vague’? When Suchman was making these
statements she was falling back into the metaphysical framework of the cognitivist tradition she was otherwise trying to demolish. The cognitivists’ transcendental use of the term ‘completeness’ and Suchman’s equally transcendental use
of the opposite term ‘vagueness’ are both examples of the kind of metaphysical
smoke that is produced when language is idling. The notion of ‘vague’ plans, ‘incomplete’ plans, etc. presupposes the logical possibility of distinct plans, complete plans, etc. The terms ‘vague’ and ‘complete’ are used as characterizations of
specific plans with respect to specific practices.
According to which criteria can a plan be said to be ‘vague’? Are there criteria
of ‘vagueness’ or specificity or adequacy outside of the particular practice?
Suchman pointed out herself that ‘While plans can be elaborated indefinitely, they
elaborate actions just to the level that elaboration is useful; they are vague with
respect to the details of action precisely at the level at which makes sense to forego abstract representation, and rely on the availability of a particular, embodied
response’ (p. 188). True: provided one has indefinite time and resources, then
plans can be elaborated indefinitely. But if the criterion of a plan’s appropriate
level of elaboration is a practical one, then surely some plans are vague (with respect to the criteria internal to the practice in question) and others not. In fact, we
would hardly make plans if they were not, generally, suitably specific and complete for our practical purposes.
The same metaphysical form of discourse is in evidence when Suchman stated
that the ‘circumstances of our actions […] are continuously changing around us’
(p. viii). Sure, there is a sense in which we can say that the world changes continually: electrons jump about, molecules form and dissolve, cells reproduce and decay, etc. But the notion of the ‘circumstances of our actions’ refers to the circumstances that are of practical significance, those that to a practitioner make a difference. And in that sense the ‘circumstances of our actions’ may or may not change.
Hence, we can unproblematically talk about doing the same thing under the same
circumstances. That is, in her honorable effort to turn the table on the cognitivism’s realist notion of ‘complete description’ in the abstract, Suchman fell back
Chapter 12: Frail foundations
397
into the trite nominalist notion that ‘everything is unique’: one cannot jump into
the same river twice, nothing stays the same, all action is therefore ad hoc. 105
The source of these confused ideas is, I suggest, the cognitivist notion that
plans are ‘descriptions’, ‘representations’, etc. On the cognitivist account, ‘plans’
are hypotheses derived inductively that (in some unintelligible way) are operating
causally. To this Suchman countered with the classical objection that inductively
derived hypotheses (and theories) are underdetermined with respect to manifold
reality and that a given hypothesis therefore is only one possible interpretation out
of many. Hence, I presume, the confusing talk about essential vagueness of
‘plans’.
In sum, Suchman was trying to demolish the cognitivist program but she did
not, from the very outset, dissolve cognitivism’s obscure notion of ‘plans’. That
is, she accepted to oppose the cognitivist notion of ‘plans’ on the very battle field
chosen and defined by cognitivism. The cognitivist notion of ‘plans’, phantasmal
objects that supposedly determine rational action in a causal sense, is a meaningless construct and would have to be demolished as such. By accepting this construct as having sense, Suchman also, unwittingly and against everything she otherwise stood for, engaged in a metaphysical discourse and got trapped as a fly in a
bottle.
For CSCW the immediate problem with the metaphysics underlying cognitivism as well as that retained in Suchman’s reversal of cognitivism is that this
form of theorizing tends to make us blind to the multiplicity of practices and
hence to phenomena that are crucially important to us. Consequently, although
Suchman’s book was justly received enthusiastically as portending a liberation
from the cognitivist dogma and from its stifling effects on socio-technical research, and while it offered vital initial formulations of the practice-oriented program of CSCW, it also, at the same time, contributed to stifling that program by
reproducing the metaphysical form of reasoning characteristic of cognitivism.
105 Suchman has recently, in response to criticisms of propositions such as ‘situated actions are essentially
ad hoc’, made some rather guarded comments: ‘I see my choice of the term ad hoc here as an unfortunate
one, particularly in light of subsequent readings of the text. The problem lies in the term’s common connotations of things done anew, or narrowly, without reference to historically constituted or broader concerns.
Perhaps a better way of phrasing this would be to say that situated actions are always, and irremediably, contingent on specific, unfolding circumstances that are themselves substantially constituted by those same actions. This is the case however much actions may also be informed by prescriptive representations, past experience, future considerations, received identities, entrenched social relations, established procedures, built
environments, material constraints and the like. To be rendered effective the significance and relevance of
any of those must be reiterated, or transformed, in relation to what is happening just here and just now.’
(Suchman, 2007, p. 27, n. 4). — It first of all needs to be said that the ‘choice of the term ad hoc’ is not simply ambiguous but confusing in as much as ‘ad hoc’, as any dictionary will confirm, has a stable meaning,
namely, something done ‘for the particular end or purpose at hand and without reference to wider application
or employment’, in contrast to ‘planned’ or ‘coordinated’. Anyway, removing the term from the text does not
alter anything. Should the phrase ‘contingent on specific, unfolding circumstances’ be read as a logicogrammatical one or an empirical one? In case the former reading is intended, the clause simply states that
action is irremediably situated or that planning is not acting. In the latter reading, the question remains: ‘irremediably contingent’ according to which criteria? Does it, on this account, even make sense to talk of an
action as an identifiable course of conduct?
398
Cooperative Work and Coordinative Practices
1.4. Regularity and normativity
‘A rule stands there like a signpost. — Does the signpost leave no doubt about the way I have to go? Does it
show which direction I am to take when I have passed
it, whether along the road or the footpath or crosscountry? But where does it say which way I am to follow it; whether in the direction of its finger or (for example) in the opposite one?’
‘The signpost is in order — if, under normal circumstances, it fulfils its purpose.’
(Wittgenstein, 1945-46, §§ 85, 87)
The extent to which Suchman in her book stayed within the cognitivist framework
is particularly clear in her understanding of the term ‘plan’. For cognitivism, for
which persons are mere carriers of ‘plans’, a ‘plan’ is a description of action. On
this view, the normative sense of plans, as prescribed or agreed–to course of action, is not considered at all. Suchman conceived of plans in exactly the same
way:
‘in our everyday action descriptions we do not normally distinguish between accounts of action
provided before and after the fact, and action’s actual course’ (p. 38 f.). ‘Like all action descriptions, instructions necessarily rely upon an implicit et cetera clause in order to be called complete’
(p. 62). ‘The general task in following instructions is to bring canonical descriptions of objects and
actions to bear on the actual objects and embodied actions that the instructions describe’ (p. 101).
This is again remarkable. As already noted, the term ‘plan’ is generally used in
ways comparable to those of the term ‘rule’. We use the term ‘rule’ both descriptively, to indicate regularity (‘As a rule, I clock out when the bars open’), or as a
criterion of correct conduct (‘The house rule says no fighting and no spitting on
the floor!’). Similarly, we certainly sometimes use the term ‘plan’ as an ‘action
description’: ‘He seemed to be working according to a plan’, or ‘He acted in conformance with Plan B’; but we also use the term to refer to normative constructs:
‘This is our plan…’ or ‘According to the production plan, we have to be finished
today.’
How can we talk and reason sensibly about rules and plans and ad hoc activities and about the role of ‘formalisms’ in work practices?
To extract ourselves from the quagmire of cognitivism, which Suchman set out
to do but did not succeed in doing, we should first of all heed some cardinal distinctions of which Wittgenstein has reminded us.106 A major source of confusion
is many social scientists’ apparently stubborn refusal to distinguish between, on
one hand, mere regularity of behavior, and on the other hand, following a rule.
106 The following discussion is, of course, primarily based on Wittgenstein’s famous analysis in Philosophical Investigations (1945-46, esp. §§ 82-87 and 185-242) but also his Remarks on the Foundations of Mathematics (1937-44). For excellent but not entirely mutually congruent commentaries on this subtle analysis, cf.
the works of Winch (1958), Pitkin (1972), von Savigny (1974, 1994), Baker and Hacker (1984, 2009), Malcolm (1986, 1995), Hacker (1989, 1990), and Williams (1999).
Chapter 12: Frail foundations
399
Mere regularity: that is, people’s exhibiting observable and reasonably predictable patterns of behavior, or their acting in observable conformity with a rule.
What many sociologists and psychologists try to tease out by trying to detect correlations in behavior (e.g., differing suicide rates) is this: mere regularity. Such
patterns may be important ‘sociological facts’, but they may also be of the same
stuff as the notorious channels on Mars.
When we talk about rule following, by contrast, we talk about something entirely different: practices that not only involve observable regularity of conduct
but also the ability of actors to explain, justify, sanction, reprimand, etc. actions
with reference to rules, and often also the ability to teach rules, formulate rules,
debate rules, etc. In his Remarks on the Foundations of Mathematics, Wittgenstein gives an wonderfully straightforward illustration of this point:
‘Let us consider very simple rules. Let the expression be a figure, say this one:
⎢— —⎥
and one follows the rule by drawing a straight sequence of such figures (perhaps as an ornament).
⎢— —⎥ ⎢— —⎥ ⎢— —⎥ ⎢— —⎥ ⎢— —⎥
Under what circumstances should we say: some gives a rule by writing down such a figure? Under
what circumstances: someone is following this rule when he draws this sequence? It is difficult to
describe this.
If one of a pair of chimpanzees once scratched the figure ⎢— —⎥ in the earth and thereupon the
other the series ⎢— —⎥ ⎢— —⎥ etc., the first would not have given a rule nor would the other be
following it, whatever else went on at the same time in the mind of the two of them.
If however there were observed, e.g., the phenomenon of a kind of instruction, of shewing how
and of imitation, of lucky and misfiring attempts, of reward and punishment and the like; if at
length the one who had been so trained put figures which he had never seen before one after another in sequence as in the first example, then we should probably say that the one chimpanzee
was writing rules down, and the other was following them.’ (Wittgenstein, 1937-44, VI §42)
The difference between ‘regularity’ and ‘rule following’ is a categorial one.
The formulation of a rule is not an empirical proposition, whereas the formulation
of a regularity is; the formulation of a rule is a normative one, it provides criteria
for what is correct and what is not, what is right and what is wrong. One may, for
example, observe that people of a certain age are disproportionately represented
among those who commit suicide; this would be an observation of a regularity.
But making this observation is obviously not the expression of a rule. If I said to a
particular person that he had tried to commit suicide at the wrong age, I would be
most certainly be regarded as demented, and rightly so. This should be clear
enough, or so I should like to think. Let us therefore move on to the more tricky
issue of the relationship between rule and action.
What may make Wittgenstein’s observation on our concept of ‘following a
rule’ particularly hard to accept by sociologists and other social scientists, is the
strong inclination in social theory to conceive of rational action as necessarily involving some kind of ‘interpretation’ of rules, instructions, precedents, etc. on the
one hand and ‘interpretation’ of the situation on the other. This is the intellectualist legend at work: the actor is portrayed in the image of the scholar bent over an
400
Cooperative Work and Coordinative Practices
ancient text fragment trying to develop a version that is both loyal to the original
and at the same time understandable to the modern reader.
Wittgenstein reminds us, however, not to confound following a rule and interpreting a rule. To do so he takes the reader of his Philosophical Investigations
through a lengthy reductio ad absurdum, demonstrating that a course of action
that exhibits regularity can be made out to conform with multiple and mutually
contradictory rule formulations. After having done that, Wittgenstein stops the
reductio and lets his fictional interlocutor ask: ‘But how can a rule teach me what
I have to do at this point? After all, whatever I do can, on some interpretation, be
made comparable with the rule’. Wittgenstein replies:
‘No, that’s not what we should say. Rather: every interpretation hangs in the air together with what
it interprets, and cannot give it any support. Interpretations by themselves do not determine meaning.’ (Wittgenstein, 1945-46, § 198).
The operative word here is interpretation. A few sections later, Wittgenstein emphasizes this point:
‘This was our paradox: no course of action could be determined by a rule, because every course of
action can be brought into accord with the rule. The answer was: if every course of action can be
brought into accord with the rule, then it can also be brought into conflict with it. And so there
would be neither accord nor conflict here.
That there is a misunderstanding here is shown by the mere fact that in this chain of reasoning we
place one interpretation behind another, as if each one contend us at least for a moment, until we
thought of yet another lying behind it. For what we thereby show is that there is a way of grasping
a rule which is not an interpretation, but which, from case to case of application, is exhibited in
what we call “following the rule” and “going against it”.
That’s why there is an inclination to say: every action according to a rule is an interpretation. But
one should to speak of interpretation only when one expression of a rule is substituted for another.’ (Wittgenstein, 1945-46, § 201)
In following a rule there is no space for interpretation: ‘an interpretation gets us
no closer to an application than we were before. It is merely an alternative formulation of the rule, another expression in the symbolism which paraphrases the initial one’ (Baker and Hacker, 2009, p. 92).
To be sure, a course of action sometimes involves interpretation, but it does not
necessarily do so. As Wittgenstein puts it in Zettel, ‘an interpretation is something
that is given in signs. It is this interpretation as opposed to a different one (running differently)’ (Wittgenstein, 1945-48, § 229). That is, we talk of ‘interpretation’ when referring to the substitution of one linguistic construct (rule formulation, instruction, command, statement, representation, etc.) by another and, supposedly, more useful construct. In following a rule, the rule is not ‘interpreted’ or
the like; it is simply applied or enacted, because that is what following a rule
means. Understanding a rule means that I can apply it without engaging in interpreting the rule. As succinctly summarized by Baker and Hacker, ‘to grasp a rule
is to understand it, and understanding a rule is not an act but an ability manifested
in following the rule’ (Baker and Hacker, 2009, p. 96). And Wittgenstein again:
‘What happens is not that this symbol cannot be further interpreted, but: I do no
Chapter 12: Frail foundations
401
interpreting. I do not interpret because I feel at home in the present picture. When
I interpret, I step from one level of thought to another’ (Wittgenstein, 1945-48, §
234).
In fact, action normally does not involve interpretation. In our ordinary work
practices we do not normally engage in interpretation work whenever we follow
instructions or execute plans. We sometimes have to, of course, but generally we
do not. That is the whole point of the concept of ‘the natural attitude’. Interpretation is required when doubt is a practical issue, and doubt needs grounds too
(Wittgenstein, 1949-51, § 124). Endless doubt is impossible: ‘If you tried to doubt
everything you would not get as far as doubting anything. The game of doubting
itself presupposes certainty’ (§ 115). We interpret when it is conceivable to us that
we could be wrong, e.g., when we are uncertain about the meaning of a rule formulation or do not yet fully understand it.
At the end of his discussion of the notion of following a rule in the Philosophical Investigations, Wittgenstein lets his interlocutor ask: ‘How am I able to obey a
rule?’ To which Wittgenstein replies:
‘If this is not a question about causes, then it is about the justification for my acting in this way in
complying with the rule.
Once I have exhausted the justifications, I have reached bedrock, and my spade is turned. Then I
am inclined to say: “This is simply what I do.”’ (Wittgenstein, 1945-46, § 217)
Two paragraphs later, Wittgenstein wraps up this line of argument by saying:
‘When I follow the rule, I do not choose.
I follow the rule blindly.’ (§ 219)
If read out of context, the phrase ‘I follow the rule blindly’ can be misunderstood
as suggesting that normative behavior is irrational, or non-rational: ‘But in context
it signifies not the blindness of ignorance, but the blindness of certitude. I know
exactly what to do. I do not chose, after reflection and deliberation, I just ACT —
in accord with the rule’ (Baker and Hacker, 1984, p. 84). In other words, what is
meant is not that the actor in following the rule proceeds mindlessly but that he or
she goes on as a matter of course. What is meant by saying ‘I follow the rule
blindly’ is made perfectly clear in the Remarks on the Foundations of Mathematics:
‘One follows the rule mechanically. Hence one compares it with a mechanism.
“Mechanical” — that means: without thinking. But entirely without thinking? Without reflecting.’
(Wittgenstein, 1937-44, VII §61).
When following the rule, that is, the actor proceeds as ‘a matter of course’
(Wittgenstein, 1945-46, § 238), for doubt is not an option. The rule ‘always tells
us the same, and we do what it tells us’: ‘One does not feel that one has always
got to wait upon the nod (the prompt) of the rule. On the contrary, we are not on
tenterhooks about what it will tell us next, but it always tells us the same, and we
do what it tells us.’ (Wittgenstein, 1945-46, § 223). The rule ‘is my final court of
appeal for the way I’m to go’. On Wittgenstein’s account, then, the concept of
‘rule’ should be understood in its ‘internal’ or ‘logico-grammatical’ relations to
402
Cooperative Work and Coordinative Practices
the concept of ‘practice’ and thereby to the concept of ‘techniques’ (Baker and
Hacker, 2009, pp. 140-145). Understanding a rule means possessing the ability to
do certain things correctly and is manifested in following a rule and mastering the
appropriate techniques.
If the question ‘How am I able to obey a rule?’ is about causes, however, then
the answer is simply, in Wittgenstein’s words, that ‘We are trained to do so; we
react to an order in a particular way’ (Wittgenstein, 1945-46, § 206). That is,
‘there is no explanation of our ability to follow rules — other than the pedestrian
but true explanation that we received a certain training.’ (Malcolm, 1986, p.
180).107
For Wittgenstein, then, to follow a rule is a practice. A ‘person goes by a signpost only in so far as there is an established [ständigen] usage, a custom’ (§ 198).
The rule, in contrast to the various spoken or written expressions and representations of the rule, does not exist independently of the action, as some mysterious
mental entity. But nor does it make sense to think of rule following as something
only one person could do only once in his or her life.
‘It is not possible that there should have been only one occasion on which only one person followed a rule. It is not possible that there should have been only one occasion on which a report
was made, an order given or understood, and so on. — To follow a rule, to make a report, to give
an order, to play a game of chess, are customs (usages, institutions)’ (Wittgenstein, 1945-46, §
199).
To follow a rule means mastering a technique (§ 199) and is thus ‘a practice’ (§
202).
The concept of ‘practice’, then, should not be conceived of as mere conduct or
behavior, nor as incessant improvisation or ‘irremediably contingent’ action. A
practice is constituted by a rule (or an array of rules) that provides the standard of
correct or incorrect conduct. It is the rule (or array of rules) that identifies a course
of action as an instance of this practice, as opposed to an instance of another practice. In the words of Peter Winch, ‘what the sociologist is studying, as well as his
study of it, is a human activity and is therefore carried on according to rules. And
it is these rules, rather than those which govern the sociologist’s investigation,
which specify what is to count as doing “the same kind of thing” in relation to that
kind of activity’ (Winch, 1958, p. 87, emphasis deleted). The identity and integrity of rules and practices over time are themselves the result of practitioners’ ‘reflective’ efforts of instructing and teaching, of commanding and correcting, of
emulating and practicing, of correcting own transgressions and asking for guidance, and of contemplating and negotiating new ways of doing things. Practices
are upheld.
107 Meredith Williams has an interesting discussion of the key issue of learning in normative behavior (M.
Williams, 1999, chapter 7).
Chapter 12: Frail foundations
403
2. Work and interpretation work
Why would Suchman be conceptually blind for the role of plans in action, that is,
of the normative character of plans?
The reason seems to be that she interposed interpretation between the plan and
the action. Thus, to act the actor must first interpret the situation and the plan with
respect to the situation and only then act. She seemed to believe that she was following Garfinkel in this, but that would be a misrepresentation of Garfinkel. Garfinkel is quoted (ibid., p. 62) for this observation:
‘To treat instructions as though ad hoc features in their use was a nuisance, or to treat their presence as grounds for complaining about the incompleteness of instructions, is very much like complaining that if the walls of a building were gotten out of the way, one could see better what was
keeping the roof up.’ (Garfinkel, 1967, p. 22)
It is a wittily put but carelessly general observation, in that the term ‘ad hoc’ of
course presumes the logical possibility that instruction can be complete. The criterion of the completeness and incompleteness of instructions is internal to the particular practice. However, Suchman then elaborated:
‘Like all action descriptions, instructions necessarily rely upon an implicit et cetera clause in order
to be called complete. The project of instruction-writing is ill conceived, therefore, if its goal is the
production of exhaustive action descriptions that can guarantee a particular interpretation. What
“keeps the roof up” in the case of instructions for action is not only the instructions as such, but
their interpretation in use. And the latter has all of the ad hoc and uncertain properties that characterize every occasion of the situated use of language.’ (Suchman, 1987, p. 62)
Suchman was here, again, making the argument that the map is not complete
compared to the terrain and that its use therefore requires ‘interpretation’ and, accordingly, have ‘all of the ad hoc and uncertain properties that characterize every
occasion of the situated use of language’. There is no reason to reiterate why this
was confused. My point here is that she, explicitly, interposed interpretation as a
necessary intermediary between instruction and action. (For a parallel critique, cf.
Sharrock and Button, 2003).
2.1. Garfinkel (mis)interpreted
It is highly relevant and instructive to note the nature of the case Garfinkel is referring to in the above quote. The study, conducted in the early 1960s by Garfinkel in collaboration with Egon Bittner was concerned with selection activities
at an outpatient psychiatric clinic: ‘By what criteria were applicants selected for
treatment?’. Their sources of information were the clinical records. The most important of these were intake application forms and the various contents of case
folders. They take care to point out that clinical folders contain records that are
generated by the activities of clinical personnel and that ‘almost all folder contents, as sources of data for our study, were the results of self-reporting procedures’ (Garfinkel, 1967, pp. 186 f.).
404
Cooperative Work and Coordinative Practices
Altogether 1,582 clinic folders were examined by two graduate students of sociology who were tasked with extracting information and fill in a ‘coding sheet’.
In doing this, the coders were permitted to make inferences and encouraged to undertake ‘diligent search’. Nonetheless, they were unable to obtain answers to quite
many of the items in the coding sheet. For about half of the items dealing with
clinical issues, the coders only got information from between 0 and 30 percent of
the cases. Garfinkel and Bittner’s thorough account of the reasons for this rather
dismal performance is a most informative discussion of the methodological problems that arise in studies that depend on secondary use of clinical records produced for internal use in the clinical setting. The gist of it is this:
‘We came to think of the troubles with records as “normal, natural” troubles. […] “Normal, natural
troubles” are troubles that occur because clinic persons, as self-reporters, actively seek to act in
compliance with rules of the clinic’s operating procedures that for them and from their point of
view are more or less taken for granted as right ways of doing things. […] The troubles we speak
of are those that any investigator — outsider or insider — will encounter if he consults the files in
order to answer questions that depart in theoretical or practical import from organizationally relevant purposes and routines under the auspices of which the contents of the files are routinely assembled in the first place. Let the investigator attempt a remedy for shortcomings and he will
quickly encounter interesting properties of these troubles. They are persistent, they are reproduced
from one clinic’s files to the next, they are standard and occur with great uniformity as one compares reporting systems of different clinics, they are obstinate in resisting change, and above all,
they have the flavor of inevitability. This inevitability is revealed by the fact that a serious attempt
on the part of the investigator to remedy the state of affairs, convincingly demonstrates how intricately and sensitively reporting procedures are tied to other routinized and valued practices of the
clinic. Reporting procedures, their results, and the uses of these results are integral features of the
same social orders they describe. Attempts to pluck even single strands can set the whole instrument resonating.’ (Garfinkel, 1967, pp. 190 f.)
That is, the ‘troubles’ arise whenever clinical records are used to ‘answer questions that depart in theoretical or practical import’ from the purposes for which
they were assembled and external to the practices that for clinicians ‘are more or
less taken for granted as right ways of doing things’. A clinic, like any enterprise,
operates within a fixed budget and must, in its daily operation, consider the comparative costs of recording and obtaining alternative information. Some information items, such as sex and age of patients, are of course cheaply acquired,
while other items, such as occupational history, require expensive reporting efforts (p. 192). At the same time, clinical records are assembled for future, variable, and generally unknown purposes. Consequently, such future purposes do not,
in and of themselves, carry much weight in the busy daily life of the clinic.
The division of labor in the clinic adds another source of ‘normal, natural troubles’: ‘The division of work that exists in every clinic does not consist only of differentiated technical skills. It consists as well of differential moral value attached
to the possession and exercise of technical skills.’ For instance, the role records
play in the accomplishment of administrative responsibilities is quite different
from the role they play in the pursuit of professional medical responsibilities, and
Garfinkel and Bittner pointed to ‘the wary truce that exists among the several occupational camps as far as mutual demands for proper record-keeping are con-
Chapter 12: Frail foundations
405
cerned’ (p. 194). Thus, clinicians exhibit ‘abiding concerns for the strategic consequences of avoiding specifics in the record, given the unpredictable character of
the occasions under which the record may be used as part of the ongoing system
of supervision and review’ (p. 194).
Now, the specific character of clinical work and the specific role of record
keeping in these practices pose another source of trouble, a ‘critical source of
trouble’ (p. 197). Clinical work consists in what Garfinkel and Bittner termed
‘remedial activities’. One of the crucial features of these is that ‘recipients [of
treatment] are socially defined by themselves and the agencies as incompetent to
negotiate for themselves the terms of their treatment’. Clinicians undertake to exercise that competence for them; they take responsibility for their patients. Accordingly ‘the records consist of procedures and consequences of clinical activities as a medico-legal enterprise’ (p. 198). This means that records are written
and gathered for ‘entitled readers’. A ‘competent readership’ is presumed. The
‘contents of clinic folders are assembled with regard for the possibility that the
relationship may have to be portrayed as having been in accord with expectations
of sanctionable performances by clinicians and patients.’ (p. 199). Thus ‘terms,
designations, and expressions contained in a document’ in the records are not ‘invoked in any “automatic” way to regulate the relationship’ of the terms to therapeutic activities. ‘Instead, the ways they relate to performances are matters for
competent readership to interpret’ (p. 199). That is, ‘considerations of medicolegal responsibility exercise an overriding priority of relevance as prevailing
structural interests whenever procedures for the maintenance of records and their
eligible contents must be decided.’ So, although records may be put to uses that
are different from those that serve the interests of considerations of medico-legal
responsibility, ‘all alternatives are subordinated’ to considerations of medico-legal
responsibility ‘as a matter of enforced structural priority’. ‘Because of this priority, alternative uses are consistently producing erratic and unreliable results’ (p.
200).
All these conditions have important implications for the relationship between
writer and reader of clinical records:
‘As expressions, the remarks that make up these documents have overwhelmingly the characteristic that their sense cannot be decided by a reader without his necessarily knowing or assuming
something about a typical biography and typical purposes of the user of the expressions, about
typical circumstances under which such remarks are written, about a typical previous course of
transactions between the writers and the patient, or about a typical relationship of actual or potential interaction between the writers and the reader. Thus the folder contents much less than revealing an order of interaction, presuppose an understanding of that order for a correct reading.’ (p.
201).
The records do work, however, because ‘there exists an entitled use of records’:
‘The entitlement is accorded, without question, to the person who reads them from the perspective
of active medico-legal involvement in the case at hand and shades off from there. The entitlement
refers to the fact that the full relevance of his position and involvement comes into play in justifying the expectancy that he has proper business with these expressions, that he will understand
them, and will put them to good use. The specific understanding and use will be occasional to the
406
Cooperative Work and Coordinative Practices
situation in which he finds himself. […] The possibility of understanding is based on a shared,
practical, and entitled understanding of common tasks between writer and reader.’ (p. 201).
That is, clinical records, especially records concerning legally touchy ‘remedial
activities’, pose quite specific methodological challenges for secondary analytical
use by non-competent readership. To emphasize this, Garfinkel and Bittner made
a comparison with actuarial records:
‘A prototype of an actuarial record would be a record of installment payments. The record of installment payments describes the present state of the relationship and how it came about. A standardized terminology and a standardized set of grammatical rules govern not only possible contents,
but govern as well the way a “record” of past transactions is to be assembled. Something like a
standard reading is possible that enjoys considerable reliability among readers of the record. The
interested reader does not have an edge over the merely instructed reader. That a reader is entitled
to claim to have read the record correctly, i.e., a reader’s claim to competent readership, is decidable by him and others while disregarding particular characteristics of the reader, his transactions
with the record, or his interests in reading it.’ (ibid., p. 202).
Clinical records belong to practices quite different from actuarial records:
‘In contrast to actuarial records, folder documents are very little constrained in their present meanings by the procedures whereby they come to be assembled in the folder. Indeed, document meanings are disengaged from the actual procedures whereby documents were assembled, and in this
respect the ways and results of competent readership of folder documents contrast, once more,
with the ways and results of competent actuarial readership.’ (p. 203).
The actuarial record ‘is governed by a principle of relevance with the use of which
the reader can assess its completeness and adequacy at a glance.’ By contrast,
with clinical records the reader, so to speak, reassembles the entries to ‘make the
case’.
It should be clear from this that no set of coding instructions, however elaborate and however meticulously designed, could have ameliorated the troubles Garfinkel and Bittner experienced. The troubles were found to be an inexorable feature of secondary use of clinical records.
In other words, the trouble with following instructions in this case is, essentially, the kind of trouble one will expect to experience when engaged in reusing records that have produced for specific purposes within one work practice — outside
of that practice and thus for purposes for which these records were not originally
intended. Garfinkel and Bittner were indeed quite adamant to point this out. What
they described is the kind of trouble historians engage in when they immerse
themselves in the archives, the collections of internal memos, minutes, and private
letters, etc. This kind of work is, essentially, interpretation work.
These findings are important for sociological studies that depend on documentary evidence produced within a particular practice for local purposes, especially
if issues of ethics and legal responsibility are at stake. They are also very informative with respect to the persistent troubles with ‘organizational memory’ and
‘knowledge management’ systems. But to construe Garfinkel’s argument as positing that instructions, by virtue of being linguistic constructs, always, everywhere,
under all circumstances, are ‘incomplete’ and require ‘interpretative work’ and,
Chapter 12: Frail foundations
407
hence, have ‘all of the ad hoc and uncertain properties that characterize every occasion of the situated use of language’, is preposterous. In spite of Garfinkel’s insistence that ethnomethodological studies are not meant to ‘encourage permissive
discussions of theory’ (p. viii), Suchman’s interpretation turns an incisive analysis
of certain methodological issues in investigating certain kinds of work practice
into a philosophical proposition.
That Suchman, at this crucial point in her argumentation, should have read
Garfinkel in such a way is puzzling. Righteous fervor in the struggle against the
cognitivist version of ‘rule governance’ would go some way towards explaining
the urge to come up with a counter-theory. But there is also the special character
of Garfinkel’s Studies in Ethnomethodology to consider. In this book Garfinkel
focused on what one could call problematic situations, either ‘normally, naturally’
problematic ones like the coding case, or contrived ones like the breaching experiments and the counselling experiment. This focus is hand-in-glove with his objective: showing that the taken-for-granted assumptions of everyday life, the ‘natural attitude’, are researchable phenomena in their own right. This focus may,
however, leave the impression that, according to Garfinkel, ordinary people, in the
natural attitude of their daily work, struggle to make sense of coding schemes,
production plans, administrative procedures, time tables. But that actors are not
‘judgmental dopes’ does not mean that they make it through the day by engaging
in endless interpretive work and ad hoc activities.108
2.2. Sources of the interpretation myth in sociology
Rule–skepticism is strong in the social sciences. There is an urge to interpose ‘interpretation work’ to account for practitioners’ acting in accordance with rules,
plans, schemes, etc., — an urge so strong, in fact, that even Ryle’s and Wittgenstein’s demolition of this myth seems exceedingly difficult to grasp and accept.
One source of this strong skepticist urge is the bafflement of a field worker when
faced with myriad activities and inscriptions that do not seem to add up and make
sense. It is natural for the field worker to project this bafflement and ascribe interpretation work to the observed practices: a natural fallacy.
In the kinds of setting that are of primary concern to CSCW (cooperative work
in organizational settings) the field worker will often find it difficult to align observable rule formulations (stipulated procedures, etc.) with the rules followed by
practitioners. In such settings, the field worker will often find presumptive rule
formulations that are not enforced, or seem to be in mutual contradiction, etc. One
may furthermore come across rule formulations (stated procedures, work schedules, project plans) of which members seem ignorant or which they deliberately
disregard. One may find rule formulations that seem to instruct actors to do a se108 It is only fair to point out that Suchman is not alone in this rule-skepticist generalization of Garfinkel. An
article by Button and Harper offers a case in point (1995). The same abrupt generalization of Garfinkel’s very
specific observations can be found in critical comment on Plans and Situated Actions by Sharrock and Button
(2003).
408
Cooperative Work and Coordinative Practices
ries of tasks in a specific order, for instance, ‘first do A, then B, and finally C’,
but then members, sometimes or often, can be observed to jump from A to C
while skipping B or to do B first and then A and C. Having experienced this, the
field worker will be tempted to report that organizational rules do not instruct
members what to do in a step-by-step manner, that they only convey general policies not operational guidelines, etc. An investigation of work practices that account for these practices in terms of the stated rules, by reference to the proverbial
rule book, will obviously produce an utterly distorted account. In view of this, sociologists have adopted various strategies. Some will introduce a distinction between ‘formal’ and ‘informal’ organization or between ‘formal’ and ‘informal’
rules, etc. (e.g., Selznick, 1948). Others will effectively dissolve the very notion
of rule-governed action by adopting the rule-skepticist position that work practices are ‘essentially ad hoc’ (e.g., Suchman, 1983, 1987; Bucciarelli, 1988a; Button
and Harper, 1995).
The field worker’s fallacy basically consists in mistaking the logical grammar
of the concept of ‘rule formulations’ for that of ‘rules’ (Baker and Hacker, 2009,
Chapter II). That we in our ordinary discourse do distinguish ‘rule formulations’
from ‘rules’ is evident. The same rule can be stated in different ways; it can be
expressed orally, in writing, by gesturing, and so on; it can be formulated in different languages, by means of different notations, at different levels of detail, at
different levels of formalization, by definition or by examples, by offering different examples, etc. One can make copies of rule formulations, but not of rules.
However, the field worker’s rule–skepticist fallacy is also an natural one, in as
much as we in our ordinary language do not always make a sharp distinction between the concept of ‘rule’ and ‘rule formulation’. In everyday life, when someone changes the formulation of a rule, the change will often be seen as a change of
the rule. That is, in the words of Baker and Hacker, rules and rule-formulations
cannot be simply segregated into ‘watertight compartments’, for ‘the grammars of
“rule” and “rule-formulation” run, for a stretch, along the same tracks’ (Baker and
Hacker, 2009, p. 47).
As pointed out by Egon Bittner (1965), the investigator does not have privileged access to determining the governing sense of a stated rule (as formulated in
a standard operating procedure or in a graphical representation of a classification
scheme). The field worker is, by definition, an outsider to the setting; he or she
does not (yet) understand the rule and has not (yet) been trained in applying the
rule. In this the investigator is in the exact same situation as a novice being taught
in the use of the same rule: like the novice, the investigator does not master the
rule, hence the doubt, the uncertainty, the tentative applications. In other words,
for lack of understanding, investigators are left with trying to interpret the stated
rule. This is a practical-epistemological condition that makes field work serious
work.
It is important to keep in mind that there is nothing intrinsic in the form of a
rule formulation that makes it a rule formulation; the sign does not need to have a
Chapter 12: Frail foundations
409
specific form, such as, say, ‘If… then…’ or ‘Thou shall not…’. In the words of
Baker and Hacker, again,
‘For the architectural historian or engineering students the blueprint describes the building or machine. It is used as a description; and if the building is other than as is drawn in the blueprint, then
the blueprint is false. But for the builder or engineer the blueprint is used as an instruction or rule
dictating how he should construct the building or machine. If what he makes deviates from the
blueprint then (other things being equal) he has erred — built incorrectly’ (Baker and Hacker,
2009, p. 52).
As field workers we cannot expect that rules are nicely stated in rule formulations
that have the paradigmatic form of rule formulations; that is, we have to look at
the actual practices of rule governed action and at the actual role of the various
rule-stating artifacts (time tables, standard operating procedures, notation
schemes, etc.) in those practices, to determine what the rule is.
But this does not mean, of course, that ethnographic accounts by necessity are
capricious or subjective. For a field worker trying to establish empirically what
the rules of a particular practice actually are, the stated rule (the printed schedule,
the procedure description, etc.) is of course an important source of data; but only
one source among many, and a source that the field worker has no privileged access to understand. To establish what the rules in fact are, the field worker will
have to consider how the stated rule is observably used in the setting. How are
members instructed in applying the rule? How is it explained, exemplified, etc.?
Furthermore, how is it invoked to justify or explain, justify, excuse, rectify, chastise, reprimand actions? How are actions that to an outsider might be seen an
transgression of the stated rule actually treated by members? Are they approved or
applauded; are they countenanced or condoned; or are they corrected, censured,
castigated? In short, the task of determining the operative sense of a stated rule ‘is
left to persons whose task it is to decide such matters’ (Zimmerman, 1966, p.
155).
In all of this, the field worker is engaged in determining the rule of the practices empirically; their normative character to practitioners themselves can easily
escape him or her. This is an insidious source of confusion and misrepresentation:
the source of a natural fallacy. The fallacy arises when the inexorable investigational condition — that the operative sense of stated rules requires interpretation
— is somehow construed as the human condition. That is, the interpretation work
in which the field worker, as an outsider, is compelled to engage is conceived of
as an inescapable condition for all, members and outsiders alike. The field worker’s fallacy consists in elevating his or her own mundane epistemological problems to the level of a human condition. This is of course confused and is a variant
of the intellectualist legend: the conditions of intellectual work are the paradigm
of the human condition and specifically intellectual practices the model of rational
conduct.
410
Cooperative Work and Coordinative Practices
3. The consequences of counter-cognitivism
Since CSCW aims at devising technologies that, when used, regulate aspects of
interaction in cooperative work settings, and since work practices are both historically specific and specific to domains and settings, we need to understand the
specificity of work practices: the rules, concerns, criteria, typifications, distinctions, priorities, notations, schemes, plans, procedures, etc.; how such rules etc.
are applied unreflectively and unhesitatingly and also sometimes with some uncertainty; how they are sometimes questioned, debated, amended, elaborated; how
they are taught and instructed; and how they evolve over time and how they propagate beyond local settings, are acquired, emulated, appropriated, etc.; what techniques practitioners bring to bear in their practices, how contingencies are dealt
with routinely, how they employ the tools of the trade as intended in their design,
how they often use them in ‘unanticipated’ ways, and how they often also seize
incidental resources in the setting; and so on.
For CSCW researchers to be able to do so, preconceived constructions of ‘human nature’ or ‘sociality’, applied as templates in technological design or in the
production and analysis of ethnographic findings, would cause immediate sterility, since the specificity of cooperative work practices then would have been lost.
It is thus of vital importance for CSCW to heed Egon Bittner’s call for ‘realism in
field work’ and develop and maintain what he calls an ‘unbiased interest in things
as they actually present themselves’ to practitioners (Bittner, 1973). For these reasons phenomenological sociology as represented by Alfred Schütz and the ethnomethodologists has played a very important role in CSCW. This was not, of
course, preordained, as other intellectual traditions offer contributions that, for
these purposes, are concordant with the phenomenological movement (e.g., the
philosophies of Wittgenstein and Ryle, the tradition of ‘symbolic interactionism’,
and in some respects also the psychology of Vygotsky). So far many if not most
CSCW researchers will agree. The problem is that, to achieve this, ‘the field
worker needs not only a good grasp of the perspectives of those he studies but also’, as pointed out by Bittner, ‘a good understanding of the distortive tendencies
his own special perspective tends to introduce’.
Because of the metaphysical tenet of Plans and Situated Actions, the version of
ethnomethodology that has been widely received in CSCW is one in which phenomenological sociology has been deprived of its most important insights: the
principle of specificity of practices (‘zones of relevance’, ‘finite provinces of
meaning’, etc.), the notion of the ‘natural attitude’ of working, the principle of
taking the point of view of practitioners (what are they up to? how does the world
look when on does that kind of work?), the method of conceiving of work practices in practitioners’ own terms, as opposed to through universal constructs, externally imposed criteria, etc.
While Suchman’s book showed a way out of cognitivism, towards studies of
actual work practices, it also — unwittingly but effectively — was instrumental in
establishing another dogma, the dogma that plans play no role in action, i.e., in
Chapter 12: Frail foundations
411
determining the course action. The dogma was not intended but was the consequence of Suchman’s mirroring cognitivism’s metaphysical form of reasoning.
The existence of this dogma is evident. Consider, for example, this admonishment
from a widely cited article written by Paul Dourish and Graham Button:
‘The disturbingly common caricature of her position is that there are no plans, but only “situated
actions” — improvised behaviors arising in response to the immediate circumstances in which
actors find themselves and in which action is situated. In fact, as Suchman has been at pains to
point out, she did, in fact, accord an important status to plans as resources for the conduct of work.
Her argument was that plans are one of a range of resources that guide the moment-by-moment
sequential organization of activity; they do not lay out a sequence of work that is then blindly interpreted.’ (Dourish and Button, 1998, pp. 405 f.)
This defense of Suchman stayed within the metaphysical discourse established by
Suchman’s original argument: ‘plans […] do not lay out a sequence of work that
is then blindly interpreted’. Disregarding the perplexing expression ‘blindly interpreted’ (what would it mean to ‘interpret blindly’?), I take it that the authors
meant to say ‘plans do not lay out a sequence of work that is then blindly [followed]’. Well, plans sometimes are. In fact, they are most of the time. They are
routinely applied as unproblematic guidelines or instructions, and if plans do not
lay out a sequence of work that is then normally followed without reflection then
there are no plans as we ordinarily understand the term. The statement that ‘plans
do not lay out a sequence of work that is then blindly [followed]’ only makes
sense if ‘plans’ are not really ordinary plans, but rather ‘plans’ as conceived of in
cognitivist theorizing. But then ‘plans’ as conceived of in cognitivism cannot be
followed (or ‘interpreted’), for they are obscure causal mechanisms.
In another widely cited article Button and Harper cautioned ‘designers’ in
CSCW not to misunderstand the concept of ‘work practice’ (Button and Harper,
1995). Their concern was occasioned by the increased use of the concept of ‘work
practice’ in the ‘design community’, especially in the wake of Suchman’s Plans
and Situated Action. They noted with barely suppressed irritation that the concept
of ‘work practice’ was ‘being invoked more and more in the rhetoric that surrounds design’ (p. 266) and had become ‘something of a rallying cry in many
quarters of CSCW’ (p. 279); but what gave them particular cause for concern was
that the concept of ‘work practice’ had its ‘origins in sociology’ and had ‘a well
established place in the sociology of work’ and that the ‘sociological underpinnings’ of the concept and ‘the order of work organisation to which Suchman refers’ might not be properly understood (p. 263-265). The problem, according to
Button and Harper, was that the concept of ‘work practice’ in sociology of work
was used for ‘describing amongst other things the rule book formulations of work
as well as the situated responses to contingent circumstances’ (p. 265, emphasis
added). The authors’ issue with this use of the concept of ‘work practice’ seems to
be that ‘rule book formulations of work’ are even considered in accounts of work
practices. They contrasted this with what they presented as the ethnomethodological position. Referring to Garfinkel and Bittner’s study of the ‘normal, natural
troubles’ of coding clinical records they made the following claims:
412
Cooperative Work and Coordinative Practices
‘The finding of Garfinkel’s study was […] that in practice it was not possible to exhaustively and
explicitly stipulate the coding rules. However full and detailed the rules in the coders’ handbook
were made, each time coders had to administer the schedule, there was a need for decision and
discretion. The coders would resort to a variety of practices to decide what the coding rules actually required of them and whether what they were doing was actually (or virtually) in correspondence with those rules. These were essentially ad hoc relative to the coding manual’s purportedly
systematic character. It was through the implementation of these ad hoc practices that coders
achieved their work of coding. The formalised account of the work of coding as applying the rules
omits the very practices that organise that work.’ (Button and Harper, 1995, p. 265).
According to Button and Harper, then, ‘work-practices constitute, in its [sic] detail and in the face of the unfolding contingencies of work, the temporal order of
work and the ordinary facticity of domains of work’ (p. 264). In short, practices
are ‘essentially ad hoc’.
Now, it should be said immediately and with emphasis that this pallid notion of
practice has not prevented these authors from investigating complex professional
practices and, in doing so, blessing CSCW with some of the most influential studies of work practices (e.g. Harper, et al., 1989a, b; Bowers, et al., 1995). Harper
and Button have provided us with very insightful analyses of normatively regulated conduct in ordinary work settings. What we have, then, is what looks like a
clean separation of ‘theory’ and ‘practice’, the hallmark of a dogma. Its paralyzing
effects show up elsewhere.
The upshot of all this is that Plans and Situated Actions has been instrumental
in replacing the cognitivist dogma with another. The new dogma, while certainly
far more fertile in terms of encouraging empirical work than that of the cognitivist
wasteland, made CSCW largely numb to the massive web of coordinative practices and techniques that characterizes typical modern workplaces. Their presence is,
of course, not flatly denied; it is even acknowledged that they serve as ‘important’
‘resources’ for situated action. But it is, ab initio and dogmatically, posited that
‘plans do not lay out a sequence of work that is then blindly [followed]’, and that
work practices are ‘essentially ad hoc’. The concept of ‘practice’, which plays a
defining role for CSCW’s research program, is hereby emptied of content, rendered useless. The practice-oriented research program of CSCW is effectively undermined.
The implication is a strong methodological bias even in ethnographic fieldwork. To see this, take for instance the Bob Anderson’s support of Suchman
against her critics:
‘If we set the context for the ethnography at the level at which many ethnographers feel most comfortable, we will find they are almost obsessed with change of one sort or another. In picking their
way through the minutiae of routine action, prominence is (endlessly) given to the innovative, the
ad hoc, and the unpredictable rife in the workplace and elsewhere. Change, here, is the very stuff
of ethnography.’ (Anderson, 1997, p. 177, emphases added).
This is an accurate statement of the diagnosis. However, Anderson does not mean
this as a critique but as a formulation of an optional methodological preference.
Chapter 12: Frail foundations
413
What Anderson does not take into account is Egon Bittner’s warning, in the
early days of ethnomethodology, against the fieldworker’s fallacy: Because the
ethnographer is ‘a visitor whose main interest in things is to see them’, to him or
her ‘all things are primarily exhibits’; they are ‘never naturally themselves but only specimens of themselves’ (Bittner, 1973, p. 121). These are unavoidable conditions for any ethnography. However, as a result, a certain intellectualism may distort the ethnography:
‘Since the field worker […] always sees things from a freely chosen vantage point — chosen, to be
sure, from among actually taken vantage points — he tends to experience reality as being of subjective origin to a far greater extent than is typical in the natural attitude. Slipping in and out of
points of view, he cannot avoid appreciating meanings of objects as more or less freely conjured.’
(Bittner, 1973, pp. 121 f.).
And in doing so, the field workers describes the setting ‘in ways that far from being realistic are actually heavily intellectualized constructions that partake more of
the character of theoretical formulation than of realistic description’ (Bittner,
1973, pp. 123 f.). Bittner is here restating and underscoring Winch’s fundamental
proposition, namely, ‘it is not open’ to the sociological investigator ‘to impose his
own standards from without. In so far as he does so, the events he is studying lose
altogether their character as social events’ (Winch, 1958, p. 108).
If we in our analyses of work practices give prominence to ‘the innovative, to
the ad hoc, to the unpredictable rife in the workplace’, or to the ‘unfolding contingencies of work’ we will be under the ‘serious misunderstanding’ of rendering
practices in ways that are ‘heavily intellectualized constructions.’ That is, if we
heed Bittner’s warning, we cannot take Anderson’s rather agnostic position and
simply accept the reputed ‘obsession with change’ as a methodological preference
some may wish to adopt and others not. For an ethnography characterized by an
‘obsession with change’ is a ‘heavily intellectualized construction’ just as much as
one obsessed with stable structures, equilibrium, and what not.
Obviously, such ‘heavily intellectualized constructions that partake more of the
character of theoretical formulation than of realistic description’ can only impede
CSCW’s ability to address the construction and use of plans in cooperative work,
for the ordinary plans of ordinary cooperative work in ordinary organizational life
are then rendered delusory, the natural attitude of practitioners excluded, if they
are seen at all.
In sum, the consequence of all this is, at best, that our studies become unrealistic. At worst, CSCW abandons its practice-oriented technological research program and becomes irrelevant as yet another research area producing post hoc descriptions of the various uses of new products, services, and facilities coming onto
the market.
4. The problem of computational artifacts
I have focused on Suchman’s account for obvious reasons. Her contribution is,
justifiably, seen as a major contribution to the intellectual foundation of CSCW
414
Cooperative Work and Coordinative Practices
and HCI. By centering on the relationship between the concepts of ‘plans’ and
‘situated action’, it provided the foundational conceptual apparatus of CSCW: its
axis of conceptualization. However, this conceptualization is no longer holder up.
Confusion reigns with respect what constitutes CSCW’s problem and its scope.
In fact, CSCW’s central problem, its axis of conceptualization, can not be adequately defined in terms of ‘plans and situated action’. CSCW’s program is not
usefully conceived of in terms of ‘plans and situated action’ or similar, for the
problem CSCW has to address is not primarily how normative constructs such as
ordinary plans and other organizational rules are applied in practice but how practitioners use or may use machines in following the rules of their cooperative work
practices (schedules, procedures, categorizations, schemes). And that is an entirely different issue. I will suggest, therefore, that CSCW’s problem is circumscribed
by the concepts of rule-governed work practices on one hand and on the other
highly regularized causal processes in the form of ‘computational artifacts’ used,
within cooperative work practices, for purposes of coordinating interdependent
activities. How do we incorporate mechanical regulation of action and interaction
in cooperative work practices? How can we exploit the immense flexibility of
computational technology to give ordinary workers effective control over the distributed execution, construction, and maintenance of mechanized coordinative
protocols?
The point of Suchman’s critique of cognitivism was of course to be able to address the problem of computational artifacts. The problem she set out to address
was not the role of plans in situated action in general but the problem of how
plans incorporated in computational artifacts (‘reified in the design of intelligent
machines’) affect human action and interaction, how they can be appropriated in
our practices, and what implications for design may be derived from that: ‘I have
attempted to begin constructing a descriptive foundation for the analysis of human-machine communication’ (p.180).
Based on her study of a problematic case of human-computer interaction (a
computer-based system intended to instruct users of a complex photocopier),
Suchman concluded:
‘The application of insights gained through research on face-to-face human interaction, in particular conversation analysis, to the study of human-computer interaction promises to be a productive
research path. The initial observation is that interaction between people and machines requires
essentially the same interpretive work that characterizes interaction between people, but with fundamentally different resources available to the participants. In particular, people make use of a rich
array of linguistic, nonverbal, and inferential resources in finding the intelligibility of actions and
events, in making their own actions sensible, and in managing the troubles in understanding that
inevitably arise. Today’s machines, in contrast, rely on a fixed array of sensory inputs, mapped to
a predetermined set of internal states and responses. The result is an asymmetry that substantially
limits the scope of interaction between people and machines. Taken seriously, this asymmetry
poses three outstanding problems for the design of interactive machines. First, the problem of how
to lessen the asymmetry by extending the access of the machine to the actions and circumstances
of the user. Secondly, the problem of how to make clear to the user the limits on the machine’s
access to those basic interactional resources. And finally, the problem of how to find ways of
Chapter 12: Frail foundations
415
compensating for the machine’s lack of access to the user’s situation with computationally available alternatives.’ (Suchman, 1987, pp. 180 f.).
Suchman summarized the conclusion as follows:
‘I have argued that there is a profound and persisting asymmetry in interaction between people and
machines, due to a disparity in their relative access to the moment-by-moment contingencies that
constitute the conditions of situated interaction. Because of the asymmetry of user and machine,
interface design is less a project of simulating human communication than of engineering alternatives to interaction’s situated properties’ (Suchman, 1987, p. 185).
The metaphor of ‘asymmetry’ is perplexing. Would it make sense to say that my
relationship to my alarm clock is ‘asymmetrical’ because it seems completely insensitive to the effects of last night excesses, due to ‘a disparity’ in our ‘relative
access to the moment-by-moment contingencies’? Does a notion of ‘asymmetry’
not presume a common metric, some significant commonality? In which way,
then, are the two parties commensurate? I take it that Suchman would have answered those expressions of disbelief by emphasizing what she already said in the
quote above: that ‘interaction between people and machines requires essentially
the same interpretive work that characterizes interaction between people, but with
fundamentally different resources available to the participants’. This is truly bewildering: ‘essentially the same interpretive work’!
Now, Suchman certainly did not intend to advocate a cognitivist position but
rather to provide an ‘alternative account’, but something was fatally amiss in this
line of reasoning. The problem, or so it seems to me, is that she conceived of
computational artifacts as linguistic constructs: ‘I argue that the description of
computational artifacts as interactive is supported by their reactive, linguistic, and
internally opaque properties’ (pp. 7, 16). On that view, it was not a dramatic jump
to assert that ‘interaction between people and machines requires essentially the
same interpretive work that characterizes interaction between people’. But then
the alternative account to cognitivism has already been effectively abandoned.
The containment strategy turns out not to contain cognitivism at all. The high
ground has been evacuated, the major highway intersections long since lost, defeat all but conceded. The strategy has landed us in something akin to a Green
Zone, surviving at the mercy of the adversary we proudly set out to vanquish.
Still, the notion that computational artifacts have ‘linguistic properties’ is, of
course, one that is taken for granted in the ordinary discourse of computer science
where it is in common usage (witness terms like ‘programming language’ etc.).
There, in the context of everyday reasoning about programming, it is (largely) unproblematic, as harmless and pragmatic as vernacular expressions such as ‘sunrise’ and ‘sunset’. However, outside of the discourse of computer science, and especially in the context of reasoning about human-computer interaction and computer-supported cooperative work, the notion of computational artifacts as linguistic constructs is fatally misleading. This should be evident inasmuch as Suchman
conceived of computational artifacts in exactly the same terms as she conceived of
‘plans’: as ‘essentially linguistic representations’ (p. 186). It is a category mistake
of the first order.
416
Cooperative Work and Coordinative Practices
The problem is, as should be clear by now, that the seeds of the defeat were
there from the very beginning, inherent in Suchman’s strategy of trying to demolish cognitivism while accepting cognitivism’s metaphysical discourse and its
mechanistic premises. In the cognitivist account, there is no room, i.e., no logical
space, for considering normative regularity. There is only room for empirical observations of concomitance (and a significant amount of speculative theorizing, of
course). On this view, ‘rules’ can only be conceived of as hypothetical propositions, derived inductively, by correlation, from observed regularity. The notion
that rules, plans, etc., are an essential part of our practices, as public standards of
correctness or incorrectness, is unintelligible on this view. Ordinary normative
activities, such as asking for the way to the nearest grocery shop, suggesting a
definition of a word, setting the table for a dinner party, executing a production
plan, etc. have no place in this discourse. Suchman’s ‘alternative account’ unfortunately left this view unchallenged. As a result, the normative nature of our practices was lost in the fire. For the same reason, computational artifacts are ascribed
properties of the normative category, which means that we end up with trying to
understand ‘human-computer interaction’ with the key concepts of ‘practices’ and
‘computational artifacts’ completely confounded.
417
Chapter 13
Dispelling the mythology of
computational artifacts
‘If calculating looks to us like the action of a machine,
it is the human being doing the calculation that is the
machine.’
(Wittgenstein, 1937-44, IV § 20)
‘Turing’s “Machines”.
These machines are humans who calculate.’
(Wittgenstein, 1946-49, § 1096)
What is paralyzing CSCW is the assimilation of the normative concept of plans,
schemes, schedules, and similar organizational constructs with the mechanist concept of causal determination of rational action. This leaves no conceptual room for
CSCW. Given this confusion, any thought anybody may have of building or studying computational artifacts that embody plans, schemes, schedules, will unavoidably be met with disbelief or disinterest: ‘But we know already that that’s
wrong/impossible, for action is essentially ad hoc!’ So, instead the good CSCW
researcher will focus on building or studying computational artifacts that embody
as little computational regulation of human interaction as possible. The ideal, one
might say, is computational artifacts on the model of plasticine.
The root of these problems is the concept of computational artifacts, or rather:
the philosophy of computing, a channel of endless mystification. To take just one
example, from a book by a well-respected computer scientist:
‘I etch a pattern of geometric shapes onto a stone. To the uninitiated, the shapes look mysterious
and complex, but I know that when arranged correctly they will give the stone a special power,
enabling it to respond to incantations in a language no human has ever spoken. I will ask the stone
questions in this language, and it will answer by showing me a vision: a world created by my spell,
a world imagined within the pattern of the stone.’ (Hillis, 1998, p. vii)
The interesting thing is that Hillis then goes on to give an instructive account,
completely free from such, well, incantations, of the concept of computing, showing how computing operations can be performed by means of causal processes in
mundane (mechanical, hydraulic) devices.
418
Cooperative Work and Coordinative Practices
What mystifies us is the ‘mythology of a symbolism’, as Wittgenstein puts it:
in this case the mythology of computational artifacts. It is, prima facie, a mythology of our everyday dealings with computational artifacts. What mystifies us is,
first of all, the computer’s apparent vivacity: the impression of an infinite number
of possible internal states and its dynamic reactivity. It is an unavoidable aspect of
using computers that the user engages in forms of thinking in which ‘interacting’
with electronic processes is considered natural; it is, in a strict Schutzian sense,
the user’s natural attitude. As computer users and technicians we engage — as the
most natural thing in the world — in ascribing linguistic and other anthropomorphic properties to computational artifacts: programming ‘language’,
‘memory’, ‘information’, ‘ontology’. The computational artifact mystifies us in
the same way as clocks and their ability to ‘tell time’ for ages have allured thinkers to speculate about the mystery of time. This mystification is unavoidable even
if one knows that the time piece executes a causal process in a manner that is carefully devised to be exceptionally regular, and that it is this regular causal process
that allows us to use the clock as a standard metric for measuring the sequential
order or irreversible nature of other casual processes, from heart beats to train
movements (Gell, 1992). A clock does not ‘tell’ us anything more about time than
the meter rod ‘tells’ us about space. But the meter rod in Paris, which has served
as standard referent for measuring spatial properties of things, does not mystify
us, because it does not do anything. The clock, however, seems to do something
when its springs uncoil, its gears turn, and its hands move; it is an automaton, and
we therefore unhesitatingly ascribe the property of ‘telling time’ to it.
Or to take another, more mundane, example, there are devices for sorting eggs
according to their size. The eggs will roll along a slightly declining plane that has
holes in different, but increasing sizes: small eggs will drop into the first holes,
medium size eggs into the next ones, and so on, until only XL eggs remain. The
device does not measure the size of eggs; the operator measures the size of eggs
by means of the device, according to the standards agreed to by the egg industry.
What makes us engage in ‘incantations’ and ascribe ‘egg-sorting’ behavior to the
‘egg sorter’ is that the ‘egg sorter’ is, within the horizon of sorting eggs, selfacting. It is the automatic performance — the allocation of the control function to
the technical implement — that prompts us to engage in such ascription of function.
Another source of mystification is the manner in which the software machine is
built: it is built by a process that, to the programmer and his or her admiring observers, is a linguistic activity. The programmer writes text: code. What happens
to the code afterwards is hidden from view, out of mind, for it is executed automatically: the transformation of the ‘source code’ by the compiler into ‘object
code’ in binary format. In this process, a linguistic construct, the source code, is
transformed into a binary machine that, in the form of electronic patterns, can
loaded into RAM and then executed in conjunction with the hard-wired machinery of the CPU. The source code is not a machine but the blueprint for one; but in
contrast to an ordinary blueprint, which is also a linguistic construct, the source
Chapter 13: The mythology of computational artifacts
419
code is not handed over to a machinist who then builds the intended machine but
is submitted to a special software machine, the compiler, that automatically performs the construction of the intended machine as specified in the blueprint or
source code.109
A deeper source of the mystification surrounding the computer, or rather: a
more insidious source, is that we routinely use software machinery, i.e., regimented causal processes, as an integrated aspect of our normative or rule-following
practices and that we are tempted to assimilate the two — the causal process and
our use of it — and become mystified, struck with awe at the fruit of our hands.
Just as it is not clocks that ‘know’ the time or ‘tell us’ what time it is, but we who
use clocks to ‘tell the time’, it is not the computer that computes, calculates, executes plans, searches for information, etc., but we who apply computational artifacts to do so in our normative practices of computing, calculating, executing
plans, searching for information, etc., just as we have previously employed abacuses, slide rulers, desktop calculators, time pieces, filing cabinets, etc.
1. Computational artifacts, or Wittgenstein vs. Turing
The source of this conceptual muddle is, again, the failure to grasp the categorial
distinction between normative behavior and mere regularity. To sort out this mess,
which in the case of computational artifacts is exceptionally dense, we are in the
fortunate situation that Wittgenstein was developing a critical conceptual analysis
of the concept of calculation or computation that was then, in the late 1930s, being developed by Alan Turing and others.
It is fairly well-known that Turing and Wittgenstein knew each other from
1937, that Turing followed Wittgenstein’s lectures on the foundations of mathematics (Wittgenstein, 1939), and that they in the course of these lectures engaged
in discussions concerning the nature of mathematics. What is less well known is
that Turing sent a reprint of his paper to Wittgenstein in February 1937
(Copeland, 2004, p. 130) and that Wittgenstein, in a bulky collection of manuscripts on the philosophy of mathematics written from 1937 to 1944, subjected
(inter alia) Turing’s mechanistic framing of his argument to incisive critical discussion (Wittgenstein, 1937-44).110
A few remarks on the background and general context are required, however.
Turing’s machine was devised in the context of the ‘foundations crisis’ that mathematicians and philosophers then believed had struck mathematics, and Wittgenstein’s critique of Turing and his mechanist thesis was but a theme in his critical
109 This linguistic mystique also appears when a user ‘interacts’ with the computational artifact by writing
‘commands’ that then activate an entire system of machines under the control of the computer’s operating
system. It is worth noticing that such a mystique is absent when a user pushes the start button on a dish washer.
110 For very useful commentaries on Wittgenstein’s critique of Turing in particular and the ‘mechanist thesis’ in general, cf. the penetrating studies by Stuart Shanker (1987c, d, 1995, 1998).
420
Cooperative Work and Coordinative Practices
discussion of the very notion that such as crisis existed and of the attempts to
overcome it.
1.1. Foundations lost
The conceptual underpinnings of computing technologies (symbolic logic, algorithmics, recursive functions, etc.) grew out of techniques that were developed in
the last decade of the 19th and the first decades of the 20 centuries in an attempt
to establish mathematics on a complete and consistent foundation.
By the end of the 19th century, leading mathematicians had become deeply
concerned with the foundations of their science. It was felt by many that the impeccable certainty and consistency of mathematical knowledge could be questioned. It was widely perceived as a ‘foundations crisis’. Developments in mathematics such as non-Euclidian geometry (by Lobachevski, Riemann, and others)
and Cantor’s theory of sets and ‘transfinite numbers’ had made it problematic for
many mathematicians to continue to rely on intuition in proving theorems. Their
concerns became acute when paradoxes and absurdities began to emerge.
To save mathematics some mathematicians (especially L. E. J. Brouwer) went
so far as to state that only those mathematical proofs that could be constructed by
‘finite’ methods, i.e., by definite and surveyable steps, and which therefore were
deemed ‘intuitively’ evident, were acceptable as bona fide members of the body
of mathematics. Going under the name ‘intuitionism’ or ‘constructivism’, this
program was prepared to jettison whatever could not be proved by finite methods,
although this would chuck off large chunks of highly useful mathematics. Unsurprisingly this Procrustean program was deemed unattractive by most of the mathematicians that participated in trying to overcome the ‘foundations crisis’.
The main axis of attack on the crisis was developed by the British philosopher
Bertrand Russell. He saw mathematics as an ‘edifice of truths’, ‘unshakable and
inexpugnable’, a vehicle of reason that lifts us ‘from what is human, into the
realm of absolute necessity, to which not only the actual world, but every possible
world, must conform’ (Russell, 1902, pp. 69, 71). Little wonder that he felt an
urge to safeguard mathematics. In his effort to do so, he developed a research program that in the foundational debates became known as ‘logicism’, a program to
rebuild mathematics into a monolithic structure of propositions based on a drastically limited set of axioms and rules of inference. Building on the work of Frege
and Peano, he first made an impressive attempt to reconstruct mathematics on the
basis of first-order logic in the form of a theory of ‘classes’ or sets (Russell,
1903). However, he had barely finished the book before he realized that the edifice he had erected to salvage the ‘realm of absolute necessity’ by establishing
mathematics on the unshakeable basis of logic — was deeply flawed: the concept
of ‘class’ derived from Cantor’s set theory led to absurdities. Mathematicians like
David Hilbert were dismayed:
‘In their joy over the new and rich results, mathematicians apparently had not examined critically
enough whether the modes of inference employed were admissible; for, purely through the ways in
Chapter 13: The mythology of computational artifacts
421
which notions were formed and modes of inference used — ways that in time had become customary — contradictions appeared, sporadically at first, then ever more severely and ominously. They
were the paradoxes of set theory, as they are called. In particular, a contradiction discovered by
Zermelo and Russell had, when it became known, a downright catastrophic effect in the world of
mathematics.’ (Hilbert, 1925, p. 375)
Russell spent most of a decade — together with Alfred Whitehead — building
another foundation of mathematics (Whitehead and Russell, 1910). In order to
exclude paradoxes, the new foundation was built by means of an elaborated logical calculus called the ‘theory of types’, ‘a complicated structure which one could
hardly identify with logic’ in the normal sense (Davis and Hersh, 1980, p. 333). In
Hilbert’s words, ‘Too many remedies were recommended for the paradoxes; the
methods of clarification were too checkered’ (Hilbert, 1925, p. 375).
In an effort to avoid these problems Hilbert developed another approach, generally known as ‘formalism’. Where Russell dreamt of restoring the ‘realm of absolute necessity’, Hilbert’s program was predicated on the ‘conviction’ that every
mathematical problem is solvable: ‘We hear within us the perpetual call: There is
the problem. Seek its solution. You can find it by pure reason, for in mathematics
there is no ignorabimus’ (Hilbert, 1900, p. 248). For Hilbert, therefore, the situation was intolerable:
‘Let us admit that the situation in which we presently find ourselves with respect to the paradoxes
is in the long run intolerable. Just think: in mathematics, this paragon of reliability and truth, the
very notions and inferences, as everyone learns, teaches, and uses them, lead to absurdities. And
where else would reliability and truth be found if even mathematical thinking fails?’ (Hilbert,
1925, p. 375).
To salvage mathematics as an unshakeable edifice, as the ‘paragon of reliability
and truth’, Hilbert formulated a grandiose ‘formalist’ program to demonstrate or
ensure — in the most rigorous manner — that the corpus of mathematical calculi
constitute a complete, consistent, and decidable formal system. His vision was to
develop mathematics ‘in a certain sense’ ‘into a tribunal of arbitration, a supreme
court that will decide questions of principle and on such a concrete basis that universal agreement must be attainable and all assertions can be verified’ (Hilbert,
1925, p. 384). Hilbert’s strategy was to regain the certainty of ‘intuitive’ perceptual inspection even in regions of mathematics such as ‘transfinite number’ theory
that defied perceptual inspection, and to do so by treating ‘formulas’ as ‘concrete
objects that in their turn are considered by our perceptual intuition’ (Hilbert, 1925,
p. 381). In other words, the approach he proposed was to encode mathematical
propositions in a ‘formal’ language, that is, (a) to ‘replace’ ‘contentual references’
by ‘manipulation of signs according to rules’, (b) treat ‘the signs and operation
symbols as detached from their contentual meaning’ and (c) thereby ultimately
obtain ‘an inventory of formulas that are formed by mathematical and logical
signs and follow each other according to definite rules’ (Hilbert, 1925, p. 381).
Now, Hilbert most certainly did not think of mathematics as a meaningless game
(in epistemological matters, he was a ‘realist’ or Platonist); but, to stave off impending skepticism, he was ready to advocate this radically formalistic approach
422
Cooperative Work and Coordinative Practices
as a methodology, and as a result his formalist program was accused of treating
mathematics as a meaningless game. Unruffled, Hilbert retorted by appropriating
the epithet: ‘What, now, is the real state of affairs with respect to the reproach that
mathematics would degenerate into a game? […] This formula game enables us to
express the entire thought-content of the science of mathematics in a uniform
manner and develop it in such a way that, at the same time, the interconnections
between the individual propositions and facts become clear’ (Hilbert, 1927, p.
475).
Although Hilbert’s program took a different course than Russell’s, the two
programs shared basic premises, namely that the problems facing mathematics at
the time were epistemological problems: how could mathematicians ensure the
impeccable and unassailable validity of the propositions of higher mathematics
where ‘perceptual intuition’ seemingly had lost traction (‘transfinite numbers’
etc.)? What united the two camps was the belief that mathematics was in need of
epistemological underpinnings and that recent extensions to mathematics made
the received epistemological foundations questionable, as they no longer ensured
‘self-evident’ truths. And both programs shared the belief that if it could be shown
that the whole of mathematics could be derived from a few ‘self-evident’ axioms
and rules of inference, in a finite sequence of steps, then the state of impeccable
certitude could be regained.
Wittgenstein played a key role in the development of mathematical logic and in
the development of the philosophy of mathematics,111 but did not share the sense
of crisis. There was, in his view, nothing in the development of mathematics that
should cause angst among mathematicians. In the midst of the intellectual turmoil
he commented coolly: ‘I have the impression that the whole question has been put
wrongly. I would like to ask, Is it even possible that mathematics to be inconsistent?’ (Wittgenstein, 1929-32, p. 119). Observing the acute fear of deep-seated
inconsistencies and antinomies, he commented that a contradiction ‘is only a contradiction if it is there’: ‘People have the notion that a contradiction that nobody
has seen might be hidden in the axioms from the beginning, like tuberculosis. You
do not have the faintest idea, and then some day or other the hidden contradiction
might break out and then disaster would be upon us’ (ibid., p. 120). Russell’s antinomies (e.g., his infamous paradox) had of course caused anxiety and wrought
havoc, but in Wittgenstein’s view these antinomies ‘have nothing whatsoever to
do with the consistency of mathematics; there is no connection here at all. For the
antinomies did not arise in the calculus but in our ordinary language, precisely
because we use words ambiguously’ (ibid., p. 121). On Wittgenstein’s diagnosis,
the problem was not a problem within mathematics but was rooted in deep confu111 In fact, the philosophy of mathematics was Wittgenstein’s central concern from the Tractatus until about
1945 (when he switched his attention to another muddle: the philosophy of psychology). This is evidenced by
the records of his discussions with members of the Vienna Circle (1929-32), the manuscripts published as
Philosophical Remarks (1930) and Philosophical Grammar (1931-34), his lectures at Cambridge (1932-33,
1939), his Remarks on the Philosophy of Mathematics (1937-44), and of course the Philosophical Investigations (1945-46). Indeed, it was his intention that the second volume of Philosophical Investigations should be
based on the later manuscripts on mathematics.
Chapter 13: The mythology of computational artifacts
423
sion in the way mathematicians and philosophers alike interpreted mathematics:
‘If I am unclear about the nature of mathematics, no proof can help me. And if I
am clear about the nature of mathematics, then the question about its consistency
cannot arise at all’ (ibid.). The source of the confusion, Wittgenstein found, was
the view, shared by both logicists and formalists, that mathematics is an investigation that, on par with physics, investigates a mathematical reality that exists independently of mathematical practice and which serves to adjudicate the correctness
of that practice and language. A major part of his critical examination of the philosophy of mathematics was therefore devoted to exposing these dearly held epistemological assumptions (Shanker, 1986b, 1987a; Gerrard, 1991).
The overriding aim of his argument with philosophy of mathematics was to
show that mathematical propositions are not empirical propositions about preexisting objects but, rather, that ‘mathematics forms a network of norms’
(Wittgenstein, 1937-44, VII §67). Mathematical propositions do not appear
‘hard’, ‘inexorable’, nonnegotiable, because they are crystallizations of experiential evidence, for then they would be contingent; they appear ‘hard’ because they
are normative: If one says ‘If you follow the rule, it must be like this’, then one
does not have ‘any clear concept of what experience would correspond to the opposite’, for ‘the word “must” surely expresses our inability to depart from this
concept’. The statement that ‘it must be like this’, does not mean, ‘it will be like
this’: ‘On the contrary: “it will be like this” chooses between one possibility and
another. “It must be like this” sees only one possibility’. Consequently, ‘By accepting a proposition as self evident, we also release it from all responsibility in
face of experience.’ This, Wittgenstein argued, is the source of the ‘hardness’ of
mathematical propositions (IV, §§ 29-31).
Mathematical propositions are rules, they specify the grammar of number
words. Still, mathematical propositions are rules of a special kind, namely rules
whose certainty is established by ‘proofs’. There is nothing otherworldly about
the proof and its objectivity: the proof is not — like a physical law — based on
experiential evidence that can be produced to justify it; is we who accept the rule
and accept to be guided the rule. The mathematical proposition can be said to be
true in much the same way that one can ask: ‘Is it true that there are 29 days in
February this year?’. That is, its truth is grammatical, it refers to the actuality of a
conventionally constituted fact. But the rule expressed by a mathematical proposition is not the result of an arbitrary choice; the proof ‘confers certainty’ upon the
proposition by ‘incorporating it — by a network of grammatical propositions —
into the body of mathematics’ and thereby making it incontestable (Hacker, 1989,
p. 333). The proof establishes a connection between mathematical concepts and
provides a concept of the connections. The proof ‘introduces a new concept’; ‘the
proof changes the grammar of our language, changes our concepts. It makes new
connexions, and it creates the concept of these connexions. (It does not establish
that they are there; they do not exist until it makes them.)’ (Wittgenstein, 1937-44,
III §31).
424
Cooperative Work and Coordinative Practices
In accordance with this view of mathematics — as a network of concepts —
Wittgenstein emphasized that mathematical calculi could not be reduced to one
formalism: ‘After all, the natural numbers are not identical with the positive integers, as though one could speak of plus two soldiers in the same way that one
speaks of two soldiers; no, we are here confronted with something new’
(Wittgenstein, 1929-32, p. 36). That is, the concept of the natural number 2, the
2
concept of the integer +2, and the rational number 1 are not the same concepts;
their grammars are distinctly different (cf. Waismann, 1930; Waismann, 1936, pp.
60 f.). One can subtract two people from the list of guests to be invited for dinner
but one cannot have a negative number of€guests for dinner. So, pointing to the
grammatical heterogeneity of mathematics, Wittgenstein objected vehemently
against the very idea that mathematics be subjected to forced formalization:
‘“mathematics” is not a sharply delimited concept’; rather, ‘mathematics is a motley of techniques and proofs’ (Wittgenstein, 1937-44, III §§46, 48), and the ‘invasion of mathematics by mathematical logic’ is a ‘curse’, ‘harmful’, a ‘disaster’
(Wittgenstein, 1937-44, V §§ 24, 46). Consequently, Wittgenstein saw ‘mathematical logic [as] simply part of mathematics. Russell’s calculus is not fundamental; it is just another calculus’ (Wittgenstein, 1932-33, p. 205). He similarly argued that what Hilbert thought of a ‘metamathematics’ was just another mathematical calculus next to the others and that the attempt to reduce the ‘motley’ of
mathematics to that single calculus was harmful: ‘There is no metamathematics’
(Wittgenstein, 1931-34, p. 296).
The point of Wittgenstein’s critique of mathematical philosophy was not to
denigrate or invalidate any part of mathematics. On the contrary. It deliberately
and carefully left mathematics as it was, only — perhaps — unburdened of some
of the metaphysical excretions that had accumulated in the form of mathematicians’ prose interpretations of their achievements and problems and in the form of
philosophical ‘invasions’.
The epistemological angst that gripped mathematicians and philosophers of
mathematics a century ago has long since dissipated. The foundational effort ended ‘mid-air’ around 1940 (Davis and Hersh, 1980, p. 323). As shown by Imre
Lakatos (1967), some of the leading contemporary participants in foundational
studies, from Russell to John von Neumann, concluded that the foundational program had collapsed. For instance, in 1947 von Neumann concluded that ‘Hilbert’s
program is essentially dead’ and that ‘it is hardly possible to believe in the existence of an absolute immutable concept of mathematical rigour dissociated from
all human experience’ (von Neumann, 1947). It would of course be false to interpret this as an indication that Wittgenstein’s critique of the foundational program
has been accepted, for that is surely not the general situation.112 Other developments have played a larger part.
112 In fact, Wittgenstein’s manuscripts on mathematics were met with sometimes excited opposition, at first
at least (key texts from this debate can be found in Shanker, 1986a). The situation is now somewhat changed,
though (cf., e.g., Shanker, 1987a; Tait, 2005).
Chapter 13: The mythology of computational artifacts
425
Writing in 1936, Friedrich Waismann, Wittgenstein’s ally in the early 1930s,
observed that, although it had ‘looked as if Hilbert’s methods of attack would lead
to the desired end’, the situation ‘changed essentially’ with the publication of Gödel’s article on ‘undecidable propositions’ (1936, pp. 100 f.). In Waismann’s interpretation, Gödel showed that ‘the consistency of a logico-mathematical system
can never be demonstrated by the methods of this system’:
‘We had previously visualized mathematics as a system all of whose propositions are necessary
consequences of a few assumptions, and in which every problem could be solved by a finite number of operations. The structure of mathematics is not properly rendered by this picture. Actually
mathematics is a collection of innumerably many coexisting systems which are mutually closed by
the rules of logic, and each of which contains problems not decidable within the system itself.’
‘Mathematics is not one system but a multitude of systems; we must, so to speak, always begin to
construct anew.’ (Waismann, 1936, pp. 102, 120).
This picture by a colleague of Wittgenstein may not quite be official doctrine, but
it is close. The notion of a universal formal language that could encompass the
motley of mathematics and subject them to one uniform representational form has
all but evaporated. At any rate, Waismann’s description seems like an accurate
description of the proverbial ‘the facts on the ground’, for, as one observer has put
it, ‘Like the hordes and horses of some fabulous khan, today’s mathematicians
have ridden off in all directions at once, conquering faster than they can send
messages home.’ (Bergamini, 1963, p. 169). The ‘motley of mathematics’ has asserted itself.
There is also the possibility that mathematicians have simply learned to live
without Russell’s high-strung faith in ‘the realm of absolute necessity’ and have
learned to subsist and get on with their business without being fortified by Hilbert’s conviction of the guaranteed solvability of mathematical problems. And in
this mathematicians may have, at least indirectly, been influenced by Wittgenstein’s persistent attempts to develop a cure against epistemological hypochondria.
Anyway, what first of all killed off the foundational program was probably the
mundane fact that, as pointed out by Davis and Hersh in their very insightful and
balanced account of The Mathematical Experience, the entire program was ‘not
compatible with the mode of thought of working mathematicians’, for ‘From the
viewpoint of the producer, the axiomatic presentation is secondary. It is only a
refinement that is provided after the primary work, the process of mathematical
discovery’ (Davis and Hersh, 1980, p. 343).
No surprise then that mathematicians are now rather pragmatic about the epistemological issues that previously bewitched the foundationalists. In the words of
the notable mathematician Paul Cohen:
‘The Realist [i.e., Platonist] position is probably the one which most mathematicians would prefer
to take. It is not until he becomes aware of some of the difficulties in set theory that he would even
begin to question it. If these difficulties particularly upset him, he will rush to the shelter of [Hilbert’s] Formalism while his normal position will be somewhere between the two, trying to enjoy
the best of two worlds.’ (P. J. Cohen, 1967, p. 11)
426
Cooperative Work and Coordinative Practices
Citing this, Davis and Hersh add — rather irreverently — that ‘Most writers on
the subject seem to agree that the typical working mathematician is a Platonist on
weekdays and a formalist on Sundays. […] The typical mathematician is both a
Platonist and a formalist — a secret Platonist with a formalist mask that he puts
on when the occasion calls for it’ (Davis and Hersh, 1980, pp. 321 f.). That is,
when engaged in doing mathematics mathematicians conceive of themselves as
engaged in discovering existent objects and relations. This is, in Schutzian terms,
the ‘natural attitude’ of mathematicians: the necessary assumptions of their daily
work, the conceptual horizon that is, and has to be, taken for granted to get on
with business. But when pressed to argue the status of these elusive objects and
relations prior to their discovery, mathematicians will adopt the official policy of
formalism: that all they do is really to manipulate (arcane) symbols.
The sound and fury of the ‘foundations crisis’ was not in vain, however; not at
all. Sure, as far as regaining the former serene certitude of mathematics is concerned, the enormous effort Russell and Whitehead and many others put into the
foundational program came to naught, but unlike Babbage’s misguided attempts
at ‘unerringly certain’ calculation ‘by steam’, these efforts, while also misguided
on Wittgenstein’s view, were not wasted. In their failed attack on the ‘foundations
crisis’ the techniques of mathematical logic were developed immensely compared
to what had been achieved by Frege and Peano before them, and in doing so they
provided the means for the formalization of algorithmics. The notion of an algorithm had, of course, been known and applied for ages (cf., e.g., Chabert, 1999);
but what the foundations program accomplished was to provide algorithmics with
requisite techniques and thus make algorithm a rigorous concept.
The story is of interest to us because the foundationalists, in the course of trying to overcome the ‘foundations crisis’, developed what — incidentally — became essential parts of the conceptual foundation of computing technology. As
Russell’s biographer, Ray Monk, puts it:
‘in the process (and this is perhaps where the lastingly important aspect of the work lies), [Russell
and Whitehead] had given an enormous boost to the development of mathematical logic itself,
inventing techniques and suggesting lines of thought that would provide the inspiration for subsequent mathematical logicians, such as Alan Turing and John von Neumann, whose work, in
providing the theoretical basis for the theory of computing, has changed our lives.’ (Monk, 1996,
p. 195).
When Monk uses a phrase like ‘the theoretical basis for the theory of computing’,
a note of caution is required, however. Computing is not a sharply defined technology but rather a family of technologies; nor are computing technologies developed on the basis of a clearly delimited theoretical basis. Like mathematics, computing is a motley of techniques and concepts that defies overarching formalization. However, the techniques and concepts of mathematical logic have been essential to the development of computing technologies. The very concept of computational artifact originated in these ‘metamathematical’ efforts. The metaphysical excesses of ‘metamathematics’ stick to the philosophy of computing. This also
means that Wittgenstein’s critique of the way in which this work was understood
Chapter 13: The mythology of computational artifacts
427
by those involved is of the highest relevance for the way in which we in CSCW
and related fields conceive of computational artifacts and investigate their use in
cooperative work practices.
1.2. Turing’s ambiguous machine
Turing’s famous article, ‘On computable numbers…’, from 1936 was designed as
an attempt to investigate a central problem in Hilbert’s program that had not been
addressed by Gödel, namely the problem of decidability. And like Gödel had done
with respect to the question of completeness and consistency, Turing demonstrated that no consistent formal system of arithmetic is decidable.
Turing began the article by considering the work of a human computer, ‘a man
in the process of computing’, Like de Prony and many others before him, he did
so in a language that is deliberately mechanistic:
‘We may compare a man in the process of computing a real number to a machine which is only
capable of a finite number of conditions q1, q2, …, qR which will be called “m-configurations”. The
machine is supplied with a tape […] running through it, and divided into sections (called
“squares”) each capable of bearing a “symbol”. At any given moment there is just one square […]
which is “in the machine”. We may call this square the “scanned square”. The symbol on the
scanned square may be called the “scanned symbol”. The “scanned symbol” is the only one of
which the machine is, so to speak, “directly aware”’ (Turing, 1936, p. 231).
‘We may compare a man […] to a machine’. A this stage, Turing was explicitly
engaged in a comparison. The mechanistic language was obviously used with
some caution, as the use of citation marks and expressions such as ‘so to speak’
suggest.
After a lengthy argument describing the technical operations performed by the
‘man in the process of computing a real number’ in these mechanistic terms, Turing returned to his analysis of the work of the human computer, but now in less
cautious mechanistic language:
‘The behaviour of the computer at any moment is determined by the symbols which he is observing, and his “state of mind” at that moment. […] Let us imagine the operations performed by the
computer to be split up into “simple operations” which are so elementary that it is not easy to imagine them further divided. Every such operation consists of some change of the physical system
consisting of the computer and his tape. We know the state of the system if we know the sequence
of symbols on the tape, which of these are observed by the computer […], and the state of mind of
the computer. We may suppose that in a simple operation not more than one symbol is altered.
Any other changes can be split up into simple changes of this kind. The situation in regard to the
squares whose symbols may be altered in this way is the same as in regard to the observed squares.
We may, therefore, without loss of generality, assume that the squares whose symbols are changed
are always “observed” squares.’ (Turing, 1936, p. 250)
Turing then summarized the results of his argument so far:
‘The simple operations must therefore include:
(a) Changes of the symbol on one of the observed squares.
(b) Changes of one of the squares observed to another square within L squares of one of the previously observed squares.
428
Cooperative Work and Coordinative Practices
The operation actually performed is determined […] by the state of mind of the computer and the
observed symbols. In particular, they determine the state of mind of the computer after the operation is carried out.’ (Turing, 1936, p. 251).
Having talked, so far, of a person involved in some kind of computing, Turing
now — and rather inconspicuously — slipped into describing ‘a machine to do the
work of this computer’ in exactly the same language that was earlier used to characterize the operations of the human computer: ‘We may now construct a machine
to do the work of this computer. To each state of mind of the computer corresponds an “m-configuration” of the machine..’ (Turing, 1936, p. 251).
It was no longer simply a comparison. Turing now, at the end of his discussion
and without further argument, conceptually equated the human computer’s ‘state
of mind’ to an ‘m-configuration’, that is, to what today is called a program. Later
he would do that without any of the caution he displayed at the beginning of his
famous paper. For example, in the lecture he gave to the London Mathematical
Society in 1947, the ‘rules of thumb’ of the human computer and the ‘machine
process’ are without further ado referred to as ‘synonymous’: ‘One of my conclusions was that the idea of a “rule of thumb” process and a “machine process” were
synonymous’ (Turing, 1947, p. 378).
What Turing implied in 1936 but made explicit in 1947 was the ‘mechanist
thesis’ that underlies the cognitivist concept of rational behavior: the execution of
‘plans’, ‘schemes’, etc. in human practices and in machines are categorially identical.
1.3. Conceptions of the ‘mechanical’
In obvious reference to Turing’s concept of mechanical calculation, Wittgenstein
(in a manuscript written 1942-44) raised the startling question: ‘Does a calculating
machine calculate?’, and to answer that he introduced a (striking, if hastily
sketched) thought experiment:
‘Does a calculating machine calculate? Imagine that a calculating machine had come into existence by accident; now someone accidentally presses its knobs (or an animal walks over it) and it
calculates the product 25 × 20.
I want to say: it is essential to mathematics that its signs are also employed in mufti.
It is the use outside mathematics, and so the meaning of the signs, that makes the sign-game into
mathematics.’ (Wittgenstein, 1937-44, V §2)
That is, for a calculating machine to be said to calculate it has to be part of a practice in which techniques of calculation are mastered and routinely performed, that
is, ‘employed in mufti’, applied to do ordinary work of that sort.
Let me elaborate his scenario a little. Imagine some ecologist doing field work
somewhere in the jungles of Central Africa. Let us say that he is collecting data
on the local ecological system and that he at some point is doing some statistical
calculation using one of the $100 computers designed by the MIT Media Lab.
Now, for some reason - perhaps while taking a nap — he leaves abruptly, leaving
the computer unattended until its battery runs out of power. Shortly afterwards a
Chapter 13: The mythology of computational artifacts
429
group of bonobo chimpanzees comes along and finds it. Out of idle curiosity one
of them turns the handle, winding up the device that is then suddenly re-activated
and continues the operations it was performing before its battery ran out of power.
Is the computer still doing calculations? The computer is still the same thing and
it still behaves in the usual highly regular manner, but does it still serve as a computer?
What Wittgenstein was suggesting with his scenario is that the concept of calculation is of the normative category; it implies that the whoever does the calculation understands the rules of the calculus in question; that is, that the calculator
has the ability to apply the rules and can justify the procedure and the result with
reference to the rules.
However, it is not as clear-cut as this, for we also have the concept of somebody doing calculations ‘mechanically’. Thus, in the following section, Wittgenstein introduced another, quite different, scenario involving a human computer:
‘But is it not true that someone with no idea of the meaning of Russell’s symbols could work over
Russell’s proofs? And so could in an important sense test whether they were right or wrong?
A human calculating machine might be trained so that when the rules of inferences were shewn it
and perhaps exemplified, it read through the proofs of a mathematical system (say that of Russell),
and nodded its head after every correctly drawn conclusion, but shook its head at a mistake and
stopped calculating. One could imagine this creature as otherwise perfectly imbecile.’
(Wittgenstein, 1937-44, V §3)
The question implicitly suggested here is whether the human computer, an ‘otherwise perfectly imbecile’, can in fact be said to be making logical inferences,
even when the ‘human calculating machine’ has no idea of the meaning of Russell’s system?
The case of the misplaced calculator and that of the ‘perfectly imbecile’ ‘human calculating machine’ point to a serious muddle in our notions of calculation,
computing, and mechanical procedures. Let us try to sort out some of the pertinent
distinctions:
(i) Ordinary (competent) calculation: A human computer has learned the rules
of a certain calculus and masters the concomitant methods and techniques. Having
learned to follow the rules, the human computer is able to apply the rules appropriately, explain and justify his or her moves with reference to the rules, etc.
(ii) Routine calculation: Again, a human computer has learned the rules of a
certain calculus and masters the pertinent methods and techniques. However,
since the human computer has performed this kind of calculation often, he or she
proceeds confidently, unhesitatingly, and without forethought. For that reason this
is often referred to as ‘mechanical calculation’ and can of course be said to be
‘mechanical’ in as much as the human computer proceeds steadily and effortlessly, but, as Wittgenstein points out, it is not performed mindlessly:
‘One follows the rule mechanically. Hence one compares it with a mechanism.
“Mechanical”? That means: without thinking. But entirely without thinking? Without reflecting.’
(Wittgenstein, 1937-44, VII §60)
430
Cooperative Work and Coordinative Practices
The human computer masters the rules but does not necessarily have them present and may need a minute or two to recall the rules, if asked for an explanation,
for example. But still, we would find it very strange indeed if the human computer
would not be able to recognize a fault if one was found and pointed out to him.
(iii) Distributed routine calculation: As exemplified by the manufactures form
of cooperative calculation work devised by de Prony, this is the prototypical case
of human computing. For a certain task, an algorithm has been specified, and the
constituent partial or ‘atomic’ operations have been ‘divided’ and allocated among
multiple ‘human computers’, in a carefully calculated proportion and sequence.
Although the cooperative effort as a whole may be organized according to the
rules of a sophisticated algorithm, the partial operations of the individual human
calculator may be extremely rudimentary and, as in de Prony’s example, reduced
to filling in a predesigned spreadsheet by operations of addition and subtraction.
The work of the human computers may also in this case be characterized as
‘mechanical’; but again, the human computers do not perform their operations
thoughtlessly but rather ‘without reflection’, as a matter of course, with a confidence rooted in routine. What is specific is that they, while being able to explain
and justify (by invoking the rules) what they do individually, in their ‘atomic’ operations, nevertheless typically will be unable to account for the rules of the algorithm governing the cooperative effort in its totality.
This is then often turned into a claim that multitude ‘meaningless’ operations
on an aggregate level may appear as complex calculation. But in making such an
inference one forgets the algorithm carefully designed by de Prony and his fellow
mathematicians. Anyway, there was nothing ‘meaningless’ about the work of the
hairdressers: they were conscientiously following the rules of additions and subtraction as well as the protocol embedded in the prefabricated spreadsheet: ‘it
makes no sense to speak of a “meaningless sub-rule”, simply because it is unintelligible to speak of following a rule — no matter how simple or complex — without understanding that rule’ (Shanker, 1987c, p. 89).
One can compare with someone who has learned to move chess pieces according to the rules governing their movement (pawns move forward, bishops move
diagonally, etc.) but, perversely, has never been taught that it is a game and that it
is about winning by trapping the opponent’s king. The imbecile player, or rather,
chess piece mover, is able to move the pieces about in accordance with the rules
but cannot be said to play chess: ‘the game, I should like to say, does not just have
rules; it has a point’ (Wittgenstein, 1937-44, I §20, cf. also III §85). Similarly, the
hairdressers cannot be said to be calculating trigonometric tables, but one could
certainly say that the entire ensemble, that is, the 80 hairdressers in their specific
formation together with de Prony and his assistants, as an ensemble, certainly
were calculating trigonometric tables.
(iv) The machine-as-symbol: Mathematicians conceive of what they do in
terms of the ‘necessity’ and ‘inexorability’ of mathematical propositions once
‘proved’ and this notion of ‘necessary’ and ‘inexorable’ inferences is often expressed in metaphors such as ‘machinery’ and ‘mechanical’. There is a age-long
Chapter 13: The mythology of computational artifacts
431
tradition for using such metaphors to express the sense that the result of a calculation is somehow already there even before the calculation started. Wittgenstein’s
discussion of this is instructive (Wittgenstein, 1939, pp. 196-199). Speaking
‘against the idea of a “logical machinery”’, he is arguing that ‘there is no such
thing’. He gives as an example the use of an imaginary mechanism such as an idealized crank shaft in a geometrical proof:
Figure 1. Wittgenstein’s example of a ‘kinematic’ proof (Wittgenstein, 1939, p. 195).
In this kind of proof one works out how the piston will move if the crankshaft is
moved in a particular way, and in doing so, ‘One always assumes that the parts
are perfectly rigid. — Now what is this? You might say, “What a queer assumption, since nothing is perfectly rigid.” What is the criterion for rigidity? What do
we assume when we assume the parts are rigid?’, Wittgenstein asks and goes on:
‘In kinematics we talk of a connecting rod—not meaning a rod made of steel or brass or what-not.
We use the word “connecting rod” in ordinary life, but in kinematics we use it in quite a different
way, although we say roughly the same things about it as we say about the real rod: that it goes
forward and back, rotates, etc. But then the real rod contracts and expands, we say. What are we to
say of this rod: does it contract and expand?—And so we say it can’t. But the truth is that there is
no question of it contracting or expanding. It is a picture of a connecting rod, a symbol used in this
symbolism for a connecting rod. And in this symbolism there is nothing which corresponds to a
contraction or expansion of the connecting rod.’
The machine is not a machine but a symbolic machine that stands as a symbol for
a certain mathematical operation:
‘If we talk of a logical machinery, we are using the idea of a machinery to explain a certain thing
happening in time. When we think of a logical machinery explaining logical necessity, then we
have a peculiar idea of the parts of the logical machinery—an idea which makes logical necessity
much more necessary than other kinds of necessity. If we were comparing the logical machinery
with the machinery of a watch, one might say that the logical machinery is made of parts which
cannot be bent. They are made of infinitely hard material—and so one gets an infinitely hard necessity.’ (Wittgenstein, 1939, p. 196)
When mathematicians use terms such as ‘machine’, ‘mechanism’, and ‘mechanical’ they are talking about a ‘picture’, a ‘symbol’: ‘if I say that there is no
such thing as the super-rigidity of logic, the real point is to explain where this idea
of super-rigidity comes from—to show that the idea of super-rigidity does not
come from the same source which the idea of rigidity comes from’. ‘It seems as if
we had got hold of a hardness which we have never experienced’. Like the idea of
‘the inexorability or absolute hardness of logic’ the idea of ‘super-hardness’ or
‘super-rigidity’ of the ‘machine-as-symbol’ is a strangely inverted expression of
the normative nature of mathematics: the parts of the machine are super-hard be-
432
Cooperative Work and Coordinative Practices
cause they are not part of any material machine but are rules we do not question
simply because they are rules and are very carefully integrated and connected to
and integrated in the grammar of our number words:
‘“It isn’t a machine which might be explored with unexpected results, a machine which might
achieve something that couldn’t be read off from it. That is, the way it works is logical, it’s quite
different from the way a machine works. Qua thought, it contains nothing more than was put into
it. As a machine functioning causally, it might be believed capable of anything; but in logic we get
out of it only what we meant by it.”’ (Wittgenstein, 1931-34, p. 247).
In sum, Turing’s machine anno 1936 is not a machine but a symbolic machine,
that is, a calculus. ‘If calculating looks to us like the action of a machine, it is the
human being doing the calculation that is the machine’ (Wittgenstein, 1937-44, IV
§20). We accept the rules as incontestable and let our calculations be guided by
the rules: by doing so, we rely on the rules to lead us to correct results, just like
we rely on the ruler to guide us to draw a straight line. It is this that in mathematical logic is meant by ‘mechanical procedure’.
(v) Machine calculation: Now, here ‘machine’ (and ‘mechanical’, etc.) means
something categorially different from the mathematician’s ‘machine-as-symbol’,
namely, the use of a causal process (classical mechanics, hydraulics, electromechanical, analog electronics, digital electronics, software machines), that has been
carefully devised to operate in an extremely regular manner (behaving uniformly,
dependably) and thus to transform specific input into specific output according to
some pattern with extreme degrees of probability. Although an electronic computer operates in a highly regular manner, it is unintelligible to say that it follows
rules when operating, when, for example, performing calculations; it is a transgression of the logical boundaries demarcating grammatically distinct categories
to ascribe rules and rule-following to a machine: in short, it is a transgression of
sense.
What does make sense is to say that we make the machine behave as if it follows rules and perform calculations, in the same way (only much faster) as we
might use a mechanical calculator to do the same. Or better, it is we who follow
rules by, in part, using mechanical calculators or digital computers. And it is we
who use the machine to produce certain transformations of physical patterns that,
within a certain domain of practice, can be taken as calculations, representations,
letters, plans, etc. and are routinely taken as such. A computational artifact is a
computational artifacts only within a practice of ‘computation’ that, like other
forms of rule-following, is a normative activity, whereas the behavior of a computational artifact is the manifestation of causal processes. Now, computation by
means of electronic computers is certainly different from computation by means
of paper and pen, by means of using slide rulers or abacuses, and so on. Computational artifacts are machines: they operate more or less automatically. That is to
say, they undergo highly dependable, causally regulated state changes without
human intervention (at least for a period of time). But computational artifacts are
still a technique of computation by virtue of being used by practitioners in their
normatively defined practices.
Chapter 13: The mythology of computational artifacts
433
The concept of ‘mechanical’ performance is obviously not a sharply defined
concept. It is a concept developed over centuries by millwrights, blacksmiths, machinists, engineers, and — lately — mathematicians and computer scientists.113 In
short, it is a concept with a history, and a rather mixed one at that. So it may be
warranted to call it a family-resemblance concept. But if so, it is a familyresemblance concept deeply divided by a categorial split, encompassing concepts
in disparate grammatical domains. It is noteworthy, however, that our language
has a large number of words that we use to make far more fine-grained distinctions, if not always very sharply or consistently: rule formulation in the form of
prose, lists, structured English, blueprints, etc.; rule codification, i.e., rules arranged according to a specific scheme; rule formalization, i.e., rules encoded in a
notation (a finite set of symbols); software coding, i.e., the formalized rules expressed in the calculus of a programming language; and implementation, i.e., incorporation of the code on a specific platform, taking into account the specific instruction set, word length, memory constraints, etc. That is, it is not so that the
distinctions are not made by practitioners but different practitioners of different
ilk make the distinctions differently. This is as it always is, and normally causes
no harm, as these different uses typically co-exist without intersecting. The problem arises when practitioners — and philosophizing theorists — make general
propositions about ‘mechanical’ with no regard for the conceptual multiplicity.
In much of the literature on the theory of computation, even among the most
notable scholars, these different uses of the notion of ‘mechanical’ are thrown together in the most primitive manner. Take, for example, Hao Wang’s discussion
of Turing’s contribution to our understanding of algorithms and mechanical calculation. He started by defining the concept of algorithm as ‘a finite set of rules
which tell us, from moment to moment, precisely what to do with regard to a given class of problems’, and then, to illustrate the concept of algorithm, introduced
the ubiquitous schoolboy as an example of a human calculator: ‘a schoolboy can
learn the Euclidian algorithm correctly without knowing why it gives the desired
result’ (Wang, 1974, p. 90). This argument is a central tenet in the philosophy of
computing — but also deeply confused. There is, in Shanker’s words, ‘something
profoundly out of focus’ in this last sentence: ‘the thoughts which it contains are
pulling in opposite directions, crediting the schoolboy with the very piece of
knowledge which is immediately denied him’ (Shanker, 1986c, p. 22). Shanker
then elaborates his objections:
‘to learn to apply an algorithm correctly involves more than merely producing the “right” results.
In order to say of the schoolboy that he has learnt — grasped — the rule, we will demand more
than simply the set of his results to justify such a judgment. The criteria for crediting someone
with the mastery of a rule are more complex than this; we place it against the background of his
explaining, justifying, correcting, answering certain questions, etc. We would no doubt be very
puzzled by the case of a schoolboy who could invariably give the “right” result for an algorithm
113 It does not help that the terms ‘machine’ and ‘mechanical’ have been adopted by sociologists as a metaphor to characterize certain organizational forms such as, for example, the cold indifference and imperviousness of state bureaucracies: the ‘mechanical efficiency’ that ‘reduces every worker to a cog in this machine’
(Weber, 1909, p. 127).
434
Cooperative Work and Coordinative Practices
and yet could not provide us with absolutely no information about how or why he had derived that
result.’ (Shanker, 1986c, p. 22).
The source of the confusion is here that Wang, like so many other philosophers
of mathematics and logic, confused the mastery of the steps of the algorithm with
mastery of the algorithm. In the words of Shanker, ‘All that we could rightly say
in such a situation is that the schoolboy has learned a series of (for him) independent rules; but to learn how to apply each of these sub-rules does not amount
to learning the algorithm’ (Shanker, 1986c, p. 22). The confusion is so much more
remarkable as Wang, immediately after having first credited and then debited the
schoolboy with knowledge of the algorithm, commented that: ‘In practice, when a
man calculates, he also designs a small algorithm on the way. But to simplify matters, we may say that the designing activity is no longer a part of his calculating
activity.’ (Wang, 1974, p. 90). But in ‘simplifying’ matters this way Wang made
himself guilty of setting up a thought experiment in which an algorithm is part of
the experimental setup — and then surreptitiously removing the algorithm from
his interpretation. This is hardly ‘simplification’ but rather obfuscation. Like Turing (and most mathematical logicians in general) Wang simply took for granted
that routine (‘noncreative’) calculation is of the same category as a causal process
performed by a machine. He did so quite explicitly by saying that what makes a
process ‘mechanical’ is that it is ‘capable of being performed by a machine, or by
a man in a mechanical (noncreative) way’ (ibid., p. 91). Wang here made the same
transgression of logical boundaries as Turing did when he surreptitiously slipped
from his ‘mechanical’ account of the human computer’s work to devising ‘a machine to do the work of this computer’ in exactly the same language: ‘What this
means is that whenever the operations of a system or organism can be mapped
onto an algorithm, there is no logical obstacle to the transfer of that algorithm to
the system or organism itself’ (Shanker, 1987b, p. 39).
Turing (and Wang and a host of other philosophers of mathematics, logic, and
computing) are flickering between the normative concept of following a rule and
the associated notions of correctness and justification etc., and the causal concept
of machinery and mechanical. Although it ‘is unintelligible to speak of a machine
following a rule’ (Shanker, 1987c, p. 89), a ‘non-normative sense of following
rules’ has been introduced (Shanker, 1987b, p. 39). The upshot is a transgression
of the boundaries of sense which has created an intellectual muddle with detrimental consequences.
It is important to emphasize that their fault is not the notion that software machines can perform sophisticated operations and can do so in flexible ways hitherto unthinkable. Of course not: CSCW is predicated on those notions and the technologies that have been built on them. The issue is neither that software machines
can be integrated in (normatively defined) practices so as to automatically regulate aspects of action and interaction. Of course not: this is an observable fact. In
fact, strictly speaking, the issue is not even that computer scientists and logicians,
in the natural attitude of their own habitat, resort to language that, outside of their
domain, can be fatally inaccurate, ambiguous, misleading (‘programming lan-
Chapter 13: The mythology of computational artifacts
435
guage’, ‘error’, ‘rule’, ‘calculation’, etc.). It is clear that when computer scientists
are engaged in programming, discussing programs, debating the complexity of an
algorithm — then such usage is of course completely legitimate. It is simply a
kind of shorthand. However, just as the ‘talk of mathematicians becomes absurd
when they leave mathematics’ (Wittgenstein, 1932-33, p. 225), the issue arises
when computer scientists (and journalists!) bring these professional metaphors
into ordinary language. Then a bonfire of metaphysics erupts. The issue is that
they, having ventured outside of their respective fields and started philosophizing
about ‘a machine following a rule’, degenerate to sheer unintelligible nonsense.
This is the real issue. By transgressing the logical boundaries governing the use of
concepts of disparate grammatical domains, Turing and his followers have created
a conceptual imbroglio in which investigations of computing technologies in and
for actual work practices are gravely handicapped. This concept of computational
artifact, as received from the philosophy of mathematical logic, is confused and
this leads us to ascribe causal characteristics to normative actions while others respond that actions are not causally determined and hence not rule-governed. This
leaves no logical space for CSCW.
2. Room for CSCW
The point of all this is not to say that CSCW’s research program is centered on the
concept of computation, nor even that CSCW is or should be focused on dispelling the metaphysical fog in which the concepts of computing and computational
artifact are still shrouded. The point of all this is rather to clear the construction
site of the piles of debris and rubble that prevent us from proceeding and from
even seeing the challenge and identifying the possibilities and issues. For CSCW
to progress and even exist as an intellectually respectable research area requires
that fundamental concepts such as ‘computation’, ‘formalization’, ‘computational
artifact’, ‘mechanical procedure’, ‘mechanization’, ‘causal process’, ‘regularity’,
‘rule following’, etc. become surveyable, and, a fortiori, that we developed a differentiated conceptualization of the ways in which such techniques and practices
function and develop. Without clarity on these issues CSCW will remain a degenerative research program, caught up in endless conceptual confusion and unprincipled design experiments.
The issues of the concept of computation and plans and computational artifacts
are critically important but are not focal issues for CSCW. They are inevitable issues, not as the subject matter of CSCW, but in the sense that their clarification is
the precondition for creating the ‘logical space’ in which CSCW research is intellectually legitimate.
CSCW’s research program is centered on the issues of developing computing
technologies that enable workers engaged in cooperative work to regulate their
interdependent and yet distributed activities effectively, efficiently, dependably,
timely, unhurriedly, with dignity, etc. and accordingly, at the same time, enable
436
Cooperative Work and Coordinative Practices
them to control, specify, adapt, modify, etc. the behavior of the regulating computational artifacts.
The point of clarifying and emphasizing that ‘it is we who are executing the instructions, albeit with the aid of this sophisticated electro-mechanical tool’
(Shanker, 1987c, p. 82), is to let the buzzing fly out of the bottle — to show that
there is no conceptual obstructions (notions of rational action as ‘essentially ad
hoc’ or of ‘rule following machinery’) prohibiting CSCW from engaging in the
development of coordination technologies: the technologies of focal interest to
CSCW.
With classic machinery, be it a ‘Spinning Jenny’ or a CNC machining station
or a digital photo-editing application like Photoshop, the worker interposes, in
Marx’s words, a ‘natural process […] as a means between himself and inorganic
nature’. By contrast, with coordination technologies we are interposing a ‘natural
process’, not as a means between us and the work piece — but as a regulating
causal process into the web of our cooperative work relationships. What characterizes coordination technologies is that causal processes are specifically devised
and employed as techniques of coordinative practices. That is, with coordination
technologies, coordinative practices, the normative constructs of coordinative
conduct such as plans, schedules, procedures, schemes, etc., are enacted and imposed (to some extent and in some respects) by means of causally unfolding processes as embodied by computational artifacts. This is not a paradox or a contradiction; it is simply a tension to be explored and mastered and thereby overcome.
The fundamental problem of CSCW is defined by this preliminary tension: the
design of (causal) mechanisms for (normative) practices.
What is new is that we are beginning to construct devices that we use in our
coordinative practices — to regulate our interactions. Probably the first example
of this kind of technique is the mechanical clock in the town hall towers of medieval Europe that regulated the life of the town. In the words of Lewis Mumford:
‘the regular striking of the bells brought a new regularity into the life of the
workman and the merchant. The bells of the clock tower almost defined urban existence’ (Mumford, 1934, p. 14). The clock provided medieval townspeople with
a criterion for determining what was the correct time (e.g., ‘morning mass’, ‘bank
hour’) under given circumstances, in the form of a commonly accepted and publicly visible metric of time. And the townspeople would typically let their activities be guided by the hands of the clock, or by the sound of its bell, or they might
ignore it, if it was of no consequence or if they for some reason had other plans
for the day. Now, as mentioned earlier, the development of railroad operations
required time-critical coordination on a large scale which in turn motivated the
development and deployment of telegraph communication systems and furthermore the establishment of national time measurement conventions and the ‘galvanic’ system of automatic synchronization of time keepers. Like the town-hall
clock, the latter system, the system of automatic synchronization, brought ‘a new
regularity’ into the life of not just of townspeople within earshot but the lives of
millions, in the UK and beyond. The (normative) standard method of time meas-
Chapter 13: The mythology of computational artifacts
437
urement (based on Greenwich time) was made effective on such a large scale by
the highly regular periodic dissemination of electric pulses over the telegraph
network. It was not the causal propagation of the electric pulse that forced railroad
workers, travellers, post office clerks, watchmakers, etc. to synchronize their daily
conduct but it was these people who followed the rule of standard time measurement by taking note of and adjusting to the time as read from their electromagnetically regularized timekeepers.
Now, both the town hall clock and the nationally synchronized clocks of course
represented very simple coordination technologies: the clock has two states; it either sounds or it does not, and more sophisticated designs soon had the number of
strikes correspond to the time of day. The coordination technologies we now have
available are of course integrated into our work practices in significantly more
sophisticated ways. A computational calendar system may issue notifications of
up-coming tasks and appointments and may be used for calculating and identifying possible time slots for meetings under consideration, etc. Similarly, workflow
management systems are used not only for routing documents, case folders, etc,
through a system of division of labor, according to case types, formal roles,
schedules, sequential dependencies, etc. but may be used also for keeping track of
deadlines and for issuing reminders.
The point I want to make is it that the problem with computational artifacts is
not a principled one. Just as the ruler does not ‘in any strong sense’ make me, i.e.,
cause me to, draw a straight line but I who exploit its material properties (stiffness, geometry) to draw what is conventionally considered a straight line, it is we
who follow rules by using computational artifacts. There is no conceptual problem in this, and the state of angst should be called off. The problems that we do
have with using machines in our rule-based practices is a practical one: it is a
problem of cost, namely, the cost (effort, time, resources, risk, etc.) of construction and modification of such artifacts.
3. The practical inexorability of CSCW
With electronic stored-program computing (equipped with high-level programming languages, code libraries, editors, compilers, interpreters, and what have
you), the construction of machinery has become immensely inexpensive. The cost
of modifying such machines is negligible compared to previous technologies.
Consequently, over the last couple of decades it has become economically feasible
to construct vast machine systems, even global ones.
Already in the case of the mechanical machine systems of the 19th century,
such as printing presses and process industries, workers cooperated through causal
processes and the coordination of their local tasks were to a large extent preordained by the structure of the machinery. But their ability to change the protocols
embodied in the machine systems were practically non-existent. At best, the only
means of coordination available to them during operations were monitoring of
438
Cooperative Work and Coordinative Practices
bodily conduct, shouting, conversations (cf. the description of the hot rolling mill
in Popitz, et al., 1957).
The stored-program architecture that made computing a practically universal
control technology has reduced the cost of constructing machinery. And propelled
by steady advances in semiconductor technology since the invention of the integrated circuit and especially the microprocessor, the development of computing
technology is steadily eroding the cost of constructing and modifying software
machinery.
As a result, the very concept of ‘automation’ has been transformed. Until quite
recently the concept of automation remained identical to the concept of automation developed by Ure and Marx, namely a concept of a technical system that,
with only occasional human intervention, provides virtually immutable operational continuity on a large scale, such as power looms, transfer lines, paper mills, nuclear power plants, oil refineries, etc. (cf., e.g. Diebold, 1952; Piel, et al., 1955;
Blauner, 1964; Pollock, 1964; Luke, 1972). With software machinery, this notion
of ‘wall-to-wall’ automation has been superseded by the interactive-computing
concept of fine-grained and highly reactive automatic control mechanisms (e.g.,
‘word wrap’) that can interoperate in flexible ways. Computing technologies has
provided the technical means for the integration of myriads of automatic processes. Ironically, this means that large-scale machine systems are becoming ubiquitous. Automatic control mechanisms can be deployed in domains and for activities for which classic ‘wall-to-wall’ automation was not feasible at all, simply because production conditions had been too uncertain or variable for it to have been
economically viable. With software control mechanisms, making machines interconnect and form large-scale systems is inexpensive and effortless, compared to
connecting by rods and gears, as it can be done by transmitting data or code between software machines. Thus, computing technologies, combined with network
technologies, has made the construction of the vast machine systems (ERP systems, CAD/CAM systems, financial transaction systems, mobile phone systems) a
viable option.
Machine systems are or are becoming a dominant technology in all domains of
work: in manufacturing and mining; in transportation by air, sea, rail, or road; in
construction; in administrative work such as accounting, trading, banking, and
insurance; in retail trade (supermarkets); in hospital work; in architectural design,
engineering design, and programming; in experimental science; in newspaper
production, radio and television program production and broadcasting, and increasingly also in movie production. Consequently, cooperative work is increasingly carried out in and through the artificial causal processes constituted by computational machine systems and the realm of ordinary work correspondingly characterized by increasingly widely ramified cooperative work relationships.
On the other hand, and again by virtue of the stored-program architecture,
computing technologies also provide the general basis for constructing, at another
level of abstraction, coordination technologies that facilitate fine-grained automatic control of routine interdependencies.
Chapter 13: The mythology of computational artifacts
439
Not only does computer technology offer the means of control mechanisms
that are physically cleanly separated from the mechanisms of transforming, converting, and transmitting energy, and not only does computing technology offers
the means of control mechanisms to interconnect machine systems; it also, by the
same token, affords the logical and physical separation of mechanisms for the
regulation of interdependent cooperative activities from the mechanisms of controlling the various productive processes, be they processes of fabrication, transportation, etc. or processes of accounting, financial transactions, etc. That is, it
affords the emergence of coordination technologies as a distinct class of technologies, separate from those of machine systems, as it is becoming technically and
economically feasible for ordinary workers to construct and modify the control
systems that mediate and regulate their cooperative activities.
The concept of the distinct control mechanism is again crucial. In coordination
technologies such as workflow systems, scheduling systems, etc., the computational protocols regulating the coordination of interdependent activities are — can
be, or rather: should be — technically separate from the procedures of the ‘first
order’ machine systems (calculating aircraft vectors or passenger lists, controlling
production machinery, etc.). It is therefore technically feasible to modify the computational coordinative protocols at very low costs. In other words, interactive
computing technologies make it technically and economically feasible even for
ordinary workers to devise, adapt, modify the computational protocols of their
coordinative practices.
Like the town hall clock, the computational artifact does not cause practitioners to behave in a certain way. Just like the medieval townspeople might let their
activities guide by the hands of the clock, or by the sound of its bell, workers today may let computational artifacts guide their interactions. Or, like the medieval
townspeople, workers may disregard the presumptive mechanically issued command and decide that they have reasons — valid reasons — not to act on the signals produced by the artifact.
Again, the problem is not conceptual; it is not principled; it is practical. The
challenge is to devise technologies that allow ordinary workers to construct and
modify computational artifacts for regulating their coordinative practices and to
do so with minimal effort. To do this, however, requires that we understand how
ordinary coordinative artifacts are being constructed, negotiated, adopted, amended, combined and recombined. Is there a pattern, a logic even, to the way they are
constructed, put together, combined and recombined, and so on? To address those
questions systematically requires that we overcome our intellectual paralysis.
One may think of CSCW as a fad; it certainly has had its share of folly. And
one may find that it has had it time. But the research problem that prompted the
formation of CSCW 25 years ago, was not an arbitrary construction, incidentally
devised; it is a problem that emerges again and again in the practical development
and application of software machinery in cooperative work settings, in the design
of workflow systems and ‘business-process execution languages’, in building
computational taxonomies or ‘ontologies’ for scientific work, in building models
440
Cooperative Work and Coordinative Practices
of product families, etc. It arises from the issues involved in interposing causal
processes as regulatory mechanisms in and for cooperative work relations. That is,
the CSCW research program arises from the organization of modern work, from
the incessant attempts to make the coordination of interdependent activities more
efficient, reliable, flexible, etc. by means of computational artifacts.
441
References
Abbate, Janet Ellen: Inventing the Internet, The MIT Press, Cambridge, Mass., and London, 1999.
(Paperback ed., 2000).
Abbott, Kenneth R.; and Sunil K. Sarin: ‘Experiences with workflow management: Issues for the
next generation’, in J. B. Smith; F. D. Smith; and T. W. Malone (eds.): CSCW’94: Proceedings
of the Conference on Computer-Supported Cooperative Work, 24-26 October 1994, Chapel
Hill, North Carolina, ACM Press, New York, 1994, pp. 113-120.
Académie Royale des Sciences: Histoire de l'Académie royale des sciences, M.DC.XCIX [1699],
avec les mémoires de mathématique & de physique, tirez des registres de cette Académie, vol.
1, Boudot, Paris, 1702. – Gabriel Martin etc., Paris, 1732.
Académie Royale des Sciences: Histoire de l'Académie royale des sciences depuis son
etablissement en 1666 jusqu’en 1699, vol. 1-13, Compagnie des libraires, Paris, 1729-34. –
Gabriel Martin, etc., Paris, 1732.
Académie Royale des Sciences: L'art du meunier, du boulanger, du vermicellier. Text ed. by J.-E.
Bertrand. Société typographique. Neuchatel, 1761. (Descriptions des arts et métiers, faites ou
approuvées par Messieurs de l'Académie royale des sciences de Paris, vol. 1). (2nd ed., 1771).
Académie Royale des Sciences: Descriptions des arts et métiers, faites ou approuvées par
Messieurs de l'Académie royale des sciences de Paris, vol. 1-20, Société typographique,
Neuchatel, 1771-83. Text ed. by J.-E. Bertrand. (Nouvelle édition publiée avec des
observations, & augmentée de tout ce qui a été écrit de mieux sur ces matières, an Allemagne,
en Angleterre, en Suisse, en Italie).
Ackerman, Mark S.; and Thomas W. Malone: ‘Answer Garden: A tool for growing organizational
memory’, in: Proceedings of the Conference on Office Information Systems, April 1990,
Cambridge, Mass., ACM, New York, 1990, pp. 31-39.
Ackerman, Mark S.: ‘Augmenting the organizational memory: A field study of Answer Garden’,
in J. B. Smith; F. D. Smith; and T. W. Malone (eds.): CSCW’94: Proceedings of the
Conference on Computer-Supported Cooperative Work, 24-26 October 1994, Chapel Hill,
North Carolina, ACM Press, New York, 1994, pp. 243-252.
Agre, Philip E.: [Review of] L. A. Suchman: Plans and Situated Actions (Cambridge University
Press, Cambridge, 1987), Artificial Intelligence, vol. 43, 1990, pp. 369-384.
Agricola, Georgius: De Re Metallica, 1556. Transl. from De Re Metallica, Froben 1556. Transl.
by H. C. Hoover and L. H. Hoover. – Dover Publications, New York, 1950.
Allport, Alan: ‘Attention and control: Have we been asking the wrong questions? A critical review
of twenty-five years’, in D. E. Meyer and S. Kornblum (eds.): Attention and Performance XIV:
Synergies in Experimental Psychology, Artificial Intelligence, and Cognitive Neuroscience,
MIT Press, Cambridge, Mass., 1993, pp. 183-218.
Amber, George H.; and Paul S. Amber: Anatomy of Automation, Prentice-Hall, Englewood-Cliffs,
New Jersey, 1962.
Andersen, Hans H. K.: ‘Classification schemes: Supporting articulation work in technical
documentation’, in H. Albrechtsen (ed.): ISKO’94: Knowledge Organisation and Quality
Management, 21-24 June 1994, Copenhagen, Denmark, 1994a.
Andersen, Hans H. K.: ‘The construction note’, in K. Schmidt (ed.): Social Mechanisms of
Interaction, Computing Department, Lancaster University, Lancaster, UK, September 1994b,
pp. 161-186. COMIC Deliverable 3.2. <ftp://ftp.comp.lancs.ac.uk/pub/comic>
Andersen, Hans H. K.: Cooperative Documentation Production in Engineering Design: The
‘Mechanism of Interaction’ Perspective, PhD diss., Risø National Laboratory, Roskilde,
Denmark, June 1997.
442
Cooperative Work and Coordinative Practices
Andersen, N. E., et al.: Professional Systems Development: Experience, Ideas, and Action,
Prentice-Hall, Englewood-Cliffs, New Jersey, 1990.
Anderson, Robert J.; Wes W. Sharrock; and John A. Hughes: ‘The division of labour’, in: Action
Analysis and Conversation Analysis, Maison des Sciences de l’Homme, September 1987,
Paris, 1987.
Anderson, Robert J.: ‘Representations and requirements: The value of ethnography in system
design’, Human-Computer Interaction, vol. 9. Hillsdale, New York, 1994, pp. 151-182.
Anderson, Robert J.: ‘Work, ethnography, and system design’, in A. Kent and J. G. Williams
(eds.): The Encyclopedia of Microcomputing, vol. 20, Marcel Dekker, New York, 1997, pp.
159-183.
Aoki, Masahiko: A New Paradigm of Work Organization: The Japanese Experience, vol. 36,
World Institute for Development Economics Research, Helsinki, Finland, 1988.
Appelt, Wolfgang: ‘WWW based collaboration with the BSCW system’, Lecture Notes in
Computer Science, vol. 1725, 1999, pp. 66-78.
Appelt, Wolfgang: ‘What groupware functionality do users really use? Analysis of the usage of the
BSCW system’, in K. Klöckner (ed.): PDP 2001: Proceedings of the Ninth Euromicro
Workshop on Parallel and Distributed Processing, 7-9 February 2001, Mantova, Italy, IEEE
Computer Society Press, 2001, pp. 337-341.
Arden, Bruce W. (ed.): What Can be Automated? The Computer Science and Engineering
Research Study (COSERS), The MIT Press, Cambridge, Mass., and London, 1980.
Aristotle: Nicomachean Ethics (c. 334-322 BCE-a). Transl. by W. D. Ross and J. O. Urmson. In J.
Barnes (ed.): The Complete Works of Aristotle. Princeton University Press, Princeton, New
Jersey, 1984, vol. 2, pp. 1729-1867.
Aristotle: Metaphysics (c. 334-322 BCE-b). Transl. by W. D. Ross. In J. Barnes (ed.): The
Complete Works of Aristotle. Princeton University Press, Princeton, New Jersey, 1984, vol. 2,
pp. 1552-1728.
Aristotle: Politics (c. 334-322 BCE-c). Transl. by B. Jowett. In J. Barnes (ed.): The Complete
Works of Aristotle. Princeton University Press, Princeton, New Jersey, 1984, vol. 2, pp. 19862129.
Aspray, William: ‘The Institute for Advanced Study computer: A case study in the application of
concepts from the history of technology’, in R. Rojas and U. Hashagen (eds.): The First
Computers: History and Architectures, The MIT Press, Cambridge, Mass., and London, 2000,
pp. 179-194.
Atkins, Daniel E., et al.: Revolutionizing Science and Engineering Through Cyberinfrastructure:
Report of the National Science Foundation Blue-Ribbon Advisory Panel on
Cyberinfrastructure, National Science Foundation, Washington, D.C., January 2003.
<http://www.nsf.gov/cise/sci/reports/atkins.pdf>
Augarten, Stan: Bit by Bit: An Illustrated History of Computers, George Allen & Unwin, London,
1984.
Austin, John Longshaw: Philosophical Papers, Oxford University Press, Oxford, 1961. Text ed.
by J. O. Urmson and G. J. Warnock. (3rd ed., 1979).
Babbage, Charles: A Letter to Sir Humphry Davy, Bart., President of The Royal Society, etc. etc.
on the Application of Machinery to the Purpose of Calculating and Printing Mathematical
Tables (London, 3 July 1822). Text ed. by P. Morrison and E. Morrison. In C. Babbage: On
The Principles and Development of the Calculator, and Other Seminal Writings by Charles
Babbage and Others. Dover Publications, New York, 1961, pp. 298-305.
Babbage, Charles: On the Economy of Machinery and Manufactures, Charles Knight, London,
1832. (4th ed., 1835). – Augustus M. Kelley, Publishers, Fairfield, New Jersey, 1986.
Babbage, Charles: Passages from the Life of a Philosopher, Longman, etc., London, 1864. Text
ed. by M. Campbell-Kelly. – William Pickering, London, 1994.
Bacon, Frances: The New Organon: Or True Directions Concerning the Interpretation of Nature
(1620). Transl. from Novum Organum, Sive Vera de Interpretatione Naturae. Transl. by J.
Spedding; R. L. Ellis; and D. D. Heath. In F. Bacon: The Works. Taggard and Thompson,
Boston, 1863.
Bahrdt, Hans Paul: Industriebürokratie: Versuch einer Soziologie des Industrialisierten
Bürobetriebes und seiner Angestellten, Ferdinand Enke Verlag, Stuttgart, 1958.
References
443
Bahrdt, Hans Paul: ‘Die Krise der Hierarchie im Wandel der Kooperationsformen’, in:
Verhandlungen des 14. deutschen Soziologentages, Stuttgart 1959, Deutsche Gesellschaft für
Soziologie, 1959, pp. 113-121. – [Reprinted in Renate Mayntz (ed.): Bürokratische
Organisation, Kiepenheuer & Witsch, Köln-Berlin, 1971, pp. 127-134.].
Baker, Gordon P.; and Peter M. S. Hacker: Scepticism, Rules and Language, Basil Blackwell
Publisher, Oxford, 1984.
Baker, Gordon P.; and Peter M. S. Hacker: Wittgenstein: Rules, Grammar, and Necessity: Essays
and Exegesis of §§ 185-242. Basil Blackwell. Oxford, 2009. (G. P. Baker and P. M. S. Hacker:
An Analytical Commentary on the Philosophical Investigations, vol. 2). (2nd rev. ed.; 1st
ed.1985).
Bannon, Liam J.: ‘Computer-Mediated Communication’, in D. A. Norman and S. W. Draper
(eds.): User Centered System Design, Lawrence Erlbaum, New Jersey, 1986, pp. 433-452.
Bannon, Liam J.; Niels Bjørn-Andersen; and Benedicte Due-Thomsen: ‘Computer Support for
Cooperative Work: An appraisal and critique’, in: Eurinfo’88: 1st European Conference on
Information Technology for Organisational Systems, 16-20 May 1988, Athens, Greece, 1988.
Bannon, Liam J.; and Kjeld Schmidt: ‘CSCW: Four characters in search of a context’, in P.
Wilson; J. M. Bowers; and S. D. Benford (eds.): ECSCW’89: Proceedings of the 1st European
Conference on Computer Supported Cooperative Work, 13-15 September 1989, Gatwick,
London, London, 1989, pp. 358-372.
Bannon, Liam J.; and Susanne Bødker: ‘Beyond the interface: Encountering artifacs in use’, in J.
M. Carroll (ed.): Designing Interaction. Psychology at the Human-Computer Interface,
Cambridge University Press, Cambridge, 1991, pp. 227-253.
Bannon, Liam J.; and Susanne Bødker: ‘Constructing common information spaces’, in J. A.
Hughes, et al. (eds.): ECSCW’97: Proceedings of the Fifth European Conference on
Computer-Supported Cooperative Work, 7–11 September 1997, Lancaster, U.K., Kluwer
Academic Publishers, Dordrecht, 1997, pp. 81-96.
Barber, Gerald R.; and Carl Hewitt: ‘Foundations for office semantics’, in N. Naffah (ed.): Office
Information Systems, INRIA/North-Holland, Amsterdam, 1982, pp. 363-382.
Barber, Gerald R.: ‘Supporting organizational problem solving with a work station’, ACM
Transactions on Office Information Systems, vol. 1, no. 1, January 1983, pp. 45-67.
Barber, Gerald R.; Peter de Jong; and Carl Hewitt: ‘Semantic support for work in organizations’,
in R. E. A. Mason (ed.): Information Processing ’83: Proceedings of the IFIP 9th World
Computer Congress, 19-23 September 1983, Paris, France, North-Holland, Amsterdam, 1983,
pp. 561-566.
Bardini, Thierry: Bootstrapping: Douglas Engelbart, Coevolution, and the Origins of Personal
Computing, Stanford Research Institute, Stanford, Calif., 2000.
Bardram, Jakob E.: ‘Plans as situated action: An activity theory approach to workflow systems’, in
J. A. Hughes, et al. (eds.): ECSCW’97: Proceedings of the Fifth European Conference on
Computer-Supported Cooperative Work, 7–11 September 1997, Lancaster, U.K., Kluwer
Academic Publishers, Dordrecht, 1997, pp. 17-32.
Barnard, Chester I.: The Functions of the Executive, Harvard University Press, Cambridge, Mass.,
1938.
Barnes, Susan B.: Computer-Mediated Communication: Human-to-Human Communication Across
the Internet, Allyn and Bacon, Boston, etc., 2003.
Basalla, George: The Evolution of Technology, Cambridge University Press, Cambridge, 1988.
Baum, Claude: The System Builders: The Story of SDC, System Development Corporation, Santa
Monica, Calif., 1981.
Beckert, Manfred: Johann Beckmann, B. G. Teubner, Leipzig, 1983.
Beckmann, Johann: Politisch-Oekonomische Bibliothek worinn von den neusten Büchern, welche
die Naturgeschichte, Naturlehre und die Land- und Stadtwithschaft betreffen, zuverlässige und
vollständige Nachrichten ertheilt werden, Vandenhoeck und Ruprecht, Göttingen, 1772.
Beckmann, Johann: Anleitung zur Technologie, oder zur Kenntniß der Handwerke, Fabriken und
Manufacturen, vornehmlich derer, welche mit der Landwirthschaft, Polizey und
Cameralwissenschaft in nächster Verbindung stehn; nebst Beyträgen zur Kunstgeschichte,
Vandenhoeck und Ruprecht, Göttingen, 1777. (5th ed., 1802).
444
Cooperative Work and Coordinative Practices
Bell, C. Gordon: ‘Toward a history of (personal) workstations’, in A. Goldberg (ed.): A History of
Personal Workstations, Addison-Wesley, Reading, Mass., 1988, pp. 4-47.
Benford, Steven D.: ‘A data model for the AMIGO communication environment’, in R. Speth
(ed.): EUTECO ’88: European Teleinformatics Conference on Research into Networks and
Distributed Applications, 20-22 April 1988, Vienna, Austria. Organized by the European
Action in Teleinformatics COST 11ter, North-Holland, Brussels and Luxemburg, 1988, pp. 97109.
Benford, Steven D.; and Lennart Fahlén: ‘A spatial model of interaction large virtual
environments’, in G. De Michelis; C. Simone; and K. Schmidt (eds.): ECSCW’93: Proceedings
of the Third European Conference on Computer-Supported Cooperative Work, 13-17
September 1993, Milano, Italy, Kluwer Academic Publishers, Dordrecht, 1993, pp. 109-124.
Benford, Steven D., et al.: ‘Managing mutual awareness in collaborative virtual environments.’, in
G. Singh; S. K. Feiner; and D. Thalmann (eds.): VRST’94: Proceedings of the ACM SIGCHI
Conference on Virtual Reality and Technology, Singapore, 23-26 August 1994, ACM Press,
New York, 1994a, pp. 223-236.
Benford, Steven D.; Lennart E. Fahlén; and John M. Bowers: ‘Supporting social communication
skills in multi-actor artificial realities’, in: ICAT’94: Proceedings of Fourth International
Conference on Artificial Reality and Tele-Existence, 14-15 July 1994, Nikkei Hall, Otemachi,
Tokyo, Japan, 1994b, pp. 205-228. <ic-at/papers/list.html>
Benford, Steven D.; and Chris Greenhalgh: ‘Introducing third party objects into the spatial model
of interaction’, in J. A. Hughes, et al. (eds.): ECSCW’97: Proceedings of the Fifth European
Conference on Computer-Supported Cooperative Work, 7–11 September 1997, Lancaster,
U.K., Kluwer Academic Publishers, Dordrecht, 1997, pp. 189-204.
Bennett, Stuart: A History of Control Engineering, 1800-1930, Peter Peregrinus, London, 1979.
(Paperback ed., 1986).
Bennett, Stuart: A History of Control Engineering, 1930-1955, Peter Peregrinus, London, 1993.
Bentley, Richard; and Paul Dourish: ‘Medium versus mechanism: Supporting collaboration
through customization’, in H. Marmolin; Y. Sundblad; and K. Schmidt (eds.): ECSCW’95:
Proceedings of the Fourth European Conference on Computer-Supported Cooperative Work,
10–14 September 1995, Stockholm, Sweden, Kluwer Academic Publishers, Dordrecht, 1995,
pp. 133-148.
Bentley, Richard, et al.: ‘Basic support for cooperative work on the World Wide Web’,
International Journal of Human-Computer Studies, vol. 46, 1997a, pp. 827-846.
Bentley, Richard; Thilo Horstmann; and Jonathan Trevor: ‘The World Wide Web as enabling
technology for CSCW: The case of BSCW’, Computer Supported Cooperative Work (CSCW):
The Journal of Collaborative Computing, vol. 6, no. 2-3, 1997b, pp. 111-134.
Bergamini, David: Mathematics, Time Inc., New York, 1963.
Berners-Lee, Tim: Information management: A proposal, Technical report, CERN, Geneva, May
1990. (2nd ed.; 1st ed. March 1989). – Reprinted in Tim Berners-Lee: Weaving the Web: The
Past, Present and Future of the World Wide Web by its Inventor, London and New York, 1999.
(Paperback ed., 2000), pp. 229-251.
Bertalanffy, Ludwig von: General System Theory: Foundations, Development, Applications,
Braziller, New York, 1968. – Penguin Books, Harmondsworth, 1973.
Best, Michael H.: The New Competition: Institutions of Industrial Restructuring, Polity Press,
Cambridge, 1990.
Bewley, William L., et al.: ‘Human factors testing in the design of Xerox's 8010 “Star” office
workstation’, in R. N. Smith; R. W. Pew; and A. Janda (eds.): CHI’83: Proceedings of the
SIGCHI conference on Human Factors in Computing Systems, 12-15 December 1983, Boston,
Mass., ACM Press, New York, 1983, pp. 72-77.
Bien, Günther: ‘Das Theorie-Praxis-Problem und die politische Philosophie bei Platon und
Aristoteles’, Philosophisches Jahrbuch, vol. 76, 1968-69, pp. 264-314.
Bien, Günther: ‘Praxis (Antike)’, in J. Ritter and K. Gründer (eds.): Historisches Wörterbuch der
Philosophie, vol. 7, 1989, pp. 1277–1287.
References
445
Bignoli, Celsina; and Carla Simone: ‘AI Techniques for supporting human to human
communication in CHAOS’, in P. Wilson; J. M. Bowers; and S. D. Benford (eds.):
ECSCW’89: Proceedings of the 1st European Conference on Computer Supported Cooperative
Work, 13-15 September 1989, Gatwick, London, London, 1989, pp. 133-147.
Bittner, Egon: ‘The concept of organization’, Social Research, vol. 32, no. 3, Autumn 1965, pp.
239-255.
Bittner, Egon: ‘Objectivity and realism in sociology’, in G. Psathas (ed.): Phenomenological
Sociology: Issues and Applications, John Wiley & Sons, New York, etc., 1973, pp. 109-125.
Bjerknes, Gro; Pelle Ehn; and Morten Kyng (eds.): Computers and Democracy : A Scandinavian
Challenge, Avebury, Aldershot, 1987.
Blauner, Robert: Alienation and Freedom: The Factory Worker and His Industry, University of
Chicago Press, Chicago, 1964.
Bly, Sara; Steve R. Harrison; and Susan Irwin: ‘Media spaces: Bringing people together in a
video, audio, and computing environment’, Communications of the ACM, vol. 36, no. 1,
January 1993, pp. 28-47.
Bogia, Douglas P., et al.: ‘Supporting dynamic interdependencies among collaborative activities’,
in S. M. Kaplan (ed.): COOCS’93: Conference on Organizational Computing Systems, 1-4
November 1993, Milpitas, California, ACM Press, New York, 1993, pp. 108-118.
Bogia, Douglas P., et al.: ‘Issues in the design of collaborative systems: Lessons from
ConversationBuilder’, in D. Z. Shapiro; M. Tauber; and R. Traünmuller (eds.): The Design of
Computer Supported Cooperative Work and Groupware Systems, North-Holland Elsevier,
Amsterdam, 1996, pp. 401-422. – Originally published at the Workshop on CSCW Design,
Schärding, Austria, 1-3 June 1993.
Borges, Jorge Luis: ‘On exactitude in science’ (Los Anales de Buenos Aires, 1946). Transl. from
‘Del rigor en la ciencia’. Transl. by N. T. di Giovanni. In J. L. Borges: A Universal History of
Infamy. Allen Lane, London, 1973, p. 149.
Bourdieu, Pierre: Le sens pratique, Les Éditions de Minuit, Paris, 1980. – English translation: The
Logic of Practice, translated by Richard Nice, Polity Press, Cambridge, 1990.
Bourdieu, Pierre: The Logic of Practice, Polity Press, Cambridge, 1990. Transl. by R. Nice.
(French original, Le sens pratique, 1980).
Bowers, John M.; John Churcher; and Tim Roberts: ‘Structuring computer-mediated
communication in COSMOS’, in R. Speth (ed.): EUTECO ’88: European Teleinformatics
Conference on Research into Networks and Distributed Applications, 20-22 April 1988,
Vienna, Austria. Organized by the European Action in Teleinformatics COST 11ter, NorthHolland, Brussels and Luxemburg, 1988, pp. 195-209.
Bowers, John M.; and John Churcher: ‘Local and global structuring of computer-mediated
communication’, in I. Greif and L. A. Suchman (eds.): CSCW’88: Proceedings of the
Conference on Computer-Supported Cooperative Work, 26-28 September 1988, Portland,
Oregon, ACM Press, New York, 1988, pp. 125-139.
Bowers, John M.: ‘The Janus faces of design: Some critical questions for CSCW’, in J. M. Bowers
and S. D. Benford (eds.): Studies in Computer Supported Cooperative Work: Theory, Practice
and Design, North-Holland, Amsterdam, 1991, pp. 333-350.
Bowers, John M.: ‘The politics of formalism’, in M. Lea (ed.): Contexts of Computer-Mediated
Communication, Harvester, New York, 1992, pp. 232-261.
Bowers, John M.; Graham Button; and Wes W. Sharrock: ‘Workflow from within and without:
Technology and cooperative work on the print industry shopfloor’, in H. Marmolin; Y.
Sundblad; and K. Schmidt (eds.): ECSCW’95: Proceedings of the Fourth European
Conference on Computer-Supported Cooperative Work, 10–14 September 1995, Stockholm,
Sweden, Kluwer Academic Publishers, Dordrecht, 1995, pp. 51-66.
Bowker, Geoffrey C.; and Susan Leigh Star: ‘Situations vs. standards in long-term, wide-scale
decision-making: The case of the International Classification of Diseases’, in J. F. Nunamaker,
Jr. and R. H. Sprague, Jr. (eds.): Proceedings of the Twenty-Fourth Annual Hawaii
International Conference on System Sciences, 7-11 January 1991 Kauai, Hawaii, IEEE
Computer Society Press, vol. 4, 1991, pp. 73-81.
Bowker, Geoffrey C.; and Susan Leigh Star: Sorting Things Out: Classification and Its
Consequences, MIT Press, Cambridge, Mass., 1999.
446
Cooperative Work and Coordinative Practices
Bradley, Margaret: A Career Biography of Gaspard Clair François Marie Riche de Prony, Bridge
Builder, Educator and Scientist, The Edwin Mellen Press, Lewiston, New York, etc., 1998.
Brannigan, Augustine: The Rise and Fall of Social Psychology: The Use and Misuse of the
Experimental Method, Aldine De Gruyter, New York, 2004.
Braudel, Fernand: Civilization and Capitalism, 15th—18th Century. Volume II. The Wheels of
Commerce, Harper & Row, New York, 1982. Transl. from Les jeux de l’echange, Librairie
Armand Collin, Paris, 1979. Transl. by S. Reynolds.
Braun, Ernest; and Stuart Macdonald: Revolution in Miniature: The History and Impact of
Semiconductor Electronics, Cambridge University Press, Cambridge, 1978. (Paperback ed.,
1980; 2nd ed., 1982).
Bromley, Allan G.: ‘Difference and analytical engines’, in W. Aspray (ed.): Computing Before
Computers, Iowa State University Press, Ames, Iowa, 1990, pp. 59-98.
Brooks, Frederick P., Jr.: ‘No silver bullet: Essence and accidents of software engineering’
(Computer, 1987). In F. P. Brooks, Jr.: The Mythical Man-Month: Essays on Software
Engineering. Addison-Wesley, Reading, Mass., 1995, pp. 177-203.
Bucciarelli, Louis L.: ‘Reflective practice in engineering design’, Design Studies, vol. 5, no. 3,
July 1984, pp. 185-190.
Bucciarelli, Louis L.: ‘An ethnographic perspective on engineering design’, Design Studies, vol. 9,
no. 3, July 1988a, pp. 159-168.
Bucciarelli, Louis L.: ‘Engineering design process’, in F. A. Dubinskas (ed.): Making Time:
Ethnographies of High-Technology Organizations, Temple University Press, Philadelphia,
1988b, pp. 92-122.
Button, Graham; and Richard H. R. Harper: ‘The relevance of “work practice” for design’,
Computer Supported Cooperative Work (CSCW): An International Journal, vol. 4, no. 4, 1995,
pp. 263-280.
Cabitza, Federico; Marcello Sarini; and Carla Simone: ‘Providing awareness through situated
process maps: the hospital care case’, in T. Gross, et al. (eds.): GROUP 2007: International
Conference on Supporting Group Work, 4-7 November 2007, Sanibel Island, Florida, USA,
ACM Press, New York, 2007, pp. 41-50.
Cabitza, Federico; and Carla Simone: ‘“…and do it the usual way”: fostering awareness of work
conventions in document-mediated collaboration’, in L. J. Bannon, et al. (eds.): ECSCW 2007:
Proceedings of the Tenth European Conference on Computer Supported Cooperative Work,
24-28 September 2005, Limerick, Ireland, Springer, London, 2007, pp. 119-138.
Cabitza, Federico; Carla Simone; and Marcello Sarini: ‘Leveraging coordinative conventions to
promote collaboration awareness: The case of clinical records’, Computer Supported
Cooperative Work (CSCW): The Journal of Collaborative Computing, August 2009, pp. 301330.
Campbell-Kelly, Martin: ‘Punched-card machinery’, in W. Aspray (ed.): Computing Before
Computers, Iowa State University Press, Ames, Iowa, 1990, pp. 122-155.
Campbell-Kelly, Martin: ‘The Railway Clearing House and Victorian data processing’, in L. BudFrierman (ed.): Information Acumen: The Understanding and Use of Knowledge in Modern
Business, Routledge, London and New York, 1994a, pp. 51-74.
Campbell-Kelly, Martin: ‘Introduction’. In C. Babbage: Passages from the Life of a Philosopher,
William Pickering, London, 1994b, pp. 7-36.
Campbell-Kelly, Martin; and William Aspray: Computer: A History of the Information Machine,
Basic Books, New York, 1996.
Campbell-Kelly, Martin: From Airline Reservations to Sonic the Hedgehog: A History of Software
Industry, The MIT Press, Cambridge, Mass., and London, 2003.
Card, Stuart K.: ‘The human, the computer, the task, and their interaction: Analytic models and
use-centered design’, in D. Steier and T. M. Mitchell (eds.): Mind Matters: A Tribute to Allen
Newell, Lawrence Erlbaum, Mahwah, New Jersey, 1996, pp. 259-312.
Cardwell, Donald: The Fontana History of Technology, FontanaPress, London, 1994.
Carey, Alex: ‘The Hawthorne studies: A radical criticism’, American Sociological Review, vol. 32,
no. 3, June 1967, pp. 403-416.
References
447
Carroll, John M.: ‘Introduction: The Kittle House Manifesto’, in J. M. Carroll (ed.): Designing
Interaction: Psychology at the Human-Computer Interface, Cambridge University Press,
Cambridge, 1991, pp. 1-16.
Carroll, John M.; Wendy A. Kellogg; and Mary Beth Rosson: ‘The task-artifact cycle’, in J. M.
Carroll (ed.): Designing Interaction: Psychology at the Human-Computer Interface,
Cambridge University Press, Cambridge, 1991, pp. 74-102.
Carroll, John M.: ‘Introduction: Human-computer interaction, the past and the present’, in J. M.
Carroll (ed.): Human-Computer Interaction in the New Millennium, ACM Press, New York,
2002, pp. xxvii-xxxvii.
Carstensen, Peter H.: ‘The bug report form’, in K. Schmidt (ed.): Social Mechanisms of
Interaction, Computing Department, Lancaster University, Lancaster, UK, September 1994,
pp. 187-219. COMIC Deliverable D3.2. <ftp://ftp.comp.lancs.ac.uk/pub/comic>
Carstensen, Peter H.; Carsten Sørensen; and Henrik Borstrøm: ‘Two is fine, four is a mess:
Reducing complexity of articulation work in manufacturing’, in: COOP’95: International
Workshop on the Design of Cooperative Systems, 25–27 January 1995, Antibes-Juan-les-Pins,
France, INRIA Sophia Antipolis, France, 1995a, pp. 314-333.
Carstensen, Peter H.; Carsten Sørensen; and Tuomo Tuikka: ‘Let’s talk about bugs!’,
Scandinavian Journal of Information Systems, vol. 7, no. 1, 1995b, pp. 33-54.
Carstensen, Peter H.: Computer Supported Coordination, Risø National Laboratory, Roskilde,
Denmark, 1996. – Risø-R-890(EN).
Carstensen, Peter H.; and Carsten Sørensen: ‘From the social to the systematic: Mechanisms
supporting coordination in design’, Computer Supported Cooperative Work (CSCW): The
Journal of Collaborative Computing, vol. 5, no. 4, 1996, pp. 387-413.
Carstensen, Peter H.; Kjeld Schmidt; and Uffe Kock Wiil: ‘Supporting shop floor intelligence: A
CSCW approach to production planning and control in flexible manufacturing’, in S. C. Hayne
(ed.): GROUP’99: International Conference on Supporting Group Work, 14-17 November
1999, Phoenix, Arizona, ACM Press, New York, 1999, pp. 111-120.
Carstensen, Peter H., et al.: IT-støtte til produktionsgrupper: Debatoplæg fra IDAK-projektet, COIndustri etc., København, June 2001. <http://cscw.dk/schmidt/papers/idak_prod.gr.pdf>
Carstensen, Peter H.; and Kjeld Schmidt: ‘Self-governing production groups: Towards
requirements for IT support’, in V. Marík; L. M. Camarinha-Matos; and H. Afsarmanesh
(eds.): Knowledge and Technology Integration in Production and Services: Balancing
Knowledge and Technology in Product and Service Life Cycle. IFIP TC5/WG5.3. Fifth IFIP
International Conference on Information Technology in Manufacturing and Services
(BASYS’02), 25-27 September 2002, Cancun, Mexico, Kluwer Academic Publishers, Boston,
Mass., 2002, pp. 49-60.
Ceruzzi, Paul E.: ‘Electronic calculators’, in W. Aspray (ed.): Computing Before Computers, Iowa
State University Press, Ames, Iowa, 1990, pp. 223-250.
Chabert, Jean-Luc (ed.): A History of Algorithms: From the Pebble to the Microchip, SpringerVerlag, Berlin, Heidelberg, New York, 1999. Transl. from Histoire d’algorithmes. Du caillou
á la puce, Paris, 1994. Transl. by C. Weeks.
Chalmers, Matthew: ‘Awareness, representation and interpretation’, Computer Supported
Cooperative Work (CSCW): The Journal of Collaborative Computing, vol. 11, no. 3-4, 2002.
Chambers, Ephraim: Cyclopædia, or, an Universal Dictionary of Arts and Sciences [etc.], J. and J.
Knapton, London, 1728. <http://digital.library.wisc.edu/1711.dl/HistSciTech.Cyclopaedia02>
Chapman, Robert L., et al.: ‘The Systems Research Laboratory's air defense experiments’,
Management Science, vol. 5, no. 3, April 1959, pp. 250-269.
Child, John: ‘Organizational design for advanced manufacturing technology’, in T. D. Wall; C. W.
Clegg; and N. J. Kemp (eds.): The Human Side of Advanced Manufacturing Technology, John
Wiley, Chichester, 1987, pp. 101-133.
Ciborra, Claudio U.: ‘Information systems and transactions architecture’, International Journal of
Policy Analysis and Information Systems, vol. 5, no. 4, 1981, pp. 305-324.
Ciborra, Claudio U.: ‘Reframing the role of computers in organizations: The transaction costs
approach’, in L. Gallegos; R. Welke; and J. Wetherbe (eds.): Proceedings of the Sixth
International Conference on Information Systems, 16-18 December 1985, Indianapolis,
Indiana, 1985, pp. 57-69.
448
Cooperative Work and Coordinative Practices
Ciborra, Claudio U.; and Margrethe H. Olson: ‘Encountering electronic work groups: A
Transaction Costs perspective’, in I. Greif and L. A. Suchman (eds.): CSCW’88: Proceedings
of the Conference on Computer-Supported Cooperative Work, 26-28 September 1988,
Portland, Oregon, ACM Press, New York, 1988, pp. 94-101.
Cicourel, Aaron V.: ‘The integration of distributed knowledge in collaborative medical diagnosis’,
in J. Galegher; R. E. Kraut; and C. Egido (eds.): Intellectual Teamwork: Social and
Technological Foundations of Cooperative Work, Lawrence Erlbaum, Hillsdale, New Jersey,
1990, pp. 221-242.
Ciolfi, Luigina; Geraldine Fitzpatrick; and Liam J. Bannon (eds.): ‘Settings for collaboration: The
role of place’. [Special theme of] Computer Supported Cooperative Work (CSCW): The
Journal of Collaborative Computing, vol. 17, no. 2-3, April-June 2008.
Coase, Ronald H.: ‘The nature of the firm’, Economica N.S., vol. 4, no. 4, November 1937, pp.
386-405.
Cohen, Gerald Allan: Karl Marx’s Theory of History: A Defence, Clarendon Press, Oxford, 1978.
Cohen, Jack; and Ian Stewart: The Collapse of Chaos: Discovering Simplicity in a Complex
World, Penguin Books, New York, 1994.
Cohen, Paul J.: ‘Comments on the foundations of set theory’ (Proceedings of the Symposium in
Pure Mathematics of the American Mathematical Society, held at the University of California,
Los Angeles, California, July 10-August 5, 1967, 1967). In D. S. Scott (ed.): Axiomatic Set
Theory. American Mathematical Society, Providence, Rhode Island, 1971, pp. 9-15.
Cole, Arthur H.; and George B. Watts: The Handicrafts of France as Recorded in the Descriptions
des Arts et Metiérs, 1761-1788, Baker Library, Harvard Graduate School of Business
Administration, Boston, 1952.
Coleman, D. C.: ‘Proto-industrialization: A concept too many’, The Economic History Review,
New Series, vol. 36, no. 3, August 1984, pp. 435-448.
Collier, Bruce: The Little Engines that Could've: The Calculating Machines of Charles Babbage,
PhD diss., The Department of History of Science, Harvard University, Cambridge, Mass.,
August 1970. <http://robroy.dyndns.info/collier/index.html>
Committee of General Literature and Education: The Industry of Nations: A Survey of the Existing
State of Arts, Machines, and Manufactures. Part II, Society for Promoting Christian
Knowledge, London, 1855.
Commons, John R.: Institutional Economics, University of Wisconsin Press, Madison, 1934.
Condillac, Etienne Bonnot, abbé de: Traité des systêmes, où l'on en démelle les inconvénients et
les avantages (La Haye, 1749). Librairie Arthème Fayard, Paris, 1991.
Conklin, Jeff; and Michael L. Begeman: ‘gIBIS: A hypertext tool for exploratory policy
discussion’, in I. Greif and L. A. Suchman (eds.): CSCW’88: Proceedings of the Conference on
Computer-Supported Cooperative Work, 26-28 September 1988, Portland, Oregon, ACM
Press, New York, 1988, pp. 140-152.
Conklin, Jeff: ‘Design rationale and maintainability’, in B. Shriver (ed.): Proceedings of 22nd
Annual Hawaii International Conference on System Sciences, IEEE Computer Society, vol. 2,
1989, pp. 533-539.
Conklin, Jeff: ‘Capturing organizational knowledge’, in R. M. Baecker (ed.): Readings in
Groupware and Computer-Supported Cooperative Work: Assisting Human-Human
Collaboration, Morgan Kaufmann Publishers, San Mateo, Calif., 1993, pp. 561-565. –
Originally published in D. Coleman (ed.): Proceedings of Groupware ’92, Morgan Kaufmann,
1992, pp. 133-137.
Constant, Edward W., II: The Origins of the Turbojet Revolution, Johns Hopkins University Press,
Baltimore, Maryland, 1980.
Constant, Edward W., II: ‘Communities and hierarchies: Structure in the practice of science of
technology’, in R. Laudan (ed.): The Nature of Technological Knowledge: Are Models of
Scientific Change Relevant?, D. Reidel Publishing, Dordrecht, Boston, and London, 1984, pp.
27-46.
Cook, Carolyn L.: ‘Streamlining office procedures: An analysis using the information control net
model’, in: National Computer Conference, 1980, 1980, pp. 555-565.
References
449
Copeland, B. Jack (ed.): The Essential Turing: Seminal Writings in Computing, Logic, Philosophy,
Artificial Intelligence, and Artificial Life plus The Secrets of Enigma, Oxford University Press,
Oxford, 2004.
Copeland, B. Jack (ed.): Alan Turing’s Automatic Computing Engine: The Master Codebreaker’s
Struggle to Build the Modern Computer, Oxford University Press, Oxford, 2005.
Copeland, B. Jack; and Diane Proudfoot: ‘Turing and the computer’, in B. J. Copeland (ed.): Alan
Turing’s Automatic Computing Engine: The Master Codebreaker’s Struggle to Build the
Modern Computer, Oxford University Press, Oxford, 2005, pp. 107-148.
Copeland, B. Jack (ed.): Colossus: The Secrets of Bletchley Park’s Codebreaking Computers,
Oxford University Press, Oxford, 2006a.
Copeland, B. Jack: ‘Colossus and the rise of the modern computer’, in B. J. Copeland (ed.):
Colossus: The Secrets of Bletchley Park’s Codebreaking Computers, Oxford University Press,
Oxford, 2006b, pp. 101-115.
Copeland, B. Jack: ‘Introduction’, in B. J. Copeland (ed.): Colossus: The Secrets of Bletchley
Park’s Codebreaking Computers, Oxford University Press, Oxford, 2006c, pp. 1-6.
Corbató, Fernando J.; Marjorie Merwin-Daggett; and Robert C. Daley: ‘An experimental timesharing system’, in G. A. Barnard (ed.): SJCC’62: Proceedings of the Spring Joint Computer
Conference, 1-3 May 1962, San Francisco, California, vol. 21, AFIPS Press, 1962, pp. 335344.
Corbató, Fernando J.: ‘Oral history interview’, Interviewed by A. L. Norberg, 18 April 1989, 14
November 1990 1990, at Cambridge, Mass. Interview transcript, Charles Babbage Institute,
University of Minnesota, Minneapolis (OH 162).
COSMOS: Specification for a Configurable, Structured Message System, Queen Mary College, ,
London, August 1989. – 68.4 Ext/ALV.
Coulter, Jeff: Rethinking Cognitive Theory, Macmillan, London, 1983.
Cournot, Antoine Augustin: An Essay on the Foundations of our Knowledge, The Liberal Arts
Press, New York, 1956. Transl. from Essai sur les fondements de nos connaissance et sur les
caractères de la critique philosophique, Hachette, Paris, 1851. Transl. by M. H. Moore.
Crabtree, Andy; Thomas A. Rodden; and Steven D. Benford: ‘Moving with the times: IT research
and the boundaries of CSCW’, Computer Supported Cooperative Work (CSCW): The Journal
of Collaborative Computing, vol. 14, 2005, pp. 217-251.
Crabtree, Andy, et al.: ‘Ethnography considered harmful’, in R. B. Arthur, et al. (eds.): CHI 2009:
Proceedings of the 27th international conference on Human factors in computing systems,
Boston, Mass., 4-9 April 2009, ACM Press, New York, 2009, pp. 879-888.
Crawford, Perry Orson: Automatic Control by Arithmetical Operations, MSc Thesis, Dept. of
Physics, Massachusetts Institute of Technology, Cambridge, Mass., 1942. – [The copy of
Crawford’s thesis available on-line from MIT is incomplete (it ends on p. 58). According to
MIT’s Document Services the ‘thesis file has been corrupted’ and Crawford’s thesis ‘appears
to have gone missing from the collection’ (Email communication, 2 and 8 December 2009).].
<http://hdl.handle.net/1721.1/11232>
Crawford, Perry Orson: ‘Application of digital computation involving continuous input and output
variables’ (5 August 1946). In M. Campbell-Kelly and M. R. Williams (eds.): The Moore
School Lectures: Theory and Techniques for Design of Electronic Digital Computers. (Revised
ed. of Theory and Techniques for Design of Electronic Computers, 1947-1948). The MIT
Press, Cambridge, Mass., 1985, pp. 374-392.
Croarken, Mary: ‘Tabulating the heavens: Computing the Nautical Almanac in 18th-century
England’, IEEE Annals of the History of Computing, vol. 25, no. 3, July-September 2003, pp.
48-61.
Crocker, David H., et al.: ‘RFC 733: Standard for the format of ARPA network text messages’,
Request for Comments, no. 733, Network Working Group, 21 November 1977.
<http://www.rfc-editor.org/rfc/rfc733.txt>
Croft, W. B.; and L. S. Lefkowitz: ‘Task support in an office system’, ACM Transactions on
Office Information Systems, vol. 2, 1984, pp. 197-212.
Crosby, Alfred W.: The Measure of Reality: Quantification and Western Society, 1250-1600,
Cambridge University Press, Cambridge, 1997.
450
Cooperative Work and Coordinative Practices
Cummings, Thomas; and Melvin Blumberg: ‘Advanced manufacturing technology and work
design’, in T. D. Wall; C. W. Clegg; and N. J. Kemp (eds.): The Human Side of Advanced
Manufacturing Technology, John Wiley, Chichester, 1987, pp. 37-60.
Cyert, Richard M.; and James G. March: A Behavioral Theory of the Firm, Prentice-Hall,
Englewood-Cliffs, New Jersey, 1963.
Dahl, Ole-Johan; and Kristen Nygaard: ‘SIMULA: an ALGOL-based simulation language’,
Communications of the ACM, vol. 9, no. 9, September 1966, pp. 671-678.
Dahrendorf, Ralf: Sozialstruktur des Betriebes, Gabler, Wiesbaden, 1959.
Danielsen, Thore, et al.: ‘The AMIGO Project: Advanced group communication model for
computer-based communications environment’, in H. Krasner and I. Greif (eds.): CSCW’86:
Proceedings. Conference on Computer-Supported Cooperative Work, Austin, Texas, 3-5
December 1986, ACM Press, New York, 1986, pp. 115-142.
Darwin, Charles G.: ‘Automatic Computing Engine (ACE)’ (NPL, 17 April 1946). In B. J.
Copeland (ed.) Alan Turing’s Automatic Computing Engine: The Master Codebreaker’s
Struggle to Build the Modern Computer. Oxford University Press, Oxford, 2005, pp. 53-57.
Davenport, Thomas H.: Process Innovation: Reengineering Work through Information
Technology, Harvard Business School Press, Boston, Mass., 1993.
Davies, Donald Watts: ‘Proposal for the development of a national communication service for online data processing’, Manuscript, [National Physical Laboratory, UK], 15 December 1965.
<http://www.cs.utexas.edu/users/chris/DIGITAL_ARCHIVE/NPL/DaviesLetter.html>
Davies, Donald Watts: Proposal for a Digital Communication Network, National Physical
Laboratory, June 1966.
<http://www.cs.utexas.edu/users/chris/DIGITAL_ARCHIVE/NPL/Davies05.pdf>
Davies, Donald Watts: ‘Foreword’, in B. J. Copeland (ed.): Alan Turing’s Automatic Computing
Engine: The Master Codebreaker’s Struggle to Build the Modern Computer, Oxford
University Press, Oxford, 2005, pp. vii-ix.
Davis, Philip J.; and Reuben Hersh: The Mathematical Experience, Birkhäuser, Boston, 1980. –
Pelican Books, Harmondsworth, England, 1983.
De Cindio, Fiorella, et al.: ‘CHAOS as coordination technology’, in H. Krasner and I. Greif (eds.):
CSCW’86: Proceedings. Conference on Computer-Supported Cooperative Work, 3-5
December 1986, Austin, Texas, ACM Press, New York, 1986, pp. 325-342.
De Cindio, Fiorella, et al.: ‘CHAOS: a knowledge-based system for conversing within offices’, in
W.Lamersdorf (ed.): Office Knowledge: Representation, Management and Utilization, Elsevier
Science Publishers B.V., North-Holland, Amsterdam, 1988.
De Michelis, Giorgio; and M. Antonietta Grasso: ‘How to put cooperative work in context:
Analysis and design requirements’, in L. J. Bannon and K. Schmidt (eds.): Issues of Supporting
Organizational Context in CSCW Systems, Computing Department, Lancaster University,
Lancaster,
UK,
October
1993,
pp.
73-100.
COMIC
Deliverable
D1.1.
<ftp://ftp.comp.lancs.ac.uk/pub/comic>
de Prony, Gaspard F. C. M. Riche: ‘Notice sur les Grandes Tables Logarithmique &c.’ (Paris, 7
June 1824). In: Recueil des Discourses lus dans la seance publique de L'Academie Royale Des
Sciences, 7 Juin 1824. – [Translation entitled ‘On the great logarithmic and trigonometric
tables, adapted to the new decimal metric system’ (Science Museum Library, Babbage
Archives, Document code: BAB U2l/l). Excerpt probably copied at Charles Babbage’s
request.].
<http://sites.google.com/site/babbagedifferenceengine/barondeprony'sdescriptionoftheconstruct
i>
Degani, Asaf; and Earl L. Wiener: Human Factors of Flight-Deck Checklists: The Normal
Checklist, Ames Research Center, National Aeronautics and Space Administration, Moffett
Field, California, May 1990. – NASA Contractor Report 177549; Contract NCC2-377.
Delayre, Alexandre [Delaire]: ‘Épingle’, in D. Diderot and J. L. R. d’Alembert (eds.):
Encyclopédie, ou dictionnaire raisonné des sciences, des arts et des métiers, vol. 5, Briasson,
David, Le Breton & Durand, Paris, 1755, pp. 804-807. Text ed. by R. Morrissey. – ARTFL
Encyclopédie Projet, University of Chicago, 2008. <http://portail.atilf.fr/encyclopedie/>
References
451
Deloney, Thomas: The Pleasant Historie of Iohn Winchcomb, in his Younger Yeares Called Iack
of Newbery (London, 1597; 10th edition, 1626). Text ed. by F. O. Mann. In T. Deloney: The
Works of Thomas Deloney. Oxford at the Clarendon Press, London etc., 1912, pp. 1-68.
Demchak, Chris C.: War, Technological Complexity, and the U.S. Army, PhD diss., Department of
Political Science, University of California, Berkeley, Calif., 1987.
Descartes, René: Meditations on First Philosophy (Paris, 1641, vol. 2). In R. Descartes: The
Philosophical Writings of Descartes. Cambridge University Press, Cambridge, 1984, pp. 1-62.
Diderot, Denis: ‘Art’ (Encyclopédie, ou dictionnaire raisonné des sciences, des arts et des métiers,
Paris, 1751). In D. Diderot: Œeuvres. Robert Laffont, Paris, 1994, vol. 1 (Philosophie), pp.
265-276.
Diebold, John: Automation: The Advent of the Automatic Factory, Van Nostrand, New York, etc.,
1952.
Divitini, Monica, et al.: A multi-agent approach to the design of coordination mechanisms, Centre
for Cognitive Science, Roskilde University, Roskilde, Denmark, 1995. Working Papers in
Cognitive Science and HCI. – WPCS-95-5.
Divitini, Monica; Carla Simone; and Kjeld Schmidt: ‘ABACO: Coordination mechanisms in a
multi-agent perspective’, in: COOP’96: Second International Conference on the Design of
Cooperative Systems, 12–14 June 1996, Antibes-Juan-les-Pins, France, INRIA Sophia
Antipolis, France, 1996, pp. 103-122.
Dourish, Paul; and Victoria Bellotti: ‘Awareness and coordination in shared workspaces’, in M.
M. Mantei; R. M. Baecker; and R. E. Kraut (eds.): CSCW’92: Proceedings of the Conference
on Computer-Supported Cooperative Work, 31 October–4 November 1992, Toronto, Canada,
ACM Press, New York, 1992, pp. 107-114.
Dourish, Paul; and Sara Bly: ‘Portholes: supporting awareness in a distributed work group’, in P.
Bauersfeld; J. Bennett; and G. Lynch (eds.): CHI’92 Conference Proceedings: ACM
Conference on Human Factors in Computing Systems, 3-7 May 1992, Monterey, California,
ACM Press, New York, 1992, pp. 541-547.
Dourish, Paul, et al.: ‘Freeflow: Mediating between representation and action in workflow
systems’, in G. M. Olson; J. S. Olson; and M. S. Ackerman (eds.): CSCW’96: Proceedings of
the Conference on Computer-Supported Cooperative Work, 16-20 November 1996, Boston,
Mass., ACM Press, New York, 1996, pp. 190-198.
Dourish, Paul: ‘Extending awareness beyond synchronous collaboration’, in T. Brinck and S. E.
McDaniel (eds.): CHI’97 Workshop on Awareness in Collaborative Systems, 22-27 March
1997, Atlanta, Georgia, 1997. <http://www.usabilityfirst.com/groupware/awareness>
Dourish, Paul; and Graham Button: ‘On “technomethodology”: Foundational relationships
between ethnomethodology and system design’, Human-Computer Interaction, vol. 13, no. 4,
1998, pp. 395-432.
Dourish, Paul: Where the Action is: The Foundation of Embodied Interaction, MIT Press,
Cambridge, Mass., 2001.
Dourish, Paul: ‘Re-space-ing place: “Place” and “space” ten years on’, in P. J. Hinds and D.
Martin (eds.): Proceedings of CSCW'06: The 20th Anniversary Conference on Computer
Supported Cooperative Work, Banff, Alberta, Canada, 4-8 November 2006, ACM Press, New
York, 2006a, pp. 299-308.
Dourish, Paul: ‘Implications for design’, in R. E. Grinter, et al. (eds.): CHI 2006: Proceedings of
the SIGCHI Conference on Human Factors in Computing Systems, 22-27 April 2006,
Montréal, Québec, Canada, ACM Press, New York, 2006b, pp. 541-550.
Dreyfus, Hubert L.: What Computers Can’t Do: The Limits of Artificial Intelligence, Harper &
Row, New York, 1979. (Rev. ed.; 1st ed. 1972).
Drury, Maurice O’C.: ‘Conversations with Wittgenstein’ (1974). Text ed. by R. Rhees. In R.
Rhees (ed.): Ludwig Wittgenstein: Personal Recollections. Rowman and Littlefield, Totowa,
New Jersey, 1981, pp. 112-189.
Dubinskas, Frank A.: ‘Cultural constructions: The many faces of time’, in F. A. Dubinskas (ed.):
Making Time: Ethnographies of High-Technology Organizations, Temple University Press,
Philadelphia, 1988, pp. 3-38.
452
Cooperative Work and Coordinative Practices
Eckert, J. Presper: ‘A preview of a digital computing machine’ (15 July 1946). In M. CampbellKelly and M. R. Williams (eds.): The Moore School Lectures: Theory and Techniques for
Design of Electronic Digital Computers. (Revised ed. of Theory and Techniques for Design of
Electronic Computers, 1947-1948). The MIT Press, Cambridge, Mass., 1985, pp. 109-126.
Eckert, Wallace J.: Punched Card Methods in Scientific Computation, The Thomas J. Watson
Astronomical Computing Bureau, New York, 1940. – The MIT Press, Cambridge, Mass., and
London, 1984.
Edgerton, David: The Shock of the Old: Technology and Global History Since 1900, Profile
Books, London, 2006.
Egger, Edeltraud; and Ina Wagner: ‘Negotiating temporal orders: The case of collaborative time
management in a surgery clinic’, Computer Supported Cooperative Work (CSCW): An
International Journal, vol. 1, no. 4, 1993, pp. 255-275.
Ehn, Pelle: Work-Oriented Design of Computer Artifacts, Arbetslivscentrum, Stockholm, 1988.
Ellis, C. A.; S. J. Gibbs; and G. L. Rein: ‘Groupware: Some issues and experiences’,
Communications of the ACM, vol. 34, no. 1, January 1991, pp. 38-58.
Ellis, Clarence A.: ‘Information control nets: A mathematical model of office information flow’,
in: Proceedings of the ACM Conference on Simulation, Measurement and Modeling, August
1979, Boulder, Colorado, ACM, New York, 1979, pp. 225-239.
Ellis, Clarence A.; and Gary J. Nutt: ‘Office information systems and computer science’,
Computing Surveys, vol. 12, no. 1, March 1980, pp. 27-60.
Ellis, Clarence A.; and Marc Bernal: ‘Officetalk-D: An experimental office information system’,
in: Proceedings of the ACM SIGOA conference on Office information system, 21-23 June
1982, Philadelphia, Pennsylvannia, 1982, pp. 131-140.
Ellis, Clarence A.: ‘Formal and informal models of office activity’, in R. E. A. Mason (ed.):
Information Processing ’83. Proceedings of the IFIP 9th World Computer Congress, 19-23
September 1983, Paris, France, North-Holland, Amsterdam, 1983, pp. 11-22.
Ellis, Clarence A.; Karim Keddara; and Grzegorz Rozenberg: ‘Dynamic change within workflow
systems’, in N. Comstock, et al. (eds.): COOCS’95: Conference on Organizational Computing
Systems, 13-16 August 1995, Milpitas, California, ACM Press, New York, 1995, pp. 10-21.
Endsley, Mica R.; and Daniel J. Garland (eds.): Situation Awareness Analysis and Measurement,
Lawrence Erlbaum Associates, Mahwah, New Jersey, and London, 2000.
Endsley, Mica R.; and William M. Jones: ‘A model of inter- and intrateam situation awareness:
Implications for design. training, and measurement’, in M. McNeese; E. Salas; and M. R.
Endsley (eds.): New Trends in Cooperative Activities: Understanding System Dynamics in
Complex Environments, Human Factors and Ergonomics Society, Santa Monica, Calif., 2001,
pp. 46-67.
Engelbart, Douglas C.: Augmenting human intellect: A conceptual framework (Prepared for:
Director of Information Sciences, Air Force Office of Scientific Research, Washington 25,
D.C., Contract AF49(638)-1024), Stanford Research Institute, Menlo Park, Calif., October
1962. – AFOSR-3233. <http://www.dougengelbart.org/pubs/augment-3906.html>
Engelbart, Douglas C.; and William K. English: ‘A research center for augmenting human
intellect’, in: FJCC’68: Proceedings of the Fall Joint Computing Conference, 9-11 December
1968, San Francisco, California. Part I, vol. 33, AFIPS Press, New York, 1968, pp. 395-410.
– Also in I. Greif (ed.): Computer-Supported Cooperative Work: A Book of Readings, Morgan
Kaufmann Publishers, San Mateo, Calif., 1988, pp. 81-105.
Engelbart, Douglas C.: ‘Authorship provisions in AUGMENT’, in: Proc. IEEE Compcon
Conference, 1984. – Also in I. Greif (ed.): Computer-Supported Cooperative Work: A Book of
Readings, Morgan Kaufmann Publishers, San Mateo, Calif., 1988, pp. 107-126.
Engelbart, Douglas C.; and Harvey Lehtman: ‘Working together’, Byte, vol. 13, no. 13, December
1988, pp. 245-252.
Engleberger, Joseph E.: Robotics in Practice: Management and Applications of Industrial Robots,
Kogan Page, London, 1980.
English, W.: ‘The textile industry: Silk production and manufacture, 1750-1900’, in C. Singer, et
al. (eds.): A History of Technology, Volume IV: The Industrial Revolution, c 1750 to c 1850,
Oxford University Press, Oxford, 1958, pp. 277-307.
References
453
English, William K.; Douglas C. Engelbart; and Melwyn L. Berman: ‘Display-selection
techniques for text manipulation’, IEEE Transactions on Human Factors in Electronics, vol.
HFE-8, no. 1, March 1967, pp. 5-15.
Essinger, James: Jacquard’s Web: How a Hand-loom Led to the Birth of the Information Age,
Oxford University Press, Oxford, 2005. (Paperback. ed., 2007).
Eveland, J. D.; and Tora K. Bikson: ‘Evolving electronic communication networks: an empirical
assessment’, Office: Technology and People, vol. 3, no. 2, 1987, pp. 103-128.
Exner, Wilhelm Franz: Johann Beckmann: Begründer der technologischen Wissenschaft, Gerold,
Wien, 1878. – Kessinger Pub Co, Whitefish, Montana, 2009. – [The reprint title contains a
typo: Johann Beckmann: Segründer der technologischen Wissenschaft].
Fayol, Henri: Administration industrielle el générale, Dunod, Paris, 1918, 1999. – Orginally
published in Bulletin de la Societé de l’Industrie minérale, 1916. (English translation: General
and Industrial Management, Pitman, London, 1949).
Farrington, Benjamin: Greek Science, Penguin Books, London, 1944/49. – Spokesman,
Nottingham, 2000.
Ferry, Georgina: A Computer Called LEO: Lyons Teashops and the World’s First Office
Computer, HarperCollins, London, 2003. (Paperback ed., 2004).
Fikes, Richard E.; and D. Austin Henderson, Jr.: ‘On supporting the use of procedures in office
work’, in: Proceedings of the 1st Annual Conference on Artificial Intelligence, 18-20 August
1980, Stanford University, American Association for Artificial Intelligence, 1980, pp. 202-207.
Finholt, Thomas A.; and Stephanie D. Teasley: ‘Psychology: The need for psychology in research
on computer-supported cooperative work’, Social Science Computer Review, vol. 16, no. 1,
Spring 1998, pp. 40-52.
Finley, Moses I.: ‘Was Greek civilization based on slave labour?’ (Historia, 1959). In M. I. Finley
(ed.): Slavery in Classical Antiquity: View and Controversies. W. Heffer & Sons, Cambridge,
1960, pp. 53-72.
Finley, Moses I.: The Ancient Economy, University of California Press, Berkeley, Calif., 1973.
(3rd ed., 1999).
Fish, Robert S.; Robert E. Kraut; and Barbara L. Chalfonte: ‘The VideoWindow system in
informal communications’, in T. Bikson and F. Halasz (eds.): CSCW’90: Proceedings of the
Conference on Computer-Supported Cooperative Work, 7-10 October 1990, Los Angeles,
Calif., ACM Press, New York, 1990, pp. 1-11.
Fitts, Paul M.: ‘The information capacity of the human motor system in controlling amplitude of
movement’, Journal of Experimental Psychology, vol. 47, no. 6, June 1954, pp. 381-391.
Fitzpatrick, Geraldine; William J. Tolone; and Simon M. Kaplan: ‘Work, locales and distributed
social worlds’, in H. Marmolin; Y. Sundblad; and K. Schmidt (eds.): ECSCW’95: Proceedings
of the Fourth European Conference on Computer-Supported Cooperative Work, 10–14
September 1995, Stockholm, Sweden, Kluwer Academic Publishers, Dordrecht, 1995, pp. 1-16.
Fitzpatrick, Geraldine; Simon M. Kaplan; and Tim Mansfield: ‘Physical spaces, virtual places and
social worlds: A study of work in the virtual’, in G. M. Olson; J. S. Olson; and M. S.
Ackerman (eds.): CSCW’96: Proceedings of the Conference on Computer-Supported
Cooperative Work, 16-20 November 1996, Boston, Mass., ACM press, New York, 1996, pp.
334-343.
Fitzpatrick, Geraldine, et al.: ‘Supporting public availability and accessibility with Elvin:
Experiences and reflections’, Computer Supported Cooperative Work (CSCW): The Journal of
Collaborative Computing, vol. 11, no. 3-4, 2002.
Flach, John M., et al. (eds.): Global Perspectives on the Ecology of Human-Machine Systems,
Lawrence Erlbaum, Hillsdale, New Jersey, 1995.
Flores, Fernando, et al.: ‘Computer systems and the design of organizational interaction’, ACM
Transactions on Office Information Systems, vol. 6, no. 2, April 1988, pp. 153-172.
Freiberger, Paul; and Michael Swaine: Fire in the Valley: The Making of the Personal Computer,
McGraw-Hill, New York, etc., 2000. (2nd ed.; 1st ed. 1984).
Fuchs, Ludwin; and Wolfgang Prinz: ‘Aspects of organizational context in CSCW’, in L. J.
Bannon and K. Schmidt (eds.): Issues of Supporting Organizational Context in CSCW Systems,
Computing Department, Lancaster University, Lancaster, UK, October 1993, pp. 11-47.
COMIC Deliverable D1.1. <ftp://ftp.comp.lancs.ac.uk/pub/comic>
454
Cooperative Work and Coordinative Practices
Fuchs, Ludwin; Uta Pankoke-Babatz; and Wolfgang Prinz: ‘Supporting cooperative awareness
with local event mechanisms: The GroupDesk system’, in H. Marmolin; Y. Sundblad; and K.
Schmidt (eds.): ECSCW’95: Proceedings of the Fourth European Conference on ComputerSupported Cooperative Work, 10–14 September 1995, Stockholm, Sweden, Kluwer Academic
Publishers, Dordrecht, 1995, pp. 245-260.
Gannon, Paul: Colossus: Bletchley Park’s Greatest Secret, Atlantic Books, London, 2006.
(Paperback ed., 2007).
Garfinkel, Harold: Studies in Ethnomethodology, Prentice-Hall, Englewood-Cliffs, New Jersey,
1967. – Polity Press, Cambridge, 1987.
Garfinkel, Harold; and Egon Bittner: ‘“Good” organizational reasons for “bad” clinic records’
(1967). In H. Garfinkel: Studies in Ethnomethodology. Prentice-Hall, Englewood-Cliffs, New
Jersey, 1967, pp. 186-207.
Garlan, Yvon: Slavery in Ancient Greece, Cornell University Press, Ithaca and London, 1988.
Transl. from Les esclaves en Grèce ancienne, François Maspero, Paris, 1982. Transl. by J.
Lloyd. (Revised ed.).
Garnsey, Peter: Ideas of Slavery from Aristotle to Augustine, Cambridge University Press,
Cambridge, 1996.
Gasser, Les; and Alan H. Bond: ‘An analysis of problems and research in distributed artificial
intelligence’, in L. Gasser and A. H. Bond (eds.): Readings in Distributed Artificial
Intelligence, Morgan Kaufmann Publishers, San Mateo, Calif., 1988, pp. 3-35.
Gaver, William W.: ‘Sound support for collaboration’, in L. J. Bannon; M. Robinson; and K.
Schmidt (eds.): ECSCW’91: Proceedings of the Second European Conference on ComputerSupported Cooperative Work, 24–27 September 1991, Amsterdam, Kluwer Academic
Publishers, Dordrecht, 1991, pp. 293-308.
Gaver, William W.: ‘The affordances of media spaces for collaboration’, in M. M. Mantei; R. M.
Baecker; and R. E. Kraut (eds.): CSCW’92: Proceedings of the Conference on ComputerSupported Cooperative Work, 31 October–4 November 1992, Toronto, Canada, ACM Press,
New York, 1992, pp. 17-24.
Gaver, William W., et al.: ‘Realizing a video environment: EuroPARC’s RAVE system’, in P.
Bauersfeld; J. Bennett; and G. Lynch (eds.): CHI’92 Conference Proceedings: ACM
Conference on Human Factors in Computing Systems, 3-7 May 1992, Monterey, California,
ACM Press, New York, 1992, pp. 27-35.
Gaver, William W.: ‘Provocative awareness’, Computer Supported Cooperative Work (CSCW):
The Journal of Collaborative Computing, vol. 11, no. 3-4, 2002, pp. 475-493.
Geertz, Clifford: The Interpretation of Culture: Selected Essays, Basic Books, New York, 1973.
Gell, Alfred: The Anthropology of Time: Cultural Constructions of Temporal Maps and Images,
Berg, Oxford/Providence, 1992.
Gerrard, Steve: ‘Wittgenstein’s philosophies of mathematics’, Synthese, vol. 87, 1991, pp. 125142.
Gerson, Elihu M.; and Susan Leigh Star: ‘Analyzing due process in the workplace’, ACM
Transactions on Office Information Systems, vol. 4, no. 3, July 1986, pp. 257-270.
Gerson, Elihu M.: [Personal communication]. August 1989.
Gibson, James J.: The Ecological Approach to Visual Perception, Houghton-Mifflin, Boston,
1979. – Lawrence Erlbaum Associates, Hillsdale, New Jersey, 1986.
Giedion, Siegfried: Mechanization Takes Command: A Contribution to Anonymous History,
Oxford University Press, New York, 1948.
Gilbert, K. R.: ‘Machine-tools’, in C. Singer, et al. (eds.): A History of Technology, Volume IV:
The Industrial Revolution, c 1750 to c 1850, Oxford University Press, Oxford, 1958, pp. 417441.
Gillespie, Richard: Manufacturing Knowledge: A History of the Hawthorne Experiments,
Cambridge University Press, Cambridge, 1991. (Paperback ed., 1993).
Gillies, James; and Robert Cailliau: How the Web was Born: The Story of the World Wide Web,
Oxford University Press, Oxford, 2000.
References
455
Gillispie, Charles Coulston (ed.): A Diderot Pictorial Encyclopedia of Trades and Industry:
Manufacturing and the Technical Arts in Plates Selected from ‘L’Encyclopedie, ou
dictionnaire raissoné des sciences, des arts et des métiers’ of Denis Diderot, vol. 1-2, Dover
Publications, New York, 1959. Transl. from Recueil de planches, sur les sciences, les arts
liberaux, et les arts méchaniques, avec leur explication, Paris, 1763-72.
Goffman, Erving: Encounters: Two Studies in the Sociology of Interaction, Bobbs-Merrill,
Indianapolis, 1961.
Goffman, Erving: Behavior in Public Places: Notes on the Social Organization of Gatherings, The
Free Press, New York, 1963.
Goldstine, Herman H.: The Computer from Pascal to von Neumann, Princeton University Press,
Princeton, New Jersey, 1972.
Good, Jack; Donald Michie; and Geoffrey Timms: General Report on Tunny: With Emphasis on
Statistical Methods, Government Code and Cypher School, 1945. vol. 1-2. – Public Record
Office, HW 25/4 & HW 25/4. – Bibliograpical data based on
<http://www.alanturing.net/tunny_report/. — Partial transcript by Tony Sale, March 2001:
http://www.codesandciphers.org.uk/documents/newman/newman.pdf>
Goody, Jack: The Domestication of the Savage Mind, Cambridge University Press, Cambridge,
1977.
Goody, Jack: The Logic of Writing and the Organization of Society, Cambridge University Press,
Cambridge, 1986.
Goody, Jack: The Interface Between the Written and the Oral, Cambridge University Press,
Cambridge, 1987.
Goody, Jack: The Power of the Written Tradition, Smithsonian Institution Press, Washington D.C.,
2000.
Grattan-Guinness, Ivor: ‘Work for the hairdressers: The production of de Prony’s logartihmic and
trigonometric tables’, IEEE Annals of the History of Computing, vol. 12, no. 3, July-September
1990, pp. 177-185.
Greenberg, Saul: ‘Sharing views and interactions with single-user applications’, in: COIS’90:
Proceedings of the Conference on Office Information Systems, April 1990, Boston, 1990.
Greenberg, Saul: ‘[Introduction to] Computer-supported Cooperative Work and Groupware’, in S.
Greenberg (ed.): Computer-supported Cooperative Work and Groupware, Academic Press,
London, 1991, pp. 1-8.
Greif, Irene; and Sunil K. Sarin: ‘Data sharing in group work’, in H. Krasner and I. Greif (eds.):
CSCW’86: Proceedings. Conference on Computer-Supported Cooperative Work, Austin,
Texas, 3-5 December 1986, ACM Press, New York, 1986, pp. 175-183.
Greif, Irene (ed.): Computer-Supported Cooperative Work: A Book of Readings, Morgan
Kaufmann Publishers, San Mateo, Calif., 1988a.
Greif, Irene: ‘Overview’, in I. Greif (ed.): Computer-Supported Cooperative Work: A Book of
Readings, Morgan Kaufmann Publishers, San Mateo, Calif., 1988b, pp. 5-12.
Grier, David Alan: ‘Human computers: The first pioneers of the information age’, Endeavour, vol.
25, no. 1, March 2001, pp. 28–32.
Grier, David Alan: When Computers Were Human, Princeton University Press, Princeton and
Oxford, 2005.
Grinter, Rebecca E.: ‘Supporting articulation work using software configuration management
systems’, Computer Supported Cooperative Work (CSCW): The Journal of Collaborative
Computing, vol. 5, no. 4, 1996a, pp. 447-465.
Grinter, Rebecca E.: Understanding Dependencies: A Study of the Coordination Challenges in
Software Development, PhD diss., University of California, Irvine, 1996b.
Grønbæk, Kaj; and Preben Mogensen: ‘Informing general CSCW product development through
cooperative design in specific work domains’, Computer Supported Cooperative Work
(CSCW): The Journal of Collaborative Computing, vol. 6, no. 4, 1997, pp. 275-304.
Groover, Mikell P.: Automation, Production Systems, and Computer-Aided Manufacturing,
Prentice-Hall, Englewood-Cliffs, New Jersey, 1980.
456
Cooperative Work and Coordinative Practices
Gross, Tom; and Wolfgang Prinz: ‘Awareness in context: A light-weight approach’, in K. Kuutti,
et al. (eds.): ECSCW 2003: Proceedings of the Eighth European Conference on Computer
Supported Cooperative Work, 14–18 September 2003, Helsinki, Finland, Kluwer Academic
Publishers, Dordrecht, 2003, pp. 295-314.
Gross, Tom; and Wolfgang Prinz: ‘Modelling shared contexts in cooperative environments:
Concept, implementation, and evaluation’, Computer Supported Cooperative Work (CSCW):
The Journal of Collaborative Computing, vol. 13, no. 3-4, August 2004, pp. 283-303.
Gruber, Tom: ‘Ontology’, in L. Liu and M. T. Özsu (eds.): Encyclopedia of Database Systems,
Springer-Verlag, 2009.
Grudin, Jonathan: ‘Why groupware applications fail: Problems in design and evaluation’, Office:
Technology and People, vol. 4, no. 3, 1989, pp. 245-264.
Grudin, Jonathan: ‘CSCW: The convergence of two development contexts’, in S. P. Robertson; G.
M. Olson; and J. S. Olson (eds.): CHI’91 Conference Proceedings: ACM SIGCHI Conference
on Human Factors in Computing Systems, New Orleans, Lousiana, 27 Apri – 2 May 1991,
ACM Press, New York, N.Y., 1991, pp. 91-97.
Grudin, Jonathan: ‘Computer-Supported Cooperative Work: History and focus’, IEEE Computer,
vol. 27, no. 5, May 1994, pp. 19-26.
Grudin, Jonathan: ‘CSCW and Groupware: Their history and trajectory’, in Y. Matsushita (ed.):
Designing Communication and Collaboration Support Systems, Gordon and Breach Science
Publishers, Amsterdam, 1999, pp. 1-15.
Grudin, Jonathan: ‘The GUI shock: Computer graphics and human-computer interaction’,
Interactions, March-April 2006a, pp. 45-47, 55.
Grudin, Jonathan: ‘The demon in the basement’, Interactions, November-December 2006b, pp.
50-53.
Gunn, Thomas G.: Manufacturing for Competitive Advantage. Becoming a World Class
Manufacturer, Ballinger, Cambridge, Mass., 1987.
Gutwin, Carl: Workspace Awareness in Real-Time Distributed Groupware, PhD dissertation,
Department of Computer Science, The Universit of Calgary, Calgary, Alberta, December 1997.
Gutwin, Carl; and Saul Greenberg: ‘The effects of workspace awareness support on the usability
of real-time distributed groupware’, ACM Transactions on Computer-Human Interaction, vol.
6, no. 2, September 1999, pp. 243-281.
Gutwin, Carl; and Saul Greenberg: ‘A descriptive framework of workspace awareness for realtime groupware’, Computer Supported Cooperative Work (CSCW): The Journal of
Collaborative Computing, vol. 11, no. 3-4, 2002.
Haas, Christina: Writing Technology: Studies on the Materiality of Literacy, Lawrence Erlbaum,
Mahwah, New Jersey, 1996.
Hacker, Peter M. S.: Insight and Illusion: Themes in the Philosophy of Wittgenstein, Thoemmes
Press, Bristol, England, 1989. (2nd ed.; 1st ed. 1972).
Hacker, Peter M. S.: ‘Malcolm on language and rules’ (Philosophy, 1990). In P. M. S. Hacker:
Wittgenstein: Connections and Controversies. Clarendon Press, Oxford, 2001, pp. 310-323.
Hafner, Katie; and Matthew Lyon: Where Wizards Stay up Late: The Origins of the Internet,
Simon & Schuster, London, etc., 2003. (2nd ed.; 1st ed., 1996).
Hammer, Marius: Vergleichende Morphologie der Arbeit in der europäischen Automobilindustrie:
Die Entwickling sur Automation, 1959.
Hammer, Michael; and Jay S. Kunin: ‘Design principles of an office specification language’, in D.
Medley (ed.): AFIPS’80: Proceedings of the National Computer Conference, 19-22 May 1980,
Anaheim, California, vol. 49, AFIPS Press, Arlington, Virginia, 1980, pp. 541-547.
Hammer, Michael; and Marvin Sirbu: ‘What is office automation?’, in: Proceedings. First Office
Automation Conference, 3-5 March 1980, Atlanta, Georgia, 1980.
Hardy, Ian R.: The Evolution of ARPANET email, History thesis paper, University of California at
Berkeley, Berkeley, Calif., 13 May 1996.
<http://www.ifla.org.sg/documents/internet/hari1.txt>
References
457
Harper, Richard H. R.; John A. Hughes; and Dan Z. Shapiro: ‘Working in harmony: An
examination of computer technology in air traffic control’, in P. Wilson; J. M. Bowers; and S.
D. Benford (eds.): ECSCW’89: Proceedings of the 1st European Conference on Computer
Supported Cooperative Work, 13-15 September 1989, Gatwick, London, London, 1989a, pp.
73-86.
Harper, Richard H. R.; John A. Hughes; and Dan Z. Shapiro: The Functionality of Flight Strips in
ATC Work: The report for the Civil Aviation Authority, Lancaster Sociotechnics Group,
Department of Sociology, Lancaster University, Lancaster, U.K., January 1989b.
Harper, Richard H. R.; John A. Hughes; and Dan Z. Shapiro: ‘Harmonious working and CSCW:
Computer technology and air traffic control’, in J. M. Bowers and S. D. Benford (eds.): Studies
in Computer Supported Cooperative Work. Theory, Practice and Design, North-Holland,
Amsterdam, 1991, pp. 225-234.
Harper, Richard H. R.; and John A. Hughes: ‘What a f—ing system! Send ’em all to the same
place and then expect us to stop ’em hitting: Managing technology work in air traffic control’,
in G. Button (ed.): Technology in Working Order: Studies of Work, Interaction, and
Technology, Routledge, London and New York, 1993, pp. 127-144.
Harrington, Joseph: Computer Integrated Manufacturing, Krieger, Malabar, Florida, 1979.
Harrington, Joseph: Understanding the Manufacturing Process. Key to Successful CAD/CAM
Implementation, Marcel Dekker, New York, 1984.
Harris, Roy: The Origin of Writing, Duckworth, London, 1986.
Harris, Roy: Signs of Writing, Routledge, London and New York, 1995.
Harris, Roy: Rethinking Writing, Indiana University Press, Bloomington and Indianapolis, 2000.
Harrison, Steve; and Paul Dourish: ‘Re-placing space: The roles of place and space in
collaborative systems’, in G. M. Olson; J. S. Olson; and M. S. Ackerman (eds.): CSCW’96:
Proceedings of the Conference on Computer-Supported Cooperative Work, 16-20 November
1996, Boston, Mass., ACM Press, New York, 1996, pp. 67-76.
Hauben, Michael; and Ronda Hauben: Netizens: On the History and Impact of Usenet and the
Internet, IEEE Computer Society Press, Los Alamitos, Calif., 1997.
Heart, Frank, et al.: Draft ARPANET Completion Report, [Bound computer printout, unpublished],
Bolt Beranek and Newman, 9 September 1977. – Quoted from Abbate: Inventing the Internet,
The MIT Press, Cambridge, Mass., and London, 1999.
Heart, Frank, et al.: ARPANET Completion Report, Bolt Beranek and Newman, 4 January 1978.
(Also published as A History of the ARPANET: The First Decade, BBN 1981, Report #4799.).
<http://www.cs.utexas.edu/users/chris/DIGITAL_ARCHIVE/ARPANET/DARPA4799.pdf>
Heath, Christian C.; and Paul Luff: ‘Collaborative activity and technological design: Task
coordination in London Underground control rooms’, in L. J. Bannon; M. Robinson; and K.
Schmidt (eds.): ECSCW’91: Proceedings of the Second European Conference on ComputerSupported Cooperative Work, 24–27 September 1991, Amsterdam, Kluwer Academic
Publishers, Dordrecht, 1991a, pp. 65-80.
Heath, Christian C.; and Paul Luff: ‘Disembodied conduct: Communication through video in a
multi-media office environment’, in S. P. Robertson; G. M. Olson; and J. S. Olson (eds.):
CHI’91 Conference Proceedings: ACM SIGCHI Conference on Human Factors in Computing
Systems, New Orleans, Lousiana, 27 April-2 May 1991, ACM Press, New York, N.Y., 1991b,
pp. 99-103.
Heath, Christian C.; and Paul Luff: ‘Collaboration and control: Crisis management and multimedia
technology in London Underground control rooms’, Computer Supported Cooperative Work
(CSCW): An International Journal, vol. 1, no. 1-2, 1992a, pp. 69-94.
Heath, Christian C.; and Paul Luff: ‘Media space and communicative asymmetries: Preliminary
observations of video mediated interaction’, Human-Computer Interaction, vol. 7, 1992b, pp.
315-346.
Heath, Christian C.; and Paul Luff: ‘Disembodied conduct: Interactional asymmetries in videomediated communication’, in G. Button (ed.): Technology in Working Order: Studies of Work,
Interaction, and Technology, Routledge, London and New York, 1993, pp. 35-54.
Heath, Christian C., et al.: ‘Unpacking collaboration: the interactional organisation of trading in a
City dealing room’, Computer Supported Cooperative Work (CSCW): An International
Journal, vol. 3, no. 2, 1995a, pp. 147-165.
458
Cooperative Work and Coordinative Practices
Heath, Christian C.; Paul Luff; and Abigail J. Sellen: ‘Reconsidering the virtual workplace:
Flexible support for collaborative activity’, in H. Marmolin; Y. Sundblad; and K. Schmidt
(eds.): ECSCW’95: Proceedings of the Fourth European Conference on Computer-Supported
Cooperative Work, 10–14 September 1995, Stockholm, Sweden, Kluwer Academic Publishers,
Dordrecht, 1995b, pp. 83-99.
Heath, Christian C.; and Paul Luff: ‘Convergent activities: Line control and passenger information
on the London Underground’, in Y. Engeström and D. Middleton (eds.): Cognition and
Communication at Work, Cambridge University Press, Cambridge, 1996, pp. 96-129.
Heath, Christian C., et al.: ‘Configuring awareness’, Computer Supported Cooperative Work
(CSCW): The Journal of Collaborative Computing, vol. 11, no. 3-4, 2002.
Henderson, Kathryn: On Line and On Paper: Visual representations, Visual Culture, and
Computer Graphics in Design Engineering, MIT Press, Cambridge, Mass., and London, 1999.
Heritage, John C.: Garfinkel and Ethnomethodology, Polity Press, Cambridge, 1984.
Hertzfeld, Andy: Revolution in the Valley, O’Reilly, Sebastopol, Calif., 2005.
Hewitt, Carl: ‘Viewing control structures as patterns of passing messages’, Artificial Intelligence,
vol. 8, 1977, pp. 323-364.
Hewitt, Carl: ‘Offices are open systems’, ACM Transactions on Office Information Systems, vol.
4, no. 3, July 1986, pp. 271-287.
Hilbert, David: ‘Mathematical problems’ (International Congress of Mathematicians, Paris, 1900).
Transl. from ‘Mathematische Probleme’. Transl. by M. F. Winston. In J. J. Gray: The Hilbert
Challenge. Oxford University Press, Oxford, 2000, pp. 240-282. – [German original in Archiv
für Mathematik und Physik, vol. 1, 1901; Enlish transl. in Bulletin of the American
Mathematical Society, vol. 8, 1902]
Hilbert, David: ‘On the infinite’ (Münster, 4 June 1925). Transl. from ‘Über das Unendliche’.
Transl. by S. Bauer-Mengelberg. In J. van Heijenoort (ed.) From Frege to Gödel: A Source
Book in Mathematical Logic, 1879–1931. Harvard University Press, Cambridge, Mass., 1967,
pp. 367-392.
Hilbert, David: ‘The foundations of mathematics’ (Hamburg, July 1927). Transl. by S. BauerMengelberg and D. Føllesdal. In J. van Heijenoort (ed.) From Frege to Gödel: A Source Book
in Mathematical Logic, 1879–1931. Harvard University Press, Cambridge, Mass., 1967, pp.
464-479.
Hillis, W. Daniel: The Pattern on the Stone: The Simple Idea that Make Computers Work,
Weidenfeld & Nicolson, London, 1998. – Phoenix, London, 2001 (Paperback ed.).
Hiltz, Starr Roxanne; and Murray Turoff: The Network Nation: Human Communication via
Computer, Addison-Wesley, Reading, Mass., 1978. – The MIT Press, Cambridge, Mass., 1993
(Rev. ed.).
Hiltz, Starr Roxanne; and Murray Turoff: ‘The evolution of user behavior in a computerized
conferencing system’, Communications of the ACM, vol. 24, no. 11, November 1981, pp. 739751.
Hiltz, Starr Roxanne; and Elaine B. Kerr: Studies of Computer Mediated Communications
Systems: A Synthesis of the Findings: Final Report on a Workshop Sponsored by the Division
of Information Science and Technology, National Science Foundation, Computerized
Conferencing and Communications Center, New Jersey Institute of Technology, Newark, New
Jersey, 1981. – Research Report Number 16. <http://library.njit.edu/archives/cccc-materials/>
Hiltz, Starr Roxanne; and Murray Turoff: ‘Structuring computer-mediated communication systems
to avoid information overload’, Communications of the ACM, vol. 28, no. 7, July 1985, pp.
680-689.
Hirschhorn, Larry: Beyond Mechanization: Work and Technology in a Postindustrial Age, MIT
Press, Cambridge, Mass., and London, 1984.
Hobbes, Thomas: Leviathan, or The Matter, Forme, & Power of a Common-Wealth Ecclesiaticall
and Civill, London, 1651. – Penguin Books, Harmondsworth, Middlesex, 1968.
Hodgskin, Thomas: Labour Defended Against the Claims of Capital, Or, The Unproductiveness of
Capital Proved with Reference to the Present Combinations Amongst Journeymen, Knight &
Lacey, London, 1825. – Augustus M. Kelley, Publishers, New York, 1969.
Holland, John H.: Hidden Order: How Adaption Builds Complexity, Addison-Wesley Publishing
Co., Reading, Mass., 1995.
References
459
Holt, Anatol W.: ‘Coordination Technology and Petri Nets’, in G. Rozenberg (ed.): Advances in
Petri Nets 1985, vol. 222, Springer-Verlag, Berlin, 1985, pp. 278-296. Text ed. by G. Goos and
J. Hartmanis.
Howard, Robert: ‘Systems design and social responsibility: The political implications of
‘Computer-Supported Cooperative Work’: A commentary’, Office: Technology and People,
vol. 3, no. 2, 1987.
Howse, Derek: Greenwich Time and the Discovery of the Longitude, Oxford University Press,
Oxford, 1980. – Philip Wilson Publishers, London, 1997.
Hughes, John A., et al.: The Automation of Air Traffic Control, Lancaster Sociotechnics Group,
Department of Sociology, Lancaster University, Lancaster, UK, October 1988.
Hughes, John A.; David W. Randall; and Dan Z. Shapiro: ‘CSCW: Discipline or paradigm? A
sociological perspective’, in L. J. Bannon; M. Robinson; and K. Schmidt (eds.): ECSCW’91:
Proceedings of the Second European Conference on Computer-Supported Cooperative Work,
24–27 September 1991, Amsterdam, Kluwer Academic Publishers, Dordrecht, 1991, pp. 309323.
Hughes, Thomas P.: ‘The evolution of large technological systems’, in W. E. Bijker; T. P. Hughes;
and T. J. Pinch (eds.): The Social Construction of Technological Systems: New Directions in
the Sociology and History of Technology, The MIT Press, Cambridge, Mass., and London,
1987, pp. 51-82. (Paperback ed., 1989).
Hull, David L.: Science as a Process: An Evolutionary Account of the Social and Conceptual
Development of Science, The University of Chicago Press, Chicago, 1988.
Hutchins, Edwin L.; James D. Hollan; and Donald A. Norman: ‘Direct manipulation interfaces’, in
D. A. Norman and S. W. Draper (eds.): User Centered System Design, Lawrence Erlbaum,
New Jersey, 1986, pp. 87-124.
Hutchins, Edwin L.: ‘Mediation and automatization’, Quarterly Newsletter of the Laboratory of
Comparative Human Cognition [University of California, San Diego], vol. 8, no. 2, April
1986, pp. 47-58.
Hutchins, Edwin L.: ‘The social organization of distributed cognition’, in L. B. Resnick; J. M.
Levine; and S. D. Teasley (eds.): Perspectives on Socially Shared Cognition, American
Psychological Association, Washington, DC, 1991, pp. 283-307.
Hutchins, Edwin L.: Cognition in the Wild, The MIT Press, Cambridge, Mass., and London,
England, 1995.
Hutchins, Edwin L.; and Tove Klausen: ‘Distributed cognition in an airline cockpit’, in Y.
Engeström and D. Middleton (eds.): Cognition and Communication at Work, Cambridge
University Press, Cambridge, 1996, pp. 15-34.
Ishii, Hiroshi: ‘TeamWorkStation: Towards a seamless shared workspace’, in T. Bikson and F.
Halasz (eds.): CSCW’90: Proceedings of the Conference on Computer-Supported Cooperative
Work, 7-10 October 1990, Los Angeles, Calif., ACM Press, New York, 1990, pp. 13-26.
Ishii, Hiroshi; Minoru Kobayaski; and Jonathan Grudin: ‘Integration of inter-personal space and
shared workspace: ClearBoard design and experiments’, in M. M. Mantei; R. M. Baecker; and
R. E. Kraut (eds.): CSCW’92: Proceedings of the Conference on Computer-Supported
Cooperative Work, 31 October–4 November 1992, Toronto, Canada, ACM Press, New York,
1992, pp. 33-42.
Jirotka, Marina, et al. (eds.): ‘Special Issue: Collaboration in e-Research’. [Special theme of]
Computer Supported Cooperative Work (CSCW): The Journal of Collaborative Computing,
vol. 15, no. 4, August 2006.
Johannsen, Gunnar, et al.: ‘Final report of experimental psychology group’, in N. Moray (ed.):
Mental Workload: Its Theory and Measurement, Plenum Press, New York and London, 1979,
pp. 101-114.
Johansen, Robert: Groupware: Computer Support for Business Teams, Free Press, New York and
London, 1988.
Johnson, Allen W.; and Timothy Earle: The Evolution of Human Societies: From Foraging Group
to Agrarian State, Stanford University Press, Stanford, Calif., 1987.
Johnson, Jeff, et al.: ‘The Xerox Star: A Retrospective’, IEEE Computer, vol. 22, no. 9, September
1989, pp. 11-26, 28-29.
460
Cooperative Work and Coordinative Practices
Johnson, Philip: ‘Supporting exploratory CSCW with the EGRET framework’, in M. M. Mantei;
R. M. Baecker; and R. E. Kraut (eds.): CSCW’92: Proceedings of the Conference on
Computer-Supported Cooperative Work, 31 October–4 November 1992, Toronto, Canada,
ACM Press, New York, 1992, pp. 298-305.
Johnson-Lenz, Peter; and Trudy Johnson-Lenz: ‘Groupware: the process and impacts of design
choices’, in E. B. Kerr and S. R. Hiltz (eds.): Computer-Mediated Communication Systems,
Academic Press, New York, 1982.
Johnston, William A.; and Veronica J. Dark: ‘Selective attention’, Annual Review of Psychology,
vol. 37, 1986, pp. 43-75.
Jones, Stephen R. G.: ‘Was there a Hawthorne effect?’, The American Journal of Sociology, vol.
98, no. 3, November 1992, pp. 451-468.
Jordan, Nehemiah: ‘Allocation of functions between man and machines in automated systems’,
Journal of Applied Psychology, vol. 47, no. 3, June 1963, pp. 161-165.
Kaavé, Bjarne: Undersøgelse af brugersamspil i system til produktionsstyring, M.Sc. diss.,
Technical University of Denmark, Lyngby, Denmark, 1990.
Kanigel, Robert: The One Best Way: Frederick Winslow Taylor and the Enigma of Efficiency, The
MIT Press, Cambridge, Mass., and London, 1997. (Paperback ed., 2005).
Kant, Immanuel: ‘Über den Gemeinspruch: Das mag in der Theorie richtig sein, taugt aber nicht
für die Praxis’ (Berlinische Monatsschrift, September 1793). Text ed. by W. Weischedel. In I.
Kant: Werke in zwölf Bänden. Suhrkamp Verlag, Frankfurt a. M., 1964, vol. XI, pp. 125-172.
Kantowitz, Barry H.; and Robert D. Sorkin: ‘Allocation of functions’, in G. Salvendy (ed.):
Handbook of Human Factors, John Wiley, New York etc., 1987, pp. 355-369.
Kaplan, Simon M., et al.: ‘Flexible, active support for collaborative work with Conversation
Builder’, in M. M. Mantei; R. M. Baecker; and R. E. Kraut (eds.): CSCW’92: Proceedings of
the Conference on Computer-Supported Cooperative Work, 31 October–4 November 1992,
Toronto, Canada, ACM Press, New York, 1992, pp. 378-385.
Kaptelinin, Victor: ‘Computer-mediated activity: Functional organs in social and developmental
contexts’, in B. A. Nardi (ed.): Context and Consciousness: Activity Theory and HumanComputer Interaction, The MIT Press, Cambridge, Mass., 1997, pp. 45-68.
Kasbi, Catherine; and Maurice de Montmollin: ‘Activity without decision and responsibility: The
case of nuclear power plants’, in J. Rasmussen; B. Brehmer; and J. Leplat (eds.): Distributed
Decision Making. Cognitive Models for Cooperative Work, John Wiley & Sons, Chichester,
1991, pp. 275-283.
Kauffman, Stuart: At Home in the Universe: The Search for Laws of Self-Organization and
Complexity, Oxford University Press, New York and Oxford, 1995.
Kawell, Jr., Leonard , et al.: ‘Replicated document management in a group communication
system’, in I. Greif and L. A. Suchman (eds.): CSCW’88: Proceedings of the Conference on
Computer-Supported Cooperative Work, 26-28 September 1988, Portland, Oregon, ACM
Press, New York, 1988, pp. 395[-404].
Kelley, Charles R.: Manual and Automatic Control: A Theory of Manual Control and its
Application to Manual and to Automatic Systems, Wiley, New York, 1968.
Kern, Horst; and Michael Schumann: Industriearbeit und Arbeiterbewußtsein: Eine empirische
Untersuchung über den Einfluß der aktuellen technischen Entwicklung auf die industrielle
Arbeit und das Arbeiterbewußtsein, vol. 1-2, Europäische Verlagsanstalt, Frankfurt am Main,
1970.
Kerr, Elaine B.; and Starr Roxanne Hiltz: Computer-Mediated Communication Systems: Status
and Evaluation, Academic Press, Orlando, etc., 1982.
Kiesler, Sara; Jane Siegel; and Timothy W. McGuire: ‘Social psychological aspects of computermediated communication’, American Psychologist, vol. 39, no. 10, October 1984, pp. 11231134.
Kling, Rob: ‘Social analyses of computing: Theoretical perspectives in recent empirical research’,
Computing Surveys, vol. 12, no. 1, March 1980, pp. 61-110.
Kling, Rob: ‘Cooperation, coordination and control in computer-supported work’,
Communications of the ACM, vol. 34, no. 12, December 1991, pp. 83-88.
Kochhar, A. K.; and N. D. Burns: Microprocessors and their Manufacturing Applications, Edward
Arnold, London, 1983.
References
461
Koren, Yoram: Computer Control of Manufacturing Systems, McGraw-Hill, New York, etc.,
1983.
Korsch, Karl: Karl Marx, Chapman & Hall, London, 1938. – Russell & Russell, New York, 1963.
Kraut, Robert E.; Carmen Egido; and Jolene Galegher: ‘Patterns of contact and communication in
scientific research collaboration’, in J. Galegher; R. E. Kraut; and C. Egido (eds.): Intellectual
Teamwork: Social and Technological Foundations of Cooperative Work, Lawrence Erlbaum,
Hillsdale, New Jersey, 1990, pp. 149-171.
Kreifelts, Thomas: ‘DOMINO: Ein System zur Abwicklung arbeitsteiliger Vorgänge im Büro’,
Angewandte Informatik, vol. 26, no. 4, 1984, pp. 137-146.
Kreifelts, Thomas, et al.: ‘Experiences with the DOMINO office procedure system’, in L. J.
Bannon; M. Robinson; and K. Schmidt (eds.): ECSCW’91: Proceedings of the Second
European Conference on Computer-Supported Cooperative Work, 24–27 September 1991,
Amsterdam, Kluwer Academic Publishers, Dordrecht, 1991a, pp. 117-130.
Kreifelts, Thomas, et al.: ‘A design tools for autonomous agents’, in J. M. Bowers and S. D.
Benford (eds.): Studies in Computer Supported Cooperative Work. Theory, Practice and
Design, North-Holland, Amsterdam, 1991b, pp. 131-144.
Kreifelts, Thomas; Elke Hinrichs; and Gerd Woetzel: ‘Sharing to-do lists with a distributed task
manager’, in G. De Michelis; C. Simone; and K. Schmidt (eds.): ECSCW’93: Proceedings of
the Third European Conference on Computer-Supported Cooperative Work, 13-17 September
1993, Milano, Italy, Kluwer Academic Publishers, Dordrecht, 1993, pp. 31-46.
Kuhn, Thomas S.: The Structure of Scientific Revolutions (1962; 2nd ed. 1969). University of
Chicago Press, Chicago, 1969.
Kuutti, Kari: ‘Activity Theory as a potential framework for Human-Computer Interaction
research ’, in B. A. Nardi (ed.): Context and Consciousness: Activity Theory and HumanComputer Interaction, The MIT Press, Cambridge, Mass., 1997, pp. 17-44.
La Porte, Todd R. (ed.): Organized Social Complexity: Challenge to Politics and Policy, Princeton
University Press, Princeton, New Jersey, 1975.
La Porte, Todd R.; and Paula M. Consolini: ‘Working in practice but not in theory: Theoretical
challenges of “high-reliability organizations”’, Journal of Public Administration Research and
Theory, vol. 1, no. 1, 1991, pp. 19-47.
Laiserin, Jerry: ‘The convergence of CAD standards’. In Digital Architect, 2002. Resource April.
– Reprint, Reprint City, Reprint 2002.
Lakatos, Imre: ‘A renaissance of empiricism in the recent philosophy of mathematics?’ (1967).
Text ed. by J. Worrall and G. Currie. In I. Lakatos: Mathematics, Science and Epistemology:
Philosophical Papers, Volume 2. (Paperback ed., 1980). Cambridge University Press,
Cambridge, 1978, pp. 24-42.
Lakatos, Imre: ‘Falsification and the methodology of scientific research programmes’, in I.
Lakatos and A. Musgrave (eds.): Criticism and the Growth of Knowledge: Proceedings of the
International Colloquium in the Philosophy of Science, London, 1965, Cambridge University
Press, Cambridge, 1970, pp. 91-196. – Also in Imre Lakatos: The Methodology of Scientific
Research Programmes: Philosophical Papers, Volume 1, Cambridge University Press.
Cambridge, 1978, pp. 8-101.
Lampson, Butler W.: ‘Personal distributed computing: The Alto and Ethernet software’, in A.
Goldberg (ed.): A History of Personal Workstations, Addison-Wesley, Reading, Mass., 1988,
pp. 293-335.
Landes, David S.: Revolution in Time: Clocks and the Making of the Modern World, Harvard
University Press, Cambridge, Mass., 1983.
Latour, Bruno: Science in Action: How to follow scientists and engineers through society, Harvard
University Press, Cambridge, Mass., 1987.
Laudan, Rachel: ‘Introduction’, in R. Laudan (ed.): The Nature of Technological Knowledge: Are
Models of Scientific Change Relevant?, D. Reidel Publishing, Dordrecht, Boston, and London,
1984, pp. 1-26.
462
Cooperative Work and Coordinative Practices
Lauwers, J. Chris; and Keith A. Lantz: ‘Collaboration awareness in support of collaboration
transparency: Requirements for the next generation of shared window systems’, in J. C. Chew
and J. Whiteside (eds.): CHI’90 Conference Proceedings: ACM SIGCHI Conference on
Human Factors in Computing Systems, Seattle, Washington, 1-5 April 1990, ACM Press, New
York, N.Y., 1990, pp. 303-311.
Lave, Jean: ‘Situated learning in communities of practice’, in L. B. Resnick; J. M. Levine; and S.
D. Teasley (eds.): Perspectives on socially shared cognition, Americal Psychological
Association, Washington, DC, 1991, pp. 63-82.
Law, John; and John Hassard (eds.): Actor Network Theory and After, Blackwell, London, 1999.
Layton, Edwin T.: ‘Technology as knowledge’, Technology and Culture, vol. 15, no. 1, January
1974, pp. 31-41.
LeCuyer, A.: ‘Design on the Computer: Frank O. Gehry und Peter Eisenman’, ARCH+, no. 128,
September 1995, pp. 26-29.
Lee, Charlotte P., et al. (eds.): ‘Special Issue: Scientific collaboration through cyberinfrastructure’.
[Special theme of] Computer Supported Cooperative Work (CSCW): The Journal of
Collaborative Computing, vol. 19, 2010. – [Forthcoming].
Lévi-Strauss, Claude: La pensée sauvage, Librairie Plon, Paris, 1962. – Plon, Paris, 1990.
Licklider, Joseph Carl Robnett: ‘Man-computer symbiosis’ (IRE Transactions on Human Factors
in Electronics, March 1960). In R. W. Taylor (ed.): In Memoriam: J. C. R. Licklider, 19151990. Digital Systems Research Center, Palo Alto, Calif., 1990, pp. 4–11. <[Also in Adele
Goldberg: A History of Personal Workstations, Addison-Wesley, Reading, Mass., 1988, pp.
131-140]>
Licklider, Joseph Carl Robnett; and Welden E. Clark: ‘On-line man-computer communication’, in
G. A. Barnard (ed.): SJCC’62: Proceedings of the Spring Joint Computer Conference, 1-3 May
1962, San Francisco, California, vol. 21, AFIPS Press, 1962, pp. 113-128.
Lindgren, Michael: Glory and Failure: The Difference Engines of Johann Müller, Charles
Babbabe and Georg and Edvard Scheutz, Linköping University, Linköping, 1987.
List, Friedrich: Das nationale System der politischen Ökonomie, Cotta Verlag, Stuttgart u.
Tübingen, 1841a.
List, Friedrich: The National System of Political Economy, J. B. Lippincott, Philadelphia, 1856.
Transl. from Das nationale System der politischen Ökonomie, Cotta Verlag, Stuttgart u.
Tübingen, 1841b. Transl. by G. A. Matile.
Lojek, Bo: History of Semiconductor Engineering, Springer, Berlin-Heidelberg, 2007.
Luff, Paul; Jon Hindmarsh; and Christian C. Heath (eds.): Workplace Studies: Recovering Work
Practices and Informing System Design, Cambridge University Press, Cambridge, 2000.
Luke, Hugh D.: Automation for Productivity, John Wiley & Sons, New York etc., 1972.
Lutters, Wayne G.; and Mark S. Ackerman: ‘Achieving safety: A field study of boundary objects
in aircraft technical support’, in E. Churchill, et al. (eds.): CSCW 2002: ACM Conference on
Computer Supported Cooperative Work, 16 - 20 November 2002, New Orleans, Louisiana,
ACM Press, New York, 2002, pp. 266-275.
Lynch, Michael: ‘Extending Wittgenstein: The Pivotal Move from Epistemology to the Sociology
of Science’, in A. Pickering (ed.): Science as Practice and Culture, Chicago University Press,
Chicago, 1993, pp. 215-265.
Maas, W.; and J. Van Rijs: FRANAX: Excursions on Density, O10 Publishers, Rotterdam, 1998.
Mahoney, Michael S.: ‘Computers and mathematics: The search for a discipline of computer
science’, in J. Echeverria; A. Ibarra; and T. Mormann (eds.): The Space of Mathematics:
Philosophical, Epistemological, and Historical Explorations, Walter de Gruyter, Berlin and
New York, 1992, pp. 349-363.
Mahoney, Michael S.: ‘The histories of computing(s)’, Interdisciplinary Science Reviews, vol. 30,
no. 2, 2005, pp. 119-135.
Malcolm, Norman: Nothing is Hidden: Wittgenstein’s Criticism of Hs Early Thought, Basil
Blackwell, Oxford, 1986.
Malcolm, Norman: Wittgensteinian Themes: Essays, 1978-1989, Cornell University Press, Ithaca
and London, 1995. Text ed. by G. H. von Wright.
Malone, Thomas W.; JoAnne Yates; and Robert I. Benjamin: ‘Electronic markets and electronic
hierarchies’, Communications of the ACM, vol. 30, no. 6, June 1987, pp. 484-497.
References
463
Malone, Thomas W.; and Kevin Crowston: ‘What is coordination theory and how can it help
design cooperative work systems’, in T. Bikson and F. Halasz (eds.): CSCW’90: Proceedings
of the Conference on Computer-Supported Cooperative Work, 7-10 October 1990, Los
Angeles, Calif., ACM Press, New York, 1990, pp. 357-370.
Malone, Thomas W.; Hum-Yew Lai; and Christopher Fry: ‘Experiments with Oval: A radically
tailorable tool for cooperative work’, in M. M. Mantei; R. M. Baecker; and R. E. Kraut (eds.):
CSCW’92: Proceedings of the Conference on Computer-Supported Cooperative Work, 31
October–4 November 1992, Toronto, Canada, ACM Press, New York, 1992, pp. 289-297.
Malone, Thomas W.; and Kevin Crowston: The Interdisciplinary Study of Coordination, Center of
Coordination Science, MIT, Cambridge, Mass., 1992.
Malone, Thomas W., et al.: ‘Tools for inventing organizations: Toward a handbook of
organizational processes’, in: Proceedings of the 2nd IEEE Workshop on Enabling
Technologies Infrastructure for Collaborative Enterprises, 20-22 April 1993, Morgantown,
West Virginia, 1993.
Malone, Thomas W.; Hum-Yew Lai; and Christopher Fry: ‘Experiments with Oval: A radically
tailorable tool for cooperative work’, ACM Transactions on Office Information Systems, vol.
13, no. 2, April 1995, pp. 177-205.
Mantei, Marilyn M., et al.: ‘Experiences in the use of a media space’, in S. P. Robertson; G. M.
Olson; and J. S. Olson (eds.): CHI’91 Conference Proceedings: ACM SIGCHI Conference on
Human Factors in Computing Systems, New Orleans, Lousiana, 27 April-2 May 1991, ACM
Press, New York, N.Y., 1991, pp. 203-208.
March, James G.; and Herbert A. Simon: Organizations, John Wiley, New York, 1958.
Mariani, John A.; and Wolfgang Prinz: ‘From multi-user to shared object systems: Awareness
about co-workers in cooperation support object databases’, in H. Reichel (ed.): GI-Tagung:
Informatik - Wirtschaft - Gesellschaft, Springer, Berlin-Heidelberg, 1993, pp. 476-481.
<http://www.fit.fraunhofer.de/~prinz/papers/gi93-awareness.html>
Mark, Gloria: ‘Conventions and commitments in distributed CSCW groups’, Computer Supported
Cooperative Work (CSCW): The Journal of Collaborative Computing, vol. 11, no. 3-4, 2002.
Marx, Karl: ‘Thesen über Feuerbach’ (Manuscript, Spring 1845a). In K. Marx and F. Engels:
Werke. Dietz Verlag, Berlin, 1962, pp. 5-7.
Marx, Karl: Brüsseler Hefte 1845 (Manuscript, 1845b). Text ed. by G. A. Bagaturija, et al. In K.
Marx and F. Engels: Gesamtausgabe (MEGA➁). Akademie Verlag, Berlin, 1998, vol. IV/3, pp.
111-433.
Marx, Karl: Misère de la philosophie: Reponse a La philosophie de la misère de M. Proudhon, A.
Frank / C. G. Vogler, Paris / Bruxelles, 1847.
Marx, Karl: ‘Der achtzehnte Brumaire des Louis Bonaparte’ (Die Revolution: Eine Zeitscrhift in
zwanglosen Heften, New York, 1852). Text ed. by M. Hundt, et al. In K. Marx and F. Engels:
Gesamtausgabe (MEGA➁). Dietz Verlag, Berlin, 1985, vol. I/11, pp. 96-189.
Marx, Karl: ‘Einleitung’ (Manuscript, August 1857). Text ed. by V. K. Brušlinskij; L. R.
Mis’kevič; and A. G. Syrov. In K. Marx and F. Engels: Gesamtausgabe (MEGA➁). Dietz
Verlag, Berlin, 1976-81, vol. II/1.1 (Ökonomische Manuskripte 1857/58), pp. 17-45. – English
transl. by Nicolaus, in Marx: Grundrisse, Pelican, Harmondsworth, 1973.
Marx, Karl: Grundrisse der Kritik der politischen Ökonomie (Manuscript, 1857-58a). Text ed. by
V. K. Brušlinskij; L. R. Mis’kevič; and A. G. Syrov. In K. Marx and F. Engels:
Gesamtausgabe (MEGA➁). Dietz Verlag, Berlin, 1976-1981, vol. II/1, pp. 47-747.
Marx, Karl: Grundrisse der Kritik der politischen Ökonomie (Rohentwurf), 1857-1858. Anhang
1850-1859 (Manuscript, 1857-58b). Dietz Verlag, Berlin, 1953.
Marx, Karl: Zur Kritik der politischen Ökonomie (Manuskript 1861-1863) (1861-63). Text ed. by
A. Schnickmann, et al. In K. Marx and F. Engels: Gesamtausgabe (MEGA➁). Dietz Verlag,
Berlin, 1976-1982, vol. II/3.1-II/3.6.
Marx, Karl: [Letter to Friedrich Engels], London, 28 January 1863. In K. Marx and F. Engels:
Werke. Dietz Verlag, Berlin, 1964, vol. 30, pp. 319-323.
Marx, Karl: ‘Ökonomische Manuscripte 1863-1867’ (1863-67). In K. Marx and F. Engels (eds.):
Gesamtausgabe (MEGA).
464
Cooperative Work and Coordinative Practices
Marx, Karl: Das Kapital. Kritik der politischen Ökonomie. Erster Band. Buch I: Der
Produktionsprocess des Kapitals (Hamburg, 1867a). Text ed. by E. Kopf, et al. In K. Marx and
F. Engels: Gesamtausgabe (MEGA➁). Dietz Verlag, Berlin, 1983, vol. II/5.
Marx, Karl: Das Kapital. Kritik der politischen Ökonomie. Erster Band. Buch I: Der
Produktionsprocess des Kapitals (Hamburg, 1867b; 4th ed., 1890). Text ed. by F. Engels. In K.
Marx and F. Engels: Werke. Dietz Verlag, Berlin, 1962, vol. 23.
Marx, Karl: Capital: A Critique of Political Economy. Book One: The Process of Production of
Capital (Hamburg, 1867c; 4th ed., 1890). Transl. by S. Moore and E. Aveling. Text ed. by F.
Engels. In K. Marx and F. Engels: Collected Works. Lawrence & Wishart, London, 1996, vol.
35.
Marx, Karl: ‘À la rédaction de l’Otečestvennyâ Zapiski’ (Draft letter, in French, OctoberNovember 1877). Text ed. by H. Schwab, et al. In K. Marx and F. Engels: Gesamtausgabe
(MEGA➁). Dietz Verlag, Berlin, 1985, vol. I/25, pp. 112-117.
Marx, Karl: ‘Lettre à Vera Ivanovna Zasulič (Projet I-IV)’ (Draft letters, in French, 18 February 8 March 1881). Text ed. by H. Schwab, et al. In K. Marx and F. Engels: Gesamtausgabe
(MEGA➁). Dietz Verlag, Berlin, 1985, vol. I/25, pp. 217-240.
Mayr, Ernst; and Peter D. Ashlock: Principles of Systematic Zoology, McGraw-Hill, New York,
1991. (2nd ed.; 1st ed. 1969).
Mayr, Ernst; and Walter J. Bock: ‘Classifications and other ordering systems’, Journal of
Zoological Systematics & Evolutionary Research, vol. 40, no. 4, December 2002, pp. 169-194.
Mayr, Otto: The Origins of Feedback Control, The MIT Press, Cambridge, Mass., and London,
1970. Transl. from Zur Frühgeschichte der technischen Regelungen, R. Oldenbourg Verlag,
Wien, 1969.
McCarthy, John: ‘Reminiscences on the history of time sharing’, Stanford University, Winter or
Spring 1983. <http://www-formal.stanford.edu/jmc/history/timesharing/timesharing.html>
McKenney, James L.: Waves of Change: Business Evolution through Information Technology,
Harvard Business School Press, Boston, Mass., 1995.
McNeese, Michael; Eduardo Salas; and Mica R. Endsley (eds.): New Trends in Cooperative
Activities: Understanding System Dynamics in Complex Environments, Human Factors and
Ergonomics Society, Santa Monica, Calif., 2001.
McPherson, J. C.: ‘Mathematical operations with punched cards’, Journal of the American
Statistical Association, vol. 37, no. 218, June 1942, pp. 275-281.
Medina-Mora, Raul, et al.: ‘The Action Workflow approach to workflow management
technology’, in M. M. Mantei; R. M. Baecker; and R. E. Kraut (eds.): CSCW’92: Proceedings
of the Conference on Computer-Supported Cooperative Work, 31 October–4 November 1992,
Toronto, Canada, ACM Press, New York, 1992, pp. 281-288.
Mendels, Franklin F.: ‘Proto-industrialization: The first phase of the industrialization process’, The
Journal of Economic History, vol. 32, no. 1, March 1972, pp. 241-261.
Merchant, M. Eugene: ‘Flexible Manufacturing Systems: Robotics and computerized automation’,
The Annals of the American Academy of Political and Social Science, vol. 470, no. 1, 1983, pp.
123-135.
Mickler, Otfried; Eckhard Dittrich; and Uwe Neumann: Technik, Arbeitsorganisation und Arbeit:
Eine empirische Untersuchung in der automatischen Produktion, Aspekte Verlag, Frankfurt
am Main, 1976.
Mill, John Stuart: Principles of Political Economy, with Some of Their Applications to Social
Philosophy, John W. Parker, London, 1848.
Miller, George Armitage: ‘What is information measurement?’, The American Psychologist, vol.
8, January 1953, pp. 3-11.
Miller, George Armitage: ‘The magical number seven, plus or minus two: Some limits on our
capacity for processing information’, Psychological Review, vol. 63, no. 2, March 1956, pp.
81-97.
Miller, George Armitage; Eugene Galanter; and Karl H. Pribram: Plans and the Structure of
Behavior, Holt, Rinehart & Winston, New York, 1960.
Mills, Charles Wright: White Collar: The American Middle Classes, Oxford University Press,
New York, 1951.
References
465
Mindell, David A.: Between Human and Machine: Feedback, Control, and Computing before
Cybernetics, Johns Hopkins University Press, Baltimore and London, 2002.
Mintzberg, Henry: The Structuring of Organizations: A Synthesis of the Research, Prentice-Hall,
Englewood-Cliffs, New Jersey, 1979.
Mitchell, W. J.. Picture Theory: Essays on Verbal and Visual Representation, The University of
Chicago Press, Chicago, 1994.
Mitrofanov, Sergei Petrovich: Scientific Principles of Group Technology, National Lending
Library for Science and Technology, Boston Spa, Yorkshire, England. Transl. from Nauchnye
osnovy gruppovoi tekhnologii, Leningrad, 1959, 1959. Transl. by E. Harris. Text ed. by T. J.
Grayson. 1966.
Monden, Yasuhiro: Toyota Production System: Practical Approach to Production Management,
Industrial Engineering and Management Press, Institute of Industrial Engineers, Norcross,
Georgia, 1983.
Monk, Ray: Bertrand Russell: The Spirit of Solitude, Jonathan Cape, London, 1996. – Vintage,
1997 (Paperback ed.).
Monod, Jacques: Chance and Necessity: An Essay on the Natural Philosophy of Modern Biology,
Collins, London, 1972. Transl. from Le hasard et la nécessité. Essai sur la philosophie
naturelle de la biologie moderne, Editions Seuil, Paris, 1970. Transl. by A. Wainhouse.
Moore Jr., Barrington: Social Origins of Dictatorship and Democracy: Lord and Peasant in the
Making of the Modern World, Boston, 1966. – Penguin University Books, Harmondsworth,
1973.
Moray, Neville (ed.): Mental Workload: Its Theory and Measurement, Plenum Press, New York
and London, 1979a.
Moray, Neville: ‘Models and measures of mental workload’, in N. Moray (ed.): Mental Workload:
Its Theory and Measurement, Plenum Press, New York and London, 1979b, pp. 13-21.
Morse, Philip M.; and George E. Kimball: Methods of Operations Research, The Technology
Press of MIT, Cambridge, Mass., 1951. (rev.ed.; 1st ed., US Navy, 1946). – Dover
Publications, Mineola, New York, 2003.
Multhauf, Robert P.: ‘Some observations on the state of the history of technology’, Technology
and Culture, vol. 15, no. 1, January 1974, pp. 1-12.
Mumford, Lewis: Technics and Civilization, Harcourt Brace & Co., New York, 1934. – Harcourt
Brace & Co., San Diego, etc., 1963 (Paperback ed.).
Myer, Theodore H.; and David Dodds: ‘Notes on the development of message technology’, in D.
M. Austin (ed.): Berkeley Workshop on Distributed Data Management and Computer
Networks, 1976, Lawrence Berkeley Laboratories, 1976, pp. 144-154. – LBL-5315.
Napper, R. B. E.: ‘The Manchester Mark I Computers’, in R. Rojas and U. Hashagen (eds.): The
First Computers: History and Architectures, The MIT Press, Cambridge, Mass., and London,
2000, pp. 365-378.
Nardi, Bonnie A.: A Small Matter of Programming: Perspectives on End User Computing, The
MIT Press, Cambridge, Mass., 1993.
Nardi, Bonnie A.; Steve Whittaker; and Erin Bradner: ‘Interaction and outeraction: Instant
messaging in action’, in W. A. Kellogg and S. Whittaker (eds.): CSCW 2000: ACM Conference
on Computer Supported Cooperative Work, 2–6 December 2000, Philadelphia, Pennsylvania,
USA, ACM Press, New York, 2000, pp. 79-88.
Neisser, Ulric: Cognition and Reality: Principles and Implications of Cognitive Psychology, W. H.
Freeman & Co., San Francisco, 1976.
Newton, Janice: ‘Technology and cooperative labour among the Orokaiva’, Mankind, vol. 15, no.
3, December 1985, pp. 214-222.
Nicolis, Grégoire; and Ilya Prigogine: Exploring Complexity: An Introduction, W. H. Freeman &
Co., New York, 1989.
Norberg, Arthur L.: ‘High-technology calculation in the early 20th century: Punched card
machinery in business and government’, Technology and Culture, vol. 31, no. 4, October 1990,
pp. 753-779.
Norman, Donald A.; and Edwin L. Hutchins: Computation via direct manipulation, Institute for
Cognitive Science, University of California, San Diego, La Jolla, California, 1 August 1988. –
ONR Contract N00014-85-C-0133.
466
Cooperative Work and Coordinative Practices
Norman, Donald A.: ‘Cognitive artifacts’, in J. M. Carroll (ed.): Designing Interaction:
Psychology at the Human-Computer Interface, Cambridge University Press, Cambridge, 1991,
pp. 17-38.
Nuland, Sherwin B.: [Review of] A. Gawande: Complications: A Surgeon’s Notes on an Imperfect
Science (Metropolitan Books), The New York Review of Books, vol. 49, no. 12, 18 July 2002,
pp. 10-13.
Nurminen, Markku I.: People or Computers: Three ways of looking at information systems,
Studentlitteratur, Lund, Sweden, 1988.
Nuttgens, Patrick: The Story of Architecture, Phaidon Press, London, 1997. (2nd ed.; 1st ed. 1983).
Nygaard, Kristen; and Ole-Johan Dahl: ‘The development of the SIMULA languages’, ACM
SIGPLAN Notices, vol. 13, no. 8, August 1978, pp. 245-272.
O’Neill, Judy Elizabeth: The Evolution of Interactive Computing through Time-sharing and
Networking, Ph.D. dissertation, University of Minnesota, 1992.
Odgaard, Irene: “Praksis var værre end forventet”: Producktionsgrupper på Ålestrup: Grundfos,
Denmark. Casestudie-rapport om nye former for arbejdsorganisation, IMF Project on new
forms of work organization. Unpublished report, Specialarbejderforbundet i Danmark (SiD),
København, 26 August 1994.
Odgaard, Irene, et al.: Produktionsgrupper: organisationsudvikling og IT-støtte, CO-industri etc.,
København, 1999.
Ohmae, Kenichi: Triad Power: The Coming Shape of Global Competition, Free Press, New York,
1985.
Ohno, Taiichi: Toyota Production System: Beyond Large-scale Production, Productivity Press,
New York, 1988.
Olson, David R.: The World on Paper: The Conceptual and Cognitive Implications of Writing and
Reading, Cambridge University Press, Cambridge, 1994.
Olson, Gary M.; and Judith S. Olson: ‘Groupware and computer-supported cooperative work’, in
J. A. Jacko and A. Sears (eds.): The Human-Computer Interaction Handbook: Fundamentals,
Evolving Technologies and Emerging Applications, Lawrence Erlbaum, Mahwah, New Jersey,
2003, pp. 583-595.
Opitz, H.; W. Eversheim; and H. P. Wiendahl: ‘Workpiece classification and its industrial
application’, International Journal of Machine Tool Design Research, vol. 9, 1969, pp. 39-50.
Opitz, H.: A Classification System to Describe Workpieces, Pergamon Press, Oxford, 1970. Transl.
from Werkstückbeschreibendes Klassifizierungssystem, Verlag W. Girardet, Essen. Transl. by
R. A. A. Taylor. Text ed. by W. R. MacConnell.
Orbiteam: ‘BSCW at work: References’, August 2009.
<http://www.bscw.de/english/references.html>
Orlikowski, Wanda J.: ‘Learning from NOTES: Organizational issues in groupware
implementation’, in M. M. Mantei; R. M. Baecker; and R. E. Kraut (eds.): CSCW’92:
Proceedings of the Conference on Computer-Supported Cooperative Work, 31 October–4
November 1992, Toronto, Canada, ACM Press, New York, 1992, pp. 362-369.
Orton, John: Semiconductors and the Information Revolution: Magic Crystals that Made IT
Happen, Academic Press, Amsterdam, etc., 2009.
Ouchi, William G.: ‘Markets, bureaucracies, and clans’, Administrative Science Quarterly, vol. 25,
March 1980, pp. 129-141.
Palme, Jacob: ‘You have 134 unread mail! Do you want to read them now?’, in H. T. Smith (ed.):
Proceedings of the IFIP Conference on Computer Based Message Services, Nottingham, U.K.,
Elsevier North-Holland, New York, 1984, pp. 175-184.
Panko, Raymond R.: ‘The outlook for computer mail’, Telecommunications Policy, June 1977, pp.
242-253.
Panko, Raymond R.: ‘Electronic mail’, in J. Slonim; E. A. Unger; and P. S. Fisher (eds.):
Advances in Data Communications Management, vol. 2, John Wiley & Sons, New York, 1984,
pp. 205-222.
Pankoke-Babatz, Uta (ed.): Computer Based Group Communication: The AMIGO Activity Model,
Ellis Horwood Publishers, Chichester, 1989a.
References
467
Pankoke-Babatz, Uta: ‘Preface’, in U. Pankoke-Babatz (ed.): Computer Based Group
Communication: The AMIGO Activity Model, Ellis Horwood Publishers, Chichester, 1989b,
pp. 13-16.
Pankoke-Babatz, Uta: ‘Group communication and electronic communication’, in U. PankokeBabatz (ed.): Computer Based Group Communication: The AMIGO Activity Model, Ellis
Horwood Publishers, Chichester, 1989c, pp. 17-66.
Peaucelle, Jean-Louis: ‘From Taylorism to post-Taylorism: Simultaneously pursuing several
management objectives’, Journal of Organizational Change Management, vol. 13, no. 5, 2000,
pp. 452-467.
Peaucelle, Jean-Louis: ‘Adam Smith’s use of multiple references for his pin making example’, The
European Journal of the History of Economic Thought, vol. 13, no. 4, 2006, pp. 489-512.
Peaucelle, Jean-Louis; and Stéphane Manin: ‘Billettes and the economic viability of pin-making in
1700’, in: Eleventh World Congress of Accounting Historians, July 2006, Nantes, 2006.
<http://adamsmithslostlegacy.com/ftp.adamsmithslostlegacy.com/Pinmaking-Billettes.pdf>
Peaucelle, Jean-Louis: Adam Smith et la division du travail: La naissance d'une fausse idée,
L’Harmattan, Paris, 2007.
Perronet, Jean-Rodolphe: ‘Epinglier: Description de la façon dont on fabrique les épingles à Laigle
en Normandie’, in D. Diderot and J. L. R. d’Alembert (eds.): Recueil de planches, sur les
sciences, les arts liberaux, et les arts méchaniques, avec leur explication.Troisième livraison,
298 Planches, Briasson, David & Le Breton, Paris, 1765, pp. 5:1-5:8. Text ed. by R.
Morrissey. – ARTFL Encyclopédie Projet, University of Chicago, 2008. – [Encyclopédie…,
vol. 21]. <<http://portail.atilf.fr/encyclopedie/>>
Perrow, Charles: ‘The Organizational Context of Human Factors Engineering’, Administrative
Science Quaterly, vol. 28, December 1983, pp. 521-541.
Perrow, Charles: Normal Accidents: Living with High-Risk Technologies, Basic Books, New York,
1984.
Perrow, Charles: Complex Organizations. A Critical Essay, Random House, New York, 1986.
(Third ed.).
Petty, William: Another Essay in Political Arithmetick, Concerning the Growth of the City of
London, with the Measures, Periods, Causes, and Consequences thereof (1683, vol. II). Text
ed. by C. H. Hull. In W. Petty: The Economic Writings of Sir William Petty. Cambridge
University Press, Cambridge, 1899, pp. 449-487. – Reprinted by Augustus M. Kelley, New
York, 1963.
Piel, Gerard, et al. (eds.): Automatic Control, G. Bell and Sons, London, 1955. (2nd ed.; 1st. ed.
1955).
Pietroforte, Roberto: ‘Communication and governance in the building process’, Construction
Managment and Economics, vol. 15, 1997, pp. 71-82.
Pitkin, Hanna Fenichel: Wittgenstein and Justice: On the Significance of Ludwig Wittgenstein for
Social and Political Thought, University of California Press, Berkeley, Calif., 1972.
(Paperback ed., 1993).
Plowman, Lydia; Yvonne Rogers; and Magnus Ramage: ‘What are workplace studies for?’, in H.
Marmolin; Y. Sundblad; and K. Schmidt (eds.): ECSCW’95: Proceedings of the Fourth
European Conference on Computer-Supported Cooperative Work, 10–14 September 1995,
Stockholm, Sweden, Kluwer Academic Publishers, Dordrecht, 1995, pp. 309-324.
Pollock, Friedrich: Automation: Materialien zur Beurteilung der ökonomischen and sozialen
Folgen, Europäische Verlagsanstalt, Frankfurt a. M., 1964.
Popitz, Heinrich, et al.: Technik und Industriearbeit: Soziologische Untersuchungen in der
Hüttenindustrie, J. C. B. Mohr, Tübingen, 1957.
Poppe, Johann Heinrich Moritz: Geschichte der Technologie seit der Wiederherstellung der
Wissenschaften bis an das Ende des achtzehnten Jahrhundert: Erster Band, Röwer, Göttingen,
1807. – Georg Olms Verlag, Hildesheim etc., 1999.
Postel, Jonathan Bruce: ‘RFC 808: Summary of computer mail services meeting held at BBN on
10 January 1979’, Request for Comments, no. 808, Network Working Group, 1 March 1982.
<http://www.rfc-editor.org/rfc/rfc808.txt>
468
Cooperative Work and Coordinative Practices
Potts, Colin; and Lara Catledge: ‘Collaborative conceptual design: A large software project case
study’, Computer Supported Cooperative Work (CSCW): The Journal of Collaborative
Computing, vol. 5, no. 4, 1996, pp. 415-445.
Pougès, Claude, et al.: ‘Conception de collecticiels pour l’aide à la prise de décision en situation
d’urgence: La nécessité d’une approche pluridisciplinaire et intégrée’, in B. Pavard (ed.):
Systèmes coopératifs: De la modélisation à la conception, Octares Éditions, Toulouse, 1994,
pp. 351-375.
Powell, Walter W.: ‘Neither market nor hierarchy: Network forms of organization’, in B. M. Staw
and L. L. Cummings (eds.): Research in Organizational Behavior, vol. 12, JAI Press,
Greenwich, Conn., 1989, pp. 295-336.
Prigogine, Ilya; and Isabelle Stengers: La nouvelle alliance: Metamorphose de la science, Paris,
1979.
Prinz, Wolfgang: ‘TOSCA: Providing organisational information to CSCW applications’, in G. De
Michelis; C. Simone; and K. Schmidt (eds.): ECSCW’93: Proceedings of the Third European
Conference on Computer-Supported Cooperative Work, 13-17 September 1993, Milano, Italy,
Kluwer Academic Publishers, Dordrecht, 1993, pp. 139-154.
Prinz, Wolfgang: ‘NESSIE: An awareness environment for cooperative settings’, in S. Bødker; M.
Kyng; and K. Schmidt (eds.): ECSCW’99: Proceedings of the Sixth European Conference on
Computer-Supported Cooperative Work, 12–16 September 1999, Copenhagen, Kluwer
Academic Publishers, Dordrecht, 1999, pp. 391-410.
Prinz, Wolfgang: ‘Industry perspective on Collaborative Environments’, in I. L. Ballesteros (ed.):
New Collaborative Working Environments 2020: Report on Industry-led FP7 Consultations
and 3rd Report of the Experts Group on Collaboration@Work, European Commisssion, DG
Information Society and Media, Bruxelles, February 2006, pp. 5-22.
Pycock, James; and Wes W. Sharrock: ‘The fault report form: Mechanisms of interaction in design
and development project work’, in K. Schmidt (ed.): Social Mechanisms of Interaction,
Computing Department, Lancaster University, Lancaster, UK, September 1994, pp. 257-294.
COMIC Deliverable D3.2. <ftp://ftp.comp.lancs.ac.uk/pub/comic>
Pycock, James: ‘Mechanisms of interaction and technologies of representation: Examining a case
study’, in K. Schmidt (ed.): Social Mechanisms of Interaction, Computing Department,
Lancaster University, Lancaster, UK, September 1994, pp. 123-148. COMIC Deliverable D3.2.
<ftp://ftp.comp.lancs.ac.uk/pub/comic>
Quarterman, John S.: The Matrix: Computer Networks and Conferencing Systems Worldwide,
Digital Press, Bedford, Mass., 1990.
Randall, David W.; Richard H. R. Harper; and Mark Rouncefield: Fieldwork for Design: Theory
and Practice, Springer, London, 2007a.
Randall, David W., et al.: ‘Ontology building as practical work: Lessons from CSCW’, in:
Proceedings of the third International Conference on e-Social Science, 7-9 October 2007, Ann
Arbor, Michigan, 2007b.
Rasmussen, Jens: On the Structure of Knowledge: A Morphology of Mental Models in a ManMachine Context, Risø National Laboratory, Roskilde, Denmark, November 1979.
Rasmussen, Jens; and Morten Lind: Coping with complexity, Risø National Laboratory, Roskilde,
Denmark, June 1981. – Risø-M-2293, DK-4000 Roskilde, Denmark.
Rasmussen, Jens: ‘The role of hierarchical knowledge representation in decisionmaking and
system management’, IEEE Transactions on Systems, Man, and Cybernetics, vol. SMC-15, no.
2, March/April 1985, pp. 234-243.
Rasmussen, Jens: ‘A cognitive engineering approach to the modelling of decision making and its
organization in process control, emergency management, CAD/CAM, office systems, library
systems’, in W. B. Rouse (ed.): Advances in Man-Machine Systems Research, vol. 4, JAI
Press, Greenwich, Conn., 1988, pp. 165-243.
Rasmussen, Jens; Berndt Brehmer; and Jacques Leplat (eds.): Distributed Decision Making:
Cognitive Models for Cooperative Work, John Wiley & Sons, Chichester, 1991.
Réaumur, René Antoine Ferchault de; Henri Louis Duhamel du Monceau; and Jean-Rodolphe
Perronet: Art de l'épinglier par M. de Reaumur; avec des additions de M. Duhamel du
Monceau, & des remarques extraites des memoires de M. Perronet, inspecteur general des
ponts & chaussees, Saillant et Noyon, Paris, 1761.
References
469
Redlow, Götz: Theoria: Theoretische und praktische Lebensauffassung im philosophischen
Denken der Antike, VEB Deutscher Verlag der Wissenschaften, Berlin, 1966.
Redmond, Kent C.; and Thomas M. Smith: Project Whirlwind: The History of a Pioneer
Computer, Digital Press, Bedford, Mass., 1980.
Redmond, Kent C.; and Thomas M. Smith: From Whirlwind to MITRE: The R&D Story of the
SAGE Airdefense Computer, The MIT Press, Cambridge, Mass., and London, 2000.
Reekie, Fraser: Reekie’s Architectural Drawing, Architectural Press, Amsterdam etc., 1995. Text
ed. by T. McCarthy. (4th ed.; 1st ed. 1946).
Reid, T. R.: The Chip: How Two Americans Invented the Microchip and Launched a Revolution,
Random House, New York, 2001. (2nd ed.; 1st ed., 1985).
Resnick, Lauren B.; John M. Levine; and Stephanie D. Teasley (eds.): Perspectives on Socially
Shared Cognition, American Psychological Association, Washington D.C., 1991.
Reynolds, Peter C.: On the Evolution of Human Behavior: The Argument from Animals to Man,
University of California Press, Berkeley, Calif., 1981.
Rice, Ronald E.: ‘Computer-mediated communication and organizational innovation’, Journal of
Communication, vol. 37, no. 4, Autumn 1987, pp. 65-94.
Rice, Ronald E.: ‘Computer-mediated communication system network data: Theoretical concerns
and emprirical examples’, International Journal of Man-Machine Studies, vol. 32, no. 6, June
1990, pp. 627-647.
Riordan, Michael; and Lillian Hoddeson: Crystal Fire: The Invention of the Transistor and the
Birth of the Information Age, W. W. Norton & Co., New York and London, 1997. (Paperback
ed., 1998).
Rittel, Horst W. J.; and Melvin M. Webber: ‘Dilemmas in a general theory of planning’, Policy
Sciences, vol. 4, 1973, pp. 155-169.
Roberts, Lawrence G.: ‘Multiple computer networks and intercomputer communication’, in J.
Gosden and B. Randell (eds.): SOSP’67: Proceedings of the First ACM Symposium on
Operating System Principles, 1-4 October 1967, Gatlinburg, Tennessee, ACM Press, New
York, 1967, pp. 3.1-3.6.
Roberts, Lawrence G.; and Barry D. Wessler: ‘Computer network development to achieve
resource sharing’, in H. L. Cooke (ed.): SJCC’70: Proceedings of the Spring Joint Computer
Conference, 5-7 May 1970, Atlantic City, New Jersey, vol. 36, AFIPS Press, Montvale, New
Jersey, 1970, pp. 543-549.
Robertson, Toni: Designing Over Distance: A Study of Cooperative Work, Embodied Cognition
and Technology to Enable remote Collaboration, Submitted for the Degree of Doctor of
Philosophy, School of Computing Sciences, University of Technology, Sydney, 1997. 195, pp.
Robertson, Toni: ‘The public availability of actions and artefacts’, Computer Supported
Cooperative Work (CSCW): The Journal of Collaborative Computing, vol. 11, no. 3-4, 2002.
Robinson, Mike: ‘Double-level languages and co-operative working’, AI & Society, vol. 5, 1991,
pp. 34-60.
Robinson, Mike; and Liam J. Bannon: ‘Questioning representations’, in L. J. Bannon; M.
Robinson; and K. Schmidt (eds.): ECSCW’91: Proceedings of the Second European
Conference on Computer-Supported Cooperative Work, 24–27 September 1991, Amsterdam,
Kluwer Academic Publishers, Dordrecht, 1991, pp. 219-233.
Rochlin, Gene I.; Todd R. La Porte; and Karlene H. Roberts: ‘The self-designing high-reliability
organization: Aircraft carrier flight operations at sea’, Naval War College Review, Autumn
1987, pp. 76-90.
Rochlin, Gene I.: ‘Informal organizational networking as a crisis-avoidance strategy: U. S. naval
flight operations as a case study’, Industrial Crisis Quarterly, vol. 3, 1989, pp. 159-176.
Rodden, Tom A.; and Gordon Blair: ‘CSCW and distributed systems: The problem of control’, in
L. J. Bannon; M. Robinson; and K. Schmidt (eds.): ECSCW’91: Proceedings of the Second
European Conference on Computer-Supported Cooperative Work, 24–27 September 1991,
Amsterdam, Kluwer Academic Publishers, Dordrecht, 1991, pp. 49-64.
Rodden, Tom A.; John A. Mariani; and Gordon Blair: ‘Supporting cooperative applications’,
Computer Supported Cooperative Work (CSCW): An International Journal, vol. 1, no. 1-2,
1992, pp. 41-68.
470
Cooperative Work and Coordinative Practices
Rodden, Tom A.: ‘Populating the application: A model of awareness for cooperative applications’,
in G. M. Olson; J. S. Olson; and M. S. Ackerman (eds.): CSCW’96: Proceedings of the
Conference on Computer-Supported Cooperative Work, 16-20 November 1996, Boston, Mass.,
ACM Press, New York, 1996, pp. 87-96.
Roethlisberger, Fritz J.; and William J. Dickson: Management and the Worker, Harvard University
Press, Cambridge, Mass., 1939.
Rogers, Yvonne: ‘Coordinating computer-mediated work’, Computer Supported Cooperative
Work (CSCW): An International Journal, vol. 1, no. 4, 1993, pp. 295-315.
Rønby Pedersen, Elin; and Tomas Sokoler: ‘AROMA: Abstract representation of presence
supporting mutual awareness’, in S. Pemberton (ed.): CHI’97 Conference Proceedings: ACM
SIGCHI Conference on Human Factors in Computing Systems, Atlanta, Georgia, 22-27 March
1997, ACM Press, New York, 1997, pp. 51-58.
Roseman, Mark; and Saul Greenberg: ‘TeamRooms: Network places for collaboration’, in G. M.
Olson; J. S. Olson; and M. S. Ackerman (eds.): CSCW’96: Proceedings of the Conference on
Computer-Supported Cooperative Work, 16-20 November 1996, Boston, Mass., ACM press,
New York, 1996, pp. 325-333.
Roth, Emilie M.; and David D. Woods: ‘Cognitive task analysis: An approach to knowledge
acquisition for intelligent system design’, in G. Guida and C. Tasso (eds.): Topics in Expert
System Design. Methodologies and Tools, North-Holland, Amsterdam, 1989, pp. 233-264.
Rouse, William B.; and Sandra H. Rouse: ‘Measures of complexity of fault diagnosis tasks’, IEEE
Transactions on Systems, Man, and Cybernetics, vol. 9, no. 11, November 1979, pp. 720-727.
Russell, Bertrand: ‘The study of mathematics’ (1902). In B. Russell: Mysticism and Logic and
Other Essays. George Allen & Unwin, London, 1917, pp. 47-57. – [First published in New
Quarterly, November, 1907].
Russell, Bertrand: The Principles of Mathematics, 1903. (2nd ed., 1937).
Ryle, Gilbert: The Concept of Mind, Hutchinson’s University Library, London, 1949.
Ryle, Gilbert: ‘The thinking of thoughts: What is “le Penseur” doing?’ (University Lectures,
University of Saskatchewan, 1968). In G. Ryle: Collected Papers. Volume II: Collected
Essays, 1929-1968. Hutchinson & Co, London, 1971, pp. 480-496.
Sabbagh, Karl: Skyscraper: The Making of a Building, Macmillan, London, 1989. – Penguin
Books, New York, 1991.
Salus, Peter H.: Casting the Net: From ARPANET to Internet and Beyond, Addison-Wesley,
Reading, Mass., etc., 1995.
Sandor, Ovidiu; Cristian Bogdan; and John M. Bowers: ‘Aether: An awareness engine for
CSCW’, in J. A. Hughes, et al. (eds.): ECSCW’97: Proceedings of the Fifth European
Conference on Computer-Supported Cooperative Work, 7–11 September 1997, Lancaster,
U.K., Kluwer Academic Publishers, Dordrecht, 1997, pp. 221-236.
Sartre, Jean-Paul: Critique de la raison dialectique. Tome I, Gallimard, Paris, 1960.
Savage, Charles M.: Fifth Generation Management for Fifth Generation Technology (A Round
Table Discussion), Society of Manufacturing Engineers, Dearborn, Michigan, 1987.
Schaffer, Simon: ‘Babbage's calculating engines and the factory system’, Réseaux, vol. 4, no. 2,
1996, pp. 271-298.
Schäl, Thomas: ‘System design for cooperative work in the Language Action Perspective: A case
study of The Coordinator’, in D. Z. Shapiro; M. Tauber; and R. Traünmuller (eds.): The Design
of Computer Supported Cooperative Work and Groupware Systems, North-Holland Elsevier,
Amsterdam, 1996, pp. 377-400.
Schick, Kathy D.; and Nicholas Toth: Making Silent Stones Speak: Human Evolution and the
Dawn of Technology, Simon & Schuster, New York, 1993.
Schmidt, Kjeld: ‘Fremmedgørelse og frigørelse’ (1970). In K. Marx: Kritik af den politiske
økonomi: Grundrids. Rhodos, København, 1970, pp. 7-19.
Schmidt, Kjeld; Kay Clausen; and Erling C. Havn: Computer-integrerede produktionssystemer
(CIM), teknologien og dens samfundsmæssige betingelser og konsekvenser, Roskilde
University, Roskilde, Denmark, September 1984. – CIM-projektet, working paper, no. 1.
<http://cscw.dk/schmidt/papers/cim1984.pdf>
References
471
Schmidt, Kjeld: ‘A dialectical approach to functional analysis of office work’, in: IEEE
International Conference on Systems, Man, and Cybernetics, 14-17 October 1986, Atlanta,
Georgia, IEEE Press, New York, 1986, pp. 1586-1591.
Schmidt, Kjeld: Teknikerens arbejde og teknikerens arbejdsplads, Dansk Datamatik Center,
Lyngby, Denmark, December 1987. TIA projektet. <http://cscw.dk/schmidt/papers/tia.pdf>
Schmidt, Kjeld: ‘Functional Analysis Instrument’, in G. Schäfer, et al. (eds.): Functional Analysis
of Office Requirements. A Multiperspective Approach, Wiley, Chichester, 1988a, pp. 261-289.
Schmidt, Kjeld: ‘Forms of cooperative work’, in H.-J. Bullinger, et al. (eds.): EURINFO’88:
Proceedings of the First European Conference on Information Technology for Organisational
Systems, 16-20 May 1988, Athens, Greece, Elsevier Science Publishers (North Holland),
1988b.
Schmidt, Kjeld: ‘Cooperative work: A conceptual framework’, in: New Technology, Distributed
Decision Making, and Responsibility, 5-7 May 1988, Bad Homburg, Germany, 1988c.
Schmidt, Kjeld: Analysis of Cooperative Work. A Conceptual Framework, Risø National
Laboratory, Roskilde, Denmark, June 1990. – Risø-M-2890.
Schmidt, Kjeld: ‘Riding a tiger, or Computer Supported Cooperative Work’, in L. J. Bannon; M.
Robinson; and K. Schmidt (eds.): ECSCW’91: Proceedings of the Second European
Conference on Computer-Supported Cooperative Work, 24–27 September 1991, Amsterdam,
Kluwer Academic Publishers, Dordrecht, 1991a, pp. 1-16.
Schmidt, Kjeld: ‘Computer support for cooperative work in advanced manufacturing’,
International Journal of Human Factors in Manufacturing, vol. 1, no. 4, October 1991b, pp.
303-320.
Schmidt, Kjeld: ‘Cooperative work: A conceptual framework’, in J. Rasmussen; B. Brehmer; and
J. Leplat (eds.): Distributed Decision Making: Cognitive Models for Cooperative Work, John
Wiley & Sons, Chichester, 1991c, pp. 75-109.
Schmidt, Kjeld; and Liam J. Bannon: ‘CSCW, Or What’s In A Name?’. [Unpublished
manuscript]. August 1991.
Schmidt, Kjeld; and Liam J. Bannon: ‘Taking CSCW seriously: Supporting articulation work’,
Computer Supported Cooperative Work (CSCW): An International Journal, vol. 1, no. 1-2,
1992, pp. 7-40.
Schmidt, Kjeld, et al.: ‘Computational mechanisms of interaction: Notations and facilities’, in C.
Simone and K. Schmidt (eds.): Computational Mechanisms of Interaction for CSCW,
Computing Department, Lancaster University, Lancaster, UK, October 1993, pp. 109-164.
COMIC Deliverable D3.1. <ftp://ftp.comp.lancs.ac.uk/pub/comic>
Schmidt, Kjeld: ‘Cooperative work and its articulation: Requirements for computer support’,
Travail Humain, vol. 57, no. 4, December 1994a, pp. 345-366.
Schmidt, Kjeld: Modes and Mechanisms of Interaction in Cooperative Work, Risø National
Laboratory, Roskilde, Denmark, 1994b. – Risø-R-666(EN).
Schmidt, Kjeld: ‘The organization of cooperative work: Beyond the “Leviathan” conception of the
organization of cooperative work’, in J. B. Smith; F. D. Smith; and T. W. Malone (eds.):
CSCW’94: Proceedings of the Conference on Computer-Supported Cooperative Work, 24-26
October 1994, Chapel Hill, North Carolina, ACM Press, New York, 1994c, pp. 101-112.
Schmidt, Kjeld: ‘Mechanisms of interaction reconsidered’, in K. Schmidt (ed.): Social
Mechanisms of Interaction, Computing Department, Lancaster University, Lancaster, UK,
September 1994d, pp. 15-122. COMIC Deliverable D3.2.
<ftp://ftp.comp.lancs.ac.uk/pub/comic>
Schmidt, Kjeld (ed.): Social Mechanisms of Interaction, Computing Department, Lancaster
University, Lancaster, UK, September 1994e. COMIC Deliverable D3.2.
<ftp://ftp.comp.lancs.ac.uk/pub/comic>
Schmidt, Kjeld, et al.: A ‘contrat sociale’ for CSCW systems: Supporting interoperability of
computational coordination mechanisms, Centre for Cognitive Science, Roskilde University,
Roskilde, Denmark, 1995. Working Papers in Cognitive Science and HCI. – WPCS-95-7.
472
Cooperative Work and Coordinative Practices
Schmidt, Kjeld; and Tom A. Rodden: ‘Putting it all together: Requirements for a CSCW platform’,
in D. Z. Shapiro; M. Tauber; and R. Traunmüller (eds.): The Design of Computer Supported
Cooperative Work and Groupware Systems, North-Holland Elsevier, Amsterdam, 1996, pp.
157-176. – Originally published at the Workshop on CSCW Design, Schärding, Austria, 1-3
June 1993.
Schmidt, Kjeld; and Carla Simone: ‘Coordination mechanisms: Towards a conceptual foundation
of CSCW systems design’, Computer Supported Cooperative Work (CSCW): The Journal of
Collaborative Computing, vol. 5, no. 2-3, 1996, pp. 155-200.
Schmidt, Kjeld: ‘Of maps and scripts: The status of formal constructs in cooperative work’, in S.
C. Hayne and W. Prinz (eds.): GROUP’97: Proceedings of the ACM SIGGROUP Conference
on Supporting Group Work, 16-19 November 1997, Phoenix, Arizona, ACM Press, New York,
1997, pp. 138-147.
Schmidt, Kjeld: ‘The critical role of workplace studies in CSCW’, in P. Luff; J. Hindmarsh; and C.
C. Heath (eds.): Workplace Studies: Recovering Work Practice and Informing System Design,
Cambridge University Press, Cambridge, 2000, pp. 141-149.
Schmidt, Kjeld; and Carla Simone: ‘Mind the gap! Towards a unified view of CSCW’, in R.
Dieng, et al. (eds.): Designing Cooperative Systems: The Use of Theories and Models.
Proceedings of the 4th International Conference on the Design of Cooperative Systems (COOP
2000) [23-26 May 2000, Sophia Antipolis, France], IOS Press, Amsterdam etc., 2000, pp. 205221.
Schmidt, Kjeld; and Ina Wagner: ‘Ordering systems in architectural design and planning: A
discussion of classification systems and practices’, in G. C. Bowker; L. Gasser; and B. Turner
(eds.): Workshop on Infrastructures for Distributed Collective Practice, 6-9 February 2002,
San Diego, 2002a.
Schmidt, Kjeld; and Ina Wagner: ‘Coordinative artifacts in architectural practice’, in M. BlayFornarino, et al. (eds.): Designing Cooperative Systems: A Challenge of the Mobility Age.
[Proceedings of the 5th International Conference on the Design of Cooperative Systems
(COOP 2002), 4-7 June 2002, Saint Raphaël, France], IOS Press, Amsterdam etc., 2002b, pp.
257-274.
Schmidt, Kjeld: ‘Remarks on the complexity of cooperative work’, Revue des sciences et
technologies de l’information. Série Revue d’intelligence artificielle (RSTI-RAI), vol. 16, no. 45. Paris, 2002a, pp. 443-483.
Schmidt, Kjeld: ‘The problem with “awareness”: Introductory remarks on “Awareness in
CSCW”’, Computer Supported Cooperative Work (CSCW): The Journal of Collaborative
Computing, vol. 11, no. 3-4, 2002b, pp. 285-298.
Schmidt, Kjeld; and Ina Wagner: ‘Ordering systems: Coordinative practices and artifacts in
architectural design and planning’, Computer Supported Cooperative Work (CSCW): The
Journal of Collaborative Computing, vol. 13, no. 5-6, 2004, pp. 349-408.
Schmidt, Kjeld; Ina Wagner; and Marianne Tolar: ‘Permutations of cooperative work practices: A
study of two oncology clinics’, in T. Gross, et al. (eds.): GROUP 2007: International
Conference on Supporting Group Work, 4-7 November 2007, Sanibel Island, Florida, USA,
ACM Press, New York, 2007, pp. 1-10.
Schmidt, Kjeld: ‘Divided by a common acronym: On the fragmentation of CSCW’, in I. Wagner,
et al. (eds.): ECSCW 2009: Proceedings of the 11th European Conference on ComputerSupported Cooperative Work, 7-11 September 2009, Vienna, Austria, Springer, London, 2009,
pp. 223-242.
Schmidt, Kjeld: ‘“Keep up the good work!”: The concept of “work” in CSCW’, in M. Lewkowicz,
et al. (eds.): COOP 2010: 9th International Conference on the Design of Cooperative Systems,
18-21 May 2010, Aix-en-Provence, France, Springer, London, 2010.
Schneider, Karin; and Ina Wagner: ‘Constructing the “dossier representatif”: Computer-based
information-sharing in French hospitals’, Computer Supported Cooperative Work (CSCW): An
International Journal, vol. 1, no. 4, 1993, pp. 229-253.
Schonberger, Richard J.: Japanese Manufacturing Techniques: Nine Hidden Lessons in Simplicity,
Free Press, New York, 1982.
Schümmer, Till; and Stephan Lukosch: Patterns for Computer-Mediated Interaction, John Wiley
& Sons, Chichester, England, 2007.
References
473
Schütz, Alfred [A. Schutz]: ‘The problem of rationality in the social world’ (Economica, 1943).
Text ed. by A. Brodersen. In A. Schutz: Collected Papers. Vol. II. Studies in Social Theory.
Martinus Nijhoff, The Hague, 1964, pp. 64-90.
Schütz, Alfred [A. Schutz]: ‘On multiple realities’ (Philosophy and Phenomenological Research,
June 1945). Text ed. by M. Natanson. In A. Schutz: Collected Papers. Vol. I. The Problem of
Social Reality. Martinus Nijhoff, The Hague, 1962, pp. 207-259.
Schütz, Alfred [A. Schutz]: ‘The well-informed citizen: An essay on the social distribution of
knowledge’ (Social Research, 1946). Text ed. by A. Brodersen. In A. Schutz: Collected
Papers. Vol. II. Studies in Social Theory. Martinus Nijhoff, The Hague, 1964, pp. 120-134.
Schütz, Alfred [A. Schutz]: Reflections on the Problem of Relevance (Manuscript, 1947-51). Text
ed. by R. M. Zaner. Yale University Press, New Haven, 1970.
Schütz, Alfred [A. Schutz]: ‘Common-sense and scientific interpretations of human action’
(Philosophy and Phenomenological Research, September 1953). Text ed. by M. Natanson. In
A. Schutz: Collected Papers. Vol. I. The Problem of Social Reality. Martinus Nijhoff, The
Hague, 1962, pp. 3-47.
Schütz, Alfred [A. Schutz]: The Phenomenology of the Social World, Northwestern University
Press, Evanston, Ill., 1967. Transl. from Der sinnhafte Aufbau der sozialen Welt; eine
Einleitung in die Verstehende Soziologie, J. Springer, Wien, 1932. Transl. by G. Walsh and F.
Lehnert.
Searle, John R.: ‘Minds and brains without programs’, in C. Blakemore and S. Greenfield (eds.):
Mindwaves: Thoughts on Intelligence, Identity and Consciousness, Basil Blackwell, Oxford,
1987, pp. 209-233. (Paperback ed., 1989).
Searle, John R.: Minds, Brains and Science: The 1984 Reith Lectures, Pelican Books, London,
1989.
Sellen, Abigail; and Richard H. R. Harper: The Myth of the Paperless Office, MIT Press,
Cambridge, Mass., 2001.
Selznick, Philip: ‘Foundations of the theory of organization’, American Sociological Review, vol.
13, 1948, pp. 25-35.
Shanker, Stuart G. (ed.): Ludwig Wittgenstein: Critical Assessments: Volume Three: From the
Tractatus to Remarks on the Foundations of Mathematics: Wittgenstein on the Philosophy of
Mathematics, Croom Helm, London, 1986a.
Shanker, Stuart G.: ‘Introduction: The portals of discovery’, in S. G. Shanker (ed.): Ludwig
Wittgenstein: Critical Assessments: Volume Three: From the Tractatus to Remarks on the
Foundations of Mathematics: Wittgenstein on the Philosophy of Mathematics, Croom Helm,
London, 1986b, pp. 1-25.
Shanker, Stuart G.: ‘Introduction: The nature of philosophy’, in S. G. Shanker (ed.): Ludwig
Wittgenstein: Critical Assessments: Volume Four: From Theology to Sociology: Wittgenstein’s
Impact on Contemporary Thought, Croom Helm, London, 1986c, pp. 1-28.
Shanker, Stuart G.: Wittgenstein and the Turning-Point in the Philosophy of Mathematics, Croom
Helm, London and Sydney, 1987a.
Shanker, Stuart G.: ‘AI at the crossroads’, in B. P. Bloomfield (ed.): The Question of Artificial
Intelligence: Philosophical and Sociological Perspectives, Croom Helm, London, etc., 1987b,
pp. 1-58.
Shanker, Stuart G.: ‘The decline and fall of the mechanist metaphor’, in R. Born (ed.): Artificial
Intelligence: The Case Against, Routledge, London and New York, 1987c, pp. 72-131.
Shanker, Stuart G.: ‘Wittgenstein versus Turing on the nature of Church's thesis’, Notre Dame
Journal of Formal Logic, vol. 28, no. 4, October 1987d, pp. 615-649.
Shanker, Stuart G.: ‘Turing and the origins of AI’, Philosophia mathematica, vol. 3–3, 1995, pp.
52-86.
Shanker, Stuart G.: Wittgenstein’s Remarks on the Foundation of AI, Routledge, London, 1998.
Shannon, Claude E.: ‘A mathematical theory of communication’, The Bell System Technical
Journal, vol. 27, July, October 1948, pp. 379-423, 623-656.
Shannon, Claude E.; and Warren Weaver: The Mathematical Theory of Communication,
University of Illinois Press, Urbana, 1949.
474
Cooperative Work and Coordinative Practices
Shapiro, Dan Z., et al.: ‘Visual re-representation of database information: the flight data strip in air
traffic control’, in M. Tauber; D. E. Mahling; and F. Arefi (eds.): Cognitive Aspects of Visual
Languages and Visual Interfaces, Elsevier Science, Amsterdam, 1994, pp. 349-376.
Sharrock, Wes W.; and Robert J. Anderson: The Ethnomethodologists, Ellis Horwood Publishers,
Chichester, 1986.
Sharrock, Wes W.; and Rupert Read: Kuhn: Philosopher of Scientific Revolution, Polity Press,
Cambridge, 2002.
Sharrock, Wes W.; and Graham Button: ‘Plans and Situated Action ten years on’, The Journal of
the Learning Sciences, vol. 12, no. 2, 2003, pp. 259-264.
Sheil, Beau A.: ‘Coping with complexity’, Office: Technology and People, vol. 1, no. 4, January
1983, pp. 295-320.
Shen, Weiming, et al. (eds.): CSCWD 2007: Computer Supported Cooperative Work in Design,
IV: 11th International Conference, 26-28 April 2007, Melbourne, Australia, Springer, 2008.
Shepherd, Allan; Niels Mayer; and Allan Kuchinsky: ‘Strudel: An extensible electronic
conversation toolkit’, in T. Bikson and F. Halasz (eds.): CSCW’90: Proceedings of the
Conference on Computer-Supported Cooperative Work, 7-10 October 1990, Los Angeles,
Calif., ACM Press, New York, 1990, pp. 93-104.
Shneiderman, Ben: ‘Direct manipulation: A step beyond programming languages’, IEEE
Computer, vol. 16, no. 8, August 1983, pp. 57-69.
Simon, Herbert A.: ‘The architecture of complexity’ (Proceedings of the American Philosophical
Society, December 1962). In H. A. Simon: The Sciences of the Artificial. MIT Press,
Cambridge, Mass., 1969, pp. 192-229.
Simon, Herbert A.: ‘How big is a chunk?’ (Science, 1974). In H. A. Simon: Models of Thought.
Yale University Press, New Haven and London, 1979, pp. 50-61.
Simon, Herbert A.: Models of Thought, Yale University Press, New Haven and London, 1979.
Simone, Carla; and Kjeld Schmidt (eds.): Computational Mechanisms of Interaction for CSCW,
Computing Department, Lancaster University, Lancaster, UK, October 1993. COMIC
Deliverable D3.1. <ftp://ftp.comp.lancs.ac.uk/pub/comic>
Simone, Carla: ‘Integrating co-operative work supports from CHAOS perspective (abstract)’,
SIGOIS Bulletin, vol. 13, no. 4, 1993, p. 25.
Simone, Carla; and Kjeld Schmidt (eds.): A Notation for Computational Mechanisms of
Interaction, Computing Department, Lancaster University, Lancaster, UK, September 1994.
COMIC Deliverable D3.3. <ftp://ftp.comp.lancs.ac.uk/pub/comic>
Simone, Carla, et al.: ‘A multi-agent approach to the design of coordination mechanisms’, in V.
Lesser (ed.): Proceedings of the First International Conference on Multi-Agent Systems, San
Francisco, Calif., USA, June 12-14, 1995, AAAI Press, Menlo Park, Calif., 1995a.
Simone, Carla; Monica Divitini; and Kjeld Schmidt: ‘A notation for malleable and interoperable
coordination mechanisms for CSCW systems’, in N. Comstock, et al. (eds.): COOCS’95:
Conference on Organizational Computing Systems, 13-16 August 1995, Milpitas, California,
ACM Press, New York, 1995b, pp. 44-54.
Simone, Carla; and Stefania Bandini: ‘Compositional features for promoting awareness within and
across cooperative applications’, in S. C. Hayne and W. Prinz (eds.): GROUP’97: Proceedings
of the ACM SIGGROUP Conference on Supporting Group Work, 16-19 November 1997,
Phoenix, Arizona, ACM Press, New York, 1997, pp. 358-367.
Simone, Carla; and Monica Divitini: ‘Ariadne: supporting coordination through a flexible use of
the knowledge on work processes’, Journal of Universal Computer Science, vol. 3, no. 8,
1997, pp. 865-898.
Simone, Carla; and Kjeld Schmidt: ‘Taking the distributed nature of cooperative work seriously’,
in: Proceedings of the 6th Euromicro Workshop on Parallel and Distributed Processing, 21-23
January 1998, Madrid, IEEE Computer Society Press, Los Alamitos, Calif., 1998, pp. 295301.
Simone, Carla; and Monica Divitini: ‘Integrating contexts to support coordination: The CHAOS
project’, Computer Supported Cooperative Work (CSCW): The Journal of Collaborative
Computing, vol. 8, no. 3, September 1999, pp. 239-283.
References
475
Simone, Carla; and Stefania Bandini: ‘Integrating awareness in cooperative applications through
the reaction-diffusion metaphor’, Computer Supported Cooperative Work (CSCW): The
Journal of Collaborative Computing, vol. 11, no. 3-4, 2002.
Sluizer, S.; and P. Cashman: ‘XCP: An experimental tool for supporting office procedures’, in:
Proceedings of the First International Conference on Office Automation, 17-19 December
1984, New Orleans, Louisiana, IEEE Computer Society Press, Silver Spring, Maryland, 1984,
pp. 73-80.
Smith, Adam: An Inquiry into the Nature and Causes of the Wealth of Nations, vol. 1-2, Printed
for W. Strahan; and T. Cadell, in the Strand, London, 1776a.
Smith, Adam: An Inquiry into the Nature and Causes of the Wealth of Nations, Printed for W.
Strahan; and T. Cadell, in the Strand, London, 1776b. Text ed. by E. Cannan. – The Modern
Library, New York, 1994.
Smith, David Canfield, et al.: ‘Designing the Star User Interface’, Byte, vol. April, 1982a, pp. 242282.
Smith, David Canfield, et al.: ‘The Star user interface: An overview’, in R. K. Brown and H. L.
Morgan (eds.): AFIPS’82: Proceedings of the National Computer Conference, 7-10 June 1982,
Houston, Texas, AFIPS Press, Arlington, Virginia, 1982b, pp. 515-528.
Smith, H. T.; P. A. Hennessy; and G. A. Lunt: ‘An object-oriented framework for modelling
organizational communication’, in J. M. Bowers and S. D. Benford (eds.): Studies in Computer
Supported Cooperative Work. Theory, Practice and Design, North-Holland, Amsterdam, 1991,
pp. 145-157.
Smith, Hugh T.: ‘The requirements for group communication services’, in R. Speth (ed.):
EUTECO ’88: European Teleinformatics Conference on Research into Networks and
Distributed Applications, 20-22 April 1988, Vienna, Austria. Organized by the European
Action in Teleinformatics COST 11ter, North-Holland, Brussels and Luxemburg, 1988, pp. 8995.
Snedecor, George W.: ‘Uses of punched card equipment in mathematics’, The American
Mathematical Monthly, vol. 35, no. 4, April 1928, pp. 161-169.
Sombart, Werner: Der moderne Kapitalismus: Historisch-systematische Darstellung des
gesamteuropäischen Wirtschaftslebens von seinen Anfängen bis zur Gegenwart: Band II: Das
europäische Wirtschaftsleben im Zeitalter des Frühkapitalismus, Duncker & Humblot,
München and Leipzig, 1916. (2nd ed.; 1st ed., 1902). – Deutscher Taschenbuch Verlag,
München, 1987.
Sørensen, Carsten: ‘The augmented bill of materials’, in K. Schmidt (ed.): Social Mechanisms of
Interaction, Computing Department, Lancaster University, Lancaster, UK, September 1994a,
pp. 221-236. COMIC Deliverable D3.2. <ftp://ftp.comp.lancs.ac.uk/pub/comic>
Sørensen, Carsten: ‘The CEDAC board’, in K. Schmidt (ed.): Social Mechanisms of Interaction,
Computing Department, Lancaster University, Lancaster, UK, September 1994b, pp. 237-245.
COMIC Deliverable D3.2. <ftp://ftp.comp.lancs.ac.uk/pub/comic>
Sørensen, Carsten: ‘The product classification scheme’, in K. Schmidt (ed.): Social Mechanisms of
Interaction (ftp.comp.lancs.ac.uk), Computing Department, Lancaster University, Lancaster,
UK, September 1994c, pp. 247-255. <ftp://ftp.comp.lancs.ac.uk/pub/comic>
Sørgaard, Pål: ‘A cooperative work perspective on use and development of computer artifacts’, in:
10th Information Systems Research Seminar in Scandinavia (IRIS), 10-12 August 1987,
Vaskivesi, Finland, 1987.
Spear, Steven; and H. Kent Bowen: ‘Decoding the DNA of the Toyota Production System’,
Harvard Business Review, September-October 1999, pp. 96-106.
Speth, Rolf (ed.): EUTECO ’88: European Teleinformatics Conference on Research into Networks
and Distributed Applications, 20-22 April 1988, Vienna, Austria. Organized by the European
Action in Teleinformatics COST 11ter, North-Holland, Amsterdam etc., 1988.
Sproull, Lee; and Sara Kiesler: Connections: New Ways of Working the the Networked
Organization, The MIT Press, Cambridge, Mass., 1991.
Standage, Tom: The Victorian Internet: The Remarkable Story of the Telegraph and the
Nineteenth Century’s On-line Pioneers, Walker Publishing Co., New York, 1998. (Paperback
ed., 2007).
476
Cooperative Work and Coordinative Practices
Star, Susan Leigh; and James R. Griesemer: ‘Institutional ecology, “translations” and boundary
objects: Amateurs and professionals in Berkeley’s Museum of Vertebrate Zoology, 1907-39’,
Social Studies of Science, vol. 19, 1989, pp. 387-420.
Star, Susan Leigh: ‘The structure of ill-structured solutions: Boundary objects and heterogeneous
distributed problem solving’, in L. Gasser and M. Huhns (eds.): Distributed Artificial
Intelligence, vol. 2, Pitman, London, 1989, pp. 37-54.
Stefik, M., et al.: ‘WYSIWIS revised: Early experiences with multiuser interfaces’, ACM
Transactions on Office Information Systems, vol. 5, no. 2, April 1987, pp. 147-167.
Stinchcombe, Arthur L.: Creating Efficient Industrial Administrations, Academic Press, New
York and London, 1974.
Stinchcombe, Arthur L.: Information and Organizations, University of California Press, Berkeley,
Calif., 1990.
Storrs, Graham: ‘Group working in the DHSS large demonstrator project’, in P. Wilson; J. M.
Bowers; and S. D. Benford (eds.): ECSCW’89: Proceedings of the First European Conference
on Computer Supported Cooperative Work, 13-15 September 1989, Gatwick, London, London,
1989, pp. 102-119.
Strauss, Anselm L.: ‘Work and the division of labor’, The Sociological Quarterly, vol. 26, no. 1,
1985, pp. 1-19.
Strauss, Anselm L., et al.: Social Organization of Medical Work, University of Chicago Press,
Chicago and London, 1985.
Strauss, Anselm L.: ‘The articulation of project work: An organizational process’, The
Sociological Quarterly, vol. 29, no. 2, 1988, pp. 163-178.
Strauss, Anselm L.: Continual Permutations of Action, Aldine de Gruyter, New York, 1994.
Styles, Keith: Working Drawings Handbook, Architectural Press, Amsterdam etc., 1995. (3rd
edtion; 1st ed. 1982).
Suchman, Lucy A.: ‘Systematics of office work: Office studies for knowledge-based systems.
Digest’, in: Office Automation Conference, 5-7 April 1982, San Francisco, 1982, pp. 409-412.
Suchman, Lucy A.: ‘Office procedure as practical action: Models of work and system design’,
ACM Transactions on Office Information Systems, vol. 1, no. 4, October 1983, pp. 320-328.
Suchman, Lucy A.; and Eleanor Wynn: ‘Procedures and problems in the office’, Office:
Technology and People, vol. 2, 1984, pp. 133-154.
Suchman, Lucy A.: Plans and Situated Actions: The Problem of Human-Machine Communication,
Cambridge University Press, Cambridge, 1987.
Suchman, Lucy A.: Notes on Computer Support for Cooperative Work, Dept. of Computer
Science, University of Jyväskylä, Jyväskylä, Finland, May 1989. – WP-12.
Suchman, Lucy A.; and Randall H. Trigg: ‘Understanding practice: Video as a medium for
reflection and design’, in J. Greenbaum and M. Kyng (eds.): Design at Work: Cooperative
Design of Computer Systems, Lawrence Erlbaum, Hillsdale, New Jersey, 1991, pp. 65-89.
Suchman, Lucy A.: ‘Response to Vera and Simon's Situated action: A symbolic interpretation’,
Cognitive Science, vol. 17, no. 1, January-March 1993a, pp. 71-75.
Suchman, Lucy A.: ‘Technologies of accountability: On lizards and airplanes’, in G. Button (ed.):
Technology in Working Order. Studies of work, Interaction, and Technology, Routledge,
London and New York, 1993b, pp. 113-126.
Suchman, Lucy A.: ‘Writing and reading: A response to comments on Plans and Situated
Actions’, The Journal of the Learning Sciences, vol. 12, no. 2, 2003, pp. 299-306.
Suchman, Lucy A.: Human-Machine Reconfigurations: Plans and Situated Actions, 2nd Edition,
Cambridge University Press, New York, 2007.
Sutherland, Ivan Edward: ‘Sketchpad: A man-machine graphical communication system’, in E. C.
Johnson (ed.): SJCC’63: Proceedings of the Spring Joint Computer Conference, 21-23 May
1963, Detroit, Michigan, vol. 23, AFIPS Press, Santa Monica, California, 1963, pp. 329-346.
Swade, Doron: The Cogwheel Brain: Charles Babbage and the Quest to Build the First Computer,
Little, Brown & Co., London, 2000. – Abacus, London, 2001 (Paperback ed.).
Swade, Doron: ‘The “unerring certainty of mechanical agency”: Machines and table making in the
nineteenth century’, in M. Campbell-Kelly, et al. (eds.): The History of Mathematical Tables:
From Sumer to Spreadsheets, Oxford University Press, Oxford, 2003, pp. 145-176. (Reprinted
2007).
References
477
Swenson, K. D., et al.: ‘A business process environment supporting collaborative planning’,
Collaborative Computing, vol. 1, no. 1, March 1994, pp. 15-24.
Symon, Gillian; Karen Long; and Judi Ellis: ‘The coordination of work activities: Cooperation and
conflict in a hospital context’, Computer Supported Cooperative Work (CSCW): The Journal
of Collaborative Computing, vol. 5, no. 1, 1996, pp. 1-31.
Syri, Anja: ‘Tailoring cooperation support through mediators’, in J. A. Hughes, et al. (eds.):
ECSCW’97: Proceedings of the Fifth European Conference on Computer-Supported
Cooperative Work, 7–11 September 1997, Lancaster, U.K., Kluwer Academic Publishers,
Dordrecht, 1997, pp. 157-172.
Tait, William W.: The Provenance of Pure Reason: Essays in the Philosophy of Mathematics and
Its History, Oxford University Press, Oxford, 2005.
Tang, John C.: ‘Findings from observational studies of collaborative work’, International Journal
of Man-Machine Studies, vol. 34, 1991, pp. 143-160.
Taylor, Frederick Winslow: The Principles of Scientific Management, Harper & Brothers, New
York, 1911. – W. W. Norton & Co., New York, 1967.
Taylor, Talbot J.: Mutual Misunderstanding: Scepticism and the Theorizing of Language and
Interpretation, Duke University Press, Durham and London, 1992.
Taylor, Talbot J.: Theorizing Language: Analysis, Normativity, Rhetoric, History, Pergamon,
Amsterdam etc., 1997.
Taylor, Talbot J.: ‘Language constructing language: the implications of reflexivity for linguistic
theory’, Language Sciences, vol. 22, 2000, pp. 483-499.
Thacker, Charles P., et al.: Alto: A personal computer, Xerox PARC, 7 August 1979. – CSL-7911. (Reprinted February 1984).
<http://www.scribd.com/doc/2183225/Alto-A-Personal-Computer>
Thacker, Charles P.: ‘Personal distributed computing: The Alto and Ethernet hardware’, in A.
Goldberg (ed.): A History of Personal Workstations, Addison-Wesley, Reading, Mass., 1988,
pp. 267-289.
The OSI e-Infrastructure Working Group: Developing the UK’s E-infrastructure for Science and
Innovation, Office of Science and Innovation (OSI), 2004.
<http://www.nesc.ac.uk/documents/OSI/report.pdf>
Thompson, James D.: Organizations in Action: Social Science Base of Administrative Theory,
McGraw-Hill, New York, 1967.
Thorngate, Warren: ‘Got a minute? How technology affects the economy of attention [Abstract of
closing plenary talk]’, in W. A. Kellogg and S. Whittaker (eds.): CSCW 2000: ACM
Conference on Computer Supported Cooperative Work, 2–6 December 2000, Philadelphia,
Pennsylvania, USA, 2000.
Tomlinson, Raymond S.: ‘The first network email’, 2001.
<http://openmap.bbn.com/~tomlinso/ray/firstemailframe.html>
Trevor, Jonathan; Tom A. Rodden; and Gordon Blair: ‘COLA: A lightweight platform for
CSCW’, in G. De Michelis; C. Simone; and K. Schmidt (eds.): ECSCW’93: Proceedings of the
Third European Conference on Computer-Supported Cooperative Work, 13-17 September
1993, Milano, Italy, Kluwer Academic Publishers, Dordrecht, 1993, pp. 15-30.
Trevor, Jonathan; Tom A. Rodden; and Gordon Blair: ‘COLA: A lightweight platform for
CSCW’, Computer Supported Cooperative Work (CSCW): An International Journal, vol. 3,
no. 2, 1995, pp. 197-224.
Turing, Alan M.: ‘On computable numbers with an application to the Entscheidungsproblem’,
Proceedings of the London Mathematical Society, vol. 42 (Series 2), no. 3-4, NovemberDecember 1936, pp. 230-265.
Turing, Alan M.: ‘Proposed electronic calculator (1945)’ (Report, 1945). In B. J. Copeland (ed.)
Alan Turing’s Automatic Computing Engine: The Master Codebreaker’s Struggle to Build the
Modern Computer. Oxford University Press, Oxford, 2005, pp. 369-454. – Written late 1945;
submitted to the Executive Committee of the NPL February 1946 as ‘Report by Dr. A. M.
Turing on Proposals for the Development of an Automatic Computing Engine (ACE)’.
478
Cooperative Work and Coordinative Practices
Turing, Alan M.; and James H. Wilkinson: ‘The Turing-Wilkinson lecture series (1946-47)’
(Lectures, London, 1946-47). In B. J. Copeland (ed.) Alan Turing’s Automatic Computing
Engine: The Master Codebreaker’s Struggle to Build the Modern Computer. Oxford
University Press, Oxford, 2005, pp. 459-527.
Turing, Alan M.: ‘Lecture on Automatic Computing Engine’ (London Mathematical Society, 20
February 1947). In B. J. Copeland (ed.) The Essential Turing: Seminal Writings in Computing,
Logic, Philosophy, Artificial Intelligence, and Artificial Life plus The Secrets of Enigma.
Oxford University Press, Oxford, 2004, pp. 378-394.
Turnbull, David: ‘The ad hoc collective work of building Gothic cathedrals with templates, string,
and geometry’, Science, Technology, & Human Values, vol. 18, no. 3, Summer 1993, pp. 315340.
Turoff, Murray: ‘Delphi and its potential impact on information systems’, in R. R. Wheeler (ed.):
FJCC’71: Proceedings of the Fall Joint Computer Conference, 16-18 November 1971, Las
Vegas, Nevada, vol. 39, AFIPS Press, Montvale, New Jersey, 1971, pp. 317-326.
Turoff, Murray: ‘Delphi conferencing: Computer-based conferencing with anonymity’,
Technological Forecasting and Social Change, vol. 3, 1972, pp. 159-204.
Turoff, Murray: ‘Human communication via data networks’, SIGCAS Computers and Society, vol.
4, no. 1, May 1973, pp. 15-24.
Turoff, Murray: ‘Computer-mediated communication requirements for group support’, Journal of
Organizational Computing, vol. 1, 1991, pp. 85-113.
Uhlig, Ronald P.: ‘Human factors in computer message systems’, Datamation, vol. 23, no. 5, May
1977, pp. 120-126.
Ure, Andrew: The Philosophy of Manufactures: or, an Exposition of the Scientific, Moral, and
Commercial Economy of the Factory System of Great Britain, Charles Knight, London, 1835.
– Frank Cass & Co., London, 1967.
Urwick, Lyndall F.; and Edward F. L. Brech: The Making of Scientific Management. Volume I:
Thirteen Pioneers, Sir Isaac Pitman And Sons, London, 1945. (Rev. ed., 1957).
Urwick, Lyndall F.; and Edward F. L. Brech: The Making of Scientific Management. Volume II:
Management in British Industry, Management Publications Trust, London, 1946.
Urwick, Lyndall F.; and Edward F. L. Brech: The Making of Scientific Management. Volume III:
The Hawthorne Investigations, Management Publications Trust, London, 1948.
Vallee, Jacques F.: Group Communication through Computers. Volume 1: Design and Use of the
FORUM System, Institute for the Future, Menlo Park, Calif., July 1974. – IFF Report R-32.
van der Aalst, Wil M.P.: ‘Exploring the CSCW spectrum using process mining’, Advanced
Engineering Informatics, vol. 21, 2007, pp. 191–199.
Van der Spiegel, Jan, et al.: ‘The ENIAC: History, Operation, and Reconstruction in VLSI’, in R.
Rojas and U. Hashagen (eds.): The First Computers: History and Architectures, The MIT
Press, Cambridge, Mass., and London, 2000, pp. 121-178.
Van Vleck, Tom: ‘The history of electronic mail’, 1 February 2001. Latest update, 25 May 2008.
<http://www.multicians.org/thvv/mail-history.html>
Vicente, Kim J.; and Jens Rasmussen: ‘Ecological interface design: Theoretical foundations’,
IEEE Transactions on Systems, Man, and Cybernetics, vol. SMC-22, no. 4, 1992, pp. 589-606.
Victor, Frank; and Edgar Sommer: ‘Supporting the design of office procedures in the DOMINO
system’, in P. Wilson; J. M. Bowers; and S. D. Benford (eds.): ECSCW’89: Proceedings of the
1st European Conference on Computer Supported Cooperative Work, 13-15 September 1989,
Gatwick, London, London, 1989, pp. 148-159.
Victor, Frank; and Edgar Sommer: ‘Supporting the design of office procedures in the DOMINO
system’, in J. M. Bowers and S. D. Benford (eds.): Studies in Computer Supported Cooperative
Work. Theory, Practice and Design, North-Holland, Amsterdam, 1991, pp. 119-130.
Von Glinow, Mary Ann; and Susan Albers Mohrman (eds.): Managing Complexity in High
Technology Organizations, Oxford University Press, New York and Oxford, 1990.
von Neumann, John: The First Draft Report on the EDVAC, Moore School of Electrical
Engineering, University of Pennsylvania, 30 June 1945. – Contract No. W–670–ORD–4926
Between the United States Army Ordnance Department and the University of Pennsylvania.
Text ed. by M. D. Godfrey. <http://qss.stanford.edu/~godfrey/vonNeumann/vnedvac.pdf>
von Neumann, John: ‘The mathematician’, Works of the Mind, vol. 1, no. 1, 1947, pp. 180-196.
References
479
von Savigny, Eike: Die Philosophie der normalen Sprache: Eine kritische Einführung in die
‘ordinary language philosophy’, Suhrkamp, Frankfurt a. M., 1974. (2nd ed., 1st. ed. 1969).
von Savigny, Eike: Wittgensteins ‘Philosophische Untersuchungen’: Ein Kommentar für Leser.
Band 1: Abschnitte 1-315, Vittorio Klostermann, Frankfurt a. M., 1994. (2nd ed.; 1st ed. 1988).
Vygotskij, Lev Semenovič [L. S. Vygotsky]: ‘The instrumental method in psychology’
(Manuscript, 1930). In L. S. Vygotskij: The Collected Works of L. S. Vygotsky. Volume 3:
Problems of the Theory and History of Psychology. Plenum Press, New York and London,
1997, pp. 85-89. – Theses of a talk read in 1930 at the N. K. Krupskaya Academy of
Communist Education.
Wagner, Ina; and Rainer Lainer: ‘Open planning: inspirational objects, themes, placeholders, and
persuasive artefacts’, in: Proceedings Colloque Architecture des systèmes urbains, 5 July 2001,
Université de Technologie de Compiègne, 2001.
Waismann, Friedrich: ‘The nature of mathematics: Wittgenstein’s standpoint’ (Königsberg, 1930).
Transl. from ‘Über das Wesen der Mathematik: Der Standpunkt Wittgensteins’. Transl. by S.
G. Shanker. In S. G. Shanker (ed.) Ludwig Wittgenstein: Critical Assessments: Volume Three:
From the Tractatus to Remarks on the Foundations of Mathematics: Wittgenstein on the
Philosophy of Mathematics. Croom Helm, London, 1986, pp. 60-67.
Waismann, Friedrich: Introduction to Mathematical Thinking: The Formation of Concepts in
Modern Mathematics, Hafner Publishing Co., London, 1951. Transl. from Einführung in das
mathematische Denken: Die Begriffsbildung der modernen Mathematik, Gerold u. Co., Wien,
1936. Transl. by T. J. Benac.
Wakati, Osamu A.; and Richard M. Linde: Study Guide for the Professional Practice of
Architectural Working Drawings, John Wiley & Sons, New York, 1994. (2nd ed., Student ed.).
Wakefield, Edward: A View of the Art of Colonization, with present reference to the British
Empire; in letters between a stateman and a colonist, John W. Parker, London, 1849.
Waldrop, N. Mitchell: Complexity: The emerging science at the edge of order and chaos, Simon &
Schuster, 1992. – Penguin Books, 1994 (Paperback ed.).
Wang, Hao: From Mathematics to Philosophy, Routledge & Kegan Paul, London, 1974.
Weber, Max: ‘Die wirtschaftlichen Unternehmungen der Gemeinden’ (28 September 1909). Text
ed. by W. Schluchter. In M. Weber: Studienausgabe der Max-Weber-Gesamtausgabe. Mohr
Siebeck, Tübingen, 1999, vol. I/8 (Wirtschaft, Staat & Sozialpolitik: Schriften und Reden,
1900-1912), pp. 127-130.
Weick, Karl E.; and Karlene H. Roberts: ‘Collective mind in organizations: Heedful interrelating
on flight decks’, Administrative Science Quaterly, vol. 38, 1993, pp. 357-381.
Wertsch, James V.: Vygotsky and the Social Formation of Mind, Harvard University Press,
Cambridge, Mass., and London, 1985.
Wertsch, James V.: Voices of the Mind: A Sociocultural Approach to Mediated Action, Harvard
University Press, Cambridge, Mass., 1991.
White, Alan Richard: Attention, Basil Blackwell, Oxford, 1964.
White, Alan Richard: The Philosophy of Mind, Random House, New York, 1967.
White, Alan Richard: The Nature of Knowledge, Rowman and Littlefield, Totowa, New Jersey,
1982.
Whitehead, Alfred North; and Bertrand Russell: Principia Mathematica, Cambridge University
Press, Cambridge, 1910. (2nd ed., 1927; paperback ed. ‘to *56’, 1962).
Whitley, Richard: ‘Cognitive and social institutionalization of scientific specialties and research
areas’, in R. Whitley (ed.): Social Processes of Scientific Development, Routledge & Kegan
Paul, London, 1974, pp. 69-95.
Wiedemann, Thomas: Greek and Roman Slavery, Croom Helm, London, 1981. – Routledge, New
York, 1988.
Wieser, C. Robert: Cape Cod System and demonstration, MIT Lincoln Laboratory, Division 6,
Cambridge, Mass., 13 March 1953. – Project Whirlwind Limited Memorandum VI - L-86.
<http://dome.mit.edu/handle/1721.3/41510>
Wieser, C. Robert: ‘The Cape Cod System’, IEEE Annals of the History of Computing, vol. 5, no.
4, April-June 1985, pp. 362-369.
480
Cooperative Work and Coordinative Practices
Wilczek, Stephan; and Helmut Krcmar: ‘Betriebliche Groupwareplattformen’, in G. Schwabe; N.
Streitz; and R. Unland (eds.): CSCW-Kompendium: Lehr- und Handbuch zum
computerunterstützten kooperativen Arbeiten, Springer, Heidelberg, 2001, pp. 310-320.
Wild, Ray: Mass-production Management: The Design and Operation of Production Flow-line
Systems, John Wiley & Sons, London, 1972.
Williams, Meredith: Wittgenstein, Mind and Meaning: Toward a Social Conception of Mind,
Routledge, London and New York, 1999.
Williams, Michael R.: ‘Early calculation’, in W. Aspray (ed.): Computing Before Computers, Iowa
State University Press, Ames, Iowa, 1990, pp. 3-58.
Williams, Michael R.: ‘Difference engines: From Müller to Comrie’, in M. Campbell-Kelly, et al.
(eds.): The History of Mathematical Tables: From Sumer to Spreadsheets, Oxford University
Press, Oxford, 2003, pp. 123-144. (Reprinted 2007).
Williamson, Oliver E.: Markets and Hierarchies: A Transactional and Antitrust Analysis of the
Firm, Free Press, New York, 1975.
Williamson, Oliver E.: ‘Transaction-cost economics: The governance of contractual relations’,
Journal of Law and Economics, vol. 22, no. 2, October 1979, pp. 233-261.
Williamson, Oliver E.: ‘The economics of organization: The transaction cost approach’, American
Journal of Sociology, vol. 87, no. 3, November 1981, pp. 548-577.
Wilson, Paul: Computer-Supported Cooperative Work. An Introduction, Intellect, Oxford,
England, 1991.
Winch, Peter: The Idea of a Social Science and its Relation to Philosophy, Routledge & Kegan
Paul, London, 1958. (Second ed., 1990).
Winograd, Terry: ‘A language/action perspective on the design of cooperative work’, in H.
Krasner and I. Greif (eds.): CSCW’86: Proceedings. Conference on Computer-Supported
Cooperative Work, Austin, Texas, 3-5 December 1986, ACM Press, New York, 1986, pp. 203220.
Winograd, Terry; and Fernando Flores: Understanding Computers and Cognition: A New
Foundation for Design, Ablex Publishing Corp., Norwood, New Jersey, 1986.
Wittgenstein, Ludwig: Wittgenstein and the Vienna Circle: Conversations Recorded by Friedrich
Waismann (1929-32). Transl. by J. Schulte and B. McGuiness. Text ed. by B. McGuiness.
Basil Blackwell, Oxford, 1979.
Wittgenstein, Ludwig: Philosophical Remarks (Manuscript, 1930). Transl. from Philosophische
Bemerkungen. Transl. by R. Hargreaves and R. White. Text ed. by R. Rhees. Basil Blackwell,
Oxford, 1975.
Wittgenstein, Ludwig: Philosophical Grammar (Manuscript, 1931-34). Transl. by A. Kenny. Text
ed. by R. Rhees. Basil Blackwell, Oxford, 1974; Paperback ed. 1980.
Wittgenstein, Ludwig: Philosophy for Mathematicians: Wittgenstein’s Lectures, 1932-33. From
the Notes of Alice Ambrose (1932-33). Text ed. by A. Ambrose. In L. Wittgenstein:
Wittgenstein’s Lectures: Cambridge, 1932-1935. From the Notes of Alice Ambrose and
Margaret Macdonald. Basil Blackwell, Oxford, 1979, pp. 203-225.
Wittgenstein, Ludwig: The Yellow Book: Wittgenstein’s Lectures and Informal Discussions during
dictation of the Blue Book 1933-34. From the Notes of Alice Ambrose (1934-35). Text ed. by
A. Ambrose. In L. Wittgenstein: Wittgenstein’s Lectures: Cambridge, 1932-1935. From the
Notes of Alice Ambrose and Margaret Macdonald. Basil Blackwell, Oxford, 1979, pp. 41-73.
Wittgenstein, Ludwig: Remarks on the Foundations of Mathematics (Manuscript, 1937-44).
Transl. from Bemerkungen über die Grundlagen der der Mathematik. Transl. by G. E. M.
Anscombe. Text ed. by G. H. von Wright; R. Rhees; and G. E. M. Anscombe. Basil Blackwell
Publishers, Oxford, 1978.
Wittgenstein, Ludwig: Lectures on the Foundations of Mathematics, Cambridge, 1939: From the
Notes of R. G. Bosanquet, Norman Malcolm, Rush Rhees, and Yorick Smythies (1939). Text ed.
by C. Diamond. The University of Chicago Press, Chicago and London, 1976.
Wittgenstein, Ludwig: Philosophical Investigations (Manuscript, 1945-46). Transl. from
Philosophische Untersuchungen. Transl. by G. E. M. Anscombe; P. M. S. Hacker; and J.
Schulte. Text ed. by P. M. S. Hacker and J. Schulte. In L. Wittgenstein: Philosophical
Investigations. Blackwell Publishers, Oxford, 4rd ed., 2009, pp. 1-181. – TS 227: Previously
published as Philosphical Investigations, Part 1.
References
481
Wittgenstein, Ludwig: Zettel (Manuscript, 1945-48). Transl. by G. E. M. Anscombe. Text ed. by
G. E. M. Anscombe and G. H. von Wright. Basil Blackwell Publishers, Oxford, 1967, 2nd
edition 1981.
Wittgenstein, Ludwig: Philosophical Investigations (Manuscript, 1945-49). Transl. from
Philosophische Untersuchungen. Transl. by G. E. M. Anscombe. Basil Blackwell Publishers,
Oxford, 1958.
Wittgenstein, Ludwig: Remarks on the Philosophy of Psychology. Volume I (Manuscript, 194649). Transl. by C. G. Luckhardt and M. A. E. Aue. Text ed. by G. H. von Wright and H.
Nyman. Basil Blackwell Publisher, Oxford, 1980.
Wittgenstein, Ludwig: On Certainty (Manuscript, 1949-51). Transl. by D. Paul and G. E. M.
Anscombe. Text ed. by G. E. M. Anscombe and G. H. von Wright. Basil Blackwell Publishers,
Oxford, 1967, 2nd ed. 1981.
Womack, James P.; Daniel T. Jones; and Daniel Roos: The Machine that Changed the World: The
Story of Lean Production, Rawson Associates, New York, 1990. – Harper Collins Publishers,
New York, 1991.
Woo, Carson C.; and Frederick H. Lochovsky: ‘Supporting distributed office problem solving in
organizations’, ACM Transactions on Office Information Systems, vol. 4, no. 3, July 1986, pp.
185-204.
Woods, David D.: ‘Commentary: Cognitive engineering in complex and dynamic worlds’,
International Journal of Man-Machine Studies, vol. 27, no. 5-6, November-December 1987,
pp. 571-585.
Woods, David D.: ‘Coping with complexity: the psychology of human behavior in complex
systems’, in L. P. Goodstein; H. B. Andersen; and S. E. Olsen (eds.): Tasks, Errors and Mental
Models. A Festschrift to celebrate the 60th birthday of Professor Jens Rasmussen, Taylor &
Francis, London, 1988, pp. 128-148.
Wynn, Eleanor H.: Office Conversation as an Information Medium, Ph.D. dissertation, University
of California, Berkeley, 1979.
Wynn, Eleanor H.: ‘Taking practice seriously’, in J. Greenbaum and M. Kyng (eds.): Design at
Work: Cooperative Design of Computer Systems, Lawrence Erlbaum, Hillsdale, New Jersey,
1991, pp. 45-64.
Yates, JoAnne: Control through Communication: The Rise of System in American Management,
Johns Hopkins University Press, Baltimore and London, 1989.
Zerubavel, Eviatar: Patterns of Time in Hospital Life: A Sociological Perspective, University of
Chicago Press, Chicago and London, 1979.
Zimmerman, Don Howard: Paper Work and People Work: A Study of a Public Assistance Agency,
PhD diss., University of California, Los Angeles, 1966.
Zimmerman, Don Howard: ‘Record-keeping and the intake process in a public welfare agency’, in
S. Wheeler (ed.): On Record: Files and Dossiers in American Life, Russell Sage Foundation,
New York, 1969a, pp. 319-354.
Zimmerman, Don Howard: ‘Tasks and troubles: The practical bases of work activities in a public
assistance agency’, in D. A. Hansen (ed.): Explorations in Sociology and Counseling,
Houghton-Mifflin, New York, 1969b, pp. 237-266.
Zisman, Michael D.: Representation, Specification and Automation of Office Procedures, Ph.D.
dissertation, Dept. of Decision Sciences, The Wharton School, Univ. of Pennsylvania, PA,
1977.
483
Index