Toward A Social Ethic of Technology
Toward A Social Ethic of Technology
Toward A Social Ethic of Technology
Introduction
Most approaches to ethics focus on individual behavior. In this paper, a
different approach is advocated, that of social ethics, which is offered as a
complement to individual ethics. To some extent, this is an exercise in
renaming some current activities, but it is also intended to clarify what is a
distinct and valuable ethical approach that can be developed much further
than it is at present. What is described here as social ethics is certainly
practiced, but it is not usually treated as a subject for philosophical inquiry.
Social ethics is taken here to be the ethical study of the options available t o
us in the social arrangements for decision-making (Devon 1999; see also a
follow-on article to the present one, Devon and Van de Poel 2004). Such
arrangements involve those for two or more people to perform social
functions such as those pertaining to security, transportation,
communication, reproduction and child rearing, education, and so forth. In
technology, social ethics can mean studying anything from legislation t o
project management. Different arrangements have different ethical tradeoffs;
hence the importance of the subject.
An illustration of social ethics is provided by the case of abortion (a
technology). The opponents of abortion take a principled position and argue
that abortion is taking a life and therefore that it is wrong. The opponents of
abortion believe all people should be opposed and have little interest in
variations in decision making practices. The pro-choice proponents do not
stress taking a position on whether abortion is good or bad but rather on
taking a position on who should decide. They propose that the pregnant
woman rather than, say, male dominated legislatures and churches should have
the right to decide whether or not an abortion is the right choice for them.
The pro-choice position would legalize abortion, of course, hence the debate.
The pro-choice position, then, is based on social ethics. Very clearly,
different arrangements in the social arrangements for making a decision about
technology (abortion in this case) can have very different ethical
implications and hence should be a subject for conscious reflection and
empirical inquiry in ethics.
ethics only focuses on engineers and not on the many other participants in
decision-making in technology, it exacerbates the problem (Devon 1991).
Studying ethics and technology means looking at both individual and
collective behavior in the production, use, and disposal of technology. This
broad scope may be contrasted with the best-developed sub-topic of
professional ethics applied to engineering, which has concentrated on roles
and responsibilities of working engineers (see Figure 1).
Figure 1
Social Ethics
Design Process
Project
Management
Organizational
Behavior
Individual Ethics
Professional
Ethics
Engineers
Other
Professionals
Policy
Legislation
Other
Individuals
Social
Responsibility
roundtable on the subject, several engineers wrote letters that included the
argument that two of the ethicists had confused a political stance with
ethics (IEEE Spectrum February 1997). The topic was the work of engineers
in various technologies such as chemical and other warfare technology, and
even working on the Cook County Jail. The ethicists in question did, in fact,
indicate personal opposition to such technologies and the letter writers were
making ethical defenses of working in such fields of engineering.
The letter writers in this case clearly felt that engineering ethics, as
presented, was excluding their values and, worse, condemning them. The same
experience occurred in the newsletters and meetings of a small, short-lived
group called American Engineers for Social Responsibility, in which I
participated. A single set of values was presented under a general rubric for
values, implicitly excluding (pejoratively) those who held other values, some
of whom told us as much. On the other hand, many engineers who feel there
are major ethical problems with the deployment of their skills can gain little
solace from codes of engineering ethics, and not much more from the
discourse of their professional societies.
We presently have no satisfactory way of handling this type of
discourse/conflict within engineering ethics, beyond making optimistic
injunctions such as calling for employers to accommodate any disjuncture
between the ethical profiles of employees and the work assigned to them by
the companies that employ them (Schinzinger & Martin 1989, p 317; Unger
1997, pp. 6-7) This frustration has led to protest emerging as a theme in
engineering ethics, and this, in turn, gets rejected by many engineers as being
politics rather than the ethics.
There is a way of dealing with the problem. Taking a social ethics approach
means recognizing not only that the ends and means of technology are
appropriate subjects for the ethics of technology, but also that differences in
value systems that emerge in almost all decision-making about technology are
to be expected. The means of handling differences, such as conflict resolution
processes, models of technology management, and aspects of the larger
political system, must be studied. This is not to suggest that engaging in
political behavior on behalf of this cause or that is what ethics is all about.
That remains a decision to be made at the personal level. Rather, the ethics
of technology is to be viewed as a practical science. This means engaging in
the study of, and the improvement of, the ways in which we collectively
practice decision making in technology. Such an endeavor can enrich and
guide the conduct of individuals, but it is very different than focusing on the
still obliged to do the best we can. There is simply no point making ethical
judgments in a state of reparable ignorance.
Some texts have appeared that provide new resources in areas where
information has been lacking. For example, it is now possible to have some
idea of the global social and environmental changes that create the life cycles
of consumer products (Ryan, et al. 2000; Graedel & Allenby 2003). This is at
least a surrogate for inclusion (see below). But it is still easier for engineers t o
understand a lot about how a technology works as a technology, while having
a limited understanding of its possible uses and its social and environmental
impacts in extraction, production, use, and disposal. Experts are usually paid
for their technical expertise and not for their contextual understanding nor
do their bosses usually ask for it. It is irritating to wrestle with, and to solve,
the technical issues of a problem, only to be confronted with social issues such
as marketability, regulatory constraints, or ethical concerns (Devon 1989). I t
is a recipe for producing defensive behavior. So, it is not enough to call for
cognizance, we need a methodology. And, while cognizance can be achieved
by social responsibility approaches at the individual level, the methodology
suggested will show how social ethics can powerfully supplement the
conscience and awareness of individuals.
The Role of Inclusion
This brings us to our second general value: we need to make sure the right
people are included in the decision making. Deciding who the right people
are should be a major focus in the social ethics of technology. Who they
might be is a point of concern in any industry where the clients, customers,
design and manufacturing staff, sales engineers, lawyers, senior management,
and various service units such as personnel are all relevant to a project. And
there will be other stakeholders such as environmental agencies, and the
community near a production plant, a landfill, a building, or a parking lot.
The classic article by Coates on technology assessment is instructive in this
regard (Coates 1971). Inclusion might be viewed as the difficult task of adding
stakeholder values to shareholder values, but that would be a misleading
representation.
Neglecting different stakeholders will have different outcomes at different
points in history. Neglect your customer and you risk losing money. Fail t o
design for the environment and you may pay heavily later. Neglect safety
standards and you risk losses in liability as well as sales. Neglect
underrepresented minorities and the poor by placing toxic waste sites in their
communities and you may get away with it for a long time, but probably not
for ever. In general, neglecting stakeholders, even when you are free to do so,
is a calculated risk and rarely ethical. The consequences of failure can be
severe. Nuclear energy technology ground to a halt with huge amounts of
capital at stake, in part, because the stakeholder issue was so poorly handled.
Once the public trust had gone, even reasonable arguments were discounted.
Involving diverse stakeholders helps with the problem of cognizance since
this diverse representation will bring disparate points of view and new
information to bear on the design process. There is also evidence that
inclusiveness with respect to diversity generates more creativity in the design
process (Leifer 1997) and facilitates the conduct of international business
(Lane, DiStefano, & Maznevski 1997). Creating more and different options
allows better choices to be made. While the final choice made may not be the
most ethical one, a wide range of choices is likely to provide an alternative
that is fairly sound technically, economically, and ethically. To some extent
then, the broader the range of design options that are generated, the more
ethical the process is. Thus, increasing representation in the design process
by stakeholders is ethical in itself and it may be in its effect on the final
product or process, also, by expanding cognizance and generating more
options. One area of design that is growing rapidly is inclusive or universal
design which studies adaptive technology for what used to be those with
disabilities. It is now embracing a continuum approach to human needs and
abilities with much interest, for example, in aging effects (Clarkson, et al.
2003). It is clear that such designs often have benefits for the average
consumer such as ramps to buildings, and wider, better grip pens. This reflects
the power of diversity that comes from more inclusive social processes in
design.
Democratizing design is not straightforward. Experts exercise much executive
authority. Corporate and government bosses think the decisions are theirs.
Clients are sure that they should decide since they pay. And the public is not
always quick to come forward because we have strongly meritocratic values.
Purely lay institutions like juries are sometimes regarded with suspicion. Yet
in Denmark they have been experimenting with lay decision-making about
complex issues like genetic engineering. Lay groups are formed that exclude
experts in the areas of the science and technology being examined. At some
point, such experts are summoned and they testify under questioning before
the lay group. Then the lay group produces a report and submits it t o
parliament. These lay groups ask the contextual questions about the science /
technology being examined: what will it do, what are the costs and benefits
and to whom, who will own it, what does it mean for our lives, for the next
generation, or for the environment. The results have been encouraging, and
industries have become increasingly interested in the value of these early
assessments by the general public for determining the direction their product
design and development should take (Schlove 1996).
The Decision Making Process
So far it has been argued that:
an individual ethics approach with social issues appended via the concept of
social responsibility. The comparison is provided in Table I.
Table I: Social and Individual Ethics Compared
Social Ethics of Technology
Subject population
Everyone
Engineers
Target process
Social
arrangements
for Individual accountability
making
decisions
about
technology
Inclusive
process
and Fiduciary
loyalty
and
cognizance
conscience
(social
responsibility)
Seamless
connection
to Political
values
and
social and political life
processes
are
seen
as
externalities
Key loyalties
Conceptualization
The debate in IEEE Spectrum ground to a halt over a clash of opinions and
an irreconcilable disjuncture between what is ethics and what is politics. Using
a social ethics framework, the differences of opinion would be treated as
normal, and the idea of a boundary between ethics and politics would be
rejected as detrimental to both ethics and politics. The discussion would focus
on assessing the technologies and the social arrangements that produced
them. Asymmetries between those who control the technology and those who
are affected by the technology would characterize at least a part of this
discussion.
Recent coverage of the plight of workers in secret government site, Area
51, in Nevada by the Washington Post (July 21, 1997) may be illustrative for
this discussion. The workers are sworn to secrecy and the government denies
the worksite even exists. According to the account, the workers are exposed
to very damaging chemicals through disposal by burning practices. Their
consequent and severe health problems cannot be helped nor the causes
addressed, because, officially, nothing happened at no such place. While
ethical defenses of weapons production exist, the situation as it is described in
the Washington Post, reveals a problem. The problem is occurring where
there is a large asymmetry in the social arrangements for decision making in
technology between those who control it and those who are affected by it. A
social ethics of technology provides a framework for discussing these
arrangements that brings everyone to the table. And much could be done here
without jeopardizing national security. A good result of such a discussion
would be the generation of a variety of options in the social arrangements for
pursuing the technology at hand, some of which would surely be safer for the
workers health.
Social Ethics of Technology in Practice
If the social ethics of technology is so important, it is reasonable to assume
that we are already doing it. This appears to be true. A social ethics of
technology is at work in legislatures, town councils, and public interest groups.
Elements may be found in books on engineering and even in codes of
engineering ethics. The tools are those of technology assessment, including
environmental impact assessment, and management of technology. But these
tools, like the social ethics of technology, are poorly represented in the
university. There is no systematic attempt to focus in the name of ethics on
the variety and efficacy of the social processes involved in designing,
producing, using, and disposing of technology.
In education, for example, two of the best texts on the sub-field of
engineering ethics address a lot of social ethics topics (Schinzinger & Martin
1989; Unger 1997). They study both means and ends, and both individual and
social processes. But the subject matter is always reduced to the plight of
individual engineers, their rights and social responsibilities. As the authors of
one text summarize their views, We have emphasized the personal moral
autonomy of individuals (Schinzinger & Martin 1989, p. 339). They note
that there is room for disagreement among reasonable peopleand there
is the need for understanding among engineers and management about the
need to cooperatively resolve conflicts (op cit., p. 340). But this is said as a
caveat to their paradigm of understanding individual responsibilities. A decade
later they reiterate this view in a text with far more social and environmental
issues than they had before: Engineers mustreflect critically on the moral
dilemmas they will confront (Schinzinger & Martin 2000, p. ix). A social
ethics approach would view these statements about value differences and
management/employee conflicts as starting points and systematically explore
the options for handling them. Further, even the emphasis on employeemanagement conflict is perhaps exaggerated by the focus on the individual.
There are also some win-win options in conflictual situations as seen by
accomplishments in negotiation and in design for the environment practices.
An individual ethics approach tends to set the individual up with a choice
between fiduciary responsibility and whistle blowing. This disempowers
References
Aristotle. Nichomachean Ethics, New York: Macmillan, 1990.
Aristotle. Aristotles Politics, H. G. Apostle and Lloyd Gerson, eds. and trans., Grinnell, Iowa:
The Peripatetic Press, 1986.
Boisjoly, R. personal communication after a lecture at Penn State, 1998. See also
http://onlineethics.org/moral/boisjoly/RB-intro.htm
Clarkson, J., Coleman, R., Keates, S. & Lebbon, C. Inclusive Design. London: Springer, 2003.
Coates, J.F. Technology Assessment. The Futurist 5 (6): 1971.
Denise, T., Peterfreund, S.P. & White, N. Great Traditions in Ethics, 8 th Ed., New York:
Wadsworth. 1996, p. 1.
Devon, R.F. A New Paradigm for Engineering Ethics. A Delicate Balance: Technics, Culture
and Consequences. IEEE/SSIT, 1991.
________. Towards a Social Ethics of Technology: The Norms of Engagement. Journal o f
Engineering Education (January): 1999.
Devon, R. & Van de Poel, I. Design Ethics: the Social Ethics Paradigm. International
Journal of Engineering Education 20 (3): 2004, 461-469.
Dewey, J. The Quest for Certainty. in Denise, T., Peterfreund, S.P. & White, N. (eds.), Great
Traditions in Ethics, 8th Ed., Belmont, CA: Wadsworth. 1996.
Fielder, J. H., & Birsch, D. (eds.). The DC-10 Case: A Study in Applied Ethics, Technology, and
Society. Buffalo, New York: State University of New York Press, 1992.
Graedel, T. E., & Allenby, B.R. Industrial Ecology. Upper Saddle River, New Jersey: Pearson
Education, 2003.
IEEE Spectrum, December, 1996. Ethics Roundtable: Doing the Right Thing, pp. 25-32; and
letters responding in February 1997, pp. 6-7, March 1997, p6, April 1997, p. 6.
Kunkle, G.C. New Challenge or the Past Revisited? The Office of Technology Assessment i n
Historical Context. Technology in Society 17 (2): 1995.
Lane, H.W., iStefano, J. & Maznevski, M.L. International Management Behavior. Cambridge,
Mass: Blackwell, 1997.
Leifer, L. Design Team Performance: Metrics and the Impact of Technology. in Seidner, C.J.,
& Brown, S.M. (eds.), Evaluating Corporate Training: Models and Issues. Norwell,
Mass: Kluwer, 1997.
Martin, M.W. & Schinzinger, R. Ethics in Engineering. 2nd Edit. New York: McGraw Hill, 1989.
Technology
Review
July
1996.
Scribner, C.F. & Culver, C.G. Investigation of the Collapse of the LAmbiance Plaza. Journal
Performance of Constructed Facilities 2 (2): 1988, pp. 58-79.
Smith, H. Rethinking America. New York: Random House, 1995.
Taylor, H.D. Flixborough: Implications
Developments Limited, 1975.
for
Management.
London:
Keith
Shipton
Ulrich, K.T. & Eppinger, S.D. Product Design and Development. New York: McGraw-Hill/Irwin,
2004.
Unger, S.H. Controlling Technology: Ethics and the Responsible Engineer. 2nd ed., New York:
Wiley & Sons, 1997.
Vaughan, D. The Challenger Launch Decision: Risky Technology, Culture, and Deviance a t
NASA. Chicago: University of Chicago Press, 1997.
Whitbeck, C. 1996. Ethics as Design. Doing Justice to Moral Problems. Hastings Center
Report May/June 1996, pp. 9-16.