PHYSICS, COMMUNITY AND THE CRISIS IN
PHYSICAL THEORY
(PHYSICS TODAY, NOVEMBER 1993,
34-40)
We
are in the midst of a restructuring of the physical sciences. Internally, they
are stratifying into independent levels, with stable basic principles;
externally, budgets are shrinking and political objectives are changing.
Silvan S. Schweber1
1. Silvan Schweber
is the Koret Professor of the History of Ideas and a
professor of physics in the Martin Fisher School of Physics at Brandeis
University, in Waltham, Massachusetts. He is also an associate in the
department of the history of science at Harvard University in Cambridge,
Massachusetts.
A deep sense of unease permeates the physical sciences.
We are in a time of great change: The end of the cold war has ushered in an era
of shrinking budgets, painful restructuring and changing objectives. At the
same time, the underlying assumptions of physics research have shifted. Traditionally,
physics has been highly reductionist, analyzing
nature in terms of smaller and smaller building blocks and revealing
underlying, unifying fundamental laws. In the past this grand vision has bound
the subdisciplines together. Now, however, the
reductionist approach that has been the hallmark of theoretical physics in the
20th century is being superseded by the investigation of emergent phenomena,
the study of the properties of complexes whose "elementary"
constituents and their interactions are known. Physics, it could be said, is
becoming like chemistry.
Such observations, of course, are not new. In 1929, in
the wake of the enormous success of nonrelativistic quantum mechanics in
explaining atomic and molecular structure and interactions, Dirac, one of the
main contributors to those developments, asserted in a famous quotation that
"the general theory of quantum mechanics is now almost complete."1
Whatever imperfections still remained were believed to be connected with the
synthesis of the theory with the special theory of relativity. But these were
"of no importance in the consideration of atomic and molecular structure
and ordinary chemical reactions.... The underlying physical laws necessary for
the mathematical theory of a large part of physics and the whole of chemistry
we thus completely known, and the difficulty is only that the exact application
of these laws leads to equations much too complicated to be soluble."
Forty years later, the head of the theory division at
CERN, Léon van Hove, made similar comments. In an after‑dinner speech entitled "The
Changing Face of Physics," delivered during a Battelle Colloquium on phase
transitions and critical phenomena held in Geneva in September 1970, he said2
that it seemed that
physics
now looks more like chemistry in the sense that, in percentage, a much larger
fraction of the total research activity deals with complex systems, structures
and processes, as against a smaller fraction concerned with the fundamental
laws of motion and interaction. This colloquium is a good example. Surely, we
all believe that the fundamentals of classical mechanics, of the
electromagnetic interaction, and of statistical mechanics dominate the
multifarious transitions and phenomena you discuss this week; and I presume
that none of you expects his work on such problems to lead to modifications of
these laws. You know the equations better than the phenomena. You are after the
missing link between them, i.e., the intermediate concepts which should allow a
quantitative understanding and prediction of phenomena.
Van Hove was pointing to an important
transformation that had taken place in physics: As had previously happened in
chemistry, an ever larger fraction of the efforts in the field were being
devoted to the study of novelty rather than to the elucidation of fundamental
laws and interactions. Van Hove could have elaborated his picture of chemistry
further by pointing to its predominantly applied and utilitarian concern and to
the close ties that academic chemistry has traditionally had with industry. The
same was becoming true for physics, and in fact for most of the sciences.
The present situation is in part a consequence of the
enormous success of quantum mechanics and quantum field theory and of the
theoretical physics based on them since World War II.
World War II was a watershed for physics. It gave
physicists the opportunity to display their powers. They emerged from the war
transformed, with the state recognizing their value for its security and its
power.3 The enormous expenditures for research and development
during World War II and the lucrative support of science by the government
after the war brought about a revolution in the physical sciences. The
revolution was driven principally by novel techniques and instruments:
microwave generators and detectors; nuclear reactors; cyclotrons, synchrotrons
and linear accelerators; transistorized electronics and computers; nuclear
magnetic resonance devices; cryostats; masers and lasers. All these changed the
practice of physics and also transformed the theoretical foundations. That
transformation culminated in a new understanding of condensed matter physics
and the establishment in high‑energy physics of what is now called the standard
model. These conceptual developments in fundamental physics have revealed a
hierarchical structure of the physical world. Each layer of the
hierarchy is successfully represented while remaining largely decoupled
from other layers.4 These advances have supported the notion of the
existence of objective emergent properties and have challenged the reductionist
approach. They have also given credence to the notion that to a high degree of
accuracy our theoretical understandings of some of these domains have been
stabilized, since the foundational aspects are considered known. Let me amplify
these remarks.
Reduction and unification
The revolutionary achievements in physics during the
period from 1925 to 1927 stemmed from the confluence of a theoretical
understanding, the representation of the dynamics of microscopic particles by
quantum mechanics, and the apperception of an approximately
stable ontology (embodying the notion that all atoms, molecules, solids and so
on are built out of electrons and nuclei). By gapproximately stableh I mean
that these particles (electrons and nuclei) could be treated as ahistoric objects, whose physical characteristics
were seemingly independent of their mode of production and whose lifetimes
could be considered as essentially infinite. One could assume these entities to
be essentially gelementaryh point‑like objects, each species specified by its
mass, spin, statistics (bosonic or fermionic) and electromagnetic properties such as its
charge and magnetic moment.
The success of quantum mechanics at the atomic level
immediately made it clear to the more perspicacious physicists that the laws
behind the phenomena had been apprehended, that they could therefore control
the behavior of simple microscopic systems and, more
importantly, that they could create new structures, new objects and new
phenomena.5 Thus already in 1928 John Slater had a vision of a
molecular engineering based on quantum mechanics. Condensed matter physics has
indeed become the study of systems that have never before existed. Phenomena
such as superconductivity are genuine novelties in the universe.
Quantum mechanics reasserted that the
physical world presented itself hierarchically. The world was not carved up
into terrestrial, planetary and celestial spheres but layered by virtue of
certain constants of nature. As Dirac emphasized in the first edition of Principles of Quantum Mechanics, Planck's constant allows us to parse the world into microscopic and macroscopic
realms, or more precisely into the atomic and molecular domains and the macroscopic
domains composed of atoms and molecules. The story repeated itself with the
carving out of the nuclear domain: quasistable
entities\neutrons and protons\could be regarded as the building
blocks of nuclei, and phenomenological theories could account for many properties
and interactions of nuclei.
During the 1970s the establishment of the standard model
description of the entities that populate the subnuclear
worlds\quarks, gluons and leptons\marked the attainment of another stage
in the attempt to give a unified description of the forces of nature. The
program traces its origins to Newton and his realization of the universality of
gravitation. Different aspects of the program were anchored during the 19th
century. Oersted and Faraday gave credibility to the quest by providing the
first experimental indications that the program of unification had validity.
Maxwell constructed a model for a unified theory of electricity and magnetism
and provided a mathematical formulation that explained many of the observed
phenomena and predicted new ones. Joseph von Fraunhofer
and others demonstrated that the laws of physics discovered here on Earth also
apply to stellar objects. With Einstein the vision became all—encompassing. Einstein advocated unification
coupled to a radical form. of theory reductionism. In 1918 he said, "The
supreme test of the physicist is to arrive at those universal elementary laws
from which the cosmos can be built up by pure deduction."
Unification and reduction are the two tenets that have
dominated fundamental theoretical physics during the present century. One
characterizes the hope of giving a unified description for all physical
phenomena; the other the aspiration to reduce the number of independent
concepts necessary to formulate the fundamental laws. The conceptual or.
idealistic component in reductionism takes precedence over the materialistic
one of reducing the number of so‑called elementary particles. The impressive
success of the enterprise since the beginning of the century has deeply
affected the evolution of all the physical sciences as well as that of
molecular biology.
But ironically, the very concepts that helped physicists
achieve the important advances that led to the standard model have eroded these
foundational tenets upon which the program is based. The ideas of symmetry
breaking, the renormalization group and decoupling suggest a picture of the
physical world that is hierarchically layered into quasi-autonomous domains,
with the ontology and dynamics of each layer essentially quasistable
and virtually immune to whatever happens in other layers. At the height of its
success, the privileged standing of high‑energy physics and the reductionism
that permeated the field were attacked.
Anderson's attack
In 1972 Philip Anderson, one of the foremost condensed
matter physicists, challenged the radical theory reductionist view held by the
majority of elementary‑particle physicists. His criticism was not only an
attack on the philosophical position of the high‑energy physicists; it also
challenged their dominance within the physics community and within the councils
of state. In an article entitled "More is Different,"6 Anderson
asserted that
the reductionist hypothesis does not by any means imply
a "constructionist" one: The ability to reduce everything to simple
fundamental laws does not imply the ability to start from those laws and
reconstruct the universe. In fact, the more the elementary‑particle physicists
tell us about the nature of the fundamental laws, the less relevance they seem
to have to the very real problems of the rest of science, much less to those of
society. The constructionist hypothesis breaks down when confronted with the
twin difficulties of scale and complexity.
Anderson believes in emergent laws. He
holds the view that each level has its own "fundamental" laws and its
own ontology. Translated into the language of particle physicists, Anderson
would say each level has its effective Lagrangian and
its set of quasistable particles. In each level the
effective Lagrangian\the gfundamentalh description at that
level\is the best we can do. But it is not enough to know the
"fundamental" laws at a given level. It is the solutions to
equations, not the equations themselves, that provide a mathematical
description of the physical phenomena. "Emergence" refers to
properties of the solutions\in particular, the properties that are
not readily apparent from the equations.
Moreover, the behavior of a
large and complex aggregate of "elementary" entities is not to be
understood "in terms of a simple extrapolation of the properties of a few
partieles."6 Although there may be suggestive indications of
how to relate one level to another, it is next to impossible to deduce the complexity and novelty that can
emerge through composition. The study of the new behavior
at each level of complexity requires research that Anderson believes "to
be as fundamental in its nature as any other." Although one may array the
sciences in a roughly linear hierarchy according to the notion that the
elementary entities of science X obey the laws of science Y one step lower, it
does not follow that science X is gJust applied science Y." The elementary
entities of condensed matter physics obey the laws of elementary‑particle
physics, but condensed matter physics is not just "applied elementary‑particle
physics," nor is chemistry applied many‑body physics, pace Dirac.
In his article Anderson sketched how the theory of broken symmetry helps
explain the shift from quantitative to qualitative differentiation in condensed
matter physics and why the constructionist converse of reductionism breaks
down.
Developments in quantum field theory and in the use of
renormalization‑group methods have given strong support to Anderson's views.
The insights from the renormalization‑group approach and from the effective
field theory method have also greatly clarified why the description at any one
level is so stable. In fact, renormalization‑group methods have changed
Anderson's remark "the more the elementary‑particle physicists tell us
about the nature of the fundamental laws, the less relevance they seem to have
to the very real problems of the rest of science," from a folk theorem
into an almost rigorously proved assertion.
Effective field theories
When renormalization was first developed,
after World War II, it was regarded as a technical device to get rid of the
divergences in perturbation theory and seemed to be a remarkably effective way
of sweeping the problems under the rug. Renormalizability became a criterion
for theory selection.7 Only renormalizable, local, relativistic
quantum field theories were adopted for the representation of the interactions
of what were then thought to be the "elementary particles." Note the
cunning of reason at work: The divergences that had previously been considered
a disastrous liability now became a valuable asset. However, the question of
why nature should be described by renormalizable theories was not addressed.
Such theories were simply the only ones in which calculations could be done.
The impressive advances of the late
1960s and early 1970s that culminated in the standard model were a triumph of
renormalization theory. In the lecture he delivered in 1979 in Stockholm when
he received the Nobel Prize in Physics, Steven Weinberg stressed that "to
a remarkable degree, our present detailed theories of elementary‑particle
interactions can be understood deductively, as consequences of symmetry
principles and of the principle of renormalizability which is invoked to deal
with the infinities."8 Relativistic local fields were once
again the "fundamental" entities, but now gauge symmetry and
renormalizability, together with the insights derived from spontaneously broken
symmetries of the Goldstone-Higgs variety, were the guiding principles.
But by the end of the 1970s the cunning of reason once
again had partially undermined the triumph. In the early 1950s Ernst Stueckelberg, Andreas Petermann,
Murray Gell‑Mann and Francis Low9 had made a fundamental observation
regarding the breakdown (due to renormalization) of naive dimensional analysis
in quantum field theory. The importance of that observation was not realized by
the community until the early 1970s, when Kenneth Wilson made manifest the
implications of Gell‑Mann and Low's paper. Wilson had derived important
insights from the work of condensed matter physicists attempting to explain
phase transitions\in particular the research of Lars Onsager, Benjamin
Widom, Leo Kadanoff and Michael Fisher\and from the work of axiomatic field
theorists who had elucidated the mathematical problems encountered in analyzing the properties of products of field operators.
Wilson's work made clear that renormalization was not a technical device to
eliminate the divergences "but rather is an expression of the variation of
the structure of physical interactions with changes in the scale of the
phenomena being probed."10
The renormalization‑group method made it clear that one
must take seriously the energy cutoff ĩ that is introduced in the
regularization method that renders a relativistic quantum theory meaningful. It
was then realized that the physics that is observed at accessible energies (of
order E) can be described, neglecting terms of order (E/ĩ)2, by an geffectiveh field
theory in which the interaction terms are finite in number and limited in
complexity. In the effective field theory\the one that essentially any theory reduces to at sufficiently low energies11\the non-renormalizable interaction terms (which
we consistent with the symmetries of the interactions) are all suppressed by
powers of 1 over the cutoff, and hence negligible.
Now once one accepts nonrenormalizable interactions,
the key issue in addressing a theory like quantum electrodynamics "is not
whether it is or is not renormalizable, but rather how renormalizable is it?h12
How large are the nonrenormalizable interactions in
the "effective" theory? What range of energy scales we well described by the degrees of freedom that
one keeps in the theory? Renormalizability, it turns out, has to do with the
range in energy over which the theory is valid.
In condensed matter physics the notions of an effective
description and an effective interaction predated the renormalization‑group
approach. Landau had emphasized those concepts in areas as diverse as the quasiparticle theory of helium and the Landau‑Ginzburg-Abrikosov theory of
superconductors. Anderson took the same point of view in his work. John
Bardeen, Leon Cooper and Robert Schrieffer showed how an effective Hamiltonian
approach could explain the behavior of
superconductors. All this was known to well‑trained condensed matter physicists
by 1960.
The concept of "universality"\the related notion that the long‑wavelength
behavior is independent of small-distance behavior\predates the publication of Wilson's
work on the renormalization group. A. A. Migdal and
Alexander Polyakov used the term to describe the results of field theoretic
calculations in the late 1960s. The idea played a major role in the phenomenological
explorations of critical phenomena in the 1960s and was present in some form in
the work of Cyril Domb, Fisher and their associates at King's College, and it
was explicitly referred to in the 1967 review paper13 by Kadanoff
and coworkers. In the late 1960s most workers in
critical phenomena knew and accepted the view that much of the macroscopic behavior was quite independent of the microscopic forces.
Wilson's brilliant insights systematized the approach. In condensed matter
physics, renormalization‑group methods were not the source of universality; in
fact, the field provided the examples that were responsible for the further
development of the methods.14
Decoupling
Renormalization‑group methods in condensed matter
physics gave new insights into why the details of the physics of matter at
microscopic length scales and high energy we inconsequential for critical
phenomena. What is important is the symmetry involved, the conservation laws
that hold, the dimensionality of space and the range of the interactions‑ The
method is not restricted to critical phenomena. It has been extended to show
that for a many‑body system one can, by integrating out the short‑wavelength,
high‑frequency modes (which are associated with the atomic and molecular constitution),
arrive at a hydrodynamical description that is valid
for a large class of fluids, and which is insensitive to the details of the
atomic composition of the fluid. The particulars of the short‑wave (atomic)
physics are amalgamated into parameters that appear in the hydrodynamic
description. Those parameters, such as density and viscosity, encapsulate the
ignorance of the short‑distance behavior. The physics
at atomic lengths\and a fortiori high‑energy physics\has become decoupled. (David Nelson
emphasized these points in a roundtable discussion with Weinberg on "What
is fundamental physics?" that was held in the department of the history of
science at Harvard University in May 1992.)
High‑energy physics and condensed matter physics have
become essentially decoupled in the sense that the existence of a top quark, or
any new heavy particle discovered at CERN or elsewhere is irrelevant to the
concerns of condensed matter physicists\no matter how great their intellectual
interest in it may be. The same is true to some extent in nuclear physics. This
fragmentation has resulted in the exploration and conceptualization of the
novelty capable of expression in the aggregation of entities in each level\novelty that is evidently contained in
the "fundamental laws" (the effective Lagrangian)
at that level and that does not challenge their "fundamental"
character. The challenge is how to conceptualize this novelty: That is what
Anderson meant by "More is different."6
The statement that condensed matter and high-energy
physics have become decoupled refers to the ontology that is used: Electrons
and nuclei are the elementary particles of condensed matter physics, and the
relevant features of the internal constitution of a nucleus is embodied in the
(empirically determined) parameters stating its spin, magnetic moment, electric
quadrupole moment and so on. Further detail is irrelevant for describing (at
the usual level of accuracy) the phenomena probed by condensed matter physics.
There is, however, a great deal of exchange of ideas
between the two fields. More than anything else, it was the importance of
symmetry breaking, and in particular the beauty of the Higgs‑Anderson mechanism
of generating masses in massless gauge theories, that made the particle
physicists recognize the significance of the insights and techniques of
condensed matter physicists. Then Wilson's work on the renormalization group
brought home to both communities the importance of mutual education. And indeed
the cross‑fertilization has been of great value and mutual benefit to both subdisciplines. The common pastures include the topics of
scaling and renormalization, topological defects, two‑dimensional models, Monte
Carlo techniques, nonlinear ƒÐ models and much else.
The commonality of theoretical techniques used to
address problems in what were different fields is a general phenomenon.
Something similar is happening in areas of chemistry, meteorology and ecology,
where the mathematics of dynamical systems has given deep new insights into
such diverse phenomena as oscillating chemical reactions, the onset of
turbulence and population dynamics. The computer has been central in this development.
It has made gmodelingh a new form of theoretical understanding
and has allowed gvisualizationh of systems where nonlinearity is an essential
feature. We can now deal with many phenomena that were not amenable to the
analytical methods fashioned until World War II.
The interdisciplinary nature of the new communities
studying these phenomena is also striking. The communities we held together not
by paradigms but by tools: renormalization‑group methods, nmr
machines, lasers, neural networks, computers and so on, It is also noticeable
that despite the apparent increase in specialization, the interconnectedness of
science is becoming more prominent. Tools and concepts are constantly being
carried from one field to another in ways that are difficult to anticipate by
any logical and structural analysis.15
We need to reconceptualize the
growth of scientific knowledge. The Kuhnian model
will no longer do. The new model will have to take into account that something important
has happened. A hierarchical arraying of parts of the physical universe has
been stabilized, each part with its quasistable
ontology and quasistable effective theory, and the
partitioning is fairly well understood. For the energy scales that are
experimentally probed in atomic, molecular and condensed matter physics the
irrelevance (to a very high degree of accuracy) of the domains at much shorter
wavelengths has been justified. Effectively a kind of "finalization"
has taken place in these domains. The date
when finalization occurred for nonrelativistic quantum mechanics can he taken to be 1957, when Bardeen,
Cooper and Schrieffer explained superconductivity. (Until then, there was the
nagging possibility that quantum mechanics might break down at distances above
about 150 Å.)
Stabilization and community
The internal advances within physics that I have
sketched have altered the traditional relationships among the various branches
of physics. The shared commitment to unification and reduction that earlier had
bound the various subdisciplines together has been
substantially weakened. Moreover, these developments have taken place at a time
when the various sub‑branches confront severe problems. Together these two
trends have produced a deep sense of unease that has permeated the discipline,
and particularly the high‑energy physics community.16
In the past the
great successes of the program of reductionism and unification were taken as
confirmation of the existence of causal connections between different layers in
the physical world. Those successes also provided a justification for claiming
the fundamental nature of high‑energy physics activities and furnished a basis
for arguing its relevance to other areas of science. But decoupling theorems
and the effective field theory viewpoint have challenged these assumptions.
The sense of crisis in the high‑energy community has
been amplified by the disparity between the time scales for constructing new
accelerators and detectors and for creating novel theories. There have
essentially been no major new experimental results since the early 1980s, when
the discovery of the W's and the Z0 corroborated the electroweak
theory. Thus the empirical basis for new theoretical advances has been absent.
The cancellation of the SSC implies that this may be the state of affairs for a
long time to come. The termination of that project spells the end of an era for
high‑energy physics in the US.
Meanwhile, particle physics theorists have divided
themselves into various camps: phenomenologists, effective field theorists,
string theorists. String theorists represent the tradition of seeking unifying
theories, and many of them share Dirac's view that theories ought to be
beautiful. And everyone seems to believe that some of the most exciting aspects
of high‑energy physics are to be found in astrophysics. Nor has the condensed
matter community been unaffected. The excitement generated by the solution of
the problem of phase transitions has abated. The problems of explaining high‑temperature
superconductivity have proven more refractory than initially anticipated. The
departure of a number of distinguished practitioners to such fields as
biophysics and neural networks has not escaped notice.
The end of the cold war in the late 1980s exacerbated
the unease. During the cold war, national prestige and national security could
be invoked to justify the large outlays connected with experimental high‑energy
physics and other gfundamental" areas. But the new realities have called
into question the assumption that society will continue to give high‑energy
physicists, cosmologists and astrophysicists generous support simply because of
their ability to produce exciting new knowledge that will enhance national
prestige.
The sense of crisis has been widely noted. In the fall
of 1992 The New York Times reported that some 800 applications had been
received for a single tenure‑track position in the physics department at
Amherst College. PHYSICS TODAY in March 1992 (page 55) described the widespread
concern about the tightening job market for physicists, in particular for new
PhDs and postdocs. And the roundtable discussion featured in the February 1993
issue of PHYSICS TODAY (page 36) suggests that physics faces a situation as
difficult as that during the early 1930s, in the depths of the Depression. Nor
is the situation likely to improve in the foreseeable future. Universities are
undergoing a structural change and are planning to cut the size of their
tenured faculties. The industrial sector is cutting back its research
activities. Government funding is slackening, and its emphasis is shifting to
applied research that will enhance competitiveness and productivity. It is
probable that the discipline will shrink sharply over the next decade or so.
The crisis is
particularly acute for particle physics and cosmology. Those fields claim to
provide, in the reductionist sense, an ultimate basis for our understanding of
the physical world. The crisis manifests itself both at the cognitive and at
the social level. Fundamental research in physics is driven by a passion to
reveal the secrets of nature by probing deeper and deeper into the physical
world. This passionate internal dynamic, which has constantly propelled physics
toward an intellectual mastery beyond the creation of technologically
exploitable knowledge, is a deeply imbedded social practice that has ancient
roots in the religious sphere. But the modern pursuit is conducted within a
society whose dominant conception of rationality follows the doctrine of instrumentalism:
Truth is valued less than usefulness. The justification of a
reductionist pursuit within such a society depends, by and large, on the
relevance of such resource-expensive research (expensive in terms of both
capital and talent) to the goals set by the society.
The conceptual dimension of the crisis has its roots in
the seeming failure of the reductionist approach, in particular its
difficulties accounting for the existence of objective emergent properties.
Thus the formulation of the strong interaction as a non‑Abelian gauge theory of
quarks and gluons can be considered a gfundamentalh description. But it has
proven very difficult to derive from quantum chromodynamics\without invoking any empirical data\an effective chiral Lagrangian describing low‑energy pion-nucleon scattering,
or to deduce from QCD the binding energy of the deuteron and explain why
it is so small. Judging from the results obtained in the mathematical
description of phase transitions in various dimensions, ascertaining the
properties of the solutions of the "fundamental" equations (to say
nothing of obtaining actual solutions) is an extremely difficult mathematical
task involving delicate limiting procedures. The expectations that had seemed
firmly established by previous successes, namely further reductionism and
further unification, have not been met thus far. In fact, this lack of success
has, in some quarters, called into question unification and reductionism as a
strategy for the community\and thus has undermined an important
component of the value system shared by the community.
At the social level, the cutbacks in Federal funding and
the contraction of job opportunities in universities, national laboratories and
industrial laboratories are demanding a deep and painful restructuring of the
community. And for us in the United States this reassessment comes at a time
when we have to face the cast of waging the cold war, a conflict that has left
us almost bankrupt economically and in need of finding new bonds to hold the
nation together.
Where we are
The huge success and the stability of the theoretical
descriptions given by quantum mechanics and quantum field theory imply that
most scientists we no longer testing the foundational aspects of the domains
they work in. Therefore the goals of most of the scientific enterprise are no
longer solely determined internally; other interests come into play. The
scientific enterprise is now largely involved in the creation of novelty\in the design of objects that never
existed before and in the creation of conceptual frameworks to understand the
complexity and novelty that can emerge from the known foundations and ontologies. And precisely because we create
those objects and representations we must assume moral responsibility for them.
I emphasize the act of creation to make it clear that science
as a social practice has much in common with other human practices. But there are
important differences: In the physical sciences, nature places strong
constraints on our experiments and means of observation and plays the role of
ultimate arbiter. Stated slightly differently, the stabilities of the natural
order studied by the physical sciences have a time scale that is "quasi-infinite"
compared with most other time scales.
I believe that in the reconstruction we are engaged in
we must accept that the separation between the moral sphere and the scientific
sphere cannot be maintained. The history of the present century makes clear
that we must reject instrumental rationality, the notion that control and
usefulness should be the overriding criteria guiding our behavior.
Similarly, the relativism of the postmodernist position poses dangers by
allowing everyone to have his or her own criteria. Yet the stubborn question
remains: How shall we determine the universal criteria that I believe we must
have and commit ourselves to criteria that transcendental philosophy had
postulated as a priori? Part of the answer surely must come from the lessons we
have learned from Darwin. On the one hand we have a moral responsibility for
the future of our species‑‑‑and in fact for all life forms on Earth\and on the other hand we have a moral
responsibility to leave the future open\this in the face of having acquired
the knowledge of how to affect our own evolution and evolution in general.
Let me conclude by coming back to fundamental physics. I
believe that elementary‑particle physics has a privileged position, in that the
ontology of its domain and the order manifested by that domain refer to the
building blocks of the higher levels. But though the domain may have a
privileged status, the community investigating it is part of the collective
human enterprise. That community must confront the implications of all-encompassing
visions such as Einstein's and recognize their danger. Yet it must also
recognize the potency of those visions and accommodate them to a more human
scale.
I also believe that fundamental physics has a special
role to play precisely because of its remoteness from everyday phenomena and
its seeming lack of relevance to utilitarian matters‑‑‑what its proponents after
World War II called its purity. There ought to be in part of the scientific
enterprise that does not respond easily to the demand for relevance. It has
become clear that that demand can easily become a source of corruption of the
scientific process. Elementary‑particle physics, astrophysics and cosmology are
among the few remaining areas of science whose advancement is determined
internally, based on experimental findings within the field and on its own
intrinsic conceptual structure. Particle physics and cosmology have not been "stabilized"
and may never be. Scientists engaged in fundamental physics have a special role‑and
a special responsibility\as a community committed to the visions of Bohr and
Charles Sanders Peirce. (For Bohr, the practice of science exhibited a
commitment to an underlying moral order; Peirce's vision was of a community
determining truth through consensus and asymptotically approaching
"Truth.") Scientists constitute a model of what Jǖrgen
Habermas has called a communicative community: one
that exists under the constraint of cooperation, trust and truthfulness, and that
is uncoerced in setting its goals and agenda.17
That community is a guarantor that one of the most exalted of human aspirations‑"to
be a member of a society which is free but not anarchical"18\can indeed be satisfied.
For the past few years I
have collaborated closely with Tian Yu Cao and have had the benefit of his sharp critical faculties, vast erudition
and impressive technical knowledge.
This paper is a testimony of
our constant dialogue.
References
1. P.A.M. Dirac, Proc. R. Soc. London, Ser. A 126, 114 (1929).
2. Quoted in C. Domb, Contemp. Phys. 26(1), 49 (1985).
3. For an entry into the enormous body of works on the physicists and World
War II, see M. Fortun, S. S. Schweber, Social
Studies of Science, Fall 1993, in press.
4. For a historical overview, see T. Y. Cao, S. S,
Schweber, Syn thése, Fall 1993, in press.
5. S.S. Schweber, Hist. Studies Phys. Biol. Sci. 10(2), 339 (1990). J. D. Bernal, The World, the Flesh and the Devil, Dutton, New York (1929).
6. P. W. Anderson, Science 177, 393 (1972).
7. P. J. Dyson, report to the Oldstone Conf, April 1949; see S. Schweber, QED and the Men
Who Made It, Princeton U. R, Princeton, N. J. (1994), p. 554.
8. S Weinberg. Rev. Mod. Phys, 52, 515 (1980).
9. E.C.G. Stueckelberg,
A. Petermann, Helv. Phys. Acta. 26,499
(1953). M. Gell‑Mann, F. Low, Phys. Rev. 95,1300 (1954). See also S.
Weinberg, in Asymptotic Realms of Physics: Essays in Honor
of Francis E. Low, A. H. Guth, K. Huang, R. L. Jaffe,
eds., MIT R, Cambridge, Mass. (1983), p. 1,
10. D. Gross, in Recent
Developments in Quantum Field Theory, J. Ambjorn,
B. J. Durhuus, J. L. Petersen, eds., Elsevier, New
York (1985), p. 151. For an overview of Wilson's contributions, see his Nobel
lecture: K Wilson, Rev. Mod. Phys. 55, 583 (1983).
11. H. Georgi,
in The New Physics, P. Davies, ed‑, Cambridge U. R, Cambridge, England
(1989), p. 446.
12. P.G. Lepage,
in From Action to Answers, Proc. 1989 Adv. Inst. in Elementary Particle
Physics, T. DeGrand, T. Toussaint, eds., World
Scientific, Singapore (1990), p. 483.
13. L.P. Kadanoff, W. Goetze, D. Hamblen, R. Hecht, E.
A. S. Lewis, V. V. Palciauskas, M. Rayl, J. Swift, D. Aspnes, J.
Kane, Rev. Mod. Phys. 139, 395 (1967).
14. A referee of this article stressed
the points made in this and the previous paragraph. See also M. Dresden, in Physical
Reality and Mathematical Description, C. P. Enz,
J. Mehra, eds. Reidel, Dordrecht, The Netherlands
(1985), p. 133.
15. This point was made by Harvey
Brooks already in 1973; see H. Brooks, Am. Sci. 75, 513 (1987).
B. L. R. Smith, American Science Policy since World War II, Brookings
Inst., Washington, D‑ C. (1990),
16. S.S. Schweber, in The Rise of
the Standard Model, Proc. June 1992 SLAC Cont, L. Brown, L. Hoddeson, M.
Riordan, eds., Cambridge U‑ R, Cambridge, England, to appear in 1994. I also
draw on a draft for a proposed workshop on the crisis in physics prepared by T.
Y. Cao and S. S. Schweber.
17. R J. Bernstein, ed., Habermas and
Modernity, MIT R, Cam bridge, Mass.
(1985).
18. I.I. Rabi, in The Scientific Endeavor: Centennial Celebration of the National
Academy of Sciences, Rockefeller Inst. R, New York (1963), p. 303.