An intellect which at any given moment knew all the forces that animate Nature and the mutual positions of the beings that com­prise it, if this intellect were vast enough to submit its data to analysis, could con­dense into a single formula the movement of the greatest bodies of the universe and that of the lightest atom: for such an intellect nothing could be uncertain; and the future just like the past would be present before its eyes. Pierre Simon Laplace, Philosophical Essay on Probabilities (1814).


The highest object at which the natural sciences are constrained to aim, but which they will never reach, is the determination of the forces which are present in nature, and of the state of matter at any given moment ‑ in one word, the reduction of all the phenomena of nature to mechanics." Gustav Robert Kirchhoff, 1865.

The idea that all movement could in theory be reduced to simple laws of nature was first recorded for posterity by Democritus of Abdera in 4th Century BC
Greece. This reductionist outlook was also expressed by the Roman philosopher Lucretius during the 1st Century BC. After the Dark Ages, when Greek and Roman ideas lost favor, disillusionment with religion grew among thinkers and they became curious about the ideas of those ancient challengers of piety and ritual. Isaac Newton wrote about the forces of nature as a basis for understanding movement during the 17th Century. The Philosophes of 18th Century France restored rationalism as the ultimate guide to Truth. Scientific discoveries continued to provide support to the reductionist notion, as shown by leading 19th Century scientists such as Laplace and Kirchhoff.

During the early 20th Century the reductionist paradigm came under challenge by Quantum Physics. The new physics does not claim to require divine intervention or primitive spirits, but it does appear to require randomness for events at physical scales the size of the atom and smaller. The so-called Newtonian physics is correct as far as it goes, but it cannot explain a category of physical events associated with the atom and its interaction with light. At some future time it may be possible to portray quantum physics with the same deterministic quality as Newtonian physics, but for now it is prudent to assume that some events are subject to random outcomes. The strict form of determinism called for by Newtonian physics should be replaced by a probabilistic form of determinism. However, both forms of determinism are reductionist since they are based on the notion that the movements of all particles, and their interaction with light, conform to the laws of physics. This last phrase is key, “conform to the laws of physics,” and it is dealt with at greater length in Appendix A.

"Reductionism" either angers people or delights them. The entire enterprise of science is based on reductionist tenets. Whereas all scientists practice their profession in accordance with the reductionist paradigm, there’s a part of the brain that is so opposed to it that ~40% of scientists claim to not believe in reductionism. These conflicted scientists are found mostly in the humanities, where muddled thinking is less of a handicap; within the physical sciences there is almost universal belief in reductionism (especially among the most esteemed scientists).  

The end-point of reductionist thinking can be most easily described by the following analogy: the universe is like a giant billiard table, in which all movements are mechanical ‑ having been set in motion by the explosive birth of the universe 13.7 billion years ago! Of course this analogy neglects quantum physics, but nothing essential to our understanding of the evolution of life, and of human nature, is lost by this simplification. The march of events, from one moment to the next, is captured by this simple-minded, deterministic view. Since this version of reductionism is essential to the rest of this book, and since it is a fascinating subject in its own right, I devote the rest of this chapter to a brief description of reductionism and an appendix to its fuller treatment.

A Rigid Universe 

"The Universe Rigid" was possibly H. G. Well's most important manuscript. It languished with the publisher, who didn't understand it, and eventually it was lost. Instead of reconstructing it, Wells turned its central idea into a story, The Time Machine (1895).

Two dozen years later, Albert Einstein published his Special Theory of Relativity (1920), which expanded upon the idea, already familiar to physicists of the day, of time as a fourth dimension. This concept was treated in "The Universe Rigid" essay with unusual insight, especially for a non-physicist. The following is a brief summary of it that appeared in The Time Machine: 

"... Suppose you knew fully the position and the properties of every particle of matter... in the universe at any particular moment of time:.. Well, that knowledge would involve the knowledge of the condition of things at the previous moment, and at the moment before that, and so on. If you knew and perceived the present perfectly, you would perceive therein the whole of the past. ... Similarly, if you grasped the whole of the present, ... you would see clearly all the future.  To an omniscient observer ... he would see, as it were, a Rigid Universe filling space and time ‑ a Universe in which things were always the same. He would see one sole unchanging series of cause and effect... If ‘past' meant anything, it would mean looking in a certain direction; while ‘future' meant looking the opposite way. From the absolute point of view the universe is a perfectly rigid unalterable apparatus, entirely predestinate, entirely complete and finished... time is merely a dimension, quite analogous to the three dimensions of space."

This passage describes the underlying principle of what is referred to in today's parlance as "reductionism."  It leaves no room for spirits, mysticism or gods. Most intellectuals use the term "reductionist" as a disparaging epithet, but I believe that reductionism is the crowning achievement of human thought. 

Mechanical Materialism

Reductionism is the belief that complex phenomena can be "reduced" to simpler physical processes, which themselves can in theory be reduced to the simplest level of physical explanation where elementary particles interact according to the laws of physics. The ancient concept of Mechanical Materialism captures the essence of reductionism, but relies upon the outdated concept that at the most basic level the particles are like stones, and interact by hitting each other like billiard balls.

Nevertheless, reductionism is the fulfillment of what Democritus and Lucretius dreamed about, a mechanistic world‑view that as a bonus could also deliver people from the tyranny of religion. Lucretius would agree with the statement: "There is no need for the aid of the gods, there is not even room for their interference.... Man's actions are no exception to the universal law, free‑will is but a delusion." (Bailey, 1926, describing how Lucretius viewed the world). 

It will be instructive to review mechanical materialism before describing the version of reductionism required by modern physics.  

Imagine a game of billiards photographed from above, and consider frames redisplayed in slow motion. After the cue sends one ball into motion, the entirety of subsequent impacts and bounces are determined. If this were not so, if the balls had a mind of their own, or if some mysterious outside force intervened, then consistently good players would not exist. Now imagine a very slow replay of the motions of the billiard balls; millisecond by millisecond the movements unfold with an undeniable inevitableness. A careful analysis would reveal that sustained momentum and elastic collisions govern the placement and velocity of each ball in the next millisecond.  

Given two successive "frames," an observer would know the positions and velocities of every ball, and he could calculate their placement, velocities and future impacts for any arbitrarily short instant later. He could thus predict the following frame, and the process that allowed the prediction of frame 3 from frames 1 and 2 could be repeated for frames 2 and 3 to predict frame 4. And so on, for all future frames.  In this way, the observer could predict all future movements (don’t worry about the fact that we've ignored friction). 

By a similar process the observer could infer a previous frame from any two neighboring frames. Thus, frames 1 and 2 could be used to predict frame 0, etc. Therefore, by knowing any two frames, all future and past frames could be inferred. This is the thought H. G. Wells captured with his unpublished Universe Rigid essay. 

Reductionism as a Basis for Physics

The 19th Century saw a de‑mystification of various science disciplines. The reshaping was done by rationalists, building upon the legacy of the 18th Century philosophes. The rationalists firmly placed science on a footing that has endured throughout the 20th Century. Like the machines of 19th Century inventors, the paradigm developed by 19th Century scientists was "mechanistic." 

Ernst Mach forced metaphysics out of physics (Holton, 1993, pg. 32). Chemistry was changed from a floundering quest for transmuting common elements to gold, into a physics‑based understanding of atoms and molecules. Darwin displaced God from the creation of life by presenting his theory of evolution, even though this may have saddened him personally.

By the end of the 19th Century, when Wells began to write, the intellectual atmosphere was congenial to ideas that reduced mysterious happenings to a juxtaposition of commonplace physical events. Each event in isolation was conceptually simple. It is the mere combining of many such events that cause things to appear incomprehensible. 

Reductionism is based on a concept taught in college Physics 101. I remember well that without fanfare the physics instructor stated that there are only four forces in nature (gravity, electro‑magnetism, the nuclear force and the weak force), and that these forces act upon a finite number of particles that are pulled this way and that by the summation of all forces acting upon each particle. In laboratory experiments where the number of relevant forces can be controlled and can be confined to only 1 or 2, motions are observed to be governed by a simple law: F = m•a, or "force equals mass times acceleration" ("acceleration" is the rate of change of "velocity vector").  It is easier to understand this law of nature by rewriting it in the form: a = F/m, which states that a particle's acceleration is proportional to the sum of forces acting on it divided by the particle's mass. Mathematically, a and F are vectors, which is why these symbols are written in "bold" typeface, and "m" is a scalar (no orientation is involved); thus, the equation a = F/m keeps track of the 3‑dimensional orientation of forces and accelerations. Since forces can originate from many sources, they must be added together to yield one net force.

At every instant a particle is responding to just one net force. It responds by accelerating in the direction of that force (which has a magnitude and direction).  The particle's velocity vector changes due to its acceleration.  Since the time history of a particle's velocity specifies where it goes, the particle's "behavior" is completely determined by the forces acting upon it. This description is called Newtonian physics, and it reigned supreme throughout the 19th Century.  

Quantum Physics

During the late 19th Century a disturbing number of laboratory measurements were made that defied explanation using Newtonian physics. Radioactivity was a puzzle, for it seemed that atoms of certain (radioactive) elements would spontaneously, and at random, emit a particle. There was also the puzzle of atoms absorbing and emitting light at only specific wavelengths, producing a unique spectral pattern for each atomic element. Newtonian physics had no way to accommodate these and other puzzling phenomena.

Quantum physics was developed in response to these puzzling measurements, all of which were related to mysterious phenomena inside the atom. The new physics expanded upon the idea that everyday objects were constructed from electrons orbiting a nucleus composed of protons and neutrons (now known to be constructed from 12 elementary building blocks of matter).  It was proposed that electrons could be thought of as a wave, with a wavelength such that the only permitted orbital circumferences around a nucleus were those with an integer number of wavelengths. Changes in an electron’s orbit involved changes in energy, so if an electron moved to a higher energy orbit (farther from the nucleus and larger in circumference) it must absorb energy from somewhere (such as a photon of light) that had an energy corresponding exactly to the difference in the electron’s energy in the two orbits – hence the quantization of spectral absorption features for each atomic element.

As quantum physics developed to explain more laboratory experiments related to the atom, the theory became weirder and weirder. Quantum mechanics (QM) was developed to deal with particles, and quantum field theory was developed to explain radiation and its interaction with particles.  Quantum physics has been described as inherently probabilistic, or indeterminate, and has been characterized as having so much "quantum weirdness" that our minds are intuitively unprepared to comprehend it. Quantum physics “works” in the sense that it gives a better account than any other theory for atomic scale physical phenomena. Contrary to popular belief, it does not discredit Newtonian physics, which is still valid for large scale phenomena; rather, it is more correct to say that quantum physics supplements Newtonian physics.  Almost every physical situation can be easily identified as requiring one or the other embodiment of physical law.

It now seems that two of the aforementioned four forces can be "unified" (the weak and the electromagnetic). One of the main goals of physics today is to create a “unified” theory that incorporates all the explanatory power of the four forces plus the weird but useful explanatory power of quantum physics.  

One of the most counter-intuitive properties of quantum physics is the notion that events are not strictly determined but are only probable, and that particles are not tiny things at a specific location but are probability functions in 3-dimensional space. When a particle moves the probability function describing its location moves. In the laboratory it is impossible to measure a particle’s position without changing its velocity; and it is similarly impossible to measure a particle’s velocity without changing its position. The Heisenberg Uncertainty Principle quantifies the partitioning of position and velocity uncertainty.

Einstein believed, but could not prove, that although we didn’t know of a way to measure a particle’s position and velocity simultaneously with great accuracy, the particle nevertheless has a well-defined position and velocity, and it interacts with other particles as if this is so. His speculation was described as needing a “basement level” of physical laws, which had not yet been discovered. With a “basement level” of physical laws the apparent “unknowableness” of a particle’s properties would be just that, apparent “unknowableness.” The particle “knows” where it is located and how fast it is going, and in what direction relative to the rest of the universe - even if humans can’t know.

This “quantum weirdness” is often cited to discredit the idea that events are “determined.” But we cannot rule out the possibility that future physicists will discover a basement level of physical law, and that this will restore Newtonian physics as a complete theory for all size scales. The new Newtonian physics would have the old Newtonian physics as a first approximation, valid for use with the vast majority of physical phenomena dealt with on a daily basis. 

Starting here I will present only brief summaries of chapter sub-sections that have been moved to Appendix A for this Second Edition of Genetic Enslavement.

Levels of Physical Explanation

The matter of “levels of physical explanation” must be dealt with for the reader who is not prepared to accept the existence of a basement level of physical law.

In the physical sciences it is common to treat a physical process at a “higher level” than atoms interacting in accordance with the most basic level of physical law, a = F/m and quantum physics. Instead, other “laws” are constructed for everyday settings, either derived from the basic level of laws or derived from experiment and deemed compatible with the basic laws. One example should serve to illustrate this.

Consider the atmosphere, which consists of an immense number of molecules. Any thought of using a = F/m applied at the level of molecules for the purpose of predicting the weather would be silly because of its impracticality. There is no way to know the position and velocity of all the molecules in the atmosphere at a given time for establishing the "initial conditions" required for subsequent calculation using a = F/m. The meteorologist employs a “higher level of physical explanation” by inventing “laws” that govern such aggregate properties as "atmospheric pressure," “temperature,” and "wind speed."

In each case the invented property and rules for using it can be derived from a = F/m, so these handy properties and rule for usage are “emergent properties” of the basic level of physical laws. Every atmospheric scientist would acknowledge that whenever a meteorologist relies on a handy rule, such as “wind speed is proportional to pressure gradient,” what is really occurring in the atmosphere is the unfolding of an immense system of particles obeying a = F/m. 

Just because scientists find it useful to employ "emergent properties" does not mean that the emergent properties exist; rather, they are no more than a useful tool for dealing with a complex system. A "pressure gradient" doesn't exist in nature; it exists only in the minds of humans. Model idealizations of an atmosphere can be used to prove, using a = F/m, that the thing called a "pressure gradient" is associated with wind. But these very proofs belie the existence of the concept, for they "invent" the concept of a pressure gradient for use in a model that then uses a = F/m. The handy meteorology rules, and their "emergent property" tools, are fundamentally redundant to a = F/m.

The refinements of modern physics do not detract from the central concept of materialism, which is that everyday (large-scale) phenomena are the result of the mindless interaction of a myriad of tiny particles in accordance with invariant laws of physics. Reductionists acknowledge the importance of the many levels for explaining complex phenomena, but they insist that all levels higher than the basic level of physical explanation are fundamentally “unreal” and superfluous, even though the higher level of explanation may be more “useful” than a lower level of explanation.

Science embraces what might be termed the "first law of reductionism," that whenever a phenomenon can be explained by recourse to a more basic level of physical law, the “higher level” explanation should only be used when it is drastically simpler to use and unlikely to be misleading. Whenever a higher level of explanation is used, there should be an acknowledgement that it is being used for convenience only.

Living Systems

Reductionists view living systems as subject to the same physical laws as non-living systems. Therefore, the behavior of a living system is an emergent property of a complex physical entity. A living thing is thus an automaton, or robot.

The thing we call "mind" is an "emergent property" of an automaton’s brain. The brain consists of electrons and protons, and these atomic particles obey the same physical laws as inanimate electrons and protons.

Such things as "thoughts, emotions and intentions" are mental constructions of the brain that in everyday situations are more "useful" than the laws of physics for the study of behavior. In spite of their usefulness, they are not actually causing the movement of particles in the living organism, and they don't exist at the most fundamental level of understanding.  Even “free will” must be shorn of its essential features, and recast as another "emergent" product of real causes. 

“Consciousness,” like “free will,” is also an emergent property of automatons, just as the "wind" is an emergent phenomenon of the atmosphere. I don’t object to the use of “consciousness” for the same reason that I don’t object to the use of “wind” when an atmospheric science problem is to be solved.

It has been argued that the physicist exhibits "faith" in extending what is observably true in simple settings to more complicated ones. This assertion of faith is true, but the faith follows from the physicist's desire to invoke a minimum of assumptions for any explanation. 

Some Practical Considerations Concerning Levels of Explanation

The brain evolved, like every other organ, to enhance survival of the genes that encode for its assembly. It should be no surprise, therefore, to find that it is an imperfect instrument for comprehending reality. If it is more efficient to construct brain circuits for dealing with the world using concepts such as spirits and prayer, rather than reductionist physics, then the "forces of evolution" can be expected to select genes that construct brain circuits that employ these pragmatic but false concepts. Since no tasks pertaining to survival requires the a = F/m way of thinking, the brain will find this to be a difficult concept. It is a triumph of physics to have discovered that a = F/m and quantum physics rule everything! 

A naive person might believe that the primitive person, viewing everything in terms of spirits, is thinking at a higher level than the scientist. This would be a ludicrous belief. A primitive is a lazy and unsophisticated thinker. He is totally oblivious to reductionist "levels of thought." As I will describe later, he uses a brain part that is incapable of thinking rationally: the right prefrontal cortex. Human evolution's latest, and possibly most magnificent achievement, is the left prefrontal cortex, which evolution uses to usurp functions from the right prefrontal cortex when rational thought is more appropriate (i.e., feasible). Too often contemporary intellectuals will unthinkingly succumb to the pull of primitive thought, as when someone proudly proclaims that they are “into metaphysics" (an oxymoron).

A fuller exposition of this topic cannot be given without a background of material that will be presented in later chapters. For now, I will merely state that mysticism is a natural way of thought for primitive humans. It is "easier" for them to invoke a "wind spirit" explanation than the reductionist ones, such as a = F/m, or higher level derivative physical concepts. They do this without realizing how many ad hoc assumptions they are creating, which in turn require explanations, and this matter is never acknowledged (as with invoking God as an explanation, without explaining "God"). Their thinking may seem acceptable from the standpoint of a right prefrontal cortex (or "efficient" from the perspective of the genes that merely want to create a brain that facilitates the gene's "goal" of existing in the future), but it is terribly misguided from the standpoint of the thinker endowed with a functioning left prefrontal cortex, that demands rational explanations with a minimum of assumptions. This unthinking proliferation of ad hoc assumptions bothers the reductionist, but it doesn't bother the unsophisticated primitive. 

Reductionism is for the Few

H. G. Wells must have understood the issues of this chapter. The reductionist paradigm was an important part of intellectual thought during the 19th Century, and Wells grasped it more surely than even many scientists today. Scientists, engineers and inventors must have been held in high esteem during the second half of the 19th Century, and the first half of the 20th. The per capita number of significant discoveries and innovations, as measured by Asimov's Chronology of Science and Discovery (Asimov, informally distributed in the 1980s, formally published 1994) peaked at about the middle of this period (actually, 1910 AD, as described in Chapter 15, and specifically Fig. 15.12).  

Late in the 19th Century, after Darwin’s evolution by natural selection instead of divine guidance had time to register with intellectuals, the idea of “humans as automatons” was part of the climate of opinion. Thomas Huxley was intrigued by this idea, and Darwin humored him by signing letters with a reference to it (Sagan and Druyan, 1992, pg. 70). Reductionism requires that all living things be viewed as automatons, or robots created by evolution. Yet none of today’s academics seem brave enough to defend this idea.

Ernst Mach (1893) deserves mention as an early champion of the idea that all branches of science will eventually be viewed as unified. He was a continuing inspiration for those who attempted to advance this perspective (Holton, 1993) throughout the first half of the 20th Century. His was one of the most important in a series of “flame-bearers” for keeping alive an idea that came out of ancient Greece with the writings of Democritus of Abdura (Sagan, 1980).  

Reductionist ideas were at least understood by literary people during the early 20th Century. In 1931 novelist Theodore Dreiser, for example, wrote "I have pondered and even demanded of cosmic energy to know Why. But now I am told by the physicist as well as the biologist that there can be no Why but only a How, since to know How disposes finally of any possible Why." (Dreiser, 1931). 

Sadly, we cannot expect today's intellectuals to have the same profound understanding of the nature of reality as was exhibited a couple generations ago by such writers and social commentators as Wells and Dreiser. The quality of thought over time, in a specific subject area, is not always progressive. As with civilizations, there is a rise and fall in the sophistication of world views. Indeed, as the 21st Century begins we are in the midst of a renewed interest in returning to the comforts of primitive outlooks, as described in the next chapter.

Return to Table of Contents

This site opened:  July 30, 2006.  Last Update:  July 30, 2006