The Fourth Edition is available in hardback and Kindle from Amazon.com. A web sample is available at: http://brucegary.net/book_GE4/

GENETIC ENSLAVEMENT:

 A CALL TO ARMS FOR INDIVIDUAL LIBERATION


Third Edition


 

 

 

Bruce L. Gary

 

 

Reductionist Publications, d/b/a

5320 E. Calle Manzana

Hereford, AZ 85615


 _________________________________________________________________________________________________________________________________

 

Published by Reductionist Publications, d/b/a

5320 E. Calle Manzana

Hereford, AZ 85615

 

Copyright 2008 by Bruce L. Gary

 

All rights reserved except for brief passages quoted in a review.  No part of this book may be reproduced, stored in a retrieval system, or transmitted in any form and by any means: electronic, mechanical, photocopying, recording or otherwise without express prior permission from the publisher.  Requests for usage permission or additional information should be addressed to:  BLG Publishing, 5320 E. Calle Manzana; Hereford, AZ 85615.

 

Third Edition: 2008 September 12

 

Printed by Fidlar-Doubleday, Kalamazoo, MI; USA

 

ISBN 978-0-9798446-0-7

 _________________________________________________________________________________

 


Books by Bruce L. Gary

 

ESSAYS FROM ANOTHER PARADIGM, 1992, 1993 (Abridged Edition)

 

GENETIC ENSLAVEMENT:

A CALL TO ARMS FOR INDIVIDUAL LIBERATION, 2004, 2006, 2008 (this 3rd edition)

 

THE MAKING OF A MISANTHROPE: BOOK 1, AN AUTOBIOGRAPHY, 2005

 

A MISANTHROPE’S HOLIDAY: VIGNETTES AND STORIES, 2007

 

EXOPLANET OBSERVING FOR AMATEURS, 2007

 

QUOTES FOR MISANTHROPES: MOCKING HOMO HYPOCRITUS, 2007

 

THE MAKING OF A MISANTHROPE: BOOK 2, MIDNIGHT THOUGHTS (2008)

 

_____________________________________________________________________________________________________________

 

 

 

 

                                       

"The topic for today is:  What is reality?"

 

 

 ___________________________________________________________________________________________________________________

 

  

 

“Men value women because they can make babies. Women value men because they can support and protect a family. The genes value both because their enslavement offers a prospect for genetic immortality.” Bruce L. Gary

 

 

"So free we seem, so fettered fast we are." Robert Browning, Andrea del Sarto, 1855

 


Do you know what the real question for a thinker is? The real question is: How much truth can you stand?" Spoken by Nietzsche character in When Nietzsche Wept, by Irvin D. Yalom, 1992

 

"When God is at last dead for Man, when the last gleam of light is extinguished and he is surrounded by the impenetrable darkness of an uncaring universe that exists for no purpose, then at last Man will know that he is alone and must create his own values to live by." Nietzsche (altered quotation)

 
“It’s a privilege to have been born and to live on this planet for a few decades.”
Richard Dawkins, in a debate June 2007

_____________________________________________________________________________________________________

C O N T E N T S

 

    Prologue                                                                                                  

 

    Introduction                                                                                         

            Outlaw Genes in 1962, Uncrossed Paths in 1963,

Book Overview

 

  1   Reductionism                                                                                    

            Universe Rigid, F= ma, No Room for Spirit Forces,

Dreiser's "No Why, Only How"

 

  2   Spiritual Heritage                                                                     

            Primitives Need Spirits, We Must Resist the Backwards Pull

 

  3   Genetics Tutorial ‑ Part I                                                        

            Review of Earth Life, Competition is Between Genes, Which

Genes Compete, Gene Interaction Effects, Trade‑Offs

and Compromises, Individual Welfare Irrelevant,

Inclusive Fitness

 

  4  Genetics Tutorial ‑ Part II                                                                  
            Pre‑adaptation, Species‑Shaping Forces, How Many Genes

Compete, Pace of Evolution, Unintended Deleterious

Effects, Dangers of Fast Evolution, Lag and

Regression, Mutational Load, Reverse Evolution,

Pleiotropy and Polygenes

 

  5   Genetics Tutorial ‑ Part III                                                              

             Remote Sensing Metaphor

 

  6   Evolution Concepts and Humans             

 GEP, Men Bear Greater Burden of Selective Forces,

Takeover Infanticidal Males, Monogamy and Cuckolding,

Men and Women Shape Each Other, Birth Order,

Duality of Morality, Emotions Control the Rational,

Consciousness

 

  7   Brain Anatomy and Function                   

            Vertical organization, Cerebral Lobes, Function, Laterality

 

  8   The Brain's Role in Evolution              

            Prefrontal is Recent, Modules and Genes, Competing

Modules, Result‑Driven Thinking, Niches, Individual

Ontogeny & Species Phylogeny

 

  9   Artisans Set the Stage for Civilizations - Part I                       

            Tool Making Artisans go Full‑Time, New Artisan Niches

 

10  Artisans Set the Stage for Civilizations - Part II                      

            Co‑Evolution of Niches and Genes, LB‑Driven

Rise of Civilizations

 

11 Lessons from Sailing Ships                                                        

            Co-evolution of genes for Altruism/Selfishness and

            Intolerance/Tolerance (Group Selection Theory)

 

12  Levels of Selection, Rise and Fall of Civilizations                  

            Gene Selection, Group Selection and Individual

Selection, New Measure for Strength of Selective

Forces, Rise and Fall of Civilizations, Oscillations

            as a Transitional Mode

 

13  The Origins of Two Cultures ‑ Part I                                      

            Evolution of a New Left Brain, "Resentful" Right Brains,

LB/RB conflicts

 

14  The Conflicts of Two Cultures ‑ Part II                                 

            Example Newspaper Articles, Example Books, Eastern

Thought, Fiction and Art, Spiritual Scientists

 

15  Factors Influencing Fate of Civilizations – Part I                 

            Natural Catastrophes, Group Selection Theories

 

16  Factors Influencing Fate of Civilizations – Part II               

Producers/Parasites, RB/LB Conflicts, Sexual Selection

 

17  Factors Influencing Fate of Civilizations – Part III             

Troubadours, Women Speed Civilization’s Fall

 

18  Factors Influencing Fate of Civilizations – Part IV            

Turning Inward, Mutation Load (dysgenia)

 

19   Factors Influencing Fate of Civilizations – Part V            

            Fascism Causing Collapse of American Empire

 

20   Dating the Demise of Humanity                                       

            New Time Scale for Humanity, Doomsday Argument and

Anthropic Principle, Probabilities of Population Collapse

 

21   A Global Civilization Crash Scenario    

            Tribalism's Starring Role; Communism & Fascism

as Twin Gene‑Driven Enemies of Artisan‑Created

Civilization, Genetic Entrenchment and Culturgens,

Conformism, RB's Revengeful Victory Over LB

 

22   Living Wisely ‑ Seeking Positives        

            Mount Cognoscenti, Life Dilemmas, Eschewing the

Crowd, Activity Categories, Emphasize Positives,

Brief Encounters

 

23   A Call to Arms ‑ Identifying Outlaw Genes

            Prospects for Replacing Gonad Man After the Crash,

Genetic Pitfalls

 

24  Utopias                                                          

            Isolated Communities, Cognoscenti Societies, Platonic

Aestheticism

 

25  Repudiation of the Foregoing                     

            Ultimate Meaninglessness of Everything, A Hierarchy

for Dealing With Reality, Existentialism

 

26  A Free Man's Worship                              

Annotated Version of a Bertrand Russell Essay

 

Your Odyssey                                                  

 

Appendix A: Reductionism                            

 

Appendix B: Human Virus Examples            

 

Appendix C: Remote Sensing Analogy        

 

Appendix D: World Population Equations   

 

Appendix E: More Repudiation of the Foregoing

 

References                                                            

 

Index for Authors                                                 

 

Index for Words                                                   


_______________________________________________________________________________________________

 

.

PROLOGUE

 

"Generally speaking, it is quite right if great things ‑ things of much sense for men of rare sense ‑ are expressed but briefly and (hence) darkly, so that barren minds will declare it to be nonsense, rather than translate it into a nonsense that they can comprehend. For mean, vulgar minds have an ugly facility for seeing in the profoundest and most pregnant utterance only their own everyday opinion." Jean Paul, as quoted by Friedrich Nietzsche, Philosophy in the Tragic Age of the Greeks, 1872.


Dear reader, you normaloid idiot!

 

Well, maybe you deserve an explanation for that greeting.

 

A perceptive alien visitor to Earth might report home that humans are the dumbest and most despicable creatures on the planet!

 

At least the other animals don’t claim to know things which, in fact, are absurd nonsense. Only humans believe in such imaginary things as heaven, hell, guardian angels, telepathy and all kinds of gods. Only humans maintain that the world was created by some imagined godly entity just for them and that this God continues to watch everything and tests humans so that He may reward or punish them in accordance with how pleased He is by their behavior. Only humans believe that they are so different from non-living things that their “consciousness” exempts them from the laws of physics. But the most incriminating human trait is that homo sapiens is the only species that has itself for its most dangerous enemy, and a revealing irony is that most killing is done on behalf of this thing they call “religion.”

 

Human conceit and imagination is so poor that people cannot imagine themselves as automatons that are assembled by genes. Even those few humans who do accept that they were assembled by genes seem unable to imagine that these genes have achieved longevity in the species gene pool by assembling automatons that serve those very genes instead of the individual. This saves them from the indignity of realizing that they are foolish slaves to tiny lifeless molecules that use them for aimless ends.

 

The humans, these aliens might conclude, are hopeless!

 

So now, dear reader, we must have a delicate conversation about you in relation to this book. If you are like that clueless 99% of humans, those I call “normaloids,” then let me suggest that you abandon this book and resume your pathetic, unthinking life! You may do so now! Please do so now!

 

Are you still reading? Are you a normaloid pretending to be one of that 1% of thinking humans? I give you one last chance to feel the guilt of reading something not meant for you.

 

Cognoscenti

 

The following was written for the diminishing numbers of “the cognoscenti.” And to the cognoscenti who may be holding this book, I apologize for writing things that are inherently self-evident. You may have already thought of them yourself, and gone beyond my modest collection of thoughts. But if, by chance, you have not already discovered the self-evident ideas in this book then I hope you enjoy the following.

 

Reductionism and Hypocrisy

 

I'm a robot! So are you! This book views people as robots assembled by genes for the "purpose" of serving them by behaving in ways that have led to genetic prosperity in the ancestral environment. Only this “reductionist” viewpoint provides insight into the many bizarre aspects of human nature.

 

Every thinking person should be disappointed in humanity! Indeed, every thinking person should become a “misanthrope.” In youth it is easy to idealize human nature, to believe what people say about themselves. Later, perhaps in the teen years, human hypocrisy is discovered. The so-called “pursuit of Truth” becomes a hollow promise. Adults who continue to believe in childish notions of human nature look foolish.

 

I’m more disappointed than bitter. I can say that with each year's accumulation of disappointment in human nature my interest in writing this book wanes. Among the plethora of book publications there are only a handful for the reader who knows how to think. Even most of those intended for serious reading are fundamentally flawed.  Why, I keep asking, are so many people incapable of thinking!

 

Alas, there is an explanation; an explanation, indeed, for all the flaws in human nature! We are the way we are because the genes have constructed us this way because it serves them!

 

The genes that assemble us were survivors in the "ancestral environment" (AE). Not only did they make fools of us in the AE, but in the modern environment our inherited tendencies make new fools of us in ways that were not even anticipated by the genes.

 

Anyone who occasionally glimpses humans this way has the opportunity of choosing a path leading to a belief that humans are victims of genetic enslavement. Life takes on new meaning for the person who then wishes for liberation from that enslavement. This book is dedicated to that rare person already on such a journey of liberation.

 

The mind is a terrible thing to trust

 

Humans are severely handicapped at comprehending such things as sub‑atomic strings vibrating in 11 dimensions, a universe that will expand forever and cause all matter to "evaporate" in 10100 seconds, or even the everyday experience of seeing a commercial jet airplane that appears to be 35 degrees ahead of where the sound is coming from. The list of things we are ill-equipped to understand is immense!

 

We cannot readily understand these things because they never affected the survival of our ancestor’s genes. How many more aspects of our world are inherently elusive because they never mattered to genetic survival? Or worse, how many things are hidden from us because they belong to a category of knowledge that would have adversely affected the survival of the genes our ancestors carried, even though this insight might have enlightened the individual?

 

The layman seems stubbornly committed to the belief that our minds can be trusted to have an intuitive understanding of all things. Both the layman and professional alike will instinctively object to any suggestion that our genes construct brains that "intentionally" handicap our ability to comprehend the way the genes have enslaved us. To put it bluntly, I am suggesting that our minds are designed to steer us away from Truth when alternative false beliefs safeguard genetic enslavement of the individual, even when this blinded vision diminishes individual well‑being.

 

Humanities versus Physical Sciences

 

Don't expect humility from humans. Just as every serious thinker must become exasperated with others, so should he become exasperated with himself (I use "him" instead of "him/her"). Even within the physical sciences, where I earned a living for 43 years, it is necessary to consciously maintain vigilance against well‑meaning, intruding intuitions. Imagine how difficult the task must be within the humanities, which are blatantly undisciplined compared to the physical sciences. Physical scientists deal with quantifiable predictions which can be tested by observations. In the humanities, on the other hand, practitioners seem more concerned with loyalty to charismatic leaders, and their beliefs, than to the pursuit of objective truth. Imagine, then, how easily investigations in the humanities can go astray.

 

And gone astray they have! The long endeavor to understand "human nature" has had more false leads from well‑meaning professionals with social agendas than probably any other field. For example, some people contend that "human nature" doesn't exist, believing instead that our minds are "blank slates" at birth, ready to be written upon for the creation of whatever mental structures conform to the external world. Others state that “human races” don’t exist, yet insist on affirmative action preferences for  non-existent minority races. Such beliefs are congenial to those who secretly wish to fiddle with the social environment for the purpose of correcting social injustices. Marxist minds are naturally attracted to the humanities, and have tried for nearly a century to hijack anthropology and distort it for their purposes.

 

In spite of the odds against progress, and in spite of energetic people who seem bent on leading others astray, there are achievements to be proud of in the study of human nature. Anthropology and psychology may have a sordid record of undisciplined meddling by people with political agendas, yet uphill progress in these fields has surely occurred.

Academic Quarrels

 

I recognize that most readers will object to this misanthropic portrayal of human nature and my cynical description of "human behavioral scientists." They may be inclined to agree with some of it, but they will quibble with specifics, or insist on different ways of approaching the subject. Just as tribes need to fission when they become too big, major subject areas within academe need to splinter to form "schools of thought" that go their separate ways by maintaining petty quarrels. For example, evolutionary psychologists complain about sociobiologists not having the proper "nuance" concerning adaptation versus optimization, and they use this minor complaint to build a wall of separation when as a practical matter the two fields are essentially one.

 

I am mindful of the need for petty carping by academics, or the inevitability of it, but I deplore the loss of vision that it inflicts upon those caught‑up in it. Sometimes a professional becomes so involved with argument over petty differences, and concern over whose grant request will be funded, that he forgets to stand back from day‑to‑day controversies in his field to see it in the larger perspective. The preoccupation with professional details may render the professional practitioner blind to bigger visions that can only be seen from a distance. An outsider, looking in, will occasionally be worth listening to, for he brings with him that distant "big picture" perspective. I claim to bring a "big picture" perspective to the subject of sociobiology, and this should interest the serious lay reader as well as the professional sociobiologist.

 

This book asks a lot from the reader without a background in sociobiology, and I realize that few, if any, will read it through. The professional sociobiologist will readily understand most of my message, but he will be troubled by the fact that he does not recall reading other articles by me in sociobiology journals. The lay reader will not be bothered that my publications are in a totally unrelated field, but he will find much of the material unfamiliar and will be repelled by it.

 

I will not be disappointed if neither the sociobiologist nor the lay person reads what follows. My life-long romp in the realm of ideas, and my writing of essays that appear in this book, has been more fun than what I imagine it would be like to have positive reader feedback or book sales. Indeed, as of this Second Edition writing (2006 January) fewer than a dozen of the first edition have been sold.

 

When I’m optimistic I recall Henri Beyle (Stendhal), who believed that his writings would escape notice until a century after his death. His forecast was amazingly accurate. Such a fate could in theory happen to this little book, but I now realize that the process of creating it was reward enough. I had more fun writing it than any reader could possibly experience in its reading. Like any creation, this book was written for the author.

.

INTRODUCTION

 

BEGINNINGS OF AN IDEA AND BOOK OVERVIEW

 

Washington, DC in 1962 was an exciting place. President Jack Kennedy created a “Camelot” aura that fed hope for unbounded progress. But the Cuban Missile Crisis brought a sobering chill to the country, especially to residents of Washington, DC. On my way to work I'd look north at the Capitol Building and wonder if it would be blown‑up by a Soviet missile while I was looking at it.

 

My first job after college was at the U.S. Naval Research Laboratory, where I worked as a radio astronomer specializing in Jupiter's radiation belts. Freed of time‑consuming college coursework, I was able to broaden my reading. A few years earlier, the double‑helix structure of DNA had been discovered. Perhaps stimulated by this, or maybe from the sheer momentum of a childhood fascination with the way genes influence behavior, I stumbled upon a thought which I now believe is the second‑most profound one of the 20th Century: “outlaw genes.”

 

1963 Identification of Outlaw Genes

 

On February 23, 1963 I was imagining the possibility of categorizing gene mutations as either promoting or subtracting from their ability to survive into the future and I needed terminology for this gene attribute. "Gene Survival Value" came to mind. Given a sufficiently well thought out measurement protocol any gene could theoretically be placed on a GSV spectrum, with endpoints labeled PGSV and NGSV - standing for "positive GSV" and "negative GSV." (I recall being dissatisfied with such awkward terms). At about the same time I was also struggling to devise theoretical concepts that might guide an individual in choosing a "rewarding life path," as ill‑defined as such a concept can be in youth. Longevity was one factor, so given the GSV example I invented ISV, for Individual Survival Value. The ISV extremes, of course, were PISV and NISV. At this critical juncture, it seemed right to draw an X‑Y coordinate system, representing GSV and ISV. (In retrospect, "individual well‑being” would have been a better parameter to adopt than Individual Survival Value.) The figure on the next page is a rendition of this scatter diagram.

 

In theory, any gene could be "placed" in such a diagram (I hadn't encountered the concept of polygenes or pleiotropy at that time, to be discussed in a later chapter). I imagined genes for this and that, and placed them in the diagram. I recall thinking that there had to be more dots in the upper‑right quadrant, corresponding to PGSV/PISV. 

 

I realized that there shouldn't be many dots in the opposite corner since NGSV/NISV mutations should quickly disappear. Likewise, there shouldn't be many dots in the upper‑left NGSV/PISV quadrant, though wouldn't it be nice if genes flourished when they promoted individual happiness regardless of the cost to themselves. But it was the lower‑right corner that awaited me with a surprise! Gene mutations of this type would "by definition" flourish while "punishing" the individual carrying them! And nothing could be done about it, short of replacing the forces of natural selection with artificially created ones. This gene category has fascinated me ever since!

Figure 1.1 An X‑Y matrix of "genetic survival value" and "individual survival value" with hypothetical markings of the locus of individual genes (as conceived in 1962).

 

Why hadn't I read about such genes? Surely others knew about the inherent conflict between the individual and some of the genes within! I looked forward to someday reading about these "outlaw genes," and the philosophical dilemmas they posed. I stashed these original diagrams and writings on the matter in a file, which remained closed for decades. Nevertheless, I did not forget about these genes and during the past four decades I have written about the subject in my spare time.

 

Coincidences

 

In the Fall of 1963 I enrolled at the University of California at Berkeley for graduate studies in astronomy. As the prospect of taking required courses on such topics as stellar spectroscopy sunk in, I realized that my career path had taken a wrong turn, of sorts, since my heart was with the humanities. I managed to add classes in psychology and anthropology as a consolation for the dry astronomy stuff.  (I quit before semester's end, and have been gainfully employed in the physical sciences ever since.)

 

Although coincidences can shape lives, more often they don't. While I was at Berkeley a little‑known biologist, George C. Williams, was using the school library to write a manuscript that would be published in 1966 as Adaptation and Natural Selection: A Critique of Some Current Evolutionary Thought. He was making a case for the view that selection forces work at the level of the genes, not the individual (and definitely not the species). Although this perspective was inherent in my thinking I failed at the time to grasp its novelty. I assumed that somewhere in the humanities was a field in which everyone believed this. Of course I was wrong, for Williams was engaged in creating such a field.

 

In this same year, 1963, William D. Hamilton prepared manuscripts describing "inclusive fitness" (Hamilton, 1964a,b), which is an essential part of understanding how gene competition drives evolution. The work of both Hamilton and Williams were essential footings, one decade later, for Edward O. Wilson's milestone book Sociobiology: The New Synthesis (Wilson, 1975). In my opinion, sociobiology is the most important idea of the 20th Century.

 

I sometimes wonder how my life's path might have differed if I had met Williams at Berkeley in 1963. A conversation with him could have clarified for me the emerging nature of the new field, and the opportunity for a role that I might have played in that emergence. Although the field was closer to my heart than astronomy, I never ran into G. C. Williams, and I never realized that he was helping to give birth to "my" field.

 

Overlooked Idea

 

Even now, four decades later, no one has written clearly about the mischievous genes (to my knowledge). The Selfish Gene, by Richard Dawkins (1976), comes close; but it never explicitly states that genes "enslave" the individual for their selfish advancement while harming the enslaved individual. Mean Genes (Burnham and Phelan, 2000) comes even closer, but its emphasis is on practical steps for resisting self‑defeating behaviors rather than the theoretical origins of the genes responsible for those behavioral predispositions.

 

Why is there such a paucity of discussion about the philosophical implications of such a profound flaw in our origins and present nature? Why have the professional anthropologists, philosophers and others been so slow to address a subject that captured my unwavering attention 40 years ago, when I was fresh out of college and struggling to establish a career in an unrelated field? Sociobiologists have written about conflicts between competing gene alleles carried by individuals of various relatedness (Hamilton, 1964a,b), between parents and offspring (Trivers, 1974), and between siblings (Sulloway, 1996), but not between the individual and his genes! If any field has a mandate to ask the questions I stumbled upon in 1962 it is the new field of sociobiology!

 

If my idea has merit then sociobiologists have simply overlooked an obvious “next step” in the unfolding of implications for the basic tenet of the field. The history of science has many examples of simple yet profound new ideas being overlooked by the professionals. Every idea has many discoverers, and probably most of them only half realize the import of their discovery. The oft‑discovered idea remains out of the public domain until it is grasped by someone having the energy to push it into the mainstream.

 

Some of the genes within us are enemies of the individual, in the same sense that outlaws are the enemies of a society. This thought should challenge the thinking of every sentient being. The discipline of philosophy should be resurrected, and restructured along sociobiological precepts. If this is ever done the new field would have as its major philosophical dilemma the following question:

 

"What should an individual do with the mental pull toward behaviors that are harmful to individual welfare, yet which are present because they favor the survival of the genes that create brain circuits predisposing the individual to those behaviors?"

 

In other words, should the individual succumb to instincts unthinkingly, given that the gene‑contrived emotional payoffs may jeopardize individual safety and well‑being? Or, should the individual be wary of instincts and thoughts that come easily and forfeit the emotional rewards and ease of living in order to more surely live another day - to face the same dilemma? Should some compromise be chosen?   How can any thinking person fail to be moved by these thoughts?

 

Overview of This Book

 

In writing this book I have wrestled with the desire to proceed directly to the matters of outlaw genes, and how an individual might deal with them. But every time I returned to the position that a proper understanding of the individual's dilemma requires a large amount of groundwork. For example, how can I celebrate the artisan way of life without first describing why the genes created the artisan?

 

In the first edition of this book I included the many groundwork chapters in their entirety before the culminating chapters. The first person to read the book (Dr. M. J. Mahoney) stated that “Once I hit Levels of Selection [Chapter 11] I couldn't put the book down.” That’s when I realized that I had violated the first principle of writing, which is to “quickly engage the reader before you lose them.” In this edition I have shortened the groundwork chapters by moving most of that material to appendices. The groundwork chapters have become a primer for the paradigm that leads inevitably to the positions of the main message of this book.

 

The remainder of this introduction is a précis for the book chapters.

 

There is no guiding hand in evolution; the natural process of the genes acting on their own behalf leads to individuals who are mere "agents" for these genes. This is the perspective of "sociobiology," also called "evolutionary psychology," and presented most effectively for the general public by Richard Dawkins in The Selfish Gene (1976). To understand the "blindness" of evolution one must first understand that the universe is just a "mechanism," that every phenomenon reduces to the action of blind forces of physics acting upon dumb particles. This outlook is called "reductionism," and is the subject of Chapter 1.

 

Lest the reader surmise that this book is about the physics of life, I attempt an impassioned appeal, in Chapter 2, for an embrace of modern man's scientific approach to understanding life, and a rejection of the primitive backwards pull that captures most unwary thinkers. This appeal provides a foretaste of the spicy sting of chapters found in the second half of the book.

 

Since genes are such an essential player in everything, I found it necessary to include tutorial chapters on genetics. The first of these genetics tutorials, Chapter 3, presents general properties of genes, such as how they compete and cooperate with each other, and have no concern for individual welfare beyond what serves them. The second genetics tutorial, Chapter 4, explores some subtle properties of genes that will be needed by later chapters. For example, since in every new environment some genes will fare better than others, it is useful to think of genes as being "pre‑adapted" and "pre‑maladapted" to novel environments. This will be an important concept in considering artisan niches in the modern world.

 

Chapter 5 is not necessary for the development of the book’s theme, but for those who understand it the chapter will provide a deeper insight into the mathematics of pre-adaptation and pre-maladaptation.

 

Chapter 6 pulls together some of the genetics ideas and applies them to human evolution. Certain insights are needed for a person to intelligently deal with emotions that control or attempt to discredit intellect. For example, how can a person handle jealousy without understanding cuckoldry?

 

Chapters 7 and 8 are devoted to the brain. The most recent advance in the evolution of the human brain is the refashioning of the left prefrontal cortex. It is important to view the brain as an organ designed by the genes to aid in gene survival. Rationality is a new and potentially dangerous tool created by the genes, and it must be kept under the control of "mental blinders" to assure that the agendas of other genes are not thwarted. Competing brain modules, cognitive dissonance, and self‑deception, are just a few concepts that any sentient must know about when navigating a path through life's treacherous shoals.

 

In Chapter 9 I write about the first artisan, whose precarious role as a full‑time tool and weapon maker may have begun 60,000 years ago. When the climate finally warmed 11,600 years ago at the start of our present "interglacial," called the Holocene, the small number of existing artisan roles served as a model for an explosion of new ones. The new artisans made high‑density populations possible and eventually led to the creation of civilizations (Chapter 10). Since I will celebrate the artisan way of life it is necessary to understand how it came into existence and why others in society are likely to view it warily. I will outline a theory for "anti‑intellectualism" and suggest that it may play a role in a civilization's decline.

 

Group selection still attracts controversy, and I use it argue that tribal warfare led to ever‑larger tribes, which required that its membership be ever‑more subservient to "tribal requirements" since the entire tribal membership had a shared destiny. But, as I argue in Chapter 11, when group selective forces were at their maximum during the Holocene, something new happened that heralded the first‑ever "individual selection" dynamic. The artisans assumed a leadership role in molding culture, governance, and opening opportunities for individual expression of creative and productive labors that led to a state that we now call "civilization." 

 

But a civilization is vulnerable to outside attack by societies that remain uncivilized, that foster religious fanaticism. These stay‑behind societies harbor resentment of the material wealth of the civilized society, and instead of achieving wealth for themselves by surrendering their group‑serving grip on the individual, they instead mobilize the individual to discredit their rich neighbor and declare cultural warfare on them. Religious zeal serves these super-tribes by fostering fanatical, suicidal attacks on those societies that respect the individual. But since individuals in the civilized society think first of themselves, the civilization's defense is half‑hearted and ultimately ineffective.

 

It is inevitable that civilizations arise with an ambivalent self‑hatred. This is because people whose thinking style is overly influenced by their "primitive" right brain are naturally resentful of the world created by those new left‑brain artisans. The new world order favors the left‑brained artisan (engineer, scientist and other rational thinkers) and relegates to some vague periphery the contributions that can be made by the old‑style people. Thus, every civilization should have "two cultures" that are in conflict, and this is treated in Chapters 12 and 13.

 

Chapter 14 begins to address the matter of what factors might contribute to the decline and fall of civilizations. One theory invokes a back‑and‑forth dominance of artisan "producers" versus opportunistic "parasites." Another suggests that the two cultures war, or the “War of the Brain Halves,” is eventually won by those who succumb to the primitive pull. Gingerly, I also suggest that dysgenia might undermine our genetic vigor and sap societal energies.

 

I discovered the Anthropic Principle (and learned that it had been written about and published obscurely a few years before my discovery of it). I use this idea to predict the approximate date range for a significant crash in the human population.  In the process of calculating this horrific event, I show that the rate of technological innovations exhibits a trace over time that foretells population patterns. From this analysis it appears that we are now in the second major "rise and fall" pattern of innovation rate and population, the latter pattern being displaced a few centuries after the first. This is described in Chapter 15.

 

I attempt to survey some possible population crash scenarios in Chapter 16. However, I conclude that the future is so difficult to predict that it is prudent to only present possibilities.

 

In Chapter 17 I begin my "call to arms" for individuals to emancipate themselves from the genetic grip. All previous chapters are preamble to this one and those that follow. My appeal must be qualified by some nitty‑gritty facts of genetics, such as pleiotropy and polygenes. Nevertheless, I present a litany of "genetic pitfalls" that any emancipated person should wish to avoid.

 

Because any reader will expect a book such as this to give specific suggestions for how to use insight to live wisely, I feel obligated to present in Chapter 18 my feeble attempt to address the subject. It is an attempt to describe ways that an individual may live wisely in a world wracked with defects caused by outlaw genes. Some genes are our enemy because they lead to dysfunctional human societies, while other genes are our enemy because they lead us as individuals to want the wrong things. The individual's task is to liberate oneself from the genes, and choose wisely. The IQ form of intelligence allows insight, and this insight must be placed into the service of an enlightened "emotional intelligence" to arrive at new personal values to live by. The questing person will understand the wisdom in the saying, which applies to the unthinking person: "If you get what you want, you deserve what you get." However, I readily acknowledge that my attempt to realize this chapter's goal is feeble, and the reasons for this are developed at the end of the book.

 

Chapter 19 follows naturally from the previous chapter, since an individual who wishes to pursue an individual‑emancipated life must do so within the constraints of living in a society where individual liberation is difficult. When a sufficient number of people awaken to their enslaved condition, thoughts may turn to a way for them to coalesce in a shared search for a winning place. I describe utopias and prospects for isolated enclaves as a path toward a stable community where individual liberation may be sought. However, I warn that the world is becoming too "small" for enclaves to remain safe from meddlesome outsiders. Since the door of feasibility for creating isolated space communities has shut, and since the earth is already "too small" for self‑sustaining communities to remain secret, there are no feasible refuges for utopias. I conclude that today's world will not tolerate the formation of an enlightened society of liberated individuals, and that those who might wish to live in such a society must be content with learning how to live a good life as individuals with secret dreams while being surrounded by an ever‑increasing number of primitive hoi poloi. The "society of the cognoscenti" will remain dispersed, and may only occasionally recognize each other during normal encounters.

 

Chapter 20 is supposed to be a surprise, but the subtitle sort of gives it away: Repudiation of the Foregoing. I will say no more.

 

Chapter 21 is an annotated version of Bertrand Russell's essay, “A Free Man's Worship.” It is an excellent example of how a liberated person thinks, and I use it to illustrate the point of the preceding chapter. Namely, once a person is liberated from genetic enslavement and free to choose values to live by that are compatible with the cognoscenti's insights, an aesthetic and poetic attitude toward "existence" can be achieved. The existentialist need not be a sourpuss, nor must he become a passive esthete. The thoughtful existentialist may end up a compassionate humanist with a lust for existence!

 

So now dear reader, if you exist, do take the following speculations with a light heart; hopefully your thoughts will be led in directions that are as congenial to your inherited ways of thinking as the following are to mine.


CHAPTER 1

 

REDUCTIONISM

 

An intellect which at any given moment knew all the forces that animate Nature and the mutual positions of the beings that com­prise it, if this intellect were vast enough to submit its data to analysis, could con­dense into a single formula the movement of the greatest bodies of the universe and that of the lightest atom: for such an intellect nothing could be uncertain; and the future just like the past would be present before its eyes. Pierre Simon Laplace, Philosophical Essay on Probabilities (1814).

 

The highest object at which the natural sciences are constrained to aim, but which they will never reach, is the determination of the forces which are present in nature, and of the state of matter at any given moment ‑ in one word, the reduction of all the phenomena of nature to mechanics." Gustav Robert Kirchhoff, 1865.

 

The idea that all movement could in theory be reduced to simple laws of nature was first recorded for posterity by Democritus of Abdera in 4th Century BC Greece. This reductionist outlook was also expressed by the Roman philosopher Lucretius during the 1st Century BC. After the Dark Ages, when Greek and Roman ideas lost favor, disillusionment with religion grew among thinkers and they became curious about the ideas of those ancient challengers of piety and ritual. Isaac Newton wrote about the forces of nature as a basis for understanding movement during the 17th Century. The Philosophes of 18th Century France restored rationalism as the ultimate guide to Truth. Scientific discoveries continued to provide support to the reductionist notion, as shown by leading 19th Century scientists such as Laplace and Kirchhoff.

 

During the early 20th Century the reductionist paradigm came under challenge by Quantum Physics. The new physics does not claim to require divine intervention or primitive spirits, but it does appear to require randomness for events at physical scales the size of the atom and smaller. The so-called Newtonian physics is correct as far as it goes, but it cannot explain a category of physical events associated with the atom and its interaction with light. At some future time it may be possible to portray quantum physics with the same deterministic quality as Newtonian physics, but for now it is prudent to assume that some events are subject to random outcomes. The strict form of determinism called for by Newtonian physics should be replaced by a probabilistic form of determinism. However, both forms of determinism are reductionist since they are based on the notion that the movements of all particles, and their interaction with light, conform to the laws of physics. This last phrase is key, “conform to the laws of physics,” and it is dealt with at greater length in Appendix A.

 

"Reductionism" either angers people or delights them. The entire enterprise of science is based on reductionist tenets. Whereas all scientists practice their profession in accordance with the reductionist paradigm, there’s a part of the brain that is so opposed to it that ~40% of scientists claim to not believe in reductionism. These conflicted scientists are found mostly in the humanities, where muddled thinking is less of a handicap; within the physical sciences there is almost universal belief in reductionism (especially among the most esteemed scientists).

 

The end-point of reductionist thinking can be most easily described by the following analogy: the universe is like a giant billiard table, in which all movements are mechanical ‑ having been set in motion by the explosive birth of the universe 13.7 billion years ago! Of course this analogy neglects quantum physics, but nothing essential to our understanding of the evolution of life, and of human nature, is lost by this simplification. The march of events, from one moment to the next, is captured by this simple-minded, deterministic view. Since this version of reductionism is essential to the rest of this book, and since it is a fascinating subject in its own right, I devote the rest of this chapter to a brief description of reductionism and an appendix to its fuller treatment.

 

A Rigid Universe

 

"The Universe Rigid" was possibly H. G. Well's most important manuscript. It languished with the publisher, who didn't understand it, and eventually it was lost. Instead of reconstructing it, Wells turned its central idea into a story, The Time Machine (1895).

 

Two dozen years later, Albert Einstein published his Special Theory of Relativity (1920), which expanded upon the idea, already familiar to physicists of the day, of time as a fourth dimension. This concept was treated in "The Universe Rigid" essay with unusual insight, especially for a non-physicist. The following is a brief summary of it that appeared in The Time Machine:

 

"... Suppose you knew fully the position and the properties of every particle of matter... in the universe at any particular moment of time:.. Well, that knowledge would involve the knowledge of the condition of things at the previous moment, and at the moment before that, and so on. If you knew and perceived the present perfectly, you would perceive therein the whole of the past. ... Similarly, if you grasped the whole of the present, ... you would see clearly all the future.  To an omniscient observer ... he would see, as it were, a Rigid Universe filling space and time ‑ a Universe in which things were always the same. He would see one sole unchanging series of cause and effect... If ‘past' meant anything, it would mean looking in a certain direction; while ‘future' meant looking the opposite way. From the absolute point of view the universe is a perfectly rigid unalterable apparatus, entirely predestinate, entirely complete and finished... time is merely a dimension, quite analogous to the three dimensions of space."

 

This passage describes the underlying principle of what is referred to in today's parlance as "reductionism."  It leaves no room for spirits, mysticism or gods. Most intellectuals use the term "reductionist" as a disparaging epithet, but I believe that reductionism is the crowning achievement of human thought.

Mechanical Materialism

Reductionism is the belief that complex phenomena can be "reduced" to simpler physical processes, which themselves can in theory be reduced to the simplest level of physical explanation where elementary particles interact according to the laws of physics. The ancient concept of Mechanical Materialism captures the essence of reductionism, but relies upon the outdated concept that at the most basic level the particles are like stones, and interact by hitting each other like billiard balls.


Nevertheless, reductionism is the fulfillment of what Democritus and Lucretius dreamed about, a mechanistic world‑view that as a bonus could also deliver people from the tyranny of religion. Lucretius would agree with the statement: "There is no need for the aid of the gods, there is not even room for their interference.... Man's actions are no exception to the universal law, free‑will is but a delusion." (Bailey, 1926, describing how Lucretius viewed the world).

 

It will be instructive to review mechanical materialism before describing the version of reductionism required by modern physics.  

 

Imagine a game of billiards photographed from above, and consider frames redisplayed in slow motion. After the cue sends one ball into motion, the entirety of subsequent impacts and bounces are determined. If this were not so, if the balls had a mind of their own, or if some mysterious outside force intervened, then consistently good players would not exist. Now imagine a very slow replay of the motions of the billiard balls; millisecond by millisecond the movements unfold with an undeniable inevitableness. A careful analysis would reveal that sustained momentum and elastic collisions govern the placement and velocity of each ball in the next millisecond.

 

Given two successive "frames," an observer would know the positions and velocities of every ball, and he could calculate their placement, velocities and future impacts for any arbitrarily short instant later. He could thus predict the following frame, and the process that allowed the prediction of frame 3 from frames 1 and 2 could be repeated for frames 2 and 3 to predict frame 4. And so on, for all future frames.  In this way, the observer could predict all future movements (don’t worry about the fact that we've ignored friction). 

 

By a similar process the observer could infer a previous frame from any two neighboring frames. Thus, frames 1 and 2 could be used to predict frame 0, etc. Therefore, by knowing any two frames, all future and past frames could be inferred. This is the thought H. G. Wells captured with his unpublished Universe Rigid essay.

 

Reductionism as a Basis for Physics

 

The 19th Century saw a de‑mystification of various science disciplines. The reshaping was done by rationalists, building upon the legacy of the 18th Century philosophes. The rationalists firmly placed science on a footing that has endured throughout the 20th Century. Like the machines of 19th Century inventors, the paradigm developed by 19th Century scientists was "mechanistic."

 

Ernst Mach forced metaphysics out of physics (Holton, 1993, pg. 32). Chemistry was changed from a floundering quest for transmuting common elements to gold, into a physics‑based understanding of atoms and molecules. Darwin displaced God from the creation of life by presenting his theory of evolution, even though this may have saddened him personally.

 

By the end of the 19th Century, when Wells began to write, the intellectual atmosphere was congenial to ideas that reduced mysterious happenings to a juxtaposition of commonplace physical events. Each event in isolation was conceptually simple. It is the mere combining of many such events that cause things to appear incomprehensible.

 

Reductionism is based on a concept taught in college Physics 101. I remember well that without fanfare the physics instructor stated that there are only four forces in nature (gravity, electro‑magnetism, the nuclear force and the weak force), and that these forces act upon a finite number of particles that are pulled this way and that by the summation of all forces acting upon each particle. In laboratory experiments where the number of relevant forces can be confined to only 1 or 2, motions are observed to be governed by a simple law: F = m•a, or "force equals mass times acceleration" ("acceleration" is the rate of change of "velocity vector").  It is easier to understand this law of nature by rewriting it in the form: a = F/m, which states that a particle's acceleration is proportional to the sum of forces acting on it divided by the particle's mass. Mathematically, a and F are vectors, which is why these symbols are written in "bold" typeface, and "m" is a scalar (no orientation is involved); thus, the equation a = F/m keeps track of the 3‑dimensional orientation of forces and accelerations. Since forces can originate from many sources, they must be added together to yield one net force.

 

At every instant a particle is responding to just one net force. It responds by accelerating in the direction of that force (which has a magnitude and direction).  The particle's velocity vector changes due to its acceleration.  Since the time history of a particle's velocity specifies where it goes, the particle's "behavior" is completely determined by the forces acting upon it. This description is called Newtonian physics, and it reigned supreme throughout the 19th Century.

 

Quantum Physics

 

During the late 19th Century a disturbing number of laboratory measurements were made that defied explanation using Newtonian physics. Radioactivity was a puzzle, for it seemed that atoms of certain (radioactive) elements would spontaneously, and at random, emit a particle. There was also the puzzle of atoms absorbing and emitting light at only specific wavelengths, producing a unique spectral pattern for each atomic element. Newtonian physics had no way to accommodate these and other puzzling phenomena.

Quantum physics was developed in response to these puzzling measurements, all of which were related to mysterious phenomena inside the atom. The new physics expanded upon the idea that everyday objects were constructed from electrons orbiting a nucleus composed of protons and neutrons (now known to be constructed from 12 elementary building blocks of matter).  It was proposed that electrons could be thought of as a wave, with a wavelength such that the only permitted orbital circumferences around a nucleus were those with an integer number of wavelengths. Changes in an electron’s orbit involved changes in energy, so if an electron moved to a higher energy orbit (farther from the nucleus and larger in circumference) it must absorb energy from somewhere (such as a photon of light) that had an energy corresponding exactly to the difference in the electron’s energy in the two orbits – hence the quantization of spectral absorption features for each atomic element.

As quantum physics developed to explain more laboratory experiments related to the atom, the theory became weirder and weirder. Quantum mechanics (QM) was developed to deal with particles, and quantum field theory was developed to explain radiation and its interaction with particles.  Quantum physics has been described as inherently probabilistic, or indeterminate, and has been characterized as having so much "quantum weirdness" that our minds are intuitively unprepared to comprehend it. Quantum physics “works” in the sense that it gives a better account than any other theory for atomic scale physical phenomena. Contrary to popular belief, it does not discredit Newtonian physics, which is still valid for large scale phenomena; rather, it is more correct to say that quantum physics supplements Newtonian physics.  Almost every physical situation can be easily identified as requiring one or the other embodiment of physical law.

It now seems that two of the aforementioned four forces can be "unified" (the weak and the electromagnetic). One of the main goals of physics today is to create a “unified” theory that incorporates all the explanatory power of the four forces plus the weird but useful explanatory power of quantum physics.  

One of the most counter-intuitive properties of quantum physics is the notion that events are not strictly determined but are only probable, and that particles are not tiny things at a specific location but are probability functions in 3-dimensional space. When a particle moves the probability function describing its location moves. In the laboratory it is impossible to measure a particle’s position without changing its velocity; and it is similarly impossible to measure a particle’s velocity without changing its position. The Heisenberg Uncertainty Principle quantifies the partitioning of position and velocity uncertainty.

Einstein believed, but could not prove, that although we didn’t know of a way to measure a particle’s position and velocity simultaneously with great accuracy, the particle nevertheless has a well-defined position and velocity, and it interacts with other particles as if this is so. His speculation was described as needing a “basement level” of physical laws, which had not yet been discovered. With a “basement level” of physical laws the apparent “unknowableness” of a particle’s properties would be just that, apparent “unknowableness.” The particle “knows” where it is located and how fast it is going, and in what direction relative to the rest of the universe - even if humans can’t know.

This “quantum weirdness” is often cited to discredit the idea that events are “determined.” But we cannot rule out the possibility that future physicists will discover a basement level of physical law, and that this will restore Newtonian physics as a complete theory for all size scales. The new Newtonian physics would have the old Newtonian physics as a first approximation, valid for use with the vast majority of physical phenomena dealt with on a daily basis.

Starting here I will present only brief summaries of chapter sub-sections that have been moved to Appendix A for this Second Edition of Genetic Enslavement.

Levels of Physical Explanation

The matter of “levels of physical explanation” must be dealt with for the reader who is not prepared to accept the existence of a basement level of physical law.

In the physical sciences it is common to treat a physical process at a “higher level” than atoms interacting in accordance with the most basic level of physical law, a = F/m and quantum physics. Instead, other “laws” are constructed for everyday settings, either derived from the basic level of laws or derived from experiment and deemed compatible with the basic laws. One example should serve to illustrate this.

Consider the atmosphere, which consists of an immense number of molecules. Any thought of using a = F/m applied at the level of molecules for the purpose of predicting the weather would be silly because of its impracticality. There is no way to know the position and velocity of all the molecules in the atmosphere at a given time for establishing the "initial conditions" required for subsequent calculation using a = F/m. The meteorologist employs a “higher level of physical explanation” by inventing “laws” that govern such aggregate properties as "atmospheric pressure," “temperature,” and "wind speed."

In each case the invented property and rules for using it can be derived from a = F/m, so these handy properties and rule for usage are “emergent properties” of the basic level of physical laws. Every atmospheric scientist would acknowledge that whenever a meteorologist relies on a handy rule, such as “wind speed is proportional to pressure gradient,” what is really occurring in the atmosphere is the unfolding of an immense system of particles obeying a = F/m. 

Just because scientists find it useful to employ "emergent properties" does not mean that the emergent properties exist; rather, they are no more than a useful tool for dealing with a complex system. A "pressure gradient" doesn't exist in nature; it exists only in the minds of humans. Model idealizations of an atmosphere can be used to prove, using a = F/m, that the thing called a "pressure gradient" is associated with wind. But these very proofs belie the existence of the concept, for they "invent" the concept of a pressure gradient for use in a model that then uses a = F/m. The handy meteorology rules, and their "emergent property" tools, are fundamentally redundant to a = F/m.

The refinements of modern physics do not detract from the central concept of materialism, which is that everyday (large-scale) phenomena are the result of the mindless interaction of a myriad of tiny particles in accordance with invariant laws of physics. Reductionists acknowledge the importance of the many levels for explaining complex phenomena, but they insist that all levels higher than the basic level of physical explanation are fundamentally “unreal” and superfluous, even though the higher level of explanation may be more “useful” than a lower level of explanation.

Science embraces what might be termed the "first law of reductionism," that whenever a phenomenon can be explained by recourse to a more basic level of physical law, the “higher level” explanation should only be used when it is drastically simpler to use and unlikely to be misleading. Whenever a higher level of explanation is used, there should be an acknowledgement that it is being used for convenience only.

Living Systems

Reductionists view living systems as subject to the same physical laws as non-living systems. Therefore, the behavior of a living system is an emergent property of a complex physical entity. A living thing is thus an automaton, or robot.

The thing we call "mind" is an "emergent property" of an automaton’s brain. The brain consists of electrons and protons, and these atomic particles obey the same physical laws as inanimate electrons and protons.

Such things as "thoughts, emotions and intentions" are mental constructions of the brain that in everyday situations are more "useful" than the laws of physics for the study of behavior. In spite of their usefulness, they are not actually causing the movement of particles in the living organism, and they don't exist at the most fundamental level of understanding.  Even “free will” must be shorn of its essential features, and recast as another "emergent" product of real causes. 

“Consciousness,” like “free will,” is also an emergent property of automatons, just as the "wind" is an emergent phenomenon of the atmosphere. I don’t object to the use of “consciousness” for the same reason that I don’t object to the use of “wind” when an atmospheric science problem is to be solved.

 

It has been argued that the physicist exhibits "faith" in extending what is observably true in simple settings to more complicated ones. This assertion of faith is true, but the faith follows from the physicist's desire to invoke a minimum of assumptions for any explanation.

 

Some Practical Considerations Concerning Levels of Explanation

 

The brain evolved, like every other organ, to enhance survival of the genes that encode for its assembly. It should be no surprise, therefore, to find that it is an imperfect instrument for comprehending reality. If it is more efficient to construct brain circuits for dealing with the world using concepts such as spirits and prayer, rather than reductionist physics, then the "forces of evolution" can be expected to select genes that construct brain circuits that employ these pragmatic but false concepts. Since no tasks pertaining to survival requires the a = F/m way of thinking, the brain will find this to be a difficult concept. It is a triumph of physics to have discovered that a = F/m and quantum physics rule everything!

 

A naive person might believe that the primitive person, viewing everything in terms of spirits, is thinking at a higher level than the scientist. This would be a ludicrous belief. A primitive is a lazy and unsophisticated thinker. He is totally oblivious to reductionist "levels of thought." As I will describe later, he uses a brain part that is incapable of thinking rationally: the right prefrontal cortex. Human evolution's latest, and possibly most magnificent achievement, is the left prefrontal cortex, which evolution uses to usurp functions from the right prefrontal cortex when rational thought is more appropriate (i.e., feasible). Too often contemporary intellectuals will unthinkingly succumb to the pull of primitive thought, as when someone proudly proclaims that they are “into metaphysics" (an oxymoron).

 

A fuller exposition of this topic cannot be given without a background of material that will be presented in later chapters. For now, I will merely state that mysticism is a natural way of thought for primitive humans. It is "easier" for them to invoke a "wind spirit" explanation than the reductionist ones, such as a = F/m, or higher level derivative physical concepts. They do this without realizing how many ad hoc assumptions they are creating, which in turn require explanations, and this matter is never acknowledged (as with invoking God as an explanation, without explaining "God"). Their thinking may seem acceptable from the standpoint of a right prefrontal cortex (or "efficient" from the perspective of the genes that merely want to create a brain that facilitates the gene's "goal" of existing in the future), but it is terribly misguided from the standpoint of the thinker endowed with a functioning left prefrontal cortex, that demands rational explanations with a minimum of assumptions. This unthinking proliferation of ad hoc assumptions bothers the reductionist, but it doesn't bother the unsophisticated primitive.

 

Reductionism is for the Few

 

H. G. Wells must have understood the issues of this chapter. The reductionist paradigm was an important part of intellectual thought during the 19th Century, and Wells grasped it more surely than even many scientists today. Scientists, engineers and inventors must have been held in high esteem during the second half of the 19th Century, and the first half of the 20th. The per capita number of significant discoveries and innovations, as measured by Asimov's Chronology of Science and Discovery (Asimov, informally distributed in the 1980s, formally published 1994) peaked at about the middle of this period (actually, 1910 AD, as described in Chapter 15, and specifically Fig. 15.12).

 

Late in the 19th Century, after Darwin’s evolution by natural selection instead of divine guidance had time to register with intellectuals, the idea of “humans as automatons” was part of the climate of opinion. Thomas Huxley was intrigued by this idea, and Darwin humored him by signing letters with a reference to it (Sagan and Druyan, 1992, pg. 70). Reductionism requires that all living things be viewed as automatons, or robots created by evolution. Yet none of today’s academics seem brave enough to defend this idea.

 

Ernst Mach (1893) deserves mention as an early champion of the idea that all branches of science will eventually be viewed as unified. He was a continuing inspiration for those who attempted to advance this perspective (Holton, 1993) throughout the first half of the 20th Century. His was one of the most important in a series of “flame-bearers” for keeping alive an idea that came out of ancient Greece with the writings of Democritus of Abdura (Sagan, 1980).  

 

Reductionist ideas were at least understood by literary people during the early 20th Century. In 1931 novelist Theodore Dreiser, for example, wrote "I have pondered and even demanded of cosmic energy to know Why. But now I am told by the physicist as well as the biologist that there can be no Why but only a How, since to know How disposes finally of any possible Why." (Dreiser, 1931).

 

Sadly, we cannot expect today's intellectuals to have the same profound understanding of the nature of reality as was exhibited a couple generations ago by such writers and social commentators as Wells and Dreiser. The quality of thought over time, in a specific subject area, is not always progressive. As with civilizations, there is a rise and fall in the sophistication of world views. Indeed, as the 21st Century begins we are in the midst of a renewed interest in returning to the comforts of primitive outlooks, as described in the next chapter.

 


CHAPTER 2

                              

RESISTING THE BACKWARD PULL

TOWARD OUR SPIRITUAL HERITAGE

 

"O miserable minds of men!  O blind hearts!  In what darkness of life, in what great dangers ye spend this little span of years! ... Life is one long struggle in the dark."  Lucretius, On the Nature of Things, ca 60 BC.

 

"It does no harm to the mystery to know a little about it. For far more marvelous is the truth than any artists of the past imagined! Why do the poets of the present not speak of it? What men are poets who can speak of Jupiter if he were like a man, but if he is an immense spinning sphere of methane and ammonia must be silent?"  Richard Feynman, Lectures in Physics, Vol. 1, Addison Wesley, 1963.

 

The term "New Age" is a misnomer, and an insult to better ages. It is a misnomer because it is a regression to primitive ways of thinking, ways which should have remained buried, yet which have resurfaced due to a mysterious mental pull toward the primitive. This pull is unfortunately endemic to the flawed human mind. "New Age" embraces the occult, a belief in angels, spirits, astrology, magic and life after death. It is a return to the kinds of enslaving thoughts which Lucretius urged his disciples to be rid of 2000 years ago.

 

The Primitive's Reliance on Spirits

 

The environment of our primitive ancestors, including both the physical and social aspects, rewarded genes that constructed brains that could deal with the world, which is profoundly different from stating that their environment rewarded brains that could understand the world. As I argue in a later chapter, primitive people did not employ the full powers of a modern left prefrontal cortex, but instead relied upon a more primitive right cerebral cortex design for both cerebral hemispheres. To the extent that "producing grandchildren" (a convenient measure of genetic success) became more dependent upon mastery of a world of human relationships instead of mastery of the natural world, the architecture of the human brain evolved in ways that favored comprehending the social world at the expense of the natural one.

 

The social arena is less predictable than the natural one, so different mental abilities were rewarded in an environment requiring social skills. When a brain that evolved for the social setting addresses matters in the inanimate world, it should not surprise us to find that such a brain employs "weird logic" in this neglected realm. The primitive's vision of the world, being unguided by rational thought, was filled with spirits that behaved like people. Primitive people have gods for lightning, wind, rain, light, dark, and whatever seems important to a primitive's precarious life. Thus, when the sun god loses a conflict, according to this weird logic, it follows that there shall be wind, rain and lightning.

 

Today’s common belief systems provide evidence that for our ancestors the need to competently deal with human affairs was more important to the evolving human genome than the corresponding need to competently deal with the inanimate world. In high‑tech modern Japan, for example, the indigenous Shinto cult and religion remains popular. Shinto worship centers on "a vast pantheon of spirits, or kami, mainly divinities personifying aspects of the natural world, such as the sky, the earth, heavenly bodies, and storms. Rites include prayers of thanksgiving; offerings of valuables..." (Encarta Encyclopedia, 2000). Even well‑educated Chinese still believe in Feng Shui (the need to please spirits by a proper placement of furniture, entrances, etc). American Indians, who crossed the Bering Strait 13,000 years ago, brought with them a burdensome need for believing in spirits that demanded ritual obedience. There seems to be an abundance of depressing examples from every culture.

 

The Dyads of Primitive Thinking

 

Dyads abound in primitive thinking. Night and day, good and bad, friend and foe, birth and death ‑ they all contribute to a "yin and yang world." It is not surprising that when primitive men floundered to explain the world, they relied upon a dyadic competition. Thus, night and day are engaged in a daily struggle, literally; and at sunrise the "day" has become victorious over "night," and so on. But it gets complicated, for during winter the stronger competitor is night, whose exhaustion gives day the upper hand during summer.

 

Conflict permeates a primitive's thinking, because conflicts between tribes define primitive life. Nevertheless, men battle upon a stage set by even stronger forces than themselves. The weather is overwhelmingly strong, as is the ocean, the occasional earthquake, tsunami and volcano. There must be gods in heaven who unleash the thunderstorm and lightning, that punishes and rewards men. Since powerful men can be appeased, or slightly influenced, so might the gods. Man's quest for control over his fate led him in false directions, for gods cannot be appeased when they don't exist.

 

We should laud the primitive's urge to explain, even though it seems to be only weakly motivated by an urge to understand. The human claim for nobility rests upon this urge. But let us also not be mistaken about the explanations created by primitive men: Primitives have been stupendously wrong in almost every instance!

 

Their explanations were wrong because they arose from a primitive right brain. Only recently, with the ascendance of the aforementioned, fast‑evolving left brain, with its logical mode of thought and lack of traditional "wisdom," has it been possible to conjure up correct explanations. But, so strong is the irrepressible right brain that even many contemporary "intellectuals" still believe that primitive explanations contain some profound and subtle wisdom that makes it "just as valid."

 

Thinking men of every age seem to have sensed a pull toward primitive thinking, and worse, toward primitive behaving. The decay of civilization has been an ever‑present concern for those who live in a civilized state. This concern was expressed by ancient Greek philosophers, just as it is in today's world.

 

We know that the civilized state is not secure, because we sense the presence of that insistent and primitive right brain. To use the primitive's own metaphor, we are engaged in a struggle between good and bad, between light and dark, and it is now "late afternoon." Some of us who worry about the approach of evening, and a long night, admonish our contemporaries to resist the "primitive pull," to stay the course that brought us to this glorious noon, atop the highest mountain, by keeping the new faith as it struggles with the old. It has become a battle between the two titans of human history:  the two brain halves!

 

The Modern's "Spiritual Cleansing"

 

The primitive way of thinking is more efficient to implement than the modern physicist's cumbersome a = F/m and quantum physics way of thinking.

 

We moderns smile at those faltering attempts to see order in nature. From our 21st Century perspective, we see that their "explanations" are pathetically simple‑minded, and emphatically wrong!

 

Yet most people today feel comfort in being pulled in this primitive direction. It's as if the more complicated and up‑to‑date explanations require too much effort, resisting as they must the objections of old brain circuits. The result, for most people, is that the brain maintains old and new understandings side‑by‑side. The human brain is amazingly adept at compartmentalizing thought, and allowing the most irrational beliefs to coexist beside enlightened ones. "Cognitive dissonance" is minimized by insulating brain circuits from each other.

 

More than a few scientists surrender rationality on Sunday. I once worked with a scientist, a master of magnetic fields on planets in our solar system, who believed in the many levels of heaven taught by the Mormon Church. I give more examples of this in Chapter 13.

 

During Humanity's long march to the present, we have progressed from "magic" explanations to "rational" ones. Those brave thinkers who led the march have shed the magic and embraced the rational. Rationality led to reductionism, which I believe is Humanity's greatest intellectual achievement! The march forward has been led by people whose style of thinking adheres to the values of our left cerebral hemisphere, or left brain. The regressive, backwards pull is from a majority of "neurologically primitive people" whose thinking style remains right‑brained.

 

Humanity's path to reductionism has been "forced" by necessity. Imagine the consequences of taking your car to a minister for its repair, instead of a car mechanic; or seeking medical help from a shaman medicine man instead of a medical doctor or nutritionist. Our primitive ancestors had no need for car mechanics, and in their time medical doctors didn't exist, so they had less to lose by adhering to magic. With the unfolding of time, and the accumulation of technology, there has been a growing need to distinguish between spiritual and rational explanations.

 

Not all aspects of modern life require rational explanation. My friend who believed in many levels of heaven was unencumbered by this belief during his work hours. His spiritual beliefs might, in fact, have had a stabilizing effect in his personal life. It is relatively inconsequential whether a person believes they will go to heaven when they die, or dissolve to dust! It is more important that they know about family budgets, cars, computers, and nutrition.

 

Whereas it may not matter to a person's success at living whether all remnants of spirituality have been purged from his intellectual outlook, it does matter to the person engaged in a serious endeavor to understand "the nature of reality." Every serious thinker is obligated to undertake a lifelong vow to cleanse away all vestiges of spirituality!

 

We must control the impulse to regress to a world of spirits, no matter how comforting it may be. As argued by Lucretius, we must move forward, abandon belief in personal guardian angels and protecting gods, and replace them with understandings based on rational thought.

 

Are We Making Progress?

 

During the past 80 years scientists have slowly aligned their personal beliefs with rationality. In 1916 Leuba surveyed the beliefs of 1000 randomly selected scientists and found that 42% of them believed in God, whereas a similar survey conducted in 1996 showed only a modest decline to 39%. The belief in immortality declined by a slightly greater amount, from 51% to 38%. Perhaps more revealing, the more accomplished scientist is less likely to believe. Leuba (1914) surveyed 400 "greater" scientists and found belief in God to be 28%, whereas Leuba (1933) found that 19 years later the belief rate declined to 15%.

 

Today, Larson and Witham (1998) report that among 517 American scientists who belong to the prestigious National Academy of Sciences only 7.0% believed in God. Considering a belief in immortality, the above studies report that for 1916, 1933 and 1998 the belief rate was 35%, 18% and 7.9%. Among the general population of non‑scientists, 96% believe in God!

 

There appears to be a decoupling of what people of accomplishment believe and what the hoi poloi believe. Thus, among "greatest" and "accomplished" scientists the rates of belief in God and immortality are low and declining dramatically, among the ranks of scientists as a group the belief rates are less than half and declining slowly, whereas among the general population the belief in God is high and remains unchanged. Only the intellectuals are abandoning God!

 

Faltering Progress

 

As my chapters on the brain explain, I believe that the right prefrontal cerebral cortex finds mysticism and religion congenial to its way of thinking, whereas the left prefrontal cerebral cortex is inclined to think rationally. There is little doubt that the left cerebral cortex is a more recently evolved brain area than the right, as it is responsible for speech, conceptual thought and logical thinking. The practice of science requires a well‑developed left brain, although a well‑functioning right brain is also required in a supporting role. I speculate that scientists typically have left brains that "dominate" their right, in the sense that the left brains use the right brains as "tools" in the pursuit of left‑brain‑directed activities. The scientist values things that the left brain values, and the scientist's approach to studying a problem, and the standards of proof, are consistent with the style of thought of a left brain. Among the others, it is the right brain that employs its left as a tool, keeping it subservient to right‑brain values and goals. This more common style is a phase humanity must continue to "evolve through" if it is ever to reach a winning place as a sentient species.

 

As I look back upon the recorded history of the human groping for an understanding of who we are I discern good and bad eras. The first good era was 5th Century BC Greece, when the Ionian philosophers articulated the reductionist paradigm, as described by Sagan in his book Cosmos (1980, pg. 175). Democritus was the shining star of that era. The next era would be 1st Century BC Rome, when Lucretius wrote his famous poem On the Nature of Things. There followed a Dark Ages millennium during which anyone who thought rationally had to keep their thoughts secret. Stirrings of rationality started in 17th Century Denmark. The 18th Century Philosophes in France produced a full-bloom resurgence of interest in Greek thought, but this rebirth was short-lived due to the French Revolution. During the 19th Century European discoveries buttressed support for rationalism and the reductionist paradigm in particular. The new physics of the early 20th Century began to discredit reductionism, unfairly in my opinion, though rationalism continued to prevail. The global depression quieted many independent thinkers (like H. L. Mencken), and this marked the beginnings of a slow return to spiritualism. At this writing, in 2006, I see only a downslide of critical thinking that cannot compete with the primitive appeal of religion and an excess of politically-correct embrace of diversity of thought, no matter how irrational. A new Dark Ages might be approaching.

 

Laying a Groundwork of Understanding

 

The prospects are poor that during the next century the general public will embrace a rational outlook. Most people, even most intellectuals, are inclined to use the term "reductionism" disparagingly. This can be understood if their belief systems are influenced by a primitive, right prefrontal cortex. Rationality, which is a left prefrontal cortex creation, is in conflict with the old right pre‑frontal cortex. A fuller explanation of this will follow chapters on human evolution, the brain's role in human evolution, the appearance of the artisan, his role in the rise of civilizations, and the resentment of the artisan's rapid rise to power.

 

These chapters, in turn, will be preceded by a tutorial on genetics, using the sociobiological paradigm. This will be our task for the next three "genetics tutorial" chapters.

 

If, dear reader, you find the genetics tutorial chapters tedious, then skip them if you must. You merely risk not having some tools for understanding the "micro‑motives" underlying the "macro‑behaviors" under discussion. The genes, after all, underlie everything pertaining to life!

 

The chapters describing human evolution (Ch. 6) and the brain's role in human evolution (Ch 8) are intended to illustrate reductionist ways of thinking about the evolution of human nature, and should not be skipped. They provide a background for understanding the following speculations on the rise and fall of civilizations.

 

The utopias and living wisely chapters will resume the main theme of this book, which is concerned with the individual's predicament of living with "outlaw genes." The intervening chapters present a story of how humans came by the weird human nature we're stuck with, and I see no way of resuming the main topics without the preparation of these intervening chapters.


CHAPTER 3

                              

GENETICS TUTORIAL ‑ PART I

 

"...organisms die but their genes pass on ‑ often mutated and redistributed, it is true, but genes nevertheless; and it is difficult, therefore, to escape the conclusion that the design of the organism is merely to provide for gene multiplication and survival..."  Carl Sagan, "Radiation and the Origin of the Gene," Evolution, January, 1957.

 

People once believed that the universe was created for Mankind and all other life was placed here for our use. This was gradually replaced by the harsher belief that our species competed with other species, and that Human Nature was designed to do what was good for the species. Anyone who acted selfishly was aberrant, and would be punished in a later life. Darwin believed that individuals competed with each other, and the victors of individual conflict shaped human nature. But now, the level at which competition occurs has descended one more level: sociobiologists argue that gene alleles compete with each other for positions on chromosomes (W. D. Hamilton, 1964a,b; G. C. Williams, 1964; E.O. Wilson, 1975; R. Dawkins, 1976).

 

If the combatants are the genes, then what are we individuals? We are the "lumbering machines" carrying the genes that assembled us for the genetic competition (Dawkins, 1976). An individual is like a puppet, whose behavior is directed by strings that are pulled, ultimately, by tiny genes (please excuse the poetic license and anthropomorphism of this phrasing). The demotions our egos suffered during the past couple centuries continues into the 21st, as people must now deal with the thought that we are created by our genes for gene battles, and the genes do not care about the individual's welfare.

 

If the genes are this important then we should know their story, from the beginnings of life to the present. I shall present a recapitulation of the evolution of life on earth, with a proper emphasis on the role of genes. Some of these descriptions are speculative, yet illustrate ways that I believe the subject should be approached. The essence of every speculation is, of course, mechanistic reductionism!

 

A Brief History of Life

 

When earthly life started, 3.5 billion years ago, tiny replicating molecules resembling DNA (or maybe RNA) must have competed with each other for incorporating their molecular building blocks into copies of the replicating molecules. In time those molecules that accidentally created a protective "coating" survived longer. This crucial event might have been hastened by the existence of water droplets that would naturally form surface layers of like‑charged molecules with hygrophobic ends (Donaldson et al, 2004). A droplet with such a covering is a rudimentary cell, as the cell "wall" may have protective properties.

 

Approximately 2 billion years ago, a well‑functioning one‑celled form appeared which housed cell‑creating DNA floating inside (prokaryote). Later, a variant of the prokaryotic one‑celled life form had the several DNA molecules confined to a cell nucleus (eukaryote). This represented one more structural level of protection of the DNA. Whereas it must have helped the DNA survive it also required that a solution be found for the slightly more complicated replication process.

 

A trend is evident, and it has continued throughout the long story of life on Earth. To compete better, DNA molecules have had to wrap themselves in an ever‑more complex structure, devise ever‑more complex methods for replication, and retain control of their protective structures for competition with other life forms.

 

Multi‑celled creatures did not appear until about 1 billion years ago. Each cell contained a nucleus with an identical set of genes inside. It is possible that initially all cells in a multi‑celled life form were identical. A grouping of cells has a smaller "surface area per mass" exposed to the watery environment than one‑celled forms, and this added protection may have rewarded the forms that tended to stick together.

 

Specialization of some cells in a many‑celled creature may have been the next step in the evolution of life. With a reliable association of cells having the same DNA, there existed a reward for any gene mutations that helped the outermost layer of cells to specialize in protective matters. Since all cells had the same nucleus, it could still be argued that the sacrifice of a few cells to become mere "protective skin" while forsaking reproduction themselves nevertheless enhanced the reproductive prospects of the identical genes inside the cells that were being protected. This concept is a kernel for "inclusive fitness," described below.

 

In this way, skin might have been the first "organ" to evolve. Once a method had evolved for guiding a cell's properties to be responsive to its surroundings, the path lay open for the evolution of any number of organs. Organisms competed with differently constructed organisms in seeking food - and perhaps in consuming each other. Although the organisms competed with each other for food, and perhaps attempted to destroy or devour each other, since the fate of this competition was determined by the properties produced by the genes within, it is more insightful to view the competition as occurring between the various gene groups than between individuals.

 

Gene Competition Within and Between Species

 

Specifically, only the genes that differed were in competition. Identical genes in competing organisms might appear to be in competition, but only because they were part of an organism that had different genes. Since the fate of identical genes in different organisms was not in question, they were not competing with each other. If organisms had chromosomes that differed at only one gene location, and only two gene forms (alleles) existed at this location throughout a population of organisms, then it would be the two genes (gene alleles) that were in "competition."

 

The casual observer who thought that two kinds of organisms existed that were competing with each other (for food, let's say) would be misunderstanding the situation. A deeper understanding shows that two genes were competing with each other.

 

The word "competing" is a human‑invented concept. It is important to remember that the entire process of genes competing, with one gene mutation slowly yielding to another, is purely mechanistic. Obviously, the genes are unaware that they are "competing." Only a human observer would remark "these two genes are in competition."

 

Individuals within a species may "compete" with each other, and so may individuals belonging to different species. The underlying dynamic is the same:  every gene acts as if it wants to proliferate and last forever. Again, a gene does not "want" to proliferate. Rather, those that in fact proliferate are the ones that express themselves as individual characteristics that the human mind will identify as "wanting" to proliferate. Thanks to our primitive right brain (that evolved for dealing with social settings) it is helpful to use social metaphors for explaining mechanical processes.

 

It is theoretically possible that two species could exist in competition with each other while having no genes in dispute within each species. In other words, all individuals of one species would be genetically identical, and the same for the other species. Yet, since we are supposing that they are competing for the same resources in their environment, for example, the two species are in competition with each other. We should further specify that the two species hold some of the same genes (this is very likely, since they have common ancestors). For this hypothetical situation, two large chunks of genes are in competition with each other and the "winner will take all" after one species is exterminated (which is common for species that occupy identical niches).

 

This is one extreme of a continuum of situations. At the other end is a situation in which there is no between‑species competition, but there is competition between genes within the single species. In other words, some gene loci on the chromosomes have two or more alleles, and these alleles are in competition with each other. The simplest case would be one gene locus with two alleles, and the two alleles are competing with each other for exclusive presence at one locus on the chromosome.

 

This simple situation is likely to end with a complete win by one of the alleles.  However, there are special cases where there will be a steady‑state percentage representation of both gene alleles (i.e., an evolutionarily stable strategy, or ESS, as described originally by J. M. Smith and later by Dawkins, 1976).

 

In the real world there will be some within‑species competition of counterpart gene alleles and some between‑species competition between all the genes that are different between the two species. For the between‑species competition, large chunks of genes are involved; but only some of these will be under the influence of selection pressures.

 

Thus, for the typical situation, some gene alleles will be changing their representation frequency within the species gene pool due to "other species" selection forces, while other gene alleles will be changing their representation frequency within the species gene pool due to "within species" selection forces.

 

Appendix B presents examples of the ruthless nature of gene competition using human virus examples. 

 

Inclusive Fitness

 

The individual organisms called humans should be forgiven for seeing everything in terms of what they're good for to the individual organism. Dawkins (1982) describes an aberrant period in biology when the paradigm shifted from a "selfish organism" perspective (starting with Darwin) to "adaptation for the benefit of the species" paradigm (during the mid‑20th Century). Given that many (unsophisticated) people believe that people do things that benefit the species, the level at which people perceive evolutionary competition to occur should shift in the other direction, leading eventually to the "selfish gene" paradigm.

 

In other words, if we seek insight into how evolution works we should not ask "what good are genes to individuals?" but "what good are individuals to genes?"

 

An example will serve to show how ridiculous an imperfect paradigm can be. A graduate student, who must still retain remnants of the belief that adaptation is for the good of the individual, studied the Australian redback spider. She describes how the male positions himself during copulation so that the female can eat his body during a leisurely insemination process, thus satisfying her into not seeking another male. This assures that the sacrificing male's sperm will not be competing with the sperm of another male. To sum up the article, the magazine author presents the following astounding quote attributed to the student: "Sexual cannibalism has always been thought of as a conflict between males and females, in which males are just being overpowered. It's important to realize that it can be advantageous for the male."

 

Only someone handicapped by the "good of the organism" paradigm could say such a ridiculous thing! The benefactor, clearly, is the gene that causes the male to behave in this bizarre manner, not the individual male. He is a victim, more than the female. What's missing in this summarizing statement is an acknowledgement that genes produce behaviors which can manipulate and victimize individuals with behaviors that benefit only the genes that create the behavior?

 

Inclusive fitness states that genes tend to produce individual behavior which maximizes the presence of the behavior‑producing gene in succeeding generations (Hamilton, 1964a,b). Since biological relatives are likely to be carriers of the same genes as the behaving individual, the fate of the genes in relatives matters as much as the genes in self. For social animals that live among relatives the merit of an action must take into account the consequences upon those "genetically related" carriers of the genes.

 

The mathematical treatment of inclusive fitness is straightforwardly presented in many places. I shall simply give a feeling for it by repeating a traditional example. A gene will reward a self‑sacrificial act if it saves at least two cousins, or at least four second cousins, etc., as in each case the same number of identical genes is preserved (on average).

 

Inclusive fitness provides an explanation for "altruism" by the argument that an altruistic act promotes the proliferation of genes generating the act. Sometimes these altruistic acts are at the expense of the individual; but that's OK from a gene's perspective, for the individual is merely a tool created by the genes for gene proliferation!

 

The inclusive fitness paradigm is useful in many other situations, some of which will be dealt with in the following chapters.


CHAPTER 4

                              

GENETICS TUTORIAL ‑ PART II

 

"Thus the earliest vertebrates, like the earliest amphibia, the earliest mammals, and the earliest primates, were small predators. Over and over again in evolution, the originators of new modes of life were small predators, and the key innovations at each stage conferred a selective advantage in predation." John Morgan Allman, Evolving Brains, 1999, p. 73.

 

In this chapter I continue describing some basic principles of evolution that apply to all living things. It will serve as a foundation for the more speculative and interesting evolutionary results found in human nature.

 

Pre‑Adaptation

 

Some genes are "pre‑adapted" for new environments. A gene is pre‑adapted if there was a negligible reward for its presence in the genome at the time some new environmental challenge appears for the first time, and for which the gene then confers a significant genetic benefit.

 

Modern society provides many examples. Computers didn't exist before the mid-20th  Century, yet we find that many individuals are naturally talented for computer programming, design, networks, and other aspects of computer use. These people have genes that are pre‑adapted for the computer environment.

 

Pre‑adaptations are always present, as the following thought experiment illustrates. Imagine any task, and a procedure for reliably measuring performance of that task. The task could be jumping as high as possible, or remembering a sequence of numbers ‑ any task will do provided performance can be measured objectively, producing a continuous range of scores (a binary result, such as pass/fail, does not meet this "continuous range of scores" criterion). After two people have performed the task, there will invariably be a "better" and "worse" performance. After many people have performed the task, the test scores may form something resembling the Gaussian, or "bell curve" distribution, with many scoring near a middle region, and fewer scoring really well and poorly. The top scorers can be described as "pre‑adapted" for the task (provided the task is novel or evolutionarily "new").

 

In real‑world situations, whatever the change in environment, whatever the change in job opportunities, whatever new sporting games are invented, there will always be new consistent winners and losers. Winners in the new task might have been mediocre performers in the old ones (and old winners may become the new losers).

 

"We are what we're good at," and the forces of selection measure us by what we're good at in the context of our times. Whereas the computer whiz is pre‑adapted to this era, so might a “nobody” of today be pre‑adapted to some future era. We should be careful in judging others, for they might have shined outstandingly in past settings, or be examples of a type that will shine in future ones. Faceless nobodies of times past might have rivaled the best of today's stars, if only given the chance by a change of environment. Chance is everything!

 

Pre‑Maladaptation

 

But if the winners are pre‑adapted, the losers are "pre‑maladapted." It may seem "unfair" to a civilized mentality to believe that "pre‑ordained" winners and losers will exist when new opportunities appear, but every species has been molded by this unfair rewarding of individuals through abrupt environmental change. We may not like it, but this is the way things work.

 

Anyone feeling gratitude for evolutionary accomplishments should also feel thankful for the diversity of individual performance. Thanks to "inequality" evolution proceeds! But while we celebrate inequality, and rewards for the pre‑adapted, let us also have compassion for the pre‑maladapted, the world's ill‑fated losers, for they cannot be responsible for the changes that doom them.

 

Over and over, in this book, we shall encounter repugnant examples in nature. Our lesson is to accept that Nature doesn’t care about individuals, only the genes! And the genes have no qualms about wasting individuals for their sake. Fish lay thousands and millions of eggs, so that on average one or two will survive. Several insect species produce male brains that are programmed to allow the female to make a nutritious meal of him after copulation ‑ to postpone her mating with a competitor or for nourishing his offspring! The historical record shows that humans will send legions of young men to battle, like fodder, who in the prime of their life become maimed or killed. The victors in battle rape the vanquished men's women, then march home as heroes, with greater rights for domestic breeding. In all these settings, the individual is "sacrificed," for he is engaged in risky behaviors with benefits that accrue reliably to only the genes.

 

Humans who ponder the consequences of what I'm calling a pre‑maladaptation have grounds for bemoaning their bad luck. I like the thought that each person "has their time," a time when they would have some maximum of pre‑adaptation, and since people are born into times "at random," they most often are "out of their preferred time." Imagine how frustrating it would have been for Beethoven to have been born before pianos existed, and before orchestras. Or for Einstein to have lived before the preliminaries of 19th Century physical theory had been set. Delay Darwin's birth a century; would he have become the giant we know today? Bring to the 21st Century such notables as H. G. Wells, Lucretius, Democritus, Shakespeare, Homer, and others; what would become of them? We cannot know how fortuitously attuned to their age these giants were, or what nobodies they might have been had the "roll of their genes" occurred at some other time.


Species Shaping Forces

 

Pre‑adaptation is a useful concept calling attention to the fact that whatever an organism's make‑up it will have some kind of “match” to every hypothetical new environment, and the match of some individuals will be adaptive. It might be adaptive whether or not the environment in question has ever existed before, and whether or not any of the organism's ancestors have been exposed to that environment. In such cases, we should not say that the organism has become adapted to the environment in question, just because it fares better than some of its cohorts.  Rather, it is adapted due to a pre‑adaptation.

 

Most organisms will be pre‑maladapted to the new environment. Thus, most individuals will fall behind, watching a minority of pre‑adapted individuals leap forward. The greater the number of changes to the environment, the greater will be the disparity in relative rewards between the pre‑adapted and pre‑maladapted. A species should evolve "faster" at such times.

 

When I refer to changes in the "environment" I mean to include not only the climate for a region, and the disappearance of a food staple (plant or animal), or appearance of new foods, but also the appearance of a new predator, the invasion of a new parasite, or the adoption of a new element of "culture" (in the case of humans, and perhaps chimpanzees).

 

There is a special case, unique to humans, in which culture has created an entirely new environmental condition:  the removal of most of the natural threats to survival.

 

An advanced civilization shields people from diseases, animal predators, and, in some cases, the need to work. It even shields people from each other to a great extent, by reducing the frequency of outbreaks of "tribal warfare." In this environment genes that in harsher, unforgiving environments would be maladaptive would now be neutral. Only the most severe genetic defect will be eliminated from a human genome shielded this way.

 

Under these conditions we might want to think in terms of "potential pre‑adaptation" and "potential pre-maladaptation." Today's genome is accumulating a large reservoir of potential pre‑maladapted genes, carried unknowingly by individuals who may be reproductively successful only because they are not subjected to selective forces.

 

At the risk of getting ahead of my story, I believe that such genes will become apparent only after natural forces of evolution are restored, and put "the squeeze" on our burgeoning global population. Winners and losers in this new environment will not be close‑call winners and losers, they'll be clear‑cut winners and losers. The disparity between those now destined to win and those destined to lose is greater than ever, and growing faster than ever. The complexion of Humanity could change dramatically apre le deluge.

 

To understand a species we must consider the selective forces that have "shaped" it. In other words, we must learn what kills individuals before they reach reproductive age, what factors determine which individuals reproduce after reaching maturity, what foods are eaten, and how precarious is the supply.

 

For example, if our ancestors 5 million years ago were eaten by lions the survivors would have been good at avoiding lions. This might have rewarded the evolution of bipedality, which would have enabled standing tall and running fast. It might also have rewarded the capacity for social cooperative strategies, a precursor to intelligence.

 

Another theory speculates that our ancestors had to learn how to find and store root plants that would have grown on the grasslands (Wrangham and Peterson, 1996). This would have rewarded the creation of digging tools, and the ability to carry extra roots to a storage place at a home base, which in turn would have rewarded bipedalism and a self‑control that provisioning requires.

 

Whichever environment accompanied the branching of bipedal chimpanzees from their jungle‑dwelling forebears 5 million years ago, we can be sure that the forces of selection rewarded individuals carrying genes for dealing with whatever were the causes of mortality in their new environment, whether they were escaping from lions or digging and storing roots.

 

Perhaps 500,000 years ago some humans migrated to the edge of constantly‑moving glaciers. Mortality in this new setting would have been climate‑related, such as cold and hunger. We may presume that genes for planning and foresight were rewarded. To the extent that large animals were hunted, and meat became an essential food source, genes for a strategic type of cooperative hunting would also have been rewarded.

 

After the last glacial cold period, that peaked 19,000 years ago, humans had to adapt to an ever‑warming climate. For some, this meant adopting an agricultural lifestyle. Those who were pre‑adapted for farming would have prospered, provided they could also band together for mutual protection from raiders. Others remained nomadic, and targeted the new farmers. Thus, the main killer of Man became other men (it probably has been "other men" for the past 100,000 years, at least). As farming achieved unprecedented success, urban living became possible, sometime after 3000 BC. This created opportunities for microbes, which competed with Man as the main killer of men. Our ancestors are the ones with immune systems that afforded protection against "urban" diseases.

 

In every step of this evolution toward modern Man, the change in what killed people was a principal selective "force."

 

H. G. Wells made the point, 100 years ago, that long‑lived life forms cannot adapt to fast changes of conditions, unlike short‑lived forms, that can adapt. This leads more often to the demise and replacement of long‑lived large creatures by other large creatures, both of whom are competing with small, short‑lived creatures. He warns that humans, with a long life span, are vulnerable on this account.

 

The looming threat to Humanity posed by viruses and bacteria may become a classic example of this evolutionary dynamic. How ironic if our demise, or loss of greatness, which is most often portrayed in terms of dramatic events, such as global thermonuclear war, instead is dealt by tiny viruses!

 

If in fact viruses produce large‑scale human die‑offs during the 21st century (Garrett, 1995), the survivors will be those with pre‑adapted immune systems; not the physically strongest or most intelligent. This possibility illustrates in dramatic fashion the principle that "a species is shaped by what kills its members."

 

How Many Genes Can Compete?

 

Human tribes are supposed to have numbered 50 to 100 individuals throughout much of our prehistory. The number of adults in such a tribe would have been about half this number, half of whom would have been adult males (12 to 25 in number). It is tempting to think that fewer than this number of genes can be evolving in the tribal genome. But such an assumption is erroneous, as I will illustrate.

 

When a man goes into the world "to be measured," it is his phenotype that is being measured. And his phenotype could be the result of many genes (interacting with each other to produce a unique phenotype). Consider the extreme case where each of the men in a tribe differs from "an average" by just one allele. Consider 4‑year intervals, during which each woman of child‑bearing age bears one baby. During each 4‑year interval, if only one man prospers and is accorded sole breeding status, then every 4 years one allele can be declared a winner over it's competitor(s). In a lifetime, 10 alleles can be declared winners (where we imagine the environment places great importance upon different aspects of phenotype each 4‑year period). After 80 years, 20 alleles could be declared winners, etc.

 

Even though this is a thought experiment, it proves that there is no fundamental, mathematical reason forbidding the number of gene sites for allelic competition to exceed the number of adult males in a self‑sufficient tribe, or the total number of tribe members. (A "multiple regression" statistical argument is also possible, and more persuasive for me, but I shall spare my readers of this daunting argument.)

 

Every individual is a carrier for many genes that are competing with allele counterparts. The number could be 50, or 500; and it doesn't matter if the individual is a member of a tribe with only 50 or 100 members.

 

Migration, New Gene Competition, and Pace of Evolution

 

The number of gene loci hosting allelic competitions has undoubtedly increased in number since the advent of urbanization, and the more recent globalization of our species. A tribe of Africans may be homozygous for genes influencing skin color, but if they were captured and brought to America as slaves their ancestors would find that those same genes influencing skin color had become a factor in determining individual welfare in the non‑African society. The same argument would apply to many genes.

 

The inescapable conclusion is that the more diverse a population becomes, due to migration, the more genes there are in competition with each other. Does this mean human evolution is progressing faster today, or slower? If a person's "measure" is affected by more genes, it must take longer for all genes to have their measure taken. Stated another way, when more genes enter the fray of competition, those already in competition may feel a decrease of selective pressure influencing their fate. Aspects of a person which were important in the tribal setting suddenly recede in importance, as other genes, which had been firmly established many generations earlier, resume their competition with alleles they had never before encountered, or had encountered long ago and had triumphed over. The coming together of ethnicities must introduce major changes to the set of genes that are subject to evolutionary forces, in terms both of which genes are in competition and the relative strength of selective forces upon specific genes. These considerations suggest that the pace of evolution has slowed in modern times.

 

Another slowing influence is the declining rate of infant and childhood mortality.  Unless something important has been overlooked, these arguments suggest that the pace of evolution has slowed in modern times (Kondrashov, 1988).

 

Conservation of Selective Pressures: Pleiotropy and Polygenes

 

When selective forces suddenly reward a new capability the species undergoes a quick disintegration in other, more recently‑acquired capabilities. This is due to the random, unintended deleterious effects that any mutation produces, which places a brake on the speed with which new capabilities can be acquired.

 

To understand this, recall that a gene has many effects, referred to as pleiotropy.  This is most dramatically illustrated when a mutation occurs that has no redeeming consequences. For example, one mutation causes its carrier to have 6 fingers, short stature and heart murmurs (Ellis‑van Creveld syndrome). These phenotypic effects are seemingly unrelated, yet they are caused by just one allele. Mutations that are adaptive, judged by the fact that they have been selected during the course of evolution, will also have many effects, with perhaps just one of them being adaptive to a far greater extent than the numerous, small  negative effects. Thus, whenever a mutation occurs and confers an increment of adaptive advantage, its future in the gene pool will depend not only on how well it performs its adaptive task, but also upon how many unintended, deleterious effects come with it.

 

Assuming for the moment that there are only 40,000 genes in the Human genome, since there are more than 40,000 properties defining a human, each gene must have more than one beneficial effect. This implies that after a gene is "in place" it can be modified over time to produce more desired effects. An "old gene" may thus have several beneficial effects, in addition to a few small negative ones. The selection of a modified, dual‑purpose (or multi‑purpose) gene must occur with "painful slowness" since the original function of the gene should not be disturbed appreciably, and since every mutation is likely to produce other unwanted effects. To get things "just right" must require many generations and many small compromises.

 

Whenever a new selective force becomes important, the other selective forces must lose importance, else the population will drop to dangerous levels. This "partitioning" of selective pressure leads to a more conservative behavior of our genome, causing already established gene alleles to remain longer than otherwise.

 

By the same reasoning, a recently‑established gene allele is more likely to be disrupted with deleterious effects than a long‑established gene allele. This "genetic entrenchment" is due in part because of the rewards of redundancy for genes that are important enough to respond to selective forces for long periods of time. A task that must be done by a gene will be less vulnerable to mutation to that gene if it has been exposed to mutations and selection for a long time. In addition, genes that exist for a long time may become "depended upon" by other genes that are selected after the first gene and which in some way depend upon the presence of the first gene for its new effect to be expressed properly. When two or more genes must be present to produce a specific phenotypic trait that has adaptive value, those two or more genes are referred to as a "polygene" group. Genes that are members of a polygene are more difficult to get rid of, provided they have not become harmful. New genes have not had this opportunity to achieve robustness, or become entrenched, and they are thus more likely to be lost by random mutations because they are likely to have small phenotypic consequences. The concept of "genetic entrenchment," and a culturgen counterpart to this concept, is treated at greater length in Chapter 16.

 

Brain Genes

 

For humans it has been estimated that at least 20% of the genome influences the brain. This is not to say that 20% of human genes are present exclusively for brain wiring, since many genes will exist mainly for other purposes which have "acquired" brain wiring roles. If one of these genes mutates it is more likely to affect its new brain task than the older, original task. Undoubtedly some genes are mostly brain‑related, and probably some genes are exclusively brain‑related. Whether a gene is partly, mostly, or exclusively brain‑related, if it recently acquired this role it is likely to be more vulnerable to random mutation than the other parts of the genome, or to older genes that mostly affect anatomy or physiology.

 

Parts of the modern human brain evolved during the past 100,000 to 200,000 years, and some people speculate that for the past 40,000 years little has changed. I will argue later that brain genes continue to evolve in response to changing social conditions, which add in subtle ways to the repertoire of human behaviors. For now, I merely claim that behaviors which are uniquely human, and which are recently evolved, are most vulnerable to disruption by the appearance of new selective forces.

 

If a new adaptation has been selected for strongly, it might acquire robustness even in a relatively short time. Human language, which may have appeared 200,000 years ago, is a candidate example. Language played such a crucial role during its evolution that the genes that code for it are probably already robust.

 

The capacities for reading and writing have a briefer evolutionary history, and the genes that code for these abilities are more vulnerable. Until recently, few people engaged in reading and writing. These genes provided a niche to only a small fraction of the population during the past 4000 years. It is therefore not surprising that dyslexia affects several percent of the population, whereas verbal language impairment is virtually unknown.

 

Unintended Deleterious Effects

 

I suffer from occasional 20‑minute blind spells, called "scintillating scotoma." It is an impairment produced by a gene that in women produce migraine headaches. As I type with difficulty through a flashing zig‑zag blind‑spell pattern, it occurs to me that I am paying a penalty for some genetic mutation that is doing good somewhere else.  Every mutation does many small bad things for every big good one, and the sum of bad ones found in most people must be worth their penalty; otherwise the gene allele would not have evolved.

 

In the case of my blind‑spells a dilated blood vessel is putting pressure on a nerve fiber carrying signals from my eyes to my brain's occipital lobe. What if the dilation occurred elsewhere within my brain? I might not know that it was occurring since I could not see it. But it might nevertheless have subtle effects upon mood or thought. There must be people, probably many people, who do indeed experience mild mental afflictions, lasting 20 minutes for example, which are counterparts to my scintillating scotoma. We should be prepared, then, for the possibility that a certain amount of irrational human behavior is caused by genes that are conferring a greater adaptation benefit in some other behavioral realm, with the unintended side effect that behavior is mildly irrational in a different realm.

 

I frequently think about the penalties that are paid when evolutionary pressures for one trait rise above the others. Sure, you can quickly evolve skin color in response to latitude migrations, but you'll pay with other unintended defects that accumulate, until the new skin has been achieved and a better balance of evolutionary forces has been established.

 

Only 12,000 years ago, just after the climate warmed but before the glaciers had melted enough to raise the world's sea level, people in Siberia migrated across Beringia to the new world. As they moved south, generation after generation they would have lost their need for light skin. Central American Indians are dark‑skinned, and this must have been achieved in less than 10,000 years. But those who continued their migration southward, past the equator, they would have needed to re‑achieve light skin. Perhaps at each migration juncture those who were best adapted to the latitude stayed behind and the others continued the migration. This could have minimized the risks of unintended deleterious mutations, but it is more likely that the southward migration was so hurried that skin color played no role.

 

Such fast adaptations must have produced defects in other aspects of the American Indian. Perhaps they lost the ability to metabolize alcohol; we shall probably never know what compromises the genes had to make to adapt quickly to the need for a different skin color.

 

Cancer may afflict humans more than most other species because we have recently undergone a rapid evolution under strong selective forces that rewarded brain re‑wiring (to accommodate behavioral adaptations) and immune system enhancements (to fight pathogens seizing the opportunities offered them by the newly evolved super‑tribe human lifestyle). To achieve these new traits, genes must have been selected that would normally not be acceptable because of their unintended deleterious effects, and a defense against cancer may have been one such compromised ability.

 

The Dangers of Fast Evolution

 

Species evolve at different rates. Even a given species may remain genetically static many generations, then respond to an abrupt change in climate by evolving fast. Rates of change must vary by orders of magnitude, with long eras of equilibrium punctuated by short periods of disruptive change. Mammals lived throughout most of the dinosaur era, and flourished only after the meteorite impact of 65 million years ago (which killed the dinosaurs because of a brief, disruptive climate change lasting several years).

 

The equilibrium periods are available for "clean up" of unintended deleterious effects created during the fast evolving times.

 

The great diversity of human anatomy, relative to other animals, testifies to the great potential for fast human evolution. Strong selective forces must have superceded such things as head shape, for example.

 

When a species is suddenly subjected to a strong selective pressure, a few gene sites will suddenly grow in importance. More than two alleles may exist at each "hot" site (if only one allele exists, it won't be a site for selective pressure). Other sites, being relegated to lesser importance, are likely to accumulate mildly deleterious mutations with less consequence than before the fast evolution (to use a metaphor, it's as if no one is "minding the store" when a new one appears). Humans, who have been evolving fast for the past 7 million years (since separating from the chimp lineage) must have many multi‑allelic gene sites. The more alleles that are in competition, the greater the fraction of maladaptive offspring. Thus, the faster evolution occurs in response to some new selective pressure, the greater the likelihood of a low offspring survival rate in order to prevent a proliferation of the unfit.

 

Is it not ironic that today, after coming out of a phase of extremely fast evolution in several traits, humans have just achieved what must be the highest offspring survival rate ever? Does this not mean that humans also must be exhibiting the greatest rate of survival of maladaptive individuals? How long can this last? This topic will be returned to in Chapters 6 and 8.

 

Lag and Regression

 

An abrupt environmental change, such as those at the onset of an interglacial (occurring every 120,000 years, typically), must set evolution in motion in new directions. Until a new "optimum" has evolved, producing stasis (and genetic consolidation), there will be "lag." Some things are easier to evolve than others, and they will lag less. Skin color may be one example.

 

Because adaptation takes time, there could be a lag in many traits after an environmental change. Present aspects of human nature should "make sense" only in the Pleistocene context, not necessarily in that of the Holocene (the past 12,000 years). For this purpose it has been useful to create the term "environment of evolutionary adaptation" (EEA), also referred to as the Ancestral Environment (AE).  Common behaviors that were adaptive 20,000 years ago need not be "adaptive" today (Symons, 1979).

 

The Yanomamo Indians of South America appear to be more "primitive" than their Asian stock who began migrating to the New World ~12,000 years ago. How can this happen? Could the forces of evolution actually cause a population to regress? Yes! And maybe this happened with the Yanomamo. Their regression is only in relation to what was adaptive in their former setting. By definition, they must be better adapted to the Venezuelan jungle than were the original Siberian stock, or even the partially modified Central American Indian stock.

 

The longer the race, the greater the disparity between the contestants – especially between winners and losers. This is certainly true for a foot race, but is it true in evolution? Consider that our ancestry traces back to a chimpanzee‑like animal 7 million years ago, or 2 billion years ago to a one‑celled life form, and 3 or 4 billion years ago to strands of DNA. Things like those early DNA strands may exist today, as do many one‑celled life forms that may resemble those in our ancestry. So "yes," the longer the race, the greater the spread between the evolutionary contestants (note that all extant living forms are "winners").

 

In human affairs there is a discernible spreading in the quality of life of winners and losers. The most prosperous people of today have a higher standard of living than the most prosperous of yesterday, yet there are people living today who are no better off than the worst off yesterday. Can there be stability in a world where the rich get richer, and the poor stay poor? This is a topic for Chapters 11, 14 and 16.

 

Evolutionary Reversal

 

Random mutations rarely produce benefits to the individual organism (i.e., for the ability of the organism to stay alive, out‑compete its contemporaries and to out-reproduce them). A mutation that alters a gene is likely to have effects on many phenotypic traits (pleiotropy), and usually all or most traits suffer from random mutations. For a mutation to succeed, it must confer some advantage that outweighs damage done to many other traits. "Forward" evolutionary change is thus difficult.

 

After a genetic mutation spreads throughout a gene pool, it becomes part of a genetic setting that new mutations must deal with. If a new mutation relies upon the presence of the first one, and if this second mutation also spreads throughout the gene pool, then the first gene has a more secure future. This occurs because any challenge to the first gene must confer an advantage that outweighs the contributions of two genes ‑ the first one and the other gene that relies upon the first one for it's proper expression. The longer a gene stays within a genome the greater is the chance that other genes will become dependent on it and therefore provide it with additional security. When this happens, the gene has become "entrenched."

 

Consider the situation of environmental change that reverses itself at some later time.  The first change may lead to the appearance and widespread acceptance into the genome of a mutation. Let us assume that this new gene, which has almost completely displaced an older one, confers an adaptive advantage in the new climate. Suppose, now, that before this new allele has time to become entrenched, the climate changes back to the original state. The few individuals who carry copies of the original gene allele will become a source for the quick re‑emergence of the original allele. Evolution can be said to have "reversed" itself.

 

If the second climate change occurred much later, however, this evolutionary reversal might not be feasible. First, the original allele may have disappeared, and second, other genes may have become dependent upon the presence of the new allele, making it more difficult to dislodge from its entrenched location. In theory, both difficulties for an evolutionary reversal can be overcome, but they may constitute an insurmountable obstacle to the reversal.

 

Laboratory evidence exists for "reverse evolution" (Teotonio and Rose, 2000). Fruit flies from a standard stock were selected for various experiments over the course of 20 years (200 generations) and were subjected to new environments to produce variant strains. When fruit flies from these new strains were subjected to the original environment, in every case reverse evolution was observed. In two cases, the reversal was almost complete after only 10 generations; others required 50 generations. In some cases the amount of reverse evolution was small.

 

At every instant of a species evolutionary history, the most vulnerable genes are the most recently‑acquired ones. This concept will be returned to in later chapters.

 

Culture can be thought of as a collection of "culturgens" or "memes" ‑ similar to a genome being comprised of a collection of genes (Lumsden and Wilson, 1981).  Although some similarities exist between genetic and cultural evolution, the differences are striking. This topic will also be dealt with in a future chapter (Chapter 16), as a unifying theory for understanding the rise and fall of civilizations.

 

Mutational Load

 

Although the idea of "mutational load" was described by Kondrashov (1988), we owe H. G. Wells it's first brief expression (ironically, in the same journal, almost 100 years earlier). In 1895 Wells wrote: "Has anything arisen to show ... that where the life and breeding of every individual of a species is about equally secure, a degenerative process must not inevitably supervene?" (Wells, 1895).

 

Primitive people today produce about 7 offspring per woman. Allowing for slightly shorter life spans in past times, about 6 offspring per woman was normal. On average, only 2 survived to adulthood. Is it possible that some of the 4 who died were genetically inferior? Yes, of course.

 

Approximately half of all conceptions fail to produce a live birth. It is speculated that the half that die are genetically defective due to some incompatibility between the paternal and maternal alleles. It is a small step to suggest that there will be a residue of live births that are also destined to fail to survive childhood due to genetic defects. If this is true, then what would be the genetic consequences of intervening medically to sustain all live births through childhood and into adulthood?

 

If some of the 2/3 of live births that formerly died were due to genetic defects (a fraction derived from the ratio of childhood mortality rates in primitive and modern societies), and if all live births now live a full and reproductive life, then surely the genetic defects which they carry will be contributed to the gene pool in larger numbers than would have occurred in the ancestral environment (AE). Our gene pool must inevitably accumulate these defective genes at a higher rate than in the past.  This phenomenon is called "genetic load" (Kondrashov, 1988).

 

It may be impossible for a species to average only one offspring per adult for a long time. With no excess of births, the downward pulling force of “genetic load” would degrade the gene pool of the species. Therefore, the survival ratio must be kept well below one if humanity is to maintain a healthy genetic future! We who survive without serious genetic defects should be grateful to those less fortunate, whose deaths in the past made us possible.

 

I feel sorry for the bent masses of future people, for they will suffer from cruel disabilities that were traditionally weeded out by the neglect of less benign times in the AE. Humanity reaps what it sows, and it is sowing the wrong genes ever more often and preserving defective offspring with an excess of unthinking compassion.

 

Compassion can be a double‑edged sword. What seems laudable for one generation may in fact create unlaudable consequences for many future generations. I shall return to this moral dilemma in Chapter 11.


CHAPTER 5

                              

GENETICS TUTORIAL ‑ PART III

 

Adapting to Novel Environments

 

I want to distinguish between "outlaw genes" and those that are innocent by virtue of a changed environment. The genes that reward eating sweets are usually mal‑adaptive in today's setting, but in the ancestral environment (AE) they were adaptive. If we moderns lived in the AE we would categorize the sweet‑tooth genes as helpful to individual welfare as well as genetic welfare. This concept can be illustrated using something that is well known in the remote sensing field called statistical retrieval theory. This is described in Appendix C; a very brief description is given here.

Figure 5.01 The filled square region contains environments, physical and social, that have been encountered in the past by our ancestors, and for which we are adapted.  The open square symbols represent modern world environments which humans are experiencing for the first time, and for which we may not be pre‑adapted. This is a mere 2‑D representation of a many‑dimensioned world environment.

 

The basic concept is that evolution creates organisms that are relatively well-adapted to their AE, and when the environment abruptly changes to something that the species has never experienced the individuals are likely to be mal-adapted in ways that cannot be easily imagined. In theory the individuals could be better off in the new environment, but it is more likely that they will be worse off.

 

In the above figure, think of the region in which the solid square symbols are found as corresponding to one climate regime, such as a jungle environment, and think of the open squares outside that region as representing the environmental conditions for another climate regime, such as the glacier's edge. When our human ancestors migrated northward from Africa into Europe they were moving from one environment to another, and some of their genes had to “adapt” (be replaced). One of the genes that adapted controlled skin pigmentation. Others were nose length, eye color, hairiness, stature, etc. After the adaptations were essentially accomplished, the new set of genes achieved a better performance in the new reality space. It could be said that the new race of people were comfortably within the environmental regime indicated by the dotted circle and solid squares in the above figure.

 

Adapting to Changeable Environments

 

There is also the matter of having to adapt to a climate that has greater annual variations. At mid‑ and high‑latitudes the seasons are more pronounced than in the tropics; a tribe of people who must endure climate extremes during the course of a year will have to adapt to a wider range of conditions than a tribe that lives in the tropics. Cro-Magnon man, who evolved adaptations for the mid‑latitude climate extremes in Europe, must have made the compromises that rendered him less than perfectly adapted to hot summers and cold winters, yet able to survive in both. A region that is subject to periodic climate changes, which occur faster than the time needed for the gene pool to evolve a new adaptation, is in effect a region with a bigger reality space to which the genes must adapt.

 

For another example, El Nino weather patterns repeat every 4 to 7 years, creating at some mid‑latitude regions shifting amounts of rain, temperature and other seasonal properties. It is unclear how long El Nino/La Nina cycles have been occurring, but this is a convenient example for illustrating how our ancestors who had left the jungle may have had to deal with wide ranges of reality space. When the genes adapt to climates that shift back and forth on timescales that are shorter than evolution can track, the adaptation will have to be for the entire range of climates and fauna. However, if the range of settings is large, penalties will grow for life within each setting.

 

Tolerating Diversity as a Solution

 

One way the genes may have solved this problem is to "tolerate diversity." In any diverse population some individuals are likely to be pre‑adapted to never‑before encountered environments. This is a "group selection" argument. Populations that are relatively isolated compete without coming in contact by merely surviving or perishing when environmental conditions change wildly. We can speculate that those that survived will be the ones that tolerated diversity, given that some of their members were pre‑adapted for future conditions. Such populations would be especially pre‑adapted for climate changes that had never occurred in the past. The drastic climate fluctuations that occurred during the transition from Pleistocene to Holocene (18,000 to 10,000 years ago) would have been relatively unprecedented (a similar period of climate change occurred at about 120,000 years ago). Thus, the introduction to the Holocene may have favored those tribes that were inherently more tolerant of diversity.

 

Evolution of Behavior Repertoires

 

Most environmental changes are repetitive, such as the El Nino/La Nina cycles.  Environments that occur at intervals of less than 10,000 years, for example, are candidates for another genetic solution, described next.

 

When an environment changes wildly a person may take a reading of present conditions, which could be climate, population density, food scarcity, or social setting, and then change behavior in response to the perceived change of setting. Humans have a larger repertoire of behavioral responses to situations than any other animal! Humans whose ancestors have encountered a variety of distinctly different environments may have unknowingly prepared their descendants for a faster “within a lifetime” adaptation to any of these environments compared with humans who have never encountered the same range of environments. This is asking a lot from natural selection, for we are assuming the creation of individuals who are pre‑adapted in a very sophisticated way to environmental change. These people are capable of instinctively responding to a specific environmental change by changing their behaviors in a specific manner that is adaptive. Is it asking too much to invoke the evolution of this capability?

 

In essence, we're asking if natural selection can evolve a human brain that has circuits that do the following:  "IF (this setting) THEN (employ that behavior or lifestyle)." These circuits are analogous to the human immune system's large repertoire for doing "IF (this pathogen) THEN (employ that immune response)," as pointed out by Gazzaniga (1997). We know that the human immune system is immense, so the evolution of the capability is apparently possible. Its evolution may have been forced by the coming together of tribes to form large settlements, and eventually urban centers. Some diseases flourish when population density is high, or when the population size is large. These new diseases would reward people with more capable immune systems.

 

I am suggesting that humans today are prepared to read their setting and shift their behaviors, and even their group's lifestyle, in a way that is adaptive. An extreme example would be a tribe that is sedentary when the environment produces abundant supplies of food, but switches to a hunter/gatherer mode when the environment is less bountiful. When the switch occurs, requiring a new lifestyle, many things related to behavior might have to change ‑ such as marriage customs, property ownership, status hierarchies, etc. The genes would simply code for a switch‑over in many behaviors in response to a new perceived setting.

 

Our ancestors probably encountered many environmental changes, especially during the Holocene, presenting many opportunities for the genes to develop a reliance on requiring lifestyle mode changes based on a perception of "conditions." The adaptation to variable environments would simultaneously have rendered us physically adapted to no one environment in particular, giving us the appearance of inferiority to other animals, which are well adapted to narrower range of  environments. The human brain, on the other hand, has become capable of switching between a large repertoire of behaviors, and when a mode switch is made correctly, the new lifestyle can be well adapted to the new environment. These factors are ideal for the creation of "culture" ‑ which allows for quick behavioral "adaptations" to environmental changes.

 

How lucky for humans if a fluctuating environment produced mental abilities for adjusting behavior that were made available to the challenges of non-environmental changes. It must be common for a mental “tool” to be created in response to one challenge and only later become useful for other tasks.

 

Risks of Behavior Repertoires

 

How unlucky for humans if this same capability for achieving adaptive changes in lifestyle by “taking readings” of one’s setting could render civilizations vulnerable to “opportunist” individuals. This speculation is dealt with in Chapter 11, 14 and 16.

 

The mismatch between the modern brain, evolved for an ancestral environment, and the modern world, recently shaped by Man himself, is treated (but not from a rigorous sociobiological perspective) in the book New  World,  New Mind: Moving  Toward Conscious Evolution, by Ornstein and Ehrlich (1989).

 

Later chapters will come back to this point, so for now just remember that the modern world is a man‑made environment with very little of the ancestral environment to provide assurance that our living in it will appear to be adaptive, or even stable.

 


CHAPTER 6

                              

EVOLUTION CONCEPTS AND HUMANS

 

"In a very real sense human beings are machines constructed by the nucleic acids to arrange for the efficient replication of more nucleic acids. ... We are, in a way, temporary ambulatory repositories for our nucleic acids."  Carl Sagan, The Cosmic Connection, Garden City, NY: Anchor Press, 1973.

 

Having described some basic tenets of genetics in the previous two chapters, we are now ready to undertake theoretical understandings of human behavior. It is important to keep in mind throughout the rest of this book that all behavior is the outcome of a competition among gene alleles for representation at specific locations on chromosomes.  Thus, the "macro‑behaviors" at the individual level, which will be the subject of the rest of this book, are the result of "micro‑motives" at the gene level.

 

GEP

 

An individual's "phenotype" is "the way it is" ‑ its anatomy, physiology and behavior. An individual's "genotype" (inherited genes) interacts with environment to produce the "phenotype." Thus, a person's phenotype is who the person has become, as opposed to who they might have become had their environment been different. This powerful concept (Symons, 1979) can be referred to by the equation: G + E = P, or GEP.

 

Anatomy includes, for example, stature, the height a person achieves in adulthood. In a society where food is plentiful, as for most people in the United States, stature is determined almost entirely by the genes. But in a society where food is scarce for some people and plentiful for others, stature is determined most strongly by food availability (the environment, unless the genes determine access to food).

 

Consider the case of Japanese stature before and after World War II. Children after the war grew taller than their parents. Between the generations the disparity in the availability of food was large, and this diet difference was large "in relation to" the genetic variation between individuals. Stature differences were determined almost entirely by environment. This example illustrates that if the variation of environment is large, environment can be the dominate cause of phenotype variation, whereas if the variation of environment is small, making the variation of genotype more important, genotype can be the dominate cause of phenotype variation.

 

Physiology includes, for example, immune response. The genes create an immune system that includes a repertoire of responses to specific pathogen stimulations. A virus will elicit an immune response that is appropriate if our ancestors were survivors of the same (or similarly‑shaped) virus. The repertoire is limited by ancestor experience, so an individual is likely to be vulnerable to new viruses with novel shapes.

 

If the environment harbors the same viruses and bacteria that our ancestors survived, and if we are considering a stable population with no immigrants from other regions, then essentially all people who get sick will recover, and it will not be apparent that genotype is affecting phenotype (vulnerability to disease). But if the population includes immigrants from distant places, where there has been a different virus exposure history, there may be dramatic differences in who recovers and who dies from local sicknesses. The immigrants will be at a disadvantage when infected by local viruses and the native population will be vulnerable to any viruses brought by the immigrants. For dramatic illustration, old world explorers came to the new world and brought diseases that killed most people in the new world (Diamond, 1996).

 

The "behavior" component of phenotype is the most interesting, and the most challenging to understand. The "immune system's response repertoire" is a useful starting analogy for understanding behavior (Gazzaniga, 1992; Gazzaniga, 1997; Jerne, 1967). A "stimulus" in the environment can produce a "behavioral response," as when an abrupt approach of something to the face produces an eye blink. We blink our eye because we have ancestors who survived more successfully than those who didn't have the eye blink response.

 

A reductionist will want to employ the "stimulus/response" (S/R) explanation for behavior as much as possible. With effort, this approach at understanding behavior is broadly successful more often than is conventionally acknowledged. Since it is the simplest possible explanation type, it should be invoked as a first hypothesis.

 

S/R fails to account for behaviors that are self‑initiated, i.e., motivated behaviors. For example, this morning I decided to hike in the mountains in the afternoon. Planning a day's activities, as with life goals, requires something that in humans is identified as "prefrontal" cerebral activity, and a general sub‑cortical "drive" mediated by a "reticular activating system" (as described in Chapters 7 and 8). The prefrontal cortex initiates broad goals, such as a career path, it initiates behavioral programs, such as preparing a speech, and it also initiates specific behaviors, such as talking.

 

I will argue that the "shape" of behavioral programs, and the "shape" of life paths, are initiated by brain circuits that we inherit. The specific behavioral programs, and specific life paths, are the product of an interaction between inherited brain circuits and the environment.

 

Not only is an individual's environment a changing thing (career opportunities, available books, current beliefs, etc), but the human environment can change dramatically from one generation to the next, and especially from one millennium to another. It is very likely that everyone was capable at birth of adapting to a hunter‑gatherer lifestyle, as was common in the AE (ancestral environment).

 

A human trait that seems to be common, such as "greed," may be expressed in only specific environments. A person born into a tribal hunter‑gatherer setting, where there are few possessions (because it's hard to carry things from camp to camp) may grow up without expressing greed. The same person growing up in an agrarian society might be greedy. In other words, the person's "genotype" will interact with "environment" to produce a lesser or greater amount of "greed" (phenotype), which illustrates G + E = P.

 

There are limits to an environment's influence. This is easiest to illustrate using dramatically different genotypes, such as individuals of different species. We cannot make a snake behave like a cat just by cuddling it while it grows up. The snake is limited in what it can become. No amount of environmental adjustment will ever make a snake cat-like, because snake genes do not have cat behavior in their repertoire. Instead of Gsnake + Ecat = Pcat we are limited to Gsnake + Ecat = Psnake. Although snakes and cats are different species, the GEP equation still holds. Illustrating this concept using such different critters the point is easier to understand.

 

In any population there is a variation of genetic predispositions, or genotype. Thus, given a fixed environment, there will still be a variation in phenotypes, and in this case it will be due entirely to the variation of genotypes. Where there is wide variation in the environment, even a uniform genotype will produce a variation of phenotype, and such variation will be due entirely to environment. The normal situation, of course, is for variations in both genotype and environment, which obscures the sources of observed phenotype variation.

 

Every population must have unfortunate cases of bad genotype coupled with bad environment. Whereas either one might produce a bad adult, together they could produce a really bad adult. (Do you think Attilla the Hun may have been the product of bad genes and a bad environment? Was he maladapted from the perspective of his genes?)

 

IF/THEN Brain Circuits

 

It is reasonable to assume that each person inherits genes that pre‑wire their brain to recognize situations that elicit appropriate behaviors (S/R) for situations that our ancestors repeatedly encountered and survived. Like any computer program with many IF/THEN sections of code, some of the IF/THEN code will not be used during normal experience. Indeed, most IF/THEN code that exists may only be used in response to rare experiences, especially so for humans, who have an unusually large repertoire of conditional behaviors and personality development paths.

 

Once a specific piece of IF/THEN code comes into existence, in response to a sustained period of selective pressure in which a recognizable situation occurs and responses have reliable benefits or costs, this piece of code can remain in the genome almost forever. If it later comes into conflict with a similar situation requiring different responses, then this old code will be modified or may disappear. Since the code that elicits a behavior is almost always produced by a combination of genes, if any of these genes are modified in response to other adaptive pressures the original code could inadvertently be modified. If this occurs, the gene pool would have to be exposed to the original adaptive pressures again to restore the original, or equivalent, IF/THEN code.

 

If it someday becomes possible to list the IF/THEN circuits in a typical human brain, we may wonder when and where each evolved. If this were possible it would probably turn out that most code sections were created during specific eras in our ancestry, with few (or no) recurrences. Each distaste may thus owe its existence to a time when we lived among a specific inedible plant. The plant might have existed for only a few centuries, during the past 2.5 million years of the human past, yet its IF/THEN legacy stays with us.

 

Many of our ancestors were nomads, living in wandering tribes. Behaviors required by nomadic tribal life are part of our repertoire, and if a "modern" non‑nomad were raised in nomadic setting he might develop with the same nomadic functionality as present‑day nomads. Settled farmers, living as single family units, are also part of our ancestry, probably confined to parts of the past 10,000 years. Each of us probably has the code necessary to grow up into fully functional, single‑family farmer. Large settlement living must have been a part of some other of periods of our ancestry, confined perhaps to the past 5000 years. Each of us is presumably capable of becoming functional urban dwellers. Our large brains are "ready" for many lives that cannot all be lived!

 

Men Bear More of Evolution's Burden

 

Paternity success, as measured by offspring per male, exhibits a wider range than maternity success (offspring per female). Every tribe will have some males who don't reproduce, whereas it is rare to find women who are childless. For sexually mature women, after weaning an offspring she is likely to become pregnant soon after menstrual cycles return.

 

As with any species, whenever a dominant male controls access of other males to females, there will be a large disparity in breeding success among males. Harems were common in human history, and presumably pre‑history as well. Even when males do not dominate other males, females are prone to prefer to breed with specific males.

 

Whereas women typically give birth to about 6 or 7 babies during their lifetime, with little difference between women, men may sire from zero to hundreds of babies! Why is there such a disparity?

 

The human ancestral environment is presumed to have been exclusively tribal. The men of most tribes, it is thought, engaged in hunting expeditions. It was also probable that they engaged in brief raids of neighboring tribes, as well as more dangerous inter‑tribal warfare. Such male activities entail an extra burden of mortality. A man could die not only from combat, but from a mismatch of anatomy or physiology to climate. A man could also die by formulating less successful strategies in warfare, or by not adhering to a planned strategy requiring careful social coordination.

 

An extreme view of this situation is to state that the purpose for men is to go out and be measured. Those who come back, and especially those who come back as heroes, will have survived the measurement test, and the women shall deem them more valuable as potential fathers for their children. All women will prefer to mate with heroes than with the others. (It goes without saying that they won't mate with those who died in the process of being measured.)

 

Any man who refused to "play this game," to go out and be measured, could expect to be shunned by women ‑ as well as by men. Such a man may even have been banished from the tribe (leading eventually to death). It has only been during the present interglacial (during the past 12,000 years) that alternative niches proliferated for the less adventurous man. Although there might have been a small, exempt class of weapon makers, most men could not have escaped the high‑mortality life style.

 

Consider the first people who migrated from Africa to the mid‑latitudes. Dark‑skinned men would have fared less well, all other things being equal, than slightly lighter‑skinned men, as the dark‑skinned men would not synthesize as much Vitamin D. Vitamin D deficient men would be at higher risk of succumbing to the physical demands of traveling, hunting and warfare. On average, the lighter‑skinned men would be more likely to return from exploits. The women, who stayed home, would be less affected by lowered Vitamin D, so their mortality would have been less affected than their male counterparts. After no more than 200 to 400 generations (based on the New World immigration experience), the entire group's skin would have evolved to a new, more adaptive color. This process would have been achieved by a differential survival of men, combined with a differential breeding success of men. Thus, the burden of adapting would have been borne more heavily by men.

 

Takeover Infanticidal Males

 

Male lions kill a female's young lion cubs after they overpower the male lions in a pride. Not only does this remove lion competitors for the male's offspring, but the female soon stops nursing, becomes fertile, and is available for mating with the killer male. We humans might think that a female lion would be upset to see her cubs killed by the new males, but amazingly, the female quickly makes the best of a bad situation by becoming coquettish with the killers. By these actions, the female increases the prevalence of the very genes that thwarted her initial reproductive investment; by favoring this behavior, which humans find so repugnant, the lioness helps shape the male genotype.

 

Infanticide by males has been documented for species of birds, fish, insects and mammals ‑ such as rodents, carnivores and primates (Wrangham and Peterson, 1996). The following description is for primates.

 

"Hrdy noticed an invading male charge after a mother, attempting to snatch away her baby. For several days, the other females in the group tried to defend the mother and her baby. But the male persevered, and finally managed to deliver a slash to the infant's stomach that left the intestines exposed. Taking the wounded infant to her breast, the mother looked up at the sky, as though in despair. 'It was the only time in my professional career that I wept.' //  Because females are usually outmatched in the physical war between the sexes, they are helpless to protect their offspring against an infanticidal male. //  Female gorillas respond to infanticide ... they leave the father who allowed their baby to be killed and run off with the murderous male. Infanticide, along with various female defenses, has been seen in 13 primate species. (Angier, 1983.)

 

We humans find such behaviors repugnant, so surely men do not act this way. Alas, they do! Studies of infanticide in Canada (Daly and Wilson, 1988a) and the United States (Daly and Wilson, 1988b) reveal that step‑fathers are 75 and 100 times more likely to commit infanticide than are biological fathers.

 

In any species where males "take over" a female by overpowering the resident male and kill the offspring, females should evolve an aversion to males who cannot protect her and her offspring from takeover males. Thus, women should find weak and low‑status males unattractive, in relation to strong and high‑status males. This should be especially true for attractive women, who are more likely targets for takeover males. These predictions are borne out by "common knowledge."

 

Monogamy and Cuckolding

 

Monogamy, and the associated female faithfulness which monogamous husbands require, give every man a more or less equal influence on the next generation's genetic pool. This must retard the potential speed of evolutionary adaptation of that gene pool. Thus, the stronger the forces of evolution, the greater the reward for polygamy.

 

It would be surprising if the genes have remained blind to this. Women, for example, should sense when evolutionary forces are strong, and in response, they should seek consort with men who are "successful." If monogamy were the norm (which would have been more likely only during the past 12,000 years), then women should be expected to try to cuckold their husband (secretly mate with a man who is not the husband) in order to bear children carrying the more "adapted" man's genes. Since monogamy was probably rare before 12,000 years ago, in the human ancestral environment the need to cuckold was also probably unimportant before that time. Cuckolding, I suggest, is therefore a "recent" tool in women's behavioral repertoire.

 

Blood tests of Canadian and American families reveal a cuckoldry rate that ranges from 15% to 25% (see Christenfeld and Hill, 1995 for additional material). Presumably, cuckoldry rate varies with time and conditions in accordance with some optimizing algorithm created during the AE.

 

Knowing the optimal time for a wife to cuckold her husband would have evolved during the time that societies became monogamous. Refinements in a woman's cuckolding wisdom would have improved the most when evolutionary forces were greatest. If a gene pool underwent a period of polygamous evolution, the previously gained cuckolding wisdom would have remained "ready" but not expressed until monogamy was restored. This is similar to the way an immune system accumulates a repertoire of immune responses, each specific one of which remains "ready" for expression when exposed to a pathogen that is "recognized."

 

Recent studies (Hazelton, 2006) show that women have a heightened interest in cuckolding their husbands when they are most fertile. This logical female strategy is matched by an equally logical male strategy of exhibiting a higher level of mate-guarding at the same time. The real challenge for women is to recognize when it is appropriate to cuckold, and with whom. As for “when,” being in a monogamous relation is one precondition.  Sensing that "times are tough" would be another (i.e., evolutionary forces are strong).

 

As for “with whom” a woman must be capable of measuring her husband against other men. One measure of the successful man is "fashion"; the man who is sought by other women is likely to produce boys who will grow up to also be sought by the next generation of women ‑ regardless of the intrinsic worth of the type. Another way to identify a good candidate is to determine who is dominant over whom. Men live much of their life within a male society, and the men who are most successful in male activities, such as the cooperative hunt, will be accorded privileged positions by other men. Women are sure to notice how men sort themselves while establishing the male hierarchy, and those who are esteemed by other men are good candidates for a cuckolding episode. Women who are fascinated by men’s sporting events might be “doing their homework” for optimizing their future cuckolding.

 

The main effect of this increased attention by women to male worth is to increase the imbalance of reproductive activity among men; fewer men will account for a greater fraction of a generation's paternity. A secondary effect of this enhanced reward for whatever the forces of evolution deem important is to reduce genetic diversity. In a one harem society all offspring will resemble the harem master. If he is vulnerable to a specific disease, most children of the next generation will be similarly vulnerable.

 

Thus, there are risks to tribal organizations that give excessive reproductive rewards to small numbers of men. It is in a woman's genetic interest to not succumb totally to fashion; but it is always in a man's interests to be the most successful man and to dominate male reproductive activity.

 

Women understand that husbands should be loyal "producers" even though they should favor other men for cuckolding. There are "husband material" men, and then there are "exciting affair" men. Women are attracted to both types, but in different ways. A faithful husband type is attractive at the time a commitment is to be made, and the exciting affair type is less attractive at this time. Some time after marriage, however, women's interest in "affair men" should increase. Thus, women regard some men as good on long timescales, serving as loyal husbands and fathers, whereas some men are good for short timescales, serving as cuckold consorts to provide offspring with genes specialized in victimizing the next generation.

 

Men likewise automatically categorize women as good for the long term, serving as loyal wives and mothers, while other women are good for short‑term consort. Men and women must automatically categorize each other as belonging to one or the other category.

 

Men and Women Shape Each Other

 

Men and women have made each other what they are!

 

Men have a greater variance in IQ, and we men also exhibit higher incidences of genetic deficiencies. For example, dyslexia (reading and writing problems) is most common among boys. This may merely be an effect of males exhibiting a greater genetic variance of recently‑evolved traits (i.e., men dominate both ends of the spectrum of most measures). Thus, there are more men geniuses, as well as more men among the learning disabled. Men are burdened by "high risk" mutation experiments that eventually benefit the larger population. Men appear to be more "expendable" than women.

 

To what extent are women responsible for making men genetically "fragile"? Women prefer men who "go out to be measured," and who come back with good measures. When women cuckold their husbands, they assure their male offspring a greater likelihood of being a cuckolding partner in the next generation. These women also assure that they will produce daughters who are prone to cuckolding their husbands. This occurs because the cuckolding males are likely to carry genes which predispose their girl offspring to cuckold when they are women, since they are likely to have been the result of women who cuckolded (this is a subtle argument).

 

Women shape men with every preference they express. If women favor men who are "travelers" (i.e., vagabonds, minstrels, pirates), then each succeeding generation of men will tend to resemble travelers.

 

Why would women be attracted to travelers?  When diseases are a principal cause of mortality, traveling men that women encounter are the ones who have immune systems with the best immunity to diseases beyond the village. This may account for girls going crazy for pirates, traveling musicians, and other itinerate roustabouts who have no long‑term parenting value.

 

Birth Order

 

Frank Sulloway (1996) has presented an immensely well documented case for the influence of family birth order on specific personality traits. The theoretical argument for such an influence begins with the fact that in the ancestral environment children often died before reaching adulthood (approximately 2/3 of children perished). Surviving childhood requires that the child adopt strategies for maximizing parental investment. By this logic, firstborns should ingratiate themselves to their parents, and gain their favor by appearing to be good prospects for their investment. Firstborns should be obedient, conscientious, hard‑working, and they should internalize the values of their parents.

 

Laterborns, noticing that there already exist firstborns who have acquired parental confidence by becoming what the parents want, must create for themselves a different identity. They must distinguish themselves from their older sibling by excelling in another endeavor, for if they tried to compete in the same arena, and became equally successful at comparable age, they would be destined to always be a worse investment prospect due to their age disadvantage. All other things being equal, older children are a better investment option because they've already survived more of the childhood risks, and they are closer to childbearing age. Thus, laterborns try to excel in things untried by the firstborn, and perhaps unfamiliar to the parents. Laterborns are more open to new experiences, and are more adventurous. As stated by Sulloway (pg 98), "...the addition that each child makes to the parents' inclusive fitness will tend to be proportional to the development of skills not already represented among other family members."

 

Firstborn boys followed by a laterborn girl are a congenial combination, since the girl is naturally inclined to have different interests than the boy. From the parents’ perspective, both children are like first-borns and represent good investments.

 

Firstborn boys feel threatened by a laterborn boy. They are likely to fight, and the younger brother must become proficient with wit, words or some other clever strategy to compensate for his smaller body. The older boy will become accustomed to dominating his younger brother, whereas the younger brother will become adept in the use of social skills for minimizing the disadvantages of being dominated by the older brother. These effects appear to be maximum when the age difference is about 3 years (close enough in age to be competing for similar age‑related niches, yet different enough that the younger is weaker and can be successfully dominated). Sulloway writes (pg 79) "Like the alpha males of primate societies, firstborns covet status and power. They specialize in strategies designed to subordinate rivals."

 

To the extent that Sulloway's birth order correlations are correct, women should prefer men who happened to be firstborns with younger brothers. They are more likely to be "masculine" and capable of protecting their wives from "takeover" males. On the other hand, these men are less likely to tolerate cuckolding wives, rendering cuckoldry a more dangerous option for the wife of a firstborn husband.

 

How confusing (at a subconscious level) this birth order "environmental monkey wrench" must be to women! Men who appear to be strong and domineering may be so merely because they had a brother 3 years younger. They can probably be counted on to protect them and their children from takeover males, but they will not necessarily provide "domineering" and "high status" genes to her offspring. Do women have methods for identifying "genotype‑produced" versus "birth order‑produced" dominant men? We await further studies in this young field.

 

Sexually Specific Morality

 

Women complain that men can philander with less consequence than women who cuckold. They attribute this disparity to the fact that men can get away with dominating women due to their greater physical strength.

 

The real reason for this duality of morality has to do with the difference in natural consequences for out‑of‑marriage mating. A philandering husband does not necessarily diminish his "paternal investment" value for his wife's children, whereas a cuckolding wife who produces illegitimate offspring necessarily does cause her husband's "paternal investment" to be squandered.

 

If it Feels Good, Beware!

 

What is the purpose of emotions? They are meant to influence behavior!

 

In the particular case of humans, emotions are meant to influence behavior in situations where rational thought is also likely to subvert the genetic agenda. Some behaviors are too important to be meddled with by rationality. The Australian redback spider, in which the male is prone to allow itself to be eaten by the female during copulation, is incapable of rational thought. A simple, automatic instinct suffices to assure that his gene‑serving deed be done. But what about humans?

 

Humans think, and are theoretically subject to influence by rational thought. The genes, in their infinite wisdom, have created emotions to safeguard behaviors that serve their interests. Emotions are employed to protect behaviors that are threatened by rational considerations of individual welfare!

 

Emotions symbolize the conflict between "outlaw genes" and a thinking, rational individual. Therefore, if something has a feel-good emotional payoff, beware!

 

This theory implies that only intelligent creatures have emotions. It suggests that emotions were "invented" by the genes as a quick solution to a fast evolving human intellect. The genes could not know what rational threats lay ahead, in untried environments, or even newly endowed rational brains, yet they "learned" from past experience that certain actions were at risk of being overturned by individuals who cared more for individual welfare than being an obedient tool of the genes (of course the genes didn't "know" anything; it merely happened that those that safeguarded important behaviors from the influence of other gene mutations prospered.)

 

We will return to this subject in a later chapter.

 

Consciousness

 

It is difficult, at this point in my argument, to avoid the problem of "consciousness."  It is tempting to speculate that C, as consciousness gurus refer to it, was invented by the genes to mediate conflicts between an old instinctual brain, and a new rational one. To the extent that C grew in power, emotions must also grow in strength.

 

A common sense theory for C is that it “exists” whenever a novel situation demands that brain modules compete for control of understanding and behaving. C almost certainly is generated by the prefrontal cortex, possibly on the left side. Measurements of brain activity show that this area is active when a novel task is being confronted, whereas tasks that have been mastered during previous encounters are not associated with the same level of activation. These cortical activity measurements may have been detecting something produced by C.

 

Are humans more conscious than chimpanzees? There is growing evidence that chimpanzees "think" ‑ in the way that people commonly think of thinking (Goodall, 1986, and Wrangham and Peterson, 1996). Chimpanzees appear to have something called “theory of mind,” or knowing what other chimpanzees are likely to know, and this also would imply that they have C, at some level.

 

It is important that thinkers with a sociobiological approach address the consciousness problem, as most of the C literature is devoid of an appreciation that 1) genes construct brains, and 2) genes exist because they're good at surviving. Anyone else who tries to investigate C is handicapped at the outset.

 

Books that treat consciousness with an adequate respect for the reductionist paradigm include Consciousness Explained (Dennett, 1991), The Illusion of Consciousness (Wegner, 2002) and The Quest for Consciousness (Koch, 2004).

 

In the next chapter we will return to this issue, which deals with neuropsychology and evolution


 

CHAPTER 7

 

BRAIN ANATOMY AND FUNCTION

 

“…intellect arose merely to serve the will [genes]. Most men … are incapable of any other employment of their intellect, because with them it is merely a tool in service of their will and is entirely consumed by this service…” Schopenhauer, Aphorisms (1851).

 

"Men think themselves free because they are conscious of their volitions and desires, but are ignorant of the causes by which they are led to wish and desire." Spinoza, Ethics (1677).

 

The brain is an organ meant to help genes survive, and in this respect it is no different from the heart, liver, and reproductive organs. A thinking brain may not like this assessment, and it may prefer to view the body and its organs, as well as the genes, as existing to serve the brain. But modern science, spearheaded by sociobiological insights, is once again forcing Mankind to move further down from his pedestal by discrediting another cherished belief. This chapter will describe brain anatomy and function. The next chapter will address their evolution.

 

Part of my intent for this chapter is to remove some of the "mystery" from how the brain works. I want to convey a sense that the brain functions like a "machine," and that living things are automatons, consistent with this book's reductionist approach.

 

Brain Anatomy: Vertical, Horizontal and Front/Back Layout

 

The human brain consists of a primitive hindbrain, a small mid‑brain section, and a large and complicated forebrain.

 

The hindbrain, which began its evolutionary existence ½ billion years ago, resembles the entirety of a reptile brain, and has been referred to as our "reptilian brain." The hindbrain's "stem" connects to the body; it receives information from sense receptors and issues commands to muscles and body glands via the spinal cord. The hindbrain's cerebellum stores motor commands and produces smooth movements.

 

The mid‑brain has a minuscule function, and won’t be described here.

 

The forebrain, on the other hand, is where uniquely human attributes are generated. It includes a limbic system, thalamus, basal ganglia, and two large cerebral hemispheres. The limbic system has many components; it maintains homeostasis (body temperature, heart rate, blood sugar, etc), and controls emotional state (things like hunger, anger, fear and sexual arousal). The limbic system's pea‑sized hypothalamus performs many of these functions using electrical commands, some of which activate hormone producing glands in the brain. The thalamus and basal ganglia control conscious state and initiate movement, respectively.

 

The cerebral cortex, comprising 70% of human brain volume, consists of a left and right cerebral hemisphere, with an interconnecting corpus callosum. Although the cerebral cortex is only 1/8‑inch thick, its surface area is about 1 ½ square feet, and it has evolved a folded configuration to allow the surface to fit within the human skull. The inside surface of the cortex (gray matter) has an immense number of nerve fibers (white matter) providing connections to other parts of the cortex, the limbic system and other brain components.

 

The cortex is the most recently evolved part of the brain, and fortunately it is also the most accessible to study. The left cortex and the right cortex each consist of 4 lobes: occipital, parietal, temporal and frontal. The occipital "sees," the parietal "feels," the temporal "hears," and the "frontal" thinks and commands!  The "see/hear/feel" lobes are referred to as "posterior lobes" (since they comprise the rear half). They can be thought of as "receptive lobes" since they receive input from the body and environment. The "see/hear" lobes receive "remote sensing" information (visual and auditory input), while the "feel" lobe receives in situ information (touch, temperature, pain and body part position). The frontal lobes, the front half of the brain, receives input from the posterior lobes, and they “think” about the situation, formulate action plans and issue commands to muscles.

 

Figure 7.01. Brain lobes: Frontal, Parietal, Temporal, Occipital.  View is of the left side, front is toward the left.

 

The corpus callosum (not shown) connects all four lobes of one side to the corresponding lobes on the other side. This nerve bundle is located underneath the frontal and parietal lobes, at about the same level as the temporal lobe.

 

Primary, Secondary and Tertiary Cortical Areas

 

Each of the 4 lobes, the frontal, parietal, temporal and occipital, consists of 3 cortical areas: primary, secondary and tertiary. These are shown in Fig. 7.02 using the numbers 1, 2 and 3.

 

The part of the parietal lobe bordering the frontal lobe, area P1 in Fig. 7.02, is the "sensory cortex" (or "somatic cortex"). This strip of cortex is where in situ sensory signals from the body arrive. Next to the somatic cortex P1 is an area, F1, located in the frontal lobe and called the "motor cortex" or "motor strip." The motor cortex issues commands for movement (requests, actually, since sub‑cortical regions may "veto" the requests).

 

Figure 7.02. Approximate boundaries for cortical primary (1), secondary (2) and tertiary (3) areas in each lobe.

 

There's a one‑to‑one mapping of body location to position along the sensory cortex strip, P1, and motor cortex strip, F1. Starting from the part of the strips closest to the center‑line (top of brain) and going outward, body positions are allocated in the following sequence:  leg, neck, head, arm, elbow, etc, to face, lips, teeth and tongue.

 

For the posterior lobes raw sensory information arrives at the primary cortical areas, which deliver processed versions to the secondary areas, which in turn deliver even further processed versions to the tertiary areas. The 3 posterior lobe tertiary areas border each other, and this is where the most "conceptualized" versions of perceptions are inter-compared and elaborated.

 

Figure 7.03. Flow of nerve activity when something is "felt."

 

For example, when something is "felt" the flow of nerve activity "flows" according to the depiction of the figure above. When something is "heard" the flow of nerve activity "flows" according to the depiction of the following figure.

 

Figure 7.04. Flow of nerve activity when something is "heard."

 

When something is "seen" the flow of nerve activity "flows" according to the depiction of the following figure.

 

Figure 7.05.  Flow of nerve activity when something is "seen."

 

For each posterior lobe the pattern of nerve activity is the same: primary activity leads to secondary activity, which then leads to tertiary activity. The next step is for tertiary activity in adjoining areas to “compare notes,” or interact with each other.

 

Tertiary Cortex Convergences

 

When a familiar object is recognized a small set of tiny nerve circuits are set into "resonance." For example, when a coffee cup is seen, there's a flow of activity in the occipital lobe from primary to secondary to tertiary. When it reaches secondary cortex, i.e., O2, there will be sub‑features such as handle, rim, steam, etc "active" at their respective locations in O2 (created from interaction with the environment in childhood). These interact in O3 (occipital tertiary), setting into resonance a tiny circuit corresponding to "coffee cup seen."

Figure 7.06. Nerve activity when a "coffee cup" is seen.

 

The same coffee cup can be felt. In this case the nerve activity will be as shown in the next figure.

Figure 7.07. Nerve activity when a "coffee cup" is felt.

 

The coffee cup may be heard, as it is set down on a table. In this case activity will occur in the temporal lobe, such shown in the next figure.

 

Figure 7.08.  Nerve activity when a "coffee cup" is heard, as for example being set down upon a table.

The concept "coffee cup" consists of the simultaneous activation of any, or all, of the three tiny regions in the three tertiary cortices of the posterior lobes. This is shown in the next figure.

 

Figure 7.09.  Nerve activity corresponding to "coffee cup."

 

The activity pattern corresponding to "coffee cup" depicted in Fig. 7.09 is said to be "generalized." That is, there are many specific ways a coffee cup can be perceived, and indeed there are many variations of coffee cup shape, appearance and sound, yet they all end up creating the one, generalized pattern "coffee cup."

 

Frontal Lobes

 

The brain not only perceives, it also generates movement. A movement that is thought about and later commanded is the result of nervous activity in the frontal lobes. There's a "reverse" pattern for this activity; the process starts in tertiary cortex, and proceeds in the direction of primary cortex. This is depicted in the next figure.

 

 

Figure 7.10. Flow of nerve activity when some activity is planned and performed. The flow in this case is from tertiary to primary.

 

The frontal lobe architecture is analogous to that of the posterior lobes, in that the most conceptualized of ideas and plans are created in the frontal tertiary cortex, which delivers vague "executive" directives to the frontal lobe's secondary cortex, which formulates more specific action commands and delivers them (as necessary) to the motor strip. The motor strip requests permission from the sub‑cortical "reticular activating system" (RAS), and if the RAS approves the request it is acted upon by sub‑cortical brain areas (Luria, 1973), which carry out specific actions (orchestrated in detail by the cerebellum).

 

The frontal lobe's secondary and tertiary cortices are also referred to by the two terms "prefrontal" cortex and "pre‑motor" cortex. The prefrontal cortex has undergone the greatest amount of recent evolution, according to arguments based on the increase in frontal lobe size versus phylogenetic location (i.e., ratio of frontal lobe size to total cortex is greatest for humans, next greatest for chimpanzees, etc). Functions performed by the frontal lobes in humans are often unique, or most advanced, in humans, whereas most areas in the posterior lobes have pre‑human analogues. The prefrontal lobes also reveal their evolutionary recentness by continuing to undergo rewiring until the age of 5 to 7 (Thatcher, 1997), and even the "late teens." Giedd and Thompson (2001) write "In late teens, the prefrontal cortex is the area that's changing the fastest..." (according to neuro-imaging studies). This is consistent with the general principal that "ontogeny recapitulates phylogeny."

 

Laterality

 

The right side of the body is commanded to move by the left cerebral hemisphere's (frontal lobe) motor strip. Likewise, the left side of the body is commanded to move by the right cerebral hemisphere. This left/right crossing‑over architecture is also adopted by sensory input; sensory information from the right side maps to the left brain, and visa versa. The reason for this is still a subject for speculation. The corpus callosum, which interconnects the left and right cerebral hemispheres, allows for the coordinated movement of both sides of the body, and also allows for some of the computational results of specialized areas on one side to be exchanged with related areas on the other side.

 

Proto‑humans probably had left/right symmetry, in the sense that the right and left cerebral hemispheres had identical capabilities, being mirror images of each other in layout.  This would have provided redundancy in case one side was injured (by a fall or blow to the head). Modern humans have asymmetric brains: the left and right cerebral hemispheres, LB and RB, are somewhat different, and are "specialized" for certain types of tasks. RB has more long‑distance inter‑connections than LB, whereas LB has many areas that are highly intra‑connected, which in turn are connected to other highly intra‑connected regions within LB.

 

The best known of LB's highly intra‑connected areas are Wernicke's Area (language comprehension) and Broca's Area (language production). Wernicke's Area is located near the interface of the three posterior lobes, in LB only (right-most pattern of dots in Fig. 7.11, upper). Broca's Area is located in the frontal lobe's secondary cortex, in LB only (left-most pattern of dots in Fig. 7.11, upper). There's a discernible pattern for the tasks performed in these specialized, highly intra‑connected LB areas:  namely, these tasks are inherently sequential, which means that the temporal order of events is crucial! For example, both receptive and productive language involves the processing of sequential events (sound perception and production). Changing word order can profoundly change meaning ("Ed ate the bear" versus "The bear ate Ed."). In contrast, RB tasks are holistic; they resemble those that a parallel computer processor (neural network) performs, such as instantaneous image recognition.

Figure 7.11 Upper panel shows location of language comprehension area, Wernicke's Area (right-most pattern of dots), and speech production area, Broca's Area (left-most pattern of dots). The lower panel shows the location of the inferior parietal lobule, IPL, which monitors the spatial relationship of body parts in relation to the immediate environment.

 

It is interesting that RB's counterpart to Wernicke's Area, shown in Fig. 7.11 (lower panel) is devoted to the task of monitoring the location of body parts in relation to each other and the immediate physical environment. This area, called the "inferior parietal lobule," or IPL, plays a critical role during manual interactions with the environment, such as reaching out to pick fruit from a nearby branch.

 

It is tempting to conjecture that before humans were capable of speech the left hemisphere's IPL counterpart region also functioned like the present‑day IPL in RB. Because reaching out to pick fruit had sequential components, it would have been natural for mutations to modify what once was an LB IPL in a way that later presented an opportunity for further modification that led to a simple form of language capability. This region must have been built-upon to produce our present‑day Wernicke's Area, which plays a critical role in language comprehension. This task consists of monitoring the relationship of sound percepts to each other over time, somewhat similar to the way the RB's IPL monitors body part relationships over time. As Wernicke's Area evolved in LB, it must have gradually displaced the former IPL function.

 

A great deal of public interest was generated during the 1970s and 1980s by reports of LB and RB differences, or lateralization. For example, RB is described as being intuitive, holistic, inductive, timeless, visuo‑spatial, non‑verbal and pessimistic, whereas LB is described as being verbal, analytic, logical, rational, time‑oriented, deductive and optimistic.

 

Traditional psychologists must have resented the newcomers to their field who used instruments to measure things, and who used rigorous techniques to study long-standing matters that had been the subject of arm chair speculation. The old-fashioned psychologists accused those who studied split brain patients, and found LB and RB differences, as suffering from “dichotomania” – as if the new investigators were over-interpreting their data due to an excess of enthusiasm. But the data is convincing, and often dramatic.

 

When LB is damaged (or when it is temporarily disabled by sodium pentathol injected into the left carotid artery) the patient's speech capability is almost non‑existent. Curiously, though, the still‑functioning RB does what it's able to do speechwise: the patient can swear, utter emotion‑laden pat phrases, sing songs with the right words, and recite the alphabet. RB cannot (usually) put together a sentence, since grammar capability resides in LB.

 

Occasionally, a patient whose corpus callosum has been cut can still manage to communicate in a simple way using the rudiments of grammar. These cases offer very interesting insights into the differing "personalities" of LB and RB. One famous example was reported by Gazziniga (1978) which suggests that LB and RB can have different goals in life. Their oft‑used subject P.S. was questioned about his job choice in an experiment that allowed only RB to answer, and "automobile race" was spelled out. As Gazziniga writes "This is most interesting, because the left hemisphere frequently asserts that he wants to be a draftsman" (p. 143). How poignant!

 

Chicken Claw Experiment

 

My favorite illustration of the independent operations of LB and RB has been referred to as “the chicken claw experiment.” This experiment was conducted by Michael S. Gazzaniga and Joseph E. LeDoux using patient P. S., who had undergone a full callosal surgery (cutting of the corpus callosum, interconnecting LB and RB) to control seizures. I shall quote from descriptions appearing in two books:  Gazzaniga and LeDoux (1978) and Gazzaniga (1985).

 


Two problems are presented simultaneously, one to the talking left brain and one to the non-talking right brain. The answers for each problem are available in full view in front of the patient. Gazzaniga and LeDoux (1978).

Figure 7.12.  “Chicken claw experiment.” The “task” (top) has two parts, presented to a brain half. The answer choices, below, are in full view to both brain halves.

 

...the experiment requires each hemisphere to solve a simple conceptual problem.  A distinct picture is lateralized to one hemisphere: in this case, the left sees a picture of a claw.  At the same time the right hemisphere sees a picture of a snow scene. Placed in front of the patient are a series of cards that serve as possible answers to the implicit questions of what goes with what. The correct answer for the left hemisphere is a chicken. The answer for the right hemisphere is a shovel.

After the two pictures are flashed to each half-brain, the subjects are required to point to the answers. A typical response is that of P.S., who pointed to the chicken with his right hand [controlled by the left brain] and the shovel with the left [controlled by the right brain]. After his response we asked him, "Paul, why did you do that?" Paul looked up, and without a moment's hesitation said from his left hemisphere, "Oh, that's easy. The chicken claw goes with the chicken and you need a shovel to clean out the chicken coop."

It is hard to describe the spell-binding power of seeing such things.

My interpretation is that the normal brain is organized into modular-processing systems, hundreds of them or maybe thousands, and that these modules can usually express themselves only through real action, not through verbal communication. Gazzaniga (1985).

 

... a basic mental mechanism common to us all. We feel that the conscious verbal self is not always privy to the origin of our actions, and when it observes the person behaving for unknown reasons, it attributes causes to the action as if it knows, but in fact it does not. It is as if the verbal self looks out and sees what the person is doing, and from that knowledge it interprets a reality. Gazzaniga and LeDoux (1978).

 

Frontal Lobes

 

The frontal lobes play a key role in orchestrating behaviors associated with LB/RB specializations. For example, RB prefrontal (RBF) originates emotional outbursts, whereas LB prefrontal (LBF) works to produce socially responsible behavior. The limbic system appears to be more strongly connected to RBF, and uses it to elaborate emotionally driven behaviors. LBF, on the other hand, appears to be the seat of the "conscience" and inhibits any RBF desires for socially inappropriate behaviors.

 

This was dramatically illustrated by the famous case of Phineas Gage, who suffered a railway construction accident in 1848 that caused a metal tamping rod to explosively penetrate and destroy his LBF (and a small part of RBF). Without the inhibiting effect of LBF upon RBF, his behavior was "fitful, irreverent, indulging at times in the grossest profanity... at times pertinaciously obstinate... he has the animal passions of a strong man." (Harlow, 1868). This old example illustrates the well known finding that RB's language ability is usually limited to profanity, songs and other memorized verbal material, such as the alphabet. A wealth of studies show that LBF is the site of the most advanced and human traits, such as conscientiousness, positive social behavior, rationality, strategic planning, and positive affect (mood). LBF is often referred to as the site of "executive function." RBF, by contrast, is associated with lack of inhibition, anti‑social behavior, emotionality, and negative affect. RBF is more closely connected to the sub‑cortical limbic system, the source of emotions.

 

If RBF and LBF could take positions concerning the idea that "the genes enslave us for their sometimes pernicious activities, and that individuals should rise up and become liberated from this genetic enslavement," it is obvious which side LBF and RBF would be on, and they wouldn't be on the same side! More on this in a later chapter.

 

This chapter's brief description of cerebral architecture, and the functional relationships of components, is part of the accepted neuropsychology literature. Every normal person's brain functions this way. If the brain was a "blank slate," as Francis Bacon initially suggested, and philosopher John Locke systematically expounded, then how amazing it would be for the blank slate to form itself into the same well‑defined areas, with corresponding functions, in all people ‑ regardless of their individual upbringing and environmental experiences! This old idea is best forgotten. Even Bacon and Locke would probably disown the outdated notion if they were alive today and could know about recent neuropsychology findings.

 

The genes assemble brains with the same architecture, modules and functional relationships, and this process occurs automatically ‑ shall I say "mechanistically." This view of the brain is consistent with the reductionist theme found throughout this book. The next chapter is more speculative as it treats the brain’s role in evolution.

 




CHAPTER 8

THE BRAIN'S ROLE IN EVOLUTION

 

"Aristotle was famous for knowing everything. He taught that the brain exists merely to cool the blood and is not involved in the process of thinking. This is true only of certain people."  Will Cuffy.

 

The brain is assembled by many genes. Each gene has had to establish itself within a species genome that, by definition, was successful at the time the new gene competed for a place in the gene pool. We should assume that each brain‑affecting gene established itself in the human genome at a different time from all other brain‑affecting genes. Obviously, all genes achieve their success without the benefit of how well it might work with any future gene. Each successful gene has had to compete with existing genes, or at least provide a benefit that exceeds penalties from incompatibilities with existing genes. From the perspective of the gene, the individual's brain has the responsibility of spreading the gene widely into future generations. This is another way to express the unavoidable tautological assessment that a gene's job is to try to infiltrate the species genome and persist forever.

 

When a new gene modifies the hardwired neural connections of some brain region (by creating new connections between neurons or by changing the size of synapses of existing connections), the function of the modified brain region is likely to be in conflict with other brain regions. Since the purpose of the brain is to influence behavior on behalf of the genes, brain regions necessarily are in competition with other brain regions for influencing behavior. Rarely is the individual aware of this conflict. When the conflict is extreme, when it affects emotional state, we might say that the brain is in an unsettled state of "cognitive dissonance." Almost all competitions for influencing thought and behavior are worked out peacefully below conscious awareness.

 

The Brain as a Mechanism

 

The brain is a mechanism, albeit a "wet chemistry" mechanism. Just as all chemical interactions are merely physical interactions at the atomic and molecular level, so are all brain interactions ultimately the working out of physical relationships between atoms and molecules. When we say that current flows along a neuron's axon, we refer to a physical process of the axon's membrane becoming more permeable to sodium atoms, allowing charged atoms to enter the axon from the surrounding fluid, etc. Every motion of every atom is governed by a = F/m and quantum physics (as explained in Chapter 1).  It would be cumbersome to try to understand brain function by invoking this basic level of physics since such a task would be incomprehensibly difficult. Wet chemistry is a less cumbersome level, but still too daunting for most brain studies. A more tractable, and hence powerful, level for understanding brain function is to think in terms of neural networks.

 

A neural network is a partially interconnected group of neurons.  One network may also have connections with other neural networks. The term "partially interconnected" is important, for it is the genes that determine the overall pattern of which connections exist. A "fully connected" network is impractical when the number of elements (neurons) exceeds a few hundred, since the number of possible connections between elements grows as "N‑1 factorial." Synaptic connections between neurons are either excitatory or inhibitory. In the brain a neuron may have many synaptic connections to a specific target neuron, and absolutely no direct connections to most other neurons.

 

Consider all neurons in the network that are connected to one individual neuron. At any moment some of them will be in the process of "discharging," causing their synaptic connections to other neurons to become active (releasing neurotransmitters across a synaptic gap). Each target neuron sums the excitatory and inhibitory discharges on its cell body, and if this sum exceeds a threshold it in turn discharges, causing neighboring neurons with which it has output connections to possibly also discharge by the same process that led to its discharge. A neural network can be made to "resonate," which is a way of stating that a pattern of firings within the network continues for many clock cycles (tens of milliseconds in the brain) once triggered by an appropriate stimulation from the connections that the neural network has with neighboring neurons (or neural sub‑networks). All of this is well understood by neural network specialists, and I provided a brief introduction of it here to give the reader a taste for the mechanistic, or reductionist nature of brain phenomena.

 

An even more useful level for understanding brain function is to speak of brain regions in terms of their function. When we use such terms as "the reticular activating system" (RAS) we know that the elaborate neural network explanation for the region's function is theoretically possible but at the present state of brain understanding these a = F/m ways of accounting for a regions function are not very feasible or even useful. So we proceed by saying, with blatant anthropomorphism, that a cortical region sends a "request for activation" to the RAS, and if RAS "grants the request" this originating cortical area becomes more active, and this activity enables it to increases it's inhibition of "competing" cortical brain areas, allowing it to succeed in "achieving behavioral expression." Even that way of speaking is cumbersome, but it captures the flavor of the mechanistic competition of one cortical neural network, having gene‑directed hard wirings, with a neighboring cortical neural network, having other gene‑directed hard wirings.

 

In any description that attempts to achieve brevity, such as this one, there are many unmentioned details about which a specialist could complain when they are left out. Sure, I didn't mention neurotransmitters, and their re‑uptake, or their breakdown, and dozens of other things going on, but they are all mere elaborations of the same basic physical mechanism. Additional details are too numerous to mention, but also too similar in terms of their ultimately physical action to warrant mention for present purposes.

 

I have risked boring you with some physics of the brain in order to show how in principle brain function can be understood as coming under the influence of the genes.  For it is the genes that direct the process of "pre‑wiring" the brain. Initially, too many connections are created, and for several years after birth approximately half of the neurons and their connections wither and are lost. But the starting point at approximately birth and some years later (depending on the brain region), the overall placement of neuron type and the majority of connections from each neuron to others, is supervised in a general way by the genes. Some genes influence one region (i.e., a neural network) and not others, while other genes influence several different specific regions. Any single neural network is most likely the result of several genes.

 

This way of viewing brain development, emphasizing as it does the role of evolutionary forces on the architecture and interconnectedness of the brain, leads to a perspective in which overall brain function is the working out of a competition of mental modules, each endeavoring to express itself by maximizing its influence over behavior. The "modularity of mentality" perspective, in which modules compete with certain others, is still controversial (for reasons I don't understand). And the idea of connecting specific modules to specific genes is so amorphous a speculation that it is not yet a sub‑discipline of the brain sciences. Evolutionary psychologists adopt this view (see Barkow et al, 1992), and it seems inevitable to me that sometime in the 21st century neuropsychologists will also, and maybe late in the 3rd millennium people who call themselves psychologists will come aboard.

 

Recent Evolutionary Hotspots in the Human Brain

 

In humans the prefrontal cortex is proportionately larger than the rest of the brain compared with all other animals. Thus, there's an evolutionary trend revealing that the prefrontal cortex has been the focus of recent human evolutionary adaptations. This makes the prefrontal cortex one of the most interesting brain areas to understand.

 

Comparison with other mammals reveals that the tertiary cortices of the posterior lobes are also proportionately larger in humans, indicating that they also have been undergoing rapid evolution in recent evolutionary time. The most obvious example is Wernicke's Area, located in the temporal lobe's tertiary cortex. So add LB posterior lobe tertiary cortical areas to the list of interesting human evolutionary "hotspots."

 

Are there any evolutionary hotspots in the human right brain posterior lobes? The short answer is "no." It therefore seems that the left brain has evolved more during human history than the right. It is even tempting to suggest that what distinguishes humans from other animals is their left brain.

 

Note one qualification that applies to most usages of the terms "left brain" and "right brain":  about 2% of the population has laterality reversed. In these people language and other sequential tasks are performed by areas in their right brain, and holistic functions are performed by their left brains. Most of these people are left‑handed with the unhooked writing position. Neuropsychologists use the terms right brain and left brain to refer to the specializations found in that 98% of the population with "normal" lateralization. So, whenever the terms LB and RB are used, think of the left and right brains of the 98% of people who possess the normal lateralization.

 

Why is the Left Brain Evolving Faster?

 

What is it about the left brain that gave it the greater burden for advancing human evolution? One clue comes from the microscope. The left brain isn’t as "white" because fewer neurons are coated with an electrical insulator composed of a whitish, fatty substance called myelin. The greater myelinization of the right side is required by the greater proportion of right side neurons that connect with distant neurons through long axons. In contrast, neurons on the left side are more often connected to nearby neurons, and therefore require less insulating myelin.

 

But what does this mean? The left brain is characterized by a neural architecture in which isolated neural networks perform their specialized tasks and then communicate their results among themselves through a smaller network of interconnections. Functionally, this is a better architecture for performing sequential tasks. Language is a good example. A sentence consists of a sequence of sounds that have to be in their proper place in order to convey the intended meaning.

 

Some have speculated that the evolution of lateralization started with our fruit eating ancestors, who would use their left arm and hand to support themselves and maintain balance while the right hand reached out to pick fruit. Fruit picking is somewhat sequential, as the hand must be guided by the eyes to reach for the ripe‑colored fruit, grasp it with fingers using just the right force, tear it off the branch, and then bring it to the mouth for eating. Recall that the sequentially performing right hand is controlled by the left brain, which would therefore be the one requiring a sequential neuronal architecture.

 

If embryological development provided for a sequential brain architecture in one part of LB because it evolved in that location by chance for the purpose of picking fruit, then when another sequential task became adaptive the forces of evolution would more often find a favorable mutation of genes that code for the left brain, since fewer mutations would be needed to add to a pre‑existing architectural capability for the task of producing a new sequential capability. This, according to one speculation, is why the left brain took on most new sequential tasks presented to it by subsequent evolutionary opportunities.

 

Brain "Dominance"

 

Damaging brain strokes in LB tend to produce more noticeable deficits than those in RB. This is because LB performs language tasks. For this reason, unfortunately, it has become customary to regard LB as the "dominant" hemisphere. But, to call LB dominant over RB for this superficial reason is misleading!

 

The limbic system (that drives emotions) influences the RB frontal lobe more strongly than the LB frontal lobe. This makes RB a better candidate for playing a dominant role. RB gives overall shape to behavior, while LB is relegated to a supporting role. When LB began its sequential specializations it must have been a useful "tool" for RB (which in turn was a tool for the limbic system, which in turn was a tool for the genes). "Values" are more likely to originate with RB, and I claim that the genes have put in place more of their “agenda protection circuits” in RB. The natural condition, I suggest, is for RB to be "in control," using LB to help achieve genetic ends. This important thought will come up repeatedly in subsequent chapters, and it is a basis for the individual to design strategies for individual liberation from the genes!

 

Even though RB has control over decisions that matter to the genes, I believe the seat of consciousness is in LB's tertiary cortex. This may seem to be a curious arrangement, but upon further thought it makes sense. We associate consciousness with planning future activities. Recall that brain structures with a more ancient origin usually have veto power over behaviors, as when RAS handles requests for action and either "authorizes" or "vetos" them. It may happen that RB tasks LB with imagining future scenarios and their likely consequences, while RB, in close consultation with the (emotional) limbic system, then makes a “judgment call” and decides whether or not to proceed with the plan of action that was under consideration. After imagining scenarios, one may be accepted while others are vetoed; all of this may occur at a subconscious level, with RB working in conjunction with the ultimate authority: the limbic system. LB must make sense of the outcome, so it "confabulates" an explanation for the chosen plan of action. Michael Gazzaniga names the left prefrontal module that performs this confabulation the "interpreter" (see Gazzaniga, 1978, p. 146; especially Gazzaniga, 1988, p. 229; and Gazzaniga, 1992, p. 121).

 

Rational thought has become an ever more important tool for evaluating the consequences of hypothetical actions. This is why LB must have been such a "hot‑spot" for human evolution for probably the past 130,000 years, and especially the past 12,000 years.

 

Before LB began to evolve its unique specializations, perhaps 250,000 years ago, the function performed by a damaged area of one side could be easily assumed by the counterpart area of the other side (relying upon the corpus callosum for inter-hemispheric communication). Lateralization brought with it risks of lost redundancy, yet this loss was apparently smaller than the gains from being able to solve problems that were common in the late Pleistocene and Holocene. We must assume that some important need started the selection for LB specializations. It may have been the payoffs for improved tool‑making, language, or dealing with a more complicated social setting that required logical thinking skills (such as "theory of mind" abilities).

 

Whatever the original impetus for LB specialization, it seems to have assumed the new duties as if forsaking redundancy with abandon. Just consider the list of important LB skills that are unique to humans:  verbal, analytic, logical, rational, time‑oriented and deductive skills. It seems inescapable that LB has acquired more recently‑evolved, distinctly human adaptations than RB. When you damage LB (posterior lobes), you get a regressed, more primitive person; but when you damage RB (posterior lobes), you get someone handicapped in mostly long‑standing, primitive traits.

 

Since recently-evolved traits are the least entrenched, and are most subject to disruption by the latest mutations, we should expect to encounter a wider variation of ability for the recently evolved traits than long‑established ones. This view correctly predicts that literacy, being a recent human achievement, should be more variable than other abilities; whereas verbal language ability, having started its evolution much earlier, should be more robust. It also explains why everyone is capable of anger, fear, sexual arousal, and jealousy, while some people are deficient in logical, rational and analytic ability.

 

Brain Modules and Genes

 

As with any organ, no single gene codes for the construction and function of an entire organ; many genes contribute. When several genes contribute to the same trait, they constitute a group of "polygenes." For example, one gene may play a major role in forming the heart's left ventricle, with minor support from other genes; another gene may have major responsibility for assemble of the right ventricle, but also contribute to the left ventricle's assembly. Both genes belong to a polygene group for constructing the heart.

 

The same argument applies to the brain. Many genes are required to assemble the primitive brain stem's reticular activating system, for example. Others assemble various parts of the limbic system. Finally, other genes assemble the surrounding neo‑cortex, LB and RB, and the interconnecting corpus callosum.

 

All brain components are interconnected with other components, and they function together as if they were "designed" to work together. When the various components work together it is because they have been present in the genome together long enough to adapt to each other's presence. Initially, when a new brain component is mutating into existence, it is useful to understand that the pre‑existing components were not meant to work together with the new component. Each new "addition" occurs against a background of pre‑existing brain components which had worked together successfully prior to the appearance of the new component. As components appear, they, as well as the pre‑existing components, co‑evolve to enhance the working relationship.

 

When our ancestors began to lose their fur, the fur altering allele had to co‑evolve with the gene(s) that made furry babies irresistibly attractive to mothers. There are many baby features that cue the mother to act like a mother, and the lack of fur amidst all the other baby features must have been disconcerting to mothers during the transition. Today, a cat resembles a primitive baby in size, weight and furriness. The fact that many people find cats irresistible, and sometimes hold them like a baby and speak "motherese" to them, suggests that the ancient collection of cues for eliciting mothering behaviors still exist in some residual form.

 

As the gene for a new brain module is selected, it evolves to be compatible with pre‑existing modules, and the genes for the pre‑existing modules simultaneously undergo modification in response to the new module. The evolution of genes that affect the brain is governed by the consequences each gene allele has on the success of the individuals carrying the genes to survive and reproduce. Or, to be more rigorous, a brain‑related gene allele's success depends on its ability to produce phenotypic changes that work with the prevailing phenotype in a way that enhances the individual's success in delivering all of its genes to future generations, under a typical range of environments.

 

Since there are many ways to construct any organ, there will be many potential competitions between genes. An allele that produces a larger heart ventricle is in competition with alleles that produce smaller ventricles. Natural selection achieves a better heart by rewarding individuals having the better heart, and thereby rewarding those gene alleles responsible for producing the "better" ventricle size.

 

Whereas it may be easy to comprehend how a gene that codes for anatomy, such as heart ventricle size, can be in competition with another gene, it is more difficult to imagine the competition between genes that assemble brain circuits governing behavior; nevertheless, it happens. A brain gene may be in competition with another allele, even while it is "cooperating" with a different set of brain genes. (Excuse the anthropomorphizing; if it bothers you just convert my brief descriptions to a rigorous lengthy one).

 

Genes compete for phenotypic expression at impressively high conceptual levels. Language ability evolved by creating proto‑Wernicke's Area circuits and proto‑Broca's Area circuits within LB (plus other cortical areas, interconnections and anatomy modifications). This was a major accomplishment, involving many small incremental steps. Other frontal lobe traits, such as assertiveness, aggressiveness, nurturance, empathy and altruism, are under significant genetic control, accounting for approximately 50% of observed variance (Rushton, 1997).

 

It has once again become fashionable to think of brain function as being "localized." Although "phrenology" deserved to be discredited, it's ultimate theme was correct: namely, that most attributes of brain function are determined by activity in specific brain regions. They were wrong to place "combativeness" where the temporal and occipital lobes join, for example, but it is localized, and belongs in the prefrontal lobes (probably in RB). Many functions require the participation of several specific areas. Productive language is a well‑studied example, exhibiting involvement of specific parts of the left frontal, temporal, and parietal lobes. Physical damage to each region produces specific, predictable language deficits. This means that something as complicated as language requires the cooperation of regions with specialized capabilities, and the fact that these regions aren't next to each other, but are located in different lobes, does not undermine the view that brain functions are localized.

 

This reductionist way of viewing brain function is supported by the notion that a finite number of genes assemble the brain. Polygenes create brain modules, consisting of specific physical networks of interconnected neurons along with an approximate set of synapse sizes.

 

Mental function, like brain architecture, appears to be modular (Fodor, 1983, Gardner, 1983, Gazzaniga, 1985, Restak, 1986, Cosmides, 1989, Cosmides and Tooby, 1992, Restak, 1994). Granted, the modules interact with each other, but they can be usefully considered as modules with functional specifications. Consider the analogy of a system's analyst parceling out the task of writing a large computer program to several teams of programmers. Each team is charged with delivering a module of code that meets functional specifications. A programming team is like a gene, their code is like a hard‑wired brain module, the function performed by the module is like a mental module, the joining of modules is performed by the systems analyst, and the running of the completed program code is like a brain performing mental tasks.

 

Modules Compete With Each Other

 

Modules compete with each other for "expression." For example, one area in the occipital lobe may be able to correctly perceive and identify an object from its visual appearance, a cup for example, while another area in the parietal lobe may be able to identify the same object by its tactile feel, and another area in the temporal lobe may be able to infer the same object's identity from the sound it makes when set down upon a table top. Each will produce a signal of recognition when the necessary stimuli are presented, and somewhere in the adjacent tertiary cortical regions (where the three posterior lobes merge) the object identification of "coffee cup" is made.

 

It would be ridiculous to conclude from this capability that there's a "coffee cup recognition" gene. Rather, there's a polygene‑created module for recognizing curved shapes, another for shadings that contain surface topography information, etc.  These modules are interconnected so that experience with the real world, or at least one that contains coffee cups, allows synapse strengths to be modified such that when a coffee cup with arbitrary orientation is viewed the various percepts are joined together to trigger the perception "coffee cup."

 

The brain's experience with the real world adjusts synapse strengths so that no other region will be triggered to "resonant" activity when a cup is presented to the posterior lobes' primary cortex (sensory input) areas. If a totally unfamiliar object is presented, there will be a competition to identify it. When shown a German beer stein for the first time, the occipital lobe (sight) may report "something like a vase," the parietal lobe (feeling) may report "something like a large‑handled cup" and the temporal lobe (hearing) might report "like a brick." The discrepant reports would compete, as the frontal lobe might want to engage in further exploration to resolve the discrepancy (which is a job for “consciousness).

 

 

Figure 8.01. Reversible Goblet, illustrating competition between brain regions vying for "acceptance" of their respective interpretations. Look for the two dark face profiles, facing each other.

 

The many examples of images with "figure/ground reversal" conveys in the most dramatic way how competing modules strive to prevail in having their "interpretation" accepted. Escher drawings (Escher, 1961) exhibit a wealth of figure/ground perceptual competition.

 

Identifying situations, such as a "social situation," is subject to the same perceptual competition, although the frontal lobes will play a more active role in generating competing hypotheses. Context may be an important "input." "Do I know the person?  Does he have a hostile stance?  Does he have comrades?"

 

Consider the metaphor of a school classroom for understanding brain module competitions. The teacher poses a question, and the students try to understand the question and come up with an answer. Some students will both understand the question and have a possible answer, and they will raise their hand. The teacher calls upon a student to present an answer, quite often it's the student whose hand is waving most excitedly (or maybe the student with the best past performance), and after hearing the answer she passes judgment. If it is incorrect or inadequate, the teacher calls upon another student.

 

This classroom example is a good metaphor for how the brain works. When a person is presented with an unusual situation, some modules in the brain "recognize" something, and they request activation by the RAS (reticular activating system). The RAS, working in coordination with a higher level cortical system that keeps score of previous successes and failures, tentatively authorizes a module to "present its case" for evaluation. The module that wins the first round for presenting its interpretation may be the one that most strongly felt it understood the situation and had the correct interpretation (like the student who waved his hand most excitedly); its request to RAS may have been the strongest among the competing modules. When the first module presents its interpretation, some type of evaluation occurs (perhaps involving the reaction of other modules), and this interpretation may be accepted, or it may be tentatively rejected. If it is rejected, or set aside, another round of RAS requests for activation is performed, and another judgment is made. At some point, a winner is declared, and the winning module's interpretation is what serves as the basis for any required action. The losing modules do not simply stop trying to compete for RAS attention, however. As more perceptions occur, or as behaviors either validate the accepted interpretation or invalidate it, the other modules are ever‑ready to renew their claim for being heard. The example of the "reversible goblet" shows how this process "feels" for the domain of visual interpretations.

 

The process of generating behavior is also a competition between competing frontal lobe modules. "Shall I turn and run? Or scream? Or attack?" Imagine that one person may inherit a propensity to "attack" in ambiguous social encounters, while another may be genetically inclined to "run away." Just as animals have inborn temperaments, so do humans. And the mental process that precedes an action consists of a competition between brain areas. To the extent that one brain area is assembled by a different polygene group than another brain area, which is inclined to a different type of behavioral response, the genes primarily responsible for wiring the competing brain modules are competing with each other for behavioral expression. The same classroom metaphor described above can be used to understand this situation. After a situation is understood, and when action is necessary, the frontal lobe modules will compete for expression (i.e., control of behavior) in the same manner that the "understanding and interpretation of the situation" modules competed. The RAS (another part of RAS than used for “adjudicating” perceptions) receives requests for action, and eventually one module's proposed action is "accepted" (given “authorization” for initiating a behavior).

 

Thus, both perceptions and behaviors exhibit the quality of involving several mental modules in competition for acceptance and expression.

 

Does it matter whether the brain accepts, and acts upon, the perception that the sky is angry and the wind god and sun god are arguing, versus the competing perception that the wind is bringing clouds from somewhere which cover the sun and may cause rain? In the contemporary world it can matter more than it did in the ancestral world. For this reason, it matters whether RB interpretations versus LB interpretations tend to gain acceptance in an individual's brain. To the extent that genes wire brains to be predisposed to some "interpretation styles" over others, the respective genes are in competition. The next chapter will deal with this subject in greater detail.

 

Intelligence and IQ

 

“IQ is what IQ tests measure!” It should be emphasized that IQ, as measured this way, is just one of many components of what most people refer to by the term "intelligence." Ironically, IQ is not a prefrontal function. Prefrontal lesions do not reduce IQ; indeed, in some cases frontal lesions have enhanced IQ. This enhanced performance could be explained by a theory that views the frontal lobes as being prone to "interfere" with posterior lobe performance (such as a tennis player "thinking" too much); by injuring a prefrontal lobe the posterior lobes are freer to perform unhindered, boosting measured IQ. The WAIS (Wechsler Adult Intelligence Scale) IQ test has two parts: the "verbal" part and "performance" part, and these parts probe left and right (posterior) brain function, respectively. The Woodcock‑Johnson has two parts, also probing left and right (posterior) lobes. The WAIS verbal and performance IQ scores differ by 3 points, on average. A difference of 10 points should occur in only 5% of cases; and differences larger than this are usually caused by a lesion to one side of a posterior lobe.

 

This concordance of IQ scores that separately probe LB- and RB‑function invite speculation on the number of genes that affect posterior lobe capability on both sides. However, it is possible that a small number of genes contribute to "general" intelligence, and the rest contribute to specific abilities. This is consistent with the finding that a person's profile of subtests will have a pattern, with some parts of the "verbal" being low, and others high, while the average of the verbal parts average about the same as the average of the performance subtest parts. Psychometricians continue to find it useful to make a distinction between specific sub‑test performances and Spearman's (1927) "g factor" of mental ability. Tests that identify g‑loaded performance afford better correlations with genetic relatedness (i.e., identical versus paternal twins), and g‑loaded test scores are better predictors of academic performance than standard IQ tests.

 

Tests have been developed for assessing frontal lobe performance. The Halstead‑ Reitan Battery includes tests of frontal lobe assessment. Components of the Montreal Neurological Institute Battery, and also Luria's Neuropsychological Investigation, also test for frontal lobe function. The frontal lobes are so complex that no single test can capture all significant features. For example, effective business executives have especially capable frontal lobes, and they excel in the development, evaluation and implementation of "big picture" strategies. The business "world of hard knocks" reveals who some of these especially well‑endowed frontal lobe "executive function" people are. Bill Gates, Steven Spielberg and Lee Iacoca are examples. It would be interesting to know if they would have been identified in childhood as having especially talented frontal lobes using existing tests purporting to probe frontal lobe function. Some day, tests for executive function may capture this elusive capability.

 

Whereas someone like Bill Gates must have superior scores for both frontal and posterior lobe function, it must occasionally occur that people are born with disparities. For example, president Jack Kennedy is supposed to have scored a mere 125 on IQ tests. He obviously would have scored higher on any executive frontal lobe test. It may be more common for people to be born with the opposite disparity, in which posterior lobe IQ is higher than frontal lobe executive ability. Indeed, this could be the more common disparity because frontal lobe function is a more recent focus in human evolution. The ability to create culture, and to absorb and use cultural elements that other people are observed to use, must have been an important pressure for human evolutionary selection during the past 60,000 years. This idea will be taken up in a later chapter.

 

Number of Brain Genes

 

It has been estimated that as many as half of the entire set of human genes have some influence upon intelligence (Weschler, 1974, as cited by Seligman, 1992). For the calculations that follow, I will assume that 30% of human genes affect the brain. In theory, every aspect of brain function can be associated with a gene that has the most control over it. (To call such a gene the "whatever trait gene" overlooks another fact, that the same gene probably affects several other phenotypic traits which are sometimes unrelated to the main trait; this is referred to as "pleiotropy.")

 

If the human genome consists of 22,000 genes (functional sections in the “junk DNA” regions will also be present), and if humans share 98.77% of genes with living chimpanzees, then humans differ from chimpanzees at approximately 270 gene locations (1.23% of 22,000). Of these 270 genes, probably more than 30% have some influence over the brain's development. Let us assume that 80 genes are responsible for making the human brain different from the chimpanzee brain.

 

If the common ancestor for modern humans lived 200,000 years ago, and if the human/chimpanzee evolutionary split occurred 6 million years ago, then it is possible to estimate the number of brain genes that are more recent than 200,000 years to be 80*0.2/6 = 2.7 genes. This absurd result requires a few caveats. First, this calculation assumes that the pace of evolutionary change has been constant during the past 6 million years. Human evolution may have proceeded faster during the past 200,000 years than before this time, and the brain is likely to have been the focus of more than 30% of this evolutionary change, considering that major human brain expansions occurred at about 1.8 and 0.5 million years ago (Aielo and Dunbar, 1993). But the most important qualifier of this argument derives from the restrictive definition of a “gene.” A gene is defined as a stretch of DNA that “contains the instructions for the production of a particular protein.” Recent research has shown that non-gene DNA (in the areas once referred to as “junk DNA”) can affect the expression of genes. Not enough is known about the number of these locations to include them in the present argument, so allow me to proceed while keeping this one important caveat in mind.

 

If during the past 200,000 years 80% of the genes that were actively evolving were brain‑related, then the 2.7 multi‑allelic number increases to 7. If the evolutionary pace for recent times (the past 200,000 years) versus before (6 million years to 200,000 years) is greater by the factor of 10, then there could be 70 multi‑allelic gene sites that affect the brain. This number is compatible with the estimate that humans and chimpanzees differ at 270 gene sites. Since each site may have many more than 2 alleles per locus, there could be 100 to 200 alleles whose main effect is on the brain and which are still vying for a presence in the human genome. This may seem like a small number of gene sites, but there are 2N combinations of configurations when each site has two possible states, and if N = 100, there are 1030 such states. That's an incredibly large number, being larger than the human population by the factor 1020 (a one followed by 20 zeros)!

 

The point of these calculations is to prepare the case for stating that perhaps half of the present human genetic diversity, and genetic competition (among perhaps several hundred alleles), pertains mainly to the brain. The brain is a major focus for ongoing evolution for Homo sapiens.

 

Recent Brain Evolution

 

Two lines of evidence point to the LB‑frontal region as being the site for most of the recent human evolutionary activity (when I use the term "frontal" I usually mean "prefrontal" ‑ which is common usage). First, this cortical region is where executive function resides, and executive function is an especially well‑developed, some would say unique, human capability. Second, the human individual's ontogenetic development's last big push is in the prefrontal brain areas; and to the extent that individual ontogeny recapitulates species phylogeny, our most recent ancestors must have been busily improving the prefrontal cortices, with increases in both size and function.

 

RB‑frontal and LB‑frontal are probably the most recently evolved regions, and I am suggesting that they gradually came to be in almost constant competition. LB‑frontal has evolved to supercede RB‑frontal for many tasks (language replacing gestures being the most prominent). LB‑frontal is capable of inhibiting RB‑frontal when an appropriate occasion arises. But there are dangers in giving a new tool too much power, especially when it is strategically positioned to formulate near‑term strategies as well as long‑term life goals ‑ the way LB‑frontal is. The genes will find RB‑frontal a more useful agent for controlling LB‑frontal's "new ideas." Presently, RB‑frontal is probably charged with preventing LB‑frontal from thinking thoughts that threaten the genetic agenda. Without understanding any of this, RB‑frontal has taken on the role of acting as an agent for the genes, and LB‑frontal has unknowingly stumbled into the position of having the capability for acting as an agent for the individual wishing to liberate himself from genetic tasks!

 

It is common knowledge that people tend to think with certain styles, such that if you know one of their beliefs or interests you can predict others. Dichotomies abound:  Spiritualism versus Rationality, Religion versus Science, Conservatism versus Liberalism. How curious that we can expect almost universal agreement in making these aggregations. Consider the following joke: A prize will be given to anyone who is able to survive driving from Houston to El Paso in a Volkswagen minibus with stickers that read “Down with the NRA” and “make peace, not war” and “vegetarians rule” and “Vote Democratic.” (I maintain that it’s possible to predict a person's political party affiliation by knowing whether they deal with weeds by pulling them or spraying them with weed killer.)

 

The apparent pattern of two ways of thinking corresponds to the preferred styles of RB and LB, and it is obvious which trait corresponds to which brain half. The person who exhibits RB styles can be thought of as belonging to a group of people who are "opposed" to the group of people exhibiting LB styles. Just as there is a competition between an old RB and a new LB within each individual, so is there a "competition" for prevailing over a society's culture between RB‑style people and LB‑style people. The genes that wire‑up RB‑styled and LB‑styled individuals are, as groups, in competition with each other. C.P. Snow wrote The Two Cultures and the Scientific Revolution (1961) to call attention to a conflict between two types of people, the literary academic and the scientist. The book was based on decades of interacting with both types, and it was written before the great burst of neuropsychological insights of the 1960s, 1970s, and 1980s. If Snow were to write that book today (see Price, 1970, for a witty “update”), it would be difficult for him to avoid making the brain laterality connection. Chapters 12 and 13 are devoted to "two cultures" matters.

 

Result‑Driven Thinking

 

RB‑frontal neural circuits have many ways of restraining and manipulating LB‑frontal activity. One of these is called "result‑driven thinking." A thinking process is recognizably result‑driven when the logical process leading to a position is embarrassingly contorted and self‑serving, making it obvious that the “result” preceded the progression of thought that purports to lead to the result. The embarrassment may be apparent to an objective observer, who is not affected by the position taken, while for the result‑driven person there is no apparent flaw in a process he believes to be logical. Deception is always more convincing when the deceiver is unaware that he is engaged in a deception. Thought blinders are a useful tool when RB‑frontal wishes to remain undetected by LB‑frontal.

 

The Mexicans have a special phrase for calling attention to someone's result‑driven thinking: quando conviene. Literally translated, it means "when convenient" ‑ which conveys the idea that a person's stated belief changes in a way that conveniently serves the person's selfish goals. I am convinced that result‑driven thinking is a "human universal" (see Brown, 1991, for an extensive treatment of human universals). It probably can be found in all societies and has existed for a long time.

 

One of the primary tasks for result‑driven thinking (RDT) is to force the individual to stay in loyal service to his group – no matter how illogical. The group may be his extended family, or his tribe, which in either case is composed of genetically related individuals. Consider the matter of "tribal mentality," first described by Spencer (1892) and later by Keith (1946, 1948), which refers to the anthropological finding that all primitives live by a moral system that requires beneficent treatment of fellow tribesmen while condoning and even encouraging barbaric treatment of individuals belonging to neighboring tribes. "Tribal morality" requires an illogic for which RDT is admirably suited, and for which the RDT capability may have evolved. Empathy and amity for some, enmity and hostility for others! This seems at odds with contemporary religious dogma, yet in time of war both sides appeal to their respective religions for legitimacy and support in vanquishing the enemy. Each side distorts the character of the other, making up false accusations and using degrading names for the purpose of arousing the passions in preparation for combat.

 

Political partisanship, ever more prevalent in America since World War II, also illustrates the power of RDT. (I adopt the common usage of the term "America" to refer to the United States, in apparent neglect of the fact that "America" actually includes Canada, Mexico, and all of Central and South America.) The model for this is tribal conflict. Consider the case of the U.S. Supreme Court's 2000 decision that in effect led to George W. Bush's selection for president. The 5 conservative‑leaning jurists had a longstanding record of supporting “states rights” on many matters, while the 4 more liberal jurists favored federal rights, yet when the issue was the election of either a conservative or liberal for president both sides switched their positions concerning which right trumped the other, and by acting in this unusual way they served personal political philosophies. Chief Justice Rehnquist later alluded to the need for the Supreme Court to sometimes get involved in political matters to prevent a national crisis (Los Angeles Times, January 21, 2001, pg. A30). Quando conviene!

 

My reading of the newspaper is immensely more informed because I readily recognize RDT. Almost every opinion, and the rationale for almost every action, requires RDT to protect it from the detached, logical thinking that a well‑functioning LB‑frontal is capable of. There is a common misperception that people are LB‑dominant, which is based almost solely on where language capability resides. But in the realm of thinking, humans are RB‑dominant. And RDT is the mechanism for accomplishing this in the presence of a powerful logical left brain.

 

RDT is a tool of the genes to have things both ways. By inventing RDT, the genes have become able to receive the benefits of a powerful new LB while retaining their interests in enslaving the individual for genetic service. Obedience of the powerful new LB is achieved by RB circuits that assure that "result‑driven thinking" is preserved in a way that accomplishes genetic goals.

 

Thinking is a Subversive Activity

 

Occasionally I’ll buy a book based solely on the cleverness of its title. My favorite example of this is Teaching is a Subversive Activity. I never read the book, probably because I didn’t want to be disappointed. The title by itself inspired me to develop a text in my imagination, and over the years I’ve continued to add to that imaginary text. This section will draw from that text.

 

Society endorses a school curriculum that renders students “useful” to society, regardless of how useful the curriculum is to the student’s individual fulfillment (thanks, Alfred Allen, for stating it this way to me). If heroism is useful to society, then history and literature classes will feature heroism. If “slacking” is not useful to society, then the payoffs for slacking will not be taught. In between these two extremes are such things as tolerance, celebrating traditions, skepticism and questioning authority. If a curriculum were to be designed to serve individual fulfillment it would suggest that heroism is folly, tolerance is good, skepticism is essential and all authority should be questioned. The fact that none of these are to be found in schools is unsurprising.

 

“Thinking” is something the left brain does. It is also something the right brain “controls.” If thinking occurs, it is because RB permits it, and may even encourage it because a problem needs to be solved that appears to be compatible with the genetic agenda. Thinking is inhibited by RB when it appears to threaten the genetic agenda.

When a parent enters a burning house to save a child there is a quick and mostly subconscious calculation of the danger involved, and the resulting decision favors genetic interests. When a household is attacked by intruders the ensuing defense is likely to include heroic acts. When a tribe is attacked by a neighboring tribe, or when one country is attacked by another, heroism will be found among the defenders. Each heroic act will be called “selfless,” or what the intellectuals would call “altruistic,” and we can be confident in surmising that these acts were initiated by a right brain that was able to inhibit a left brain from hesitating for thoughtful consideration.

 

When society pays a teacher’s salary it is natural for society’s interests to be served by what’s taught. Any teacher who suggests that individuals have the right to renounce heroism when a situation calls for it would be considered subversive. Similarly, any teacher who counsels skepticism and a questioning of authority would also be considered subversive – especially by the authorities.

 

America has become a “consumer society” and the American government has become the protectors of business interests. Imagine schools that teach how to question advertisements, make fun of claims by the pharmaceutical industry, question FDA approvals, or take every politician’s utterance and consider that truth is closer to its opposite. Students graduating from such a school would fare better in life, but those with power in society would view the curriculum as subversive and they would not permit such a school to exist.

 

In a future chapter I will describe the co-evolution of genes promoting altruism and intolerance. A theory will be described that purports to show that when there is chronic conflict between neighboring tribes there are benefits for the tribe whose members are both intolerant and altruistic. Heroes will be seen as intolerant altruists!  Only when a tribe overwhelms all competitors is it safe for genes promoting the opposite traits to prevail. Thus, each empire will gradually become dominated by selfish individuals who tolerate others regardless of their differences. This becomes a “weakness” from the standpoint of societal survival, which leads to the decline and fall of the empire. Only after this theory has been explained will it be possible to understand the full meaning of “thinking is a subversive activity.”


CHAPTER 9

 

ARTISANS SET THE STAGE FOR CIVILIZATION: PART I

 

The human brain and culture co‑evolved (Lumsden and Wilson, 1981) during at least the past 70,000 years. In this chapter I will make the case for the position that a civilization can only develop when there is a division of labor, and when society supports a large base of "artisans."

 

The first artisan niche was probably the toolmaker, whose task was initially part‑time, but eventually became full‑time. Significant evolutionary forces were created when the first full‑time artisan was employed, and these led to the further specialization of the left brain (LB).

 

When the Pleistocene glacial climate began a transition to the Holocene interglacial, 14,650 yr BP (years before the present), the better climate set the stage for an explosive expansion of artisan niches. Increasingly complex economies allowed for higher population densities, which supported large, sedentary populations. New artisan niches were created, allowing for an increase in the artisan population. The increasing "presence" of artisans caused cultures to expand, and become more sophisticated. The artisan played a crucial role in creating civilizations.

 

The entire process of artisan proliferation, cultural elaboration, and the creation of modern civilizations, occurred because human culture and brain‑function genes co‑evolved. By this is meant that the civilized environment that was created by artisans, whose special abilities are at least partly due to the appearance of "artisan genes," changed the environment in such a way that artisan genes were more valuable, and were "selected" in greater numbers. This chapter describes some speculative mileposts along this interesting journey.

 

Pleistocene Life

 

I will argue that for the past 70,000 years one of the most strongly contested allelic competitions was related to the creation of full‑time niches within human tribes.

 

The world's climate was cold for most of the past 1.6 million years, a period referred to as the Pleistocene Epoch (from 1.6 million to 12,000 years ago). Brief warmings, or interglacials, occurred at approximately 100,000 year intervals throughout the Pleistocene. There was an interglacial from about 129,000 to 116,000 yr BP. At about 69,750 yr BP there was a brief several‑century warming, but it was too dry and not quite warm enough to qualify as a true interglacial. A brief warming occurred 34,800 yr BP. After an extreme cold period 18,000 yr BP, a gradual warming began. Erratic swings of warm and cold climate gave way 14,650 yr BP to an almost irreversible warming (the Younger‑Dryas cold interlude was from 13,000 to 11,600 yr BP), ushering in a true warm interglacial, called the Holocene Epoch, extending from 11,600 yr BP to the present.

 

Prior to the “70,000 yr BP warm/dry episode” human tools were uniformly simple. After this warming event tools became abundant, standardized, and more sophisticated (in Africa and Europe.) What caused the proliferation of quality tools? One possibility is that full‑time toolmakers appeared at this time.

 

In order to understand the importance of tool making we must imagine what life was like during the Pleistocene. What, for example, were the main "selecting forces" for humans? In the book Demonic Males (1996) by Richard Wrangham and Dale Peterson a compelling and disturbing description is presented of our ancestors, starting with woodland apes that split from the chimpanzee line 5 to 7 million years ago, then to proto‑humans living from 1.8 million years ago to 200,000 yr BP, and finally to our homo sapien ancestors of the past 200,000 years. All of these ancestors, like their chimpanzee ancestor, were preoccupied with territory, border raids, rape and even warfare ‑ accomplished, of course, by "demonic" males.

 

A book by Lawrence Keeley, War Before Civilization (1996), carries this theme into the late Pleistocene and Holocene. An archeological record is described brimming with evidence for pre‑historic "wars" that were more deadly and pervasive than our bloodied modern experience. Women and children were not spared, except for abduction as new wives or to serve as slaves. The enemy's property was either destroyed or appropriated. Mass graves contain victims with embedded arrow heads or spear points. Palisades and fortresses with moats preceded castles by many thousands of years. The principal cause of death during the past 20,000 to 30,000 years, when the archeological record is good enough for these interpretations, appears to have been inter‑tribal warfare, with surprise raids being the preferred strategy.

 

Our ancestors must have lived in a world where survival depended on effectiveness in warfare. The tools of war during the last 200,000 years of the Pleistocene must have been clubs, axes, spears and eventually bows and arrows. The term "hunter and gatherer," in which men hunt and women gather, should be amended to "warrior, hunter and gatherer." Therefore, our "ancestral environment" (AE, also referred to by the more pedantic term "environment for evolutionary adaptation," or EEA), should be based not on the H&G model, but on a WH&G model!

 

Colin Tudge (1998) summarizes emerging suspicions that throughout the past 100,000 years our ancestors supplemented a "hunting and gathering" mainstay by horticulture and pastoral "farming." Horticulture consists of weeding and clearing an area to provide for a greater growth of some naturally occurring plant that produces something that is normally "gathered." Pastoral farming consists of herding and later domesticating animals that were previously hunted. Both forms of "proto‑farming" invite banditry! And banditry will inevitably elicit defensive efforts by the proto‑farmers. Both bandits and farmers would require effective weaponry, and this increases the importance of artisans who specialize in weapon‑making.

 

Weapon Making Transitions

 

I speculate that prior to the 129,000‑116,000 yr BP interglacial, every hunter/warrior made his own tools ‑ including axes, spears, bows and arrows, which were used for both hunting and warfare. After the 129,000‑116,000 yr BP interglacial, a part‑time toolmaker may have made most of these items. But during the 70,000 yr BP dry warming, migrations may have enhanced tribal conflict, rewarding those tribes with the best weapons. Any tribe that made use of a full‑time weapons maker would have enjoyed an advantage in inter‑tribal raids. The critical warming 70,000 years ago would then have completed a two‑phase transition between three states: 1) a culture in which "every man made his own weapons and tools," 2) part‑time tool and weapon making for the others, and 3) a tribal arrangement in which a specialist assumed full‑time responsibility for tool and weapon making.

 

Whether the first full‑time specialist appeared 120,000 yr BP, 70,000 yr BP, or 40,000 yr BP, the event would have been a milestone in human evolution. With a niche in each tribe for just one thing, such as tool-making, genes for tool-making could be evolutionarily selected at a rapid pace. Before that time, any genes for tool making are likely to have had deleterious effects on other phenotypic traits, which would have represented a serious penalty. But if a person's task is confined to mostly tool-making, it doesn't matter if he is unable to perform as a warrior, for example. Fierceness isn't a requirement for tool-making, nor is physical strength, fast reflexes, or endurance. A new era dawned for humanity when this new full‑time niche came into existence. Because of what followed, I shall refer to the people who filled these "weapon maker" niches as "artisans."

 

Weapon Makers, Toolmakers and Artisans

 

The first full‑time weapon‑making "artisan" may have differed little in temperament from his fellows. It would make sense for him to train an apprentice after achieving competence, in order to preserve what had been learned through years of trial and error.  This arrangement, in which an artisan takes on an apprentice, set the stage for the accumulation of advances in weapon making technique during succeeding generations.

 

Anyone expert in making weapons is already more likely than others to make good tools. The axe that kills an enemy warrior is similar to the axe that chops a tree. The stone knife that cuts an enemy warrior's scalp for a take‑home trophy could also be used for skinning an animal. The first tools for constructive uses could have had their genesis as warrior weapons. The weapon maker was a natural choice for assuming tribal toolmaking duties as well.

 

If the apprentice is chosen by observation of who seems to have talent and temperament for toolmaking, and if this practice is preserved for many generations, then it is inconceivable that the genes would "overlook" this new opportunity. “Toolmaking genes” would have been rewarded, and they would become more abundant ‑ even if they meant the individual was poor at hunting and "war making." The artisan will have been released from the genetic limitations imposed by the need to preserve hunting and warlike traits. Initially, the niche would be limited to approximately ~2% of the male tribal membership, but an important process would have been set in motion.

 

The toolmaker established a precedent for special status, and this precedent increased tribal readiness for establishing niches for other artisan work. Proto‑farming might have created three other niches for artisans: horticulture, animal herding and animal domestication. Individuals especially effective in these activities would contribute to a tribe's success, possibly saving them from extinction when the vagaries of climate or animal migrations brought hardship to those who relied too completely upon hunting and gathering. Tribes that "accommodated" proto‑farmers, in addition to full‑time toolmaking, should have fared better during the late Pleistocene. (If this sounds like "group selection," it is! I pursue this further in the next chapter.)

 

If the horticulture artisans produced a surplus of crops, and if they could somehow process them for "storage," then there would be payoffs for the new specialty of preserving surplus foods and building crop storage houses. Because of the physical demands of this work, the initial stage of construction was probably performed by men. Whereas we can easily imagine that horticulture was initially women's work, since gathering is traditionally for women, the first horticulturists were probably women. But a woman's labor is finite, and childrearing, regular food gathering and preparation, making clothing, and other domestic jobs set a limit on how much of her time could be devoted to new artisan tasks. Thus, over time, men must have assumed more and more full‑time jobs performing the horticulture tasks. Men probably were the initial pastoralists, since working with wild animals would probably be dangerous and require physical strength.

 

As artisan niches expanded, it is inconceivable that genes would not have been affected by the new opportunities, and they must have responded by "producing" people who were talented artisans to fill the niches. If 10% of the jobs for men were artisan‑like, then in a steady state condition it can be anticipated that approximately 10% of men would be born with a phenotype having the artisan's talent and temperament. As tribes became technologically more sophisticated, the spectrum of abilities that people exhibited would have "matched" the broadening spectrum of niche opportunities.

 

Dawkins wrote about the hypothetical case of a population of "hawks" and "doves" living with specified payoffs (Dawkins, 1976), and he showed that natural selection forces should eventually lead to the establishment of a specific population mix. It was stable at this "stable point" because any displacement would tilt the rewards to those who were less populous. He called this dynamic an "evolutionary stable strategy," or ESS. The same argument should apply to tribal niches, and modern societal niches ‑ provided the niches are long‑lived and the forces of selection are natural. Thus, if a tribe "needs" only 10% of adults to engage in "infrastructure" matters (building and maintaining huts, clearing paths, building water storage structures, irrigation, sewage disposal, etc), we can expect that about 10% of newborns will be talented in these activities (assuming the payoffs are not drastically uneven). Another solution form, as Dawkins also points out, is that all newborns will be talented in "infrastructure" but only 10% of them will adopt that role when they grow up. Perhaps, in reality, we should expect to encounter something intermediate between these two extremes.

 

A new complication arises when within one species two or more "types" of individuals are rewarded. Since females are the result of breeding with males of the warrior type, they cannot be expected to produce male offspring of the artisan type without invoking some additional selection mechanism. One candidate mechanism is to hypothesize that a new type of woman co‑evolves with the new artisan man. The artisan wife would have to assortatively mate with the artisan man for this to work. As Dawkins also explains (1982), daughters of a father who differs from the norm are likely to prefer men like her father, since she inherited some of her mother's predispositions. The theory for this has been worked out mathematically, and it is called "linkage disequilibrium." Therefore, it may be theoretically possible for boys to be born with a predisposition to become artisan men as opposed to warrior men, or politician men, etc., and still find women willing to marry or mate with them.

 

A simpler mechanism is for most of the new artisan genes to be located on the same chromosome, preferably close together. Then, when the sex cells are created by meiosis, the crossing‑over process is likely to preserve the association of artisan genes on the same chromosome segment of the new gamete, and thus pass to male offspring either an undiluted warrior type or an undiluted artisan type.

 

Let it be noted, here, that the artisan performs tasks that require good left brain function. So when artisan niches expand, this is equivalent to stating that there are genetic rewards for genes that produce individuals who have especially well developed left brains. This will become an important point later in this chapter, and in subsequent chapters.

 

Problems Created by the Existence of Artisans

 

If full‑time toolmaking led to toolmakers who were exempt from the dangerous exploits of war, who even began to lose their ancestral adaptation to hunting and war‑making, then what might have been the attitude of the hunter/warrior toward the toolmaker? Would they not make fun of the toolmaker for staying home with the women and children when they went out on dangerous hunting and raiding expeditions? Would they not be inclined to tease and intimidate the toolmaker, and steal his provisions? But since the person who makes superior arrowheads and spears is too important to go on risk‑prone hunts, especially if he is poor at such things, the tribe would be served well by customs that honored the toolmaker's special status. By stating that the tribe would be served well if the toolmaker is somehow allowed to safely pursue his labors without the threat of harassment by warrior men, I am actually saying that the warriors would be served well by customs that provide for the toolmaker's protection. This dilemma might have been solved by a ritualized granting of special status to the toolmaker, with taboos (eventually converted to "laws") requiring that his tool works and other provisions be off limits to the destruction or theft that might have been tolerated for non‑toolmaker victims.

 

Artisan men may have been shunned by women, for the simple reason that they can not protect a wife and offspring from "take‑over" males. Moreover, the tribe might benefit by artisan men not marrying, for a bachelor artisan would have more free time to practice his essential trade. (Might it be efficient if the genes also conferred upon the artisans a predisposition for homosexuality? This is a completely new theory for the origins of homosexuality.)

 

An unmarried artisan would be without the benefits of in‑laws to support his case against unfair treatment by cheats and bullies, who might covet his possessions, food supply or hut. Social pressure is an important stabilizing force within a tribe. When someone cheats another, rumors of the wrongdoing spread, and although this may not restore equity it might at least serve to discourage a repeat offense. An unmarried man has half as many people belonging to the "relatives and in‑laws" category, who stand ready to support him with social pressure, rumor spreading, or literal assistance. These problems constitute a challenge to the tribe (or rather the genes within the tribe) to institute an effective structure of taboos that guarantee protection of the artisan from non‑artisan men.

 

There are two ways to imagine how a protection could be accomplished. Tribes that just happen to include taboos prohibiting intimidation and theft will prosper more than other tribes. This is a "group selection" theory for the development of taboos, and eventually the rule of law. An alternative is to suppose that genes are created (and are present in the population at large) which predispose people to respect "fairness." And such predispositions favor the adoption of specific tribal laws which protect artisans (and are available for dealing with other specific fairness issues). This is a more "robust" path toward the creation of laws. It requires the co‑evolution of genes and culture, dealt with below.

 

It is possible that the laws which were meant to protect artisans, whose numbers were surely small, were made use of to some lesser extent by the others. Although the others would have had less need to use the laws on their behalf, the opportunity nevertheless existed for these others to "borrow" the protections of status meant for the artisans by presenting themselves as having artisan abilities. (A theory for "status" might be developed from this idea, but not here.)

 

The responses just described constitute the beginnings of a new type of culture, one based on concepts of "fairness." The fairness "culturgen" must have been unfamiliar when it first became a tribal law ("culturgen" is a term for an element of culture, introduced by Lumsden and Wilson, 1981). It must have been extremely frustrating for warriors to resist taking advantage of the artisan. The idea of "status" was old, but the idea of a special status for an artisan, someone who could not defend his possessions or wife in the traditional manner, was new. It would serve as a model for new kinds of status that are indispensable for a modern civilization.

 

Further Problems with the Existence of Artisans

 

Let us be mindful of the sobering fact that all of our ancestors before the Holocene lived in a tribal setting. Tribes flourished or floundered as a group. It would be amazing if we didn't have many genes adapted to tribal living. The tribe needed an artisan, and the artisan needed the tribe. The tribe fed him, and protected him from the cruel, harsh world outside the tribal setting.

 

It's difficult for us in this individual‑worshipping, modern culture to imagine how restrictive, confining and enslaving the tribal setting was, and how important group evolutionary dynamics were. It is often stated that banishment from the tribe was equivalent to a death sentence. The kind of liberated, individual‑thinking that we take for granted today would have been rare for our tribal ancestors. Criticizing tribal rituals or beliefs would have been unthinkable, unless the individual was willing to leave the tribe and go live by himself, leaving no progeny, which is an evolutionary dead end. Maybe some people did this, but none of them are our ancestors.

 

In a tribal setting it makes sense for some of the membership to have assigned roles that contribute to tribal welfare. Individuals could have performed these roles better without the burden of family. Hence, bachelorhood (maybe even homosexuality) could have had a place in the tribal society. To use a recent example, we should all be thankful that Beethoven wasn't a family man, and that Einstein didn't allow family responsibilities to burden him. A whole host of other lesser people could also serve the tribe in this way (as they do in today's society), and they would be better able to make their contributions by eschewing family responsibilities. 

 

These people, the artisans, were expected to make individual contributions to the greater good of the larger group, and part of their individual sacrifice might have been to forsake marriage. To the extent that the artisans were expected to remain single, any young man with artisan abilities would have been perceived by women as a bad mate prospect. Parents may have steered their daughters away from men who appeared to be on this path of individual contribution. Tribal people must have had their terms for geek, or egghead, and they would have served the purpose of discouraging young women from being attracted to “bad bet” mates. So, anti‑intellectualism may in fact have its origin long ago, with the artisan playing the role of today's intellectual, being shunned, yet valued for the greater good.

 

In spite of all the special privileges bestowed upon the lucky artisan, he must have had many unexpected challenges to his individual welfare. Survival of the tribe is evolutionarily irrelevant except to the extent that the tribe's survival was a precondition for the survival of the genes within the individual. Thus, loyalty to the integrity of the tribe would have been valued by all. But the artisan is a special case. Artisans in all tribes might have been viewed as somewhat interchangeable. For example, if one tribe triumphed in battle over another, they might actually go out of their way to not kill or injure the enemy's artisan, for they could abduct him, and put him to use back home. Now, knowing this, every tribe should be suspicious of their artisan's allegiance to tribal survival, for he would have a less compelling reason for adhering to such an allegiance. Hence, even in the absence of evidence that his allegiance should be questioned, the artisan should be a worthy target of suspicion, and he should be treated as someone prone to tribal disloyalty.

 

The artisan trade must have brought with it many risks. Imagine a condition in which one tribe is being beaten down by a neighboring tribe, and weakening year after year. Might the artisan want to escape before it's too late, and thus avoid the risk of having to be captured during battle? He might even be at risk of being killed by his own tribesmen during their defeat, as a form of "scorched earth" strategy that is even today sometimes practiced. These conditions are conducive to all kinds of complicated intrigue, all revolving around the questionable loyalty of the artisan to the tribe. All artisans must have been both loved and hated by his fellow tribesmen. During battle, indeed, they would want to protect him because he makes their arrowheads.  However, he would be the first to defect in the face of a deteriorating tribal situation, for he would be accepted by any other tribe. Oh, how his fellow tribesmen must have loathed his envious position.

 

Alas, the modern intellectual is heaped with the same scorn and ambivalence. During WWII the scientists and engineers who powered the war machinery were the artisans of their day. Britain’s Alan Turing played a crucial role in breaking the code of the German’s Enigma Machine, and helping the war effort immensely (interestingly, Turing was “gay”). After the war Werner von Braun was snatched by the Allies as if he were a prized booty of victory. How ruinous it would have been to the Axis if the Allies had captured von Braun at the beginning of the war. The Germans were short‑sighted to allow Albert Einstein to leave, and a few others, who contributed to the conception and building of the atomic bomb. We may never know if Heisenberg intentionally did not work diligently to build the atomic bomb on Germany's behalf. Such is the power of the modern artisan, for if Heisenberg had pursued the atomic bomb successfully, Germany could have easily won the war.

 

Returning to the tribal setting, think of how the artisan must have viewed his fellow tribesman. The warriors he must have viewed with disdain, for couldn't they see that the warrior was mainly “fodder” for useless battles (that settled nothing as far as the artisan was concerned). The women who willingly became burdened with child‑bearing, couldn't they see that they were being used for the tribal goal of producing warrior fodder?

 

And what about the tribal chief, who protected the artisan from exploitation by the more savvy and intimidating warrior?  The chief was the artisan's benefactor, so the artisan would at least have to pretend to view him with a more loyal heart. But the chief must have thought of his artisans as a "useful asset" ‑ like a herd of cattle, needing protection in the same way as a cow or goat. The king must have secretly snickered over this person unworthy of battle, inexcusably effeminate, but also essential for tribal survival, and contributing to the chief's job security. So the chief must have had to control his ambivalent feelings toward the artisan better than the other tribesmen.

 

The artisan does not completely belong to his tribe. Rather, you could say that he belongs to his trade. For whichever tribe fails him, his trade will remain as his means for livelihood. The artisan secret motto might have been "if you're good at your trade, the tribesmen will come a courting."

 

The Holocene Artisan Explosion

 

The 5000 year climate transition from 18,000 to 13,000 years ago was a watershed period for human evolution. For the first time very large tribes assembled and adopted sedentary lifestyles. As glaciers receded they exposed new fertile lands, bathed in warmth and rain, and farming became more feasible. The domestication of both plants and animals was practiced more widely. Artisan opportunities exploded, expanding from weapons and toolmaking to such novel things as animal breeding, irrigation, grain storage, record keeping, trading, and tax collecting.

 

The Holocene should be viewed as an epoch of food surpluses created by sedentary economies, driven by the dramatic expansion of new artisan niches, which in turn created ever‑more artisan niches. This positive feedback dynamic fueled an explosion of cultural change, as well as an exploding population. Large population centers influenced farming practices across ever‑larger surroundings.

 

Many aspects of the way humans lived underwent dramatic change during the early Holocene. One that deserves comment here is that social life for the first time faced the challenge of having to deal with strangers who were not enemies from a rival neighbor tribe. Indeed, some of the strangers encountered in everyday life might have come from tribes that used to be rivals, but who could no longer be treated as enemies since they were a useful part of the expanding new economy.

 

How confusing it must have been for the first super‑tribesmen: they were surrounded by unfamiliar faces, yet these unknown faces were not the enemy. What profound implications this must have had! Aggressive behavioral responses that were meant for strangers must have been triggered at subconscious levels, almost continuously, for early Holocene man while he conducted commerce on busy city streets among strangers engaged in a similar commerce. The new conditions of public life called for a change in one's attitude toward "society," as well as one's relation to it.

 

The glue that held together tribes numbering in the hundreds, as with smaller primitive societies, was based on "inclusive fitness" relationships and repeating "reciprocity" dealings with familiar tribesmen whose history of faithful past dealings was known. The new social setting required a greater adherence to explicit "rules" ‑ which resemble taboos. Concepts of "fairness" were changed, as they included "outsiders" for the first time. Barter of goods for goods, and goods for services, and services for goods, became an everyday way of meeting needs. Artisans, who worked with their brains instead of their brawn, were a newly respected class. Whereas perhaps 2 to 4% of the pre‑Holocene tribe was an artisan, perhaps 10% or more of an economically connected population were artisan‑like. Artisan types proliferated; instead of just toolmakers, the new artisans constructed irrigation works, farmed, processed food, stored grain, tended markets, made clothes, kept records and governed.

 

What had happened to the old structure, with just hunter/warriors and maybe a part‑time toolmaker? Few people hunted, and the warrior class had shrunk to a minority, with diminished power. Things had been turned upside down during the hectic few millennia spanning 13,000 to 6000 years ago.

 

This was the transition to a new condition called “civilization.”

 



CHAPTER 10

 

ARTISANS SET STAGE FOR CIVILIZATION: PART II

 

The Co‑evolution of Genes and Culture

 

What is culture? Is it created by the genes, or superimposed upon human behavior from the "outside" as a new environment within which the genes must adapt and with which the genes have no "experience"? Or is culture something in between?

 

Let's begin with an 1896 assessment by H. G. Wells of what culture is and why it is so important to the Human future.

 

"...it appears to me impossible to believe that man has undergone anything but an infinitesimal alteration in his intrinsic nature since the age of the unpolished stone.  ...  A decent citizen is always controlling and disciplining the impulses...  ...it is indisputable that civilized man is in some manner different ...  But that difference ... is in no degree inherited.  ... With true articulate speech came the possibilities of more complex co‑operations...  Came writing, and therewith a tremendous acceleration in the expansion of that body of knowledge and ideals which is the reality of the civilized state. ...in civilized man we have (1) an inherited factor, the natural man, who is the product of natural selection, the culminating ape, and a type of animal more obstinately unchangeable than any other living creature; and (2) an acquired factor, the artificial man, the highly plastic creature of tradition, suggestion, and reasoned thought. ...in a rude and undisciplined way indeed, ...humanity is even now consciously steering itself against the currents and winds of the universe in which it finds itself. In the future, it is at least conceivable, that men with a trained reason and a sounder science, both of matter and psychology, may conduct this operation far more intelligently, unanimously, and effectively, and work towards, and at last attain and preserve, a social organization so cunningly balanced against exterior necessities on the one hand, and the artificial factor in the individual on the other, that the life of every human being ... may be generally happy. To me, at least, this is no dream, but a possibility to be lost or won by men, as they may have or may not have the greatness of heart to consciously shape their moral conceptions and their lives to such an end."  H. G. Wells, "Human Evolution, An Artificial Process," Fortnightly Review, Oct, 1896.

 

Wells viewed human nature as unchanging during our acquisition of a changeable culture. He leaves unaddressed whether or not he thought cultural changes were influenced by the genes (which hadn’t been generally recognized in 1896), but he doesn't believe that culture changed our inherited nature (our genotype).

 

Charles J. Lumsden and Edward O. Wilson, in their book Genes, Mind and Culture (1981), take the position that the evolution of culturgens causes evolutionary change in the genome, which in turn allows new directions for culture, etc. Each affects the other, and together they co‑evolve, reaching the elaborate cultural level of today's modern world. Their book is excessively mathematical, and can discourage all but the most accomplished mathematician; however, it is not necessary that one follow the rigorous mathematical treatments in the book to comprehend the concepts presented. I agree with their argument, and will present a synopsis of it here.

 

It is easy to understand that cultural evolution is channeled by what is "possible." As a trivial example, consider a hypothetical cultural element, or "culturgen," forbidding the drinking of liquids. Any person fool enough to adhere to this regimen would die (unless sufficient liquids were present in the solid diet). Not only would practitioners not live long enough to spread the culturgen, but the need for liquids is so strongly rooted in our genes that the weird culturgen would lack appeal and fail to win converts.

 

A pork taboo, on the other hand, would face less resistance, and indeed has appeared at a location and time when eating pork was probably too risky to be worth its nutritional benefit. Thus, the genes "allow" some culturgens but not others.

 

The incest taboo is a well‑studied culturgen, and it is found in all societies. Genes have evolved that identify incest situations and produce an aversion to their completion (in order to prevent the homozygotic expression of recessive genetic defects in offspring). The incest taboo is present in all human societies as well as many species. When it appears in animals that we normally do not believe are capable of culture, the behavior can be said to be hard‑wired. It is not strongly hard‑wired in humans because incest behaviors do sometimes occur, and details of the taboo are different in different cultures.

 

Incest taboo is a "permitted" culturgen; indeed, it is a predisposition that the genes have been coding for during all of human and pre‑human ancestry. Therefore, any genes that influence incest behavior will be under selective pressure, and can be quickly selected into existence if they exhibit adaptive nuances ‑ such as favoring first‑cousin matings.

 

The following hypothetical culturgens illustrate the range of likely to unlikely: 1) the celebration of successful warriors versus scorning them, 2) offering help to fellow tribesmen versus offering help to neighboring tribesmen, 3) sharing food with relatives when there's extra food versus denying food to relatives, and 4) adopting tribal culturgens versus mocking them. The genes aren't "dumb," and not all culturgens have an equal chance for acceptance.

 

The "other side of the coin" is to ask if the evolution of genes can be affected by an entrenched culture? The key word here is "entrenched." There is a tendency for all members of a tribe to adopt the same culturgens, a noticeable human trait called "conformism" (Boyd and Richerson, 1996; Henrich and Boyd, 1998). In sharing a culture, people adopt most of its culturgen elements. There must have evolved a gene for a brain function that causes individuals to be unquestioning joiners, and all people have the gene.

 

Consider a person who is more willing than others to try out new culturgens.  Not only will he be burdened with many new culturgens, most of which will be maladaptive, but his beliefs and behaviors will make him resemble someone from another tribe. Since every tribe is in conflict with neighboring tribes, a person who appears to belong to another tribe will be severely handicapped in gaining acceptance by his own tribe. Unless this open-minded individual happens to adopt a highly adaptive culturgen, his aberrant beliefs and behaviors will not be tolerated by his fellow tribesmen and he will be banished by them. Therefore, any gene that inclines a person to be open-minded is likely to quickly disappear from the human genome.

 

Yet, new culturgens do occasionally appear.  So there must be a mental calculus of perceived benefit versus cost that allows some individuals to adopt a new culturgen without being banished. Perhaps tribesmen who have attained a position of unquestioned tribal loyalty are able to try out mildly new culturgens with impunity. If a successful huntsman uses a new arrowhead shape, then other tribesmen may be curious about it and may eventually adopt it. If the chief attributes a tribal victory to a new spirit, the others may consider accepting this new spirit. Everything new has a barrier for acceptance. If enough individuals are willing to overcome their natural resistance to a new idea then a generally-accepted culturgen shift within the tribe might eventually occur. If this happens, then those who remain uncomfortable with the new culturgen would be at a disadvantage, and the gene that codes for their culturgen preference would face a slow extinction.

 

This illustrates how culture may influence the evolution of genes.

 

To cite a specific, hypothetical example illustrating the co‑evolution of genes and culture, imagine the first groups of Africans to migrate northward after the start of an interglacial warming. Upon reaching Europe, the migrants would have encountered retreating glaciers, rivers of melt water, abundant plant growth, grazing animals, and slow‑moving Neanderthals (distantly related to humans). These migrations might have occurred 120,000 years ago, 70,000 years ago, and 13,000 years ago (but, in this last case, the migration would not have encountered Neanderthals, who were displaced by humans about 30,000 years ago). The new setting presents many opportunities, but it also demands many behavioral adaptations. Seasons are more extreme, and procuring food in winter is different from summer. Uneaten food doesn't spoil as quickly as in the jungle, so food storage is not only possible, it is essential to avoid starvation in the winter.

 

In this new land with seasons it makes sense to establish a home base in the fall where food provisions can be kept for use throughout winter. New customs are needed, as are new instincts. Whereas jungle life has no rewards for those who store food, glacier's edge life demands it! The impulse to eat whatever food is present is now a liability. Impulse control on this, and other matters, is important. Conscious thought is brought to bear on such tasks as providing food stores for the winter, protecting these stores from theft by animals (and other human groups), planning ahead by making winter clothes from animal skins while the skins are available, finding a cave before others, constructing a shelter, and many other season‑related tasks.

 

As described earlier, every population of individuals will exhibit a spectrum of pre‑adaptations and pre‑maladaptations to a totally new challenge. Those who are naturally inclined to possess impulse control, for example, will be inclined to adopt culturgens requiring impulse control. In the absence of a large tribe with an entrenched culture, individuals are freer to discover their innate "usage probabilities" for new culturgens. The transition from a previous culturgen to a competing new one, is set by the genes, but is also dependent on the situation (physical environment, social setting, etc). People in the same situation will have different "transition probabilities." Those who are quicker to make the transition to the new culturgen are relatively "pre‑adapted" to the new setting.

 

Whereas such pre‑adapted people did not have a competitive advantage in the jungle, they are the new winners in a mid‑latitude setting. Their pre‑adapting genes create more successful individuals, and their genes will spread through the gene pool of those groups that migrate north.

 

As a new collection of culturgens accumulate, creating a new culture, some genes become mal‑adapted (to the new culture). For example, genes for impulsivity handicap individuals with that gene. If those people fail to set aside winter food stores, and are forced to steal from neighbors during the winter, they are at greater risk of injury or death by those who are protecting their food stores. Agreements may be formulated among like‑minded provisioners, requiring the group to take action against those who don't respect other people's "property." An individual who has trouble grasping the new concept "property" and "property rights," and the consequences of "stealing," will be dealt with harshly by the majority, once these new culturgens are adopted.

 

These examples illustrate how a new environment can change culture, and how a changed culture can influence the fate of genes, causing gene allele frequencies to change. As one change becomes established, new selection pressures exist on the other. And selection pressures work in both directions: new genes alter "transition probabilities" for the adoption of new culturgens, and newly adopted culturgens alter the selection pressure on genes. Thus, genes and culture co‑evolve.

 

I claim that when tribes began to subsidize the full‑time employment of artisans, possibly during a warming 70,000 years ago, the stage was set for an explosion of new artisan‑like niches, and that when the Holocene interglacial began, some 12,000 years ago, the explosion of changing cultures began. The new niches include such things as agricultural farmer, domesticated animal farmer, tribute record keeper, clothier, entertainer, priest, government administrator, entertainer, merchant, full‑time soldier and others.

 

Before these changes began everyone within a tribe assumed the same roles, which dealt mostly with providing food and fighting neighboring tribesmen. The Holocene saw fewer and fewer people engaged in the traditional, all‑roles lifestyle; an ever‑increasing fraction of people in urban centers became engaged in specialized roles, having nothing to do with food production or fighting wars. Country living yielded to living within or close to cities. Culture became more complex, powerful, and played a growing role in the selection of gene alleles that were pre‑adapted for new niches. This is the story of the expansion of culture and the birth of civilizations.

 

Defining Civilization

 

Let us ponder the term "civilization." Like most people, I know it when I see it, but it may nevertheless be instructive to struggle with defining it.

 

Perhaps the root word "civil" is the key to its definition. Where civil social interactions are common, there resides a civilization. However, uncivilized primitive people are usually "civil" to each other ‑ but their "civility to strangers" is another matter? It is well known that the "tribal mentality" (Spencer, 1892) requires that two separate codes of morality be used; one is meant for intra‑tribal interactions (amity) and the other for extra‑tribal interactions (enmity). However, when neighboring tribes trade goods, they are civil with each other. Even if that's due to a fear of retribution, fueled by not knowing the ferociousness of the stranger tribesmen, "civility to strangers" still lacks the essential trait we're looking for.

 

I'm going to suggest a definition based on an observation that has probably never been suggested before. I assert that a civilization is the product of left‑brained values and productive activities. Consider the dictionary definition: civ·i·li·za·tion, n. 1. An advanced state of intellectual, cultural, and material development in human society, marked by progress in the arts and sciences, the extensive use of writing, and the appearance of complex political and social institutions.  Note how left brained these qualities are: material and intellectual developments, writing, science, complex social political and institutions.  These are things that left brains value and only they can do!

 

The dictionary's phrase "the appearance of complex political institutions" conceals a deeper truth about civilizations. One of the functions of complex political institutions is to safeguard the rights of individuals from violation by the collective. I believe that ever since the left brain began its specializations for what we now recognize as LB‑style thinking there has been a conflict between obligations imposed upon the individual by the group versus LB‑style individual aspirations. The group wants conformity, and it endeavors to suppress individual expression. Those individuals who identify with group conformity are agents of the collective will, which is to say, they are dupes of the genes! For it is the individual with an independent will, fortified by a strong LB, who is unwilling to remain subservient to its RB, who can show the way to liberation from the genes that wish to keep us dedicated to serving their "needs." LB‑style individuals protect themselves from exploitation by the genes by constructing political institutions, such as a legislature, a police system, and a justice system. When these institutions work, they control the collective's meddlesome intimidation of the individual who is minding his own business as he creates his own individual path through life.

 

Too often the political institution is hijacked by the enemy of the individual. Communism is a case of the collective usurping power from those who created the institutions to protect the individual from the insidious meddling of the collective. I believe that communists are well‑meaning, but misguided by a naive understanding of human nature. They mistake society for the family. Within a family it is natural for each to take from those who have and give to those in need. This makes genetic sense, since the members of a family have a strong genetic relatedness to each other. It even makes sense for a tribe to behave in this communistic way, though to a lesser extent than for the family. But human nature will not endure the attempt to bestow family and tribal obligations to the larger social group of a society. Society can never be made to look like a family, or small tribe. Human nature has been molded for competition with neighboring groups, and it mobilizes our energies to defeat them. Communism must suppress individualism; it symbolizes the classic conflict between the needs of the collective and the aspirations of the individual. Communism is the enemy of everything valued by LB. LB wishes to liberate the individual from the collective, and communism thwarts these liberating ambitions.

 

The dictionary definition also refers to progress in the sciences. Science is a discipline that requires strong LB involvement. For a scientist, RB must play a supportive role, though it does "point the way" and "give opinions" when its intuitive feel for matters is useful. During my practice of science (in the physical sciences) my RB contributions have been important, but supportive. I will share credit with RB for my four patents, and other creative labors. Intuition is an essential guide through the labyrinth of possibilities faced by a researcher in any branch of science. Hunches that pay off advance every investigation. But the entire enterprise is overseen and guided by a disciplined LB. After the inspiring moment (which has happened to me many times), while the emotional excitement swells, LB goes into action and begins to "work out" the idea. Logical consequences of the idea are pursued, and tests of it are devised. A moment's inspiration can lead to many years of an unfolding, LB‑guided investigation. Without a specialized LB, science could not progress.

 

As an aside, I believe the inclusion of "art" in the dictionary definition is a mistake.  If we somehow could remove all the "arts" from Western civilization, would we still think it was a civilization? Imagine that we had the same level of literature, science, technology, musical heritage, material standard of living, sophisticated governing institutions, medical knowledge, and insight into how things work, I claim that we would still call it a civilization. Primitive societies have their "art" ‑ and sometimes it's quite good art, easily rivaling "modern art" in appeal. Cro-Magnon artistic renderings are impressive, yet they did not possess a civilization. I maintain that "progress in the arts" is not an essential aspect of a civilization.

 

The dictionary definition for civilization refers to "the extensive use of writing." Writing is an LB activity, with key roles for both Broca's and Wernicke's areas, and others within LB. True, RB plays a role, but it is a supporting role, similar to the role I described for the pursuit of science. If RB tried to write by itself, it would lamely produce pat sayings, open interpretation poetry, and profanity; it would be unable, for want of a Broca's area, to produce syntactical prose. This much we know from brain research, that reveals what RB is capable of verbalizing when LB has been disabled.

 

There are more candidates wishing to be called a civilization than are deserving of it, the way I prefer to define "civilization." For example, the Mayan is often referred to as a civilization. The more we learn about it, the more despicable it appears to have been. For every trace of accomplishment, there are several barbaric, bloody practices. Yes, their artisans devised a complex calendar, and massive stone pyramids and temples, but they were used for the most inhumane ceremonies in the human record. The Mayans are an embarrassment to humanity! The Mayan individual was a victim of his culture (I will grant them "culture" status, but not civilization status). I pity those poor LB‑style Mayan artisans who must have existed, for their labors were used to advance a collective appetite for brutality that only crazed contemporaries could admire. It appears that the insane Mayan culture was the captive of a right brain that was answering every call of the reptilian brain. Anyone with a strong LB would have had a limited opportunity to influence societal values, and would have been relegated to improving the calendar, overseeing the construction of killing temples, and fiddling with hieroglyphs for recording the glorious deeds of their murderous employer.

 

Most so‑called civilizations are a mixture of the Mayan type (regrettably dysfunctional) and the ancient Greek example (admirable). The early Greek civilization produced truly ground‑breaking insights into the nature of reality and Man's place in it. I will not present a systematic listing of civilization candidates, and their salient features. Rather, I will use three civilizations from among the many to illustrate dynamics that to some extent must have been present in them all. Let us first consider the rise of one of the first civilizations, the Minoan, and try to learn what drove its ascent.

 

Civilization Growth Phases

 

The Minoan civilization grew through three stages: an Early Period, 3000 to 2100 BC, a Middle Period, 2100 to 1600 BC, and a Late Period, from 1600 to 1326 BC, which came to an abrupt end when the volcano on the island of Thera erupted and, by destroying much of their infrastructure, rendered them helpless against invasion by the Mycenaeans. A tsunami probably destroyed the coastal settlements along Crete's north shore, where the Minoan civilization was also present. Prior to their demise, the Minoans were merchants and traders. They plied the Mediterranean, moving products from port to port for profit. Their merchant ships were well designed, and very functional. They had multi‑story residences, with water delivered to some of them, aqueduct style.  Their standard of living must have been one of the best for their time. The Minoans were a peaceful people, as they apparently had no army. Their art was elaborate and accomplished, and when the subject matter included humans it was usually related to athletic performances, dancing, and never warriors, battle scenes or angels.

 

Although I do not consider "art" to be an essential component of civilization, it can be used to understand something about the nature of their society. Art can also be used to provide clues to the evolution of their rise to power. Consider the samples of pottery from each of the three periods, and note the style they used to form and decorate them.

Figure 10.1 Samples of Minoan pitchers and vases from the Early, Middle and Late periods. From Time‑Life Books (1975).

 

Ask any neuropsychologist to view the above sequence of vases and he would immediately recognize that there's a progression from a functional form preferred by the left brain to a decorative form preferred by the right brain. LB prefers straight lines and functional shapes; RB prefers curved lines, ornamentation that is elaborate, bizarre and sometimes incongruous (i.e., like the baroque style) and extra flourishes that may detract from functionality.

 

What could this progression of patterns mean? Assuming that artisans made what their patrons wanted, it means that people in power during the Early Minoan Period were LB‑styled, whereas by the Late Period the power had shifted to RB‑style people. And, assuming that this interpretation is correct, how could this factoid illuminate our understanding of how civilizations ascend? It says, I believe, that the earliest stage of a civilization's rise is driven by LB‑style people. And it also says that during the unfolding of a civilization the reins of power are captured by RB‑style people. This last speculation will be taken up in the chapter that deals with the decline of civilizations.

 

In Chapter 15 I present evidence that the per capita output of technological innovations rises over time to a peak, then subsides ‑ while the economic activity of the civilization continues toward a peak that occurs a few centuries later. In the case of the Greco‑Roman civilization, the population peak (a proxy parameter for economic activity) followed the innovation peak by 5 centuries. In the case of the present Western (European‑American) civilization, the population peak will follow the innovation peak (which occurred in 1900 AD) by at least a century, and probably two centuries.

 

The innovation peak corresponds to a period when society gives the greatest freedom to LB‑style people, by celebrating their efforts, paying for their services, giving them a status that exempts them from warrior service, and publicly recognizing that LB activities are good for the general welfare. As I argue in the next chapter, RB‑style people are "people‑oriented" as opposed to "artifact oriented," and they are good at manipulating other people for their personal gain. This talent of one segment of the population leads to a gradual displacement of the LB‑style people from power, thus explaining the shift in preferred art form during the course of a civilization's unfolding.

 

The world's innovation per capita has two major peaks, one at 300 BC and the other at 1900 AD. We know more about the recent peak, so let's consider it from the standpoint of LB versus RB. It is generally recognized that 15th and 16th Century Renaissance led to the 17th Century Enlightenment, which led to the explosion of 18th and 19th Century industrialization. The Enlightenment was a unique chapter in human history, generated by a changed "climate of opinion." The intellectual atmosphere was dominated by thinkers who, like Voltaire, penetrated the cobwebs of previous centuries and saw things the way they were. Voltaire was a nuisance to the church, politicians, and traditional intellectuals because he would not be tamed. He saw through the posturing and pretense of phony pontifications and despised the veneer of social acceptability; instead, he was cynical, skeptical, uncompromising, and had an acerbic wit. He exemplifies the LB‑style artisan. Other Philosophes, like Holbach and Diderot, worshipped the Goddess of Reason, and ushered in the view that it is within human power to create a world, based upon Reason, to replace the old unrealistic dream of a Heavenly City, where perfection and felicity were supposed to dwell for eternity (see The Heavenly City of the Eighteenth‑Century Philosophers by Becker, 1932). The Philosophes promised a new Heavenly City, built on Earth by what can now be seen as LB insights, and guided by LB logic.

 

Civilizations Falter

 

The 19th Century began to make good on some of these promises. Inventions just kept coming, insights into physics accelerated, and Darwin presented the world with one of mankind's greatest insights into where we came from and who we are ‑ all based on LB observation and reason. The pace of discovery and industrialization continued into the 20th Century, starting with Einstein's succession of profound insights into the nature of the physical world. Science, technology and engineering were held in high public regard ‑ until the Great Depression. For a decade, during the global depression in both Europe and the U.S., a malaise stifled the spirit, and it questioned LB's warrant for carrying the Torch of Progress. Criticism of society was suddenly unwelcome (poor H. L. Mencken lost audience). For 15 years serious thinkers were contemplating the end of civilization, as Diderot and the other Philosophes had worried might happen during the 18th Century. World War II resurrected the reliance upon technology for weapons, and the engineer was again cast as savior. The atomic bomb kindled the bittersweet importance of science. Sputnik forced a new dedication to giving power to the scientists and engineers, not because of a desire to build that dreamed of Heavenly City on Earth, but out of concern for national survival.

 

The Apollo program that landed 12 men on the moon was LB's last hurrah! From the social upheaval of "The Sixties" came a change in the "climate of opinion" ‑ which lasts until today. LB accomplishments didn't stop, and in fact continue to be made use of, but they were not publicly applauded (except for an occasional rover on Mars or Hubble Space Telescope picture). Political correctness was created to discredit and stifle LB values. The greatest insight that Mankind has achieved occurred midway through the last half of the 20th Century, yet almost no one knows about it today. Mankind's greatest discovery is Sociobiology! It is the crowning achievement of LB thinking. It is comparable to the 19th Century's discovery that the physical world is reducible to the invariant laws of nature. Sociobiology forces living systems into this physical world, and accomplishes the supreme feat of Reductionism ‑ everything, including life, is governed by invariant physical laws, and all happenings reduce to an unfolding of physics, where a = F/m and quantum physics determine everything!

 

My measure for a civilization is that people have an honest understanding of who they are. If only a few percent have this glimmer of understanding, it constitutes a civilization. The Greeks qualify, thanks to such luminaries as Thales of Miletus, Anaximaner of Miletus, Democritus of Abdera, and some of the ancient Romans qualify, thanks to thinkers like Lucretius. The 20th Century, Western Edition, qualifies because of such sociobiologists as W. D. Hamilton, G. C. Williams, Robert Trivers, Edward O. Wilson and Richard Dawkins.

 

The Chinese have seen many civilizations rise and fall in their land, and during the past millennium they have often been more advanced than their contemporary European civilization on measures commonly used to describe civilization. Their technologies raised living standards, but as far as I can determine they repeatedly failed the boldness test concerning the quest of insight into the nature of reality. Their "philosophy" suffers from an excess of intuitive, RB style nonsense! In my opinion they never achieved the level of insight of Thales, Democritus or Lucretius, and their 19th and 20th Century stifling "collective versus individual" culture has made them bystanders while Western thinkers explored beyond the Greek giants, led by Schopenhauer, Bertrand Russell, and the sociobiologists.

 

On many occasions the Chinese abandoned their relatively advanced technology, and reverted to living in an RB world. This "failure of nerve," or unwillingness to pursue Truth into areas where it "hurts," constitutes what seems to me to be an endemic Oriental flaw. Their practice of physical science suffers from the same intellectual timidness. Even though the Chinese score higher on IQ tests than all other races (except the Jews), there's something about their frontal lobes, or something about their RB‑style of thinking, that causes me to question their ability to boldly advance human understanding of big picture matters on "the nature of existence" or "who we are." Until they allow the individual more freedom from the will of the "collective" the contributions of the Chinese civilization will be confined to mostly engineering. This, I'm sorry, is my humble opinion.

 

Thanks to a specialized left brain, two great civilizations, by my reckoning, have arisen during recorded history. I rank them "great" because they celebrate the individual, and they bring us closer to a stage of human evolution when we shall subdue the collective mentality. Outlaw genes created this desire to conform to what's good for the collective good, and we are now discovering that we have been dim‑witted slaves too long! LB is leading the way to emancipation, and it is accomplishing its feat by creating civilizations. The job of liberation has not been accomplished; but the stage has been set for it's serious pursuit.

 

Thus we stand at the cusp of two millennia, looking back at many failed civilizations, at least two great ones, and wondering where ours is headed. Most people are unaware of human servitude to the genes and the collective they've created; while others, like me, wish for liberation and wonder if this next will be the century when humanity's emancipation will finally be achieved. If my claim is true that each civilization is the result of LB efforts to improve the life of the individual, with the unforeseen consequence of bringing the individual closer to liberation from the grip of outlaw genes, then we have a tool for discerning the health of today's civilization, and predicting its future.

 



CHAPTER 11

LESSONS FROM SAILING SHIPS:
AN INTRODUCTION TO GROUP SELECTION THEORY

 

“For now I see peace to corrupt no less than war to waste.” John Milton, Parasise Lost, 1667

 

Imagine being a crew member on a merchant ship setting sail for a crossing of the Atlantic Ocean during the 18th Century. There will be storms and the constant threat of pirates during the 7-week journey. The sailing is sponsored by merchants who want the cargo to arrive safely, the ship’s owner who wants to preserve his investment by the arrival of his ship intact, and the captain and crew who wish to arrive safely where they will be paid and continue their lives. All factors favor cooperation by everyone on the ship in the mission of operating the ship properly on the high seas and delivering its precious cargo safely to the opposite shore.

 

Each person on the ship has one or more assigned jobs. Presumably the assignments are made on the basis of ability for the needed tasks. It won’t matter that one crew mate is an excellent runner, or hunter, or mountain climber, or jungle explorer, for on the ship these abilities don’t matter as he will be measured by his performance of assigned tasks. Each crew member’s fate will be affected by the quality of his crewmates and the manner in which they all work together to navigate the ship safely to port. When each mate discharges his task with competence and cooperation the entire endeavor is helped, and the prospects for a prosperous outcome for all mates is improved

 

This situation is a simple way to introduce the concept of “group selection theory.”  During the voyage all people aboard the ship will either live as a group, or die as a  group. This is a more extreme example of a tribe either entirely living or dying during  conflict with a neighboring tribe, but the concept is easier to grasp using the sailing  hip example because the ocean is deep and unforgiving with a history of taking entire  crews to the ocean bottom.

 

With the ship analogy in mind let’s consider the tribal situation; after all, the tribal setting our ancestors had to survive for millions of years. If a tribe is in chronic conflict with a neighbor tribe the losing tribe might be decimated. This prospect has a message for individual members who pride themselves as being proficient in some irrelevant realm. An individual with a talent for basket design, for example, will have a useless talent when there are more compelling needs for warrior talent.

 

So what makes a good warrior? There are the obvious factors of strength, agility and

other skills. Two other factors deserve special attention: altruism and intolerance.

 

Altruism is defined as a willingness to forego individual payoffs in order to achieve a payoff for another individual or group of individuals. Two explanations are commonly offered to account for the existence of altruism. First, if the cost to the altruist is small, and the benefit to the other person is great, and if the interactants have recurring relationships, then it is easy to imagine that a series of such acts can yield benefits to all participants if there are several such interactions with opposite sign. (The “sign” of the interaction refers to which person is the recipient of the altruistic act.) Notice that this dynamic does not require that the two people have a close genetic relationship.

 

The second explanation for altruistic acts requires that the two individuals be closely related. J. B. S. Haldane famously quipped that he would willingly give his life for two first cousins, or four second cousins, etc., in answer to a question about altruism. The calculus of genetic payoffs of this type is now called “inclusive fitness” and it states that our brains are designed to recognize when a sacrifice is likely to confer a greater benefit than loss to our genes, present in our near relatives as well as in oneself.

 

Finally, there’s a “group selection” theory that can account for altruistic acts. If a tribe is at risk of being decimated by a rival, and if the home tribe is desperate, then there’s logic in some individuals making high risk attempts to turn the tide of battle. It’s not necessary for the hero to be closely related to his fellow tribesmen since all of them will either survive or be killed depending on the outcome of the battle. This is analogous to ship mates dealing with an emergency at sea which requires heroic action to save the ship and all its crew. The genetic relationship of the sailors is irrelevant to the need for action.

 

A heroic warrior can be seen as an altruist. He risks his life in order to save the tribe because saving the tribe also saves the hero. Genes that predispose to this form of altruism should be selected for by evolution whenever tribes live in chronic conflict with their neighbors. The prediction is borne out, at least in game theory simulations (Choi and Bowles, 2007). Since the altruistic acts benefit only those in the home tribe it has been referred to as “parochial altruism” (“parochial” refers to a concern that is narrowly restricted, or a way of thinking that is “provincial”). The notion that genes predisposing for “parochial altruism” will evolve when tribes are in conflict is based on “group selection theory.”

 

There’s an interesting aspect to the way in which this kind of parochial altruism is elicited, which has also pointed out by Choi and Bowles (2007) as well as Wilson and Wilson (2007). It pertains to intolerance, an unwillingness to overlook individual or group differences. For example, if fellow tribesmen dress one way and someone is seen dressing another way (not incorporated into tribal rituals), the non-conformist will not be tolerated. Perhaps there were instances in our evolutionary past when a brave member of a neighboring tribe sneaked in to assess tribal strengths and weaknesses in preparation for later warfare. Such a person would be noticed as a “stranger” who dressed differently. A tribe whose members were tolerant might merely shrug and leave the stranger alone, whereas a tribe with intolerant members can be expected to challenge the stranger and demand an explanation of who he was and what he was up to. Clearly, if tribes are in chronic conflict conditions favor genes that predispose to intolerance. Thus, conditions of chronic conflict should increase the incidence of two types of genes: those that predispose to “parochial altruism” and those that predispose to intolerance. The game theory simulation by Choi and Bowles (2007) show that indeed both genes increase their representation in hypothetical gene pools that are in chronic conflict.

 

decisively overwhelms opposing tribes that it creates a form of peace that lasts for several generations. The evolutionary forces that selected genes for intolerance and parochial altruism are relaxed, and in their place are new forces that reward the opposite genes. During peace genes are selected that predispose to tolerance and selfishness. Again, this dynamic was demonstrated to exist in the simulations by Choi and Bowles (2007). Wilson and Wilson (2007) as well as Turchin (2007) have suggested this scenario as a way to understand the fate of empires. Indeed, this is one way to view the decline and fall of civilizations.

 

It seems ironic that war and peace elicit genes with opposite traits. How can these reversals be achieved? Two modes are possible. Either the population evolves in a way that changes the representation of “genetic types” or the individual members take readings of an ever-evolving social setting and automatically adjust their attitudes and behaviors. Both modes are based on gene expression, but the latter is more sophisticated. Just as the immune system takes readings of pathogens in the blood and adjusts its activity accordingly, the brain is capable of reading social  situations and adjusting its activity in an adaptive manner.

 

There are two important clarifications for this use of the term “adaptive.” First, something is adaptive if it helps the genes for it to survive better. Second, the specified change is adaptive (for the genes) provided the current setting is similar to the “ancestral environment.”

 

The first clarification conveys the message that behaviors that help genes survive may not be in the best interests of individual welfare. Consider the switch from peace time to war time; the individual is expected to become intolerant and hateful, and he is expected to sacrifice his life through heroic acts that protect the home tribe. His fellow tribesmen may benefit by this heroism, but not the hero.

 

The second clarification has become important in modern times because tribes have been replaced by nations consisting of members from many genetic backgrounds. Japan is one of the few nations that has preserved its genetic purity, so there may be some genetic sense for the Japanese to engage in extreme acts of heroism (e.g., kamikaze heroics). It is also noteworthy that the Japanese in peace time have one of the lowest crime rates in the world. For them, the current environment resembles the ancestral one in important respects. But for most other nations the populations are so genetically diverse that the genes are foolish to create individuals willing to become loyal patriots ready to fight to the death for the Fatherland.

 

If humans were capable of sanity they would mock patriotism for the pointless suffering it inflicts upon humanity. Patriotism has always been pointless from the perspective of the individual, but it is now also pointless from the perspective of the group. Yet, it cannot be eradicated since it has been so crucial to genetic survival for so many generations.

 

What a pathetic situation humans find themselves in. Anyone who mocks patriotism, who points out that it serves no purpose, will be branded “unpatriotic” – and their message will not be heard. The need to enforce patriotism has been so strong for our ancestors that they created a mythical entity to help enforce it: God. This creation was instigated by the genes, of course, since they were the beneficiaries of behaviors that secured their survival at the expense of individuals. Since the modern “state” is an outgrowth of primitive tribes, governed by chiefs and their helpers, it can be said that the church and state were meant to work together. The 18th Century struggle to separate them was motivated by a subconscious realization that individuals were the victims of this collaboration. The separation of church and state is a historical aberration, doomed to a short existence. Every humanist should be sad that the few bastions of 20th Century sanity are doomed to revert to their former evil state in the 21st.

 

In trying to understand the rise and fall of empires it will be wise to keep in mind the possibility that they are related to the rise and fall of genes that predispose for parochial altruism and intolerance. Other factors deserve consideration. Most of the forces causing empires and civilizations to rise and fall are based on evolutionary changes to the genome that require an understanding of the different levels of evolutionary selection. This chapter introduced the concept of “group selection.” We must also consider selection at the level of the individual and the gene. This is the goal of the next chapter.

 



CHAPTER 12

LEVELS OF SELECTION AND
THE RISE AND FALL OF CIVILIZATIONS

Abstract

 

Evolutionists still dispute the relative importance of "group selection" while favoring almost exclusive selection at the level of the gene. There is never a discussion of that in‑between level which I shall refer to as "individual selection." This is understandable given that individuals die, whereas genes and groups survive on evolutionary time scales. However, I  present a different definition of "selective force" which more directly addresses the factors influencing the fate of genes, permitting the use of the concept "individual selection." With  this modified way of viewing causes for gene frequency changes there is a simple way to "partition" causative factors between "levels" that I shall term Gene Selection, Individual Selection and Group Selection (GS, IS, and GrS). The concepts GS and GrS differ somewhat from the traditionally used meanings for gene selection (kin selection) and group selection (multi‑level selection). I present an overview perspective for understanding the relative importance of these three levels of influence as they relate to the rise and fall of civilizations. I conclude that civilizations are an anomaly that arises when individuals break loose from the most confining bonds of the genes, as expressed by GrS, and give birth to IS. The creative forces let loose by an era of IS propels the society embracing it to create the thing we call a "civilization."

 

But history teaches us that civilizations are short‑lived. It may be that by its very nature a society constructed upon a base of individual freedom is vulnerable to fanatical attacks by a residual of contemporaneous societies that remain gripped by GrS forces. This may allow us to understand why civilizations have always collapsed, in spite of their being surrounded by social groups with inferior levels of technology and oppressive levels of individual subjugation. It may be theoretically possible for civilizations to endure after the more primitive form of GrS societies are eradicated, but human nature is such that fanatical GrS societies will probably re‑form spontaneously. If this occurs, civilizations will be doomed to fall after every rise, unless the long stretch of time somehow leads to the weakening of this impulse for reverting to GrS fanaticism.

 

The previous chapter got "ahead of itself" somewhat, so before proceeding further with the concept for the rise and fall of civilizations it will be necessary to back up and review some of the past 100,000 years and the rise of the artisan, which set the stage for the rise of civilizations.

 

Introduction

 

In my view the sociobiology paradigm is the 20th Century's greatest achievement.

 

When, for example, a sociobiologist considers observations of Darwin's finches on the Galapagos Islands, and the changes they undergo in response to a year of heavy rainfall that produces foods rewarding a different beak length, the concurrent rapid evolution of birds having different beaks is understood using a theoretical paradigm in which genes compete with other genes. Similarly, when prairie dogs warn of predators, and in the process catch the attention of the predator, the altruistic act is understood using the same sociobiology theory ("inclusive fitness") whose mathematics was worked out by William D. Hamilton (1962). Both examples illustrate evolutionary change as a product of competition at the level of the gene.

 

There is almost a consensus among sociobiologists that any gene frequency change should be viewed as the product of competition and selection occurring at the level of the gene, and not higher. Genes assemble individuals as mere "vehicles" for the genes, and these "lumbering" creations (Dawkins, 1976) compete with each other in order to enhance the competitive prowess of the genes within. This same theory also allows for the perspective that groups of individuals can be viewed as "vehicles" for carrying the genes in an even larger arena of gene competition. In other words, the accepted sociobiological theory states that evolution occurs when gene frequencies change in response to competition at the level of the gene, and that it is unnecessary to take explicit account of competition at the level of the individual or the group. Reeve (2000) seems to have shown that the equations of the "standard Hamiltonian inclusive fitness" theory provides for group effects, thus eliminating the need for considering groups as a level for competition.

 

As much as I like this theory, and in spite of the fact that I will defend it as basically correct for providing a proper account of essentially all evolutionary observations, I shall consider another paradigm for "understanding" gene frequency changes.  This other paradigm has the advantage of addressing some observations that the sociobiological paradigm is theoretically incapable of explaining.

 

I assert that sociobiology's basic task is to explain why the frequency of a gene in a gene pool changes over time. I agree that the first order explanation must be that genes can achieve success by creating individuals who do a better job of reproducing those genes, and those of their kin, thus accounting for a greater representation of these genes in future generations. However, Hamiltonian inclusive fitness theory is an awkward tool for understanding group competition and it seems greatly handicapped in dealing with humans who have partially "liberated" themselves from the genetic grip by employing "logic" to influence decisions. In the next section I present examples of cases that pose difficulties for inclusive fitness theory, and in the sections that follow it I will suggest a different way of viewing the locus of causation for gene frequency changes.

 

After presenting these humble suggestions for amending sociobiological theory, I shall then march forward into dangerously speculative territory, and address the recurring puzzle of why civilizations rise and fall. In doing this I shall rely on my newly‑defined concept for evolution at the Individual Selection level.


Special Cases That Defy Sociobiological Explanations

 

Sociobiological theory assumes that physical environments do not undergo drastic changes on time scales shorter than can be accommodated by evolutionary adaptation. For example, if a rare and drastic climate change occurs it is possible that a multitude of adaptations related to earlier climates will be rendered useless and the fate of the species (and the genes that are unique to it) cannot be predicted. This may seem like a trivial "objection" to sociobiological theory, but it serves to illustrate that special cases do exist for which sociobiological theory is helpless. This illustration may not be a trivial exception to sociobiological theory given the ever-increasing evidence in the climate record for drastic and rapid climate changes. For example, Weiss and Bradley (2001) list 7 drought events that led to societal collapse (10,000 BC, 6400 BC, 3100 BC, 2200 BC, 550 AD, 950 AD, 950 AD, and 1280 AD). As an extreme example, climate can lead to the abrupt demise of species, as happened 65 million years ago when an asteroid impact created a "global winter" that exterminated the dinosaurs and allowed mammals to flourish.

 

Humans present a special case in two respects. First, they associate in super-tribes that require strict adherence by their individual membership to arbitrary customs. The drive for ever larger super-tribes may have been caused by the winner‑take‑all nature of warfare that evolved sometime during the past 100,000 years. To maintain the required superiority of numbers, and to enhance the competitive effectiveness of large tribal groups, I suggest that the power of the group over the individual grew to oppressive levels. An individual born into a tribe would have no choice but to adhere to the tribal beliefs and customs and to engage in coordinated warfare with neighboring tribes. The tribes with cultures that evoked a high degree of fanatical loyalty to tribal endeavors would be more successful at surviving and dominating their region. In this setting the individual (and his genes) experience a high degree of "shared fate." The group and its membership would prevail or perish together.

 

In this setting a novel genetic mutation that began by affecting just one person would be rewarded far less than in a setting where group‑imposed behaviors were weaker, or not present. If we ask "what factors affect the fate of genes in the setting where fanatical tribes are in constant conflict, where there is an ever-present risk of the entire tribe’s extermination?" we are forced to answer that "the perspective of selection at the level of the group appears to be more useful than selection at the level of the gene." In other words, when individuals are severely subjugated by the imperatives of tribal survival we must reckon with more than just genetic mutation in order to conveniently account for changes in gene frequency over time.

 

Humans confound sociobiological theory in yet another respect. With the evolution of a "logic using" left cerebral hemisphere, or left brain (LB), some individuals have achieved a modest level of liberation from the influence of the genes. "Rational" decisions are to a large extent "genetically unanticipated," which in some small measure disconnects the fate of the genes, for which the individual is a vehicle, from the ancestral environment selective forces that guided the development toward this wondrous, rational LB. One dramatic and straightforward example is the decision by an individual in contemporary society to use birth control measures to limit reproduction. When the genes created smarter brains they had no way of anticipating that those brains would subvert the genetic agenda. The sociobiological literature is inexplicably quiet on this confounding factor.

 

Before presenting a new way of viewing "selective forces" that incorporate the two anomalous aspects of human evolution described above I want to present a brief history of the changes in thought on where the locus of power for genetic change resides. There has been an active debate over the causes for gene frequency changes, and there have been several shifts in the preferred way of assigning importance to the various levels at which selection can occur.

 

Brief History of Level of Selection Viewpoints

 

Before Darwin's 1859 book, On the Origin of Species, most people believed that God made humans in such a way that our behavior guaranteed the survival of the species. Darwin displaced God from this thrown, and groped to identify what replaced Him. The concept of genes was too vague in Darwin's time to be incorporated by his theory, since Gregor Mendel's article "Experiments With Plant Hybrids" (1866) lay unopened on Darwin's shelf; the rest of the intellectual community also failed to appreciate this work until 1900, 18 years after Darwin's death. For this reason Darwin can be excused for writing in 1859 "natural selection works by and for the good of each being." Nevertheless, with this statement Darwin appears to place the locus of influence at the level of the individual. Later, Darwin shifted toward group selection when he wrote (1871) "primeval man regarded actions as good or bad, solely as they obviously affected the welfare of the tribe, not of the species." Alfred Russell Wallace, who co‑discovered natural selection as an explanation for evolution of species, "stressed that group selection ... played an important role..." (see Merlotti, 1986). Darwin, it seems, eventually joined Wallace in giving group selection a leading role in natural selection.

 

According to Carl Sagan and Ann Druyen (1992, p. 70), "One of Huxley's interests had been the idea that all animals, including us, were 'automata,' carbon‑based robots, whose 'states of consciousness... are immediately caused by molecular changes of the brain‑substance.' Darwin closed his last letter to him with the words:  'Once again, accept my cordial thanks, my dear old friend. I wish to God there were more automata in the world like you.'" (see also Huxley, 1874). This idea was "ahead of its time" and did not become part of the climate of opinion in the late 19th Century. However, I hope this "reductionist" idea will be resurrected during the 21st Century and will form the basis for understanding all behavior.

 

By 1950 several writers rediscovered the importance for natural selection of inter‑group competition and, hence, intra‑group cooperation. Merlotti (1986) summarizes Spencer (1892) as believing "Let enough members of a society disobey the code of amity (for members within the tribe) and the society will fragment; let enough disobey the code of enmity (against neighboring tribes) and the society will be crushed." Merlotti quotes Sumner (1906) "The exigencies of war with outsiders are what makes peace inside, lest internal discord should weaken the we‑group." Sir Arthur Keith wrote persuasively in the same vein (1946, 1948), as again summarized by Merlotti "the success of the human species had been secured by cooperation within groups and competition between them."

 

The following table will be useful in seeing how differently thinkers of different periods partitioned the locus of influence for natural selection.

 

TABLE I

% Importance

 

   <1859  1859  1950  1962  1966  1994 Present

GOD      100     0     0     0     0     0      0

SPECIES    0     0     0    80     0     0      0

GROUP      0    20    80    15    10    20     10

INDIVIDUAL 0    80    20     0     0     0     10

GENE      n/a    ?     ?     5    90    80     80

 

By the mid-20th Century the forces of academic Marxist influence began to take their toll on the quality of anthropology thought. Whereas most previous theory incorporated an inherited predisposition to affiliate with tribes, and to be predisposed from birth to adhere to Spencer’s "tribal mentality" (i.e., with amity toward fellow tribesmen and enmity to all others), the climate of opinion shifted during the 20th Century toward a form of "cultural determinism." Biology was "out" and culture was "in." Culture was seen as a guarantee of species survival, and the locus of influence was "whatever is good for the species." The good of the species was such an appealing thought that those who could not relinquish a role for a "species nature" tried to see a pattern of evidence that instincts served species survival goals. Perhaps the most comprehensive expression of this idea is the 1962 book Animal Dispersion in Relation to Social Behavior by Wynne‑Edwards.  It amassed a tremendous amount of data in support of the idea that when a species begins to over‑exploit its environment individuals will reduce their rate of reproduction (voluntarily) as if motivated to guarantee that resources will be available for future generations.

 

The fundamental flaw in this idea is that organisms are gene‑created automata, and they cannot perceive the future, or even care about the future; they behave the way the genes have programmed them to behave, reacting to environments in ways that are programmed, and the genes that constructed them are the ones that have been the most successful proliferators in past gene‑upon‑gene competitions. Culturgens (culture’s "memes") are also a factor for human behavior, but even memes cannot be credited with caring about the fate of the species.

 

Robert Ardrey's writings have withstood the test of time, in spite of the ridicule heaped upon him by those who resented his audacity for having an opinion on anthropology after having established himself in a different field (playwright). From his 1961 African Gensis to his 1976 The Hunting Hypothesis, the amity‑enmity duality was a central theme from which he argued that aggression is a natural human instinct, as were within‑group cooperation and loyalty. He wrote "If competition takes place not only between individuals but between groups, then the group with greater endowments of loyalty, cooperation, self‑sacrifice and altruism concerning social partners will be selection's survivor." Thus, Ardrey tried to keep group selection "alive" during the 1960s and 1970s, but due to his background as a playwright he was not taken seriously by the Marxist anthropologists of that time.

 

During the first half of the 20th Century the discipline of physics made dramatic advances that captured the imagination of the general public. The climate of thought by mid‑century should have been congenial to the notion that physical events at the atomic level dictate all particle motion, and therefore all animal behavior. There is no evidence (that I am aware of) that people were thinking this way by mid‑century, even though Huxley had suggested the idea 80 years earlier. With the discovery in 1953 that genes are double‑helix DNA molecules the stage was finally set for thinking of gene‑assembled organisms as automata. The slowness of the process by which humans approach Truth is best understood by remembering that every creature is a gene‑created automaton, and that a human is programmed to think in ways that served the genes that constructed his ancestors. Therefore, if the comprehension of a fundamental truth has never influenced the selection of human ancestors we cannot expect that humans will quickly grasp that truth. The gene‑centered perspective requires a difficult leap of imagination for which the human ancestral environment has not prepared us.

 

William D. Hamilton (1964a,b) was one of the first writers to grasp that evolution should be viewed as occurring at the level of the genes! He created the mathematical foundation for understanding how evolutionary competition at the genetic level can explain such social behaviors as altruism. Hamilton's "intrinsic fitness" theory is a “mechanistic” theory for the evolution of individuals by natural selection that behave "altruistically" toward each other, provided the interactants are close relatives.

 

The math of Hamilton's derivation is daunting, even for mathematicians, and fortunately George C. Williams came forward with a non‑mathematical interpretation of Hamiltons' message (plus other implications of the gene‑centered view) in a landmark book Adaptation and Natural Selection (1966). Williams addressed the issue of levels of natural selection, and allowed for the theoretical possibility that group selection, GrS, could occur when certain conditions existed. He considered conditions that must exist before GrS can occur (my description of this is adapted from Buss, 1999).

 

According to Williams, for group selection to be important there must be:

 

1) a high degree of "shared fate" of the members of the group,

2) low levels of reproductive competition within the group, and

3) recurrent patterns of differential growth and extinction of groups.

 

Williams demolished the old version of group selection, wherein adaptations evolved for the benefit of the species. He was even skeptical that this new version of group selection could be found in nature, for he believed that the three conditions he specified were rarely met, especially with humans.

 

Edward O. Wilson's book Sociobiology: The New Synthesis (1975) and Richard Dawkins' popular book The Selfish Gene (1976) consolidated the perspectives presented earlier by W. D. Hamilton, G. C. Williams and their lesser‑known predecessors (R. A. Fisher, J. B. S. Haldane, S. Wright). The notion that natural selection worked at the level of the genes, and not groups, grew in strength with their writings.

 

Wilson and Sober (1994) revived group selection theory, and emphasized that groups can be portrayed as "vehicles" for genes, somewhat similar to the way individuals are vehicles for genes. They did not challenge that selection at the level of the genes is important; rather, for social animals it is also necessary to incorporate the effects of groups in order to understand the fate of the genes carried by its members. This version of group selection theory is sometimes referred to as "multilevel selection theory" (MLS). If the idea of multiple levels for viewing selection bothers you, re-read Chapter 1, and Appendix A, where I argue that a proper understanding of reductionism allows for more than one “level of physical explanation.”

 

Even though MLS embraces such levels as the gene, the group, and even the species (and multi-species ecosystems), it excludes the individual. A 1998 book by Sober and D. S. Wilson, Unto Others: The Evolution of Unselfish Behavior, endeavors to place group selection on a sound, mathematical footing. Although Sober and Wilson acknowledge that the mathematics of Unto Others is "equivalent" to the mathematics of Hamilton's inclusive fitness theory, they claim that the rearrangement of terms in the equations renders it a better tool for understanding the role of competition at the group level. Reeve (2000) argues that the equivalence of the two mathematical formulations renders the group selection arguments in Unto Others unnecessary, since both theories are based on competition that ultimately occurs at the level of the genes. The latest version of "group theory" thus portrays the argument as a subjective preference for the mathematical arrangement of terms in an equation.

 

Overview of Human Evolution

 

I accept the conventional wisdom that before approximately 125,000 years ago humans lived in tribes of approximately 50 to 150 individuals. Inter‑tribal conflicts may have been common, but I am unaware of compelling evidence that tribes exterminated each other before approximately 40,000 years ago (Keeley, 1996). It is not known if individuals were free to switch tribes. The simplest model for human evolution in those times would be to place the bulk of selection pressure at the level of the genes. To the extent that primitive humans lacked "culture," their evolution and the evolution of all other animal and plant life can be explained just as successfully using gene‑level sociobiological theory.

 

At some time between approximately 70,000 and 12,000 years ago human tribes began to grow in size dramatically, and most humans soon found themselves members of "super-tribes" several times larger than the original tribes. The drive toward larger tribes was irresistible when they became more effective in inter‑tribal warfare. The super-tribes also could more easily afford to allow a small number of men to specialize in full‑time weapon and toolmaking, and other artisan specialties. Whenever inter‑tribal warfare began to result in the extermination of the losing tribe, sometime between 70,000 to 12,000 years ago, that was the time when tribal size and tribal loyalty became crucial determinants of the fate of genes. Since the individuals within a tribe had a "shared fate" when tribal decimations began we can begin to consider placing the rise of "group selection" (the modern version of GrS) to this critical time. We may also speculate that tribal rules would grow more restrictive at the same time. Genes for "compliance" would be rewarded, for they would produce individuals who could embrace conformity with a minimum of cognitive dissonance (Lumsden and Wilson, 1981, and Boyd and Richerson 1985).

 

Super-tribes, and the growth of a tribal culture requiring strict adherence, represent a critical stage for the shift of evolutionary relevance from the genes to the group. Fanaticism, reinforced by tribal custom and religious fervor, reward the genes of individuals within that group, for they make the tribe a more formidable enemy to their neighbors (Kriegman and Kriegman, 1997). Individuals within super-tribes have no viable alternative to membership in their tribal group. After this transition, occurring sometime between 70,000 and 12,000 years ago, the factors that determined the fate of genes would be shared between the level of genes and the group. Prior to this transition most of the factors determining the fate of genes would have been at the level of the genes.

 

From the time of the transition to super-tribal conflicts to the time of Classical Greece, the factors determining the fate of genes would have been shared between the level of the genes and the level of the group, between GS and GrS. The notion of individual rights, or individual liberation from the oppressions of tribal life, began during the 6th Century BC in the area of northern Greece. It might have had beginnings in the Minoan civilization, but we have no records of what people believed from that far back (ending abruptly when the volcanic island of Thera exploded in 1628 BC). After the slow collapse of the Roman Empire the rights of the individual lost influence and group rule was restored. With the re‑discovery of Greek philosophy by the 17th Century Enlightenment philosophes it again became fashionable for the individual to assert himself. The church's power was at an all‑time low when this 17th Century rebirth of individualism occurred. What is now referred to as Western Civilization is an outgrowth of the ideas originally expressed by the Greek philosophers.

 

Western societies at the end of the 20th Century gave individuals the freedom to move about, to experiment with alternative cultures, and to think thoughts that contradicted those generally accepted. Tolerance of individual differences and respect for individual freedoms were at an all‑time high at this time in the Western world. For example, a person was free to adopt birth control measures to control pregnancy, and many educated and wealthy individuals choose to forego having offspring. The fate of the genes was influenced by individuals who were free to make rational decisions that were often dictated by a desire to optimize the welfare of the individual at the expense of genetic propagation. This short‑changing of the genes was accompanied by a weakening of the group to which the liberated individuals loosely belong, as will be explained below.

 

A New Measure for the Strength of Selective Forces

 

This section may be "tedious" for most readers. I will summarize it in the next paragraph so that you may skip to the next section whenever you think you're encountering more detail than you need.

 

The task of this section is to assign importance to each of three levels for selection (gene, individual and group) in an account of why genes change their frequency in a gene pool over time. If the gene level is all important, then we will end up stating that:  the "force of the genetic level" FG = 1.00, the "force at the individual level" FI = 0.00, and the "force at the group level" FGr = 0.00. However, if group selection is important, there will be a re-partitioning of selection strength so that genes will be a more important explanation for gene frequency changes; for example, we would state that FG = 0.50, FI = 0.0, and FGr = 0.5. Note that the sum of the three forces adds to 1.00, which is a condition I impose for convenience. The task is to "partition importance" among the three candidate levels in order to explain what is causing a gene pool to evolve.

 

In groping to define a new "measure" for use in explaining changes in gene frequency let us review some of the attributes we expect of it. If this resembles describing the answer, then formulating the solution, it is. Common sense should sometimes guide us.

 

Consider the situation before there were super-tribes that held individual members captive by inhibiting the full measure of unique individual attributes and inhibiting the free expression of personal aspirations. It should be clear that however this new measure is defined it should assign an almost complete amount of strength to the level of the gene and very little to the group. After super-tribes came into existence, we want this measure to share strength between the level of the genes and the level of groups. After individuals begin to liberate themselves from group‑conforming societal pressures, a phenomenon which has occurred most dramatically in Western cultures, we want this measure to assign some strength to the level of the individual. The individual who is free to choose his culture, his role in that culture, and his reproductive lifestyle, warrants a seat at the table of power over gene frequency changes.

 

The measure I propose is based on asking "What factors affect the fate of the genes? And how are these factors partitioned between the level of the gene, the individual and the group?" If we constrain the three strength values to add up to 1.0, then as individual liberation grows, for example, the strength of the genes and the group diminish. Thus, FG + GI + FGr = 1.00, where FG is proportional to the sum of selective forces identified as originating at the Level of the Gene, FI is proportional to the sum of selective forces identified as originating at the Level of the Individual,  FGr is proportional to the sum of selective forces identified as originating at the Level of the Group, and the normalizing of the three forces is done so that the sum of forces = 1.

 

In this treatment I ignore all factors that influence the fate of genes that do not belong to the above three categories. For example, I will exclude from consideration earthquakes, floods, drastic climate changes, asteroid impacts, and all other rare physical environment events, even though they do indeed occasionally affect gene frequencies. By neglecting them in this analysis I am following the precedent of most other evolutionary models, and my treatment merely fails to capture random perturbations which will not affect the conclusions I wish to make. It would be straightforward to formulate a version of this theoretical treatment that includes these environmental catastrophes, but it is not my intention to include them here in order to better emphasize the role of the non‑gene levels of selection, GrS and IS.

 

The method for measuring "Selective force," which I shall describe momentarily, is inherently subjective, and this is the weakest part of my argument. However, it has attributes that make it useful for things that other formalisms do not allow. At any given time many things are happening that may influence subsequent changes in gene frequency in a gene pool. The basic task is to associate changes of gene frequency with causes for those changes. Instead of attempting to ascribe cause and effect through an explicit treatment of each gene's phenotypic expression, and speculating about the implications of that altered phenotype, I propose to employ mathematical tools that are blind to the mechanisms for cause and effect. I propose to perform a multiple regression analysis of all gene frequency changes in a genome, over a specified time, using as independent variables all parameters that can be measured and that describe potentially relevant aspects of the social environment, the genes found in individuals, the milieu of culturgens that individuals are exposed to, "novel thoughts" experienced by individuals, and many other similar properties that could in theory be measured. This item "novel thoughts" may be troubling, but I want to retain it for reasons that will become clear later.

 

Obviously it is not feasible for anyone to measure all relevant parameters describing the social environment, the genotype of individuals, the cultural milieu, and the thoughts experienced by individuals, but let us suppose for the sake of argument that these parameters nevertheless exist. I make this request of the reader in the same spirit that is required by twin study investigators, for example, who attempt to partition the effects upon individual traits by genetic versus environmental causes. In those studies it is not necessary to identify every factor that influences how a person becomes who they are; rather, it is merely assumed that a myriad of such factors in each category exist, and the investigator proceeds to partition causation of the aggregate of effects by performing correlation analyses. The following paragraphs are the ones the casual reader may wish to skip.

 

Assume that we create an immense inventory of parameters that describe the state of a "setting" in space and time. Assume further that we assign each parameter to the categories G, I, Gr  and "other," and reject all parameters belonging to "other," such as natural catastrophes (for the reasons presented above). The number of parameters belonging to G, I and Gr shall be referred to as N. The "state" at any given time is an N‑dimensional "state vector," to use mathematical terminology. After choosing a "timescale" for associating the state vector with events of a gene frequency change we can, in theory, perform a multiple regression analysis (MRA) for each gene. Each MRA will use gene frequency as the "dependent variable" and the N‑dimensional state vector as elements for N "independent variables." Each MRA will then produce N correlation coefficients, one for each parameter. We now can sum the correlation coefficients in the following way:

 

    FG' = Sum of all correlation coefficients associated with Gene Level parameters,

    FI’  = Sum of correlation coefficients associated with Individual Level parameters,

    FGr'=Sum of all correlation coefficients associated with Group Level parameters.

 

which are the un‑normalized "forces," that are easily normalized by dividing by their sum.

 

The entire procedure just described is then re‑performed for the next gene, and so forth until all genes are thusly treated. Since only a few dozen gene loci have more than one allele (for a given species), this analysis need only be repeated for a few dozen genes. These several dozen results are combined to arrive at an overall score for the relative importance of each of the three levels of selection.

 

Again, I want to emphasize that it is not necessary for the above procedure to be feasible in order to make use of the concepts that they convey. I only ask that you accept that it is conceptually feasible! In some sense I am appealing for a belief in concepts that are just as "real" as the coefficients in Hamilton's horrendously complex mathematical derivations. If we can imagine specific coefficients to exist in a hypothetical world, then they do exist in the real world even though as a practical matter we are limited to only an approximate measure of them.

 

One further clarification is needed here: there is nothing in my proposed paradigm that is incompatible with present‑day sociobiology theory. What I am suggesting is an alternative way of viewing events. This is often done in physics. For example, a physical chemist can, in one situation, treat a salt crystal as a lattice structure held together by electrical forces, while in another situation treat the same salt crystal as a group of sodium and chlorine atoms that can become dislodged for chemical reactions when dissolved in water. The physical chemist knows, during both treatments of the salt crystal, that whatever happens is the result of the four forces of nature acting upon tiny masses, in a way that is too cumbersome for practical use in everyday experiments (the four forces of nature being gravitational, electromagnetic, weak and nuclear). Similarly, the person trying to understand human behavior, or the rise and fall of civilizations should know that every person's actions are dictated by the same four forces of nature acting upon tiny masses. I acknowledge the frequent need to seemingly overlook the inherently reductionist nature of all phenomena in order to advance our "understanding" of the everyday world. Sociobiology and my suggested partitioning of influence among three levels is just another example of looking at the same phenomena from different perspectives, and they are NOT contradictory.

 

Levels of Selection and the Rise and Fall of Civilizations

 

Consider the following figure, where I have marked off 9 stages, "A" through "I", that I am suggesting typify the evolution of a human civilization.

 

Figure 11.01. Hypothetical allocation of "selection strength" for the three levels Genes, Individual and Group.

 

At stage "A" we are to imagine that mostly the gene level of selection is important in determining the fate of genes. At this stage the individual does not assert himself, he does not make birth control decisions, or decide to walk away from the tribe and live alone. Also at this stage there are no super-tribes, and tribal conflicts are not all‑or‑nothing group exterminations. Consequently, the strength of group selection pressures is very small. It is only non‑zero because I assume that the fate of individual social alliances has some effect upon the survival of the individual and his reproductive outcome.

 

At stage "B" we have super-tribes exterminating each other, rewarding the super-tribes that enforce conformity among its membership. The fate of the genes within an individual are less affected by the individual phenotypic expression of them, for some of the individual's destiny is beyond his control by virtue of the fact that he belongs to a tribe that will survive or be extinguished on the basis of how well the tribal membership works together. It will matter only slightly that a particular individual is greatly endowed by his genes if the tribe he belongs to is ineffective in combat with its rivals.

 

By stage "C" the group has become oppressive in its restriction of individual members. Any deviation from the cultural norm will be punished, so any differences in individual genetic profile, any outstanding abilities for example, are ignored and each individual is subservient to the dictates of group needs. This stage is marked by devotion to tribal rituals, unquestioned loyalty, fervent religious devotion, fanatical fighting and a readiness to sacrifice the self for the greater glory of the group. For a modern Westerner this stage is the most difficult to like. Kriegman and Kriegman (1997) suggest that religion was an invention that enhanced the fighting competitiveness of a group because it provided a "rationale" for fanatical behaviors; any groups not having a religion to motivate fanatical adherence to the group's destiny would be handicapped during warfare. I incorporate this thought as a crucial component of my argument that civilizations are destined to be short lived in a world where fanatical societies exist.

 

Stage "C" represents the "birth of individualism." It is no coincidence that the Olympic games, which emphasize individual as opposed to team competition, originated in a region that gave birth to the notion of celebrating the individual. The Greek philosophers discussed the proper relationship between the individual and society, and the proper role of a government. Democracy as a form of government is an outgrowth of a shifting of power from a "tribal leader with group support" to the individual. When the individual is set free to achieve, and receive credit for his achievements, it should be no surprise that more achievements per capita should result. Commerce and technology should develop faster, and more economic niches should be created. In a society where the individual has government sanctioned rights, as in a democracy and free‑enterprise economy, there should develop a greater tolerance for people having new ideas. Productivity should rise in not only the commercial sector of the society, but also the intellectual. Literature, the arts, and philosophy are individual endeavors that attest to a vibrant social order that rewards individual initiative. These are the conditions that lead to what we call a "civilization."

 

But as the individual thinks for himself, he exerts an influence over gene frequency changes as well as the shape of society, and by the zero‑sum nature of my proposed partitioning of the forces of selection we must see a decline in the influence at the level of either the genes or the group. I suggest that the group is the big loser, and perhaps its losses are so great that even the genes are winners. During the rise of a civilization, when more power flows to the individual, the genes can still be winners because an individual with a new mutation has the potential for prospering more than the average of his society. Even today the extremes of personal wealth continue to widen. Bill Gates has more wealth than 1 million average people in the Western World, and his wealth is based on genetic intelligence and business savvy (and, yes, the luck of an opportunist); at least you will agree that it is not based upon inheritance or acquisition through plunder.

 

It should be pointed out that Stage "C" appears to be driven by a bold assertion of a growing minority of people with "strong" left brains! Any neuropsychologist would agree that the manner of discourse exhibited by Socrates, Plato and most other philosophers of that time can be explained by invoking a leading role for styles of thought that only the left cerebral cortex is capable of performing. Elsewhere I have written that during the past 70,000 years, at least, there was a growing place for artisans within tribes, and these artisans could perform their work most effectively if they had well‑functioning left brains. I have also argued that the left brain, as well as the frontal lobes, have undergone the greatest amount of evolutionary change in recent times, and that this is due to the need for full‑time artisans for the support of tribal endeavors, and that this has grown in importance during the past 70,000 years.

 

Stage "D" is a growth of what was started in Stage "C." During this stage individual liberation, made possible by the influence of people with left brain styles of thinking, enjoys a toleration from society at large that is unprecedented. Tolerance for new ideas, new customs, and challenges to old ideas and customs becomes acceptable and expected. The lifting of group oppressions allows a release of unprecedented creativity and productivity, and this energizes commerce, technology and government efficiency, which leads to a spread of greater material wealth throughout most of the population. Excesses of wealth, found in both successful individuals and a bountiful government, benefit the arts. Stage "D" meets the dictionary definition for civilization, as "An advanced state of intellectual, cultural, and material development in human society, marked by progress in the arts and sciences, the extensive use of writing, and the appearance of complex political and social institutions."

 

But a curious thing happens during the progress toward a more extreme development of Stage "D" civilization, as we are now experiencing in Western Civilization. The recipient of civilization's bounty, the individual, turns inward, and becomes absorbed with personal, individual well-being. Beyond the boundaries of civilization's campfire exist uncivilized societies that have not absorbed the values of their more successful cousins. These societies are on the fringes of the fountain of wealth, and they feel "used" and left behind, as they pick up the crumbs that fall their way. Instead of wanting to emulate those better off, they resent them, and they wish to defeat their well‑off neighbors, and perhaps plunder the fruits of other men's labors. They are moved by the ancient and primitive tradition of seizing what one wants instead of producing it.

 

While resentment grows among those relegated to being spectators of the civilized, and while the numbers of those lucky civilized members grow, another unexpected force gathers strength from within the civilization: it is a curious cadre of "cultural enforcers." These people are a residual of past episodes of boom and bust, and their ancestors have saved their kin from the excesses of success. The cultural enforcers (religious fundamentalists) wish to curb the undisciplined pursuit of civilization's glitter by re‑instituting some old fashioned values. They can be likened to a well‑meaning friend trying to sober‑up a hung‑over celebrant, as if the celebrant was merely preparing himself for the next day's battle. For there is always a next day's battle, and every civilization must be ready for it, or perish.

 

Stage "E" is the turn‑around, a reckoning with the consequences of success. The cultural forces from within are creating a "group mentality" in readiness for battle, and the forces beyond civilized borders are probing their enemy for weaknesses. Both sides silently gather strength, like the quiet before a storm.

 

The Stage "F" collapse will be faster than the rise, for destruction is always easier than production. Fanatics chip away at the existing structure by attacking places that are most vulnerable; but just as important, they inexorably reduce public confidence in the existing order. For those at ground level, measuring change by moments of a lifetime, the changes may not be apparent and their significance will not be appreciated. Someone still drives the trucks, repairs the streets, and constructs the houses, though it is a different person and he worships differently. Traditional battle lines will not exist, as the war is one of skirmishes by well‑organized, small groups of fanatics, as they erode a structure built by processes that are quietly disappearing. Instead of energies being focused on new and more glorious projects for the future, energies are focused on repairing crumbling social structures that directly impact personal well‑being and on protecting society from random terrorist attacks. The "vision" of future things is replaced by a need to make concrete repairs and protect the security that was taken for granted in the past. There is no time or energy for the arts, for music, or new ideas. As civilization dissolves and eventually evaporates, it leaves a residue of useless scum.

 

Few people will recognize their loss as a loss, for by then most people will have turned over their left brains to the control of their rights. Part of the war effort, waged by the attackers and the attacked, is a covert campaign to discredit left brain styles of thought and left brain values. The attackers do it because they've never known the left brain's ways. The attacked do it because priorities now require that everyone become engaged in only essential endeavors. The essential endeavor is defense, and defense is most effective when the postures of fanaticism are adopted. And the retransformation to fanatic postures requires that the left brain style of thinking be abandoned.

 

Stage "G" is a complete deliverance of the once victorious civilization to the leaders of the group mentality. The tolerance for new ideas is lower than at any previous stage, and the individual expression of anything new and potentially upsetting to the grip of the group is unthinkable. Religion's job is to enforce this policy, to keep individual thoughts suppressed in order to preserve the status quo. The individual is not the only loser, for the fate of genes is even more strongly influenced by the vagaries of group culture. Whatever preserves group survival and dominance defines the way things are. Genetic mutations that in prior times would have brought their lucky individual vehicle to a winning place, thanks to their individual creations being inclined to be creative or productive, now produce individuals who are burdened by their superior creativity and individual passions. The newly strengthened group selection forces inhibit what we now call progress, for groups reward things that lead to Dark Ages. Where Group Memes Rule, the genes are in repose, and the individual is in eclipse.

 

Stage "H" and its eventual deliverance of its victims to stage "I" (equivalent to Stage "A") is a slow process, much slower than indicated by the figure. During this stage there is a lessening "need" for oppressing those who threaten stability by wanting to be progressive. The need to suppress individuality at this late stage is less than before because few people remain who remember how to assert one's individuality. Individualism during the Dark Ages is publicly non‑existent. It may exist, but only furtively. Individuals who show more initiative, and who are motivated to achieve, slowly infiltrate the positions of power, and through a neglect of enforcing the culture of oppression they allow increments of change to erode the power of group culture. Perhaps we are lucky that it only took one millennium for the Dark Ages that followed the fall of the Roman Empire to give way to a rebirth of individualism. Part of the credit for the weakening of group culture goes to the plague of the Black Death, which so dispirited the populace, and so undermined the credibility of religion, that an opening appeared for individual voices to speak out in favor of ideals that had not been publicly uttered for 1000 years (Cartwright and Biddiss, 1972).

 

The longer Stage "H" endures, the more likely it is to be replaced by changes brought about by the voice of individuals. Evolution has produced brains that will not stay quiet forever, and this restless energy will break through religion's oppressive "blanket" eventually. A millennium or more may be required for this recovery.

 

The current episode of civilization shows signs of decline, amidst isolated surges of forward growth. If the Western World's civilization reached its peak in the early 20th Century (see Chapter 15), then the complete cycle, from Stage "A" to its repeat as Stage "H," requires approximately 2500 years. The previous cycle may have been interrupted by the volcanic eruption of Thera, destroying the Minoan civilization in 1628 BC ‑ which had many of the features of the later Greek civilization. If the Minoan civilization had unfolded naturally, and had undergone a decline caused by a human restoration of group culture enforced by religious oppression, then the human spirit might have lain fallow for longer than the 1000 years that in fact was required for the resurgence of an individual‑based culture, as occurred with the Golden Age of Greece. Thus, our present knowledge of the human record denies us the opportunity of knowing whether a 2500 year cycle is typical. Most of the time of our most recent cycle was spent in the Dark Ages mode. We do not know if this was true of the previous cycle, or if it will be true of any future cycles.

 

The dynamic just described occurs at lesser levels when regions are isolated for long periods. Thus, there are other examples to learn from of the exchange of power between the levels of the group and the individual (always at the expense of the gene). Asian history might be revealing in this regard.

 

Oscillations as a Transitional Mode

 

Before 11,600 years ago, when the transition to our present interglacial was complete, the oscillations between Group and Individual power, each borrowing from Gene power, probably did not exist. It is a property of some physical systems that they undergo a transitional mode of oscillation during their shift from one mode to another. Before the Holocene (i.e., our present interglacial), human societies were probably exclusively tribal, and super-tribal, corresponding to Stage "A" in the above figure. It is natural to ask "Will humans some day be exclusively civilized, remaining so for long periods?"

 

For this to happen, according to the ideas of this chapter, it would be necessary for a civilization to "include" the entire world's people in the benefits of being civilized. Such a condition would remove one of the energizing motives for fanatical attacks upon civilization's structure. To be sure, some people can be counted upon to hate the established order, no matter how beneficial it is to its individual membership. But there must be a critical amount of discontent for it to become a serious threat to the majority. This should be the hope of every civilization, that it can share the benefits broadly enough that the number of malcontents will not find each other in sufficient numbers to constitute a serious threat to maintaining the civilization.

 

Alternatively, those wishing for the longevity of civilizations may hope that the genes for malcontent will diminish during a civilization's existence. Clearly, it is too much to hope that a civilization will pro‑actively alter the genetic composition of its citizenry. Unless, that is, the civilization is small in numbers, serious about survival, and physically isolated from its neighbors ‑ as might occur some day when settlements exist in space.

 

If civilizations are to avoid falling soon after rising, they must confront both challenges: 1) attack from uncivilized, group culture societies that feel threatened by civilization's presence, and 2) the ever‑present threat of indigenous malcontents coming together to form fanatical cadres bent upon destroying the civilization from within. I take an agnostic stance on the likelihood of either condition being met someday. The human species is an experiment, and it’s not over yet!

 

Acknowledgement For This Chapter

 

I want to thank Dr. Daniel Kriegman and Orion Kriegman for letting me read their unpublished manuscript that expands upon their 1997 HBES Conference presentation. My thinking was helped by their idea that religion's proclivity for producing fanatic proselytizers and defenders gives religious societies an advantage in prevailing over neighboring societies with non‑religious beliefs. 


APPENDIX TO THIS CHAPTER

 

A Fuller Explanation of Group Selection

 

How might GrS be measured? In the body of this essay I presented a conceptual version of a multiple regression analysis procedure for measuring the "force of group selection," FGr. That derivation was meant to illustrate a concept, and not meant for use in any specific situation. However, it may be possible to crudely measure FGr at a specific location and time. Imagine the existence of a questionnaire with weighted scores that probe key aspects of the setting in question. Although I leave the task of creating such a questionnaire to someone else, I shall hint at it with the following examples.

 

    1) How often do individuals suffer from their affiliation with a specific group ("shared fate")?

    2) How often do individuals publicly question beliefs that are held by the majority of group members ("compliance")?

    3) Are individuals free to change group affiliation without sanction; may they have more than one group affiliation at the same time ("membership enforcement")?

    4) Is an individual free to leave the group and live without any group affiliation (relating to the threat of a group to punish individuals by "banishment")?

 

Scoring societies using such a test might be useful in studying the "civilization's evolution." If Western Civilization can postpone that “rounding the corner” from Stage E to Stage F during the 21st Century, such studies might actually be conducted.

 

A Fuller Explanation of Individual Selection

 

It’s easy to make the case that walking sticks and spiders that exhibit "male sacrifice" during procreation are enslaved to their genes (male sacrifice is when the female literally begins eating the male's head and other body parts after copulation has begun, partly to better nourish the development of eggs but also to assure that other males will not fertilize the female’s eggs). With humans, the case is more difficult to make, but an earnest effort will be repaid. At the present time the sociobiological literature merely hints at this fact, for the field could lose public funds if it pushes forward too fast. Let us be bold, and accept the notion that humans in all societies are to some extent "used" by their genes, that emotional payoffs are meant to encourage individual performance of the most essential if not dangerous, laborious and illogical of tasks needed by our genes for gene survival. This book issues a "call to arms" for individuals to liberate themselves from genetic enslavement. Imagine that another book exists that calls for the individual to also free himself from the grip of "the group" (using whatever definition one likes for "group"). For now let us just assume that the individual is to some extent in the grip of both the genes and the group. In what ways are individuals now liberating themselves from these twin enslavements?

 

An individual asserts his "rights" when, for example, birth control measures are  used. An individual asserts his rights when he argues for peace over war, and avoids being drafted into non‑defensive, aggressive wars. An individual aspires to liberation from the genes when he thinks critically of conventional beliefs and pursues thoughts freely ‑ as a "free thinker" does. These few examples reveal the possibilities for individual liberation from the grip of the genes and the group.

 

It is a novel situation when individuals achieve some degree of liberation from their genetic and group enslavements, and freely make personal decisions that can affect the fate of the genes for which they have become "newly uncooperative" vehicles. Because such individuals are less likely to produce their quota of offspring, or to nurture their nieces and nephews, these individuals would be viewed by the genes, if they had a view, as "freeloaders." They are not paying the price for admission to Life!

 

To the extent that other individuals remain enslaved, and to the extent that the inclination to liberate oneself is influenced by the individual's genetic makeup, the existence of liberated individuals will alter the fate of genes. The present time, as we approach the Stage "E" crest of Western Civilization, is an ongoing experiment. Let us hope that insulated communities will form and pursue the dreams of individualism for many more decades ‑ before they are snuffed out by an encroaching reversion to primitive tribalism.

 

[Note: Most of this chapter was written during the 1990s, before Islamic extremists toppled the World Trade Centers on September 11, 2001.]

 

 


CHAPTER 13

THE TWO CULTURES ‑ PART I

 

The previous chapter presents a speculation to account for the rise and fall of civilizations. I will now consider other factors that may contribute to their fall. In this and the next chapter I call attention to a troubling situation: there seem to be two types of people in today's world, and their ways of thinking, and their values, are the source of "polarizing" conflicts in academia, politics, foreign relations and everyday life. It may be important to understand the origin of this two‑sided facet of human "ways of thinking," for, as I claim, they may play a role in the fall of civilizations.

 

Review of Brain Evolution

 

In this section I will review material in previous chapters concerning brain specializations, and present them in a way that serves my present purpose of understanding why today's civilization appears to consist of a continuum of people at the ends of which are two distinct and incompatible types. The activities of people at the extremes tend to produce what has been referred to as "Two Cultures."

 

If we go back far enough in our human ancestry, we will find that the left cerebral cortex, or left brain, LB was identical to our right cerebral cortex, or right brain, RB. By having essentially identical brain halves our remote ancestors benefited from a form of redundancy that was valuable in case of injury to one side. Although a slight specialization for sequential tasks probably developed in the left brain of our pre‑human ancestors, left brain specializations began a dramatic evolution sometime in the Pleistocene, possibly 200,000 or 300,000 years ago, perhaps in response to an environmental opportunity presented by one of the interglacial warmings that have been occurring at approximately 100,000 year intervals for the past half million years. To first order, RB remained unchanged while LB began to allocate small areas for new, specific tasks. The ability to talk, as a supplement to gestures, evolved in LB's frontal lobe (Broca's Area), which was accompanied by the evolution of an ability to comprehend speech in LB's temporal lobe (Wernicke's Area). Eventually the left side also developed a capability for logic, a form of sequential thinking that requires the type of neural architecture involved in language. Logical thought is most effective when emotional intrusions are minimized. Hence, we may assume that as LB's capacities for logic evolved it became somewhat disconnected from the limbic system, where emotions originate.

 

Modern man's LB owes its "power" to repeated triumphs of wresting control from RB on matters that were best performed by newly‑evolving LB modules. Logic is more powerful than intuition when novel situations are encountered. Novel situations and opportunities must have been frequent during climate transitions, such as occurred at the close of the last glaciation, approximately 12,000 years ago, and during the previous interglacial, occurring 129,000 to 116,000 years ago.

 

Whereas brain modules appear to "work together" to the untrained eye, a more likely dynamic is that they form an elaborate system of competing modules. Every situation encountered is "presented" to modules, and any module that "recognizes" the situation is an invitation for it to become aroused (anthropomorphically speaking); if the module “recognizes” the situation, it seeks activation from the "reticular activation system," or RAS. The RAS chooses a "winner" from among the competing modules (based on past experiences, presumably, as well as inborn predispositions), and the RAS grants authority to one module. That module is given access to the frontal lobe modules for formulating behavior and commanding it (via the motor strip). The winning module’s recommended specific behavior (such as saying or doing something) is then implemented.

 

This modular arrangement makes good sense from the gene's perspective. Indeed, the more one thinks about it the more impossible any alternative seems to be. Consider any given brain circuit; there will be one gene that has the greatest effect upon it, a second gene that has the second‑greatest effect, and so forth. The gene with the greatest effect on this one circuit is unlikely to affect ALL other brain circuits with which this circuit interacts, which will also be true for the other genes that affect the circuit under consideration. Therefore, if this one circuit works with the others harmoniously it will not be due to the genes that create the one circuit; rather, it will be due to the forces of natural selection that pass judgment on genes that create new circuits (and modify existing ones). The genes for any one circuit do not "know" about the other circuits, even though they can work harmoniously with them.

 

We humans are observers of the product of the many brain circuits, and we should resist the temptation to attribute the apparent harmony of mental performance to a harmony of genetic design.  Any harmony that we observe is probably illusory, since it can be produced, as it most likely is, by a competition of modules with a continuous unfolding of winners.

 

Why was LB the site for mutations conferring the new capabilities of language, logic and other sequential tasks (if you remember reading this material from a previous chapter, then skip this paragraph)? One speculation attributes it to a subtle difference in LB's role for our arboreal ancestors. LB commands the right hand, which long ago, due to some random selection, took on the task of reaching for fruit while the left hand stabilized the body by holding onto a branch. Picking fruit involves a sequence of actions, which led to the development of small neural regions devoted to each sub‑task and their integration with each other. Since neural connections within the small neural region relied upon short‑distance communication, the neuron axons in these regions required less myelinization (reliance upon a fatty tissue covering of the axon to provide electrical insulation for better communication to distant neurons). The genes for LB development became more adept at producing small neural networks that were less myelinized (having less of the myelin "white matter," thus accounting for LB's appearance of having a more grayish color). These small neural networks accomplished specialized tasks with a neural activity that was more‑or‑less independent of neighboring areas. When an LB neural network area finished its task it would communicate a "result" to specific other areas, which in turn also tended to be independent and specialized in their operation. Thus, LB lost the "holistic" nature of the right, as it specialized in mastering sequences of specific sub‑tasks that were connected to each other for the accomplishment of an overall task ‑ such as talking or comprehending speech.

 

LB's new specializations rendered it suitable for playing supporting roles when RB recognized specific situations requiring the specialized performance. LB paid a price for becoming a specialist. For example, when it assumed responsibility for language it lost some ability for monitoring body part position in relation to the immediate physical environment. This task is now performed almost exclusively by RB's "inferior parietal lobule," a region that is a homologous counterpart to the Wernicke language area in LB. So, as RB became dependent on LB for language, LB became dependent upon RB for graceful movement.  So far, in this brief recapitulation of the evolution of LB, we can view the two brain halves as working together.

 

RB did not have to understand how LB did what it was specialized to do (nor could it); RB merely had to trust it to do what it was good at doing, and RB called upon it when the situation required. The use of "logic" to understand novel situations is an LB specialty. RB must be baffled by what the left is doing when it thinks through a sequence of logical operations. (Similarly, LB would be just as baffled by how the right side instantly recognizes a face.) Since RB is a more fully‑connected neural network system, it would seem to be well suited to recognizing when a situation requires LB help. When RB makes this determination, it hands control to an LB region for the duration of the specific task. At least this may be the way things started out when the LB specializations first began to evolve.

 

When a human invents a new tool, he uses it for pre‑existing purposes. However, the tool is also available for use to accomplish other tasks, perhaps tasks that existed before the tool existed. I view the new LB specializations in the same way. The brain did not evolve to comprehend reality; rather, the little reality that it does comprehend, between the many distorted comprehensions, is due to the fact that a better understanding of some realities were useful to the survival of our ancestor's genes. We can speculate that the human brain's greater ability to understand the world around it is an unintended consequence of an original need to perform specific sequential tasks. For example, the rewards of allocating the right hand for the task of reaching for fruit led to the evolution of sequential structures in LB, which allowed language to develop, which led to the making of sophisticated tools, which helped to create categories for placing and using words, which is the basis for abstract thought, which may eventually allow humans to achieve liberation from their genetic enslavement. What a wonderful outcome of the prosaic task of reaching for fruit.

 

My assertion that the sequential abilities of LB were used by RB to accomplish long‑standing tasks more successfully is consistent with the notion that the cerebral cortex itself, both left and right sides, is used by sub‑cortical structures for the more effective accomplishment of pre‑existing goals of survival and reproduction. Thus, the limbic system mobilizes the more recently evolved cerebral cortex by “driving it,” invoking emotions as necessary, to engage the world in ways that lead to the creation and raising of offspring carrying the genes that assembled the individual and his close relatives.

 

The brain was not intended to aid in comprehending reality, per se, and the genes can be excused for not anticipating that LB would someday figure out that the individual is enslaved by its genes – a central message of this book. This unintended achievement is due to LB's prefrontal cortex, the part of the frontal lobes forward of the motor strip. Both the left and right prefrontal cortices have grown in size at an amazing rate in relation to the rest of the brain, during the past half million years. Interestingly, the prefrontal areas do not contribute to the component of intelligence measured by IQ tests. Rather, the prefrontal cortex works with the limbic system to produce elaborate planning capability, or "executive function." It is likely that “what if” scenarios are orchestrated by the prefrontal cortex, and are evaluated by the underlying limbic system ‑ to assure that genetic goals are being served when a plan is adopted. I speculate that LB is less restrained by the limbic system, in order to rely upon logic when creating "what if" scenarios, and to subject these hypothetical scenarios to objective criteria for establishing the truth of a novel situation.

 

Because LB is better suited to comprehend situations requiring logical analysis, it would have frequently found itself in situations where it recognized a problem that RB was totally unaware of. LB needed to take action, yet RB was unable to endorse LB's call for action since it did not "comprehend" the problem. To the extent that survival (of the genes) depended upon LB being allowed to act when RB didn't comprehend the need for it, there would be a need for some level of trust in LB's compelling request to take charge in dealing with a matter that only it could handle. The trust would have to reside in both RB and the reticular activating system (RAS). Perhaps it became more efficient for LB to initiate actions without RB's consent, with a direct appeal to RAS. This would eventually lead to trouble for RB.

 

In a primitive setting there may have been very few situations requiring LB's specialized understanding. However, the more capable that LB became, the more the new world created by LB needed LB solutions. LB’s new world included sophisticated weapons and tools, horticulture, animal domestication, and the formulation of logical strategies for both everyday living and inter‑tribal warfare. RB would be calling upon LB to act more and more, and allowing LB to unilaterally decide to act for a growing number of situations in the emerging new world of LB's creation. It is inevitable that as LB assumed more initiative in recognizing situations that needed LB attention, LB would have assumed responsibility for responding without consulting RB. The harmony that existed between RB and LB was at risk of being upset, the more so as the new "upstart" LB assumed more roles and a greater importance in surviving the challenges of the late Pleistocene. Occasional LB/RB disputes of authority may have become common during the Holocene.

 

The disputes would not be between the entire LB and the entire RB; they would be between specific parts of LB and a large, diffuse part of RB. Recall that RB is a more fully-connected neural network, whereas LB consists of many specialized, task‑specific neural networks. RB recognizes situations from the totality of the sensory input, but isolated modules within LB recognize situations needing LB attention.

 

Since niches in the tribal setting tended to be of either an artisan type or the non‑artisan type, I believe that eventually people were "born with a predisposition" to occupy one of these two niche categories. It is quite possible that most people are born with the ability to take a reading of the environment and develop in the direction of one or the other niche type. It would make sense if the brain circuits that "take readings" of the environment, and hence of the relative merits for differing ontological development paths, also take into account the strengths (predispositions) that the individual inherited. (This whole process in which an individual's development unfolds in some way that approximately optimizes its usefulness to the genes within is a daunting subject, but it must be studied by future generations of sociobiologists.) In this way, I suggest, individuals are born with a predisposition for developing into one of two types: the sequential‑style artisan type, or the holistic‑style primal type. The rest of this chapter is concerned with the differences between these two types of adults and their implications for the fate of civilizations.

 

RBS and LBS Defined

 

In English there has been a longstanding use of the terms "head" and "heart" to signify two distinct ways of thinking and of "being." There is an unending flow of movies and books based on the conflict of the head and the heart, so since the terms are in such widespread use it should be unnecessary to define them. However, it is now obvious that the common term "head" is somehow associated with the perspective of the left brain, LB, while "heart" is somehow associated with the perspective of the right brain, RB.

 

I am tempted to present specific speculations about the neural mechanism that accounts for this dual association. For example, I could suggest that the "head" person's LB dominates his RB, forcing RB to play a supporting role; and that the "heart" person has an opposite dominance/support role relationship. Or I could suggest that the "head" person's LB asserts itself without RB permission, like an intrusive, uninvited guest, whereas the "heart" person's LB is not assertive in this manner. As an alternative, I could allege that the "head" person's RB is more similar to an LB, and the two hemispheres work together because they do not have conflicting values and life agendas; whereas the "heart" person has two RBs that work together in similar harmony. As a final speculation, which I tend to favor, I could invoke the prefrontal cortical areas, and state that the "head" person's LB prefrontal cortex has a stronger inhibition of the RB prefrontal cortex compared to the opposite inhibition, rendering LB's more closely connected LB posterior lobes a greater "voice" in perceiving the world and determining behavior; whereas the "heart" person has an opposite prefrontal lobe inhibition pattern, giving the RB posterior lobe perceptions of the world greater voice and a stronger influence over behavior.

 

I will not take a position on these alternative theories, as I believe the neurosciences are not yet advanced enough to warrant a preference for any of the foregoing. Indeed, perhaps none of the hypotheses I've identified are true. Or maybe several apply. It is not necessary for us to know why so many people tend to fall into the "head" and "heart" categories in order for me to proceed toward an important goal of this book. I merely ask the reader to accept that with some people it is possible to categorize them as tending to belong at the "head" end of a spectrum of traits, while others belong at the "heart" end of the same spectrum, acknowledging that most people are somewhere between the two extremes. (Anyone reading this far is a “head” person; the heart people would have discontinued in disgust during Chapter 1.)

 

I also ask the reader to consider the possibility that some people are born with a predisposition toward one or the other type, or with a predisposition to be a specific blend of the two extremes. And finally, I ask the reader to accept that those at the "head" end of the spectrum are a product of recent evolution, meaning that people having the "head" predisposition were less common the farther back in our ancestry we go. This last assertion is essentially a "thought experiment," since I cannot think of any way to prove it using data from the archeological record. The last point also has a corollary, worth stating, which is that "heart" people have "always" existed. Again, the archeological record probably cannot be used to support that assertion. However, I think that both of the alleged head/heart trends over evolutionary time are reasonable, possibly self‑evident, and I shall risk making no further argument to support it as I proceed with my larger argument.

 

I am going to use the terms RBS and LBS as shorthand for "Right Brain Style" and "Left Brain Style" to refer to the beliefs and ways of experiencing life by the "heart" and "head" people. By my subjective reckoning, LBS people are much rarer than RBS people, and most people are a mixture of the two extreme types. To estimate their numbers it will be convenient to make use of readily available polls of religious belief. I shall use "disbelief in God" as a proxy parameter for identifying LBS people (which admittedly neglects culture's influence upon an individual's religious beliefs). One could also use "disbelief in immortality, or spirits, or astrology" but these questions are less frequently included in polls. Based on the frequency of atheism in the American population, only 4% of people are LBS (and 0.8% are hard‑core LBS). The 4% estimate may strike some as too small; I agree, and cite that since the atheism rate in Europe is greater than in the United States the 4% value has probably been influenced by the traditional strength of religion in the U.S. from the country's founding (Hofstadter, 1969). Perhaps 5 to 10% would be a better estimate for the LBS rate among the Western world's population. Among elite groups of professionals the LBS rate is higher. For example, among physical scientists who are members of the National Academy of the Sciences, the rate of disbelief in God rises from the general population's 4% to an impressive 79%! (Larson and Witham, 1998).

 

In order to describe a conjecture on the evolution of the human LBS/RBS distribution I will describe how one might measure an individual for the purpose of scoring him for placement on an LBS/RBS spectrum. Let us begin with IQ tests to illustrate how a person's LBS/RBS score might be measured. Every IQ test that I'm aware of has many subtests, and these are usually averaged in two groups. The two groups were devised before LB/RB studies showed that brain competencies are lateralized. Nevertheless, it is clear that one part probes mostly RB posterior lobe function while the other probes mostly LB posterior lobe function. For example, the "verbal" and "performance" scores for the WAIS and Wookcock‑Johnson tests are crude measures of competence of the posterior lobes of LB and RB, respectively. With sufficient motivation I am sure that someone experienced in psychological testing could devise an objective measure that scores how LB and RB posterior lobe competencies are employed by the frontal lobes to produce the subjectively recognized RBS and LBS individual. In addition to a reworking of existing subtests, such as the WAIS and Luria Neuropsychological Investigation, I would suggest including questions about belief in angels, ghosts, spirits, communicating with the dead, telepathy, precognition, prayer, voodoo, astrology, touch therapy, life after death, and God. Assuming an effective LBS/RBS test could be devised, it would then be possible to imagine the "thought experiment" of testing a population of people from different times in our evolutionary past.

 

Evolution of LBS

 

If pressed to estimate when LBS people began to appear, I would say approximately 200,000 years ago, which is before the split of the three principal races of man:  African, Asian and Caucasian (Rushton, 1995). Before then, everyone was all "heart." It may be worth illustrating how the population distribution across an LBS/RBS spectrum evolved from before the appearance of brain lateralization to the present. I rather think the change was not a "shift" but rather the rise of one "wing" of a distribution. The population distribution change would have been "driven" by the existence of niches that were newly within reach of being occupied by an otherwise slowly evolving human species.

 

The following two figures are meant to illustrate an imaginary experiment in which our primitive ancestors are measured by a LBS/RBS test and scored on a LBS/RBS spectrum. Figure 13.01 shows what might have been measured prior to the exploitation of artisan niches, let us say 300,000 years ago, before our cerebral hemispheres were greatly lateralized. At that time the only niches being exploited required "LBS minus RBS scores" in the 70 to 90 point region, which in this illustration is well matched by the population distribution.

 

Figure 13.02 shows a hypothetical population distribution after new niches appeared and began to be exploited by a subset of the population. The new niches might have been the use of language in hunting, toolmaking, horticulture, animal domestication, animal breeding, construction of food cache buildings, or anything requiring nearly full‑time dedication and requiring conceptual thought or sequential thinking modes. I have indicated that the population distribution is not symmetrical, but has a bulge at the high "LBS minus RBS" end of the spectrum. When comparing individuals from the two ends of the spectrum there would be a stark contrast, and it would be tempting to say that they represent two distinct "types." The population distribution does not have to be noticeably bimodal for this impression to exist.


Figure 13.01 Hypothetical distribution of primitive population on an imaginary LBS/RBS spectrum. The boxes along the baseline represent niches that exist and can be feasibly exploited at this stage of human evolution.

 

Figure 13.02 Hypothetical distribution after new niches appear, perhaps associated with a climate change, showing how the population distribution might change in response to the new opportunities for niche exploitation. The right-most symbols represent "artisan" niches requiring a sequential type of thinking that only a left brain can perform.

I suspect that if a valid LBS/RBS test could be devised, results from today would produce a distribution resembling this figure. The two cultures would be represented by scientists, engineers and lawyers having jobs located by the right-most symbols, whereas salesmen, entertainers and the rest of the population would have jobs represented by the other symbols.

 

The conditions that favor the evolution of LBS are cultures rich with niches for the analytically inclined, which are abundant in today's technical world. In previous generations a Bill Gates would have been a lackey to a king, an artisan “on call” for odd jobs the king assigned. Just as people are born with different abilities for scoring well on IQ tests, it is inevitable that people are born with different predispositions for being LBS versus RBS. I believe that some people are thus pre‑adapted to become LBS, and prosper within a culture that rewards LBS, whereas in others the vast majority would be destined for life as RBS, or something in between.

 

Those people who are genetically predisposed to being strongly RBS will be forever resentful of the minority of strongly LBS people among them, especially if the culture rewards LBS talent as much as it does today. The vast majority of people will merely feel "uneasy" about the presence of strongly LBS people. This inherent resentment of one type for the other is hinted at in C. P. Snow's book The Two Cultures (1961). It is also possible to gain insight into the dilemma posed by LBS individuals in a predominantly RBS society by reading between the lines in Resentment Against Achievement (Sheaffer, 1988), Anti-Intellectualism in America (Hofstadter, 1969) and The Great Roob Revolution (Price, 1970).

 

In today's Western culture, dominated by the products of LBS achievements, it is easy to portray primitive people as primitive. Such pronouncements may not be welcomed by the RBS majority, for they recognize a resemblance of their outlook with that of the primitive. For example, RBS people tend to see all things in terms of the animate, whereas LBS people tend to see things in inanimate terms (cf. Ch. 2). The primitive, who is by inclination an adherent of an RBS world view, imagines that spirits reside in the trees, the wind, lightning, and all other natural phenomena. They conceive of the "forest as parent" and "a giving environment, in the same way as one's kin are giving" (Bird, 1990). The Inuit "typically view their world as imbued with human qualities of will and purpose" (Riddington, 1982). American Indians in pre-Columbian times also viewed the world as controlled by spirits (Aleshire, 2001).

 

In contrast, the scientist has cleansed his world view of these ancient spirits, and he tries to perceive things as mechanistic (this attribution is for the hard‑core LBS scientist, who is more likely to be found among the elite, high‑achieving scientist population). The hard‑core LBS person is a reductionist, who automatically perceives all phenomena as the inevitable unfolding of a mechanistic universe (only a minority of high‑achieving scientists would object to this characterization). If you like what you're reading, then you, like the author, are extremely LBS.

 

As an aside I will present an amusing example of the differences between the approaches to understanding something by an LBS and an RBS person. It occurred at a conference on "Imagination and the Adapted Mind: The Prehistory and Future of Poetry, Fiction and Related Arts," UCSB, August 26‑29, 1999. The invited speakers seem to have been intentionally drawn from the core of the opposing “two cultures” camps. It was amusing for me to watch a hard‑core RBS participant's consternation over the fact that inanimate objects enter our mouth, and inanimate objects leave at the other end, yet "we ourselves are animate." For him that was a dilemma, requiring appeal to the magic of words, which he provided, referring to "transubstantiation" ‑ as if this magical invocation explained something and solved his dilemma. But it is only a dilemma for the RBS person. For me, a hard‑core LBS person, all so‑called animate objects are more accurately to be thought of as "inanimate automatons." We are composed of the same atoms as everything else, and no magic transubstantiation occurs when the atoms of food and water enter or are incorporated into our bodies. Those atoms are subject to the same laws of nature that moved them before they became part of this exalted entity we call human. The RBS person, being unaccustomed to thinking of the human body as a mechanism, would naturally be troubled by this trivial phenomenon.

 

The modern world rewards left‑brained styles of thinking more than at any previous time. The LBS people of today appear to be "pre‑adapted" to our times. But pre‑adaptations do not exist by intention. Either they are accidents, in which a small opportunity is exploited by a novel mutation, or a wide niche is exploited gradually by a series of mutations, each adding an increment of capability to the first. I am wondering if the gradual evolution of LB specializations, which underlies lateralization, is in fact a case of the latter. For example, imagine an evolutionary scenario in which a sequential brain circuit enables a toolmaker to produce better spearheads than his brothers, and his genes are rewarded. Next, that line of men who have the slightly more sequential brain circuit in their LB, which the tribe may designate as their official toolmaker, undergoes another mutation that further improves his toolmaking ability ‑ he might simply demystify the stone that he works with. By this process he may lose the belief that the stone has feelings, and is hurt when struck, or that the stone wishes to be elsewhere, or that the stone will invoke the spirits of his stone relatives to wreak revenge upon the heartless chipper ‑ and this is an asset for performing his toolmaking task. We now have someone who not only tends to depersonalize the stone he must work with, which frees him to handle the inanimate stone better, but a person who tends to demystify in general. He is more likely to perceive the wind as really not caring about humans. He might be inclined to discard the many spirits that so preoccupy his fellow tribesmen - and if he is smart he will keep these new beliefs to himself.

 

What I am proposing is an evolutionary selection and maintenance of a specific type of variation within a species, for there are reasons the majority of tribesmen should maintain their spirit‑styled outlook. Spirits, I shall assume, are a short‑hand way of enabling people to deal with each other, and the majority of tribesmen are faced with having to compete with their fellow men rather than the stones to which the lonely toolmaker is relegated. I view the toolmaker as heralding an era of the division of labor. He is the prototype for the artisan, who occupies the broader tribal niche of improving weapons, inventing new ones (like the bow and arrow), refining horticulture, building food storage huts, domesticating animals, inventing the pastoral way of life, and later, irrigation, wheeled carts, atomic bombs, the internet and nanotechnology. The opening up of artisan niches, starting with the toolmaker, provided ever greater rewards to the man who viewed nature as spiritless, or inanimate. The human answer to this ever‑growing opportunity was to invest more in left‑brained circuitry specialization for dealing with the inanimate world. These lucky new people may have paid a price in being less apt in dealing with human relationships, but this would often be mitigated by a tribal understanding that these people were "different" and should be left alone to do their artisan work. They may have been looked upon as childish, or unmanly, since they were probably exempt from pillaging and other proud warrior exploits. The tribal chief might have thought of the artisans the way people later thought of the herds of animals that they domesticated: they had to be "kept" because they were useful to the tribe.

 

This speculation, that some people are naturally LBS, is meant to explain why the two cultures phenomenon is so pronounced. It suggests that some people are fated at birth to occupy either one of the small but growing number of LBS niches, or instead to occupy the many though decreasing number of RBS niches. Other people, being only partly inclined to artisan niches, could go either way in their personal development, and they would be wise to take a reading of the likely rewards for competing career paths when they make their (unconscious) life's path decision.

 

The Threat of Individual Liberation

 

The evolution of capable prefrontal lobes, and a newly‑fashioned LB specialized for analytical insight, obviously helped our ancestral genes survive. This much can be safely concluded by merely citing that we have the hardware in question. But new tools can sometimes cut in two directions, metaphorically. A brain that can put two and two together can theoretically put itself to uses that are subversive from the perspective of the genes; such an individual is at risk of liberating himself from the enslavement of "being a tool for the community"; he may think taboo thoughts of "becoming the captain of one's own ship, enslaved to no one."  The toolmaker is at risk of “walking away” from his assigned role as a tool for the tribe!

 

A creative prefrontal lobe might ponder scenarios in which the individual essentially "thumbed its nose" at the tribe by going off into the woods to live a simpler life? Before the creation of super-tribes, which offered specialized niches for strangers to fill, this way of thinking would have been suicidal. But during the past 12,000 years, during our present Holocene interglacial, some options for walking away from the genetic agenda have become feasible, and should be a real "concern" to the genes.

 

Let's review the main theme of this book, as presented in the Introduction. For each hypothetical behavior that has an innate component, we are to make an evaluation from the perspective of the genes, and then make an evaluation from the perspective of the individual. We must invent the concept "individual welfare," or "individual best interests," to continue this experiment. I shall infer its essence by presenting just two brief examples of thought experiments to illustrate the concept "outlaw genes." A larger set of examples, with fuller explanations, is presented in Chapter 13.

 

The individual who invests in raising offspring is embarking upon a long‑term and exhausting life course. Sex leads to babies, and for women this means the commitment of a life of parental enslavement. For "dutiful" fathers, sex and babies can lead to a similar enslavement. If a man or a woman gave thought to this situation, and instead decided to pursue an individual‑fulfilling life of aesthetic pleasures ("smelling the roses," watching sunsets, appreciating music) that person could in theory lead a less exhausting and more tranquil and fulfilling life.

 

For my second example, consider the young man who is expected to participate as a warrior in raids of neighboring tribes (or nations, if our locus is the Holocene). Marauding, pillaging and engaging in battle are dangerous activities. The individual who says "no" to tribal offensive campaigns, thereby flaunting unthinking patriotism, and who can successfully avoid combat without paying the price of humiliation, would be safeguarding his best interests.

 

Let us now pretend that we can calculate outcomes of various actions, and ascribe to them some quantified measure of Individual Welfare Value, IWV. Let us also suppose that we are able to identify how strongly each gene contributes, under average conditions, to eliciting each action. I have in mind a model for generating behavior in which a person's genotype interacts with his Environment to produce an expressed Phenotype, or GEP, a process best described by Symons (1979) and treated in Chapter 6. By adopting some average environment we can in theory calculate 1) the probability that a given gene will elicit the action under question, through the GEP analysis, and 2) the effect of this action upon the frequency of genes in the future genome. This allows us to link an action to a measure I will call Genetic Survival Value, or GSV. If we can do these things, then we can plot a point corresponding to that gene in IWV and GSV space. To wit, refer to Fig. 12.03, first presented in the Introduction and reproduced here.

 

Since the concept of "outlaw gene" is so important, its explanation is worth repeating. In the figure we have four quadrants. The upper‑right quadrant is where the actions elicited by most genes reside, as they provide value to both genetic survival and individual welfare – such as breathing and eating. The lower‑left quadrant contains mutant mistakes, and genes that elicit such actions should survive no longer than the individuals who carry them. Genes in the upper‑left quadrant are unlikely to exist, except by mutational accident, since there is no way for Nature to code for an action that destroys the code. The most interesting quadrant is the one in the lower‑right. The actions found here are actions that should be selected by Nature just as strongly as for those in the upper‑right quadrant since they benefit the genes that code for them; but the actions in this lower‑right quadrant exist in the face of harm done to individual welfare. This is the quadrant that motivated my interest in sociobiology on February 23, 1963, when I first created the figure.

 

Figure 13.03. Gene Value and Individual Welfare scatter diagram.

 

The "outlaw genes" are the ones found in the lower‑right quadrant. There is no way for natural processes to eliminate outlaw genes. Indeed, natural processes will reward them when they mutate into existence, and if they conflict with other genes a balance will naturally be struck. The balance could involve some form of accommodation, such as making the individual believe that the actions in question are attractive. The more common solution, however, is for the genes to invent emotional rewards that are meant to overcome any logical hesitation that might exist by individuals who are otherwise unconvinced by the instinctive imperative. Short of taking evolution into our own hands, humans will be burdened forever by the "victimization" that outlaw genes perpetrate upon the individuals they make.

 

An LB individual might chance upon the thought that some things everybody does are not in a person’s best personal interest, and if his social setting provides a way to avoid the unwanted action he might actually plunge into self‑liberation by resisting the instinctive, self-defeating act. For example, a woman might choose to not have children, or a man might find a way to avoid going to war. Horrors, if the genes allowed these acts of individual liberation to continue! So, given that liberated people would have fewer offspring, we should expect to see some evidence that a tendency to think liberated thoughts would be discouraged by the genes, or that thinking the way the genes "want" us to think would be attractive. Is there any such evidence?

 

The Two Cultures

 

Anyone who today reads C. P. Snow's The Two Cultures and the Scientific Revolution (1961) would be struck by resemblances of Snow’s descriptions to LB and RB styles of thinking. During the 1980s neuropsychologists were making headlines for research that revealed LB/RB differences, due mostly to the study of split‑brain patients who had their interconnecting corpus callosum surgically cut to control epileptic seizures. Snow contrasted "literature academics" with scientists, as they seemed to epitomize the extremes of a continuum based on his personal experience. His work during previous decades had placed him in contact both groups, and he became fascinated by their differences and their inability to communicate. Not only could they not communicate, they didn't want to. It was mostly the literati who shunned the scientists, and made disparaging remarks about them, while the scientists went about their work full of optimism that their importance was on the ascendance and they were eclipsing their rivals.

 

The word “rivals” is apt, because, as the previous sections of this chapter have made clear, I claim we are really talking about people who think with an LB style versus those who think with an RB style, which are products of rival brain circuits. The community of RBS intellectuals have frequently "made fun of" LBS people, as if they needed to discredit them in order to compete with them. I recall from my college days a prankster's sign posted on an arch of an engineering building that read "All my life I wanted to be an engineer; and now I are one!" At the time, I realized the point, and found it funny in a bittersweet way (I was enrolled in the School of Engineering at the time). But now I have a better appreciation of the motives for the humor.

 

The field of anthropology was "hijacked" by Franz Boas early in the 20th Century. His student disciples, Margaret Meade, Ruth Benedict, and others, worked with Marxist zeal to discredit the influence of genes on behavior. The main determinants of behavior were supposed to be the wretched influences of a corrupt society acting upon an innocent and malleable human nature. Anthropology is still undergoing turmoil, as the forces of rational LBS people seek to rescue it from a century of RBS abuse. The apologists for the RBS suppression of nature‑oriented investigators have occasionally revealed their motivation; they were afraid of the political ramifications if it was generally believed that badness in people is inevitable.

 

The latest RBS campaign to discredit LBS thought has various names, the most common being "political correctness," "deconstructionism" and "postmodernism." Again, there is a Marxist heritage to these new thought enforcers (Ellis, 1997; MacDonald, 1998). Their argument hinges upon a grain of truth, but exploited beyond reasonableness. At the most fundamental level, so the argument goes, all things are inherently meaningless and without value, and anyone who espouses a belief is deluding himself that his belief can be measured against someone else's belief. Hence, science is just one more pompous and fashionable belief system, and it is used to hoodwink the general public. Consequently, someone from the humanities should feel free to challenge anything sacred in science, such as gravity, the conservation of energy, and all manner of physical law. And therefore all LBS people are charlatans! Yet, amazingly, their RBS beliefs somehow escape their "fundamental meaningless law," and RBS beliefs remain unscathed. Therefore, all RBS people are Truth Tellers! Ellis (1997) states that "In the span of less than a generation, university humanities departments have experienced an almost unbelievable reversal of attitudes, now attacking and undermining what had previously been considered the best and most worthy in the western tradition." It is fitting that the most corrupt field of study is "literary criticism," the field that C. P. Snow chose as exemplifying the culture opposed to science. The demise of the humanities during the second half of the 20th Century is so bizarre, and out of control, that it's an easy subject to get worked up over. I shall refrain from the temptation, in the interest of pursuing a bigger subject (see Weinberg, 2001 for science-based rebuttals to postmodernism.)

 

It is no coincidence that the early 20th Century hijacking of anthropology, and the late 20th Century hijacking of humanities departments, has been inspired by Marxist intellectuals. Marxism and communism are political expressions of "collectivism." Individualism has been in conflict with collectivism for as long as human history has been recorded. The Eastern cultures are strongly collectivist. It should be no surprise that China embraced communism, and remains today the staunchest holdout and defender of communism. Remember that the LBS individual is prone to think subversive thoughts, and discover new ways of viewing the underlying motives for the collective force of societies. I claim that the genes are most likely to find the RBS‑prone person a fitting candidate for enforcing the collective (genetic) agenda by monitoring the errant wanderings of LBS intellectuals. Sensing that RBS vigilantes are watching them, LBS interlopers are always looking over their shoulder for the enforcers of gene interests.

 

High achievers have probably always sensed a resentment of their successes by those less able or less motivated. Robert Sheaffer (1988) wrote Resentment Against Achievement: Understanding the Assault Upon Ability to document this interesting social phenomenon. He writes that "Throughout recorded human history the ebb and flow of the love of achievement and the resentment against its successes have been major forces behind the rise and fall of civilizations and empires… Ahievement‑oriented values like tolerance, liberty, and the freedom of the individual to work hard and enjoy the fruits of his labor provide the motivation necessary for a civilization to grow and flourish."

 

A superficial assessment of how the genes should perceive achievers might challenge my thesis that the resentful are working on behalf of the genes. But consider that what the achievers are really resenting, and the fear the genes are really responding to, is that the achievers are those unpredictable LBS artisans who can't be trusted to stay loyal to the tribe. A person who leaves the service of the genes to pursue individual goals is not an asset to the genes; rather, such a person is a threat to the genes because his example might inspire others to do the same, and thereby threaten a wholesale abandonment of the collective genetic agenda. Lo to him who dares write a book urging individuals to seek liberation from their genetic enslavement!



CHAPTER 14

TWO CULTURES ‑ PART II

 

"The whole thing is so patently infantile, so incongruous with reality, that to one whose attitude to humanity is friendly it is painful to think that the great majority of mortals will never be able to rise above this view of life."  Freud, Civilization and its Discontents (1930), commenting on religion.

 

Examples from Everyday Life

 

Sometimes a theoretical argument is best illustrated by examples, so I want to start this chapter with the following three newspaper articles.

 

Item 1. On October 31, 1999, Egypt Air Flight 990 crashed shortly after taking off on a flight from New York to Cairo. The cockpit voice recorder and flight data recorder indicated that a lone relief co‑pilot was at the controls at the time the tragic event began to unfold. The co‑pilot said the Arabic equivalent of "I put my faith in God," then disconnected the autopilot, pushed the nose down, and sped toward the ocean below at Mach 0.9. The pilot returned to the cabin, asked "What's going on?" He took his seat, and pulled up on his control yoke while shouting to the co‑pilot to "pull with me." But the co‑pilot persisted in pushing down, which dis‑engaged the tail elevator's coordination mechanism, causing the elevators to go in opposite directions, which made the plane start to spin. During the dive the engines were turned off. It crashed, killing all 217 aboard. This information was recovered and reported by the American FAA about 3 weeks into their investigation, and implied that the co‑pilot may have committed a mass murder suicide. However, in Egypt the public reaction was one of disbelief. The Egyptians suggested that the Americans had engaged in a cover‑up to defend the reputation of an American airplane manufacturer (Boeing) and a desire of the Americans to discredit Egyptians. American newspapers began reporting on "the cultural chasm between the two societies." The Egyptian newspaper Al Akbar described the American interpretation thusly: "This hallucination might be accepted in an American movie. But it is difficult to be convinced of this [being done by] a mature and sensible Egyptian." An Egyptian magazine editor summed up the different philosophies: "The American has learned to conquer life and put trust in science and technology, while the Egyptian has learned [that] time is more powerful than the human being, and we cannot stand alone, but that it is better to have God by your side." My summation would be that "The Americans have learned how to conquer life by relying upon the left brain to use science and technology to understand and live effectively in the world, while the Egyptians have continued to view the world with the right brain perspective in which God is one’s guide for serving the genes."

 

Item 2. The Santa Barbara News‑Press reported (1999.02.07) that "LightShift 2000" wants everyone to "meditate for planetary harmony" at noon on the first day of every month. They claim that synchronized meditation is “a catalyst for positive energy” that will affect the surroundings. They believe that "silent, compassionate meditation can shift the collective consciousness of the world, reverse our course of destruction and truly create positive healing change in the approaching millennium. ... If you get people banding together and create a chain reaction of positive energy, that ... creates the light shift." The movement was instigated by a book written by astrologer Ken Kalb, residing in Summerland, CA (near Santa Barbara) titled LightShift 2000 ‑ Let's Turn on the Light of the World, which has sold 12,000 copies so far. A LightShift web site (www.lightshift.com) has been visited 1.3 million times. The stated goal is for a 6 million person mass meditation January 1, 2000, at 12:12 AM. Somehow they determined that if only the square root of any population would synchronously meditate, the overall quality of life would improve. According to my math this means they need only 78,000 people to save 6 billion, not their 6 million. I guess math isn't their strong suit. Even if you're "at work or in a meeting, a short prayer is better than nothing. Or just repeat the affirmation: 'May peace prevail on Earth.'"

 

Item 3. Some people are an embarrassment to the human race! They keep coming up with weird beliefs that defy logic, which causes the rest of us to conclude that they are “logic challenged.” One example of a recurring belief is that God will put gold crowns on the teeth of the faithful "as a sign." After attending a revival at a church where the magical appearances are reported to have occurred the faithful will check their mouth, and behold, gold crowns, plates bridges and bands materialize. This phenomenon first appeared in Argentina in the 1980s, and spread to Mexico, South Africa, Canada, Britain and now America. The claims fade after several years, then reappear. A Canadian went to his dentist to prove his claim, and the dentist reported that he did the gold work 10 years earlier. When this was reported in the newspapers, the gold teeth reports mysteriously stopped. "Faith springs eternal" because it is in our genes. Fortunately, not all humans have it in the same amount, and those who lack it are blessed with a modern left cerebral hemisphere.

 

Empowerment of the Masses

 

The 20th Century LBS (Left‑Brain Styled) scientist is acutely aware of the threat posed by the newly‑emboldened common man, who is preponderantly RBS and considers his opinion on scientific matters to be just as valid as the expert's. Keay Davidson (1999) captured this concern using the example of Harlow Shapely, a Harvard astronomer who opposed pseudo science. Shapely was especially irked by the readiness of the Macmillan publishing house to publish the book World's in Collision by Velikovsky (an RBS exemplar). Shapley's position was summarized by James Gilbert (1997) in the following manner:  Science "...by its very nature should never be molested by popular belief; it is the sole purview of those who understand it." Keay contrasts this position with that of William Jennings Bryan, who believed in "the democratic community's right to decide the validity of scientific theory" (Gilbert, 1997).

 

Jose Ortega y Gasset wrote in his 1930 book The Revolt of the Masses that the unschooled common man began, in the 19th Century, to view his opinion on all matters as having equal validity to those of scholars who devoted their life to the same matters. Part of this new boldness may be due to economic growth that brought increasing wealth to the masses; the new boldness was also nurtured by the entitlements that a democracy creates. But the mass man's mistaken belief was outwardly attributed to the conviction that Truth could be found by looking inward, with earnestness and faith. This subject is treated in greater length in the next section.

 

Oriental Rejection of LB

 

Oriental philosophies, such as the Tao, eschew LB‑styles of thinking, and attempt to discredit it in their philosophies for thinking and being. In their place they counsel ways of thinking and being that are transparently RB‑styled. I will assume that the reader is familiar with the late 20th Century neuropsychological findings on brain laterality, as reviewed in Chapter 7. During the 1980s there was so much reportage in the popular press about brain laterality findings, many of which were indeed astounding, that some purists referred to the phenomenon of an over enthusiastic embrace of these new ideas by coining the term "dichotomania." During the 1990s the purists effectively squelched those inclined to even a reasonable version of dichotomania, so the exploration of RB versus LB issues was "still‑born." Be warned; I am a dichotomaniac!

 

The Tao complains that "we have learned to put excessive reliance upon central vision, upon the sharp spotlight of the eyes and mind..." and that "we cannot regain the powers of peripheral vision unless the sharp and staring kind of sight is first relaxed." Alan W. Watts (1957, p.19). A neuropsychologist would immediately recognize the brain laterality connection of this thought. The old RB is continually monitoring sensory input for signs of danger, and it does this in a computationally fast and subconscious method, relying upon parallel processing, comprised of interconnected neural networks. When the conscious mind focuses upon something, it most‑often does so under the direction of LB, and uses the high resolution central visual field for this task. It is uncanny how the Oriental has captured this signal difference in left/right roles for looking at the world, and prefers one over the other.

 

The I Ching encouraged the Chinese mind to arrive at "decisions spontaneously, decisions which are effective to the degree that one knows how to let one's mind alone, trusting it to work by itself." (Alan W. Watts, op. cit., p.19). Again, this advice essentially counsels a person to "turn off" LB, and trust RB to make decisions. (This is remindful of the advice to tennis players, golfers, businessmen, etc. presented in a series of The Inner Game of... books by W. Timothy Gallwey, starting in 1974).

 

Alan Watts describes Chuang‑tzu's advice "The perfect man ... grasps nothing..." Also, "it 'fuzzes' itself a little, to compensate for too harsh a clarity." Later, Watts writes "...both Tao and Confucius thought that the natural man is to be trusted." (Note that "natural" must refer to primitive, RBS man.)

 

In Oriental thinking there is a strong resentment of LB intrusions. They extol the virtues of a form of unconsciousness, something "exponents of Zen later signified by wu‑hsin, literally 'no‑mind,' which is to say, un‑self‑consciousness. It is a state of wholeness in which the mind functions freely and easily, without the sensation of a second mind or ego standing over it with a club." (Alan W. Watts, op. cit., p.23). The term "second mind" that Watts uses refers, quite transparently, to LB. “Me thinks RB doth protest too much!” Why are the Orientals so against the intrusion of LB? I believe a balance is possible, but my optimum balance point is different from theirs!

 

Consider this passage from Watts (op. cit., p. 23) where he quotes Lin Yutang: "The baby looks at things all day without winking; that is because his eyes are not focusing on any particular object. He goes without knowing where he is going, and stops without knowing what he is doing. He merges himself with the surroundings and moves along with it. These are the principles of mental hygiene." LB, being the last to evolve phylogenetically is also the last to develop ontologically, so the baby begins life with only the rudiments of a functioning LB but a more fully functioning, behavior‑controlling RB. In essence, the Oriental mind is counseling adults to become like the baby, and forsake, as an individual, evolution's latest accomplishment for us as a species. Why would anyone counsel that unless they’re dutiful agents for the genes willing to squelch individualism?

 

Eastern Thought as a Genetic Solution to the Dangers of Intelligence

 

I used to wish for peace of mind, to be "centered" and to be rid of "inner mental turbulence." This desired mental state is often sought through meditation, a component of Eastern Thought. However, I now look at this ever‑more popular pursuit with mixed feelings.

 

Recall that when a baby comes into the world it's only memories are of a peaceful womb, where the temperature was constant, noises were muffled, and there was no danger of being hurt by falling or bumping into things. After birth the task of maintaining physiological conditions within limits is thwarted by the conflicting task of exploring the world. Physical mobility and social interactions are a challenge to maintaining physiological homeostasis. (Poor baby; it's beginning an Odyssean mission, for it has a job to do ‑ for the genes.)

 

Engaging the outer world requires mental vigilance and physical effort, which often produce displacements from homeostatic equilibrium. The overriding task of safeguarding homeostasis sometimes requires that the individual withdraw from the world. Fortunately, the baby is endowed with instincts for learning how to balance these conflicting goals. The dual goals of engaging the outer world and later returning to equilibrium must become a lifelong, recurring theme.

 

Growth requires that each exploration become bolder and more engaging of the outer world. Childhood is a time for practicing adult skills. Adulthood, however, is a time for performing the business of life; and the business of life is to place ones genes in as many offspring as possible so that they may grow up to do the same. Thus, individuals do all kinds of things that an alien observer would find remarkably silly, irrational, and ill‑self‑serving, such as waging warfare on neighboring tribes, sacrificing oneself unquestioningly to questionable patriotic causes, striving for status, wooing mates, making love, raising offspring and performing all manner of ridiculous rituals.

 

Because our recent ancestors evolved to be more intelligent they have placed us ever closer to a dangerous psychological border, a fuzzy border on the other side of which the individual thinks for itself, and is prone to consciously abandon irrational acts that in the past assured genetic survival. Such an individual asks "Why? Why should I do as my ancestors have done? Why should I wage war, strive for status, woo mates, make love, create babies, commit to decades of burdensome parenting, and perform ridiculous rituals?"

 

In short, the individual is asking why it must be enslaved in service to its genes the way its ancestors have been, when it is possible to become liberated from these onerous and exhausting tasks. For our genes these are dangerous thoughts! And anyone who believes the genes have neglected to deal with this challenge to their supremacy should think again.

 

It is true that the genes were rewarded when they created intelligent brains, for intelligent people were better at waging war, striving for status, wooing mates, etc. However, genes were punished for creating intelligent brains, for those individuals were more likely to question their genetically‑assigned slavish role. How might the genes have the benefit without the cost? Solution: new genes were selected that placed blinders upon intelligence, and inhibited "bad" thoughts! Bad thoughts, of course, are whatever leads to individual liberation. Any gene that sanctioned submission to continued enslavement by them were "good" and thus selected, whereas those that rendered an individual prone to question authority and the way things are, and who was attracted to eschewing enslavement, were "bad" and were selected out. All of these changes happened naturally, and inevitably, as the effects of the blind "forces" of natural selection.

 

I now suggest that Eastern Thought incorporates mechanisms for assuring continued genetic enslavement by inhibiting aspirations of individual liberation. Specifically, I am suggesting that one of the purposes of Eastern Thought is to preserve the individual's acceptance of his condition in life. This compliance is accomplished by inhibiting or discrediting all forms of skepticism, questioning the existing order, and any thoughts that might lead the individual to abandon his network of family and tribal duties. When life is tough, Eastern Thought consoles by sustaining the belief that "this is the way things are, and you are a mere part of an immense whole; don't complain and don't fight it; be submissive." The individual then fails to ask if there are roads not taken, alternative decisions that would have been wiser, and changes that might still be made to better one's individual well‑being. For the genes, no individual sacrifice is too great!

 

If meditation is palliative enough to keep an enslaved person enslaved, then the genes will produce meditation. If a person believes that "bad things happen because spirits need more attention," and if this belief discourages a slave from taking matters into his own hands and breaking free, then the genes will create minds that deflect attention from real causes by predisposing them to explain things as the doings of neglected spirits. If an excessively curious left brain asks too many subversive questions, Eastern Thought will subjugate it so that it will be more obedient to the right brain. It is possible that the Oriental brain is wired to prefer culturgens that assist the right brain in retaining control of the individual.

 

As my left brain is dominant over my right in terms of setting values upon things, it is inevitable that I must accept "mental turbulence" over "mental quietude." I worship the individual, and applaud his efforts at liberation. I also detest any attitude that inhibits this individual quest, especially if it smacks of a trick by the genes to keep the individual enslaved forever in service to themselves. Is my attitude lacking in gratitude? Should I worship the genes, indiscriminately? No! I shall worship only those genes which "respect" their individual creations!

 

I therefore question the wisdom of withdrawing from the tumult of questioning, as I challenge the ideas of timid thinkers and pursue individual liberation. You can have my secret mantra, the popular word "one," as I enthusiastically proclaim "bring on the turbulence!"

 

Fiction

 

Our distant ancestors were dumb enough to do whatever the genes wanted. But lately, with the emergence of powerful prefrontal areas and a re‑engineered LB possessing newfound powers of rationality, individuals are more able to challenge the genes by ‑ sit down for this one ‑ thinking!

 

Many tools are employed by the genes to keep us in their service. Emotions have been adjusted for this purpose, walls to subversive thoughts have been erected, and curious rewards have been put in place for tricking us into wanting what is often bad for us as individuals.

 

The genes have targeted sex and patriotism for special reinforcement since the thinking mind is capable of discovering how harmful they can be to individual welfare. Incautious sex exposes the individual to 1) disease, 2) bodily harm from a partner not wanting to be cuckolded, or 3) if one follows the rules, the prospect of a life‑long burden of child rearing. Enthusiastic and thoughtless patriotism leads one into war, which exposes the individual to lifelong injury or quick death. For years I have been using this pair, sex and war, as the strongest examples of the individual's worst genetic enemy, placing them in my figure's “outlaw gene” lower‑right quadrant.

 

What a coincidence, then, that sex and violence are the two most reliable themes for selling fiction. A theory for imagination that doesn't account for this salient feature of fiction is incomplete. I am suggesting that the genes are "concerned" that we might abandon our appetites for sex and violence, as a person might do if he allowed himself to be guided by this new thing that a fast‑evolving LB has come up with, called "logic."

 

The genetic preoccupation with sex and violence helps explain them as persistent themes in fiction, but the genes influence fiction in other ways. Everyone has different roles to play in life. The roles may conflict, as when a mother secretly celebrates her son's philandering, thus spreading her hitch‑hiking genes more widely in the local gene pool. Or when a father applauds his son's virility, the better to plunder neighboring tribes and rape their women, thus sewing his hitch‑hiking genes more widely and winning resources for future misdeeds. These mental conflicts are capable of producing "cognitive dissonance" (Festinger, 1957), and literature is a way of working out the conflict so that the cognitive dissonance is not disabling.

 

Some attitudes and behaviors which are not tolerated within one's tribe have to be elicited when dealing with people from the neighboring tribe. Epic tales seem fashioned to hone this distinction and inspire awe for the desired performance ‑ even though it jeopardizes individual welfare and makes no logical sense from the individual's perspective. Chimpanzees don't need inspiring epic tales to fling them into war, because they do not have left brains that cause them to question foolish actions. But humans do, so epic tales are used to enforce unthinking adherence to the genetic script.

 

Not all fiction is meant to keep the individual enslaved. Mothers might recite Hansel and Gretel stories to their children to alert them to the dangers of step‑fathers, who are prone to kill step children so they will not compete for parental resources with the children he fathers (Daly and Wilson, 1988). Monster stories are psychological preparation for marauders from a neighboring tribe. These stories can in fact be helpful to the individual by reinforcing the existence of unpleasant realities that children must learn.

 

The genes must deal with both categories of story, those that maintain genetic enslavement and those that are actually helpful to the individual. It may have happened that the genes have created pre‑wired brains that are "attracted to" stories of both categories. It may not matter that a few unenslaving or uninstructive stories catch a free ride; it matters more that the vehicle for safeguarding the real message is preserved.

 

We should not be surprised when we find evidence that the brain has pre‑wired us to reject stories that celebrate individual liberation from the tribe, and all the other enslavements that the genes demand. The human taste in fiction seems designed to keep us individuals on the straight and narrow path the genes have set for us.

 

Since adaptation really means "adaptive for the genes," not adaptive for the individual, every thinking human must have an ambivalent attitude toward those little molecules that give us life. We were not part of the negotiation of life's conditions, but we are awakening to the option of saying "no thanks."

 

Yet there's a danger to "civilization" when LB‑style individuals have a prominent presence. I save this matter for the next chapter.

 

The Dilemma of Spiritual Scientists

 

The annals of science have many examples of weird coexisting beliefs. In this section I wish to present a few examples. They illustrate that the two sides of our brain are able to retain their preferred styles of thinking without persuading the other. I include these examples at the risk of creating the impression that most scientists are capable of the same level of intellectual dissociation that permits profoundly incompatible styles of thinking to coexist in one brain. It is important to realize that scientists, on the whole, eschew non‑science relics, such as spirits, ghosts and prayer. The different scientific disciplines differ in their success with this (Larson and Witham, 1997, 1998, 1999); the less disciplined studies (such as sociology) fare worst, and the most rigorous sciences (such as physics) fare best. Because it is so rare to find spiritual people in the physical sciences, and because it is rarest among the most accomplished physical scientists, it should be most instructive for us to study examples from among this small group that are the exception to the rule. For them, the dissonance of their thoughts should be most glaring. For my examples of "spiritual scientists" I have chosen Newton, Townes and Dyson.

 

Isaac Newton (1643‑1727), who some describe as the greatest scientist of all time (not my assessment), formulated "Newtonian physics," a view of the natural world that is the basis for "reductionism." Newtonian physics starts with F=ma to explain motion (Chapter 1). Newton dealt most with the force of gravity, as it governs planetary motions; but the concepts were found to hold well for electricity and magnetism (until the 20th Century revisions of quantum physics, which must be taken into account when dealing with interactions at microscopic scales). Newton "invented" calculus (at the same time as Leibnitz). These accomplishments are a tribute to Newton's left brain. But his right brain was equally active. He is quoted as saying "The Supreme God is a Being eternal, infinite, absolutely perfect. We must believe that he is the father of all things, & that he loves his people as his children that they may mutually love him & obey him as their father." And also "This most beautiful system of the sun, planets, and comets, could only proceed from the counsel and dominion of an intelligent and powerful Being." (Newton was an early adherent of what is now called Intelligent Design, a tool of the devout for questioning Darwinian evolution.) The following is from a Christian web page, expanding upon the coincidence that Newton was born on Christmas Day: "The very incidents surrounding his birth seemed to indicate God had some special plan for him ‑ at least that's what Isaac Newton thought." Also from this web page: "The design of the eye required a perfect understanding of optics, and the design of the ear required a knowledge of sound. The solar system itself could not have been produced by blind chance or fortuitous causes but only by a cause 'very well skilled in mechanics and geometry.' Gravity itself was an active principle God used to impose order on the world. Newton spent a tremendous amount of time studying the Bible, especially the prophetic portions of Scripture. He believed history was under the dominion of the Creator, and prophecy showed how the Creator was to establish His earthly kingdom in the end. His Chronology of Ancient Kingdoms, Amended used astronomical data to argue that the Bible was the oldest document in the world and that the events of Biblical history preceded all other ancient histories."

 

Newton wrote as much on religion as on science, but rarely published the former (mindful of there heretical nature). He believed in the possibility of transmuting the baser metals into gold, and he wasted much time on this project.

 

Ironically, Newton was not a Newtonian, as the term is used today. Much of Newton's non‑science beliefs are ridiculous in light of today's more thoroughly developed understanding of physics and evolution. It is easy to criticize someone from the past, so on that account let us turn to contemporaries for examples of spiritual scientists.

 

Charles Townes (1915‑) won a nobel prize in 1964 for his contributions to quantum theory. He invented the maser (microwave amplification by stimulated emission of radiation) and is co‑patent holder of the laser. In spite of this strong background in science and technology, Townes has no trouble retaining a religious faith. In an interview (http://www.ssq.net/html/brief_interviews.html) he states that "the physical laws are laws that God made," and the universe has a "purpose and meaning." On many occasions his views seem to negate each other, creating the impression of a slippery evasiveness. For example, when asked how he reconciles the lack of a place for God in the equations of physics, he answers "Well, I would say that we don’t know that the equation is complete" as if to imply that there is room for religion to somehow sway the unfolding of physical events, which sounds like RB telling LB what to say. Yet, later, when asked about God affecting the universe outside physical law, he answers "...in terms of what we know at present, our present laws allow no room for a separate action of God. While things are not deterministic, nevertheless there is no room for some superimposed outside force coming in and affecting things. There's no room within our physical laws. [That's his left brain talking; but get ready for a shift.] However, that doesn't trouble me as a religious person because I recognize that there are a lot of things we don’t understand, and that there may be such a possibility which is there but we don’t understand it yet. So, for me, that's not a problem; it's an interesting puzzle, but not a problem." With oblivious disregard for the devastating denial of God just spoken by his left brain, he ends with a shrug of the left shoulder (controlled by RB) and says "that's not a problem."

 

Perhaps we can attribute this odd coexistence of a primitive outlook with an impressive understanding of modern physics to his upbringing in South Carolina, America's bible capital. He's one of the few outstanding scientists who have retained religion during their ascent to notable achievement in the physical sciences. He's not alone, though; there are others who are somehow able to maintain the two world views in the same brain.

 

Freeman Dyson (1923‑) is another "spiritual scientist." His commitment to bringing religion and science together earned him the Templeton Prize (and almost a million US dollars). In his acceptance speech (March 22, 2000) he stated "I am saying to modern scientists and theologians: don't imagine that our latest ideas about the Big Bang or the human genome have solved the mysteries of the universe or the mysteries of life... Science and religion are two windows that people look through, trying to understand the big universe outside, trying to understand why we are here [make note of the phrase “two windows”]. The two windows give different views, but they look out at the same universe. Both views are one‑sided, neither is complete. Both leave out essential features of the real world. And both are worthy of respect."

 

I have great respect for Freeman Dyson, both as a scientist and a creative thinker, and a compassionate champion of people who are unfortunately "left behind" as others march forward to ever greater prosperity. Dyson writes "The great question for our time is how to make sure that the continuing scientific revolution brings benefits to everybody rather than widening the gap between rich and poor. To lift up poor countries, and poor people in rich countries, from poverty, to give them a chance of a decent life, technology is not enough. Technology must be guided and driven by ethics if it is to do more than provide new toys for the rich. Scientists and business leaders who care about social justice should join forces with environmentalists and religious organizations to give political clout to ethics. Science and religion should work together to abolish the gross inequalities that prevail in the modern world. That is my vision, and it is the same vision that inspired Francis Bacon four hundred years ago, when he prayed that through science God would 'endow the human family with new mercies.' "

 

Dyson understands F=ma and quantum physics as well as anybody (better than me, for sure). It baffles me that he does not embrace the concept of a "rigid universe." How can someone so knowledgeable in physics be so blind to the primitive pedigree of religion and the misleading guidance it fraudulently purports to give.

 

Anybody reading this should ask "How can this author be so sure of himself, especially when he differs with a polymath scientist as esteemed as Freeman Dyson?  He must consider the possibility that the person following the erroneous path is himself!" Yes, I have considered this, and it is possibly true. But is it also possible that Dyson is mistaken?

 

I think Dyson is swayed by the pragmatic way religious groups help serve community goals. His Templeton award acceptance speech included the passage "Trouble arises when either science or religion claims universal jurisdiction, when either religious dogma or scientific dogma claims to be infallible. Religious creationists and scientific materialists are equally dogmatic and insensitive. By their arrogance they bring both science and religion into disrepute. The media exaggerate their numbers and importance. You media people should tell the public that the great majority of religious people belong to moderate denominations that treat science with respect, and the great majority of scientists treat religion with respect so long as religion does not claim jurisdiction over scientific questions. In the little town of Princeton, where I live, we have more than twenty churches and at least one synagogue, providing different forms of worship and belief for different kinds of people. They do more than any other organizations in the town to hold the community together. Within this community of people, held together by religious traditions of human brotherhood and sharing of burdens, a smaller community of professional scientists also flourishes."

 

This is a gentle defense of religion, based on its perceived positive impact on society. It overlooks the fact that any good done by a belief system is irrelevant to the truth of its core beliefs. He also overlooks the long history of religious wars; apparently he’s focused on only the intra-tribal supportive aspects of religion.

 

It is often assumed that without religion social order would break down. Garrett Hardin has argued (1999, p. 46‑47) that "consequentialist ethics" is fundamentally a far better guide for societal governance than any morality handed down from ancient times. In addition, sociobiology argues that humans inherit an acceptable intra-tribal morality, and this is an alternative to religion’s claim that religion is needed to maintain social order. Were Dyson to study Hardin's idea, and sociobiology’s intra-tribal amity instincts, he would have a weaker justification for defending religion as an essential part of a community and as a window upon reality with something to offer that cannot be obtained elsewhere?

 

Closing Comment

 

The evidence is abundant and persuasive that humans think in different styles, with two main ones being frequently cited. The neurological substrate for this difference is the left and right cerebral cortices, which are shown by laboratory studies and neuropsychology experiments to have different neural architectures, abilities and thinking styles. The logic of the left brain is a relatively new evolutionary development, and I claim that it has had to compete with the old right brain for expression. Since I view brain function as being inherently "modular," with modules competing for comprehending a situation, and then competing for controlling behavioral responses, I suspect that the right brain "resents" the left, and tries to discredit it. Since civilization has been built largely by the efforts of people using their left brain, there is always some risk to civilization when right brain styles of thought gain influence within a culture. In the next chapter this concern is treated as one of many possible threats to the health of present‑day civilization.

 



CHAPTER 15

FACTORS INFLUENCING THE FATE OF CIVILIZATIONS
PART I – GROUP SELECTION

 

"The oldest of all philosophers, that of Evolution, was bound hand and foot and cast into utter darkness during the millennium of theological scholasticism. But Darwin poured new lifeblood into the ancient frame; the bonds burst, and the revivified thought of ancient Greece has proved itself to be a more adequate expression of the universal order of things than any of the schemes which have been accepted by the credulity and welcomed by the superstition of 70 generations of men."  T. H. Huxley, 1887.

 

Introduction

 

The rise and fall of civilizations is a macro-behavior produced by the "micro‑ motives" of genes, to use a concept made popular by Schelling (1978).

 

To the extent that genes played a role in the creation of civilizations it should be said that it was not their "intent" to do so. We know this, first, because the genes are mere molecules conjured up by a mechanistic universe. To say that genes have motives is, of course, an excess of anthropomorphic metaphor. The genes are simply the product of mutations that succeeded in surviving, and if they survive in the gene pool for many generations (of the individuals they construct) we say that they were successful in trying to express themselves and survive, as if they were motivated to do these things.

 

We know that it was not their intent to create civilizations because it cannot even be said that civilizations are a product of evolution in the same way that one can say the "eye" is a product, or the "brain." The difference, here, is that the eye exists because the genes that code for its assembly during embryologic development have been "selected" by the process of mutation and survival of the inclusively fittest. Whereas eyes exist because eye genes evolved, a civilization exists because of a fortuitous configuration of circumstances which have the unprecedented outcome of "civilization." The outcome is unprecedented because the genome out of which civilizations arise is essentially the same as the genome that gave birth to the first civilization. And when the first civilization emerged, however one defines "civilization," it was an evolutionary unforeseen event, for which no gene, or combination of genes, can take credit. Only after civilizations change the human genome by competing with uncivilized tribes, and winning, will it be possible to credit the genes for "sustaining" civilizations after their accidental "creation" ‑ by the normal evolutionary process of mutation and natural selection.

 

Civilizations are too new for viewing them as some manner of phenotypic expression with an adaptiveness that has been measured; civilizations have not had time to influence gene frequency appreciably. It is more accurate to view the phenomenon of a civilization as an unintended product of evolutionary processes that shaped human nature in an ancestral environment, devoid of civilizations, which rewarded genes that we learn later just happen to lead to the creation of civilizations when the random configuration of circumstances are conducive to the civilization's rise.

 

Even this gives too much credit to the genes. Group selection, GrS, followed by what I have termed individual selection, IS, may be responsible for the creation of civilizations (as I explain in Chapter 11). Although group selection is made possible by the genes, the genes are not responsible for the failure of the group to keep individuals subservient to the group. So, to the extent that my suggestion in Chapter 11 concerning the roles of GrS and IS in the creation of civilizations is correct, some of the credit for civilizations must go to a factor called "LUCK." (In my use of the term “luck” I’m assuming the reader is in favor of civilization, a clarification that is made necessary with the rise of fundamental religious movements throughout Islam, and even Christianity).

 

In the remainder of this chapter, and the following one, I deal with factors that contribute to the fall of civilizations. There must be more ways for a civilization to fall than to rise. The fall of a specific civilization must have many contributing factors, and the most important one may differ in each specific case. The following sections are brief sketches of some of these factors.

 

Natural Catastrophe Theory

 

The Minoan civilization was destroyed by the volcanic eruption of the island Thera (now called Santorini) in the Fall of 1628 BC. In addition to destroying most life on the island of Thera, the volcanic eruption produced a tidal wave (tsunami) that devastated coastal settlements on nearby Crete. The Minoan settlements on the north shore of Crete suffered damage to their fleets of fishing and trading ships. The Mycenaeans took advantage of the weakened state of the remaining Minoan civilization by invading them and replacing the Minoan culture with theirs. This unlucky natural event led to the fall of what may have been one of the world's first great civilizations.

 

A comparable volcanic eruption and related earthquake-induced tidal wave would not bring down 20th Century world civilization even if the area destroyed were Los Angeles, San Francisco or Seattle. It would weaken, but I doubt that by itself it would destroy, the American embodiment of Western Civilization. Only if other factors were at work undermining the strength of civilization would a natural disaster of moderate magnitude contribute to its decline and fall. A global civilization could be threatened by an asteroid impact, creating a global cloud of stratospheric aerosols that would cool the surface and upset agricultural production for several seasons, leading to famine, widespread desperation and the breakdown of social order. Short of this unlikely scenario, I doubt that a natural disaster will be an important contributor to the current global civilization's decline and fall.

 

Group Selection Speculations

 

Chapter 12 describes a possible mechanism for the rise and fall of civilizations, relying on the controversial concepts of "group selection" and "individual selection." I suggested that the appearance of "individual selection" was a genetically unforeseen breakdown of the group's control of individual aspirations for making decisions, and that the "release" of individual creative and productive powers can generate what we call a civilization. After the successful creation of a civilization many of its citizens become self-absorbed with their new-found material wealth. Less civilized members of neighboring societies become resentful of the wealthy neighbors, and they feel threatened by the individual-liberating culturgens of those neighbors. The uncivilized societies may then draw upon the magical strength of religious fervor, with its readiness for fanatical actions, and engage in a "holy war" of terrorism upon the civilized societies. These attacks require that within the civilization individual energy that had been productive become diverted to defensive and protective measures. The individual may find it easier to adapt to a growing nuisance of interference by jealous outsiders than to coordinate with others to mount counter-measures. This neglectful attitude weakens the civilization under siege, thus hastening its decline and fall. (See Chapter 12 for details.)

 

I recently became aware of a group selection theory relying on gene frequency changes for the rise and fall of empires (Choi and Bowles, 2007, Wilson and Wilson, 2007, Turchin, 2007). I’ll refer to these speculations as “parochial altruism” theories. The scenario envisioned by them is quite similar to my speculation, which I will now refer to as the “insightful individual” theory. My speculation is that individuals act heroically on behalf of a super-tribe while it is in conflict with its neighbors because it is in the individual’s best interest to do so, whereas during a peaceful era following a decisive victory individuals act on behalf of themselves and lose their effectiveness at defending their accomplishment because they have become accustomed to a peaceful order that does not require sacrifice for the group. The main difference between my 1990’s “insightful individual” speculation and the 2007 “parochial altruism” theories has to do with the role played by gene frequency changes. Whereas I attributed behavior changes to individuals who could think on their own behalf, and forsake group needs when appropriate, the new theories invoke the influence of genes that predispose individuals to behave in one mode or the other.

 

In considering the relative merits of these two theories it is necessary to consider that the gene-based theory requires many generations to be effective, whereas my insightful individual theory can operate on a much shorter, one generation time scale. If we conceive of a civilization as a series of empires “taking turns” defending the same basic culture, then empires may come and go on time scales that require my “insightful individual” explanation, whereas the rise and fall of civilizations may require the “parochial altruism” explanation

 

Wouldn’t it be nice if there was a way to accomplish fast behavioral changes without the need to invoke “individual insight” and without the need to invoke gene frequency changes? There is, and I prefer it to the previous two theories; it can be called SR Theory.

 

Stimulus/response theory states that genes produce brain circuits that recognize specific environmental situations and respond with specific behaviors. Social animals are capable of reading a social situation and responding appropriately. It would be a simple matter for a person to distinguish between the home tribe being in chronic conflict with a neighboring tribe and being at peace with neighboring tribes. I’m proposing that when the chronic conflict condition is detected people tend to behave in ways described by “parochial altruism” and intolerance for “out-group differences” - which together predispose the individual for heroic actions during tribal conflicts. The reverse of this is just as easy to imagine: when times are peaceful behaviors are favored that we characterize as self-serving and tolerant of other people’s differences.

 

The proposed SR Theory for eliciting adaptive behavior has the virtue of producing quick responses to changing conditions since gene frequency changes are not involved. As soon as peace prevails personal behaviors occur that set in motion the downfall of the victorious tribe, which leads eventually to another tribe’s gaining the mantle of victor and eventual loser, thus perpetuating the endless cycle of rise and fall of tribal empires. What a simple theory for explaining the rise and fall of empires and maybe civilizations!

 

As with any phenotypic trait every person will be endowed with slight differences in genetic predisposition. Also, during each transition some will be quicker to undergo change than others. Since this theory proposes that most individuals are capable of behaving according to two opposite modes, depending on their reading of the social environment and depending on the strength of their genotypic predispositions, we can expect to see a mix of types during every transition. I claim that Western Civilization is undergoing a transition of decline, so is there evidence that people with opposite outlooks are present?

 

Yes, in the United States one “type” is called Democrat and the other is called Republican. Democrats are widely recognized as being tolerant, compassionate, prone to preferring “peace, not war” and somewhat self-absorbed. Republicans are widely recognized as being intolerant, short on compassion outside the immediate family and social circle, enforcers of group social norms, unthinkingly hyper-patriotic and quick to wage war.

 

Every society should have some of both types whenever the social situation is difficult to read, or whenever conditions are changing. Indeed, the term “liberal” and “conservative” are used to describe the participants in every political system. In the United States Democrats are liberal, Republicans are conservative, in Great Britain Whigs are liberal and Tories are conservative, and so on for every political system.  When a society is attacked we can expect influence to swing to the “conservatives” and when the peace has been won influence should swing to the “liberals.” The American Empire is in decline, and there is a growing sense that the liberal Democrats can’t be trusted with safeguarding the homeland. As the decline continues, as surely it will, the call for patriotism will become louder, and the level of intolerance for differences in opinion or lifestyle will grow. The outcome, however, is sealed by an unprecedented level of apathy, corruption and corporate control that grew during the peaceful years. From my perspective it seems too late for undoing the institutional damage to a once great America.

 

As the American Empire recedes other societies will aspire to fill the power vacuum. China may be the next global empire. Their tradition does not include individualism, so as Western Civilization declines a collectivist Oriental Civilization may take its place. If this happens, as I believe it will, the prospects for individual liberation from genetic enslavement will have become remote.

 

Pampered Comfort Theory

 

As the previous section shows there is reason to suspect that the successful creation of a civilization guarantees that it will have within itself the seeds for its own destruction. At the risk of dwelling again upon the same theme, consider the mistake of the Roman Empire in recruiting too many of their army troops from barbarian populations in outlying regions (e.g., the Visigoths). This rendered Rome vulnerable because it was a city defended by less loyal foreigners than during the rise of the empire. The temptation of the Roman citizenry to avoid unpleasant duties in favor of a comfortable life was irresistible, so they relied upon others to bear their burdens of defending the empire. The foreign-born soldiers were treated shabbily, lived far from Rome, and were ordered to battle as if they were “canon fodder.” Meanwhile, back in comfortable Rome, the decadent lifestyle of pampered citizens is said to have led to a neglect of civic duties and a corrupted governance. It is natural for the strong and the clever to gain power when others are not paying attention. This inattention to civic responsibility seems to happen in times of peace. Whenever Vigilance gives way to Neglect, the hard-earned benefits of a prosperous civilization commence a slow disintegration.

 


─────────────────────────────────
CHAPTER 16
─────────────────────────────────
FACTORS INFLUENCING THE FATE OF CIVILIZATIONS
PART II – SOCIAL PARASITES

 

Producer/Parasite Theory

 

I would like to suggest another "endogenous" theory that should concern smug residents of every civilization. Parasitic behavior is a common part of Nature. All grazing animals are parasites of plants, for example, and all carnivores are parasites of plant-eating animals and smaller carnivores. Plants are therefore the original non-parasite "producers" since their "livelihood" is based on sunlight, carbon dioxide in the air and nutrients in the soil, all of which are non-living and "free" for the taking.

 

Some animal species rely entirely upon parasitism of another species, the way a leech parasitizes fish. Parasitism also exists within a species. Humans, having conquered Nature more thoroughly than any other species, created opportunities for a variety of individual "strategies" for prospering and replicating that are fundamentally intra- pecies parasitic. I will rely upon a common sense definition for producer and parasite behaviors, but if you're having trouble think of a tribe that marauds a neighboring tribe, killing some of them, stealing their crops and possessions, burning their shelters, and taking prisoners for later use as slaves. The victor's rewards are from theft instead of production, and therefore it is a form of parasitism. Or think of a merchant ship on the high seas being pursued by a pirate ship, overtaken, commandeered, causing precious cargo to change "ownership."

 

I contend that each person inherits a repertoire for many survival strategies, and that the environmental setting (including the social component) elicits from the individual those strategies most likely to work best (based on the experience of ancestral generations). Strategies are "chosen" automatically from among a repertoire of brain circuits whose basic architecture was created by the genes. The process for choosing which behavioral circuits (modules) to activate is itself contained within brain circuits, created by genes.

 

I also contend that it is possible to assess strategies as belonging somewhere along a spectrum with "Producer" at one end and "Parasite" at the other. An individual person may engage in behaviors belonging to one type, then, in response to a change in the setting, switch to behaviors of the other type. Some people may engage in mostly producer behaviors, while others may engage in mostly parasitic ones. If the same person could be born into the world at different times, he may be mostly producer-oriented in one setting yet be mostly parasite-oriented in another.

 

I will refer to the dynamical interaction of an individual's Genome with Environment to produce the person's specific Phenotype (expressed behavior, as well as expressed

anatomy and physiology) using the term GEP (Symons, 1979), and described in Chapter 6. Over generations the physical and social Environment changed many times, and to the extent that specific environment "types" repeat, the Genotype would tend to provide for brain circuits that elicit an appropriate repertoire of possible behavioral Phenotypes suitable for each environment. If, for example, the climate in one locale alternates between two types, for which two different ways of life are  adaptive, it is likely that the Genotype will eventually provide for the required pair of behavioral Phenotypes within each individual. Whereas the anatomy and physiology are relatively fixed, behavior can be elicited in response to perceived conditions, and it would be an oversight on the part of the genes if they did not provide for this adaptive flexibility.

 

Michael Gazzaniga has suggested (1997) that the brain's large repertoire of responses to social or physical conditions is analogous to an immune system, which has a large repertoire of immune responses to a very large number of pathogens. Because our ancestors survived exposure to many pathogen types, our immune system is "prepared" to respond appropriately to each specific pathogen that our ancestors survived. In any single individual's life only a few pathogens will challenge the immune system, so only a small portion of the immune system repertoire is made use of. By analogy, an individual's lifetime involves a small number of environmental challenges and these will elicit a small portion of behaviors that reside within our immense repertoire of possible behaviors.  Each behavior type is “poised” for release by the appropriate social environmental stimulus.

 

Although individuals must have the capacity to switch from one behavior type to another in response to perceived conditions, thresholds for the switching must vary. Thus, some individuals are predisposed to be one way versus another. This complicates analyses that strive to understand the role of producer/parasite behaviors in leading to the rise and fall of civilizations.

 

As an aside, any modeling of the penetration of a gene into a gene pool is complicated by the large number of phenotypic measures that must be taken into account for determining an individual gene carrier's fate. Not only is an individual parasitic or productive, but he is more or less intelligent, resourceful, immune to infections, physically strong, etc. All phenotypic variables can be relevant to the fate of the genes making up the individual's genotype, and any study of the strength of environmental cues to elicit parasitic behaviors will have to make use of statistical multiple regression analyses.

 

Another feature of this dynamic deserves comment. Genes exist for thousands and millions of years, typically. The individuals they construct are just temporary residences, meant to survive within a variable environment and compete with other individuals for future genetic representation. Thus, if a person is parasitic, and prospers, the real beneficiary is the gene (or genes) that predispose the individual to behave in parasitic, gene‑serving ways. The individual is sometimes the loser, in an individual welfare sense, in spite of the gene‑winning ways of those that made him.

 

If we wanted to write a history of an animal species, such as the giraffe, it would be unthinkable to omit the role played by the animal's anatomic and physiological traits. These traits are fairly straightforward, and predispose the animal to specific ways of living. The behavioral capacities, predispositions and inabilities are no less important. They evolved in conjunction with the anatomical and physiological traits. We should therefore expect to find a compatibility among all three trait categories:  anatomy, physiology and behavior.

 

The phenotype, or the way an individual organism is, consists of these three factors (anatomy, physiology, and behavior). For humans, behavior is probably a more important component of phenotype than for any other species (the immune response, a component of physiology, must be another important component). More genes must influence behavior for humans than for any other animal (which is supported by the emerging ubiquity of genes that influence the brain, amounting to as many as 50% of all genes by one estimate).

 

As a thought experiment, let us imagine that it is possible to measure each individual's "producer/parasite" score at a specific time, in a specific setting. For any population of humans living in a "society" consisting of many tribes that have at least some non‑antagonistic social contacts, it would then be possible to create a histogram of these scores; we could determine what fraction of the population was "productive" versus "parasitic." If we could keep track of the parasitic fraction versus time for a society we would note variations in the incidence of expressed parasitism.

 

If we could also measure the per capita wealth of a society, the wealth parameter would also vary. Now, I allege that the two parameters, parasitism and per capita wealth, would be correlated. Moreover, I predict that they would be positively correlated, with a slight phase lag. Whenever a society reaches a peak in per capita wealth, parasitism is rewarded more than during the previous few generations; during the wealth peak parasitism will show its greatest growth.  I suggest that it is the "rate of growth of parasitism" that is positively correlated with per capita wealth. (For engineers who like sinusoidal curves, the fraction of the population that is parasitic is alleged to exhibit a phase lag of 90 degrees with respect to per capita wealth ‑ disregarding, for the moment, that the two traces are not sinusoids.) To investigate these speculations I created a spreadsheet model that incorporates wealth creation, parasitic gene payoff, and other factors, and have demonstrated that expressed parasitism does indeed lag the wealth trace. Chapter 15 has plots of "innovation rate" versus time, and population versus time. (In Fig. 15.14, and also 15.15, there might be evidence that parasitism rose as the population was rising, at the same time that the innovation rate was decreasing.)

 

The reason parasitism increases during "boom times" is that wealthy people are willing to tolerate the loss of small amounts to parasitism, whereas poor people will take measures to defend themselves from parasitic losses of the same absolute amount. An individual's actions are based on what effect it has on the genes in that individual, assuming the genes have experienced similar situations in the past and evolution has left mostly those genes that respond to situations "adaptively." If the genes in an individual do not benefit by allocating energy to a defense from parasitism, compared to the cost of that defense, they should not be expected to put up a defense. Thus, parasitic behaviors should be able to invade wealthy societies more easily than poor ones.

 

This argument does not require that parasitic people "invade" a society from "outside." Rather, desperate individuals may "switch" from being mostly productive to being more parasitic. Also, individuals who are predisposed to being parasitic (have lower thresholds for responding to situations in parasitic manners) may flourish, while their less fortunate producer‑brethren flounder and produce fewer offspring. The first process can occur almost instantly, in a matter of years, while the second process requires generations to have an effect.

 

The previous argument assumed that within a society there was a wide range of wealth.  A society that achieves wealth by capitalist means is likely to create wealth disparities.  In America the wealthiest 1% now own 40% of the country’s assets, and the wealth gap is increasing at a frightening rate. Among the Western industrial nations America has the highest levels of wealth inequality within its borders (Phillips, 2002). This growing disparity within a society is frightening because it causes those left behind to feel forgotten, and “left out” – which resembles banishment from the tribe. And whenever people feel banished, the tribe that banished them is “fair game” for reprisals by the banished.

 

The greatness of a civilization is probably correlated with its per capita wealth. When we refer to the "rise of a civilization to greatness," we may be thinking about the amount of activity devoted to the arts, science, and technology, and these are correlated with the availability of funds (patrons of the arts, etc.) for those activities, which is related to per capita wealth (consider the famous example of the Medici family’s patronage in fifteenth-century Florence, Italy).

 

I am assuming that our ancestors have experienced a sufficient number of boom and bust episodes that our genome has "adapted" to this dynamic. Although it is theoretically possible the genome has not adapted to boom/bust scenarios, to the extent that they have our present civilization’s zenith may be short-lived.

 


─────────────────────────────────
CHAPTER 17
─────────────────────────────────
FACTORS INFLUENCING THE FATE OF CIVILIZATIONS
PART III - TROUBADOURS

 

The Troubadour Theory

 

This theory is a variant of the Producer/Parasite theory.

 

Consider super‑tribe civilizations, for which we may take the ancient Mesopotamian as our model. A large city is surrounded by a sprawling countryside devoted to farming. Within the city is a society of "government employees" who report to the king. There is a strong division of labor within the city. There are jobs for collecting taxes from the farmers, recording tax and other government transactions, settling disputes, construction of buildings, roads and irrigation works, manufacturing (cloth,  pottery, household wares, etc), commercial transportation of goods from the point of production to the shop‑keeper, commercial sale of goods, entertainment (music, dance, story‑telling), and waging war.

 

The concentration of wealth always increases the temptation for theft. Thus, other ways of making a living appeared that were not sanctioned by the king and his government, such as internal corruption, highway robbery, high seas piracy, and other socially parasitic activities.

 

I want to categorize all of the above lifestyles, sanctioned and unsanctioned, as "sedentary" or "adventurous." The warrior has a sanctioned "adventurous" lifestyle. It is important to realize that warriors are measured on many "fronts." The most obvious measure is during combat with other warriors. In a similar way the highway robber and high seas pirate are measured during their frequent conflicts and dangerous lifestyle.

 

Before making the central point of this section, I want to invite the reader to think about what the strongest evolutionary force might have been facing mankind during this era? Was it invasion by barbarian tribes, environmental destruction due to use of natural resources, natural disasters, predation by other animals, overpopulation and the deterioration of inadequate infrastructure? No, it was none of these! The greatest threat to super-tribe life has always been disease pandemics!

 

Diseases brought from distant places can decimate a population if the people have never been exposed to the pathogen. It can be assumed that a small fraction of any large population has a genetic immunity to every new disease. It is a fundamental principle of genetics that some individuals will have a better pre‑adaptation to any conceivable new challenge or threat, regardless of how novel or old it is. For this reason we can argue that sometimes only a small fraction of a population will survive the experience of wandering into foreign lands where new diseases exist.

The era of kingdoms brought the threat of disease to its people as never before! The threat went in both directions. Invading armies carried their homeland diseases with them, and diseases in the lands being invaded would infect some of the invaders. Diseases were carried in both directions by more than invading armies. Migrants, traders, and any of the many new categories of itinerants were "vectors" for disease. One way or another, every large population center was at risk, no matter how great were its civilization's technical or military achievements.

 

Now, consider two hypothetical female inclinations under these conditions: 1) be sexually interested in mating with "adventurous" men, or 2) remain disinterested in the "adventurous" men while maintaining a loyal monogamous relationship with their husband. To the extent that women were inclined to be of one type or the other (and assuming that all other factors were equal), which type would have yielded more offspring surviving into adulthood? The answer is obvious: the better strategy for women is to cuckold their husband by feeling attracted to "adventurers" when she’s most fertile (Haselton and Gangestad, 2006). These “calculating” cuckolding women will have a greater genetic legacy than the "faithful" women!

 

Female choice refers to the influence of female preference for mating choice. Female choice refers to any action taken by a female that is likely to influence which male makes her pregnant. It may take the form of influencing who she marries, or it may take the form of who she mates with outside marriage ‑ i.e., with whom she cuckolds her husband. Both types of choice, choosing a husband and choosing a cuckolding partner while married, will affect the success of her offspring. On first principles (evolutionary ones), the genes will have "something to say" about such behaviors. Female choice requires that women pay close attention to the males who can be observed. Females should be measuring them for "what they're good for" ‑ from the perspective of her genes. All of this measuring, of course, will be automatic, and usually subconscious. But the woman who fails to evaluate men from her genes' perspective will be a failure as a woman.

 

A surviving warrior must not only have good genes for physical endurance, he must also have genes for an immune system that can deal with the germs that are out there in neighboring lands. It is just a matter of time for diseases in foreign lands to arrive at the doorstep of large settlements. Women who mate with adventurers and bear their children are likely to do their own genes a favor; for they will be hitch‑hiking with a winner in Man's greatest battle ‑ the battle with viruses and bacteria. The pirate who comes into port, has a legitimate lust, for his immune system has been measured and it has survived exposure to diseases on foreign shores. The troubadour travels with a similar right to women's hearts. And to a somewhat lesser extent, so does the common rogue and scoundrel, who is too easily excused by being portrayed as cute or naughty instead of parasitic! And now we see the glimmer of an explanation for the mild and ineffectual condemnation of scoundrels.

 

How ironic, that the most parasitic of men should take on the role of exploring "immune system mutation space" to find solutions to near‑future threats of bacterial and viral infection, and thereby appear to enhance the chances of a civilization's survival. Their role as unwitting pathfinders in the invisible war with the microbes can be lauded on this basis; but let us not be blind to the consequences of the rest of their genetic heritage. By this strange dynamic one parasite creates another; the microbial parasite creates the socially parasitic rogue.

 

I am perilously close to accusing women of being influenced by their genes in unthinking ways, about which they haven't the faintest clue of explanation. Surely, Ruth Westheimer was not driven by deep thought when she wrote (1986, pg. 21) that "Most married women want pirates, or something like pirates..." and "Here is a good marriage fantasy ‑ to imagine that your nice steady husband, who never inconveniences you by being arrested or a fugitive (sic), is really a dangerous criminal..." She never explains why such fantasies should work. I just did!

 

If ancient kingdoms, like those producing such civilizations as the Mesopotamian, the Egyptian, early Greek, and the Roman, rewarded women carrying genes that caused them to be sexually attracted to soldiers, pirates, troubadours and scoundrels, then how might these genes have fared in subsequent eras? Diseases have ravaged Europe on many occasions since the kingdoms and empires created the conditions from which this curious female behavior originated. The Dark Ages, the Middle Ages, and even the Renaissance ‑ have yielded up impressive episodes of spreading disease and death. The Black Plague reduced both China and Europe's population by almost 30% in the 14th Century. The European explorers in the New World brought diseases that killed perhaps 90% of the indigenous population. The driving force is unabated, so presumably women's adaptive response is undiminished.

 

What evidence do we have, besides Ruth Westheimer’s fantasy, that contemporary American women continue to practice a female choice that favors rogues? I will cite two examples.

 

Several paternity test studies in the U.S. and Canada show that 9 to 20 % of offspring were not fathered by the mother’s spouse. This would seem to be an important statistic, yet there is essentially no discussion of it in public. It is an unnoticed elephant in the room. This would rank as one of women’s best kept secrets if it wasn’t also true that men are universally vigilant about any sign that their wives are interested in another man, and when there is evidence of this interest men are universally jealous.  This shows that men fear being cuckolded by women who wish to hedge their genetic bets by mating with other men, including traveling scoundrels with apparently good immune systems.

 

Additional evidence for the notion that women are attracted to traveling scoundrels can be seen in contemporary styles and fads: such as teenage girls' swooning over the barbaric antics of "rock stars," today's equivalent of the more romantically portrayed itinerate troubadours, the box office success of movies with angry young rebels (modeled after unlawful highway robbers), and the popularity of superficial, airhead movie muscle heroes. Even the new fad of wearing baseball caps backwards (when not riding a motorcycle) seems pathetically ridiculous and inexplicable without reference to this theory's payoffs to men for appearing to be unruly motorcycle‑riding roustabouts.

 

There is evidence that this female fascination with rogues is modulated by cultural or economic conditions. The Great Depression in America seems to have produced a healthy regard for gentlemen "producers." This anomaly extended throughout the World War II years, and into the Fifties. Then, during the Early Sixties, the apex of American civilization, the preferred type began to shift to the rebellious, shiftless, social parasite. The new culture produced such epithets as egghead, nerd and workaholic. The generation that "saved civilization" from Hitler begat a pampered, spoiled generation notorious for its ingratitude and self‑absorption. Well‑mannered movie heroes like William Holden, Cary Grant, Gene Kelly, Jimmy Stewart, Gary Cooper, Spencer Tracy and Katherine Hepburn were gradually replaced by the likes of Arnold Schwartzeneger, Bruce Willis, Madonna and Roseanne ‑ which is where we are today! If there's a pattern here, what could it mean?

 

When "times are tough" it pays to set aside whimsical and immature frivolities and become serious about commitment to hard work. At these times the over‑riding benefits of men who are adult in their attitude, and capable of achievement, give them an advantage over immature, irresponsible rogues. But when times become "easy" the things men are good for changes, and parasitic men become relatively more valuable. Women's values adjust to the times. If the genes that govern women's subconscious behaviors are capable of making this distinction about men and can adjust their preferred type, then these genes would confer great competitive advantage over those women who cannot do these things.

 

There seem to be two ways to achieve a change in the temperament of a society. The fastest way to achieve a change in outward behavior relies upon female choice. Women are sensitive fashion watchers. They sense the "times" in ways that almost defy logic. Whereas men must sense a storm beyond the horizon, before it can be seen, women must sense a change in social climate before it occurs. Their response to a sensed change is to cuckold their husbands to different men at varying rates. When women make their "preferred mate choice" other women notice, and this speeds the shift. But men also notice these shifts, and they attempt to imitate the preferred male type. In just one generation we have seen a shift away from men wanting to be perceived as Producers to preferring to be perceived as Parasites.

 

The second way a society's manifest behavior can change is through a change in gene frequency. Once the dynamic of "female choice" has accomplished a shift, the stage is set for gene frequency change. Gene frequency changes are slow. The conventional wisdom is that evolution is so slow that noticeable changes require tens of thousands, and maybe hundreds of thousands of years. This conventional wisdom is ridiculous! The American Indian evolved from an Asian in less than 10,000 years.