Home | Psychology | Children of Narcissus |     Share This Page
Children of Narcissus
An evolutionary analysis of narcissism

2. Baseline

Copyright © 2008, Paul LutusMessage Page

Introduction | Successful Adaptations | Historical Context | Adaptability and Uncertainty | Quantum Mechanics | The Self-Reference Problem | Belief and Reason | Evolution | Authority and Narcissism | Conclusion | References

(double-click any word to see its definition)

 

Introduction

To place narcissism in its proper context, we need to compare narcissistic behavior to a baseline. I had considered using the word "normal" to describe the baseline, but I see the risk in using an obviously emotionally charged term. From an evolutionary perspective there is no "normal," because nature's requirements change over time and species must remain flexible just to keep up with nature. But at any particular time there is a baseline, an adaptation validated by its adoption by many successful competitors. This doesn't mean all successful individuals share the same behaviors — it is more that they adopt similar strategies.

Narcissists represent a parasitic adaptation, and they are not self-sustaining (they can't survive in an environment composed entirely of narcissists). The most useful definition of "parasitism" is that, if all the players were parasites, the system would collapse. Narcissists are classic parasites, parasites require hosts, and hosts are the baseline under discussion.

Successful Adaptations

Let's describe the baseline, the 99% of individuals who are not narcissists (provisionally accepting the published estimate that 1% of people are clinical narcissists). And a word of caution — what follows is a somewhat idealized portrayal of how people adapt to life's problems, and it is to some extent designed as a contrast to the narcissistic outlook.

There are behaviors and attitudes that greatly improve an individual's prospect for survival. In a study of Holocaust survivors, such traits as adaptability, initiative and cleverness are identified as contributing factors. Another factor often quoted is a sense of purpose and drive that transcends conventional notions of morality. What these traits have in common is their rejection of fixed systems of rules and standards, of the notion that there is a single, broadly applicable, easily described set of rules for survival. They also emphasize individual over group values.

Here is a story about someone who survived a very dangerous situation through his ability to think quickly and unconventionally. A team of firefighters was battling a brush fire in a western state, when the wind suddenly shifted and put them in danger. All the firefighters began running up a hill to escape the approaching flames. Each of the firefighters chose a different escape route and they were quickly separated. One of the men realized the hill was too steep and the wind-driven flames were moving faster than he was. So, thinking quickly, he dropped to the ground and set a fire in front of him. The fire he set quickly spread uphill away from him, and just as the main fire approached from below, he walked safely into the charred area created by his own fire.

This is a true story of personal ingenuity in the face of mortal danger (unfortunately I can't find a reference online). It shows many of the same traits that identify Holocaust survivors — ingenuity, self-reliance, and indifference to rigid notions of acceptable behavior. At the time of this story it was unheard of for a firefighter to deliberately set a fire, but this individual realized the area he ignited was minutes away from burning anyway, and by setting his own fire he made sure he wasn't part of the conflagration. Sadly, of his work crew, he was the only survivor.

The baseline is composed of people who exhibit adaptability, skepticism toward easy answers, a willingness to abandon failed ideas, and a perception that life is a problem to be solved with personal ingenuity. Again, this is to some extent an idealized description meant to show a contrast with narcissism. It is certainly a description of the most successful individuals in the group.

Historical Context

If we examine the sweep of human history since A.D. 1600, the single most important philosophical change has been the ascendancy of reason over dogma. I chose 1600 as my starting point because that was the year Giordano Bruno was burned at the stake. He was killed by the Church for his heretical claim that there might be other places much like earth, with other life forms, and that all perspectives are relative (e.g. there is no special, preferred perspective). According to the dogma of the time, Earth was the center of the universe, Rome was the center of Earth, and the Church was the center of Rome. According to the intellectual tradition of the time, you either accepted what the Church believed or you were put to death.

Church officials cannot have known this, but Bruno's execution marked the beginning of the end of Church authority, and by the time of Galileo Galilei's heresy trial in 1633, the option of publicly executing a noted critic had pretty much evaporated (Galileo was placed under lifetime house arrest instead).

Both these men were far ahead of their time and both took great personal risks. Before the time in question the Church provided simple answers to the most complex questions, even though the answers were often quite wrong. Since that time an intellectual tradition has evolved that replaces dogma with reason and evidence.

Contrary to a popular misconception, the modern scientific outlook doesn't produce a greater level of certainty than dogma, indeed about many topics it produces the opposite. Because of how science works, because of its emphasis on evidence, we are less emotionally comfortable than we would be under a system of dogmatic belief. We know which questions have not been answered, and we have come to realize some questions can never be satisfactorily answered. The difference is that the modern outlook is testable and verifiable in a way that dogma cannot be.

Under the old system, there were a handful of absolute beliefs, some were true, some false, but no one dared test any of them against reality for fear of suffering the wrath of the Church. Under the new system, there are no absolute beliefs, only theories, and any theory can be challenged using evidence. Under the old system, we had diseases and plagues that resulted from God's displeasure at our sinful ways. Under the new system, we have vaccines that result from an open exploration of nature on her own terms.

An irony of modern times is that the new, irreverent, uncertain, skeptical intellectual tradition has produced much more reliable knowledge about the world than the old system, and these results flow directly from an acceptance of doubt and uncertainty.

Adaptability and Uncertainty

From an evolutionary perspective, it's obvious which intellectual strategy is more effective — it's a simple matter of counting noses among creative, well-paid workers. In modern times, people succeed by avoiding dogma, testing new ideas against reality, and remaining flexible and adaptable. The most prized trait in modern times is not the ability to make a firm decision when presented with firm information — a computer can do that. More highly valued is the ability to make a reasonable choice when confronted by imperfect, ambiguous information.

Isn't this adaptability a temporary adjustment because we haven't yet figured out how to acquire better information? Isn't it possible that this adaptability won't be required in the future when information is better? Well, it seems the answer is no. There is much more waiting to be discovered, but it is well-established that some things we expected to be able to know will never be resolved to any degree of precision. This uncertainty arises from two important discoveries — quantum mechanics, and a limitation to certainty created by self-reference.

Quantum Mechanics

Albert Einstein was the last important physicist who believed that perfect predictability could be had through perfect observation. Before his time, it was believed that more accurate observation resulted in more accurate predictions. By the 1920s however, a new field of physics had sprung up that successfully married theory and observation and that banished the idea of absolute predictability. In the new field of quantum mechanics, events at a small scale are governed by statistics and probability, and one can only compute the probability of a particular event, not the event itself.

When Einstein first heard the new quantum ideas, he complained that God doesn't throw dice, but as it turns out, events on a very small scale no longer have classical predictability. As we approach the scale of individual atoms, it is as though each atom is a coin being flipped randomly. It turns out the only reason we see the large-scale world as solid and predictable is because of an averaging effect:

  • If we flip a fair coin a million times, and if we count heads as 1 and tails as 0, the average of all the flips is very likely to lie near 1/2. The outcome, determined by probability, is reliable but not deterministic — a small possibility exists that the result might differ significantly from 1/2.
  • As we increase the number of atoms, the probability of a particular outcome increases because of the mathematics of averaging. So for large objects, the outcomes predicted by quantum mechanics and classical physics become more alike. This is why quantum effects went unrecognized for so long.
  • For the extreme case of a single atom, nature doesn't allow a firm prediction about how things are going to turn out. Under the rules of quantum mechanics we cannot know the outcome of the flip, nor specifically when the flip will take place — we can only compute probabilities.
  • Consider the implications of this idea. For a large collection of atoms, an average of 1/2 is very likely, and as the collection grows larger, this outcome becomes even more likely. For a single atom, an outcome of 1/2 is not even possible — it must be either 0 or 1.

As strange as quantum ideas sound, they are now very well-supported by evidence — in some cases there is agreement between theory and experiment to ten decimal places. But the theory's experimental confirmation doesn't make it less weird. Here's an example — inside a box we place a cat, a vial of poison, a Geiger counter and a radioactive sample. At some random time, the radioactive sample will emit a particle, the particle will trip the Geiger counter, and the Geiger counter will break the vial of poison which will kill the cat. (This is a classic thought experiment called "Schrödinger's Cat".)

In classical physics, after we close the box and let some time pass, the cat is either alive or dead, and we can discover the outcome by opening the box. But in quantum mechanics, inside the closed box the cat is neither alive nor dead, and the outcome is produced when we open the box, through the act of observation. Until the box is opened, the cat is in a superposition of states, a so-called quantum wave function. When we open the box, by observation we collapse the quantum wave function and produce a classical result.

This thought experiment is intended to reveal something important about quantum mechanics — even though its strangest effects take place at very small scales, there are circumstances where quantum uncertainty can make its way into the large-scale world. For example, it has gradually occurred to meteorologists that quantum mechanics may forever prevent weather prediction beyond a certain degree of uncertainty, and the longer the time scale, the greater the uncertainty. The idea is that the quantum uncertainty of the microscopic world percolates up into the position or direction of a storm cloud, or even the size and location of the next hurricane. This is called the "butterfly effect", the idea that the flip of a butterfly's wing off the coast of Africa might be adequate to create or destroy a hurricane.

From a philosophical standpoint, the experimental confirmations of quantum mechanics have signaled the end of classical notions of certainty and predictability. Before, we didn't know what we didn't know. Now we know, to a very high degree of precision, what we cannot know.

The Self-Reference Problem

In 1913, Alfred North Whitehead and Bertrand Russell published Principia Mathematica (the title is borrowed from Isaac Newton), an important work that describes the foundations of mathematical reasoning. The ambition behind this work was to describe the roots of mathematical reasoning in a small set of axioms (principles taken as self-evidently true).

The assumption behind Principia Mathematica, that all mathematical reasoning could be shown to arise from an internally consistent set of axioms, had a rather short life. In his Incompleteness Theorems, Kurt Gödel showed that for any sufficiently complex formal system, there are statements that cannot be proven within the system. This doesn't mean the formal system described in Principia Mathematica is false, it means we cannot use a formal system to test some assertions about that system.

We can always construct a system B of sufficient complexity that system A is a proper subset, and we can then use system B to prove statements about system A, but this only relocates the problem, it doesn't solve it — in this effort at a solution, there are now statements about system B that are unprovable.

Here is a simple statement of the intractable problem revealed by Gödel's work:
  1. Let's say that Principia Mathematica is a book meant to serve as a catalogue to all the books in a library called "Mathematics."
  2. The new catalogue includes all the books in the library, so it is judged to be complete and successful.
  3. The champagne has scarcely gone flat when an argumentative Austrian points out there is an important mathematical book not catalogued within Principia Mathematica — the catalog itself.
  4. This problem is solved by constructing a new, bigger library that encompasses the previous library plus its catalog.
  5. The new library doesn't have a catalog, a problem solved by step (1) above.

Repeat until exhausted.

At first glance this might seem to be a problem limited to complex formal systems, but all nontrivial systems of thought and reasoning possess the limitation identified by Gödel — to put it simply, for each formal system, there are certain questions it cannot answer. This doesn't mean the system's results are false, it means we cannot prove them either true or false. In exchange for applying the system, we must accept a degree of irreducible uncertainty.

As in the earlier example, we now know what we don't know, and what cannot be known.

Belief and Reason

People in the baseline group educate themselves about these firm limitations to reasoning. They don't react by abandoning reason, they simply learn that knowledge cannot ever become both complete and internally consistent.

This is the most important objection to rigid systems of belief. Were it not for the properties of nature described above (and a few others), one could collect all the information about nature, shape it into a formal description of reality, declare the thinking project complete, and impose the result on everyone else. In fact, this idea has been tested, repeatedly, but it's always defeated by systems that are open to change and revision.

We can explore nature for clues about effective strategies. Is there evidence that nature prefers fixed systems, or those that admit change? Well, it seems nature is very clearly biased toward systems that allow change.

Consider an abstract thinking being with no exposure to the real world, trying to guess what form life might take. He would very likely expect life to be composed of perpetual organisms with fixed processes and cells that regenerate forever. This picture agrees with basic physical principles — such an eternal organism would require little energy to sustain itself, physical systems naturally move to their lowest energy state, and an eternal organism is simpler than one that has the capacity to reproduce (nature prefers simple systems over complex ones).

Given this premise, it seems nature's preference for an energy-inefficient system of constant birth and death, of multiple mutable species, is meant to tell us something about reality. If we accept the premise that simple solutions are preferred, we must conclude that nature's present system is the simplest practical one that meets the requirements of physical reality.

According to present evidence, nature tests all solutions in parallel, in an open-air, free-form laboratory where successful experiments are called "survivors" and the laboratory's rules are called "evolution."

Evolution

Like many scientific discoveries, evolution confirms the principle (called "Occam's Razor") that simple solutions tend to prevail over complex ones. Ironically, even though the results of evolution can be very complex, the basic idea of evolution is simplicity itself — conduct a lot of experiments, let the experiments create their own experiments, and see which ones succeed.

Through evolution, nature tells us something — in a rich, complex environment with sufficient energy available, the simplest way to solve a problem with ten thousand possible solutions is to test them all at once, and the optimal solution will pop out of the results. And evolution's usefulness doesn't end with one result — in a constantly changing environment, evolution continues to pick the best solution, the "survivor," again and again.

Evolution, an efficient way to choose an optimal solution in the physical world, also works in the sphere of ideas. Ideas evolve over time just as species do, and an equivalent process of natural selection exists for ideas. The intellectual version of evolution is called science — science is to ideas what evolution is to nature. And the same basic rule applies — for ideas to evolve, the system must be open.

In all its forms, evolution is a bottom-up system — the source of variability lies at the lowest level and propagates upward. For physical species, a beneficial mutation originates within a single germ cell, and (if it passes nature's tests) the adaptation might eventually be shared by all members of that species (or give rise to a new species). In the realm of ideas, of science, the same bottom-up logic applies — ideas originate with individuals, and their general adoption depends on the outcome of a difficult series of tests (and most ideas fail the tests).

Mutations in nature can only succeed through tests against the environment, while ideas in science can only succeed through tests against evidence. Both physical and intellectual evolution meet the bottom-up criterion that any number of mutations might appear at the bottom, but only a handful survive to win general acceptance. And all forms of evolution share another property — there is no single, final answer. This is why members of species have limited lifetimes, and it is why science is composed only of theories, no truths or laws.

This rejection of permanent solutions — of belief and authority — is itself an idea that has evolved over time, and it passes nature's most demanding tests.

Authority and Narcissism

Authority is a system for imposing fixed ideas on a group from above. For those who understand which solutions nature prefers, this is three strikes in a ball game that hasn't even begun. There are any number of perfectly reasonable ideas that, if imposed from above, will fail on that ground alone.

Consider an individual impulse toward generosity, a willingness to help members of one's own group with the expectation of future reciprocation (an idea I call the Symmetry Principle). Perfectly natural. But when imposed by authority from above, the same idea is called "Communism," and ... it just doesn't work, indeed it's one of history's more ghastly failures.

Consider education, what was originally a young person's exploration of the world, with a few helpful hints from older people along the way. When imposed by authority from above, it becomes something more akin to indoctrination, is notoriously effective at snuffing out natural curiosity and produces ignorant drones. Many intelligent people eventually realize this, and resume their true education only after escaping the "educational system."

Obviously, like any interesting question, this issue has two legitimate points of view:

  • Without authority, we have anarchy (shudder).
  • With authority, we have hierarchy (shudder).

Take your pick, remembering that, because of nature's preference for disorder, authoritarian systems eventually become indistinguishable from anarchy.

But authority is enormously attractive to a particular kind of person, usually someone of "diminished intellectual capacity," and preferably one predisposed to clinical narcissism. Because of narcissism's roots in deep personal insecurity, a primary goal is to place oneself beyond the possibility of criticism, and one solution is to adopt a system of fixed beliefs, then acquire a position of authority in order to impose those beliefs on others.

This narcissistic strategy can never be more than a holding action, because:

  • Authority and fixed beliefs contradict nature in the most basic way.
  • Narcissists need enablers, and all but the dullest enablers eventually figure out what's being done to them.

Someone will doubtless ask why, if narcissism is so contrary to nature, there are so many of them. The answer is that the natural rate of mutation is high enough to produce a certain number of narcissists along with the Einsteins, Gandhis and Béla Bartóks that pop out of the system. Also narcissists have children and there is some evidence that narcissism is linked in families, either for genetic or environmental reasons.

Conclusion

Baseline individuals learn the limits of certainty, become increasingly humble with respect to what any single person can know, and raptly listen to what nature has to say. The most successful members of this group show adaptability, ingenuity, and a positive outlook that tends to be self-fulfilling. They expect to find their own way in the world. They accept responsibility for their failures, and earn credit for their successes.

Narcissists, by contrast (and not to give away too much of the content of the next page), reject the limits of certainty, become increasingly arrogant with respect to what they can assert, and refuse to listen to nature. They instead try to redirect your attention away from nature toward them. The secret of the narcissist is that, without enablers, his system collapses. But it seems for each narcissist there is an enabler, just as for each cockroach there is a grain of discarded rice.

References

 

Home | Psychology | Children of Narcissus |     Share This Page