Caution: much ego strength recommended.

1 Preface 2897-2898
2 Encyclopaedia of Religion and Ethics 2899-2899
3 Much Ado About Nothing 2900-2900
4 Excerpts from the Internet 2901-2914
5 Preliminary Probes 2915-2916
6 Social Science Research Council 2917-2917
7  Mental Illness from Ancient Times 2918-2919
8 On the Matter of the Mind 2920-2922
9 Freud and the Neurosciences 2923-2927
10 Phantoms in the Brain 2928-2938


PREFACE [provisional] 12/24/2004 - 6/21/2005

          I (LS) was studying (early November, 2004) The System of Nature [see 2829], by Baron d'Holbach [1723 - 1789], 1770, Robinson translation, page 327 (see 2837): "it would be plunging them into a vacuum....than to float in the vacuum". Horror vacui!

Later, page 259 (see 2833): "Horrour of a vacuum"; page 84 (see 2829): "in that which is not matter, he can only see vacuum and emptiness".

          My previous interest in horror vacui, probably 1964, San Francisco, given as one explanation for graffiti [see 2906] on walls, etc.

          Study and reflection has resulted in the following preliminary examination—from disparate sources. I consider "mental horror vacui" ["MHV"] [see 2904, 7.], horror vacui originating in the mind (and probably not due to an external stimulus, such as a blank wall), a huge and useful psychological paradigm.

          My initial reflections, I later found, were sententiously confirmed by Jean-Paul Sartre 1905 - 1980 (see 2905): "NOTHINGNESS HAUNTS BEING."

          My guesses: All the Gods of Homo sapiens (erectus, et al.?), including the famous Gods of the Greeks, the reported 1200 deities of the Romans, the Gods and God of the Jews and Christians, Jesus Christ, other myths, folklore (Bibles, etc.), have been and are, "stuffing" for mental horror vacui.

          Atheism does not provide "stuffing" for the "superstition vacuum" ((my term) a component of mental horror vacui)—hence—minor appeal—and, more Agnosticism.

          Ten? years ago, a city planner, my friend (D. A.), alerted me to "self induced"—which especially applies when our behaviors have caused us troubles

(a medical student (later, pathologist and researcher), and old friend (E. W.), c. 1968, U.C. Medical Center, San Francisco, colloquially concocted, and smilingly alerted me to: "autofuck"—his neologism).

My guess: "self induced" is commonly "stuffing" for mental horror vacui.

          Is mental horror vacui a function of qualia (do qualia exist [see 2934-2937]?)?

          Is mental horror vacui—stimulus for reproduction? An old, and departed friend—nurseryman—Bromeliad expert (W. S., Escondido, California)—"cut back" some plants, to "scare them into reproducing". [see 2907-2908 (Floridi)].

Much of human behavior is motivated by mental horror vacui.

Midnight, rooftop, two people, about 1999, Rosarito Beach, Baja: my old friend and gem cutter, sadly said: "Me siento vacío [I feel empty]". Drug (including alcohol) habits renewed. Survived a bullet in the belly; the diseased organs?  [added 1/17/2006: Jose died December 8, 2005, age 43].


          Assuming mental horror vacui is an entity (physiological/psychological/etc.), can it be traced back to the "primordial ooze"?

Environment to biochemistry to physiology to prior Homos to amphibians to one-celled organisms to bacteria to genetics-proteins to gases to the "primordial ooze" (to origins of the solar system—as infinity)?

_____ _____ _____

          Tonight (4/17/2005)—now, 2:00 a.m. (4/18/05), more (mediocre) mentating, about the relationships of mental horror vacui and Fear of Death; both, variable by the subject and environment.

Mental horror vacui might incorporate Fear of Death.

Homo sapiens [the other primates? other animals?] have to cope with two very disturbing psychological entities—mental horror vacui and Fear of Death.

_____ _____ _____

          4/18/2005 Continuing cogitation (agitation), regarding mental horror vacui and Fear of Death.


Mental horror vacui: extends as infinity.

Fear of Death: the dying, and, dead body, is earthbound.

_____ _____ _____

          5/2/2005 Apparently "illegal drugs" (Marijuana, Cocaine, etc.), and "legal drugs" such as alcohol, Selective Serotonin Reuptake Inhibitors (Prozac, Paxil, etc.), etc.; sex; marriage; shopping; radio; television; overeating; Internet; telephones; religions; music; fiction; sports; holiday travel; pets; gambling; politics; war; hobbies; investing; clutter; "Man thrives on adversity"; newspapers and magazines; construction; education; procreation; etc.; function, in part, to diminish the sensations of mental horror vacui and Fear of Death. [See: 2866-2868].

_____ _____ _____


from: Encyclopaedia of Religion and Ethics, 1961, Volume VI, 796, 797.

          "HORROR.—Although, both in its subjective aspect and in its external expression, horror is usually regarded as an extreme form of fear (q.v.), there are certain features which almost amount to a difference of kind. It [horror] has two forms, which may be called the acute and the diffuse. The latter ["diffuse"] is felt mainly in regard to the sufferings or disasters of others rather than one's own....

Horror is...pre-eminently a social emotion; to the individual it is a shock, followed, almost as in cases of physical shock, by prolonged depression and to a certain extent a lowering of the mental and physical tone....

          The acute form of horror is more intense, more egoistic, and concerns situations of imminent danger to oneself....

The sense of sin, and of the imminence or vastness of the corresponding penalty, or, in more refined natures, the sense of isolation from man or from God which the consciousness of sin brings with it, has, as is well known, stirred up in many souls a sense of horror which has been the motive to religious conversion in some cases, to self-destruction in others....

          In horror the mental powers are affected similarly with the bodily; the senses are confused or defective, and among others the sensibility to pain seems mercifully lowered, as it is in animals also....

          Like physical shock, horror may cause death, when too extreme; and in predisposed natures it may cause insanity, whether of the obsessional or of the depressive, melancholic type. Burton collected a number of instances from the earlier literature; naturally such cases were more familiar in the Middle Ages, when superstition was more wide-spread.

          Probably the sympathetic form of horror is more frequent to-day than the egoistic; it certainly is aroused by sights and sounds which a century ago would have left men unmoved; WHAT EXCITES HORROR IN A PEOPLE MIGHT WELL BE TAKEN AS A CRITERION OF ITS DEGREE OF CIVILIZATION.

          LITERATURE.—C. Darwin, Expression of the Emotions, London, 1872; C. Bell, Anatomy of Expression6, do. 1872; A. Bain, Emotions and Will3, do. 1875; D. Hack Tuke, Influences of the Mind upon the Body2, 2 vols., do. 1884; C. Féré, Pathology of the Emotions, Eng. tr., do. 1899; R. Burton, Anatomy of Melancholy (ed. Bell, London, 1896), i. 386. J.L. McIntyre."

_____ _____ _____


from: Much Ado About Nothing, Theories of space and vacuum from the Middle Ages to the Scientific Revolution, Edward Grant, Professor of History and Philosophy of Science and History, Indiana University, Cambridge University Press, 1981.

'Few dicta are more inextricably linked with the Middle Ages than the declaration that "nature abhors a vacuum." Although the full significance of this famous principle would be described and explicated only in the fourteenth century, it had already emerged in the thirteenth, when expressions such as natura abhorret vacuum, horror vacui, and fuga vacui began to appear.1 The origin of the principle ["nature abhors a vacuum"] is, however, unknown....

There can be little doubt that Aristotle's [384 - 322 B.C.E.] vigorous attacks against the existence of any kind of vacuum constituted the broad background from which the principle that nature abhors a vacuum was ultimately derived.7 But Aristotle himself formulated no equivalent of the medieval principle ["nature abhors a vacuum"]. And yet one passage of his may be singled out as of particular significance because it provided the point of departure for the commentary by Averroes [1126 - 1198]. In De caelo (4.5.312b.4–12), Aristotle declared....

          Taken together, the ideas of Adelard [1075 - 1160], Philo [Philo (Judaeus) of Alexandria 13 B.C.E. - 45-50 C.E.], and Averroes clearly express the concept of nature's abhorrence of a vacuum. Only a name or phrase was needed to identify it for convenient reference, and at least three [see above] such were supplied in the thirteenth century.' [67, 68, 69].

          "Scholastics were concerned about two kinds of intramundane ["being or occurring within the material world" (Webster's Third)] vacua [vacuums], small and large. The first of these was conceived as a multitude of minute, interstitial, void pores lying between particles of matter and designated by expressions such as vacuum imbibitum,26 vacuum interceptum,27 vacuum diffusum,28 vacuum infusum,29 and vacuum immixtum;30 the second, commonly referred to as vacuum separatum,31 was described as an extended, separately existing space.32" [70-71]. [other vacuums, listed in the index (455): vacuum aeris, vacuum ignis, vacuum negativum, vacuum privativum].

Additional References

The Great Chain of Being, A Study of the History of an Idea, The William James Lectures Delivered at Harvard University, 1933, Arthur O. Lovejoy, Harvard University Press, 1978 (c1964) (c1936).

Biografía del Vacío, Su historia filosófica y cientifica desde la Antiqüedad a la Edad Moderna, Albert Ribas, 2004. [only noted on the Internet].

_____ _____ _____




1.       from:

"Many of Erasistratus' [fl. c. 250 B.C.E.] physiological theories stemmed from his combined notions of 'corpuscularianism' and 'pneuma'. He proposed, correctly, that all tissues in the body were composed of nerves, veins and arteries. Key to his conjecture was the statement that "nature abhors a vacuum" (in horror vacui) (On the Natural Faculties [by Galen 129 - c. 199 C.E.]). This concept provided sound argument for such necessary processes as digestion and the subsequent circulation of nutriments to the body....the basic notion of in horror vacui was revolutionary. The concepts of respiration, diffusion, osmosis, and concentration gradients all draw upon this basic idea; these processes are vital to maintaining homeostasis and organ function within the body. Erasistratus was the first to apply in horror vacui to physiological function."


2.       from:

'horror vacui [middle Latin (O.E.D.)] - The compulsion to make marks in every space. Horror vacui is indicated by a crowded design. In Latin, it is literally, "fear of empty space" or "fear of emptiness." Some consider horror vacui one of the principles of design. Those who exclude it from their list of principles apparently interpret it as possessing an undesirable, perhaps obsessive quality, in contrast to the desirable, controled [controlled] principle of limitation, or perhaps of that of emphasis or dominance. (pr. [pronounced] horror vack'wee)'


3.       from:

"National Association of Practising Psychiatrists"


4.       from:

"Friedrich Nietzsche (1844-1900) German

Nietzsche's life:

          Son of strict, intensely patriotic Lutheran minister who went mad when Nietzsche was 4

          Nietzsche was raised in a household of cold and prudish women (He was later to display a great contempt for women)


          Health was poor -- brief military service

          Distinguished classical philologist; appointed professor at Basel at age 24

          Resigned 10 years later for health reasons

          The wanderer, writing

          Madness 1889

Nietzsche is often classed as an existentialist:

Existentialism—the attempt, increasingly prominent in Western thought after the late 19th century, to make philosophical sense of life in an apparently absurd or meaningless world. Existentialism usually emphasizes individual freedom and responsibility in the face of alienation, absurdity, despair and the threat of nihilism."

"Genealogy of Morals"

"Third Essay: What is the Meaning of Ascetic Ideals?"

'Essay begins with what Nietzsche calls "THE BASIC FACT OF HUMAN WILL" ... "ITS HORROR VACUI. IT NEEDS A GOAL—AND IT WILL RATHER WILL NOTHINGNESS THAN NOT WILL." (331) This statement is repeated again as the last line of the essay: "MAN WOULD RATHER WILL NOTHINGNESS THAN NOT WILL." (373)'

[from: Article #7, 197 (Edward O. Wilson): "Men would rather believe than know, have the void as purpose, as Nietzsche said, than be void of purpose."]

"No one who thinks deeply today can avoid asking the questions Nietzsche asked, even though nearly everyone rejects his answers."

5.       from:

"Mad Max

Max Beckmann [1884 - 1950] at the Museum of Modern

Art, Queens, New York

By Joseph Phelan"

'....As an art student he read Arthur Schopenhauer [1788 - 1860], who is best known as the philosopher of pessimism. Carl Jung [1875 - 1961] explained the widespread appeal of this resolutely non-academic thinker to so many nineteenth century artists:


He [Schopenhauer] was the first to speak of the suffering of the world, which visibly and glaringly surrounds us, and of confusion, passion, evil -- all those things which the [other philosophers] hardly seemed to notice and always tried to resolve into all-embracing harmony and comprehensibility. Here at last was a philosopher who had the courage to see that all was not for the best in the fundamentals of the universe.

[Memories, Dreams, Reflections, Vintage Books, 1961, p. 69]


Schopenhauer inevitably left him [Max Beckmann] open to an even greater philosophical influence: Frederic [Friedrich] Nietzsche [1844 - 1900]. Nietzsche's first book, The Birth of Tragedy, caused a sensation throughout Europe by sweeping away the old formula about the Greeks of Goethe and Winckelmann....'

'His [Beckmann's] service as a medical corpsman in the trenches of Flanders nearly drove him mad, and he was invalided out of the army in 1915, suffering from fits of hallucination and unbearable depression. As the famous Self-Portrait with Red Scarf of 1917 shows, he was ferociously concentrated, willfully intense, and deeply unhappy.

At this time Beckmann began to speak of "THE INFINITE SPACE" whose foreground has always got to be "filled with some rubbish or other, so as to disguise its dreadful depth." This element in his own theory of art was but a reflection of the fact that Beckmann himself was now prone to a keen feeling of total abandonment and desolation. He [MAX BECKMANN] WAS IN A CONSTANT STRUGGLE TO OVERCOME HIS OWN PERSONAL HORROR VACUI. With a view to winning this struggle he decided to become a recorder of the unofficial history - the nightmare of history, so to speak - of a Europe gone mad with cruelty, ideological murder, and deprivation, as in Family Picture (1920) and in his masterpiece of the period, The Night (1918-19).'


6.       from:

'What also helps explain the death of Ornament is the fading of that horror vacui which used to belong to Western sensibility in what seems a special way, a way that has to do with that "heavy" pictorial art of the West which I've already mentioned. (The Greeks might have been somewhat less affected by horror vacui, but not the Romans, who appear to have felt it in a decided way too.) Modernist visual art has tended on the whole towards lightness and openness. It tolerates increasingly, nay it demands increasingly, empty spaces and blank surfaces. That's been its long-term if not consistent tendency. Economy: "less is more." Orientalization? Not exactly. Islam, India, China have their own kinds of horror vacui, even if these haven't been as unrelenting as those of the West and the Greco-Romans. The Japanese have, on the evidence, "suffered" from horror vacui least (and their culture is the only one that can match the West's when it comes to rationalizing); it's thus no accident—as Bolshevik Marxists used to say—that Japanese art once had such an encouraging influence on Modernist painting, architec-ture[sic], and poetry: more than that of any other exotic tradition.

But as I've indicated, Modernism hasn't been consistent in its overcoming of horror vacui. Many of its heroes went on feeling it and acting on it: Picasso [Pablo Picasso 1881 - 1973], Proust [Marcel Proust 1871 - 1922], Joyce, Pollock, Stravinsky [Igor Stravinsky 1822 - 1971]; others felt it but resisted it, or resisted it off and on: Valéry, David Smith, Bonnard, Mallarmé, Stefan George. But I don't want to make too much of horror vacui; it's too hard to say where it begins and where it ends....'


7.         from: [Hubertus Fremerey] [found 11/13/2004, searching for my seeming brainstorm: "mental horror vacui"]

'There is a mental "horror vacui": What we don't know we fill with hypotheses of all sorts to make sense of the world we live in and of our actions in this world. We need a house to live in, we don;t [sic] like to sleep in the woods under stars and storms like animals.

And this explains why people often resist placing hypotheses dear to them by "mere factual knowledge" or by "doubts". If you make them abandon all unproven assumptions then you tear their house apieces and drive them to the woods. We all live on some assumptions every day—even the sceptic when drinking and eating.'


8.       from:

"The absolute horror of the media professional is the interrupted broadcast. In the TV format it is sometimes witnessed in an ultimately brief interval as a traumatic black screen. In radio the despair of silence is even greater than the absence of the image on TV. Horror Vacui is replaced here by an electronic form of Horror Silentiae.

The silence of the faded radio signal and the blackness of the imploded TV screen does not merely mark the absence of a signal. The horror implied is the immanent destruction of the illusion of the seamless media surface, which requires the continuous suggestion of immediacy and connection that gives the viewer the reassuring impression of the transparency of the media screen." [compare: deaf persons, blind persons].

9.       from:

"What are my plans for 2002? To continue the previous threads, obviously, and polish some hidden orders of chaos and doom lurking in half-cooked thoughts and spelling mistakes made throughout the years

(my god, I'm insane managing that large a web site...years! it's years! all work and no play...I bet I'm just fighting horror vacui here...). [the substance of my (LS) comments to myself, before finding this]

Also, this site's got a parent and a sister site, and New playgrounds. What else? Some more writing, maybe painting. And yes, how about a relationship. But I guess, that cannot be planned on the drawing board. So I'll stick with transforming all that loneliness and desperation and sexual frustration and fear of mortality into this web site and what surrounds and encompasses it. What else is there to do? Better build your own prison than be encaged by somebody or something else. And yes, keep the key. You might need it someday..."


10.     from:

'Horror vacui

[translation from Spanish] [Óscar Reyes] One of the reasons by which the people tend not to change its habits is by the horror vacui, the fear to the chaos, the emptiness. Homo sapiens bases its survival not on instincts like the bees, that are programmed for all the activities of their short life, but on their capacity to order the cosmos that surrounds it [them], to make instincts. To study the harvests, rains, the habits of the prey, has allowed us to be almost as successful as the virus or the cockroaches'.


11.     from:


From Henri Nouwen:

"We are afraid of emptiness. Spinoza [1632 - 1677] speaks about our "horror vacui," our horrendous fear of vacancy. We like to occupy—fill up—every empty time and space. We want to be occupied. And if we are not occupied we easily become preoccupied; that is, we fill the empty spaces before we have even reached them. We fill them with our worries, saying, "But what if..."

It is very hard to allow emptiness to exist in our lives. Emptiness requires a willingness not to be in control, a willingness to let something new and unexpected happen. It requires trust, surrender, and openness to guidance. God wants [was invented] to dwell in our emptiness."'


12.     from:

'Sartre [Jean-Paul Sartre 1905 - 1980], too, was possessed with a sort of horror vacui.

he wrote in the treatise [book] aptly entitled Being and Nothingness.'

["The men of that time [Cicero's, 106 - 43 B.C.E.] knew, just as we do, THAT DISCONTENT WITH THE PRESENT AND THAT UNCERTAINTY OF THE MORROW[,] WHICH DO NOT ALLOW US TO ENJOY TRANQUILLITY OR REPOSE." (Gaston Boisser, Cicero and His Friends, Ward, Lock, 1895? (1865 French), 390-391)].


13.     from:, page 1767

"As nature abhors a vacuum, so early Christians were reluctant to leave unidentified this or that person who is mentioned but not named in the pages of the New Testament....TRADITION PROVIDED NAMES FOR ALL OF THESE—sometimes several different names...."



14.     from:


Stress, words, anger, resentment, overwork, and dozens of other "space-fillers" take up huge amounts of room in relationships, crowding out connection and intimacy. What if you find yourself with some extra time or space or silence around you? Do you find yourself filling it with words, sounds, thoughts, things, [pets] or people? How do these space filers serve as distractions and barriers to closeness? This workshop will get you thinking differently about "space," "space needs," and "space fillers."'


15.     from:

"Against Despair and Boredom the Certainty[?] of Hope"

'....Despair is when man gives in to nothingness. This is the decisive evil of our time. In a certain sense, then, fear is useful. It points to


When the backbone of the external structures that people feared collapses [is something feared—sought? A serotonin producer?]—and I'm thinking of Communism—everything seems empty and devoid of value. It's the much-quoted "lack of values." If there is no justice, no tension to the ideal, society can collapse into disorder and criminality [2907, 16.]. This is why, today, politicians, even non-believers, seek the support of the historical Churches for the re-birth of national institutions, of culture and the moral structure of society. In Romania, they are building a thousand churches at the cost of the State. In Russia, the old and much-persecuted Orthodox Church is supported in many ways by the public authorities, even by non-believers.'

[Comment (LS (11/9/2004)): reference to Horror Vacui:

RELIGION IS GRAFFITI—ON THE MIND! Christianity, the graffiti of a masochistic-sadistic gang].



16.     from:

'The idea that Nazism was motivated primarily by a typically Leftist hunger for change and excitement and rejection of the status quo is reinforced by the now famous account of life in Nazi Germany given by a young "Aryan" who lived through it. Originally written [1939] before World War II, [Sebastian] Haffner's (2002 [Defying Hitler, a memoir, 70]) account of why Hitler rose to power stresses the boring nature of ordinary German life and observes that the appeal of the Nazis lay in their offering of relief from that:


"The great danger of life in Germany has always been emptiness and boredom...The menace of monotony hangs, as it has always hung, over the great plains of northern and eastern Germany, with their colorless towns and their all too industrious, efficient, and conscientious business and organizations. With it comes a horror vacui and the yearning for 'salvation': through alcohol, through superstition, or, best of all, through a vast, overpowering, cheap mass intoxication." [see 2858-2859]

So he [Sebastian Haffner (pseudonym for: Raimund Pretzel)] too saw the primary appeal of Nazism as its offering of change, novelty and excitement.'


17.     from (Luciano Floridi):

'In order to emerge and flourish, the mind needs to make sense of its environment by continuously investing data (constraining affordances, see Chapter 4) with meaning. MENTAL LIFE IS THUS THE RESULT OF A SUCCESSFUL REACTION TO A PRIMARY HORROR VACUI semantici [(provisional) Italian: semantics]: meaningless (in the non-existentialist sense of "not-yet-meaningful") chaos threatens to tear the Self asunder ["Into separate parts" (], to drown it in an alienating otherness perceived by the Self as nothingness. THIS PRIMORDIAL DREAD OF ANNIHILATION URGES THE SELF TO GO ON FILLING ANY SEMANTICALLY EMPTY SPACE WITH WHATEVER MEANING THE SELF CAN MUSTER, as successfully as inventiveness and the cluster of contextual constraints, affordances, and the development of culture permit. This semanticization of being, or reaction of the Self to the non-Self (to phrase it in Fichtean terms), consists in the inheritance and further elaboration, maintenance, and refinement of factual narratives (personal identity, ordinary experience, community ethos, family values, scientific theories, common-sense-constituting beliefs, etc.) that are logically and contextually (and hence sometimes fully) constrained and constantly challenged by the data that they need to accommodate, mold, and explain.


Historically, the evolution of this process is directed towards an ever-changing, richer and robust framing of the world. Schematically, it is the result of four conceptual thrusts:

1) a metasemanticization of narratives....

2) a delimitation of culture....

3) a dephysicalization of nature....

4) a hypostatization [see Article #3, 45, 46, 67; Addition 34, 1559] (embodiment) of the conceptual environment designed and inhabited by the mind....'

"....every intellectual movement generates the conditions of its own senescence and replacement."

"Philosophy belongs to the centuries, not to the day."

"Philosophy is not a conceptual aspirin, a superscience, or the manicure of language. It is the last stage of reflection, where the semanticization of being is pursued and kept open (Russell [Bertrand Russell, The Problems of Philosophy] 1912: ch. 15)."

18.     from:

'Alas the wars of the 1990's have proven me all too correct. It is something that is written in fire over the last century, it is the fact that in the age of imperialism there exists a "horror vacui" - a fear of the empty space - in the system of hegemony. Where a fallen power leaves a power vacuum, other powers will contend to fill that temporary emptiness with their own power....'

19.     from: [Hubertus Fremerey] [11/11/2004]

'There is a strong tendency in humans to make meaning of what they see and hear. We try to live in a meaningful world. Thus if the meaning is not evident, we try to construct one. There is some "horror vacui" (fear of void) in philosophy, and

MOST PEOPLE PREFER a plausible [ALMOST ANY] ANSWER TO NO ANSWER AT ALL ["The subjective need to know" (forgot (old) source)]. [see 2901-2902 (Nietzsche)]

Which explains a bit why so many strange [Gods and] demigods and monsters and "forces" have been invoked at all times to explain strange events. Why do bright children always ask "why"? Because they try to build up a meaningful world.'


20.     from:

[excerpt from commentator #1] '"the world is full of people...who need to pry into the "secrets" of the universe and unfold it in the name of progress and longevity.


in the meantime we begin to forget how to live. this is dangerously close to nihilism, and i believe that "progress" will ultimately destroy our species. if I dared, i might used this much-maligned "logic" to assert that what we know as progress and rationality are ultimately detrimental to the health and existence of the human race."

[response by commentator #2, to the above comment] Then why would a species be programmed with its own self-destruction? It seems that we are if what we strive for is that which is most likely to cause our own destruction? Are we meant to last only so long? Why? Or is that perhaps an unfortunate side effect of a programming that was designed to ensure our self-preservation. We seek knowledge so that we can survive—or perhaps that is how it was "meant." And I really think that is how most people do perceive the search for knowledge.

So what then? We crawl out of caves, build condos on the moon and then explode because we have reached our maximum capacity? Is all this striving for nothing in the end? *shakes head* You know, I am going to wave the "look at the drooling fool" flag here and voice another belief I have—because it sort of fits here. I believe our purpose is to die. We exist so that we may be used as part of the renewal process for the earth. WE ARE FERTILIZER-IN-WAITING [a new favorite!]. All organic material eventually becomes that. And perhaps everything we do accomplish is just to fill the time in between birth and death. And perhaps we do "seek to fill the void that results from a necessarily empty existence by filling it with knowledge and facts." Which makes me wonder...why?'

[response by commentator #1, to the above comment] 'how can you say that our "purpose" is to die? that's just physics. its's free radicals passing through cells and degrading the DNA strands within their chromatin envelopes. its a logical outcome to the things that religion hints at: freedom from suffering, ultimate bliss. what could possibly be more blissful than non-existence?

you are atoms, molecules, tissues, electricity. nothing more. physics. to assign a "purpose" to that seems to me to be needlessly sentimental.'



21.     from:

"The Worn-out Skin

Reflections on the Uraga Sutta

["Discourse of the Snake's Skin" ("On the Snake's Skin")] ["sutta: Buddhism. A scriptural narrative, especially a text traditionally regarded as a discourse of the Buddha[? (see 2752) (one source: 560 - 480 B.C.E.)]]." (].

by Nyanaponika Thera [1901 - 1994]"

"Reflections on the Verses"


          '[Verse] 3.     HE WHO ENTIRELY CUTS OFF HIS CRAVING,




Craving (tanha) is the mighty stream of desire that flows through all existence, from the lowest microbes up to those sublime spheres free from coarse materiality. Craving is threefold: craving for sensuality, for continued existence, and for annihilation or destruction[?].

Sensuous craving (kama-tanha) within that mighty river of which our verse speaks, is a powerful whirlpool dragging everything into its depth. The infinity of all craving appears here as the bottomless abyss which vainly longs for fullness and fulfillment. But though it ceaselessly sucks into itself the objects of desire, it can never find safety and peace. For like the hunger for food, this perpetual hunger of the senses daily craves afresh for gratification: "THE SENSES ARE GREEDY EATERS."

[see 2806]


stands dark and threatening behind each sensual craving as an additional driving force. We see starkly THE PARTNERSHIP OF FEAR AND DESIRE in the pathological avarice, the hectic grasping and clinging, of those old people so masterly described by Moliere [1622 - 1673] and Balzac [Honoré de Balzac 1799 - 1850].



We strive to absorb into our ego what is non-ego or "alien"; we chase hectically and insatiably after sense enjoyment, possessions or power; we yearn to be loved, envied or feared. In short, we try to build up our "personality"--a persona, a hollow mask. But such attempts to satisfy sensual craving must fail.






Excursus (reference to Nyanaponika Thera): from: Abhidhamma Studies, Buddhist Explorations of Consciousness and Time, Nyanaponika Thera [1901 - 1994], edited with an introduction by Bhikkhu Bodhi, Wisdom Publications • Boston, in collaboration with the Buddhist Publication Society, Kandy • Sri Lanka, 1998 (1949).


"Editor's Introduction


In his preface to this book Nyanaponnika Thera explains that these studies originated while he was engaged in translating into German the Dhammasanganī and the Atthasālinī, respectively the first book of the Pāli Abhidhamma Pitaka and its authorized commentary. He translated these works during the trying years of World War II, while residing in the British civilian internment camp at Dehra Dun, in north India (1941–46). Unfortunately, these two translations, made with such keen understanding and appreciation of their subject, remain unpublished. The Dhammasanganī appeared only in a very limited cyclostyle edition (Hamburg, 1950), long unavailable. The Atthasālinī has been in preparation for the press since the mid 1980s, but it is still uncertain whether it will ever see the light of day.

The investigations stimulated by this translation work, however, have enjoyed a happier fate. Soon after returning to Sri Lanka following the war, Ven. [Venerable] Nyanaponika recorded his reflections on the Abhidhamma in a set of four essays, which became the first version of this book, entitled Abhidhamma Studies: Researches in Buddhist Psychology. The manuscript must have been completed by 15 March 1947, the date of the preface, and was published in a series called Island Hermitage Publications (Frewin & Co. Ltd., Colombo, 1949). This imprint emanated from the Island Hermitage at Dodanduwa, a monastic settlement chiefly for Western Buddhist monks founded in 1911 by Ven. Nyanaponika's teacher, Ven. Nyanatiloka Mahāthera (1878–1957). Ven Nyanatiloka, also from Germany, was the first Theravāda bhikkhu from continental Europe in modern times. Ordained in Burma in 1903, he soon established himself as an authority on the Abhidhamma, and it was from him that Ven. Nyanaponika acquired his deep respect for this abstruse branch of Buddhist learning...." [VII].


'Appendix 1

The Authenticity of the Anupada Sutta


Mrs. C. A. F. Rhys Davids, in the preface to her translation of the Dhammasanganī, throws doubt on the authenticity of the Anupada Sutta (MN No. 111) as a genuine discourse of the Buddha: "The sutta, as are so many, is an obvious patchwork of editorial compiling, and dates, without reasonable doubt, long after Sāriputta has preceded his Master in leaving this world. We have first a stock formula of praise spoken not once only of Sāriputta. Then, ex abrupto, this tradition of his fortnight of systematic introspection. Then, ex abrupto, three more formulas of praise. And that is all. The sutta, albeit put into the mouth of the Founder ["the Buddha"], is in no way a genuine discourse."74


So Mrs. Rhys Davids, we do not agree at all....' [115]. [Reminiscent of the Jesus Seminar (see Article #3, 65-66, 344.-348.). Of course, I (LS), guess Mrs. Rhys Davids is correct].

'About the Author


Ven. [VENERABLE] NYANAPONIKA THERA was one of the foremost interpreters of Theravāda Buddhism in modern times. Born into a working class Jewish family in Hanau, Germany, in 1901, with the name SIEGMUND FENIGER, he became a Buddhist by self-conviction before his twentieth year. In 1936 he left Germany for Sri Lanka, where he entered the Buddhist monastic order as a pupil of Ven. Nyanatiloka Mahāthera, the first Theravāda Buddhist monk from Germany.


Ven. Nyanaponika participated in the Sixth Buddhist Council in Yangon (1954–56) and was a cofounder of the Buddhist Publication Society in Kandy, which he served as its longtime president and editor. At the time of his death in 1994 he was one of the four "Living Ornaments of the Teaching" in the Amarapura Nikāya, the monastic fraternity into which he had been ordained. His other publications in English include The Heart of Buddhist Meditation and The Vision of Dhamma.' [145].


End of Excursus.



Excursus (reference to Nyanaponika Thera): from: Numerical Discourses of the Buddha, An anthology of Suttas from the Anguttara Nikāya, Selected and translated from the Pāli by Nyanaponika Thera and Bhikkhu Bodhi, AltaMira Press, A Division of Rowman & Littlefield Publishers, Inc., Walnut Creek • Lanham • New York • Oxford, 1999.



This book is an anthology of discourses by the Buddha selected from the Anguttara Nikāya, the Collection of Numerical Discourses. The Anguttara Nikāya belongs to the Pāli Canon, the authorized recension of the Buddha's Word for followers of Theravāda Buddhism, the form of Buddhism that prevails in the Buddhist countries of southern Asia: Sri Lanka, Myanmar, Thailand, Cambodia and Laos. The texts included in the anthology are called sutta (Skt [Sanskrit]: sūtra), literally, "threads", a Pāli word denoting the Buddha's own words (or those of his most eminent direct disciples) as distinguished from expository and commentarial works, which lack the same degree of authority....' [XV].

"About the Translators


Nyanaponika Thera (1901–94) was one of the foremost interpreters of Theravāda Buddhism in our time. Born in Germany, he entered the Buddhist Order in Sri Lanka in 1936 and passed 58 years as a monk until his death in late 1994. He was the founding-president and longtime editor of the Buddhist Publication Society. His books include The Heart of Buddhist Meditation, The Vision of Dhamma, and Abhidhamma Studies [see 2912]." [331].


End of Excursus.

 ● ● ● ● ●




'Neuroanatomical hypothesis of panic disorder, revised.

Gorman JM, Kent JM, Sullivan GM, Coplan JD

Am J Psychiatry 2000 Apr;157(4):493-505

RESULTS: There appears to be a remarkable similarity between (a) the physiological and behavioral consequences of response to a conditioned fear stimulus and (b) a panic attack. In animals, these responses are mediated by a "fear network" in the brain that is centered in the amygdala [see 2932-2933] and involves its interaction with the hippocampus (link) and medial prefrontal cortex (link). Projections from the amygdala to hypothalamic (link) and brainstem sites explain many of the observed signs of conditioned fear responses. It is speculated that a similar network is involved in panic disorder. A convergence of evidence suggests that both inheritable factors and stressful life events, particularly in early childhood, are responsible for the onset of panic disorder.

CONCLUSIONS: Medications, particularly those that influence the serotonin system, are hypothesized to desensitize the fear network from the level of the amygdala through its projects to the hypothalamus [see 2932-2933] and the brainstem. Effective psychosocial treatments (i.e. cognitive-behavioral therapy, CBT) may also reduce contextual fear and cognitive misattributions at the level of the prefrontal cortex and hippocampus. Neuroimaging studies should help clarify whether these hypotheses are correct.' [end of entry].

_____ _____ _____



"Harvard Magazine"

"Neural Insulation

Cushioning Hard Memories"

'For those who suffer from post-traumatic stress disorder (PTSD), vivid recollections of the horrific events they survived or witnessed—wars, rapes, accidents, injuries, concentration camp internments—often return relentlessly for years, evoking the same fear, helplessness, horror, and consequent anguish that accompanied the initial experience. This creates a disabling cycle that can be difficult, if not impossible, to break.

But encouraging new research suggests that the beta-blocker drug propranolol, by inhibiting the release of certain stress-related hormones, may stop such unwanted memories from being reinforced in our brains. Unlike the creepy device that erases undesirable recalls just like files on a computer in the recent film Eternal Sunshine of the Spotless Mind, propranolol won't cause PTSD sufferers to forget their ghastly memories, "but it can take out the sting," says professor of psychiatry Roger K. Pitman....

The biological reason why we never forget significant experiences involves the amygdala [see 2932-2933], an almond-shaped portion of the temporal lobe. Highly emotional events stimulate the amygdala to release so-called stress hormones, such as adrenaline, into our hippocampus. These hormones strengthen the recollections, gruesome or lovely, of the events that prompted their release. In PTSD, graphic memories—frequently including flashbacks and nightmares—not only remain intense over time, but are self-perpetuating. Each time a sufferer relives the traumatic experiences, the amygdala

re-releases stress hormones into the brain, and consequently reinforces already unwanted memories. But propranolol interferes with the amygdala's receptors and "takes it off-line," Pitman says. "It blocks the consolidation of memory."

Since the amygdala doesn't release stress hormones in response to ordinary situations, it's not surprising we forget where we placed our keys or parked our car. "You are likely to remember in fair detail what you were doing on the morning of September 11, 2001," says Pitman. "But do you remember what you were doing on the morning of September 10?" This reaction, he maintains, is firmly based in natural selection. "If a primitive hominid decided to take a new route to a watering hole and on her way encountered a crocodile," he says, "should she fail to remember in the future that a crocodile inhabited that route, she would be more likely to take it again and be eliminated from the gene pool."'

"Catherine Dupree"

● ● ● ● ●




Symbols of Destruction

Elemer Hankiss, Director, Hungarian Academy of Sciences Institute of Sociology"

"Horror vacui. When the clouds of dust started to settle, it was a painful shock to see the absurd gap, the vacuum, the absence of something that had been there a couple of minutes before, in its full power and reality. We were staring into the invisible depths of non-existence, the existentialists' Néant. With the so-called stop-trick, filmmakers can make objects and persons disappear from a scene in a trice but in reality we have never witnessed this sudden annihilation of life."

'Death, triumphant. With a certain degree of exaggeration one could say that, together with the towers, the illusion of immortality collapsed as well on September 11. In what sense? We, people living in our contemporary consumer civilization, believe, and want to believe, so strongly in the power of the human being to solve the problems of life that we have almost come to believe that even the ultimate problem of human existence, mortality, can be solved. Or, at least, it can and should be eliminated from human consciousness.

Several scholars have argued that the "denial of death" is one of the main characteristics of citizens of contemporary western civilization, the civilization of consumption.3 ["3The expression comes from Ernest Becker's The Denial of Death [see 2975] (New York: The Free Press, 1973). He applied this concept only to people in America."] The cult of youth, the joy of life, success, wealth, the exclusion of death from civilized, politically correct conversation, the tactful separation of the elderly from the world of the young, the teeming of angels, spirits, time travelers, those returned from the land of death on the TV screen, the American soldier who must win the war without letting himself get killed: it is beyond doubt that - at least on the surface -- our contemporary civilization turns much less about the idea of mortality and death than traditional European civilization (and most other traditional civilizations) have ever done.

And on September 11 [2001], we were suddenly and rudely confronted with the fragility of human life. And we could not avert our eyes from the terrible sight. We could not ignore any more the unacceptable fact of death. Even if only temporarily, death has moved into our hearts.

All these may have been among the factors that have made September 11 [2001] the day of the symbols of destruction.'

● ● ● ● ●


from: Masters of the Mind, Exploring the Story of Mental Illness from Ancient Times to the New Millennium, Theodore Millon; with contributions by Seth D. Grossman and Sarah E. Meagher; portraits by Theodore Millon and Carrie N. Millon, John Wiley & Sons, c2004.


Among advances in thought between 700 B.C. and 400 B.C. were the speculations of a number of philosopher-physicians, most notably Thales, Pythagoras, Aesculapius, Alcmaeon, and Empedocles. These forerunners laid the ground-work for the great Greek physician Hippocrates and the great Greek philosophers Socrates, Plato, and Aristotle.

          In the earliest periods of Greek civilization, insanity was considered to be a divine punishment, a sign of guilt for minor or major transgressions. Therapy sought to combat madness by various expiatory rites that removed impurities, the cause of the psychic disorder. Priests mediated the ill person's prayers to the gods to assure his or her cure. Thus, with divine help, the person's heart could be purified of its evil.

          Albeit slowly, Greek scholars realized that little of a rational nature characterized their thinking about mental pathology. To them, external, but unseen, agents could no longer serve as a logical basis for a genuine understanding of mentally troublesome phenomena. A fundamental shift began to take place, not merely in describing different types of mental disorder, but in providing a sounder basis for thinking about ways to alter these aberrant behaviors. To treat mental disorders, they began to recognize the necessity of understanding how and why mental disorders were expressed in the natural world: Only then could they successfully deal therapeutically with the tangible symptoms of everyday mental life. Instead of leaving the treatment of mental disorders to the supernatural and mystical, a more concretely oriented perspective began to emerge. This transition was led by imaginative thinkers in the fifth and sixth centuries B.C.

          A central intellectual effort of Greek philosophers was the desire to reduce the vastness of the universe to its fundamental elements. Most proposed that complexities could be degraded to one element—be it water, air, or fire. Their task was to identify the unit that composed all aspects of the universe. Among the first philosopher-scientists to tackle this task was Thales (652–588 B.C.), born in the seventh century B.C. What little we know of Thales comes largely from the writings of later Greek philosophers, notably Aristotle, Plato, and the historian Herodotus. This nimble-witted Greek believed that the fundamental unit of the universe was a tangible and identifiable substance—water. Some philosophers disagreed with the notion that the universe was composed of a simple and permanent element. Heraclitus (530–470 B.C.), for example, proposed that fire was the component that constitutes all nature. He asserted, however, that the universe was composed of no lasting substance: nothing stable, solid, or enduring. Things real and tangible inevitably vanish, change their form, even become their very opposites.

          In a similar manner, Anaxagoras (500–428 B.C.) asserted that a reduction to the basic elements could not explain the universe. He differed from Heraclitus in that he did not believe the universe lacked an enduring substance. He asserted that there was an endless number of qualitatively different elements. It was the organization or


arrangement of these diverse elements that was central to the structure of the universe. Anaxagoras's novel belief that the character of these constituents could not be explained except through the action of human thought is similar to the view of the phenomenologists and the gestaltists. Some centuries later, they claimed that the structure of objective matter was largely in the interpretive eye of the perceiver.

          Later, the philosopher Democritus (460–362 B.C.), following Leucippus (ca. 445 B.C.), proposed that the universe was made of variously shaped atoms. These small particles of matter were in constant motion, differing in size and form, but always moving and combining into the many complex components that composed the universe. This innovative speculation endures to the present time. Extending the theme proposed a century earlier by Anaxagoras, Democritus stressed that all truths are relative and subjective. As noted, he asserted that matter consisted of invisible particles called atoms, a term coined by Leucippus, who had proposed the concept some half-century earlier. Each atom was composed of different shapes that combined and were linked in numerous ways. This purely speculative idea remains essentially correct to this day. The physical thesis of contemporary times known as the Heisenberg principle finds its origins in the surmise of Democritus.

          Returning to Thales for the moment, it should be noted that though he was not the prime forerunner of a modern understanding of mental processes, he was a radical thinker. He redirected attention away from mysticism, recognizing that psychic disorders were natural events that should be approached from a scientific perspective. As a pivotal figure in his time, he ushered in an alternative to earlier supernatural beliefs. Equally significant was Thales's view that scientific thinkers should try to uncover the underlying principles on which overt phenomena were based. Oriented to find these principles in physical studies and "geometric proportions," he turned to "magnetic" phenomena, convinced that the essential element of all life was its animating properties. To Thales, action and movement, based on balanced or disarrayed magnetic forces, was what distinguished human frailty. He further derogated the view that external supernatural forces intruded on the psyche; instead, the source of pathology was inherent within persons themselves.

          Paralleling the views of Thales, Pythagoras (582—510 B.C.) reasserted the importance of identifying the underlying scientific principles that may account for all forms of behavior. He differed from Thales in that he retrogressively used ethics and religion as the basis for deriving his scientific principles. More progressively, however, he was the first philosopher to claim that the brain was the organ of human intellect, as well as the source of mental disturbances. He adopted an early notion of biological humors, or naturally occurring bodily liquids, as well as positing the concept of emotional temperament to aid in decoding the origins of aberrant passions and behavior. The mathematical principles of balance and ratio served to account for variations in human characterological styles (e.g., degrees of moisture or dryness, the proportion of cold or hot). Balances and imbalances among humoral fundamentals would account for whether health or disease would be present. Possessing a deep regard for his "universal principles," he applied his ideas to numerous human, ethical, and religious phenomena. Though he believed in immortality and the transmigration of souls, this did not deter Pythagoras from making a serious effort to articulate the inner equilibrium of human anatomy and health....' [10-12].

● ● ● ● ●


from: Bright Air, Brilliant Fire, On the Matter of the Mind, Gerald M. Edelman [see 2924], BasicBooks, a Division of HarperCollins, c1992.

"To the memory of two intellectual pioneers,

Charles Darwin [1809 - 1882] and Sigmund Freud [1856 - 1939].

In much wisdom, much sadness."

"Putting Psychology on a

Biological Basis" [33].


To a great extent, the philosophy of mind has pitched its inquiries without concerning itself (except anecdotally) with the body or the brain. We have already seen that the first modern philosopher, Descartes [1596 - 1650], based his form of rationalism on thought itself using his well-known "method of doubt," which he outlined in the Discourse on the Method: ....' [34].

          "The most ruthless and skeptical of the empiricists, Hume [David Hume 1711 - 1776], concluded that no knowledge could be secure given that it is all based on sense impressions. Even scientific knowledge appeared to be shaken by his analysis of cause and effect as no more than mental correlation based on the repetition of these sense impressions. But as we will see later,


          Immanuel Kant [1724 - 1804] (figure 4–1), whose background in physics and astronomy was greater than in biology, put the matter in a larger perspective. He answered Hume by pointing out the existence of categories a priori in the mind, thus assuring their coexistence with sensory experience. But while the existence of a priori categories appears in better accord with modern evidence on ethologically determined action patterns and on the neuro-physiological properties of brain cells, it is not strictly consistent with developmental studies of how babies gain a sense of space, or even with the physics of relativity. Ignorant as he had to be of modern developments in biology and physics, Kant is to be forgiven for not understanding what constraints there might be on the a priori.

          I could give other examples, but these should suffice to indicate that, in philosophy, a knowledge of psychology based on experiment and an understanding of neurology and evolution are useful to guard against extreme errors. But all this knowledge is a recent acquisition, and one can only admire the courage and persistence of these great thinkers in keeping important questions alive." [35].


"Psychology itself has not fared very well in the absence of knowledge of the brain and nervous system [?]. This is not to say that an enormous amount of useful and important information has not been accumulated since William James at Harvard in 1878 and Wilhelm Wundt in Leipzig in 1879 founded the first laboratories of experimental physiological psychology. Instead of a unified theory of the mind, however, a series of schools subsequently sprang up, each with different views on behavior, consciousness, and on the relative significance of perception, memory, language, and thought." [36].

"as modern methods of measuring brain function developed and an increased understanding of brain biochemistry emerged, it became clear that psychology could not be pursued without being increasingly grounded in biology. At best it could be provisionally pursued (as it always has been) while awaiting biological interpretation.

          Once one arrives at this conclusion, however, there is no escaping an even more fundamental one: The phenomena of psychology depend on the species in which they are seen, and the properties of species depend on natural selection. This view, taken by ethologists such as Nikolaas Tinbergen [1907 - 1988] and Konrad Lorenz [1903 - 1989] and also by most modern psychologists, inexorably links psychology to biology. That linkage demonstrates the importance of evolutionary origins in the behavior of species." [40].

          "While the ideas of philosophers and of different psychological schools must be taken into account in any consideration of the matter of the mind, such ideas have only lately come to grips with the key issues of biology itself. The message boils down to this: The fundamental basis for all behavior and for the emergence of mind is animal and species morphology (anatomy) and how it functions. Natural selection acts on individuals as they compete within and between species. From studying the paleontological record it follows that what we call mind emerged only at particular times during evolution (and rather late at that).

          These terse comments can be used as the basis for a research program to connect psychology with biology—a program to account for embodiment. Given the record of the history of the philosophy of mind and of psychology, the continued avoidance of the biological underpinnings of such a program is not likely to enhance our understanding of how the human mind emerged and how it functions. Errors continue to arise when psychology is pursued without strong connections to biology; I discuss some of them in the Postscript.

          The center of any connection between psychology and biology rests, of course, with the facts of evolution. It was Darwin [Charles Robert Darwin 1809 - 1882] who first recognized that natural selection had to account even for the emergence of human consciousness. Let us turn to some of his insights and their consequences." [41].

Excursus (Los Angeles Times, 9/19/2005: "The devolution of a believer", John Darnton): "by the time he [Charles Darwin] was in his 60s, ... he readily described himself as a nonbeliever.... Darwin was not afraid to look deeply into the VOID." End of Excursus.


"Chapter 15

A Graveyard of Isms:

Philosophy and Its Claims


'Philosophy is a graveyard of "isms."' [158].

"A Critical Postscript"

"....We must incorporate biology into our theories of knowledge and language...."

[Seems strange!, this encouragement is still needed c. 1992]. [252].

Excursus: from: Diderot [1713 - 1784] and The Encyclopaedists, John Morley

[1838 - 1923], Vol. I. (of two), Macmillan, 1914 (1878). [See: 2813-2814].

'Diderot exclaims, "Ah, madam, how different is the morality of a blind man from ours; and how the morality of the deaf would differ from that of the blind; and if a being should have a sense more than we have, how wofully imperfect would he find our morality!" This is plainly a crude and erroneous way of illustrating the important truth of the strict relativity of ethical standards and maxims. Diderot speaks as if they were relative simply and solely to our five wits, and would vary with them only. Everybody now has learnt that morality depends not merely on the five wits, but on the mental constitution within, and on the social conditions without. It is to these rather than to the number of our senses, that moral ideas are relative.' [89].

"[John Morley] The question whether a blind man has as good reasons for believing in the existence of God as a man with sight can find, was of more vivid interest." [90].

'Diderot proceeds to consider man as distributed into as many distinct and separate beings as he has senses. "My idea would be to decompose a man, so to speak, and to examine what he derives from each of the senses with which he is endowed. I have sometimes amused myself with this kind of metaphysical anatomy; and I found that of all the senses, the eye was the most superficial; the ear, the proudest; smell, the most voluptuous; taste, the most superstitious and the most inconstant; touch, the profoundest, and the most of a philosopher. It would be amusing to get together a society, each member of which should have no more than one sense; there can be no doubt that they would all treat one another as out of their wits."

          This is interesting, because it was said at the time to be the source of one of the most famous fancies in the philosophical literature of the century, the Statue in Condillac's Treatise on the Sensations....' [105-106]. End of Excursus.

● ● ● ● ●


from: Freud [Sigmund Freud 1856 - 1939] and the Neurosciences, From Brain Research to the Unconscious, edited by Giselher Guttmann and Inge Scholz-Strasser, with contributions by Oliver W. Sacks, Giselher Guttmann, Harald Leupold-Löwenthal, Malcolm Pines, Cornelius Borck, Morris N. Eagle, and Detlef B. Linke, Verlag Der Österreichischen Akademie der Wissenschaften, Vienna 1998. [Many facsimiles of beautiful histology drawings, by Sigmund Freud, are included].



For I am actually not at all a man of science, not an observer, not an experimenter, not a thinker. I am by temperament nothing but a conquistador – an adventurer, if you want it translated – with all the curiosity, daring, and tenacity characteristic of a man of this sort.

Sigmund Freud to Wilhelm Fliess

February 1, 1900' ["7"].

'Oliver W. Sacks

Sigmund Freud: The Other Road


It is making severe demands on the unity of the personality to try and make me identify myself with the author of the paper on the spinal ganglia of the petromyzon. Nevertheless I must be he, and I think I was happier about that discovery than about others since.

Sigmund Freud to Karl Abraham

September 21, 1924

Everyone knows Freud as the father of psychoanalysis, but most people know little about the twenty years (1876–1896) when he was primarily a neurologist and anatomist; Freud himself rarely referred to them in later life. Yet this "other," neurological, life was the precursor to his psychoanalytic one, and perhaps an essential key to it. An early and enduring passion for Darwin [Charles Darwin 1809 - 1882], Freud [1856 - 1939] tells us in his Autobiography (allied with Goethe's [1749 - 1832] essay on Nature), made him decide to become a medical student; and already in his first year at university he was eagerly attending courses on "Biology and Darwinism" as well as lectures by the physiologist Ernst Wilhelm von Brücke [1819 - 1892]. Two years later, eager to do some real, hands-on research, Freud asked Brücke if he could work in his laboratory. Although, as Freud was later to write, he already felt that the human brain and mind might be the ultimate subject of his explorations, he was intensely curious, after reading Darwin, about the early forms and origins of nervous systems, and wished to get a sense of their slow evolution first.


          Brücke suggested that Freud look at the nervous system of a very primitive fish, Petromyzon, the lamprey – in particular at the curious "Reissner" cells clustered about the spinal cord; these cells had attracted attention since Brücke's own student days forty years earlier, but their nature and function had never been understood. Freud was able to detect the precursors of these cells in the singular larval form of the lamprey, and to show they were homologous with the posterior spinal ganglia cells of higher fish – a significant discovery. (This so-called Ammocoetes larva of Petromyzon is so different from the mature form that it was long considered to be a separate genus (Ammocoetes.) He then turned to looking at an invertebrate nervous system, that of the crayfish. And while it was believed at this time that the nerve elements of invertebrate nervous systems were radically different from those of vertebrate ones, Freud was able to show that they were, in fact, morphologically identical – and thus that it was not the cellular elements which were different in primitive or advanced animals, but their organization. Thus there emerged, even in Freud's earliest researches, a sense of a Darwinian evolution whereby, using the most conservative means (the same basic anatomic cellular elements) more and more complex nervous systems could be built.1" ["11"-12].

'....[Freud] was to make one final, highly theoretical attempt to delineate the neural basis of mental states - in his Project for a Scientific Psychology - and he [FREUD] NEVER GAVE UP THE NOTION THAT THERE MUST ULTIMATELY BE A BIOLOGICAL "BEDROCK" TO ALL PSYCHOLOGICAL CONDITIONS AND THEORIES. But for practical purposes he felt he could, and must, put these aside for a time....' [17].

          'In the last third of the twentieth century, the whole tenor of neurology and neuroscience has itself been moving to such a dynamic and constructional view of the brain, a sense that even at the most elementary levels – as, for example, in the "filling in" of a blind spot or scotoma, or the seeing of a visual illusion, as both Richard Gregory and V.S. Ramachandran have demonstrated – the brain constructs a plausible hypothesis or pattern or scene. Gerald Edelman [see 2920], above all – drawing on the data of current neuroanatomy and neurophysiology, of embryology and evolutionary biology, of clinical and experimental work, and of synthetic neural modelling – has been creating a most detailed neurobiological model of the mind. And in this, the brain's central role is precisely one of constructing categories – first perceptual, then conceptual – and of an ascending process, a "bootstrapping," where through repeated recategorization at higher and higher levels, consciousness is finally achieved. Thus every perception is a creation for Edelman, and every memory, all remembering is re-categorization, recreation.7 [see footnote, 2925]

          Such categories, for Edelman, depend on the "values" of the organism, those biases or dispositions (partly innate, partly learned) which, for Freud, were characterized as "drives," "instincts," and "affects." Thus "retranscription" becomes the model for the brain-mind's most fundamental activity. The attunement here between Freud's views and Edelman's is striking – and here, at least, one has the sense that psychoanalysis and neurobiology can be fully at home with one another, congruent and mutually supportive. And it may be that in this equation of "Nachträglichkeit" [complex. yours to define] with "recategorization" we see a hint of


how the two seemingly disparate universes – the universes of human meaning and of natural science - may come together.

          Ernest Jones [1759 - 1958] spoke of Freud as "The Darwin of the mind," and Edelman [see 2920], in his latest book on neural darwinism, dedicates it to the memory of Darwin [Charles Darwin 1809 - 1882] and Freud [Sigmund Freud 1856 - 1939]. And this is not "just" the Freud of psychoanalysis, but the Freud who spent his first adult twenty years as a neuroanatomist, a clinical neurologist, and neuro-theorist – and laid the foundations upon which psychoanalysis could arise.' [20-21] ["References", follow].

          '[footnote] 7There are, of course, innumerable areas in neuroscience and neurobiology besides that of memory where Freud's influence, direct or indirect, has been profound. There are marked analogies between psychoanalysis and neuropsychology, as discussed by Solms and Saling. A.R. Luria himself was fascinated by Freud's work as a very young man, and wrote to him in 1922 regarding the new Psychoanalytic Society, which he had founded in Kazan. Luria was thrilled, he wrote in his autobiography, The Making of Mind, to receive a courteous reply from the great man, addressing him as "Mr. President," and giving him permission to translate some of his works into Russian.' [21].

'Cornelius Borck

Visualizing Nerve Cells and Psychical Mechanisms

The Rhetoric of Freud's Illustrations*


"Psycho-analysis is related to psychiatry approximately as histology is to anatomy: the one studies the external forms of the organs, the other studies their construction out of tissues and cells. It is not easy to imagine a contradiction between these two species of study, of which one is a continuation of the other. To-day, as you know, anatomy is regarded by us as the foundation of scientific medicine. But there was a time when it was as much forbidden to dissect the human cadaver in order to discover the internal structure of the body as it now seems to be to practise psycho-analysis in order to learn about the internal mechanism of the mind."1 [see footnote, below]

          It is no coincidence that in 1916 Freud used the comparison between histology and anatomy to illustrate the situation of psychoanalysis and its relationship to the scientifically established related discipline of psychiatry. After all, his own scientific career had begun with studies of the histological anatomy of the nervous system....' ["57"].

          "[footnote (see above)] 1Freud, S. (1963). Introductory Lectures on Psycho-Analysis. In: Standard Edition, vol. XVI, London: Hogarth Press, pp. 254–255." ["57"].


          'Freud's intense concern with the technical aspects of histology can be seen in the fact that he experimented systematically with different staining techniques, devoting several publications to this topic (1879a, 1884b–d, 1887g). He [Freud] felt he was particularly successful in developing a method of staining nervous tissue with gold chloride in such a way that the cellular organization of the tissue could be studied by means of the nerve tracts. Freud considered this work so important that he published it in three different places, among them in an English version in the journal Brain (1884c). This gold chloride staining was also an important methodological prerequisite for his studies of the anatomy of the medulla oblongata, the transition from the spinal cord to the brain (1885d, 1886b–c). Furthermore as Brücke's assistant, Freud was obligated to prepare the anatomical specimens and histological object carriers for the lectures at the university. Freud's scientific activity during those years consisted of working on visualizations; and this was true even after he left Brücke's laboratory. When working with Meynert [Theodor Meynert 1833 - 1892], Freud [1856 - 1939] continued his microscopic studies, publishing e.g. his studies on the medulla oblongata: "In a certain sense I nevertheless remained faithful to the line of work which I had originally started. The subject which Brücke had proposed for my investigations had been the spinal cord of one of the lowest of the fishes (Ammocoetes Petromyzon); and I now passed on to the human central nervous system."8 [see footnote, below] With the exception of his studies on cocaine, all of Freud's scientific publications from his first ten years of research are directly related to histology and techniques of visualization.' [60-61].

          "[footnote (see above)] 8Freud, S. (1959). An Autobiographical Study. In: Standard Edition, vol. XX, London: Hogarth Press, p. 10." [61].


"Morris N. Eagle

Freud's Legacy

Defenses, Somatic Symptoms and Neurophysiology


I think a most fateful and fortunate step in Freud's intellectual life was his decision to abandon grand speculative neurological schemes following the Project for a Scientific Psychology and to limit his writings to psychological and to ontologically neutral concepts and constructs. The decision to abandon what might be called proto-neurology enabled Freud [Sigmund Freud 1856 - 1939] to describe and formulate psychological phenomena and principles and not especially concern himself with their neurophysiological or biological underpinnings. If the former were sufficiently valid and clear, the latter would ultimately be uncovered. This remains a useful and constructive strategy.

          Freud's (1950 [1895]) Project is primarily of interest because of its anticipation and proto-neurological versions of his central psychological, i.e., psychoanalytical ideas. I believe a similar relationship holds between Freud's early neurological writings and his psychoanalytic writings. That is, apart from the solid contributions they may have made to such topics as aphasia, localization, and infantile cerebral paralyses, these writings are mainly of a more enduring and special interest to the extent that they suggest precursors and hints of Freud's psychoanalytic thinking. It is the latter, after all, that constitutes the main claim that Freud's ideas have on posterity." ["87"].

● ● ● ● ●


from: Phantoms in the Brain, Probing the Mysteries of the Human Mind, V.S. Ramachandran, M.D., Ph.D., and Sandra Blakeslee, Quill, An Imprint of Harper-Collins, Pb, 1999 (c1998).

          'I'd also like to say a word about speculation, a term that has acquired a pejorative connotation among some scientists. Describing someone's idea as "mere speculation" is often considered insulting. This is unfortunate. As the English biologist Peter Medawar [1915 - 1987] has noted, "An imaginative conception of what might be true is the starting point of all great discoveries in science." Ironically, this is sometimes true even when the speculation turns out to be wrong. Listen to Charles Darwin [1809 - 1882]:

"False facts are highly injurious to the progress of science for they often endure long; but false hypotheses do little harm, as everyone takes a salutary pleasure in proving their falseness; and when this is done, one path toward error is closed and the road to truth is often at the same time opened."

          Every scientist knows that the best research emerges from a dialectic between speculation and healthy skepticism. Ideally the two should co-exist in the same brain, but they don't have to. Since there are people who represent both extremes, all ideas eventually get tested ruthlessly.' [xv-xvi].

'If you snip away a section of brain, say, from the convoluted outer layer called the neocortex and peer at it under a microscope, you'll see that it is composed of neurons or nerve cells—the basic functional units of the nervous system, where information is exchanged. At birth, the typical brain probably contains over one hundred billion neurons, whose number slowly diminishes with age.

          Each neuron has a cell body and tens of thousands of tiny branches called dendrites, which receive information from other neurons. Each neuron also has a primary axon (a projection that can travel long distances in the brain) for sending data out of the cell, and axon terminals for communication with other cells.

          If you look at Figure 1.1, you'll notice that neurons make contacts with other neurons, at points called synapses. Each neuron makes anywhere from a thousand to ten thousand synapses with other neurons. These can be either on or off, excitatory or inhibitory. That is, some synapses turn on the juice to fire things up, whereas others release juices that calm everything down, in an ongoing dance of staggering complexity. A piece of your brain the size of a grain of sand would contain one hundred thousand neurons, two million axons and one billion synapses, all "talking to" each other. Given these figures, it's been calculated that the number of possible brain states—the number of permutations and combinations of activity that are theoretically possible—exceeds the number of elementary particles in the universe. Given this complexity, how do we begin to understand the functions of the brain? Obviously, understanding the structure of the nervous system is vital to understanding its functions5—and so I will begin with a brief survey of the anatomy of the brain, which, for our purposes here, begins at the top of the spinal cord....' [8-9].


Excursus: from: Article #3, 71, 379., 96, 379.:


'"Q: What is the soul?

          A: The soul is a living being* without a body, having reason and free will.


--Roman Catholic catechism"


The Astonishing Hypothesis is that "You," your joys and your sorrows, your memories and your ambitions, your sense of personal identity and free will, are in fact no more than the behavior of a vast assembly of nerve cells and their associated molecules. As Lewis Carroll's Alice might have phrased it: "You're nothing but a pack of neurons." This hypothesis is so alien to the ideas of most people alive today that it can truly be called astonishing.'


"Francis Crick [Winner of Nobel Prize], The Astonishing Hypothesis The Scientific Search for the Soul [See: "confabulation", etc.], Scribner's Sons, 1994, 3." [See: 2933, 2934]. End of Excursus.

'denial is something we do all our lives, whether we are temporarily ignoring the bills accumulating in our "to do" tray or defiantly denying the finality and humiliation of death.' [137].

          'The "model" of denial that we considered earlier provides a partial explanation for both the subtle forms of denial that we all engage in, as well as the vehement protests of denial patients. It rests on the notion that the left hemisphere attempts to preserve a coherent worldview at all costs, and, to do that well, it has to sometimes shut out information that is potentially "threatening" to the stability of self.

          But what if we could somehow make this "unpleasant" fact more acceptable—more nonthreatening to a patient's belief system? Would he then be willing to accept that his left arm is paralyzed? In other words, can you "cure" his denial by simply tampering with the structure of his beliefs? ....' [151].

'What we call rational grounds for our beliefs are often

extremely irrational attempts to justify our instincts.

                                           —THOMAS HENRY HUXLEY [1825 - 1895]

          When I began this research about five years ago, I had no interest whatsoever in Sigmund Freud [1856 - 1939]. (He might have said I was in denial. [charming!]) And like most of my colleagues I was very skeptical of his ideas. The entire neuroscience community is deeply suspicious of him because he touted elusive aspects of human nature that ring true but that cannot be empirically tested. But


after I had worked with these patients, it soon became clear to me that even though Freud wrote a great deal of nonsense, there is no denying that he was a genius, especially when you consider the social and intellectual climate of Vienna at the turn of the century. Freud was one of the first people to emphasize that human nature could be subjected to systematic scientific scrutiny, that one could actually look for laws of mental life in much the same way that a cardiologist might study the heart or an astronomer study planetary motion. We take all this for granted now, but at that time it was a revolutionary insight. No wonder his name became a household word.

          Freud's most valuable contribution was his discovery that your conscious mind is simply a facade and that you are completely unaware of 90 percent of what really goes on in your brain. (A striking example of the zombie in Chapter 4.) And with regard to psychological defenses, Freud was right on the mark. Can anyone doubt the reality of the ""nervous laugh" or "rationalizations"? Remarkably, although you are engaging in these mental tricks all the time, you are completely unaware of doing so and you'd probably deny it if it were pointed out to you. Yet when you watch someone else doing it, it is comically conspicuous—often embarrassingly so. Of course, all this is quite well known to any good playwright or novelist (try reading Shakespeare [1564 - 1616] or Jane Austen [1775 - 1817]), but Freud surely deserves credit for pointing out the pivotal role of psychological defenses in helping us organize our mental life. Unfortunately, the theoretical schemes he constructed to explain them were nebulous and untestable. He relied all too often on an obscure terminology and on an obsession with sex to explain the human condition. Furthermore, he never did any experiments to validate his theories.

          But in denial patients you can witness these mechanisms evolving before your very eyes, caught in flagrante delicto. One can make a list of the many kinds of self-deception that Sigmund [1856 - 1939] and Anna Freud [(Daughter of Sigmund Freud) 1895 - 1982] described and see clear-cut, amplified examples of each of them in our patients. It was seeing this list that convinced me for the first time of the reality of psychological defenses and the central role that they play in human nature.

          Denial: ....


          Repression: .... [caution!—with involvements and interpretations]


          Reaction formation: This is the propensity to assert the exact opposite of what one suspects to be true of oneself....

          Rationalization: ....

          Humor: ....

          Projection: ....

          So here we have patients engaging in precisely the same types of Freudian defense mechanisms—denial, rationalization, confabulation, repression, reaction formation and so forth—that all of us use every day of our lives. I've come to realize that they present us with a fantastic opportunity to test Freudian theories scientifically for the first time. The patients are a microcosm of you and me but "better," in that


their defense mechanisms occur on a compressed time scale and are amplified tenfold. Thus we can carry out experiments that Freudian analysts have only dreamed of. For example, what determines which particular defense you use in a given situation? Why would you use an outright denial in one case and a rationalization or reaction formation in another? Is it your (or the patient's) personality type that determines which defense mechanisms you use? Or does the social context determine which one you muster? Do you use one strategy with a superior and another with social inferiors? In other words, what are the "laws" of psychological defense mechanisms? We still have a long way to go before we can address these questions,16 but, for me, it's exciting to contemplate that we scientists can begin encroaching on territory that until now was reserved for novelists and philosophers.' [152-153, 154, 155].

          'Freud bashing is a popular intellectual pastime these days (although he still has his fans in New York and London). But, as we can see in this chapter, he did have some valuable insights into the human condition, and, when talking about psychological defenses, he was right on target, although he had no idea why they evolved or what neural mechanisms might mediate them. A less well known, but equally interesting idea put forward by Freud was his claim that he had discerned the single common denominator of all great scientific revolutions: Rather surprisingly, all of them humiliate or dethrone "man" as the central figure in the cosmos [to me (LS), probably just attempted descriptions of Homo sapiens].

          The first of these, he said, was the Copernican revolution, in which a geocentric or earth-centered view of the universe was replaced with the idea that earth is just a speck of dust in the cosmos.

          The second was the Darwinian revolution, which holds that we are puny, hairless neotenous apes that accidentally evolved certain characteristics that have made us successful, at least temporarily.

          The third great scientific revolution, he claimed (modestly), was his own discovery of the unconscious and the corollary notion that the human sense of "being in charge" is illusory. He claimed that everything we do in life is governed by a cauldron of unconscious emotions, drives and motives and that what we call conscious emotions, drives and motives and that what we call consciousness is just the tip of the iceberg, an elaborate post hoc rationalization of all our actions.

          I believe Freud correctly identified the common denominator of great scientific revolutions. But he doesn't explain why this is so—why would human beings actually enjoy being "humiliated" or dethroned? What do they get in return for accepting the new worldview that belittles humankind? [these last two sentences are strange; strike me as unscientific sentimental projections]

          Here we can turn things around and provide a Freudian interpretation of why cosmology, evolution and brain science are so appealing, not just to specialists but to everyone.


But the study of cosmology gives us a sense of timelessness, of being part of something much larger. The fact that your own personal life is finite is less


frightening when you know you are part of an evolving universe—an ever-unfolding drama. This is probably the closest a scientist can come to having a religious experience.

          The same goes for the study of evolution, for it gives you a sense of time and place, allowing you to see yourself as part of a great journey. And likewise for the brain sciences. In this revolution, we have given up the idea that there is a soul separate from our minds and bodies. Far from being terrifying, this idea is very liberating. If you think you're something special in this world, engaging in a lofty inspection of the cosmos from a unique vantage point, your annihilation becomes unacceptable. But if you're really part of the great cosmic dance of Shiva, rather than a mere spectator, then your inevitable death should be seen as a joyous reunion with nature rather than as a tragedy [a superb statement!].


Brahman is all. From Brahman comes appearances, sensations, desires deeds. But all these are merely name and form. To know Brahman one must experience the identity between him and the Self, or Brahman dwelling within the lotus of the heart. Only by so doing can man escape from sorrow and death and become one with the subtle essence beyond all knowledge.

                     —Upanishads, 500 B.C.'

[157-158] [end of Chapter 7: "The Sound of One Hand Clapping"].

"Chapter 9



Excursus: from:



          From Wikipedia, the free encyclopedia.


The limbic system is a group of brain structures that are involved in various emotions such as aggression, fear, pleasure and also in the formation of memory. The limbic system affects the endocrine system and the autonomic nervous system. It consists of several subcortical structures located around the thalamus:


                   hippocampus: involved in the formation of long-term memory

                   amygdala: involved in aggression and fear

                   cingulate gyrus (the circular shape of the cingulate gyrus resembles that of a "limb", hence the name)

                   fornicate gyrus


                   hypothalamus: controls the autonomic nervous system and regulates blood pressure, heart rate, hunger, thirst and sexual arousal.


Connected to the pituitary gland and thus regulates the endocrine system. (Not all authors regard the hypothalamus as part of limbic system.)


The limbic system is among the oldest parts of the brain in evolutionary terms: it can be found in fish, amphibians, reptiles and mammals.


The pleasure center is located in the limbic system. It is involved in sexual arousal and in the "high" derived from certain recreational drugs. Dopamine acts here Rats with electrodes implanted into their limbic system will self-stimulate in preference over food and will eventually die of exhaustion.


The limbic system is tightly connected to the prefrontal cortex. It has been conjectured that this connection is related the pleasure obtained from solving problems. To cure severe emotional disorders, this connection was sometimes surgically severed, a procedure of psychosurgery. Patients who underwent this procedure often became passive and lacked all motivation.' [end of entry].


End of Excursus.

          'When the Canadian psychologist Dr. Michael Persinger got hold of a similar device ["transcranial magnetic stimulator" (174)] a few years ago, he chose instead to stimulate parts of his temporal lobes. And he found to his amazement that he [Dr. Michael Persinger] experienced God for the first time in his life. [Note: an adult male, Christian first name, experienced "God". Not Gods, Goddess, Goddesses, Devil, Buddah, Allah, Santa Claus, etc.—but: "God". Culture, age, gender, etc.—at work]

          I first heard about Dr. Persinger's strange experiment from my colleague, Patricia Churchland, who spotted an account of it in a popular Canadian science magazine. She phoned me right away. "Rama, you're not going to believe this. There's a man in Canada who stimulated his temporal lobe and experienced God. What do you make of it?"

          "Does he have temporal lobe seizures?" I asked.

          "No, not at all. He's a normal guy."

          "But he stimulated his own temporal lobes?"

          "That's what the article said."

          "Hmmmm, I wonder what would happen if you tried stimulating an atheist's brain. Would he experience God?" I smiled to myself and said, "Hey, maybe we should try the device on Francis Crick."' [175].

'....One wonders whether this technique [galvanic skin response, etc.] could be useful as a sort of "piety index" to distinguish religious dabblers or frauds ("closet atheists") from true believers. The absolute zero on the scale could be set by measuring Francis Crick's galvanic skin response.' [186].

[The above references to Francis Crick [see 2929, 2934], are delightful and amusing].


          'I find great irony in the fact that every time someone smiles at you she [or he, or "it"] is in fact producing a half threat by flashing her canines. When Darwin [Charles Robert Darwin 1809 - 1882] published On the Origin of Species [1859] he delicately hinted in his last chapter that we too may have evolved from apelike ancestors. The English statesman Benjamin Disraeli [1804 - 1881] was outraged by this and at a meeting held in Oxford he asked a famous rhetorical question: "Is man a beast or an angel?" To answer this, he need only have looked at his wife's canines as she smiled at him, and he'd have realized that in this simple universal human gesture of friendliness lies concealed a grim reminder of our savage past.

          As Darwin himself concluded in The Descent of Man:


BUT WE ARE NOT HERE CONCERNED WITH HOPES AND FEARS, ONLY WITH TRUTH. We must acknowledge, as it seems to me, that man with all his noble qualities, with sympathy which he feels for the most debased, with benevolence which extends not only to other men but to the humblest creature, with his God-like intellect which has penetrated into the movements and constitution of the solar system—with all these exalted powers—man still bears in his bodily frame the indelible stamp of his lowly origin.' [211].

'the circuitry that embodies the vivid subjective quality of consciousness resides mainly in parts of the temporal lobes (such as the amygdala, septum, hypothalamus and insular cortex) and a single projection zone in the frontal lobes—the cingulate gyrus. And the activity of these structures must fulfill three important criteria, which I call (with apologies to Isaac Newton, who described the three basic laws of physics) the "three laws of qualia" ("qualia" simply means the raw feel of sensations such as the subjective quality of "pain" or "red" or "gnocchi with truffles"). My goal in identifying these three laws and the specialized structures embodying them is to stimulate further inquiry into the biological origin of consciousness.' [228-229].

"....qualia or subjective sensation. How can the flux of ions and electrical currents in little specks of jelly—the neurons in my brain—generate the whole subjective world of sensations like red, warmth, cold or pain? By what magic is matter transmuted into the invisible fabric of feelings and sensations? This problem is so puzzling that not everyone agrees it is even a problem. I will illustrate this so-called qualia riddle with two simple thought experiments of the kind that philosophers love to make up. Such whimsical pretend experiments are virtually impossible to carry out in real life. My colleague Dr. Francis Crick [see 2829, 2933] is deeply suspicious of thought experiments, and I agree with him that they can be very misleading because they often contain hidden question-begging assumptions. But they can be used to clarify logical points, and I will use them here to introduce the problem of qualia in a colorful way...." [229].

          "....These examples clearly state the problem of why qualia are thought to be essentially private...." [231].


          'Why did qualia—subjective sensation—emerge in evolution? Why did some brain events come to have qualia? Is there a particular style of information processing that produces qualia, or are there some types of neurons exclusively associated with qualia? (The Spanish neurologist Ramón y Cajal calls these neurons the "psychic neurons.") Just as we know that only a tiny part of the cell, namely, the deoxyribonucleic acid (DNA) molecule, is directly involved in heredity and other parts such as proteins are not, could it be that only some neural circuits are involved in qualia and others aren't? Francis Crick and Christof Koch have made the ingenious suggestion that qualia arise from a set of neurons in the lower layers of the primary sensory areas, because these are the ones that project to the frontal lobes where many so-called higher functions are carried out. Their theory has galvanized the entire scientific community and served as a catalyst for those seeking biological explanations for qualia. Others have suggested that the actual patterns of nerve impulses (spikes) from widely separated brain regions become "synchronized" when you pay attention to something and become aware of it.5 In other words, it is the synchronization itself that leads to conscious awareness. There's no direct evidence for this yet, but it's encouraging to see that people are at least trying to explore the question experimentally.

          These approaches are attractive for one main reason, namely, the fact that reductionism has been the single most successful strategy in science....' [233-234].

' is based on the fallacy that because you can imagine something to be logically possible, therefore it is actually possible....

you cannot use statements that begin, "After all, I can imagine" to draw conclusions about any natural phenomenon.' [235].

"Once a qualia-laden perception has been created, you're stuck with it. (A good example of this is the dalmatian dog in Figure 12.2. Initially, as you look, it's all fragments. Then suddenly everything clicks and you see the dog. Loosely speaking, you've now got the dog qualia. The next time you see it, there's no way you can avoid seeing the dog. Indeed, we have recently shown that neurons in the brain have permanently altered their connections once you have seen the dog.)8

          These examples demonstrate an important feature of qualia—it must be irrevocable. But although this feature is necessary, it's not sufficient to explain the presence of qualia. Why? ...." [238].

          "There is a third important feature of qualia. In order to make decisions on the basis of a qualia-laden representation, the representation needs to exist long enough for you to work with it. Your brain needs to hold the representation in an intermediate buffer or in so-called immediate memory. (For example, you hold the phone number you get from the information operator just long enough to dial it with your fingers.) Again this condition is not enough in itself to generate qualia...." [238-239].


          "To summarize thus far—for qualia to exist, you need potentially infinite implications (bananas, jaundice, teeth) but a stable, finite, irrevocable representation in your short-term memory as a starting point (yellow). But if the starting point is revocable, then the representation will not have strong, vivid qualia...." [241].

          'What is the functional or computational advantage to making qualia irrevocable? One answer is stability. If you constantly changed your mind about qualia, the number of potential outcomes (or "outputs") would be infinite; nothing would constrain your behavior. At some point you need to say, "this is it" and plant a flag on it, and it's the planting of the flag that we call qualia....' [241].

"Qualia are irrevocable in order to eliminate hesitation and to confer certainty to decisions.9 And this, in turn, may depend on which particular neurons are firing, how strongly they're firing and what structures they project to." [242].

          'One of the attributes of the self-representation system is that the person will confabulate to try to cover up deficits in it. The main purposes of doing this, as we saw in Chapter 7, are to prevent constant indecisiveness and to confer stability on behavior. But another important function may be to support the sort of created or narrative self that the philosopher Dan Dennett [see below] talks about—that we present ourselves as unified in order to achieve social goals and to be understandable to others. We also present ourselves as acknowledging our past and future identity, enabling us to be seen as part of society. Acknowledging and taking credit or blame for things we did in the past help society (usually kin who share our genes) incorporate us effectively in its plans, thereby enhancing the survival and perpetuation of our genes.17


Excursus: from:


"Daniel Dennett [see above]


From Wikipedia, the free encyclopedia.


Daniel Clement Dennett (born March 28, 1942) is an American philosopher. Dennett's research centers on philosophy of mind and philosophy of science, particularly as those fields relate to evolutionary biology and cognitive science. He is currently (January 2005) employed as Austin B. Fletcher Professor of Philosophy and director of the Center for Cognitive Studies at Tufts University.


Dennett is the author of several major books on evolution and consciousness. He is a leading proponent of the theory known by some as Neural Darwinism (see also greedy reductionism).


Dennett is also well known for his argument against qualia, which claims that the concept is so confused that it cannot be put to any use or understood in


any non-contradictory way, and therefore does not constitute a valid refutation of physicalism. This argument was presented most comprehensively in his book Consciousness Explained." End of Excursus.


Excursus: from:


"....Philosophers who deny that there are qualia often have in mind qualia, as the term is used in the senses specified in this section. Sometimes their target is qualia, conceived of as in the opening paragraph of the entry, but with the additional assumption (often not explicitly stated) that qualia are ineffable or nonphysical or 'given' to their subjects incorrigibly (without the possibility of error). Thus, announcements by philosophers who declare themselves opposed to qualia (e.g., Dennett [see 2936] 1987, 1991) need to be treated with some caution. One can agree that there are no qualia in the more restricted senses I have explained, and also agree that there are no ineffable or incorrigibly presented or non-physical qualities possessed by our mental states, while still endorsing qualia, in the standard broad sense [is this a world record for qualifying the abstract—qualia? Dennett's response could be fun]..


In the rest of this entry, I shall use the term 'qualia' in the standard broad way I did at the beginning of the entry. So, I shall take it for granted that there are qualia...."


Additional (qualia) Reference:


End of Excursus.

          If you doubt the reality of the social self, ask yourself the following question: Imagine that there is some act you've committed about which you are extremely embarrassed (love letters and Polaroid photographs from an illicit affair). Assume further that you now have a fatal illness and will be dead in two months. If you know that people rummaging through your belongings will discover your secrets, will you do your utmost to cover your tracks? If the answer is yes, the question arises, Why bother? After all, you know you won't be around, so what does it matter what people think of you after you're gone? This simple thought experiment suggests that the idea of the social self and its reputation is not just an abstract yarn. On the contrary, it is so deeply ingrained in us that we want to protect it even after death. Many a scientist has spent his entire life yearning obsessively for posthumous fame—sacrificing everything else just to leave a tiny scratchmark on the edifice.

          So here is the greatest irony of all: that the self that almost by definition is entirely private is to a significant extent a social construct—a story you make up for others. In our discussion on denial, I suggested that confabulation and self-deception evolved mainly as by-products of the need to impose stability, internal consistency and coherence on behavior. But an added important function might stem from the need to conceal the truth from other people.


          The evolutionary biologist Robert Trivers18 has proposed the ingenious argument that self-deception evolved mainly to allow you to lie with complete conviction, as a car salesman can. After all, in many social situations it might be useful to lie—in a job interview or during courtship ("I'm not married"). But the problem is that your limbic system often gives the game away and your facial muscles leak traces of guilt. One way to prevent this, Trivers suggests, may be to deceive yourself first. If you actually believe your lies, there's no danger your face will give you away. And this need to lie efficiently provided the selection pressure for the emergence of self-deception.

          I don't find Trivers's idea convincing as a general theory of self-deception, but there is one particular class of lies for which the argument carries special force: lying about your abilities or boasting. Through boasting about your assets you may enhance the likelihood of getting more dates, thereby disseminating your genes more effectively [I (LS) compare: the mother of my two sons]. The penalty you pay for self-deception, of course, is that you may become delusional. For example, telling your girlfriend that you're a millionaire is one thing; actually believing it is a different thing altogether, for you may start spending money you don't have! On the other hand, the advantages of boasting successfully (reciprocation of courtship gestures) may outweigh the disadvantage of delusion—at least up to a point. Evolutionary strategies are always a matter of compromise.' [254-255].

'Science—cosmology, evolution and especially the brain sciences—is telling us that we have no privileged position in the universe and that our sense of having a private nonmaterial soul "watching the world" is really an illusion (as has long been emphasized by Eastern mystical traditions like Hinduism and Zen Buddhism).

Once you realize that far from being a spectator, you are in fact part of the eternal ebb and flow of events in the cosmos, this realization is very liberating. Ultimately this idea also allows you to cultivate a certain humility—the essence of all authentic religious experience.' [256]. [A favorite statement].