virus: God Module

TheHermit (
Wed, 31 Mar 1999 01:43:33 -0600

Joseph Heller, "Catch-22"

"What the hell are you getting so upset about? I thought you didn't believe
in God."
"I don't," she sobbed, bursting violently into tears, but the God I don't
believe in is a good God, a just God, a merciful God. He's not the mean and stupid God you make Him out to be."

Searching for "gods" - in all the right places.

Irrational beliefs are not entirely independent of intelligence and education. The more intelligent you are, the less likely you are to believe in religion. (B.P. Beckwith Free Enquiry, 1986 vol 6 p 46). I note that this is not a promise. You still need to think. The best, maybe the only defense against religious memes is a thinking mind.

"I have certain, positive knowledge from my own direct experience. I can't
put it any plainer than that. I have seen God face to face." With these words, the fictional theologian Palmer Joss defends his religious convictions in Carl Sagan's 1985 novel, Contact. Joss argues for the existence of his Christian god on the basis of personal revelation. And while the Vatican may explain that this is invalid, "Joss" is not alone. Many religionists rest their faith on the apparently "solid foundation" of
"personal religious experiences". Some receive visions. Others hear a
comforting voice. Almost all experience a "sense of presence" or a feeling of "unity with the universe." Such episodes typically bring catharsis, joy, and calm.

Importantly, it is not solely people suffering from brain damage or mental illness who report these experiences. I can tell you that I have occasionally felt sure that I have experienced these feelings myself as a result of religious experience, drug activity, bio-feedback sessions and meditation - particularly while performing so called Yogic Flying"). Yet I do not ascribe them to anything other than aberrant brain processes. Such subjective experiences are lacking all the necessary qualities of scientific evidence, such as reproducibility and openness to consensual validation or critique. The religionist invariably resorts to claims that while his belief may not be scientifically justifiable, he knows it to be true, nonetheless, because of his private religious revelation. I cannot deny his experience (back to the cave), as I have experienced things that I class as experientially indistinguishable from what he is describing; or in some cases as more or less weird than he is describing, but which, in my opinion, are symptoms of abnormal or at least "not normal" brain function, not of
"gods". Telling him that is only likely to lead to the kind of anger and
miscommunication that every atheist attempting to discuss religion with "the other side" has experienced. A person who knows that you suspect has a few cogs loose is not likely to react positively anyway.

So what can we do to counter this argument for the existence of gods while avoiding a belligerent backlash from baffled believers [Alliteration's artful aid again!]? An alternative explanation which shows that these sensations, or to use the word of the week, noëtic experiences, are generated internally, are explicable using purely physical processes and do not require any "god like action" would be a good preliminary step. Of course, any alternative explanations need to be testable, repeatable and verifiable. At last, researchers in the fields of psychology and neuroscience have begun to uncover the biological mechanisms that might give rise to feelings of "revelation" in healthy adults. Some of the answers seem more than a little interesting. Here is a brief summary of the status quo.

Neuroscience, the science of the brain and the central nervous system, is on the threshold of a unified theory that will have an impact at least as powerful as that of Darwinism a hundred years ago. If you would like to know more, Edward O. Wilson's "The Insect Societies" and "Sociobiology: The New Synthesis" are both seminal and accessible. E.O. Wilson teaches zoology at Harvard and created and named the field of sociobiology. He has compressed its underlying premise into a single paragraph. "Every human brain," he says, "is born not as a blank tablet (a tabula rasa) waiting to be filled in by experience but as 'an exposed negative waiting to be slipped into developer fluid.' You can develop the negative well or you can develop it poorly, but either way you are going to get precious little that is not already imprinted on the film. The print is the individual's genetic history, over thousands of years of evolution, and there is not much anybody can do about it." Furthermore, says Wilson, genetics determine not only things such as temperament, role preferences, emotional responses, and levels of aggression, but also many of our most revered "moral" choices, which are not choices at all in any free-will sense but tendencies imprinted in the hypothalamus and limbic regions of the brain, a concept expanded upon in 1993 in a much-talked-about book, The Moral Sense, by James Q. Wilson (no kin to Edward O.).

The "God" Module

In a letter to New Scientist published on 17 May 1997, Stephen Moreton, of Cheshire, suggested that natural selection may have given advantages to humans who had a sense of religion. He says it has been known that neurological disorders such as temporal lobe epilepsy can cause
"hyper-religiosity". St Paul, Joan of Arc, Mohammed and others are cited as
examples. Researchers at the University of Minnesota have discovered that if one twin is deeply religious, the other is too. (Waller et al 1990 Psychological Science vol 1 p138). This says nothing about the truth or otherwise of religious belief, but it does suggest that a propensity for religious belief can be hard-wired, genetically, into the brain, and had a selective advantage in the past. Waller's research at the University of Minnesota in 1990 apparently used identical twins raised apart (a common research technique when attempting to determine whether a characteristic is primarily genetic or cultural), and it offers an alternative hypothesis to those who use the indisputable evidence of the survival of religious belief as evidence that a God must exist for people to believe in.

One common way to hunt for a module in the brain is to examine patients with various kinds of brain damage, hoping to find a localized form of damage that correlates with changes in the behavior of interest. In this way, one may discover relationships between certain circuits in the brain and certain behavioral functions. This was the strategy taken by the scientists in San Diego reported below. They decided to focus on temporal lobe epilepsy (TLE) patients, who exhibit inter-ictal behavior syndrome (IBS). These patients are prone to excessive activity in their temporal lobes, causing seizures during which they report powerful religious experiences. Importantly, clinicians have previously reported that such TLE patients are also often fanatically religious, even during the long periods between seizures. The question asked by Ramachandran and his colleagues was, why do such seizures often lead to enhanced religiosity? They entertained three possibilities:

  1. Strange sensory experiences that arise during seizure are rationally interpreted as signs of paranormal powers.
  2. The strong and widespread electrical activity that defines seizures strengthens connections between temporal lobe sensory areas and the amygdala (a brain area associated with emotion). This causes patients to see "deep cosmic significance" in everything.
  3. There is a system in the temporal lobe devoted to mediating emotional responses of a religious nature. Seizures can selectively strengthen the connections in this system.

The researchers dismissed the first option on the grounds that other kinds of neurological and psychiatric disorders result in strange hallucinations without causing the development of specifically religious propensities. To distinguish between the remaining two options, the scientists tested TLE-IBS patients to see if they had stronger emotional responses to everything in the world or only to religious stimuli. The degree of emotional response was measured through a physiological correlate, skin conductance response (SCR). By measuring small rapid changes in perspiration, the researchers hoped to show that TLE-IBS patients were particularly aroused by religion. Indeed, that was exactly what was found. The TLE-IBS patients showed preferential emotional arousal when presented with religious words as opposed to words with sexual or violent connotations. Unlike the patients, age-matched healthy control subjects responded most strongly to the sexual words. While these results seem to indicate that there is something distinctly religious about some of the circuitry in the temporal lobes, there are some reasons to be cautious about this conclusion. First, this study involved only three patients, and the preference for religious words was not equally robust in all three. Also, TLE-IBS patients sometimes exhibit changes in sexuality, becoming obsessed with the topic or bored by it. This symptom could have impacted the patients' responses to the sexual stimuli. In short, these results should be seen as preliminary.

In October 1997, a presentation boldly titled "The Neural Basis of Religious Experience" was given at the annual conference of the Society for Neuroscience by neuropsychologist Dr. V. S. Ramachandran and his colleagues. Here is what was reported.

"GOD MODULE: Study Ponders: Are Humans Hard-Wired For Religious Bent?" BY

No one knows why humanity felt its first religious stirrings, but researchers at the University of California, San Diego, have reported that the human brain may be hard-wired to hear the voice of heaven, in what researchers said was the first effort to address the neural basis of religious expression. In a provocative experiment with patients suffering from an unusual form of epilepsy, researchers at the UC San Diego brain and perception laboratory determined that the parts of the brain's temporal lobe, which the scientists quickly dubbed the "God module", may affect how intensely a person responds to religious beliefs.

People suffering this type of seizure have long reported intense mystical and religious experiences as part of their attacks but also are unusually preoccupied with mystical thoughts between seizures. That led this team to use these patients as a way of investigating the relationship between the physical structure of the brain and spiritual experiences. Where, Not Why: In a carefully designed experiment, the researchers determined that one effect of the patients' seizures was to strengthen their brain's involuntary response to religious words, leading the scientists to suggest a portion of the brain was naturally attuned to ideas about a supreme being.

"It is not clear why such dedicated neural machinery . . . for religion may
have evolved," the team reported last week at a meeting of the Society for Neuroscience in New Orleans. One possibility, the scientists suggested, was to encourage tribe loyalty or reinforce kinship ties or the stability of a closely-knit clan. The scientists emphasized that their findings in no way suggest that religion is simply a matter of brain chemistry. "These studies do not in any way negate the validity of religious experience or God," the team cautioned. "They merely provide an explanation in terms of brain regions that may be involved." [La La La - TheHermit]

Craig Kinsely, an expert in psychology and neuroscience at the University of Richmond in Virginia, said "the implications are fascinating." "People have been tickling around the edges of consciousness and this sort of research plunges in," Kinsely said. "There is the quandary of whether the mind created God or God created the mind. This is going to shake people up, but [any conclusion] is very premature." Vilayanur S. Ramachandran, the senior scientist involved in the experiment, said, "We are skating on thin ice. We are only starting to look at this. The exciting thing is that you can even begin to contemplate scientific experiments on the neural basis of religion and God."

Well much water has flown since 1997. Ramachandran had no idea how much publicity was about to descend onto his head, and backed down dramatically.
"In the end, this experiment suggests only that TLE-IBS patients do, indeed,
display religion-specific symptoms. This, in turn, suggests that the brain's temporal lobe is involved in religious experience. The degree to which religion is a distinct or genetically determined part of our neural architecture has yet to be determined."

Dr. Michael Persinger of Laurentian University has performed years of research into the neurological basis of religious experiences. Over this time, he has constructed and refined a rather detailed account of the neural processes that may underlay feelings of supernatural contact.(Michael A. Persinger. Neuropsychological Bases of God Beliefs, New York: Praeger, 1987 ). In brief, religious experiences are seen as the result of "temporal lobe transients" (TLT)--short-lived rate increases and instability in the firing patterns of neurons in the temporal lobe. These transients are seen as miniature versions of the seizures experienced by temporal lobe epileptics, and they are thought to occasionally arise in healthy people.

Persinger has speculated as to why such TLT events would produce the particular configuration of experiences reported as religious revelations. He sees a critical part of our "sense of self" as being maintained by systems in the left hemisphere temporal cortex. Most of the time, there is
"matched" activity in the analogous places in the right hemisphere. However,
when activity on the right gets out of sync with activity on the left, as during a TLT event, the left hemisphere interprets the mismatched activity as "another self" or a "sensed presence" the mind of God. In conjunction with this experience comes excessive stimulation of subcortical areas in the temporal lobe, particularly the amygdala (associated with emotion) and the hippocampus (associated with autobiographical memory). Excitation of these area results in the attribution of personal meaning to the experience. These powerful TLT events may naturally result from a number of factors, including increased sensitivity or lability of right temporal areas, loss of oxygen to the brain, and changes in blood sugar. These biological conditions may be caused by crisis situations, prolonged anxiety, near-death contingencies, high altitudes, starvation and fasting, diurnal shifts, and other physiological stressors.

A variety of correlational studies on healthy adults have been conducted by scientists in Persinger's lab. Assuming that people who are prone to TLT events will show subtle signs of a tendency towards hemispheric mismatch even when not experiencing a "micro-seizure," Persinger and his colleagues have examined the "brain waves" of a large number of healthy subjects and have compared these results with reports of religious experiences. They have found that a particular low-frequency component of one's electroencephalogram (EEG) trace, known as the theta rhythm, can partially predict the likelihood of having religious experiences ( C. Munro and Michael A. Persinger. "Relative Right Temporal Lobe Theta Activity Correlates with Vingiano's Hemispheric Quotient and the 'Sensed Presence."' Perceptual and Motor Skills, 75(1992): 89 903 ). Across healthy subjects, hemispheric mismatch in the theta component correlates with reports of previous "sensed presence" experiences. Furthermore, signs of specifically subcortical (limbic) mismatch in the temporal lobes are correlated with belief in paranormal phenomena, whereas indications of mismatch in the cortex are correlated with previous "sensed presence experiences ( Michael A. Persinger, "Paranormal and Religious Beliefs May Be Mediated Differently by Subcortical and Cortical Phenomenological Processes of the Temporal (Limbic) Lobes." Perceptual and Motor Skills, 76(1993): 247 51 ). In short, there is good correlational evidence that one's tendency to have religious experiences involves interhemispheric circuits in the temporal lobe.

Persinger has now done much better. He has devised a machine that generates a particular kind of magnetic field around the head producing
"micro-seizures" in the temporal lobes of the brain. Healthy people who have
experienced this induced brain activity have reported such things as a feelings of floating, deformations of the body, strong emotions, a "sensed presence," and specifically religious dreamlike hallucinations!

While the results above are interesting, Persinger's work involving the actual generation of religious experiences is much more striking. In a typical experiment the subject is isolated from sound and the eyes are covered. A helmet equipped with solenoids is strapped to the head. While reclining in this state of partial sensory deprivation, currents are induced in the subject's brain through the generation of patterned extremely low frequency milligauss magnetic fields in the solenoids. The subject is asked to describe any experiences aloud and this monologue is recorded. By manipulating the magnetic field the experimenter has some control over the location and pattern of induced current in the brain. When subcortical (limbic) areas in the temporal lobes are targeted, subjects often report distortions in their body images, senses of forced motion, and strong emotional reactions. For example, Dr. Susan Blackmore entered Persinger's experimental chamber and reported a sense of swaying motion, a feeling of being yanked into an upright position, a sense that her leg had been stretched halfway to the ceiling, a period of intense anger, and flash of terror ( Susan Blackmore. "Alien Abduction: The Inside Story." New Scientist, 19 November (1994): pp. 2 31 ).

When temporal cortical areas are targeted for stimulations, subjects report dreamlike visions (often with mystical or religious content) a "sense of presence," and strong emotions. Journalist Ian Cotton, for example, reported highly detailed visions of his childhood home, a dreamlike visit to the monks of a Tibetan temple, and an emotional "realization" that he too was, and always had been, a Tibetan monk ( Ian Cotton, "Dr. Persinger's God Machine." FREE INQUIRY, 17(1)(1996/97): 47 51 ). Visions are particularly sensitive to suggestion, with the content being influenced by, say, the presence of a crucifix or the playing of distinctly Eastern music.

With these experimental results in mind let's see if we can make a start at defining a non-religious explanation, which meets our requirements. The primary questions posed are: Is there a "religion instinct" that is genetically "hard-wired" into our brains? (Innateness) What brain circuits are involved in "religious" experiences? (Circuity) Does the brain contain a special module dedicated to "religious" experience? (Modularity). I think you will see that the indications are looking fairly positive. Persinger attempts to reply to them (Persinger, Neuropsychological Bases of God Reliefs, p. 138.) in a critical examination of his theory and of his data.

Persinger holds that subcortical temporal lobe systems contribute to paranormal experiences and paranormal belief. Cortical areas in the temporal lobes participate in the "sense of self" and, during periods of hen spheric mismatch, in the "sensed presence." His correlational and experimental data both support the notion that temporal circuits are central religious experience. With regard the question of modularity, Persinger's theory specifically denies the existence of a distinct "God module." In his view, the brain areas responsible for religious experience are exactly those areas that also mediate "sense of self," general emotional responses, and autobiographical memory. While his experimental work does not bear on this question, his correlational data support this distributed view. The likelihood of having religious experiences is systematically related to these other properties of cognition.

Persinger's position on the question of innateness is more ambiguous. In his writings, he frequently points out that religious experiences can have positive effects. He sees TLT events as a remedy for extreme anxiety. The God Experience has had survival value. It has allowed the human species to live through famine, pestilence, and untold horrors. When temporal lobe transients occurred, men and women who might have sunk into a schizophrenic stupor continued to build, plan, and hope. While such "survival value" may have facilitated the incorporation of a feature into the genome, the utility of a behavior is not enough to ensure such fixation in DNA. For example, the making of bread is a skill with great survival value, but it is unlikely that this skill genetically encoded. Still, Persinger seems to lean towards a largely nativist account. Unfortunately, the data that has emerged from Persinger's lab does not really address the question of innateness.

Note that, even if a tendency towards experiencing TLT events was found to be influenced by one's genes, this would not necessarily mean that religious experiences have been favored by natural selection. For example, it might be the case that temporal lobe lability contributes to imagination and creativity, and this lability also accidentally results in religious experiences. In short, the question of a "religion instinct" is far from settled by Persinger's work. Persinger's investigations have yet to fully confirm his views on the neurological bases of religious experience, but he has made tremendous progress. Unlike Ramachandran's work with TLE-IBS patients, Persinger has focused on healthy adults. He has shown that particular activity patterns in the temporal lobes of healthy brains can give rise to experiences that are very similar to the spontaneous religious experiences reported by many.

Seeing the "gods" at Work (and at play)

Until recently, most neuroscientists confined their inquiries to research aimed at alleviating the medical problems that affect the brain's health, and to attempts to fathom its fundamental neural mechanisms. Emboldened by their growing understanding of how the brain works, scientists now dare to investigate the relationship between the brain, human consciousness and a range of intangible mental experiences. And I might have been thinking along these lines when I said that "we (...) have a number of strong indications that a capacity for faith is genetically programmed.", but I was actually thinking of something even more exciting. fMRI and other brain imaging techniques which allow measurement and localization of brain responses to word lists and careful examination of genetic codes to find common similarities and differences.

Until recently, all of neuroscience was deductive, based on extrapolation from other species or on programs of observations of "brain damaged" patients and analysis such as that performed by Dr Ramachandran to determine what brain functionality is impaired. So neuroscientific theory was based on indirect evidence, from studies of animals or of how a normal brain changes when it is invaded (by accidents, disease, radical surgery, or experimental needles). As examples, Edward O. Wilson, mentioned in my introduction had only a limited direct knowledge of the human brain. He is a zoologist, not a neurologist, and his theories were extrapolations from the exhaustive work he had performed in his specialty, the study of insects. The French surgeon Paul Broca discovered Broca's area, one of the two speech centers of the left hemisphere of the brain, only after one of his patients suffered a stroke. Monitoring gross brain currents using the electroencephalograph enabled the discovery of most of the limited knowledge we have today.

Now however, neuroscience has some new tools: tools for brain imaging which are improving all the time. Brain imaging refers to techniques for watching the human brain as it functions, in real time. The most advanced forms currently are three-dimensional electroencephalography using mathematical models; the more familiar PET scan (positron-emission tomography); the new fMRI (functional magnetic resonance imaging), which shows brain blood-flow patterns, and MRS (magnetic resonance spectroscopy), which measures biochemical changes in the brain; and the even newer PET reporter gene/PET reporter probe, which is, in fact, so new that it still has that length of heavy lumber for a name. Used so far only in animals and a few desperately sick children, the PET reporter gene/PET reporter probe pinpoints and follows the activity of specific genes. On a scanner screen you can actually see the genes light up inside the brain.

While the PET scan and the PET reporter gene/PET reporter probe are technically medical invasions, since they require the injection of chemicals or viruses into the body, they offer glimpses of what the noninvasive imaging of the future will probably look like. A neuroradiologist can read a list of topics out loud to a person being given a PET scan, topics pertaining to sports, music, business, history, whatever, and when he finally hits one the person is interested in, a particular area of the cerebral cortex actually lights up on the screen. Eventually, as brain imaging is refined, the picture may become as clear and complete as those see-through exhibitions, at auto shows, of the inner workings of the internal combustion engine. At that point it may become obvious to everyone that ***all*** we are looking at is a piece of machinery, an analog chemical computer, that processes information from the environment. "***all***", since you can look and look and you will not find any ghostly self inside, or any mind, or any soul.

Brain imaging was invented for medical diagnosis. But its far greater importance is that it may very well confirm, in ways too precise to be disputed, certain theories about "the mind," "the self," "the soul," and
"free will" that are already accepted by scholars in what is now the hottest
field in the academic world, neuroscience. Granted, all those skeptical quotation marks are enough to put anybody on the qui vive right away, but Ultimate Skepticism is part of the brilliance of this dawn.

Neuroscience - Politically Incorrect

Of course, not everybody is comfortable with these ideas. Like Saint Augustine with mathematics, "The good Christian should beware of mathematicians and all those who make empty prophecies. The danger already exists that mathematicians have made a covenant with the devil to darken the spirit and confine man in the bonds of Hell.", so the general public and neuroscience and "genetic determinism."

Just think about intelligence, as measured by IQ tests. Privately--not many care to speak out--the vast majority of neuroscientists believe the genetic component of an individual's intelligence is remarkably high. Your intelligence can be improved upon, by skilled and devoted mentors, or it can be held back by a poor upbringing--i.e., the negative can be well developed or poorly developed--but your genes are what really make the difference. The recent ruckus over Charles Murray and Richard Herrnstein's The Bell Curve is probably just the beginning of the bitterness the subject is going to create.

It wasn't simply that no one believed you could derive IQ scores from brainwaves; it was that nobody wanted to believe it could be done. Nobody wanted to believe that human brainpower is that hardwired. Nobody wanted to learn in a flash that "the genetic fix" is in. Nobody wanted to learn that he was "a hardwired genetic mediocrity" and that the best he could hope for in this “Trough of Mortal Error” was to live out his mediocre life as a stress-free dim bulb. Barry Sterman of UCLA, chief scientist for a firm called Cognitive Neurometrics, who has devised his own brain-wave technology for market research and focus groups, regards brain-wave IQ testing as possible--but in the current atmosphere you "wouldn't have a Chinaman's chance of getting a grant" to develop it.

So Neurometrics sought out investors, developed and demonstrated their unit, and tried to market their amazing but simple device known as the IQ Cap. It worked! They provided a way of testing intelligence that is free of
"cultural bias," one that does not force anyone to deal with words or
concepts that might be familiar to people from one culture but not to people from another. The IQ Cap records only brain waves; and a computer, not a potentially biased human test-giver, analyzes the results.

It was based on the work of neuroscientists such as E. Roy John I, who is now one of the major pioneers of electroencephalographic brain imaging; Duilio Giannitrapani, author of The Electrophysiology of Intellectual Functions ; and David Robinson, author of The Wechsler Adult Intelligence Scale and Personality Assessment: Toward a Biologically Based Theory of Intelligence and Cognition and many other monographs famous among neuroscientists. One researcher who devised an IQ Cap himself, by replicating an experiment described by Giannitrapani in The Electrophysiology of Intellectual Functions described its use as follows: It was not a complicated process. You attached sixteen electrodes to the scalp of the person you wanted to test. You had to muss up his hair a little, but you didn't have to cut it, much less shave it. Then you had him stare at a marker on a blank wall. This particular researcher used a raspberry- red thumbtack. Then you pushed a toggle switch. In sixteen seconds the Cap's computer box gave you an accurate prediction (within one-half of a standard deviation) of what the subject would score on all eleven subtests of the Wechsler Adult Intelligence Scale or, in the case of children, the Wechsler Intelligence Scale for Children--all from sixteen seconds' worth of brain waves. There was nothing culturally biased about the test whatsoever. What could be cultural about staring at a thumbtack on a wall? The savings in time and money were breathtaking. The conventional IQ test took two hours to complete; and the overhead, in terms of paying test-givers, test-scorers, test-preparers, and the rent, was $100 an hour at the very least. The IQ Cap required about fifteen minutes and sixteen seconds--it took about fifteen minutes to put the electrodes on the scalp--and about a tenth of a penny's worth of electricity.

Neurometrics's investors were rubbing their hands and licking their chops. Unfortunately, nobody wanted their IQ Cap! Visit their company web site ) not a mention of it there. Visit E. Roy John's web site ( ) and see the cautious phrasing that surrounds his work today.

The neuroscientific view of life, has become the strategic high ground in the academic world, and a political hot potato without peer! The battle for it has already spread well beyond the scientific disciplines and, for that matter, out into the general public. Both liberals and conservatives without a scientific bone in their bodies are busy trying to seize the terrain. The gay rights movement, for example, has fastened onto a study published in July of 1993 by the highly respected Dean Hamer of the National Institutes of Health, announcing the discovery of "the gay gene." Obviously, if homosexuality is a genetically determined trait, like left-handedness or hazel eyes, then laws and sanctions against it are attempts to legislate against Nature. Conservatives, meantime, have fastened upon studies indicating that men's and women's brains are wired so differently, thanks to the long haul of evolution, that feminist attempts to open up traditionally male roles to women are the same thing: a doomed violation of Nature.

Wilson himself has wound up in deep water on this score; or maybe I should say, cold water. In his personal life Wilson is a conventional liberal, PC, as the saying goes--he is, after all, a member of the Harvard faculty--concerned about environmental issues and all the usual things. But he has said "forcing similar role identities" on both men and women "flies in the face of thousands of years in which mammals demonstrated a strong tendency for sexual division of labor. Since this division of labor is persistent from hunter-gatherer through agricultural and industrial societies, it suggests a genetic origin. We do not know when this trait evolved in human evolution or how resistant it is to the continuing and justified pressures for human rights." "Resistant" was the neuroscientist, speaking. "Justified" was the PC Harvard liberal. He was not PC or liberal enough. Feminist protesters invaded a conference where Wilson was appearing, dumped a pitcher of ice water, cubes and all, over his head, and began chanting, "You're all wet! You're all wet!" The most prominent feminist in America, Gloria Steinem, went on television and, in an interview with John Stossel of ABC, insisted that studies of genetic differences between male and female nervous systems should cease forthwith.

But that turned out to be mild stuff in the political panic over neuroscience. In February of 1992, Frederick K. Goodwin, a renowned psychiatrist, head of the federal Alcohol, Drug Abuse, and Mental Health Administration, and a certifiable virgin in the field of public relations, made the mistake of describing, at a public meeting in Washington, the National Institute of Mental Health's ten-year-old Violence Initiative. This was an experimental program whose hypothesis was that, as among monkeys in the jungle--Goodwin was noted for his monkey studies--much of the criminal mayhem in the United States was caused by a relatively few young males who were genetically predisposed to it. In short, people who are hardwired for violent crime. Out in the jungle, among mankind's closest animal relatives, the chimpanzees, it seemed that a handful of genetically twisted young males were the ones who committed practically all of the wanton murders of other males and the physical abuse of females. What if the same were true among human beings? What if, in any given community, it turned out to be a handful of young males with toxic DNA who were pushing statistics for violent crime up to such high levels? The Violence Initiative envisioned identifying these individuals in childhood, somehow, some way, someday, and treating them therapeutically with drugs. The notion that crime-ridden urban America was a
"jungle," said Goodwin, was perhaps more than just a tired old metaphor.

That did it. That may have been the stupidest single talk uttered by an American public official in the year 1992 - and that was the year of the Gulf War and Quail's "What a waste it is to lose one's mind. Or not to have a mind is being very wasteful. How true that is"! The outcry against neuroscience and its practitioners was immediate. Senator Edward Kennedy of Massachusetts (Yes that Kennedy) and Representative John Dingell of Michigan (who, it became obvious later, suffered from hydrophobia when it came to science projects) not only condemned Goodwin's remarks as racist but also delivered their scientific verdict: Research among primates "is a preposterous basis" for analyzing anything as complex as "the crime and violence that plagues our country today." [La La La - I guess neither of them had ever heard of NASA building the space program around primate studies.] The Violence Initiative was compared to Nazi eugenic proposals for the extermination of undesirables. Dingell's Michigan colleague, Representative John Conyers, then chairman of the Government Operations Committee and senior member of the Congressional Black Caucus, demanded Goodwin's resignation--and got it two days later, whereupon the government, with the Department of Health and Human Services now doing the talking, denied that the Violence Initiative had ever existed. It disappeared down the memory hole, to use Orwell's term. When a University of Maryland legal scholar named David Wasserman tried to reassemble the troops on the quiet, as it were, in a hall all but hidden from human purview in a hamlet called Queenstown in the foggy, boggy boondocks of Queen Annes County on Maryland's Eastern Shore, the NIH, proving it was a hard learner, quietly provided $133,000 for the event but only after Wasserman promised to fireproof the proceedings by also inviting scholars who rejected the notion of a possible genetic genesis of crime and scheduling a cold-shower session dwelling on the evils of the eugenics movement of the early twentieth century. No use, boys! An army of protesters found the poor cringing devils anyway and stormed into the auditorium chanting, "Maryland conference, you can't hide--we know you're pushing genocide!" It took two hours for them to get bored enough to leave, and the conference ended in a complete muddle with the specially recruited fireproofing PC faction issuing a statement that said: "Scientists as well as historians and sociologists must not allow themselves to provide academic respectability for racist pseudoscience." Today, at the NIH, the term Violence Initiative is a synonym for taboo.

The last hiding place of the "gods"

Neuroscience today resembles that moment in the Dark Ages when the Catholic Church forbade the dissection of human bodies, for fear that what was discovered inside might cast doubt on the Christian doctrine that God created man in his own image. Here we begin to sense the chill that emanates from the hottest field in the academic world. The unspoken and largely unconscious premise of the wrangling over neuroscience's strategic high ground is: We now live in an age in which science is a court from which there is no appeal. And the issue this time around, at the end of the twentieth century, is not the evolution of the species, which can seem a remote business, but the nature of our own inner selves. The elders of the field, such as Wilson, are well aware of all this and are cautious, or cautious compared to the new generation. The new generation of neuroscientists is not! In their private conversations, the bull sessions, as it were, that create the mental atmosphere of any hot new science, they express an uncompromising determinism.

They start with the most famous statement in all of modern philosophy, Descartes's "Cogito ergo sum," "I think, therefore I am," which they regard as the essence of "dualism," the old-fashioned notion that the mind is something distinct from its mechanism, the brain and the body. (I will get to the second most famous statement in a moment.) This is also known as the
"ghost in the machine" fallacy, the quaint belief that there is a ghostly
"self" somewhere inside the brain that interprets and directs its
operations. Neuroscientists involved in three-dimensional electroencephalography will tell you that there is not even any one place in the brain where consciousness or self-consciousness (Cogito ergo sum) is located. This is merely an illusion created by a medley of neurological systems acting in concert. The young generation takes this yet one step further. Since consciousness and thought are entirely physical products of your brain and nervous system--and since your brain arrived fully imprinted at birth--what makes you think you have free will? Where is it going to come from? What "ghost," what "mind," what "self," what "soul," what anything that will not be immediately grabbed by those scornful quotation marks, is going to bubble up your brain stem to give it to you? I have heard neuroscientists theorize that, given computers of sufficient power and sophistication, it would be possible to predict the course of any human being's life moment by moment, including the fact that the poor devil was about to shake his head over the very idea. I doubt that any Calvinist of the sixteenth century ever believed so completely in predestination as these, the hottest and most intensely rational young scientists in the United States at the end of the twentieth.

Which brings us to the second most famous statement in all of modern philosophy: Nietzsche's "God is dead." The year was 1882. (The book was Die Fröhliche Wissenschaft [The Gay Science].) Nietzsche said this was not a declaration of atheism, although he was in fact an atheist, but simply the news of an event. He called the death of God a "tremendous event," the greatest event of modern history. The news was that educated people no longer believed in God, as a result of the rise of rationalism and scientific thought, including Darwinism, over the preceding 250 years. Of course, there was a dark side to this, "The story I have to tell," wrote Nietzsche, "is the history of the next two centuries." He predicted (in Ecce Homo) that the twentieth century would be a century of "wars such as have never happened on earth," wars catastrophic beyond all imagining. And why? Because human beings would no longer have a god to turn to, to absolve them of their guilt; but they would still be racked by guilt, since guilt is an impulse instilled in children when they are very young, before the age of reason. As a result, people would loathe not only one another but themselves. The blind and reassuring faith they formerly poured into their belief in God, said Nietzsche, they would now pour into a belief in barbaric nationalistic brotherhoods: "If the doctrines...of the lack of any cardinal distinction between man and animal, doctrines I consider true but deadly", he says in an allusion to Darwinism in Untimely Meditations, "are hurled into the people for another generation...then nobody should be surprised when...brotherhoods with the aim of the robbery and exploitation of the non-brothers...will appear in the arena of the future."

Nietzsche's view of guilt, incidentally, is also that of neuro-scientists a century later. They regard guilt as one of those tendencies imprinted in the brain at birth. In some people the genetic work is not complete, and they engage in criminal behavior without a twinge of remorse--thereby intriguing criminologists, who then want to create Violence Initiatives and hold conferences on the subject.

Nietzsche said that mankind would limp on through the twentieth century "on the mere pittance" of the old decaying God-based moral codes. But then, in the twenty-first, would come a period more dreadful than the great wars, a time of "the total eclipse of all values" (in The Will to Power ). This would also be a frantic period of "revaluation," in which people would try to find new systems of values to replace the osteoporotic skeletons of the old. But you will fail, he warned, because you cannot believe in moral codes without simultaneously believing in a god who points at you with his fearsome forefinger and says "Thou shalt" or "Thou shalt not."

Why should we bother ourselves with a dire prediction that seems so far-fetched as "the total eclipse of all values"? Because of man's track record, I should think. After all, in Europe, in the peaceful decade of the 1880s, it must have seemed even more far-fetched to predict the world wars of the twentieth century and the barbaric brotherhoods of Nazism and Communism. Ecce vates! Ecce vates! Behold the prophet! How much more proof can one demand of a man's powers of prediction? A hundred years ago those who worried about the death of God could console one another with the fact that they still had their own bright selves and their own inviolable souls for moral ballast and the marvels of modern science to chart the way. But what if, as seems likely, the greatest marvel of modern science turns out to be brain imaging? And what if, ten years from now, brain imaging has proved, beyond any doubt, that not only Edward O. Wilson but also the young generation are, in fact, correct?

The elders, such as Wilson himself and Daniel C. Dennett, the author of Darwin's Dangerous Idea: Evolution and the Meanings of Life , and Richard Dawkins, author of The Selfish Gene and The Blind Watchmaker , insist that there is nothing to fear from the truth, from the ultimate extension of Darwin's dangerous idea. They present elegant arguments as to why neuroscience should in no way diminish the richness of life, the magic of art, or the righteousness of political causes, including, if one need edit, political correctness at Harvard or Tufts, where Dennett is Director of the Center for Cognitive Studies, or Oxford, where Dawkins is something called Professor of Public Understanding of Science. (Dennett and Dawkins, every bit as much as Wilson, are earnestly, feverishly, politically correct.) Despite their best efforts, however, neuroscience is not rippling out into the public on waves of scholarly reassurance. It is trickling out in 5 minute clips on CNN, brief articles in New Scientist and Scientific American. Not the hope. Not the joy. Not the beauty. In my opinion the next stop is Popular Science "We're all hardwired!" and then I expect to see something like "JonBenet Ramsey's mother is innocent: Don't blame me! I'm wired wrong!" as a headline on the cover of National Enquirer at the local supermarket.

While we have seen immense developments in computing, medicine and genetics, and missed great opportunities in space technology and atomic research, this sudden switch from a belief in "Nurture", in the form of social conditioning, to "Nature", in the form of genetics and brain physiology, is in my opinion, the great intellectual event, to borrow Nietzsche's term, of the late twentieth century. Up to now the two most influential philosophical ideas of the century have been Marxism and Freudianism. Both were founded upon the premise that human beings and their "ideals" [Marx and Freud knew about quotation marks, too]; are completely molded by their environment. To Marx, the crucial environment was one's social class; "ideals" and "faiths" were notions foisted by the upper orders upon the lower as instruments of social control. To Freud, the crucial environment was the Oedipal drama, the unconscious sexual plot that was played out in the family early in a child's existence. The "ideals" and "faiths" you prize so much are merely the parlor furniture you feature for receiving your guests, said Freud; I will show you the cellar, the furnace, the pipes, the sexual steam that actually runs the house. By the mid-1950s even anti-Marxists and anti-Freudians had come to assume the centrality of class domination and Oedipally conditioned sexual drives. On top of this came Pavlov, with his "stimulus-response bonds," and B. F. Skinner, with his "operant conditioning," turning the supremacy of conditioning into something approaching a precise form of engineering.

Ironically, said Nietzsche, this unflinching eye for truth, this zest for skepticism, is the legacy of Christianity. Then he added one final and perhaps ultimate piece of irony in a fragmentary passage in a notebook shortly before he lost his mind (to the late-nineteenth-century's great venereal scourge, syphilis). He predicted that eventually modern science would turn its juggernaut of skepticism upon itself, question the validity of its own foundations, tear them apart, and self-destruct.

Right now, we seem to be in the throes of an anti-science back to the dark-ages and belief-based mentality thrust. Are we marching to Nietzsche’s drum? In the darkest hours of the night, I sometimes fear so, yet if I were a college student today, I don't think I could resist going into neuroscience. Here we have the two most fascinating riddles of the twenty-first century: the riddle of the human mind and the riddle of what happens to the human mind when it comes to know itself absolutely. In any case, we live in an age in which it is impossible and pointless to avert your eyes from the truth. Thereupon, in the year 2006 or 2026, some new Nietzsche will step forward to announce: "The self is dead", except that being prone to the poetic (Joe are you listening?), he will probably say something along the lines of "The soul is dead." He will say that he is merely bringing the news, the news of the greatest event of the millennium:
"The soul, that last refuge of values, is dead, because educated people no
longer believe it exists." Unless the assurances of the Wilsons and the Dennetts and the Dawkinses also start rippling out, the lurid carnival that will ensue may make the phrase "the total eclipse of all values" seem tame.

Modern science is beginning to understand the neurological mechanisms that give rise to the religious experiences of the believer. Given these results, the skeptic may present the believer with a simple question: How do you know that your religious experience is not a simple trick of your brain--the unfolding of a perfectly natural temporal lobe transient? How can you trust such an experience when, through science, we can convincingly mimic the face of God? Hopefully the answers to these questions will help us to discuss these "religious" questions more rationally. Of course, a question which needs to be asked, and while many researchers will deny as a motivation [They lie - Heh! Heh!], is whether, once we have answers for these questions, is if the gods will still have a place left to hide?

This is but a draft, and like Blaise Pascal (1656)I can only ask you to excuse me, “I have made this letter longer because I did not have time to make it shorter”


P.S. if you think after reading this that we are dealing with a minefield, for the ultimate in non-PC try Rushton On Gould at )

R.A. Heinlein The Cat Who Walks Through Walls
"The hardest part about gaining any new idea is sweeping out the false idea
occupying that niche. As long as that niche is occupied, evidence and proof and logical demonstration get nowhere. But once the niche is emptied of the wrong idea that has been filling it -- once you can honestly say, 'I don't know,' then it becomes possible to get at the truth."

TheHermit on IRC
"It is time to kick some gods out of their accustomed Nietzsches"