The idea is something like this:
Human beings are prone to superstition, the invention of myth and legend based on over exaggeration of real past events, and unwavering belief in anything asserted by parent figures.
Humans are also pretty rational beings, so they tend to integrate all of theses myths, superstitions, and traditions into what might be called a worldview. This worldview tends to explain the origin of humanity and the world it inhabits, prescribe proper behavior of individuals toward one-another (as well as toward the ubiquitous imagined anthropomorphic forces controlling the world), and address the terrifying concept of the inevitability of death.
However, despite the human tendencies mentioned above, these worldviews and their constituent parts are not invulnerable. Humans are clever, and very good at pattern-recognition. When the patterns of the worldview become inconsistent with the observed patterns in the world, humans will often reject all or part of the worldview. For example, do you believe that spilling salt is bad luck, or saying "Bloody Mary" repeatedly in a darkened room with a mirror will bring forth some unholy being to torment you? If not, why don't you?
When a young child, I tested the Bloody Mary one. My friends insisted it was true, but I was skeptical. However, I was terrified, because I'm human and I tend to accept these superstitions naturally, on some level. Still, I was not afraid enough that I did not go into the bathroom (half-petrified) and test it thoroughly. Once experience had failed to confirm the purported truth of Bloody Mary, I came to assume the falsehood of this particular myth.
The idea of a mind-virus is that these ideas, these myths and superstitions, are passed from person to person easy. However, the human mind has defenses against false ideas (skepticism, rational approach to reality). The ideas resilient to these defenses tend not to be widely abandoned, but passed on further. The ideas mutate and grow over time, and the mutations most favorable to the survival of the idea are passed on, where the other ones are quickly abandoned. This, as I'm sure you've seen, is the source of the term "mind-virus," and I find that the behavior of these ideas are actually very analogous to the evolution of an actual contagious virus.
After all, a virus is just RNA in a protein shell, and is little more than information. This is hauntingly close to what an idea consists of. Virii do not reproduce like cellular organisms, they simply deliver the information, which has evolved through selection to hijack existing cells for the reproduction and distribution of the virus.
So, I have been musing over the potential dangers to a mind-virus posed by a human being, and the possible defenses the virus might evolve.
Experimental refutation of an idea. The mind, as I said, can go out and test ideas, and will usually abandon them when they have been thoroughly tested without significant matching results.
-If the idea cannot be tested, it cannot be experimentally disproved.
-If the idea includes anecdotal accounts of strong evidence for its veracity, it may weaken refutation by apparent (but unverified) counterexample.
-If the idea is tied strongly to another idea which is accepted, the accepted idea can anchor the attached idea by way of the unwillingness to accept the falsehood of the accepted idea (which in many cases would indeed be shown to be false if the connected idea were disproved.)
Rational inconsistency of parts of an idea or worldview. Human beings are adept at finding the incongruities of two or more assertions, and will often abandon at least one idea in order to escape from the illogic.
-If an appeal to cognitive dissonance is packaged with the idea, a person might well rationalize away or ignore the obvious inconsistencies. Often this takes the form of "At least do/try it for
-If an appeal to human ignorance is made, a person might accept inconsistent assertions, pending further data. This is actually the virus commandeering the person's skepticism; it is very true that knowledge gained later might clarify the apparent inconsistency. A strengthening factor often included is a hint or assertion that such knowledge is had somewhere, and may well be on its way.
Experience and a realistic world-view. Human begin to expect the world to act as observed once they have gained enough experience. Skepticism is the natural reaction to claims inconsistent with that pattern.
-If the virus comes packaged with some rationalization that seems logical, the inconsistency can be made to seem trivial. Jonathan Swift's satirical A Modest Proposal is an demonstration of how preposterous ideas can be cast in a light making them seem perfectly reasonable.
-If a virus appeals to deep-seating anxieties common to human beings, the strong desire to escape to safety from these anxieties might cause a person to strongly accept an idea immediately, making skeptical investigation less likely. Unemployment, disease, death, loneliness, etc, can cause people to go so far as to buy spring water from Tibet and to vote for sketchy candidates making sweeping promises.
An understanding of uncertainty, probability, and statistics. This is related to the experienced, realistic world-view, but is more rational than intuitive. The unlikelihood of the truth of an assertion compared to alternatives is extremely damaging to an idea. If you flip a coin and heads comes up twice, there is little reason to assume a biased coin. If you flip a coin 1000 times in a row and get heads, to suspect the coin is the logical reaction. It is not that the alternative is impossible, but it is much less likely.
-If an idea comes with an argument containing bad math or unverified statistics, it still creates some apparent credibility of the idea. "Most Americans prefer green" is far less believable than "60% of Americans prefer green," because the quantitative assumption seems authoritative and precise.
Now, I ask, what types of worldviews and ideas are most resilient? Things that turn out correct, for one thing. But also, religions (the ones that last) tend to have almost every beneficial mind-virus characteristic, and so they become prolific and resilient.
Religions make claims on subjects that cannot or cannot yet be experimentally examined.
-eg The existence of god.
Religions perpetuate unverifiable anecdotes.
-eg The resurrection of Jesus.
Religions tie ideas together firmly to anchor less-secure ideas.
-eg If you have faith, you can believe.
Religions tend to make appeals to emotions in order to promote cognitive dissonance.
-eg Just try it for a while, for (family member)'s sake.
Religion regularly implies higher intelligence and a grand scheme beyond human comprehension.
-eg We cannot understand (inconsistent doctrine) in this life, but all will be made clear in heaven.
Religions employ flowery rhetoric which seems sound, but actually contains assumptions and fallacies.
-eg All things have a beginning, so nothing could exist unless there was something (god) to start it.Religions constantly appeal to almost-universal human anxieties.
-eg Answers addressing "Where did I come from, why am I here, where am I going?"
Religions quote false or unverified statistics and use bad math constantly.
-eg The watchmaker argument.
You'll notice that these are in the same order as the bullet points I listed above enumerating the various viral survival mechanisms. Religion nearly universally contains many, many, of each of these mechanisms.
But this makes sense: religions only survive when they are resilient to skepticism and disproof. Religion has existed most likely since the dawn of human culture, and has evolved over the millennia to be what it is now: very well-suited to the human mind. Modern religion is a nearly perfect mind-virus.
Any person will agree, I think, that most religions make many false claims (because clearly not many more than one can possibly be correct). The fact that these religions remain prolific, and their followers faithful, indicates the human vulnerability to viral (and sometimes virulent) ideas.