Cult Indoctrination – and the Road to Recovery

Updated October 15, 2019


Sun Myung Moon: “split the person apart”

Significance Of The Training Session
Reverend Sun Myung Moon
Third Directors’ Conference
Master Speaks    May 17, 1973  Translated by Mrs. Won-Bok Choi

“Good morning! Sit down!
I am going to speak about the significance of a training session like this. Master’s intention is to have the State Representatives, Commanders, and the Itinerary Workers pass the examination, getting at least 70 points. I will continue this until the last one of the responsible members has passed the examination.

For fallen men it is their duty to pass through three stages of judgment! Judgment of words, judgment of personality, and judgment of love or heart. All through history, mankind has been in search of the truth, true words. The truth is the standard by which all the problems of mankind can be solved. We know man somehow fell in the beginning, and to fall means to fall into the bondage of Satan. So, in order for us to return to the original position, we have to get rid of the bondage of Satan. For fallen people, there is no other message which is more hopeful and desirable than the message of restoration to the original position, To be restored is, in another sense, to be liberated from Satanic bondage – and this is the gospel of gospels for fallen men.

Then what is judgment? Judgment is the measurement of the standard on which all our acts are judged. If our acts cannot come in accordance with the original rule or measurement, we must be judged or punished.

Through 40 days you will have six cycles of Divine Principle lectures. If you study hard, after the sixth cycle of lectures – or in the course of them – you can imagine what will come next when the lecturer gives you a certain chapter. You can even analyze or criticize President Kim’s lecture. You may think, “The last time I came he gave a dynamic lecture, but he is tired this time; when I give the lecture I will never be tired,” etc. In your own way, you can organize your lecture. In order for you to be a dynamic lecturer, you must know the knack of holding and possessing the listeners’ hearts. If there appears a crack in the man’s personality, you wedge in a chisel, and split the person apart. For the first few lectures, you will just memorize. But after that, you will study the character of your audience, and adapt your lecture. If he is a scientist, you will approach him differently than a commercial man, artist, etc. The audience as a whole will have a nature, and you must be flexible.

At least two weeks – you must experience flower selling – two weeks to 30 days. Whether in two weeks or in one full month, until you raise 80 dollars a day; then you go to rallies, witnessing, and then if you cannot bring in three persons in one month’s time, you cannot go. That’s the formula you have to go through….

http://www.tparents.org/moon-talks/sunmyungmoon73/SM730517.htm


1. VIDEO: Why do people join cults? – Janja Lalich

2. VIDEO: The BITE model of Steve Hassan / The Influence Continuum

3. Robert Jay Lifton’s Eight Criteria of Thought Reform

4. VIDEO: The Wrong Way Home
An analysis of Dr Arthur J. Deikman’s book on cult behavior

5. Cult Indoctrination through Psychological Manipulation by Professor Kimiaki Nishida

6. Towards a Demystified and Disinterested Scientific Theory of Brainwashing (extracts) by Benjamin Zablocki

7. Psyching Out the Cults’ Collective Mania by Louis Jolyon West and Richard Delgado

8. Book: Take Back Your Life by Janja Lalich and Madeleine Tobias (2009)

9. VIDEO: Paul Morantz on Cults, Thought Reform, Coercive Persuasion and Confession

10. PODCAST: Ford Greene, Attorney and former UC member, on Sun Myung Moon

11. VIDEO: Steve Hassan interviewed by Chris Shelton

12. VIDEO: Conformity by TheraminTrees

13. VIDEO: Instruction Manual for Life by TheraminTrees

14. The Social Organization of Recruitment in the Unification Church – PDF by David Frank Taylor, M.A., July 1978, Sociology

15. Socialization techniques through which UC members were able to influence by Geri-Ann Galanti, Ph.D.

16. VIDEO: Recovery from RTS (Religious Trauma Syndrome) by Marlene Winell

17. VIDEO: ICSA – After the cult

18. “How do you know I’m not the world’s worst con man or swindler?” – Sun Myung Moon

19. Bibliography


2. VIDEO: The B.I.T.E. model by Steven Hassan



3. Robert Jay Lifton’s Eight Criteria of Thought Reform

“I wish to suggest a set of criteria against which any environment may be judged — a basis for answering the ever-recurring question: “Isn’t this just like ‘brainwashing’?”
– Robert Jay Lifton

“Ideological Totalism” is Chapter 22 of Robert Jay Lifton’s book, Thought Reform and the Psychology of Totalism: A Study of ‘brainwashing’ in China

Dr. Lifton, a psychiatrist and author, has studied the psychology of extremism for decades. He is renowned for his studies of the psychological causes and effects of war and political violence and for his theory of thought reform. Lifton testified at the 1976 bank robbery trial of Patty Hearst about the theory of “coercive persuasion.”

His theories — including the often-referred to 8 criteria described below — are used and expanded upon by many cult experts.

First published in 1961, his book was reprinted in 1989 by the University of North Carolina Press. From Chapter 22:

8 CRITERIA AGAINST WHICH ANY ENVIRONMENT MAY BE JUDGED:

  • Milieu Control – The control of information and communication.
  • Mystical Manipulation – The manipulation of experiences that appear spontaneous but in fact were planned and orchestrated.
  • The Demand for Purity – The world is viewed as black and white and the members are constantly exhorted to conform to the ideology of the group and strive for perfection.
  • The Cult of Confession – Sins, as defined by the group, are to be confessed either to a personal monitor or publicly to the group.
  • The Sacred Science – The group’s doctrine or ideology is considered to be the ultimate Truth, beyond all questioning or dispute.
  • Loading the Language – The group interprets or uses words and phrases in new ways so that often the outside world does not understand.
  • Doctrine over person – The member’s personal experiences are subordinated to the sacred science and any contrary experiences must be denied or reinterpreted to fit the ideology of the group.
  • The Dispensing of existence – The group has the prerogative to decide who has the right to exist and who does not.

 

Eight Conditions of Thought Reform
as presented in 
Thought Reform and the Psychology of Totalism, Chapter 22.

1. Milieu Control
The most basic feature of the thought reform environment, the psychological current upon which all else depends, is the control of human communication. Through this milieu control the totalist environment seeks to establish domain over not only the individual’s communication with the outside (all that he sees and hears, reads and writes, experiences, and expresses), but also — in its penetration of his inner life — over what we may speak of as his communication with himself. It creates an atmosphere uncomfortably reminiscent of George Orwell’s 1984…. (Page 420.)
Purposeful limitation of all forms of communication with outside world.

The control of human communication through environment control.

The cult doesn’t just control communication between people, it also controls people’s communication with themselves, in their own minds.

2. Mystical Manipulation
The inevitable next step after milieu control is extensive personal manipulaton. This manipulation assumes a no-holds-barred character, and uses every possible device at the milieu’s command, no matter how bizarre or painful. Initiated from above, it seeks to provoke specific patterns of behavior and emotion in such a way that these will appear to have arisen spontaneously from within the environment. This element of planned spontaneity, directed as it is by an ostensibly omniscient group, must assume, for the manipulated, a near-mystical quality. (Page 422.)
Potential convert is convinced of the higher purpose within the special group.

Everyone is manipulating everyone, under the belief that it advances the “ultimate purpose.”

Experiences are engineered to appear to be spontaneous, when, in fact, they are contrived to have a deliberate effect.

People mistakenly attribute their experiences to spiritual causes when, in fact, they are concocted by human beings.

3. The Demand for Purity
The experiential world is sharply divided into the pure and the impure, into the absolutely good and the absolutely evil. The good and the pure are of course those ideas, feelings, and actions which are consistent with the totalist ideology and policy; anything else is apt to be relegated to the bad and the impure. Nothing human is immune from the flood of stern moral judgements. (Page 423.)
The philosophical assumption underlying this demand is that absolute purity is attainable, and that anything done to anyone in the name of this purity is ultimately moral.

The cult demands Self-sanctification through Purity.

Only by pushing toward perfection, as the group views goodness, will the recruit be able to contribute.

The demand for purity creates a guilty milieu and a shaming milieu by holding up standards of perfection that no human being can attain.

People are punished and learn to punish themselves for not living up to the group’s ideals.

4. The Cult of Confession
Closely related to the demand for absolute purity is an obsession with personal confession. Confession is carried beyond its ordinary religious, legal, and therapeutic expressions to the point of becoming a cult in itself. (Page 425.)
Public confessional periods are used to get members to verbalize and discuss their innermost fears and anxieties as well as past imperfections.

The environment demands that personal boundaries are destroyed and that every thought, feeling, or action that does not conform with the group’s rules be confessed.

Members have little or no privacy, physically or mentally.

5. Aura of Sacred Science
The totalist milieu maintains an aura of sacredness around its basic dogma, holding it out as an ultimate moral vision for the ordering of human existence. This sacredness is evident in the prohibition (whether or not explicit) against the questioning of basic assumptions, and in the reverence which is demanded for the originators of the Word, the present bearers of the Word, and the Word itself. While thus transcending ordinary concerns of logic, however, the milieu at the same time makes an exaggerated claim of airtight logic, of absolute “scientific” precision. Thus the ultimate moral vision becomes an ultimate science; and the man who dares to criticize it, or to harbor even unspoken alternative ideas, becomes not only immoral and irreverent, but also “unscientific”. In this way, the philosopher kings of modern ideological totalism reinforce their authority by claiming to share in the rich and respected heritage of natural science. (Pages 427-428.)
The cult advances the idea that the cult’s laws, rules and regulations are absolute and, therefore, to be followed automatically.

The group’s belief is that their dogma is absolutely scientific and morally true.

No alternative viewpoint is allowed.

No questioning of the dogma is permitted.

6. Loading the Language
The language of the totalist environment is characterized by the thought-terminating cliché. [Slogans] The most far-reaching and complex of human problems are compressed into brief, highly reductive, definitive-sounding phrases, easily memorized and easily expressed.
The cult invents a new vocabulary, giving well-known words special new meanings, making them into trite clichés. The clichés become “ultimate terms”, either “god terms”, representative of ultimate good, or “devil terms”, representative of ultimate evil. Totalist language, then, is repetitiously centered on all-encompassing jargon, prematurely abstract, highly categorical, relentlessly judging, and to anyone but its most devoted advocate, deadly dull: the language of non-thought. (Page 429.)

Controlling words helps to control people’s thoughts.

The group uses black-or-white thinking and thought-terminating clichés.

The special words constrict rather than expand human understanding.

Non-members cannot simply comprehend what cult members are talking about.

7. Doctrine over Person
Another characteristic feature of ideological totalism: the subordination of human experience to the claims of doctrine. (Page 430.)
Past experience and values are invalid if they conflict with the new cult morality.

The value of individuals is insignificant when compared to the value of the group.

Past historical events are retrospectively altered, wholly rewritten, or ignored to make them consistent with doctrinal logic.

No matter what a person experiences, it is belief in the dogma which is important.

Group belief supersedes individual conscience and integrity.

8. Dispensed Existence
The totalist environment draws a sharp line between those whose right to existence can be recognized, and those who possess no such right.
Lifton gave a Communist example:
In thought reform, as in Chinese Communist practice generally, the world is divided into “the people” (defined as “the working class, the peasant class, the petite bourgeoisie, and the national bourgeoisie”), and “the reactionaries” or “the lackies of imperialism” (defined as “the landlord class, the bureaucratic capitalist class, and the KMT reactionaries and their henchmen”). (Page 433.)

The group decides who has a right to exist and who does not.

The group has an elitist world view — a sharp line is drawn by cult between those who have been saved, chosen, etc. (the cult members) and those who are lost, in the dark, etc. (the rest of the world).

Former members are seen as “weak,” “lost,” “evil,” and “the enemy”.

The cult insists that there is no legitimate alternative to membership in the cult.


The full text of Chapter 22 appears HERE courtesy of Dr. Robert Jay Lifton.



4. The Wrong Way Home

An analysis of Dr Arthur J. Deikman’s book on cult behaviour



5. Cult Indoctrination through Psychological Manipulation

by Professor Kimiaki Nishida 西田 公昭 of Rissho University in Tokyo.

(There is an explanation of this diagram below.)

This is a shortened version of an article entitled Development of the Study of Mind Control in Japan first in published in 2005.

Recently, psychologists in Japan have been examining a contemporary social issue — certain social groups recruit new members by means of psychologically manipulative techniques called “mind control.” They then exhort their members to engage in various antisocial behaviors, from deceptive sales solicitation and forcible donation to suicide and murder [e.g. Tokyo sarin gas attack by Aum Shinrikyō in 1995]. We classify such harmful groups as “cults” or even “destructive cults.” Psychologists concerned with this problem must explain why ordinary, even highly educated people devote their lives to such groups, fully aware that many of their activities deviate from social norms, violate the law, and may injure their health. Psychologists are now also involved in the issue of facilitating the recovery of distressed cult members after they leave such groups.

Background
In the 1970s, hardly anyone in Japan was familiar with the term “destructive cult.” Even if they had been informed of cult activities, such as the 1978 Jonestown tragedy, in which 912 members of the Guyana-based American cult were murdered or committed suicide, most Japanese people would have thought the incident a sensational, curious, and inexplicable event. Because the events at Jonestown occurred overseas, Japanese people, except possibly those worried parents whose child had joined a radical cult, would not have shown any real interest.

In the 1980s, a number of Japanese, including journalists and lawyers, became concerned about the “unethical” activities of the Unification Church, whose members worshiped their so-called True Father, the cult’s Korean founder Sun Myung Moon, who proclaimed himself to be the Second Advent of Christ. One of the group’s activities entailed shady fund-raising campaigns. Another unethical activity of the cult in the 1980s was Reikan-Shôhô, a swindle in which they sold spiritual goods, such as lucky seals, Buddhist rosaries, lucky-towers [pagodas] ornaments, and so on. The goods were unreasonably expensive but the intimidated customers bought them to avoid possible future misfortune [or to liberate their deceased loved-ones from the ‘hell’ they were told they were suffering in].

The first Japanese “anti-cult” organization was established in 1987 to stop the activities of the Unification Church. The organization consisted of lawyers who helped Reikan-Shôhô victims all over Japan (see Yamaguchi 2001). According to their investigation, the lawyers’ organization determined that the Unification Church in Japan engaged in three unethical practices. First, large amounts of money were collected through deceptive means. Under duress, customers desperate to improve their fortunes bankrupted themselves through buying the cult’s “spiritual” goods. Second, members participated in mass marriages arranged by the cult without the partners getting to know each other, after the partners were told by the cult leader that their marriage would save their families and ancestors from calamity. Third, the church practiced mind control, restricting members’ individual freedom, and employing them in forced labor, which often involved illegal activity. Mind-controlled members were convinced their endeavors would liberate their fellow beings.

The 1990s saw studies by a few Japanese psychological researchers who were interested in the cult problem. By the mid-1990s, Japanese courts had already acknowledged two Unification Church liabilities during proceedings the lawyers had brought against the cult; namely, mass marriage and illegal Reikan-shôhô. (see Judgment by the Fukuoka [Japan] District Court on the Unification Church 1995). The lawyers’ main objective, however, had been that the court confirm the Unification Church’s psychological manipulation of cultists, a ruling that would recognize these members as being under the duress of forced labor.

What Is Mind Control?
Early in the study of mind control, the term was equated with the military strategy of brainwashing. Mind control initially was referred to in the United States as “thought reform” or “coercive persuasion” (Lifton 1961; Schein, Schneier, and Barker 1961). Currently, however, mind control is considered to be a more sophisticated method of psychological manipulation that relies on subtler means than physical detention and torture (Hassan 1988).

In fact, people who have succumbed to cult-based mind control consider themselves to have made their decision to join a cult of their own free will. We presume that brainwashing is a behavioral-compliance technique in which individuals subjected to mind control come to accept fundamental changes to their belief system. Cult mind control may be defined as temporary or permanent psychological manipulation by people who recruit and indoctrinate cult members, influencing their behavior and mental processes in compliance with the cult leadership’s desires, and of which control members remain naive (Nishida 1995a).

After the Aum attacks, Ando, Tsuchida, Imai, Shiomura, Murata, Watanabe, Nishida, and Genjida (1998) surveyed almost 9,000 Japanese college students. The questionnaire was designed to determine: whether the students had been approached by cults and, if so, how they had reacted; their perception of alleged cult mind-control techniques; and how their psychological needs determined their reactions when the cults had attempted to recruit them.

Ando’s survey results showed that about 20% of respondent impressions of the recruiter were somewhat favorable, in comparison with their impressions of salespersons. However, their compliance level was rather low. The regression analysis showed that the students tended to comply with the recruiter’s overture when:

• they were interested in what the agent told them;
• they were not in a hurry;
• they had no reason to refuse;
• they liked the agent; or
• they were told that they had been specially selected, could gain knowledge of the truth, and could acquire special new abilities.

When asked to evaluate people who were influenced or “mind controlled” by a cult, respondents tended to think it was “inevitable” those people succumbed, and they put less emphasis on members’ individual social responsibility. When mind control led to a criminal act, however, they tended to attribute responsibility to the individual. More than 70% of respondents answered in the affirmative when asked whether they themselves could resist being subjected to mind control, a result that confirms the students’ naiveté about their own personal vulnerability. The respondents’ needs or values had little effect on their reactions to, interest in, and impressions of cult agents’ attempts to recruit them.

Mind Control as Psychological Manipulation of Cult Membership
Nishida (1994, 1995b) investigated the process of belief-system change caused by mind control as practiced by a religious cult. His empirical study evaluated a questionnaire administered to 272 former group members, content analysis of the dogma in the group’s publications, videotapes of lectures on dogma, the recruiting and seminar manuals, and supplementary interviews with former members of the group.

Cult Indoctrination Process by Means of Psychological Manipulation
In one of his studies, Nishida (1994) found that recruiters offer the targets a new belief system, based on five schemas. These schemas comprise:

1. notions of self concerning one’s life purpose (Self Beliefs);

2. ideals governing the type of individual, society, and world there ought to be (Ideal Beliefs);

3. goals related to correct action on the part of individuals (Goal Beliefs);

4. notions of causality, or which laws of nature operate in the world’s history (Causality Beliefs); and 

5. trust that authority will decree the criteria for right and wrong, good and evil (Authority Beliefs) ▲.

Content analysis of the group’s dogma showed that its recruitment process restructures the target’s belief-system, replacing former values with new ones advocated by the group, based on the above schemas.

Abelson (1986) argues that beliefs are metaphorically similar to possessions. He posits that we collect whatever beliefs appeal to us, as if working in a room where we arrange our favorite furniture and objects. He proposes that we transform our beliefs into a new cognitive system of neural connections, which may be regarded as the tools for decision making.

Just as favorite tools are often placed in the central part of a room, or in a harmonious place, it appears that highly valued beliefs are located for easy access in cognitive processing. Meanwhile, much as worn-out tools are often hidden from sight in corners or storerooms, less-valued beliefs are relocated where they cannot be easily accessed for cognitive processing. Individual changes in belief are illustrated with the replacement of a piece of the furniture while a complete belief-system change is represented as exchanging all of one’s furniture and goods, and even the design and color of our room. The belief-system change, such as occurs during the recruitment and indoctrination process, is metaphorically represented in Figure 1 (below), starting with a functional room with its hierarchy of furniture or tools, and progressing through the stages of recruitment and indoctrination to the point at which the functional room has been replaced by a new set of furniture and tools that represent the altered belief system.

Step 0. The Figure shows the five schemas as a set of the thought tools that potential recruits hold prior to their contact with the group.

Step 1. Governed by their trust in authority, targets undergoing indoctrination remain naive about the actual group name, its true purpose, and the dogma that is meant to radically transform the belief system they have held until their contact with the group. At this stage of psychological manipulation, because most Japanese are likely to guard against religious solicitation, the recruiter puts on a good face. The recruiter approaches the targets with an especially warm greeting and assesses their vulnerabilities in order to confound them.

Step 2. While the new ideals and goals are quite appealing to targets, their confidence level in the new notions of causality also rises; some residual beliefs may remain at this stage.

The targets must be indoctrinated in isolation so that they remain unaware that the dogma they are absorbing is a part of cult recruitment. Thus isolated, they cannot sustain their own residual beliefs through observing the other targets; the indoctrination environment tolerates no social reality (Festinger 1954). The goal for this stage is for the targets to learn the dogma by heart and embrace it as their new belief, even if it might seem strange or incomprehensible.

Step 3. At this stage, the recruiter’s repeated lobbying for the new belief system entices the targets to “relocate” those newly absorbed beliefs that appeal to them into the central area in their “rooms.” By evoking the others’ commitment, the recruiter uses group pressure to constrain each target. This approach seems to induce both a collective lack of common sense (Allport 1924) and individual cognitive dissonance (Festinger 1957).

Step 4. As the new recruits pass through a period of concentrated study, the earlier conversion of particular values extends to their entire belief system. By the end, they have wholly embraced the new belief system. The attractive new beliefs gradually are “relocated” from their “room’s” periphery into its center, replacing older beliefs. Recently held beliefs are driven to the room’s periphery, thoroughly diminished; new, now-central beliefs coalesce, blending with the few remaining older notions.

Shunning their former society, the targets begin to spend most of their time among group members. Their new social reality raises the targets’ conviction that the new beliefs are proper. At this time, the targets feel contentedly at home because the recruiters are still quite hospitable.

Step 5. The old belief system has become as useless as dilapidated furniture or tools. With its replacement, the transformation of the new recruits’ belief systems results in fully configured new beliefs, with trust in authority at their core, and thus with that authority an effective vehicle for thought manipulation.

At the final stage of psychological manipulation, during the recruitment and indoctrination process, the recruiters invoke the charismatic leader of the group ▲, equating the mortal with god. The recruiters instill a profound fear in the targets, fear that misfortune and calamity will beset them should they leave the cult.

Figure 1. Metamorphosis of the belief system change by cultic psychological manipulation.

Each ellipse represents the working space for decision making. The shapes colored black in the ellipse represent the newly inputted beliefs. The large shapes are developed beliefs, and the shapes in the middle represent beliefs that are highly valued by the individual.  represents the authority of the charismatic leader of the group.


Cult Maintenance and Expansion through Psychological Manipulation

Nishida (1995b) studied one cult’s method of maintaining and expanding its membership by means of psychological manipulation, or cult mind control. The results of factor analysis of his survey data revealed that cult mind-control techniques induced six situational factors that enhanced and maintained members’ belief-systems: (1) restriction of freedom, (2) repression of sexual passion, (3) physical exhaustion, (4) punishment for external association, (5) reward and punishment, and (6) time pressure.

Studies also concluded that four types of complex psychological factors influence, enhance, and maintain members’ belief systems: (1) behavior manipulation, (2) information-processing manipulation, (3) group-processing manipulation, and (4) physiological-stress manipulation.

Behavior Manipulation
Behavior manipulation includes the following factors:

1  Conditioning. The target members were conditioned to experience deep anxiety if they behaved against cult doctrine. During conditioning, they would often be given small rewards when they accomplished a given task, but strong physical and mental punishment would be administered whenever they failed at a task.

2  Self-perception. A member’s attitude to the group would become fixed when the member was given a role to play in the group (Bem 1972; Zimbardo 1975).

3  Cognitive dissonance. Conditions are quite rigorous because members have to work strenuously and are allowed neither personal time nor money, nor to associate with “outsiders.” It seems that they often experienced strong cognitive dissonance (Festinger 1957).

Information-Processing Manipulation
Information-processing manipulation factors include the following:

1  Gain-loss effect. Swings between positive and negative attitudes toward the cult became fixed as more positive than negative (Aronson and Linder 1965). Many members had negative attitudes toward cults prior to contact with their group.

2  Systemization of belief-system. In general, belief has a tenacious effect, even when experience shows it to be erroneous (Ross, Lepper, and Hubbard 1975). Members always associate each experience with group dogma; they are indoctrinated to interpret every life event in terms of the cult’s belief-system. 

3  Priming effect. It is a cognitive phenomenon that many rehearsed messages guide information processing to take a specific direction (Srull and Wyer 1980). The members listen to the same lectures and music frequently and repeatedly, and they pray or chant many times every day.

4  Threatening messages. They are inculcated with strong fears of personal calamity by means of [illnesses such as cancer, accidents, influence of evil spirits, “restored after satan”], and so on.

Group-Processing Manipulation
Group-processing manipulation components include:

1  Selective exposure to information. Members avoid negative reports, but search for positive feedback once they make a commitment to the group (Festinger 1957). It should also be added that many group members continue to live in the locale in which they exited their society. Even so, new members are forbidden to have contact with out-of-group people, or access to external media.

2  Social identity. Members identify themselves with the group because the main goal or purpose of their activity is to gain personal prestige within the group (Turner, Hogg, Oakes, Reicher, and Wetherell 1987). Therefore, they look upon fellow members as elite, acting for the salvation of all people. Conversely, they look on external critics as either wicked persecutors or pitiful, ignorant fools. This “groupthink” makes it possible for the manipulators to provoke reckless group behavior among the members (Janis 1971; Wexler 1995).

Physiological-Stress Manipulation
It has been established that physiological stress factors facilitate this constraint within the group based on the following, as examples:

1  urgent individual need to achieve group goals,
2  fear of sanction and punishment,
3  monotonous group life,
4  sublimation of sexual drive in fatiguing hard work,
5  sleep deprivation,
6  poor nutrition,
7  extended prayer and / or [study sessions].


Post-Cult Residual Psychological Distress
Over the past few decades, a considerable number of studies have been completed on the psychological problems former cult members have experienced after leaving the cult, as compared with the mind-control process itself.

It is important to note that most former members continue to experience discontent, although its cause remains controversial (Aronoff, Lynn, and Malinoski 2000). A few studies on cult phenomena have been conducted so far in Japan, notably by Nishida (1995a, 1998), and by Nishida and Kuroda (2003, 2004), who investigated ex-cultists’ post-exit problems, based mainly on questionnaires administered to former members of two different cults.

In a series of studies, Nishida and Kuroda (2003) surveyed 157 former members of the Unification Church and Aum Shinrikyō. Using factor analysis, the studies posited eleven factors that contribute to ex-members’ psychological problems. These factors can be classified into three main groups: (1) emotional distress, (2) mental distress, and (3) interpersonal distress. The eleven factors are (1) tendencies to depression and anxiety, (2) loss of self-esteem, (3) remorse and regret, (4) difficulty in maintaining social relations and friendships, (5) difficulty in family relationships, (6) floating or flashback to cultic thinking and feeling, (7) fear of sexual contact, (8) emotional instability, (9) hypochondria, (10) secrecy of cult life, and (11) anger toward the cult. These findings seem to have a high correlation with previous American studies.

Moreover, Nishida and Kuroda (2004) deduced from their analysis of variance of the 157 former members surveyed that depression and anxiety, hypochondria, and secrecy of cult involvement decreased progressively, with the help of counseling, after members left the cult. However, loss of self-esteem and anger toward the cult increased as a result of counseling.

Furthermore, Nishida (1998) found clear gender differences in the post-exit recovery process. Although female ex-cultists’ distress levels were higher than those of the males immediately after they left the cults, the women experienced full recovery more quickly than the men. The study also found that counseling by non-professionals works effectively with certain types of distress, such as anxiety and helplessness, but not for others, such as regret and self-reproof.


Conclusion
It can be concluded from Japanese studies on destructive cults that the psychological manipulation known as cult mind control is different from brainwashing or coercive persuasion. Based on my empirical studies, conducted from a social psychology point of view, I concluded that many sets of social influence are systematically applied to new recruits during the indoctrination process, influences that facilitate ongoing control of cult members. My findings agree with certain American studies, such as those conducted by Zimbardo and Anderson (1993), Singer and Lalich (1995), and Hassan (1988, 2000). The manipulation is powerful enough to make a vulnerable recruit believe that the only proper action is to obey the organization’s leaders, in order to secure humanity’s salvation, even though the requisite deed may breach social norms. Furthermore, it should be pointed out that dedicated cult veterans are subject to profound distress over the extended period of their cult involvement.


This chapter is a reprint of an article originally published in Cultic Studies Review, 2005, Volume 4, Number 3, pages 215-232.


Kimiaki Nishida, Ph.D., a social psychologist in Japan, is Associate Professor at the Rissho University 立正大学 in Tokyo and a Director of the Japan Cult Recovery Council. He is a leading Japanese cultic studies scholar and the editor of Japanese Journal of Social Psychology. His studies on psychological manipulation by cults were awarded prizes by several academic societies in Japan. And he has been summoned to some courts to explain “cult mind control.”


統一協会の伝道とマインド·コントロール



6. Towards a Demystified and Disinterested Scientific Theory of Brainwashing

by Benjamin Zablocki

from Misunderstanding Cults: Searching for Objectivity in a Controversial Field
Edited by Benjamin Zablocki and Thomas Robbins

Please note that for the sake of brevity and space more than half the following important book chapter has been cut. The whole chapter is available on the internet.

Nobody likes to lose a customer, but religions get more touchy than most when faced with the risk of losing devotees they have come to define as their own. Historically, many religions have gone to great lengths to prevent apostasy, believing virtually any means justified to prevent wavering parishioners from defecting and thus losing hope of eternal salvation. In recent centuries, religion in our society has evolved from a system of territorially based near-monopolies into a vigorous and highly competitive faith marketplace in which many churches, denominations, sects, and cults vie with one another for the allegiance of ‘customers’ who are free to pick and choose among competing faiths. Under such circumstances, we should expect to find that some of the more tight-knit and fanatical religions in this rough-and-tumble marketplace will have developed sophisticated persuasive techniques for holding on to their customers. Some of the most extreme of these techniques are known in the literature by the controversial term ‘brainwashing.’ This chapter is devoted to a search for a scientific definition of brainwashing and an examination of the evidence for the existence of brainwashing in cults. I believe that research on this neglected subject is important for a fuller understanding of religious market dynamics.1 And, ultimately, research on this subject may yield a wider dividend as well, assisting us in our quest for a fuller understanding of mass charismatic movements such as Fascism, Nazism, Stalinism, and Maoism.

Do We Need to Know Whether Cults Engage in Brainwashing?

The question of why people obey the sometimes bizarrely insane commands of charismatic leaders, even unto death, is one of the big unsolved mysteries of history and the social sciences. If there are deliberate techniques that charismatic leaders (and charismatically led organizations) use to induce high levels of uncritical loyalty and obedience in their followers, we should try to understand what these techniques are and under what circumstances and how well they work.

This chapter is about nothing other than the process of inducing ideological obedience in charismatic groups. Many people call this process brainwashing, but the label is unimportant. What is important is that those of us who want to understand cults develop models that recognize the importance that some cults give to strenuous techniques of socialization designed to induce uncritical obedience to ideological imperatives regardless of the cost to the individual. …

But given the fact that only a small proportion of the human population ever join cults, why should we care? The answer is that the sociological importance of cults extends far beyond their numerical significance. Many cults are harmless and fully deserving of protection of their religious and civil liberties. However, events of recent years have shown that some cults are capable of producing far more social harm than one might expect from the minuscule number of their adherents. The U.S. State Department’s annual report on terrorism for the year 2000 concludes that ‘while Americans were once threatened primarily by terrorism sponsored by states, today they face greater threats from loose networks of groups and individuals motivated more by religion or ideology than by politics’ (Miller 2000:1).

In his recent study of a Japanese apocalyptic cult [Aum Shinrikyo of sarin gas in Tokyo in 1995 infamy], Robert Jay Lifton (1999: 343) has emphasized this point in the following terms:

“Consider Asahara’s experience with ultimate weapons … With a mad guru and a few hundred close followers, it is much easier to see how the very engagement with omnicidal weapons, once started upon, takes on a psychological momentum likely to lead either to self-implosion or to world explosion … Asahara and Aum have changed the world, and not for the better. A threshold has been crossed. Thanks to this guru, Aum stepped over a line that few had even known was there. Its members can claim the distinction of being the first group in history to combine ultimate fanaticism with ultimate weapons in a project to destroy the world. Fortunately, they were not up to the immodest task they assigned themselves. But whatever their bungling, they did cross that line, and the world will never quite be the same because, like it or not, they took the rest of us with them.” …

Brainwashing is the most commonly used word for the process whereby a charismatic group systematically induces high levels of ideological obedience. It would be naively reductionistic to try to explain cultic obedience entirely in terms of brainwashing. Other factors, such as simple conformity and ritual, induce cultic obedience as well. But it would be an equally serious specification error to leave deliberate cultic manipulation of personal convictions out of any model linking charismatic authority to ideological obedience.

However, the current climate of opinion, especially within the sociology of new religious movements, is not receptive to rational discussion of the concept of brainwashing, and still less to research in this area. Brainwashing has for too long been a mystified concept, and one that has been the subject of tendentious writing (thinly disguised as theory testing) by both its friends and enemies. My aim in this chapter is to rescue for social science a concept of brainwashing freed from both mystification and tendentiousness. I believe it is important and long overdue to restore some detachment and objectivity to this field of study.

The goal of achieving demystification will require some analysis of the concept’s highly freighted cultural connotations, with particular regard to how the very word brainwash became a shibboleth in the cult wars. It is easy to understand how frightening it may be to imagine that there exists some force that can influence one down to the core level of basic beliefs, values, and worldview. Movies like The Manchurian Candidate have established in the popular imagination the idea that there exists some mysterious technique, known only to a few, that confers such power. Actually, as we will see, the real process of brainwashing involves only well-understood processes of social influence orchestrated in a particularly intense way. It still is, and should be, frightening in its intensity and capacity for extreme mischief, but there is no excuse for refusing to study something simply because it is frightening.

The goal of establishing scientific disinterest will require the repositioning of the concept more fully in the domain of behavioral and social science rather than in its present domain, which is largely that of civil and criminal legal proceedings. …

My own thirty-six years of experience doing research on new religious movements has convinced me beyond any doubt that brainwashing is practised by some cults some of the time on some of their members with some degree of success. Even though the number of times I have used the vague term some in the previous sentence gives testimony to the fact that there remain many still-unanswered questions about this phenomenon, I do not personally have any doubt about brainwashing’s existence. But I have also observed many cults that do not practise brainwashing, and I have never observed a cult in which brainwashing could reasonably be described as the only force holding the group together. My research (Zablocki 1971; 1991; 1996; Zablocki and Aidala 1991) has been ethnographic, comparative, and longitudinal. I have lived among these people and watched the brainwashing process with my own eyes. I have also interviewed people who participated in the process (both as perpetrators and subjects). I have interviewed many of these respondents not just one time but repeatedly over a course of many years. My selection of both cults and individuals to interview has been determined by scientific sampling methods (Zablocki 1980: app. A), not guided by convenience nor dictated by the conclusions I hoped to find. Indeed, I have never had an axe to grind in this field of inquiry. I didn’t begin to investigate cults in the hope of finding brainwashing. I was surprised when I first discovered it. I insist on attempting to demonstrate its existence not because I am either for or against cults but only because it seems to me to be an incontrovertible, empirical fact.

Although my own ethnographic experience leads me to believe that there is overwhelming evidence that brainwashing is practised in some cults, my goal in this chapter is not to ‘prove’ that brainwashing exists, but simply to rescue it from the world of bogus ideas to which it has been banished unfairly, and to reinstate it as a legitimate topic of social science inquiry. My attempt to do so in this chapter will involve three steps. First, I will analyse the cultural misunderstandings that have made brainwashing a bone of contention rather than a topic of inquiry. Second, I will reconstruct the concept in a scientifically useful and empirically testable form within the framework of social influence theory. Third, I will summarize the current state of evidence (which seems to me to be quite compelling) that some cults do in fact engage in brainwashing with some degree of success.

Cultural Contention over the Concept of Brainwashing

That Word ‘Brainwashing’

The word brainwashing is, in itself, controversial and arouses hostile feelings. Since there is no scientific advantage in using one word rather than another for any concept, it may be reasonable in the future to hunt around for another word that is less polemical. We need a universally recognized term for a concept that stands for a form of influence manifested in a deliberately and systematically applied traumatizing and obedience-producing process of ideological resocialization.

Currently, brainwashing is the generally accepted term for this process, but I see no objection to finding another to take its place. There are in fact other terms, historically, that have been used instead, like ‘thought reform’ and ‘coercive persuasion.’ Ironically, it has been those scholars who complain most about ‘the B-word’ who have also been the most insistent that none of the alternatives is any better. As long as others in the field insist on treating all possible substitute constructions as nothing more than gussied-up synonyms for a mystified concept of brainwashing (see, for example, Introvigne 1998: 2), there is no point as yet in trying to introduce a more congenial term. …

The Reciprocal Moral Panic

Study of brainwashing has been hampered by partisanship and tendentious writing on both sides of the conflict. In one camp, there are scholars who very badly don’t want there to be such a thing as brainwashing. Its nonexistence, they believe, will help assure religious liberty, which can only be procured by defending the liberty of the most unpopular religions. If only the nonexistence of brainwashing can be proved, the public will have to face up to the hard truth that some citizens choose to follow spiritual paths that may lead them in radical directions. This camp has exerted its influence within academia. But, instead of using its academic skills to refute the brainwashing conjecture, it has preferred to attack a caricature of brainwashing supplied by anticult groups for litigational rather than scientific purposes.

In the other camp, we find scholars who equally badly do want there to be such a thing as brainwashing. Its existence, they believe, will give them a rationale for opposition to groups they consider dangerous. A typical example of their reasoning can be found in the argument put forth by Margaret Singer that ‘Despite the myth that normal people don’t get sucked into cults, it has become clear over the years that everyone is susceptible to the lure of these master manipulators’ (Singer 1995: 17). Using a form of backward reasoning known as the ecological fallacy, she argues from the known fact that people of all ages, social classes, and ethnic backgrounds can be found in cults to the dubious conclusion that everyone must be susceptible. These scholars must also share some of the blame for tendentious scholarship. Lacking positions of leadership in academia, scholars on this side of the dispute have used their expertise to influence the mass media, and they have been successful because sensational allegations of mystical manipulative influence make good journalistic copy. …

… the fact is that I am not presenting some new revised theory of brainwashing but simply a restatement of Robert Lifton’s (1989, 1999) careful and rigorous theory in sociological terms.

There are, I believe, six issues standing in the way of our ability to transcend this reciprocal moral panic. Let us look closely at each of these issues with an eye to recognizing that both sides in this conflict may have distorted the scientifically grounded theories of the foundational theorists – Lifton (1989), Sargant (1957), and Schein (1961) – as they apply to cults.

1. The Influence Continuum [omitted]

2. Unidirectional versus Bi-directional Influence [omitted]

3. Condemnatory Label versus Contributory Factor [omitted]

4. Obtaining Members versus Retaining Members [omitted]

The fourth issue has to do with a confusion over whether brainwashing explains how cults obtain members or how they retain them. Some cults have made use of manipulative practices like love-bombing and sleep deprivation (Galanti 1993), with some degree of success, in order to obtain new members. A discussion of these manipulative practices for obtaining members is beyond the scope of this chapter. Some of these practices superficially resemble techniques used in the earliest phase of brainwashing. But these practices, themselves, are not brainwashing. This point must be emphasized because a false attribution of brainwashing to newly obtained cult recruits, rather than to those who have already made a substantial commitment to the cult, figures prominently in the ridicule of the concept by NRM scholars. …

Why should the foundational theorists, concerned as they were with coercive state-run institutions like prisons, ‘re-education centres,’ and prisoner-of-war camps have any interest in explaining how participants were obtained? Participants were obtained at the point of a gun. The motive of these state enterprises was to retain the loyalties of these participants after intensive resocialization ceased. As George Orwell showed so well in his novel 1984, the only justification for the costly indoctrination process undergone by Winston Smith was not that he love Big Brother while Smith was in prison, but that Big Brother be able to retain that love after Smith was deployed back into society. Nevertheless, both ‘cult apologists’ and ‘cult bashers’ have found it more convenient to focus on the obtaining function.

If one asks why a cult would be motivated to invest resources in brainwashing, it should be clear that this can not be to obtain recruits, since these are a dime a dozen in the first place, and, as Barker (1984) has shown, they don’t tend to stick around long enough to repay the investment. Rather, it can only be to retain loyalty, and therefore decrease surveillance costs for valued members who are already committed. In small groups bound together only by normative solidarity, as Hechter (1987) has shown, the cost of surveillance of the individual by the group is one of the chief obstacles to success. Minimizing these surveillance costs is often the most important organizational problem such groups have to solve in order to survive and prosper. Brainwashing makes sense for a collectivity only to the extent that the resources saved through decreased surveillance costs exceed the resources invested in the brainwashing process. For this reason, only high-demand charismatic groups with totalistic social structures are ever in a position to benefit from brainwashing.6

This mistaken ascription of brainwashing to the obtaining function rather than the retaining function is directly responsible for two of the major arguments used by the ‘cult apologists’ in their attempts to debunk brainwashing. One has to do with a misunderstanding of the role of force and the other has to do with the mistaken belief that brainwashing can be studied with data on cult membership turnover. …

… Lifton and Schein have both gone on public record as explicitly denying that there is anything about their theories that requires the use of physical force or threat of force. Lifton has specifically argued (‘psychological manipulation is the heart of the matter, with or without the use of physical force’ [1995: xi]) that his theories are very much applicable to cults.7 The difference between the state-run institutions that Lifton and Schein studied in the 1950s and 1960s and the cults that Lifton and others study today is in the obtaining function not in the retaining function. In the Chinese and Korean situations, force was used for obtaining and brainwashing was used for retaining. In cults, charismatic appeal is used for obtaining and brainwashing is used, in some instances, for retaining.

A related misconception has to do with what conclusions to draw from the very high rate of turnover among new and prospective recruits to cults. Bainbridge (1997), Barker (1989), Dawson (1998), Introvigne (forthcoming), and Richardson (1993) have correctly pointed out that in totalistic religious organizations very few prospective members go on to become long-term members. They argue that this proves that the resocialization process cannot be irresistible and therefore it cannot be brainwashing. But nothing in the brainwashing model predicts that it will be attempted with all members, let alone successfully attempted. In fact, the efficiency of brainwashing, operationalized as the expected yield of deployable agents8 per 100 members, is an unknown (but discoverable) parameter of any particular cultic system and may often be quite low. For the system to be able to perpetuate itself (Hechter 1987), the yield need only produce enough value for the system to compensate it for the resources required to maintain the brainwashing process.

Moreover, the high turnover rate in cults is more complex than it may seem. While it is true that the membership turnover is very high among recruits and new members, this changes after two or three years of membership when cultic commitment mechanisms begin to kick in. This transition from high to low membership turnover is known as the Bainbridge Shift, after the sociologist who first discovered it (Bainbridge 1997: 141-3). After about three years of membership, the annual rate of turnover sharply declines and begins to fit a commitment model rather than a random model.9

Membership turnover data is not the right sort of data to tell us whether a particular cult practises brainwashing. The recruitment strategy whereby many are called but few are chosen is a popular one among cults. In several groups in which I have observed the brainwashing process, there was very high turnover among initial recruits. Brainwashing is too expensive to waste on raw recruits. Since brainwashing is a costly process, it generally will not pay for a group to even attempt to brainwash one of its members until that member has already demonstrated some degree of staying power on her own.10

5. Psychological Traces

The fifth issue has to do with the question of whether brainwashing leaves any long-lasting measurable psychological traces in those who have experienced it. …

6. Separating the Investigative Steps

The final issue is a procedural one. There are four sequential investigative steps required to resolve controversies like the one we have been discussing. These steps are concerned with attempt, existence, incidence, and consequence. A great deal of confusion comes from nothing more than a failure to recognize that these four steps need to be kept analytically distinct from one another. …

Brainwashing as a Scientific Concept

What I am presenting here is not a ‘new’ theory of brainwashing but a conceptual model of the foundational theory developed in the mid-twentieth century by Lifton, Schein, and Sargant as it applies to charismatic collectivities. Because its scientific stature has been so frequently questioned, I will err on the side of formality by presenting a structured exposition of brainwashing theory in terms of eight definitions and twelve hypotheses. Each definition includes an operationalized form by which the trait may be observed. If either of the first two hypotheses is disconfirmed, we must conclude that brainwashing is not being attempted in the cult under investigation. If any of the twelve hypotheses is disconfirmed, we must conclude that brainwashing is not successful in meeting its goals within that cult.

I do not pretend that the model outlined here is easy to test empirically, particularly for those researchers who either cannot or will not spend time immersing themselves in the daily lives of cults, or for those who are not willing, alternatively, to use as data the detailed retrospective accounts of ex-members. However, it should be clear that the model being proposed here stays grounded in what is empirically testable and does not involve mystical notions such as loss of free will or information disease (Conway and Siegelman 1978) that have characterized many of the extreme ‘anticult models.’

Nor do I pretend that this model represents the final and definitive treatment of this subject. Charismatic influence is still a poorly understood subject on which much additional research is needed. With few exceptions, sociology has treated it as if it were what engineers call a ‘black box,’ with charismatic inputs coming in one end and obedience outputs going out the other. What we have here is a theory that assists in the process of opening this black box to see what is inside. It is an inductive theory, formed largely from the empirical generalizations of ethnographers and interviewers. The model itself presents an ideal-type image of brainwashing that does not attempt to convey the great variation among specific obedience-inducing processes that occur across the broad range of existing cults. Much additional refinement in both depth and breadth will certainly be needed.

Definitions

D1. Charisma is defined, using the classical Weberian formula, as a condition of ‘devotion to the specific and exceptional sanctity, heroism, or exemplary character of an individual person, of the normative patterns or order revealed or ordained by him’ (Weber 1947: 328). Being defined this way, as a condition of devotion, leads us to recognize that charisma is not to be understood simply in terms of the characteristics of the leader, as it has come to be in popular usage, but requires an understanding of the relationship between leader and followers. In other words, charisma is a relational variable. It is defined operationally as a network of relationships in which authority is justified (for both superordinates and subordinates) in terms of the special characteristics discussed above.

D2. Ideological Totalism is a sociocultural system that places high valuation on total control over all aspects of the outer and inner lives of participants for the purpose of achieving the goals of an ideology defined as all important. Individual rights either do not exist under ideological totalism or they are clearly subordinated to the needs of the collectivity whenever the two come into conflict. Ideological totalism has been operationalized in terms of eight observable characteristics: milieu control, mystical manipulation, the demand for purity, the cult of confession, ‘sacred science,’ loading the language, doctrine over person, and the dispensing of existence (Lifton 1989: chap. 22).11

D3. Surveillance is defined as keeping watch over a person’s behavior, and, perhaps, attitudes. As Hechter (1987) has shown, the need for surveillance is the greatest obstacle to goal achievement among ideological collectivities organized around the production of public goods. Surveillance is not only costly, it is also impractical for many activities in which agents of the collectivity may have to travel and act autonomously and at a distance. It follows from this that all collectivities pursuing public goals will be motivated to find ways to decrease the need for surveillance. Resources used for surveillance are wasted in the sense that they are unavailable for the achievement of collective goals.

D4. A deployable agent is one who is uncritically obedient to directives perceived as charismatically legitimate (Selznick 1960). A deployable agent can be relied on to continue to carry out the wishes of the collectivity regardless of his own hedonic interests and in the absence of any external controls. Deployability can be operationalized as the likelihood that the individual will continue to comply with hitherto ego-dystonic demands of the collectivity (e.g., mending, ironing, mowing the lawn, smuggling, rape, child abuse, murder) when not under surveillance.

D5. Brainwashing is an observable set of transactions between a charismatically structured collectivity and an isolated agent of the collectivity, with the goal of transforming the agent into a deployable agent. Brainwashing is thus a process of ideological resocialization carried out within a structure of charismatic authority.

The brainwashing process may be operationalized as a sequence of well-defined and potentially observable phases. These hypothesized phases are (1) identity stripping, (2) identification, and (3) symbolic death/rebirth. The operational definition of brainwashing refers to the specific activities attempted, whether or not they are successful, as they are either observed directly by the ethnographer or reported in official or unofficial accounts by members or ex-members. Although the exact order of phases and specific steps within phases may vary from group to group, we should always expect to see the following features, or their functional equivalents, in any brainwashing system: (1) the constant fluctuation between assault and leniency; and (2) the seemingly endless process of confession, re-education, and refinement of confession.

D6. Hyper-credulity is defined as a disposition to accept uncritically all charismatically ordained beliefs. All lovers of literature and poetry are familiar with ‘that willing suspension of disbelief for the moment, which constitutes poetic faith’ (Coleridge 1970: 147). Hyper-credulity occurs when this state of mind, which in most of us is occasional and transitory, is transformed into a stable disposition. Hyper-credulity falls between hyper-suggestibility on the one hand and stable conversion of belief on the other.12 Its operational hallmark is plasticity in the assumption of deeply held convictions at the behest of an external authority. This is an other-directed form of what Robert Lifton (1968) has called the protean identity state.

D7. Relational Enmeshment is a state of being in which self-esteem depends upon belonging to a particular collectivity (Bion 1959; Bowen 1972; Sirkin and Wynne 1990). It may be operationalized as immersion in a relational network with the following characteristics: exclusivity (high ratio of in-group to out-group bonds), interchangeability (low level of differentiation in affective ties between one alter and another), and dependency (reluctance to sever or weaken ties for any reason). In a developmental context, something similar to this has been referred to by Bowlby (1969) as anxious attachment.

D8. Exit Costs are the subjective costs experienced by an individual who is contemplating leaving a collectivity. Obviously, the higher the perceived exit costs, the greater will be the reluctance to leave. Exit costs may be operationalized as the magnitude of the bribe necessary to overcome them. A person who is willing to leave if we pay him $1,000 experiences lower exit costs than one who is not willing to leave for any payment less than $1,000,000. With regard to cults, the exit costs are most often spiritual and emotional rather than material, which makes measurement in this way more difficult but not impossible.

Figure 1: The Effect of Charismatic Influence on Uncritical Obedience

Hypotheses

Not all charismatic organizations engage in brainwashing. We therefore need a set of hypotheses that will allow us to test empirically whether any particular charismatic system attempts to practise brainwashing and with what effect. The brainwashing model asserts twelve hypotheses concerning the role of brainwashing in the production of uncritical obedience. These hypotheses are all empirically testable. A schematic diagram of the model I propose may be found in figure 1.

This model begins with an assumption that charismatic leaders are capable of creating organizations that are easy and attractive to enter (even though they may later turn out to be difficult and painful to leave). There are no hypotheses, therefore, to account for how charismatic cults obtain members. It is assumed that an abundant pool of potential recruits to such groups is always available. The model assumes that charismatic leaders, using nothing more than their own intrinsic attractiveness and persuasiveness, are initially able to gather around them a corps of disciples sufficient for the creation of an attractive social movement. Many ethnographies (Lofland 1966; Lucas 1995) have shown how easy it is for such charismatic movement organizations to attract new members from the general pool of anomic ‘seekers’ that can always be found within the population of an urbanized mobile society.

The model does attempt to account for how some percentage of these ordinary members are turned into deployable agents. The initial attractiveness of the group, its vision of the future, and/or its capacity to bestow seemingly limitless amounts of love and esteem on the new member are sufficient inducements in some cases to motivate a new member to voluntarily undergo this difficult and painful process of resocialization.

H1. Ideological totalism is a necessary but not sufficient condition for the brainwashing process. Brainwashing will be attempted only in groups that are structured totalistically. However, not all ideologically totalist groups will attempt to brainwash their members. It should be remembered that brainwashing is merely a mechanism for producing deployable agents. Some cults may not want deployable agents or have other ways of producing them. Others may want them but feel uncomfortable about using brainwashing methods to obtain them, or they may not have discovered the existence of brainwashing methods.

H2. The exact nature of this resocialization process will differ from group to group, but, in general, will be similar to the resocialization process that Robert Lifton (1989) and Edgar Schein (1961) observed in Communist re-education centres in the 1950s. For whatever reasons, these methods seem to come fairly intuitively to charismatic leaders and their staffs. Although the specific steps and their exact ordering differ from group to group, their common elements involve a stripping away of the vestiges of an old identity, the requirement that repeated confessions be made either orally or in writing, and a somewhat random and ultimately debilitating alternation of the giving and the withholding of ‘unconditional’ love and approval. H2 further states that the maintenance of this program involves the expenditure of a measurable quantity of the collectivity’s resources. This quantity is known as C, where C equals the cost of the program and should be measurable at least at an ordinal level.

This resocialization process has baffled many observers, in my opinion because it proceeds simultaneously along two distinct but parallel tracks, one involving cognitive functioning and the other involving emotional networking. These two tracks lead to the attainment of states of hyper-credulity and relational enmeshment, respectively. The group member learns to accept with suspended critical judgment the often shifting beliefs espoused by the charismatic leader. At the same time, the group member becomes strongly attached to and emotionally dependent upon the charismatic leader and (often especially) the other group members, and cannot bear to be shunned by them.

H3. Those who go through the process will be more likely than those who do not to reach a state of hyper-credulity. This involves the shedding of old convictions and the assumption of a zealous loyalty to these beliefs of the moment, uncritically seized upon, so that all such beliefs become not mere ‘beliefs’ but deeply held convictions.

Under normal circumstances, it is not easy to get people to disown their core convictions. Convictions, once developed, are generally treated not as hypotheses to test empirically but as possessions to value and cherish. There are often substantial subjective costs to the individual in giving them up. Abelson (1986: 230) has provided convincing linguistic evidence that most people treat convictions more as valued possessions than as ways of testing reality. Cognitive dissonance theory predicts with accuracy that when subject to frontal attack, attachment to convictions tends to harden (Festinger, Riechen et al. 1956; O’Leary 1994). Therefore, a frontal attack on convictions, without first undermining the self-image foundation of these convictions, is doomed to failure. An indirect approach through brainwashing is often more effective.

The unconventional beliefs that individuals adopt when they join cults will come to be discontinuous with the beliefs they held in precult life. What appears to happen is a transformation from individually held to collectively held convictions. This is a well-known phenomenon that Janis (1982) has called groupthink. Under circumstances of groupthink, the specific content of one’s convictions becomes much less important than achieving the goal that all in the group hold the same convictions. In elaboration likelihood terms we can say that the subject undergoes a profound shift from message processing to source processing in the course of resocialization (Petty and Wegener 1998).

When the state of hyper-credulity is achieved, it leaves the individual strongly committed to the charismatic belief of the moment but with little or no critical inclination to resist charismatically approved new or contradictory beliefs in the future and little motivation to attempt to form accurate independent judgments of the consequences of assuming new beliefs. The cognitive track of the resocialization process begins by stripping away the old convictions and associating them with guilt, evil, or befuddlement. Next, there is a traumatic exhaustion of the habit of subjecting right-brain convictions to left-brain rational scrutiny. This goes along with an increase in what Snyder (1974) has called self-monitoring, implying a shift from central route to peripheral route processing of information in which the source rather than the content of the message becomes all important.

H4. As an individual goes through the brainwashing process, there will be an increase in relational enmeshment with measurable increases occurring at the completion of each of the three stages. The purging of convictions is a painful process and it is reasonable to ask why anybody would go through it voluntarily. The payoff is the opportunity to feel more connected with the charismatic relational network. These people have also been through it, and only they really understand what you are going through. So cognitive purging leads one to seek relational comfort, and this comfort becomes enmeshing. The credulity process and the enmeshing process depend on each other.

The next three hypotheses are concerned with the fact that each of the three phases of brainwashing achieves plateaus in both of these processes. The stripping phase creates the vulnerability to this sort of transformation. The identification phase creates realignment, and the rebirth phase breaks down the barrier between the two so that convictions can be emotionally energized and held with zeal, while emotional attachments can be sacralized in terms of the charismatic ideology. The full brainwashing model actually provides far more detailed hypotheses concerning the various steps within each phase of the process. Space constraints make it impossible to discuss these here. An adequate technical discussion of the manipulation of language in brainwashing, for example, would require a chapter at least the length of this one. Figure 2 provides a sketch of the steps within each phase. Readers desiring more information about these steps are referred to Lifton (1989: chap. 5).

H5. The stripping phase. The cognitive goal of the stripping phase is to destroy prior convictions and prior relationships of belonging. The emotional goal of the stripping phase is to create the need for attachments. Overall, at the completion of the stripping phase, the situation is such that the individual is hungry for convictions and attachments and dependent upon the collectivity to supply them. This sort of credulity and attachment behavior is widespread among prisoners and hospital patients (Goffman 1961).

H6. The identification phase. The cognitive goal of the identification phase is to establish imitative search for conviction and bring about the erosion of the habit of incredulity. The emotional goal of the identification phase is to instill the habit of acting out through attachment. Overall, at the completion of the identification phase the individual has begun the practice of relying on the collectivity for beliefs and for a cyclic emotional pattern of arousal and comfort. But, at this point this reliance is just one highly valued form of existence. It is not yet viewed as an existential necessity.

H7. The symbolic death and rebirth phase. In the death and rebirth phase, the cognitive and emotional tracks come together and mutually support each other. This often gives the individual a sense of having emerged from a tunnel and an experience of spiritual rebirth.13 The cognitive goal of this phase is to establish a sense of ownership of (and pride of ownership in) the new convictions. The emotional goal is to make a full commitment to the new self that is no longer directly dependent upon hope of attachment or fear of separation. Overall, at the completion of the rebirth phase we may say that the person has become a fully deployable agent of the charismatic leader. The brainwashing process is complete.

H8. states that the brainwashing process results in a state of subjectively elevated exit costs. These exit costs cannot, of course, be observed directly. But they can be inferred from the behavioral state of panic or terror that arises in the individual at the possibility of having his or her ties to the group discontinued. The cognitive and emotional states produced by the brainwashing process together bring about a situation in which the perceived exit costs for the individual increase sharply. This closes the trap for all but the most highly motivated individuals, and induces in many a state of uncritical obedience. As soon as exit from the group (or even from its good graces) ceases to be a subjectively palatable option, it makes sense for the individual to comply with almost anything the group demands – even to the point of suicide in some instances. Borrowing from Sartre’s insightful play of that name, I refer to this situation as the ‘no exit’ syndrome. When demands for compliance are particularly harsh, the hyper-credulity aspect of the process sweetens the pill somewhat by allowing the individual to accept uncritically the justifications offered by the charismatic leader and/or charismatic organization for making these demands, however farfetched these justifications might appear to an outside observer.

H9. states that the brainwashing process results in a state of ideological obedience in which the individual has a strong tendency to comply with any behavioral demands made by the collectivity, especially if motivated by the carrot of approval and the stick of threatened expulsion, no matter how life-threatening these demands may be and no matter how repugnant such demands might have been to the individual in his or her pre-brainwashed state.

H10. states that the brainwashing process results in increased deployability. Deployability extends the range of ideological obedience in the temporal dimension. It states that the response continues after the stimulus is removed. This hypothesis will be disconfirmed in any cult within which members are uncritically obedient only while they are being brainwashed but not thereafter. The effect need not be permanent, but it does need to result in some measurable increase in deployability over time.

H11. states that the ability of the collectivity to rely on obedience without surveillance will result in a measurable decrease in surveillance. Since surveillance involves costs, this decrease will lead to a quantity S, where S equals the savings to the collectivity due to diminished surveillance needs and should be measurable at least to an ordinal level.

H12. states that S will be greater than C. In other words, the savings to the collectivity due to decreased surveillance needs is greater than the cost of maintaining the brainwashing program. Only where S is greater than C does it make sense to maintain a brainwashing program. Cults with initially high surveillance costs, and therefore high potential savings due to decreased surveillance needs [S], will tend to be more likely to brainwash, as will cults structured so that the cost of maintaining the brainwashing system [C] are relatively low.

Characteristics of a Good Theory [omitted]

Evidence for Brainwashing in Cults [omitted]

Ethnographic Accounts [omitted]

Leader Accounts [omitted]

Ex-Member Accounts [omitted]

Incidence and Consequences [omitted]

Conclusions

We can conclude from all of the above that those who claim that cultic brainwashing does not exist and those who claim it is pandemic to cults are both wrong. Brainwashing is an administratively costly and not always effective procedure that some cults use on some of their members. A few cults rely heavily on brainwashing and put all their members through it. Other cults do not use the procedure at all. During periods of stressful confrontation, either with external enemies or among internal factions, or in attempts to cope with failed apocalyptic prophecies, it is not uncommon for brainwashing suddenly to come to play a central role in the cult’s attempts to achieve order and social control. At such times, risk of uncritically obedient violent aggression or mass suicide may be heightened.

Hopefully, it will be clear from this chapter that brainwashing has absolutely nothing to do with the overthrow of ‘free will’ or any other such mystical or nonscientific concept. People who have been brainwashed are ‘not free’ only in the sense that all of us, hemmed in on all sides as we are by social and cultural constraints, are not free. The kinds of social constraints involved in brainwashing are much more intense than those involved in socializing many of us to eat with knives and forks rather than with our hands. But the constraints involved differ only in magnitude and focus, not in kind. Any brainwashed cult member always retains the ability to leave the cult or defy the cult as long as he or she is willing to pay the mental and emotional price (which may be considerable) that the cult is able to exact for so doing.

As I finish this chapter, a number of European nations are debating the advisability of anti-brainwashing laws, some of which eventually may be used to inhibit freedom of religious expression. In light of this trend a number of colleagues have criticized me, not on the grounds that my facts are incorrect, but that my timing is unfortunate. One socked me with the following, particularly troubling, complaint: ‘Ben, if you had discovered evidence, in 1942, of a higher prevalence among Jews than among non-Jews of the Tay-Sachs genetic defect, would you have published your findings in a German biology journal?’ Ultimately, although I respect the sentiments behind my colleagues’ concerns, I must respectfully disagree with their fastidious caution. It never works to refuse to look at frightening facts. They only become larger, more frightening, and more mystically permeated when banished to one’s peripheral vision. A direct, honest acknowledgment of the limited but significant role that brainwashing plays in producing uncritical obedience in some cults will serve, in the long run, to lessen paranoid reactions to ‘the threat of the cults,’ rather than to increase them.

Notes [omitted]



7. Psyching Out the Cults’ Collective Mania

by Louis Jolyon West and Richard Delgado

Los Angeles Times  November 26, 1978

Louis Jolyon West is director of UCLA’s Neuropsychiatric Institute. Richard Delgado is a visiting professor of law at UCLA.

Just a week ago yesterday, the ambush of Rep. Leo J. Ryan and three newsmen at a jungle airstrip set off a terrible sequence of events that left many hundreds of people dead in the steamy rain forests of Guyana. The horrible social mechanism that ground into motion in the Peoples Temple camp that day seems inexplicable to many and has focused attention on the murky world of cults, both religious and nonreligious.

Historically, periods of unusual turbulence are often accompanied by the emergence of cults. Following the fall of Rome, the French Revolution and again during the Industrial Revolution, numerous cults appeared in Europe. The westward movement in America swept a myriad of religious cults toward California. In the years following the Gold Rush, at least 50 utopian cults were established here. Most were religious and lasted, on the average, about 20 years; the secular variety usually endured only half that long.

The present disturbances in American culture first welled up during the 1960s, with the expansion of an unpopular war in Southeast Asia, massive upheavals over civil rights and a profound crisis in values in response to unprecedented affluence, on the one hand, and potential thermonuclear holocaust, on the other. Our youth were caught up in three rebellions: red (the New Left), against political and economic monopolies; black, against racial injustice, and green (the counterculture), against materialism in all its manifestations, including individual and institutional struggles for power.

Drug abuse and violent predators took an awful toll among the counterculture’s hippies in the late 1960s. Many fled to form colonies, now generally called communes. Others turned to the apparent security of paternalistic religious and secular cults, which have been multiplying at an astonishing rate ever since.

Those communes that have endured—perhaps two or three thousand in North America—can generally be differentiated from cults in three respects:

— Cults are established by strong charismatic leaders of power hierarchies controlling resources, while communes tend to minimize organizational structure and to deflate or expel power seekers.

— Cults possess some revealed “word” in the form of a book, manifesto or doctrine, whereas communes vaguely invoke general commitments to peace, libertarian freedoms and distaste for the parent culture’s establishments.

— Cults create fortified boundaries confining their members in various ways and attacking those who would leave as defectors, deserters or traitor; they recruit new members with ruthless energy and raise enormous sums of money, and they lend to view the outside world with increasing hostility and distrust as the organization ossifies. In contrast, communes are like nodes in the far-flung network of the counterculture. Their boundaries are permeable membranes through which people come and go relatively unimpeded, either to continue their pilgrimages or to return to a society regarded by the communards with feelings ranging from indifference to amusement to pity. Most communes thus defined seem to pose relatively little threat to society. Many cults, on the other hand, are increasingly perceived as dangerous both to their own members and to others.

Recent estimates place more than 2 million Americans, mostly aged 18 to 25, in some way affiliated with cults and, by using the broadest of definitions, there may be as many as 2,500 cults in America today. If the total seems large, consider that L. Ron Hubbard’s rapidly expanding Church of Scientology claimed 5.5 million members worldwide in 1972; the Unification Church of Rev. Sun Myung Moon boasts of 30,000 members in the United States alone.

These enterprises may seem rich, respectable and secure compared to the Rev. Jim Jones tragic Peoples Temple, with its membership of only 2,000 to 3,000. However, the Church of Scientology, the Unification Church and other organizations such as Chuck Dederich’s Synanon, have all been under recent investigation by government agencies. Other large religious cults, such as the Divine Light Mission, the International Society for Krishna Consciousness and the Children of God are being carefully scrutinized by the public. For the public is alarmed by what it knows of some cults’ methods of recruitment, exploitation of members, restriction on members freedom, retaliation against defecting members, struggles with members’ families engaged in rescue operations (including so-called “deprogramming”), dubious fiscal practices and the like. Lately, death threats against investigative reporters, leaked internal memoranda justifying violence, the discovery of weapons caches, such incidents as the rattlesnake attack against the Los Angeles attorney Paul Morantz last month, the violent outburst of the Hanafi Muslims in Washington, D.C., last year and now the gruesome events in Guyana have served to increase the public’s concern.

[The 1977 Hanafi Siege occurred on March 9-11, 1977 when three buildings in Washington, D.C. were seized by 12 Hanafi Muslim gunmen. The gunmen were led by Hamaas Abdul Khaalis, who wanted to bring attention to the murder of his family in 1973. They took 149 hostages and killed radio journalist Maurice Williams. Police officer Mack Cantrell also died.]

Some cults (for instance, Synanon) are relatively passive about recruitment (albeit harsh when it comes to defections). Others, such as the Unification Church, are tireless recruiters. Many employ techniques that in some respects resemble those used in the forceful political indoctrination prescribed by Mao Tse-tung during the communist revolution and its aftermath in China. These techniques, described by the Chinese as “thought reform” or “ideological remolding,” were labeled “brainwashing” in 1950 by the American journalist Edward Hunter. Such methods were subsequently studied in depth by a number of western scientists and Edgar Schein summarized much of this research in a monograph, “Coercive Persuasion,” published in 1961.

Successful indoctrination by a cult of a recruit is likely to require most of the following elements:

— Isolation of the recruit and manipulation of his environment;
— Control over channels of communication and information;
— Debilitation through inadequate diet and fatigue;
— Degradation or diminution of the self;
— Early stimulation of uncertainty, fear and confusion, and joy and certainty as rewards for surrendering self to the group;
— Alternation of harshness and leniency in the context of discipline;
— Peer pressure, often applied through ritualized “struggle sessions,” generating guilt and requiring open confessions;
— Insistence by seemingly all-powerful hosts that the recruit’s survival –physical or spiritual– depends on identifying with the group;
— Assignment of monotonous tasks of repetitive activities, such as chanting or copying written materials;
— Acts of symbolic betrayal or renunciation of self, family and previously held values, designed to increase the psychological distance between the recruit and his previous way of life.

As time passes, the new member’s psychological condition may deteriorate. He may become incapable of complex, rational thought; his responses to questions may become stereotyped and he may find it difficult to make even simple decisions unaided. His judgement about the events in the outside world will likely be impaired. At the same time, there may be such a reduction of insight that he fails to realize how much he has changed.

After months or years of membership, such a former recruit may emerge from the cult –perhaps “rescued” by friends or family, but more likely having escaped following prolonged exploitation, suffering and disillusionment. Many such refugees appeared dazed and confused, unable to resume their previous way of life and fearful of being captured, punished and returned to the cult. “Floating” is a frequent phenomenon, with the ex-cultist drifting off into disassociated states of altered consciousness. Other frequent symptoms of the refugees include depression, indecisiveness and a general sense of disorientation, often accompanied by frightening impulses to return to the cult and throw themselves on the mercy of the leader.

This suggests that society may well wish to consider ways of preventing its members, particularly the young, from unwittingly becoming lost in cults that use psychologically and even physically harmful techniques of persuasion. Parents can inform themselves and their children about cults and the dangers they pose; religious and educational leaders can teach the risks of associating with such groups. However, when prevention fails and intervention assumes an official character –as through legislation or court action– it is necessary to consider the potential impact of such intervention on the free exercise of religion as guaranteed by the First Amendment.

Under the U.S. Constitution, religious liberty is of two types –freedom of belief and freedom of action. The first is, by its nature, absolute. An individual may choose to believe in a system that others find bizarre or ludicrous; society is powerless to interfere. Religiously motivated conduct, however, is not protected absolutely. Instead, it is subject to a balancing test, in which courts weigh the interest of society in regulating or forbidding the conduct against the interest of the group in carrying it out.

How can society best protect the individual from physical and psychological harm, from stultification of his ability to act autonomously, from loss of vital years of his life, from dehumanizing exploitation –all without interfering with his freedom of choice in regard to religious practices? And, while protecting religious freedom, how can society protect the family as a social institution from the menace of the cult as a competing super-family?

A number of legal cases involving polygamy, blood transfusions for those who object to them on religious grounds and the state’s interest in protecting children from religious zealotry suggest that the courts will hold these interests to be constitutionally adequate to check the more obvious abuses of the cults. Furthermore, the cults interest is likely to be found weakened by a lack of “sincerity,” a requirement deriving from conscious-objector and tax-exemption cases, and lack of “centrality,” or importance of the objectionable practices to such essential religious functions as worship.

To be protected by the First Amendment, religions conduct must stem from theological or moral motives rather than avarice, personal convenience, or a desire for power. Such conduct must also constitute a central or indispensable element of the religious practice.

Many religious cults demonstrate an extreme interest in financial or political aggrandizement, but little interest in the spiritual development of the faithful. Because their religious or theological core would not seem affected by a prohibition against deceptive recruiting methods and coercive techniques to indoctrinate and retain members, it is likely the courts would consider the use of such methods neither “sincere” nor “central.”

Thus the constitutional balance appears to allow intervention, though it could be objected that obnoxious practices which might otherwise justify intervention should not be considered harmful if those experiencing them do so voluntarily and do not see them as harmful at the time.

But is coercive persuasion in the cults inflicted on persons who freely choose to undergo it —who decide to be unfree— or is it imposed on persons who do not truly choose it of their own free will? The decision to join a cult and undergo drastic reformation of one’s thought and behavioral processes can be seen as similar in importance to decisions to undergo surgery, psychotherapy and other forms of medical treatment. Accordingly, it should be protected in the same manner and to the same degree as we protect the decision to undergo medical treatment. This means the decision must be fully consensual. This entails, at a minimum, that those making such decisions do so with both full mental “capacity” and with a complete “knowledge” of the choices offered them. In other words, they should give “fully informed consent” before the process of indoctrination can be initiated.

A review of legislative reports, court proceedings (including cases involving conservatorships, or the “defense of necessity” in kidnaping prosecutions), and considerable clinical material makes clear that the cult joining process is often not fully consensual. It is not fully consensual because “knowledge” and “capacity” —the essential elements of legally adequate consent— are not simultaneously present. Until cults obtain fully informed consent from prospective members giving permission in advance to apply the procedures of indoctrination, and warning of the potential risks and losses, it appears that society may properly take measures to protect itself against cultist indoctrination without violating the principle, central to American jurisprudence, that the state should not interfere with the voluntarily chosen religious behavior of adult citizens.

Most young people who are approached by cultist recruiters will have relatively unimpaired “capacity”. They may be undergoing a momentary state of fatigue, depression, or boredom; they may be worried about exams, a separation from home or family, the job market, or relations with the opposite sex —but generally their minds are intact. If the recruiter were to approach such a person and introduce himself or herself as a recruiter for a cult, such as the Unification Church, the target person would likely be on guard.

But recruiters usually conceal the identity of the cult at first, and the role the recruit is expected to play in it, until the young person has become fatigued and suggestible. Information is imparted only when the target’s capacity to analyze it has become low. In other words, when the recruit’s legal “capacity” is high, his “knowledge” is not; later the reverse obtains. Consent given under such circumstances should not deserve the respect afforded ordinary decisions of competent adults.

If intervention against cults that employ coercive persuasion is consistent with the First Amendment, a line must be drawn between cults and other organizations. But is it possible to impose restrictions on the activities of cults that use coercive persuasion without imposing the same restraints upon other societal institutions —TV advertising, political campaigns, army training camps, Jesuit seminaries— that use influence, persuasion and group dynamics in their normal procedures?

Established religious orders may sequester their trainees to some extent. Military recruiters and Madison Avenue copywriters use exaggeration, concealment and “puffing” to make their product appear more attractive than it is. Revivalists invoke guilt. Religious mystics engage in ritual fasting and self-mortification. It has been argued that the thought-control processes used by cults are indistinguishable from those of more socially accepted groups.

Yet it is possible to distinguish between cults and other institutions —by examining the intensity and pervasiveness with which mind-influencing techniques are applied. For instance, Jesuit seminaries may isolate the seminarian from the rest of the world for periods of time, but the candidate is not deliberately deceived about the obligations and burdens of the priesthood; in fact, he is warned in advance and is given every opportunity to withdraw.

In fact, few, if any, social institutions claiming First Amendment protection use conditioning techniques as intense, deceptive, or pervasive as those employed by many contemporary cults. A decision to intervene and prevent abuses of cult proselytizing and indoctrinating does not by its logic alone dictate intervention in other areas where the abuses are milder and more easily controlled.

To turn again to the sad case of the Peoples Temple, it seemed to be, for some years, a relatively small and, in its public stance, moderate cult. Its members differed from those of most cults: Many were older people, many were black, many were enlisted in family units. Nevertheless, from its origins, based on professed ideals of racial harmony and economic equality, the cult gradually developed typical cultist patterns of coercive measures, harsh practices, suspicions of the outside world and a siege mentality.

It may be that these developments comprise an institutional disease of cults. If so, the recent events in Guyana pose a new warning of continuing dangers from cults. For as time passes, leaders may age and sicken. The cult’s characteristically rigid structure and its habitual deference to the leader as repository of all authority leaves the membership vulnerable to the consequences of incredible errors of judgment, institutional paranoia and even deranged behavior by the cult’s chief.

Perhaps the tragedy of Jim Jones’ Peoples Temple will lead to more comprehensive and scientific studies of cult phenomena. Perhaps it will lead our society to a more reasoned public policy of prevention and intervention against further abuses by cults in the name of freedom of religion. If so, then perhaps the disaster in Guyana will have some meaning after all.



8. Take Back Your Life by Janja Lalich and Madeleine Tobias


Paperback: 374 pages
Publisher: Bay Tree Publishing; 2nd edition (September 10, 2009)

Cult victims and those who have experienced abusive relationships often suffer from fear, confusion, low self-esteem, and post-traumatic stress. Take Back Your Life explains the seductive draw that leads people into such situations, provides insightful information for assessing what happened, and hands-on tools for getting back on track. Written for victims, their families, and professionals, this book leads readers through the healing process.


About the Authors

Janja Lalich, Ph.D., is Associate Professor of Sociology at California State University, Chico. She has been studying the cult phenomenon since the late 1980s and has coordinated local support groups for ex-cult members and for women who were sexually abused in a cult or abusive relationship. She is the author of Bounded Choice: True Believers and Charismatic Cults, and co-author, with Margaret Singer, of Cults in Our Midst.

Madeleine Tobias, M.S., R.N., C.S., is the Clinical Coordinator and a psychotherapist at the Vet Center in White River Junction, Vermont, where she treats veterans who experienced combat and/or sexual trauma while in the military. Previously she had a private practice in Connecticut and was an exit counselor helping ex-members of cultic groups and relationships.


“If you buy one book on cults, this could be top of the list.”

Here are three reviews of an earlier edition:

An essential roadmap to recovery

For me, the special usefulness of this book came in the form of material directed at children who grew up in a cult, who have no other frame of reference to go back to.

The information I gleaned here gave me that frame of reference, and helped me to “detox” from the environment which was so seductively calling me back. It explains and makes sense of some very bewildering and deceptive manipulation techniques. And it has helped my therapy by outlining the kinds of issues that children coming out of cults usually face.

This book has a universal appeal for all cult escapees because it focuses not on beliefs or practices, but rather on manipulations and psychological pressures which are commonly brought to bear in cults. I found it easy to identify experientially with the material, without being challenged and put off by attacks on my strange belief system which I was still disengaging from.

It’s been a big part of my recovery. My thanks to the authors!

____________

A must-read for former cult members by Troy Waller:

I wish I had found this book immediately after leaving the cult I was involved in.

This book offers invaluable assistance to those who have been involved with a destructive cult, whether it be religious, political or psycho-therapeutic. The text gives former members indications of what to expect in recovery as well as practical assistance to cope with their recovery.

The text also gives a breakdown of how and why cults operate as they do; how and why people get recruited into cults; and how and why people leave cults.

This book is truly a gift from the authors’ heart, experiences and study. Thanks to them.

____________

Sane Advice for Those Leaving Cults by D. L. Barnett

We don’t hear much these days about the Branch Davidians, Heaven’s Gate or even Jim Jones. It’s tempting to think that the cult movement has faded and that the world’s attention is on more pressing matters – like suicide bombers. But they are all of a piece, according to Chico State University Associate Professor of Sociology Janja Lalich.

In “Take Back Your Life: Recovering from Cults and Abusive Relationships”, Lalich and co-author Madeleine Tobias, a Vermont psychotherapist, make clear that modern day cults have not disappeared. “If there is less street recruiting today, it is because many cults now use professional associations, campus organizations, self-help seminars, and the Internet as recruitment tools” to entice the unwary.

Who gets sucked into a cult? “Although the public tends to think, wrongly, that only those who are stupid, weird, crazy and aimless get involved in cults, this is simply untrue. … We know that many cult members went to the best schools in the country, have advanced academic or professional degrees and had successful careers and lives prior to their involvement in a cult or cultic abusive relationship. But at a vulnerable moment, and we all have plenty of those in our lives (a lost love, a lost job, rejection, a death in the family and so on), a person can fall under the influence of someone who appears to offer answers or a sense of direction.”

For the authors, “a group or relationship earns the label ‘cult’ on the basis of its methods and behaviors – not on the basis of its beliefs. Often those of us who criticize cults are accused of wanting to deny people their freedoms, religious or otherwise. But what we critique and oppose is precisely the repression and stripping away of individual freedoms that tends to occur in cults. It is not beliefs that we oppose, but the exploitative manipulation of people’s faith, commitment, and trust.”

Written for those coming out of cults, as well as for family members and professionals, “Take Back Your Life” deals with common characteristics of myriad cult types: Eastern, religious and New Age cults; political, racist and terrorist cults; psychotherapy, human potential, mass transformational cults; commercial, multi-marking cults; occult, satanic or black-magic cults; one-on-one family cults; and cults of personality. …

The book features riveting personal accounts from ex-cult members and offers a wide range of resources for the person who is trying to retrieve his or her “pre-cult” personality. Education looms large, for that can begin to break down the narrow black-and-white thinking cult members often display. Many cults redefine common terms or introduce special vocabulary making it difficult for members to make sense of the world outside of even their own inner aspirations.

The authors are also concerned about those in the education and helping professions who don’t see the dangers posed by cults both to the individual and the larger community. Part of the purpose of the book is to make a credible case that any course of therapy needs to take into account a patient’s cult associations.

“Take Back Your Life” is a book of hope, an excellent starting point for those thinking of exiting a cult and for those who are taking back their lives, one day at a time.


Contents

Acknowledgments ix
Introduction 1

Part One – The Cult Experience 7

1. Defining a Cult 9

2. Recruitment 18

3. Indoctrination and Resocialization 36

4. The Cult Leader 52

5. Abusive Relationships and Family Cults 72

Part Two – The Healing Process 87

6. Leaving a Cult 89

7. Taking Back Your Mind 104

8. Dealing with the Aftereffects 116

9. Coping with Emotions 127

10. Building a Life 151

11. Facing the Challenges of the Future 166

12. Healing from Sexual Abuse and Violence 180

13. Making Progress by Taking Action 196

14. Success Is Sweet: Personal Accounts 212

Part Three – Families and Children in Cults 239

15. Born and Raised in a Cult 241

16. Our Lives to Live: Personal Accounts 259

17. Child Abuse in Cults 280
Nori J. Muster

Part Four – Therapeutic Concerns 287

18. Therapeutic Issues 289

19. The Therapist’s Role 305
Shelly Rosen

20. Former Cult Members and Post-Traumatic Stress Disorder 314

Appendixes

A. Characteristics Associated with Cultic Groups 327
Janja Lalich and Michael Langone

B. On Being Savvy Spiritual Consumers 329
Rosanne Henry and Sharon Colvin

C. Resources 332

D. Recommended Reading 336

Notes 345

Author Index 359

Subject Index 363


Introduction

Take Back Your Life: Recovering from Cults and Abusive Relationships gives former cult members, their families, and professionals an understanding of common cult practices and their aftereffects. This book also provides an array of specific aids that may help restore a sense of normalcy to former cult members’ lives.

About twelve years ago, we wrote our first book on this topic: Captive Hearts, Captive Minds: Freedom and Recovery from Cults and Abusive Relationships. Over the years, we received mounds of positive feedback about that book in the form of letters, phone calls, postcards, emails, faxes, and personal contact at conferences and in our professional lives. Former cult members, families, therapists, and exit counselors continually told us that Captive Hearts, Captive Minds was always their number-one book. That positive reception (and the need to provide up-to-date information) was the impetus for this new book. We are delighted to offer this new resource to people who want to evaluate, understand, and, in many cases, recover from the effects of a cult experience. We hope this book will help you take back your life.

Cults did not fade away (as some would like to believe) with the passing of the sixties and the disappearance of the flower children. In fact, cult groups and relationships are alive and thriving, though many groups have matured and “cleaned up their act.” If there is less street recruiting today, it is because many cults now use professional associations, campus organizations, self-help seminars, and the Internet as recruitment tools. Today we see people of all ages— even multigenerational families—being drawn into a wide variety of groups and movements focused on everything from therapy to business ventures, from New Age philosophies to Bible-based beliefs, and from martial arts to political change.

Most cults don’t stand up to be counted in a formal sense. Currently, the best estimates tell us that there are about 5,000 such groups in the United States, some large, some remarkably small. Noted cult expert and clinical psychologist Margaret Singer estimated “about 10 to 20 million people have at some point in recent years been in one or more of such groups.”(1) Before its enforced demise, the national Cult Awareness Network reported receiving about 20,000 inquiries a year.(2)

A cult experience is often a conflicted one, as those of you who are former members know. More often than not, leaving a cult environment requires an adjustment period so that you can put yourself and your life back together in a way that makes sense to you. When you first leave a cult situation, you may not recognize yourself. You may feel confused and lost; you may feel both sad and exhilarated. You may not know how to identify or tackle the problems you are facing. You may not have the slightest idea about who you want to be or what you want to believe. The question we often ask children, “What do you want to be when you grow up?” takes on new meaning for adult ex-cult members.

Understanding what happened to you and getting your life back on track is a process that may or may not include professional therapy or pastoral counseling. The healing or recovery process varies for each of us, with ebbs and flows of progress, great insight, and profound confusion. Also, certain individual factors will affect your recovery process. One is the length and intensity of your cult experience. Another is the nature of the group or person you were involved with—or where your experience falls on a scale of benign to mildly harmful to extremely damaging. Recovering from a cult experience will not end the moment you leave the situation (whether you left on your own or with the help of others). Nor will it end after the first few weeks or months away from your group. On the contrary, depending on your circumstances, aspects of your cult involvement may require some attention for the rest of your life.

Given that, it is important to find a comfortable pace for your healing process. In the beginning, particularly, your mind and body may simply need a rest. Now that you are no longer on a mission to save the world or your soul, relaxation and rest are no longer sinful. In fact, they are absolutely necessary for a healthy, balanced, and productive life.

Reentering the non-cult world (or entering it for the first time if you were born or raised in a cult) can be painful and confusing. To some extent, time will help. Yet the passage of time and being physically out of the group are not enough. You must actively and of your own initiative face the issues of your involvement. Let time be your ally, but don’t expect time alone to heal you. We both know former cult members who have been out of their groups for many years but who have never had any counseling or education about cults or the power of social-psychological influence and control. These individuals live in considerable emotional pain and have significant difficulties due to unresolved conflicts about their group, their leader, or their own participation. Some are still under the subtle (or not so subtle) effects of the group’s systems of influence and control.

A cult experience is different for each person, even for members of the same group, family, or situation. Some former members may have primarily positive impressions and memories, while others may feel hurt, used, or angry. The actual experiences and the degree or type of harm suffered may vary considerably. Some people may leave cults with minimum distress, and adjust rather rapidly to the larger society, while others may suffer severe emotional trauma that requires psychiatric care. Still others may need medical attention or other care. The dilemmas can be overwhelming and may require thoughtful attention. Many have likened this period to being on an emotional roller coaster.

First of all, self-blame (for joining the cult or participating in it, or both) is a common reaction that tends to overshadow all positive feelings. Added to this is a feeling of identity loss and confusion over various aspects of daily life. If you were recruited at any time after your teens, you already had a distinct personality, which we call the “pre-cult personality.” While you were in the cult, you most likely developed a so-called new personality in order to adapt to the demands and ambiance of cult life. We call this the “cult personality.” Most cults engage in an array of social-psychological pressures aimed at indoctrinating and changing you. You may have been led to believe that your pre-cult personality was all bad and your adaptive cult personality all good. After you leave a cult, you don’t automatically switch back to your pre-cult self; in fact, you may often feel as if you have two personalities or two selves. Evaluating these emotions and confronting this dilemma—integrating the good and discarding the bad—is a primary task for most former cult members, and is a core focus of this book.

As you seek to redefine and reshape your identity, you will want to address the psychological, emotional, and physical consequences of living in or around a constrained, controlled, and possibly abusive environment. And as if all that weren’t enough, many basic life necessities and challenges will need to be met and overcome. These may include finding employment and a place to live, making friends, repairing old relationships, confronting belief issues, deciding on a career or going back to school, and most likely catching up with a social and cultural gap.

If you feel like “a stranger in a strange land,” it may be consoling to know that you are not the first person to have felt this way. In fact, the pervasive and awkward sense of alienation that both of us felt when we left our cults motivated us to write this book. We hope that the information here will not only help you get rid of any shame or embarrassment you might feel, but also ease your integration into a positive and productive life.

We were compelled to write this book because more often than not, people coming out of cults have tremendous difficulty finding practical information. We, too, experienced that obstacle. Both of us faced one roadblock after another as we searched for useful information and helping professionals who were knowledgeable about cults and post-cult trauma.

A matter we hope to shed light on in this book is the damage wrought by the so-called cult apologists. These individuals (mostly academics) allege that cults do no harm, and that reports of emotional or psychological damage are exaggerations or even fabrications on the part of disgruntled former members. Naturally we disagree. It is unfortunate that there is still so little public understanding of the potential danger of some cults. Certainly there are risks and harmful consequences for individuals involved in these closed, authoritarian groups and abusive relationships. If there weren’t, there would be no need for cult research and information organizations, or for books such as this. Added to individual-level consequences, there are documented dangers to society as a whole from cults whose members carry out their beliefs in antisocial ways— sometimes random, sometimes planned—through fraud, terrorist acts, drug dealing, arms trading, enforced prostitution of members, sexual exploitation, and other violent or criminal behaviors.

From our perspective, a group or relationship earns the label “cult” on the basis of its methods and behaviors—not on the basis of its beliefs. Often those of us who criticize cults are accused of wanting to deny people their freedoms, religious or otherwise. But what we critique and oppose is precisely the repression and stripping away of individual freedoms that tends to occur in cults. It is not beliefs that we oppose, but the exploitative manipulation of people’s faith, commitment, and trust. Our society must not shy away from exposing and holding accountable those social systems (whether they be communities, organizations, families, or relationships) that use deception, manipulation, coercion, and persuasion to attract, recruit, convert, hold on to, and ultimately exploit people.

Also, it’s important to note that there are many non-cult organizations to which people can dedicate their lives and may experience personal transformation. Many religious and self-help institutions, as well as mainstream political parties and special-interest groups, are examples of such non-cult organizations. We do not call them cults because they are publicly known institutions that are usually accountable to some higher body or to society in general. When people join, they have a clear idea of these organizations’ structures and goals. Deceptive or coercive practices are not integral to the growth of these organizations or their ability to retain their members.

In contrast, cult membership is less than fully voluntary. Often it is the result of intense social-psychological influence and control, sometimes called coercive persuasion. Cults tend to assault and strip away a person’s independence, critical-thinking abilities, and personal relationships, and may have a less-than-positive effect on the person’s physical, spiritual, and psychological state of being.

We wrote this book for the many individuals who have experienced harm or trauma in a cult or an abusive relationship. Because it is awkward to continually repeat the phrase “cult or cultic relationship,” in many instances throughout this book we simply shortened it to “cult” or “group,” which are meant to be inclusive of all types of cultic involvements. In the same vein, while we recognize the existence of many one-on-one cultic relationships and family cults, we tend to use simply “cult leader” or “leader” rather than always specifying “leader or abusive partner.” Also, we tend to use masculine pronouns when referring to cult leaders in general. This is not to ignore the fact that there are many female cult leaders, but merely to acknowledge that most cult leaders tend to be men. However, whether male or female, most are equal-opportunity victimizers, drawing men, women, and children of all ages into their webs of influence.

We have included case examples and personal accounts throughout the chapters to illustrate the specifics of involvement, typical aftereffects, and the healing process. Some examples are composites based on interviews and our personal and professional experiences with many hundreds of former cult members. Some former members made specific contributions or allowed us to quote them and use their real names, while others asked for pseudonyms to protect their privacy. These latter, as well as the case examples, are indicated in the text by the use of first name and last initial on the first mention of that name.

If you are a former cult member, you may identify personally with some of the experiences, emotions, challenges, and difficulties discussed here. Other topics may appear quite foreign and unrelated to your experience. It may be helpful to look them over anyway, as there may be lessons or suggestions that could be useful for your situation.

The keys to recovery are balance and moderation, both of which were quite likely absent in the cult. Now you can create a program for recovery that addresses your needs and wants, and you can change it at will to adapt to any new circumstances or needs. The important thing is to do what feels right. Most cults teach you to squelch your gut instincts, but you can now let your self speak to you—and this time, you can listen and act. From now on, only you are responsible for setting and achieving your goals. Our hope is that this book will be useful to you in your recovery process, and we wish you well.


pages 15-17

Cults as Power Structures

In Bounded Choice: True Believers and Charismatic Cults, I (Janja Lalich) present my most recent findings from an in-depth study of cultic structures and dynamics:

A cult can be either a sharply bounded social group or a diffusely bounded social movement held together through a shared commitment to a charismatic leader. It upholds a transcendent ideology (often but not always religious in nature) and requires a high level of commitment from its members in words and deeds.(8)

Four interlocking dimensions make up the framework of a cult’s social system and dynamics. You can use this framework to examine your own cult experience. These four dimensions are clearly separated here for analytical purposes so that former cult members (whose memories of cult experiences are often confused and conflicting) can more easily deconstruct and understand each phase of indoctrination and control:

Charismatic authority. This is the emotional bond between a leader and his followers. It lends legitimacy to the leader and grants authority to his actions while at the same time justifying and reinforcing followers’ responses to the leader and/or the leader’s ideas and goals. Charisma is the hook that links a devotee to a leader and/or his ideas.

The general purpose of charismatic authority is to provide leadership. The specific goal is for the leader to be accepted as the legitimate authority and to offer direction. This is accomplished through privilege and command. The desired effect, of course, is that members will believe in and identify with the leader.

Transcendent belief system. This is the overarching ideology that binds adherents to the group and keeps them behaving according to the group’s rules and norms. It is transcendent because it offers a total explanation of past, present, and future, including the path to salvation. Most importantly, the leader/group also specifies the exact methodology (or recipe) for the personal transformation necessary to travel on that path.

The goal of the transcendent belief system is to provide a worldview that offers meaning and purpose through a moral imperative. This imperative requires each member to subject himself to a process of personal transformation. The desired effect is for the member to feel a sense of connection to a greater goal while aspiring to salvation. This effect is solidified through the internalization of the belief system and its accompanying behaviors and attitudes.

Systems of control. This is the network of acknowledged—or visible—regulatory mechanisms that guide the operation of the group. It includes the overt rules, regulations, and procedures that guide and control members’ behavior.

The purpose of the systems of control is quite simply to provide organizational structure. The specific goal is to create a behavioral system and disciplinary code through rules, regulations, and sanctions. The effect is compliance, or better still, obedience.

Systems of influence. This is the network of interactions and social influence that resides in the group’s social relations. This interaction and group culture teach members to adapt their thoughts, attitudes, and behaviors in relation to their new beliefs.

The purpose of the systems of influence is to shape the group culture. The specific goal is to create institutionalized group norms and an established code of conduct by which members are expected to live. This is accomplished by various methods of peer and leadership pressure, and through social-psycho-logical influence and modeling. The desired effect is conformity and the self-renunciation that is required not only to be part of the group but also to achieve the professed goal.(9)

This combination of a transcendent belief system, all-encompassing systems of interlocking structural and social controls, and highly charged charismatic relationships between leader(s) and adherents results in a self-sealing system that exacts a high degree of commitment (as well as expressions of that commitment) from its core members. A self-sealing system is one that is closed in on itself, allowing no consideration of disconfirming evidence or alternative points of view. In the extreme, a self-sealed group is exclusive and its belief system is all inclusive, in the sense that it provides answers to everything. Typically the quest of such groups is to attain a far-reaching ideal. However, a loss of sense of self is all too often the by-product of that quest.(10)

Over the years, some people have used alternative terms or adjectives to identify cult groups, such as high-demand, high-control, totalistic, totalitarian, closed charismatic, ultra-authoritarian, and so on. In academia, some rather acrimonious debate has arisen over the use of the word cult, with some academicians and researchers using their influence to dissuade scholars, legal and helping professionals, the media, and others from identifying any group as a cult. Recent work addressing these debates and arguments can be found in Misunderstanding Cults: Searching for Objectivity in a Controversial Field, edited by Benjamin Zablocki and Thomas Robbins.(11)

Frankly we prefer to use the term cult because we feel that it has historical meaning and value. Whatever one decides to call these groups, one must not ignore the structural and behavioral patterns that have been identified through years of study and research, or through the voluminous accounts of people who successfully exited from cult groups and relationships. To sweep cults under the rug or to call them by another name won’t make cults go away—nor will it aid us in understanding these complex social systems. Most importantly, cover-ups and whitewashing won’t help former cult members evaluate or recover from their experiences in a whole and healthful manner.


from pages 26-27

Contract for Membership in a Cultic Group or Relationship

In the medical profession, ethical contracts ensure that patients have given “fully informed consent.” That is, if a doctor fails to inform a patient about the risks, side effects, and options for treatment, the uninformed patient is entitled to sue for maltreatment. Below is a mock contract for cult membership. Ask yourself if you gave informed consent at the time of your recruitment, or if you would have joined had you known your participation would involve the following conditions.

I, _______________________________ hereby agree to join
_______________________________ . I understand that my life will change in the following ways. I know what I am getting into and agree to all of the following conditions:
1. My good feelings about who I am will stem from being liked by other group members and/or my leader, and from receiving approval from the group/leader.
2. My total mental attention will focus on solving the group’s/leader’s problems and making sure that there are no conflicts.
3. My mental attention will be focused on pleasing and protecting the group/leader.
4. My self-esteem will be bolstered by solving group problems and relieving the leader’s pain.
5. My own hobbies and interests will gladly be put aside. My time will be spent however the group/leader wants.
6. My clothing and personal appearance will be dictated by the desires of the group/leader.
7. I do not need to be sure of how I feel. I will only be focused on what the group/leader feels.
8. I will ignore my own needs and wants. The needs and wants of the group/leader are all that is important.
9. The dreams I have for the future will be linked to the group/leader.
10. My fear of rejection will determine what I say or do.
11. My fear of the group’s / leader’s anger will determine what I say or do.
12. I will use giving as a way of feeling safe with the group/leader.
13. My social circle will diminish or disappear as I involve myself with the group/leader.
14. I will give up my family as I involve myself with the group / leader.
15. The group’s/leader’s values will become my values.
16. I will cherish the group’s / leader’s opinions and ways of doing things more than my own.
17. The quality of my life will be in relation to the quality of group life, not the quality of life of the leader.
18. Everything that is right and good is due to the group’s belief, the leader, or the teachings.
19. Everything that is wrong is due to me.
20. In addition, I waive the following rights to:
• Leave the group at any time without the need to give a reason or sit through a waiting period
• Maintain contact with the outside world
• Have an education, career, and future of my choice
• Receive reasonable health care and have a say in my health care
• Have a say in my own and my family’s discipline, and to expect moderation in disciplinary methods
• Have control over my body, including choices related to sex, marriage, and procreation
• Expect honesty in dealings with authority figures in the group
• Expect honesty in any proselytizing I am expected to do
• Have any complaints heard and dealt with fairly with an impartial investigation
• Be supported and cared for in my old age in gratitude for my years of service


Janja Lalich presentation:
Re-forming the Self: The Impact and Consequences of Institutional Abuse

WEBSITE: http://cultresearch.org/



9. VIDEO: Paul Morantz, Attorney and cult expert
Cults, Thought Reform, Coercive Persuasion and Confession    (7minutes)

Los Angeles attorney and cult expert Paul Morantz has devoted his professional life to fighting cults. But in the late 1970s that life almost came to an abrupt end when one of the cults he litigated against planted a live rattlesnake in his mailbox. Paul’s health (and speech) has been affected ever since.

University of California Television (UCTV):
The Lawyer Synanon Tried to Kill – Legally Speaking

Paul Morantz speaks with California Lawyer editor Martin Lasden about his career and the dangers he faced. Series: “Legally Speaking”

WEBSITE: http://www.paulmorantz.com/



10. Podcast: Ford Greene, Attorney and Former Moonie, on Sun Myung Moon

Peter B. Collins podcast:
Ford Greene, an expert on religious cults including Scientology and the Unification Church, returns to talk about the death of Rev. Moon. Greene, once a Moonie himself, talks about the impact of Rev. Moon’s death. We touch on my podcast with Archbishop Stallings in early September, and the spin he put on the cult behaviors of Moon and his followers. Greene has deprogrammed many Moonies, and sued the church on behalf of former members. His own sister remains a member of the church. We talk about the CIA connections of Moon and his underlings, Moon’s role in right wing politics in the US, including his operation of the Washington Times. Greene also speculates about the future of the business empire and Moon’s brainwashed followers. While Green has not seen The Master yet, he comments on the aggressive legal tactics of Scientology.

At 37:20, Gary Chew reviews the new film The Master, which is based on the early life of Scientology founder L. Ron Hubbard. Gary Chew offers a somewhat cryptic view of The Master, starring Philip Seymour Hoffman and Joaquin Phoenix.

https://www.peterbcollins.com/2012/09/28/ford-greene-attorney-and-moonie-de-programmer-on-the-death-of-rev-moon-gary-chew-reviews-the-master-maxine-doogan-tells-californians-no-on-prop-35/


Ford Greene and the Moonies

Ford Greene is featured in the book Moonwebs

The book was made into the movie, Ticket to Heaven

Billet pour le ciel – par Josh Freed (français)


11. Video: Steve Hassan interviewed by Chris Shelton

Sensibly Speaking Podcast #77: Dealing with Destructive Cults ft. Steve Hassan

This week I interview Steve Hassan, a cult recovery specialist and licensed mental health counselor who has written on the subject of cults and published three books, including Combatting Cult Mind Control, which is an excellent breakdown of how destructive cults work, what undue influence is, how to recover from a cult experience and what family and friends can do for their loved ones who may be stuck in a cult situation.

Steve’s website: http://freedomofmind.com



12. Video about Conformity by TheraminTrees

We each possess one of the most powerful tools in the known universe: the human brain — capable of the extremes of insight and ignorance; of productiveness and destructiveness; of detection and projection; of rationality and rationalisation; of liberation and oppression.

13. Video: Instruction Manual for Life by TheraminTrees


Charismatic Authority (authority is not the same as power)



14. The Social Organization of Recruitment in the Unification Church – PDF

by David Frank Taylor, M.A., July 1978, Sociology

The purpose of this study is to provide an empirical description of recruitment into the Unification Church. The Unification Church is one of many new religious movements that has appeared in America during the 1970’s. The methods Church members use to attract and secure the commitment of individuals to the Church has generated controversy in recent years.

The research was initiated under the assumption that these recruitment strategies could be under­stood through the use of qualitative field methods. As an ethnographic treatment of religious indoctrina­tion, the study is based on participant observation of the recruitment process and is grounded in the interaction and language usage of participants. Close attention is given to the daily life of Church members and prospective members, where members help in a coopera­tive effort to persuade individuals to join their movement.

University of Montana
ScholarWorks at University of Montana
Theses, Dissertations, Professional Papers  –  Graduate School

LINK:
http://scholarworks.umt.edu/cgi/viewcontent.cgi?article=6585&context=etd

TABLE OF CONTENTS
Abstract …………………………………………………………… ii
Acknowledgements ……………………………………………… iii
Chapter I. INTRODUCTION TO THE STUDY …………………… 1
Chapter II. HISTORY, BELIEFS, AND STRUCTURE OF THE UNIFICATION CHURCH … 13
History and Beliefs ………………………………………………… 13
Organizational Structure …………………………………………… 20
Controversies Surrounding the Church …………………………… 23
Chapter III. A DESCRIPTION OF RECRUITMENT …………………… 31
The Encounter ………………………………………………………… 31
The Elephant Bus to Boonville ……………………………………… 36
“The Greatest Weekend” …………………………………………… 38
The Keynote Lecture: Falling in Love, Together …………………… 40
Understanding God’s Situation ……………………………………… 43
“A Universal Point of View” …………………………………………… 47
“Truth and Righteousness”…………………………………………… 50
Another Great Day …………………………………………………… 54
Sunday’s Finale ……………………………………………………… 58
Chapter IV. RECRUITMENT: A SOCIALLY ORGANIZED ACCOMPLISHMENT … 62
Finding Prospective Members ……………………………………… 62
The Choreography of Total Participation …………………………… 67
Groups ………………………………………………………………… 71
Loving ………………………………………………………………… 76
Control of Communication ………………………………………… 82
Making a Positive Evaluation ……………………………………… 84
“We Can Be New People” ………………………………………… 87
Lecture Reinforcement: Groups, Testimonies and Songs ……… 90
Dreams and Destiny ……………………………………………… 92
Testimonies and Skits …………………………………………… 94
Restoration of Value …………………….………………………… 97
We Want to Be Those People …………………………………… 99
Consensual Validation …………………………………………… 101
Expressions of Self-Fulfillment ………………………………… 106
Sustaining Group Unity and Brotherhood ……………………… 108
Following God’s Will ……………………………………………… 112
Guiding Prospects Towards Truth and Transformation ………… 115
Following Center ………………………………………………… 120
The True Parents ………………………………………………… 123
Idolization and Emulation of Leaders as Role Models ………… 125
Testimonies of Transition ………………………………………… 128
Overcoming Doubt and Negativity ……………………………… 131
Symbols of Commitment ………………………………………… 135
Dramatic Commitment Scenarios ………………………………… 137
Accomplished Commitment ……………………………………… 143
Chapter V. AN OVERVIEW ………………………………………… 146
Sincere Performance ……………………………………………… 148
Trust ………………………………………………………………… 149
Legitimized Control ………………………………………………… 151
Enthrallment ………………………………………………………… 153
Assuming the Role ………………………………………………… 155
Bibliography ………………………………………………………… 160



“Socialization techniques through which the UC members were able to influence”

by Geri-Ann Galanti, Ph.D.

Abstract
This article reports on the experiences and thoughts of an anthropologist who, under an assumed identity, participated in a 3-day Unification Church workshop.  Although the author’s expectation that she would encounter “brainwashing” techniques was not met, she was, nevertheless, struck by the subtle, yet powerful, socialization techniques through which the UC members were able to influence her.  She concludes that, to be effective, preventive education in this area must address the subtleties of the socialization processes that can bring about major personality changes.


I recently had an encounter with what has been termed “brainwashing,” when I spent a weekend at Camp K, a Moonie training camp in Northern California [in circa 1981-83?].  As a result of my experience there, I would like to offer a few comments on the nature of brainwashing from the perspective of an anthropologist.  I went to the camp to do research for a project on deprogramming.  I thought it was important to see what the “programming” was all about.  I pretended, however, to be a young woman who wandered into their church by chance, and who knew little about Rev. Moon or Moonies.

To begin with, I was allowed plenty of sleep and given a sufficient amount of protein.  Both mornings, I got out of bed around 8:30 or 9:00 – when I was tired of laying around.  No one made me get up early.  We were given eggs, fish, tuna, something that looked like “chicken spam,” lasagna (meatless, but plenty of cheese) and other foods.  We were constantly being fed – three meals and about two snacks per day. Most people looked a bit overweight. In any case, the two things I was looking for that might “brainwash” me were not present.

I was further disarmed by the fact that the group let me know right up front that they were the Unification Church, and followers of the Reverent Moon.  The San Francisco Bay area center had earned a rather bad reputation for hiding that fact until a new recruit was already well entrenched in the group.  Apparently, this is no longer true.  I walked into the church on Bush Street in San Francisco on a Friday evening, and the first thing that was said to me was “You understand that this is the Unification Church and that we’re followers of the Reverent Moon?”  They also had a permanent sign on the front of their building stating “Unification Church.”  The first evening at Bush Street, after showing some interest in the Church, I was shown a videotape about the Church and Reverend Moon.  In order to go to their camp for the weekend, I had to sign a release, which clearly stated that I was going with the Unification Church.  However, the fact that they were now being honest about who they were, in contrast to their past deceptiveness, served to weaken my defense.

The first night, I heard the word “brainwashing” used four or five time, always in a joking context.  I finally asked John, my “spiritual father,” why that word kept cropping up so often.  He said it was because people often accuse them of being brainwashed.  The explanation I heard several times that weekend in this regard is that “people are so cynical and they can’t believe that we can be happy and want to help other people and love God and each other.  So they think that we must be brainwashed to feel this way.  Ha! Ha!”  I was also told by two different Moonies about a recent psychological study comparing Moonies with young adults from other mainstream religious groups.  They told me that Moonies came out much better in terms of independence, aggressiveness, assertiveness, and other positive characteristics.  The group is apparently meeting the criticism leveled at them head on.  Their explanations seemed so reasonable. They would ask, “We don’t look brainwashed, do we?”  And they didn’t.

I somehow expected to see glassy-eyed zombies.  I didn’t.  There was one new member – he’d been in the group only a month and a half – who seemed to fit that stereotype.  When I talked to him, his gaze wandered, his eyes not fixed on anything.  But everyone else seemed perfectly normal.  They were able to laugh and joke (about everything except themselves, which I’ll discuss later) and talk seriously about things.  The only thing that really struck me as strange was a kind of false over-enthusiasm.  Any time anyone performed, which was often, everyone would clap and cheer wildly.  They were good, but not that good.  During lectures, they would underscore points with a hearty “yeah!”  I must admit, however, that by the end of the weekend, much of the enthusiasm seemed more charming than odd.

Since the issue was brainwashing, I was constantly monitoring my mental state. During lectures (three per day, each lasting about an hour to an hour and a half), I would sit there and smugly critique the lecture (to myself) as it was presented.  My intellectual faculties were as sharp as ever.  I was able to note the kinds of techniques they were using as well.  Immediately before each lecture, we would sing songs from their songbook, to the accompaniment of a guitar.  Their songs are very beautiful, and the lyrics always upbeat.  As a result, you start off the lecture feeling good from the singing.  The lectures are always ended by singing a few more songs.  This puts a whole aura of “goodness” around the lectures.

The lectures were carefully orchestrated so as to create a feeling in the listener that they must be “learned,” rather than analyzed.  I could discuss this in greater detail, but for now, I will return to the issue of brainwashing.  Despite the use of questionable and manipulative educational techniques, I was constantly aware of the functioning of my intellect and of my beliefs, and at no time did I feel that they were being influenced.  This may not be the case with an individual who has not spent 13 years in college, but, as will become clear, it only underscores the power of brainwashing.  As an anthropologist, I found their beliefs interesting; as an individual, I found them ridiculous.  Nor did I experience any altered states of consciousness to indicate that I was being hypnotized in any way.  So I thought I was safe.

What I didn’t realize is that the “brainwashing” – or to use a better term, “mind control” – doesn’t come until later.  And what is really being talked about is a process of socialization, one which goes on in every household around the world.  Human beings are not born with ideas.   Ideas are learned.  Anthropologists, more than any other group, perhaps, are aware of the variety of beliefs that are held by people around the world.  We acquire these beliefs through a process that involves observation, imitation, and testing.  Beliefs that are acquired in childhood are generally the strongest, although they may be changed through experience as one grows older.  When we have experiences that conflict with our world view, we either rationalize the experience (e.g., I couldn’t find my necklace in the jewelry box yesterday, but today it’s there – I must have overlooked it, or someone must have taken it and put it back), leaving our beliefs intact (e.g., objects don’t magically disappear and reappear), or, if it happens too often and we are presented with an alternative world view which accounts for it, we may change our beliefs.  (This is the stuff that Kuhn writes about in his classic book, The Structure of Scientific Revolutions.)  it is possible to explain the same event in many ways.  What cults do is to offer an alternative way of looking at things.  When everyone holds the same belief but you, their view starts to make sense.  Society, especially the smaller scale societies we had throughout most of human evolution, could not operate smoothly if everyone were to hold a different belief about the nature of reality.  Millions of years of evolution have selected for a human tendency to be influenced by the beliefs of others.  If this were not the case, how could any child be socialized to be a member of the group? There are, of course, rebels and visionaries, people who do not accept the beliefs of the group.  But they are much fewer in number.  Furthermore, adolescence seems to be a major time for group conformity.  Teenagers appear to have a strong need to belong, to look and act like one of the group.  And it is these adolescents and post-adolescents who are most strongly attracted to cults.

How does mind control work?  Let me rephrase that.  Even “mind control” is too strong a term – for it, too, conjures up visions of men reaching invisible fingers into your brain, controlling your thoughts and actions like a puppeteer.  I think of it more as a socialization process in which one is led to think like the rest of the group.  Robert Lifton, in his seminal book entitled: Thought Reform and the Psychology of Totalism:  A Study of Brainwashing in China, outlines the eight conditions that result in ideological totalism: milieu control, mystical manipulation, need for purity, personal confession, acceptance of basic group dogma as sacred, loading the language, subordination of person to doctrine, and dispensing of existence.  As I see it, all of these features conspire to do two things: (1) isolate the person within a particular cultural context so that that context becomes the only reality, and (2) make the individual feel that if he becomes a member of the group, he will be special.  These features are an inherent part of any culture, and not necessarily purposefully contrived to achieve particular aims.  Let me give an example.

Several years ago, I spent a summer doing fieldwork in Guatemala.  After a month in the field, I couldn’t remember a lot of things about home, e.g., my husband’s voice.  He was back in the U.S.  Reality was where I was, in Guatemala. One regret I have is not buying more of the beautiful Indian weavings.  The reason I didn’t was that they were “too expensive.”  The finest cost approximately $30.  To buy something similar here would cost well over $100.  But I had internalized the Guatemalan standard of money.  That summer, no one was purposely trying to control my environment.  It was controlled by virtue of the fact that I was spending most of my time in a small rural village. Though I retained most of my American ways and beliefs, my sense of reality was slowly changing, and Guatemala became the standard by which I tested reality.

Regarding the notion that ideological totalism functions to make an individual feel that if he joins the group, he will somehow be better than everyone who is not a member – this is not a new concept.  All cultures promote this idea about themselves.  The attitude is called “ethnocentrism.”  Everything we do is right and natural; everything outsiders do is unnatural, barbaric, etc.  The names that most small scale societies use to refer to themselves generally translate into something meaning “the people” or “human beings,” implying that everyone who is not a member of the group is somehow less than human.  Perhaps I am overstating the case, but what I saw the Moonies do was to do on a smaller scale what all cultures do with their members.

The techniques they use are for the most part, not very sinister.  They are things we encounter in everyday life.  They are how we become socialized.  The cult becomes a total subculture.

Which brings me to what I think is the most important part.  In the beginning, they don’t influence you by changing your beliefs.  As I said earlier, they did not affect mine in the least in that short weekend.  (although I should point out that my beliefs are very clear and strong.  Most people who join the church are self-described “searchers”: they’re looking for answers.)  the way they get to you is emotionally.  If you stay with an isolated group of people long enough, you will eventually begin to think like they do, act like they do, see the world as they do.  It’s part of human nature.  It’s what we anthropologists mean when we talk about enculturation.  The degree of enculturation (taking on the culture of another group) will depend upon the relative amount of time you associate with people from your own culture and from the new culture, among other factors.  If you associate only with members of the new culture, acculturation will generally be much more rapid.

So how do they get you to stay?  By giving you a good time, by being likeable, by being happy.  Of all the things I expected to happen that weekend, the last thing I expected was to have a good time.  Except for the lectures, which I found rather boring and insulting (I thought they were aimed at about a third grade level), I really had fun.  We sang a lot, people performed songs and poems, we put on a group talent show, we played volleyball.  We became children again, with no responsibilities.  It was like being at camp; in fact, it was called camp: Camp K.  the setting was beautiful – in the mountains, along a creek, with lots of trees.

They also make you feel really good about yourself.  One of the famous Moonie techniques is “love bombing,” which basically consists of giving someone a lot of positive attention.  For example, one morning, Jane said to me, “You know, you’re really one of the most open people I’ve ever met.  You don’t put up any defenses.  You’re really open.  I think that’s so great.”  When she said this, part of my mind went “flash.  Love-bombing, love bombing.”  But the other part of me went, “Yeah, but it’s really true.  (Don’t we all like to believe the best about ourselves?)  She probably really means it.”  In any case, it made me feel good.  Despite my intellectual recognition of what she was doing, emotionally, I bought it.

Another technique they use is to make you feel part of the group.  New recruits were constantly encouraged to take part in the many performances that were put on.  During one of the initial group sessions, when we were introducing ourselves, I mentioned that I like to dance.  That night, when we were making up our presentation for the “talent show,” everyone kept urging me to choreograph our musical number.  I felt a bit shy about it, but then figured, why not?  I had never seen a more supportive group in my life.  There was no way to fail – except not to take part.  I had about 5 minutes to make up and teach a number to a group of 15.  needless to say, my “dance” was simple and rather silly.  But it was all in fun and didn’t matter. It made me feel a part of the group.  It also gave them ample opportunity for more love-bombing.  After the show and all the next day, at least a dozen people came up to tell me what a “great” dance it was.  Despite the fact that I knew it wasn’t, it still felt good to have people compliment me on something that is important to me.  I was made to feel good by being part of the group.

They also made me feel that I was a lot like individual members of the group.  Part of my “cover” was that I was a third grade school teacher.  (I did teach 3rd grade for 10 weeks once.)  when I told this to my “spiritual father” he replied, “I used to be a school teacher too.”  He kept emphasizing how much alike we are.  (We’re not.)  He also told me how much I remind him of a close friend of his.  Someone else told me how much I reminded her of her sister-in-law.  Other people told me that I look “so familiar.”  It was rather transparent to me that this was merely a technique to make me feel that we were not so different and I could be a part of them.  (Actually, this technique was too obvious and not effective on me.)

Socialization also works through subtle peer pressure.  At the end of Saturday evening, we once again got in our groups to discuss “what we liked best about the day.”  As we went around the circle, people mentioned things like the lecture we had on Rev. Moon, or the movie about the Unification Church, or something that was said in the lecture.  As it was coming around to me, I was thinking, “My honest answer would be the volleyball game.  I really had a great time playing volleyball.  But if I say that, I’m going to sound really shallow compared to everybody else.  And I know I’m not shallow.”  So I chose something that was also true, thought less so, but which sounded much better.  When my turn came, I said, “I really enjoyed meeting a lot of really nice people.”  Because of a general human tendency to try to create a positive image of ourselves, I was slowly becoming socialized into the ways of the group.  If this were a group that valued physical activity, my true response would have been appropriate.  But this was a group that valued God, love, ideals, and so I found myself shaping myself in a way that emphasized the aspects of my being that were most acceptable to the values and standards of the group.  We are all multi-faceted.  It is a common experience to find that different people or groups of friends being out different aspects of our personality.  Generally, we change subtly as we interact with each group, thus emphasizing all aspects of our personality.  In a totalist group like the Moonies, however, the group values are so strong and so consistent that only one side of ourselves is elicited and reinforced.  We thus shape our personality as we become socialized into the group.

The most powerful aspect of the whole experience was the personal relationships.  At the beginning of the weekend, I remember thinking that there really wasn’t anyone there that I would want to be friends with.  But by the end of 2 ½ very intense days, I had developed a few attachments, especially to two of the women, Susan and Jane.  I also felt very guilty about deceiving them regarding who I was and why I was there.  Yet I couldn’t tell them the truth because then I couldn’t be sure that they weren’t treating me differently from others – non-researchers.  Even though I knew they were deceiving me in subtle ways and that the ultimate goal that was shaping their behavior toward me was the desire to get me to join the group, I still felt guilty.  I honestly liked them.  They seemed so open and honest with me, although I still don’t know how open and honest that really was.  They seemed to like me.  My ego wants to believe they did. The whole cult issue is very clouded in my mind.  It is exceedingly complex.  If their main motive was to get me to join the group, it was because they believed that by doing so, they were helping to save the world and my soul.  Is that so dishonest?  Yet how honest is it to consciously use those very effective techniques?  I see them as both victims and victimizers.  Simultaneously.

They presented a lifestyle alternative that was very appealing.  Community, love, idealism.  They presented a picture of true happiness. Yet we learn from ex-members (who admittedly have their own biases) that this picture is false.  Or at least, only part of the picture.  What is left out is the fear and guilt and the loss of self.

What the “brainwashing” is all about, in my view, is grabbing you emotionally.  Giving you a good time, showing you others, like yourself, who are fulfilled.  People who, like you, were searching for answers to life’s basic questions and found them.  Why not stay a little longer, and learn a little more about them?  You don’t have to believe in the doctrine right away.  You can still think critically at the end of the weekend, when you make the decision to stay on for the 7-day seminar.  But you’ve begun to develop emotional ties that will keep you there.  To learn a little more.  Until they have finally socialized you into their way of life.  They grab you emotionally until they can keep you long enough to completely socialize you.

I am writing this article because I think it is important to understand what is going on.  I know that I didn’t understand, despite having done a lot of reading and talking to people about it.  I think it is because most of us have too many strong associations with the words “brainwashing” and “mind control.”  They seem so overt.  They’re not.  The process can be extremely subtle.  But because we have such strong associations, we do not recognize the process in its other manifestations.  I think that in part it is because it is so familiar.  It is something that happens everyday to every child that is born on this planet.  Society is possible only because socialization techniques are effective.  Socialization isn’t sinister.  The problem I see with the cults is the context.  As an anthropologist, I am aware of the existence of what we would term cults in other societies.  I think that cults have a greater and more damaging impact in our culture because we value the individual so highly.  From discussions with ex-members, it appears that one of the most negative effects of cult involvement is a loss of self.  Many other societies value the group over the individual.  Although I am not a psychiatrist, I would guess that it is not so damaging to the psyche to give up your individual identity to the group (the cult), if you have always been raised to value the group over the self.  But in our culture, where the opposite is true, this can be devastating to many individuals.

I think it was the contrast between my expectations and my experience that allowed the weekend to have such a strong emotional affect on me.  I was looking for something big and evil and what I found was very subtle and friendly, so I didn’t recognize its power.  I was also mistaken in believing that the socialization process (or the influence process) was intellectual.  It’s not.  It’s emotional, and thus touches a deeper and more central part of one’s brain.  When I left at the end of the weekend, a friend who had been in the Moonies and worked for a while as a deprogrammer picked me up.  One of the first things I said to him was, “I had a great time.  Remind me again what’s so bad about the Moonies.”

The next day I was interviewing a former deprogrammer.  About half-way through the interview I asked her to describe exactly what she did during the deprogramming.  She looked me directly in the eye and said, “Exactly what I’ve been doing with you.”  This shocked me, because I didn’t think I needed any deprogramming.  I didn’t buy their doctrine.  They didn’t brainwash me.  But they did get to me.  I had forgotten all of the organization’s abuses of church members: the long hours of fund-raising, sometimes in dangerous areas, late at night; the lack of proper nutrition; the suicide training; the fear and guilt; the relative poverty the members live in, while the leaders live in splendor; the munitions factory owned by a church which is supposedly striving for world peace; the divisions created between family members; the deception; all of the horrors.  Part of me remembered them, because I remember asking questions about what exactly the church does to make the world better, knowing that most members spend them time selling flowers.  But that knowledge didn’t seem important.  The people seemed good, so by association, the group did too.  I had been influenced.  The emotional truth was so much stronger than the intellectual one that it was the only one that seemed important.

I have mixed feelings about the use of the term “brainwashing” with regard to cult indoctrination.  Because of the general effectiveness of the techniques in influencing a person’s thoughts and actions, I can understand the persistence of its use.  If someone like Patty Hearst is going to be defended on such a basis, it needs to be recognized as a powerful and legitimate technique (although degree of susceptibility will vary).  However, if the goal is to keep people out of cults, I am afraid the contrast between the stereotypic notion of brainwashing (which I don’t think we can escape) and the experience a new recruit has is to sharp, that people are disarmed and no longer aware of the techniques being used on them.  Instead, I would advocate seeing the brainwashing process in the context of socialization.  This is something with which we are all familiar and about which we hold few, if any, negative connotations.  At the same time, it is something that we are aware of the power of.  I would contend that the process of “brainwashing” can best be understood as an intensified socialization experience.  I may be quibbling over semantics, but given the fact that the words in question are so loaded, I feel that semantics are important here.  The Moonies take the raw material of our human needs – to be loved and to be accepted – and use the same techniques that for centuries cultures have used to shape individuals into members of the culture: peer pressure, reward and punishment, and the experience of being surrounded by individuals who all view the world in the same way.

My weekend with the Moonies was intended to answer some questions I had.  Instead, it raised many more.  The most solid thing I came away with, however, and my reason for writing this, is a new understanding of brainwashing.  If we are to avoid it, we must first learn to recognize it.


Geri-Ann Galanti, Ph.D., is a medical anthropologist, and lecturer at the UCLA School of Medicine. Dr. Galanti was formerly on the faculty of California State University’s Department of Anthropology and California State University’s School of Nursing, where she developed the curriculum for the BSN program’s Cultural Diversity in Healthcare course. Dr. Galanti is a consultant to Civility Mutual.

Geri-Ann Galanti

This article is an electronic version of an article originally published in
Cultic Studies Journal, 1984, Volume 1, Number 1, pages 27-36. 



Sun Myung Moon’s theology used to control members


Japanese woman recruited and sold by FFWPU to a Korean farmer

A 20-year-old woman, recruited by the Family Federation for World Peace and Unification / UC in Japan, was sold to an older Korean farmer in an “apology marriage”.


Una mujer japonesa fue reclutada por la Federación de Familias y luego vendida a un granjero coreano

Mujer de 20 años reclutada en Japón por la Federación de la Familia para la Paz Mundial y la Unificación / IU y luego fue vendida a un granjero coreano en un “matrimonio de disculpa”.


Allen Tate Wood on Sun Myung Moon and the Unification Church


16. VIDEO: Recovery from RTS (Religious Trauma Syndrome) by Marlene Winell


17. VIDEO: ICSA – After the cult


18. “How do you know I’m not the world’s worst con man or swindler?” Sun Myung Moon

ORDER NUMBER 77-01-02

REVEREND SUN MYUNG MOON SPEAKS ON
LET US MEET OPPORTUNITY WELL
January 2, 1977 
World Mission Center 
Translator – Bo Hi Pak

Let’s say God promised you something. Is it an empty promise or will it be delivered? Once God makes a promise, once God makes a decision, He always keeps that promise even if it takes thousands of years for it to be fulfilled. Time after time He has fulfilled His promises. We all need this God.

How about human promises? We promise each other quite a bit. Sometimes we even make promises knowing that we will never fulfill them. In other words, we lie. We have all lied; none of you has been perfect. In human affairs everybody lives like that. This is the honest situation.

Human lies are everywhere because lies are very convenient. Without lies, commercial people practically couldn’t continue in business. What about God? He does not lie, and does not hear lies because He knows they are lies. He sees through them. Often we listen to lies without knowing it, but not God. We are exposed as we really are before God because our lies cannot hide anything from Him. This means you cannot even trust me 100%. I have human weaknesses. That’s an honest and frank statement. However, I am introducing you to a person you can trust 100%: God. My mortal body will live only one generation here on earth, but God will remain here forever.

The same thing is true for all mankind. You may often think the same thing, “I wish there was no God. He bothers me too much. Oh, God go away somewhere. I want to do my own thing.” I’m sure some of you are thinking right now, “Oh, I wish Rev. Moon were not here. He bothers me too much. He pushes me too much!”

How do you know I’m not the world’s worst con man or swindler? Have you seen my heart? No one can see it. How can you trust me? You can only trust me by experiencing life together with me. Maybe your future experience will be entirely different. Regardless, are you ready to go?

I tell you one thing: You will never lose yourself; you will never be harmed by going this way. Suppose I were telling you lies, but you took my lies as the truth, as God’s words and lived them 100%. God knows very well what is true or false. In that case, God may condemn me, but God would never condemn you. God may not give me the blessing but He definitely would not withdraw His blessing from you. Actually this is a challenge. Even if I am telling you lies, if you take them seriously as the word of God, and you fulfill them, actually there is a chance that you could become the real Rev. Moon. You can’t lose. When you take things seriously and live the teaching, not for your own sake but for God’s sake, God will never abolish your deeds.

How can you take my word 100% seriously? You got up this morning for pledge service, didn’t you? Tell me very honestly, were you willing to do that this morning? Don’t tell me big lies! You did it because you had to! Even I didn’t want to get up at 4:30. Are you different? Why do I do it if I don’t want to? Because there’s someone upstairs watching me.

Nobody really wants to go out there selling peanuts and flowers. Nobody wants to go out there on the street, acting like a crazy man trying to grab people and witness. You act almost like servants to the people, trying to win their hearts, trying to talk to them. When you think of it, the amount of work you have to do to win one person’s heart is incredible. And you have to do it day in and day out. You go out fundraising every day, from early morning to late night. Are we really fond of doing it? We do it because we have to.

Respected people outside will say, “How crazy you are. Why did you become a slave of Rev. Moon? I have never seen such a fool.” Do you have the courage to overcome that kind of reaction?

Let me tell you one episode from my past. Many times in North Korea and one time [1955] in South Korea I was in jail. There was one ardent member following me around that time, but he became tired and left. Then he read in the newspaper that l was going to jail. At that time many members were trying to encourage me, saying, “Don’t worry, Father. You just wait; we shall do 1000 times more than you.” But this particular person came to the prison, curious to see how I looked. He happened to be in such a position that I met him face to face, and I will never, never forget that man’s statement. He said to my face, “You fool, are you still doing this?”

[The main investigation into Sun Myung Moon in 1955 was into his sex rites with many students from Ewha Womans University, and some married women.] LINK


19. Bibliography

Alstad, Diana and Kramer, Joel (1993) The Guru Papers. Berkeley, Calif.: North Atlantic Books  WEBSITE

Atack, Jon. (2016) Opening Minds: the secret world of manipulation, undue influence and brainwashing. Second Edition. Colchester, England: Open Minds Foundation Trentvalley Ltd. WEBSITE

Bale, Jeffrey M. The Unification Church and the KCIA – ‘Privatizing’ covert action: the case of the UC  Lobster, May 1991

Blake, Mariah (November 25, 2013) “The Fall of the House of Moon” New Republic pages 28-37
https://newrepublic.com/article/115512/unification-church-profile-fall-house-moon

Blake, Mariah (December 9, 2013) “Meet the Love Child Rev. Sun Myung Moon Desperately Tried to Hide” Mother Jones http://www.motherjones.com/politics/2013/12/reverend-moon-unification-church-washington-times-secret-son

Boettcher, Robert (with Freedman, Gordon L.) (1980) Gifts of Deceit, Sun Myung Moon and the Tongsun Park Korean Scandal. New York: Holt, Rinehart and Winston

Case, Thomas W. (1995) Moonie, Buddhist, Catholic: A Spiritual Odyssey. Cincinnati, OH, USA: White Horse Press

Cialdini, Robert (2009) Influence, Science and Practice (5th edition). Boston, MA, USA: Pearson Education, Inc.  VIDEO

Choe, Joong-Hyun (1993) The Korean War and messianic groups: Two cases in contrast (Unification Church and the Olive Tree Movement). PhD. thesis, Syracuse University, USA.

Choi Syn-duk 崔信德 (1967) Korea’s Tong-il Movement. in Transactions of the Royal Asiatic Society No. 43 (1967) Volume XLIII pages 101-113.
http://www.raskb.com/transactions/VOL43/Vol043-6.docx

Chun, Young Bok (1976) The Korean Background of Unification Church: A New Religion. pp 14-18 in Japanese Religions Vol. 9 July 1976 No. 2. A magazine issued by the NCC Center for the Study of Japanese Religions Kyoto, Japan

Deikman, Arthur J., (2009) Them and Us: Cult Thinking and Terrorist Threat. Bay Tree Publishing

De Mente, Boyé Lafayette (2018) The Korean Mind. Understanding Contemporary Korean Culture. North Clarendon, Vermont, USA: Tuttle

Durham, Deanna (1981) Life Among the Moonies: three years in the Unification Church. Plainfield, New Jersey, USA: Logos International

Edwards, Christopher (1979) Crazy for God. Engelwood Cliffs, New Jersey, USA: Prentice Hall Inc.

Elkins, Chris (1980) Heavenly Deception. Wheaton, Illinois, USA: Tynedale House Publishers, Inc.

Ford, Wendy (1990) Recovery from Abusive Groups, American Family Foundation

Freed, Josh (1980) Moonwebs, Journey into the Mind of a Cult. Canada: Dorset Publishing Inc. (Hardback), Virago (Paperback).

Goldberg, Lorna; Goldberg, William; Henry, Rosanne; Langone, Michael (2017) Cult Recovery – a clinician’s guide to working with former members and families. Bonita Springs, Florida, USA: ICSA

Gorenfeld, John (2008) Bad Moon Rising (how the Reverend Sun Myung Moon created the Washington Times, seduced the religious right, and built his Kingdom). Sausalito, CA, USA: PoliPoint Press

Guisso, Richard W.I. and Yu, Chai-shin (1988) Shamanism: The Spirit World of Korea. Berkeley, Calif.: Asian Humanities Press

Hassan, Steven (2015) Combating Cult Mind Control, Freedom of Mind Press

Herman, Judith (1992, 2015) Trauma and Recovery. The aftermath of violence – from domestic abuse to political terror. New York: Basic Books

Hoffer, Eric (1951) The True Believer, Thoughts on the Nature of Mass Movements. New York: Harper Perennial Modern Classics

Hong, Nansook (1998) In the Shadow of the Moons: My Life In The Reverend Sun Myung Moon’s Family. Boston, USA: Little, Brown and Company.

Horowitz, Irving Louis (1978) Science, Sin, and Scholarship: The Politics of Reverend Moon and the Unification Church. Cambridge, USA, and London: The MIT Press

Kim, Chong-sun (1978) Rev Sun Myung Moon. by University Press of America in Washington, D.C.: Rowman & Littlefield

Lalich, Janja and Landau Tobias, Madeleine (2006) Take Back Your Life: Recovering from Cults and Abusive Relationships. Berkeley, Calif.: Bay Tree Publishing   WEBSITE

Langone, Michael – Editor (1993) Recovery from Cults: Help for Victims of Psychological and Spiritual Abuse. New York: W.W. Norton and Company

Lifton, Robert Jay (1961) Thought Reform and the Psychology of Totalism. Chapel Hill, NC, USA: University of North Carolina University Press

Lofland, John (1971) Doomsday Cult: A Study of Conversion, Proselytization, and Maintenance of Faith. Enlarged edition. 362pp (first published by Prentice Hall) New York: Irvington Publishers, ISBN-10: 0-8290-0095-X

Mook, Jane Day (May 1974) “New Growth on Burnt-Over Ground” in A.D. pages 30-36

Naylor, R.T. (2004) Hot Money and the Politics of Debt, Third Edition (536pp) Montreal and Kingston, Canada: McGill-Queen’s University Press ISBN 0-7735-2743-5

Nevalainen, Kirsti L. (2011) Change of Blood Lineage through Ritual Sex in the Unification Church (162pp) ISBN 978-1439261538

Paden, William E. (1994 edition) Religious Worlds, the Comparative Study of Religion. Boston, MA: Beacon Press

Parke, Jo Anne and Stoner, Carol (1977) All God’s Children: The Cult Experience—Salvation or Slavery? Radnor, Pennsylvania: Chilton Book Company

Reiss, Steven (2015) The 16 Strivings for GodMercer University Press

Rice, Berkeley  (1976) “The pull of Sun Moon”

Shermer, Michael (2012) The Believing Brain: From Spiritual Faiths to Political Convictions – How We Construct Beliefs and Reinforce Them as Truths      VIDEO

Singer, Margaret Thaler with Lalich Janja (Feb 15, 1995) Cults in Our Midst: The Hidden Menace in Our Everyday Lives. San Francisco: Jossey Bass Social and Behavioral Science Series. Hardcover

Soh, C. Sarah (2008) The Comfort Women, Sexual Violence and Postcolonial Memory in Korea and Japan. Chicago and London: The University of Chicago Press

Stein, Alexandra (2017) Terror, Love and Brainwashing – Attachment in Cults and Totalitarian Systems. London and New York: Routledge
A primer on how cults and ideologically extremist groups work    WEBSITE

Tahk Myeong-hwan

Underwood, Barbara and Underwood, Betty (1979) Hostage to Heaven. New York: Clarkson N. Potter, Inc.

Walker, Pete (2013) Complex PTSD: From Surviving to Thriving: a guide and map for recovering from childhood trauma. Berkeley, Calif.: An Azure Coyote book (self-published)

Winell, Marlene (2007) Leaving the Fold. Berkeley, Calif.: Apocryphile Press  VIDEO

Yamamoto, J Isamu (1977) The Puppet Master. An Inquiry into Sun Myung Moon and the Unification Church. Downers Grove, Illinois, USA: Intervarsity Press

Zablocki, Benjamin and Robbins, Thomas (Editors) (2001) Misunderstanding Cults: Searching for Objectivity in a Controversial Field. University of Toronto Press, Scholarly Publishing Division

Zieman, Bonnie (2017) Cracking the Cult Code for Therapists: What Every Cult Victim Wants Their Therapist to Know. CreateSpace Independent Publishing Platform