Cult Indoctrination – and the Road to Recovery

Updated April 23, 2021 – under construction


Sun Myung Moon: “split the person apart”

Significance Of The Training Session
Reverend Sun Myung Moon
Third Directors’ Conference
Master Speaks    May 17, 1973  Translated by Mrs. Won-bok Choi

“Good morning! Sit down!
I am going to speak about the significance of a training session like this. Master’s intention is to have the State Representatives, Commanders, and the Itinerary Workers pass the examination, getting at least 70 points. I will continue this until the last one of the responsible members has passed the examination.

For fallen men it is their duty to pass through three stages of judgment! Judgment of words, judgment of personality, and judgment of love or heart. All through history, mankind has been in search of the truth, true words. The truth is the standard by which all the problems of mankind can be solved. We know man somehow fell in the beginning, and to fall means to fall into the bondage of Satan. So, in order for us to return to the original position, we have to get rid of the bondage of Satan. For fallen people, there is no other message which is more hopeful and desirable than the message of restoration to the original position, To be restored is, in another sense, to be liberated from Satanic bondage – and this is the gospel of gospels for fallen men.

Then what is judgment? Judgment is the measurement of the standard on which all our acts are judged. If our acts cannot come in accordance with the original rule or measurement, we must be judged or punished.

Through 40 days you will have six cycles of Divine Principle lectures. If you study hard, after the sixth cycle of lectures – or in the course of them – you can imagine what will come next when the lecturer gives you a certain chapter. You can even analyze or criticize President Kim’s lecture. You may think, “The last time I came he gave a dynamic lecture, but he is tired this time; when I give the lecture I will never be tired,” etc. In your own way, you can organize your lecture. In order for you to be a dynamic lecturer, you must know the knack of holding and possessing the listeners’ hearts. If there appears a crack in the man’s personality, you wedge in a chisel, and split the person apart. For the first few lectures, you will just memorize. But after that, you will study the character of your audience, and adapt your lecture. If he is a scientist, you will approach him differently than a commercial man, artist, etc. The audience as a whole will have a nature, and you must be flexible.

At least two weeks – you must experience flower selling – two weeks to 30 days. Whether in two weeks or in one full month, until you raise 80 dollars a day; then you go to rallies, witnessing, and then if you cannot bring in three persons in one month’s time, you cannot go. That’s the formula you have to go through….”

http://www.tparents.org/moon-talks/sunmyungmoon73/SM730517.htm


CONTENTS

1. VIDEO: Why do people join cults? – Janja Lalich

2. PODCAST: The Cult Vault – Introduction to the Study of Cults

3. VIDEO: The BITE model of Steve Hassan / The Influence Continuum

4. Father Kent Burtner on manipulation of the emotions by the UC

5. VIDEO: Terror, Love and Brainwashing ft. Alexandra Stein

6. Robert Jay Lifton’s Eight Criteria of Thought Reform

7. VIDEO: The Wrong Way Home
An analysis of Dr Arthur J. Deikman’s book on cult behavior

8. Cult Indoctrination through Psychological Manipulation by Professor Kimiaki Nishida

9. Towards a Demystified and Disinterested Scientific Theory of Brainwashing by Benjamin Zablocki  [This item has been expanded and moved HERE ]

10. Psyching Out the Cults’ Collective Mania by Louis Jolyon West and Richard Delgado

11. Book: Take Back Your Life by Janja Lalich and Madeleine Tobias (2009)

12. VIDEO: Paul Morantz on Cults, Thought Reform, Coercive Persuasion and Confession

13. PODCAST: Ford Greene, Attorney and former UC member, on Sun Myung Moon

14. VIDEO: Steve Hassan interviewed by Chris Shelton

15. VIDEO: Conformity by TheraminTrees

16. VIDEO: Instruction Manual for Life by TheraminTrees

17. The Social Organization of Recruitment in the Unification Church – PDF by David Frank Taylor, M.A., July 1978, Sociology

18. Mind Control: Psychological Reality Or Mindless Rhetoric? by Philip G. Zimbardo, Ph.D., President, American Psychological Association

19. Socialization techniques through which Moon church members were able to influence by Geri-Ann Galanti, Ph.D.

20. VIDEO: Recovery from RTS (Religious Trauma Syndrome) by Marlene Winell

21. VIDEO: ICSA – After the cult

22. “How do you know I’m not the world’s worst con man or swindler?” – Sun Myung Moon

23. VIDEO: What Is A Cult? CuriosityStream

24. VIDEO: The Space Between Self-Esteem and Self Compassion: Kristin Neff

25. Bibliography


1. VIDEO: Why do people join cults?

2. PODCAST: The Cult Vault – Introduction to the Study of Cults

This episode is an introduction to myself, Kaycee, and this podcast. An excellent look at the definition of a cult, and what makes some of them harmful.

3. VIDEO: The B.I.T.E. model by Steven Hassan



4. Father Kent Burtner on manipulation of the emotions by the Unification Church / FFWPU of Sun Myung Moon and Hak Ja Han


Father Kent Burtner, right, discusses Rev. Sun Myung Moon and his teaching with two former Moon followers. Burtner, a Catholic priest in Eugene, Oregon, has helped “deprogram” many Moonies over decades.

By Dave Horsman – South Idaho Press Writer

There’s no universal formula for “deprogramming” a Moonie, according to Fr. Kent Burtner, a Catholic priest who first tangled with the Unification Church in 1969.

“The thing that’s crucial is that you consider who you’re talking to. You have to meet him where he’s at,” he said. In young Milton Esquibel’s case, “the process (of withdrawing from his Moonie experience) had already begun because he had so much time to spend with his family. Our time spent with him was to help reinforce the rehabilitation of his emotional faculties.”

Esquibel attended an all-day deprogramming session December 17 in Portland. It had been nearly a month since his parents abducted him from a Moonie group in Los Angeles and brought him home.

“Once they are out of the cult, their emotional life is restored to them rather dramatically,” Burtner said. “In cases where we begin deprogramming soon after the person is brought home, that restoration happens very, very suddenly.”

Burtner currently works at the Newman Center on the University of Oregon campus at Eugene. He previously was in the campus ministry at St. Mary’s College in Moraga, California, where his reputation as an expert on the Unification Church was earned.

His introduction to the sect was in 1969 during his seminary training. “A daughter of our secretary got involved and I was asked to meet with her. She invited me to come to a lecture series in Berkeley. I could see then that the process by which the people got involved was much more significant than the doctrine.”

The American movement was in its infancy then. “It wasn’t until 1972 that they began the series of weekend seminars and 21-day workshops” that Esquibel attended.

Burtner has since helped dozens of young people shed the “emotional amnesia” produced by Unification indoctrination. He prefers “not to work in any extra-legal way —I don’t believe in kidnapping people.”

Gus and Gladys Esquibel, however, were “acting under the law” when they retrieved their 15-year-old son.

The “marvelous thing about deprogramming is that Moonies are led to believe it includes a variety of tortures,” he said, “when in fact we just sit down and talk. We offer a supportive environment, usually with the family and close friends present.”

Deprogramming “can be done a lot of different ways,” he added. But it usually “gives an individual a chance to look at the aspects of his life in the cult that he wasn’t able to examine when he was in it. We give the person an opportunity to see new information that the cult itself would not have divulged. And it undoes two things that have been done to the individual. The first thing is that their critical faculties (their ability to observe and assess) have been put into a state of ‘suspended animation.’ The second thing is that they are made to feel guilty for having emotions.”

Burtner explained the latter condition: “Their feelings are things that appear to them as being evil or in some way the cause of their fallen condition. That gives the cult the power to control the person. It results in a repression of the emotional life and an inappropriate sense of responsibility for the state of the world. Once they have gotten into that mind set, everything outside the group appears to be evil and Satanic and everything on the inside is where God is. So their parents and their old friends are perceived to be agents of Satan.”

“We’re talking about a systematic program that denies a person his individual freedom without his being aware of it,” Burtner said. “I don’t believe there’s anything in the (Unification Church) that essentially relates to religious commitment. It uses very, very high pressure techniques of coercion and basically places the person in a conditioned neurotic state. When you get the person out of that environment and help them see what was going on, their attachment vanishes.”

Even the diet of Moonies is part of their conditioning, according to Burtner. They get very little protein and “without at least 70 grams of protein per day the cerebral cortex is unable to function adequately. They can’t reason normally.”

People who have been Moonies only a short time often are harder to deprogram according to Burtner. “They have a real idealism, They haven’t realized they are going to spend all of their time either fundraising or recruiting new members.”

Esquibel, although he was with the Moonies only about a month, responded well in the December 17 session.

“A young woman and myself chatted with him over coffee and tea to start with. We got him to talk about his experience and we shared some of our own experiences and helped him to understand some of what he had encountered. Later on he started asking the questions. He had to have time to kind of reestablish his emotional contacts and start living his real existence again.”

Each person is different, Burtner said. “Sometimes he is quite belligerent and you have to state your case quite boldly in the beginning.”

He offers the following advice to young people who may be drawn to the Moonies: “First, learn to accept and deal with your emotional life. Second, remember you have the right to ask a lot of questions when someone wants you to get involved.”

He suggests that parents “maintain open, honest lines of communication. If there are difficulties in the family, deal with them in a straightforward way.”

While Burtner despises the Unification movement, he respects its power. He encourages people to examine the church first hand, but asks them to leave a written statement with the police allowing them “to come and get you after a week” Granting another person power of attorney to assure your return is another possibility, he added.

“I had a case like that in Eugene, involving a young woman who wanted to visit a Moonie camp. She was against the (church) after seeing what one of her friends who had been involved had gone through” and wanted to expose its practices.

“She gave me power of attorney and I had to use it after she had been there one weekend. I called (the church) and threatened to turn the story over to the Associated Press if she didn’t come home to talk to me. She returned and went through a day of deprogramming.”



5. VIDEO: Terror, Love and Brainwashing ft. Alexandra Stein

Sensibly Speaking Podcast with Chris Shelton #154

This week I have Dr. Alexandra Stein, social psychologist and author of the book Terror, Love and Brainwashing. We discuss various aspects of cult behavior and psychology.

Comments:

1. This is an outstanding piece of work that I will go back to several times. It has provided me with a degree of clarity hitherto unknown. Now, what do I do with it?

2. Some brilliant information, again, thank you. I found the part about separating from ones emotions and not allowing yourself to process them particularly interesting. Having been brought up in a [cult] family, I still struggle with dealing with (“low tone”) emotions.



6. Robert Jay Lifton’s Eight Criteria of Thought Reform

“I wish to suggest a set of criteria against which any environment may be judged — a basis for answering the ever-recurring question: “Isn’t this just like ‘brainwashing’?”
– Robert Jay Lifton

“Ideological Totalism” is Chapter 22 of Robert Jay Lifton’s book, Thought Reform and the Psychology of Totalism: A Study of ‘brainwashing’ in China

Dr. Lifton, a psychiatrist and author, has studied the psychology of extremism for decades. He is renowned for his studies of the psychological causes and effects of war and political violence and for his theory of thought reform. Lifton testified at the 1976 bank robbery trial of Patty Hearst about the theory of “coercive persuasion.”

His theories — including the often-referred to 8 criteria described below — are used and expanded upon by many cult experts.

First published in 1961, his book was reprinted in 1989 by the University of North Carolina Press. From Chapter 22:

8 CRITERIA AGAINST WHICH ANY ENVIRONMENT MAY BE JUDGED:

  • Milieu Control – The control of information and communication.
  • Mystical Manipulation – The manipulation of experiences that appear spontaneous but in fact were planned and orchestrated.
  • The Demand for Purity – The world is viewed as black and white and the members are constantly exhorted to conform to the ideology of the group and strive for perfection.
  • The Cult of Confession – Sins, as defined by the group, are to be confessed either to a personal monitor or publicly to the group.
  • The Sacred Science – The group’s doctrine or ideology is considered to be the ultimate Truth, beyond all questioning or dispute.
  • Loading the Language – The group interprets or uses words and phrases in new ways so that often the outside world does not understand.
  • Doctrine over person – The member’s personal experiences are subordinated to the sacred science and any contrary experiences must be denied or reinterpreted to fit the ideology of the group.
  • The Dispensing of existence – The group has the prerogative to decide who has the right to exist and who does not.

 

Eight Conditions of Thought Reform
as presented in 
Thought Reform and the Psychology of Totalism, Chapter 22.

1. Milieu Control
The most basic feature of the thought reform environment, the psychological current upon which all else depends, is the control of human communication. Through this milieu control the totalist environment seeks to establish domain over not only the individual’s communication with the outside (all that he sees and hears, reads and writes, experiences, and expresses), but also — in its penetration of his inner life — over what we may speak of as his communication with himself. It creates an atmosphere uncomfortably reminiscent of George Orwell’s 1984…. (Page 420.)
Purposeful limitation of all forms of communication with outside world.

The control of human communication through environment control.

The cult doesn’t just control communication between people, it also controls people’s communication with themselves, in their own minds.

2. Mystical Manipulation
The inevitable next step after milieu control is extensive personal manipulation. This manipulation assumes a no-holds-barred character, and uses every possible device at the milieu’s command, no matter how bizarre or painful. Initiated from above, it seeks to provoke specific patterns of behavior and emotion in such a way that these will appear to have arisen spontaneously from within the environment. This element of planned spontaneity, directed as it is by an ostensibly omniscient group, must assume, for the manipulated, a near-mystical quality. (Page 422.)
Potential convert is convinced of the higher purpose within the special group.

Everyone is manipulating everyone, under the belief that it advances the “ultimate purpose.”

Experiences are engineered to appear to be spontaneous, when, in fact, they are contrived to have a deliberate effect.

People mistakenly attribute their experiences to spiritual causes when, in fact, they are concocted by human beings.

3. The Demand for Purity
The experiential world is sharply divided into the pure and the impure, into the absolutely good and the absolutely evil. The good and the pure are of course those ideas, feelings, and actions which are consistent with the totalist ideology and policy; anything else is apt to be relegated to the bad and the impure. Nothing human is immune from the flood of stern moral judgements. (Page 423.)
The philosophical assumption underlying this demand is that absolute purity is attainable, and that anything done to anyone in the name of this purity is ultimately moral.

The cult demands Self-sanctification through Purity.

Only by pushing toward perfection, as the group views goodness, will the recruit be able to contribute.

The demand for purity creates a guilty milieu and a shaming milieu by holding up standards of perfection that no human being can attain.

People are punished and learn to punish themselves for not living up to the group’s ideals.

4. The Cult of Confession
Closely related to the demand for absolute purity is an obsession with personal confession. Confession is carried beyond its ordinary religious, legal, and therapeutic expressions to the point of becoming a cult in itself. (Page 425.)
Public confessional periods are used to get members to verbalize and discuss their innermost fears and anxieties as well as past imperfections.

The environment demands that personal boundaries are destroyed and that every thought, feeling, or action that does not conform with the group’s rules be confessed.

Members have little or no privacy, physically or mentally.

5. Aura of Sacred Science
The totalist milieu maintains an aura of sacredness around its basic dogma, holding it out as an ultimate moral vision for the ordering of human existence. This sacredness is evident in the prohibition (whether or not explicit) against the questioning of basic assumptions, and in the reverence which is demanded for the originators of the Word, the present bearers of the Word, and the Word itself. While thus transcending ordinary concerns of logic, however, the milieu at the same time makes an exaggerated claim of airtight logic, of absolute “scientific” precision. Thus the ultimate moral vision becomes an ultimate science; and the man who dares to criticize it, or to harbor even unspoken alternative ideas, becomes not only immoral and irreverent, but also “unscientific”. In this way, the philosopher kings of modern ideological totalism reinforce their authority by claiming to share in the rich and respected heritage of natural science. (Pages 427-428.)
The cult advances the idea that the cult’s laws, rules and regulations are absolute and, therefore, to be followed automatically.

The group’s belief is that their dogma is absolutely scientific and morally true.

No alternative viewpoint is allowed.

No questioning of the dogma is permitted.

6. Loading the Language
The language of the totalist environment is characterized by the thought-terminating cliché. [Slogans] The most far-reaching and complex of human problems are compressed into brief, highly reductive, definitive-sounding phrases, easily memorized and easily expressed.
The cult invents a new vocabulary, giving well-known words special new meanings, making them into trite clichés. The clichés become “ultimate terms”, either “god terms”, representative of ultimate good, or “devil terms”, representative of ultimate evil. Totalist language, then, is repetitiously centered on all-encompassing jargon, prematurely abstract, highly categorical, relentlessly judging, and to anyone but its most devoted advocate, deadly dull: the language of non-thought. (Page 429.)

Controlling words helps to control people’s thoughts.

The group uses black-or-white thinking and thought-terminating clichés.

The special words constrict rather than expand human understanding.

Non-members cannot simply comprehend what cult members are talking about.

7. Doctrine over Person
Another characteristic feature of ideological totalism: the subordination of human experience to the claims of doctrine. (Page 430.)
Past experience and values are invalid if they conflict with the new cult morality.

The value of individuals is insignificant when compared to the value of the group.

Past historical events are retrospectively altered, wholly rewritten, or ignored to make them consistent with doctrinal logic.

No matter what a person experiences, it is belief in the dogma which is important.

Group belief supersedes individual conscience and integrity.

8. Dispensed Existence
The totalist environment draws a sharp line between those whose right to existence can be recognized, and those who possess no such right.
Lifton gave a Communist example:
In thought reform, as in Chinese Communist practice generally, the world is divided into “the people” (defined as “the working class, the peasant class, the petite bourgeoisie, and the national bourgeoisie”), and “the reactionaries” or “the lackies of imperialism” (defined as “the landlord class, the bureaucratic capitalist class, and the KMT reactionaries and their henchmen”). (Page 433.)

The group decides who has a right to exist and who does not.

The group has an elitist world view — a sharp line is drawn by cult between those who have been saved, chosen, etc., (the cult members) and those who are lost, in the dark, etc., (the rest of the world).

Former members are seen as “weak,” “lost,” “evil,” and “the enemy”.

The cult insists that there is no legitimate alternative to membership in the cult.


The full text of Chapter 22 appears HERE courtesy of Dr. Robert Jay Lifton.



7. VIDEO: The Wrong Way Home

An analysis of Dr Arthur J. Deikman’s book on cult behavior



8. Cult Indoctrination through Psychological Manipulation

by Professor Kimiaki Nishida 西田 公昭 of Rissho University in Tokyo.

(There is an explanation of this diagram below.)

This is a shortened version of an article entitled Development of the Study of Mind Control in Japan first in published in 2005.

Recently, psychologists in Japan have been examining a contemporary social issue — certain social groups recruit new members by means of psychologically manipulative techniques called “mind control.” They then exhort their members to engage in various antisocial behaviors, from deceptive sales solicitation and forcible donation to suicide and murder [e.g. Tokyo sarin gas attack by Aum Shinrikyō in 1995]. We classify such harmful groups as “cults” or even “destructive cults.” Psychologists concerned with this problem must explain why ordinary, even highly educated people devote their lives to such groups, fully aware that many of their activities deviate from social norms, violate the law, and may injure their health. Psychologists are now also involved in the issue of facilitating the recovery of distressed cult members after they leave such groups.

Background
In the 1970s, hardly anyone in Japan was familiar with the term “destructive cult.” Even if they had been informed of cult activities, such as the 1978 Jonestown tragedy, in which 912 members of the Guyana-based American cult were murdered or committed suicide, most Japanese people would have thought the incident a sensational, curious, and inexplicable event. Because the events at Jonestown occurred overseas, Japanese people, except possibly those worried parents whose child had joined a radical cult, would not have shown any real interest.

In the 1980s, a number of Japanese, including journalists and lawyers, became concerned about the “unethical” activities of the Unification Church, whose members worshiped their so-called True Father, the cult’s Korean founder Sun Myung Moon, who proclaimed himself to be the Second Advent of Christ. One of the group’s activities entailed shady fund-raising campaigns. Another unethical activity of the cult in the 1980s was Reikan-Shôhô, a swindle in which they sold spiritual goods, such as lucky seals, Buddhist rosaries, lucky-towers [pagodas] ornaments, and so on. The goods were unreasonably expensive but the intimidated customers bought them to avoid possible future misfortune [or to liberate their deceased loved-ones from the ‘hell’ they were told they were suffering in].

The first Japanese “anti-cult” organization was established in 1987 to stop the activities of the Unification Church. The organization consisted of lawyers who helped Reikan-Shôhô victims all over Japan (see Yamaguchi 2001). According to their investigation, the lawyers’ organization determined that the Unification Church in Japan engaged in three unethical practices. First, large amounts of money were collected through deceptive means. Under duress, customers desperate to improve their fortunes bankrupted themselves through buying the cult’s “spiritual” goods. Second, members participated in mass marriages arranged by the cult without the partners getting to know each other, after the partners were told by the cult leader that their marriage would save their families and ancestors from calamity. Third, the church practiced mind control, restricting members’ individual freedom, and employing them in forced labor, which often involved illegal activity. Mind-controlled members were convinced their endeavors would liberate their fellow beings.

The 1990s saw studies by a few Japanese psychological researchers who were interested in the cult problem. By the mid-1990s, Japanese courts had already acknowledged two Unification Church liabilities during proceedings the lawyers had brought against the cult; namely, mass marriage and illegal Reikan-shôhô. (see Judgment by the Fukuoka [Japan] District Court on the Unification Church 1995). The lawyers’ main objective, however, had been that the court confirm the Unification Church’s psychological manipulation of cultists, a ruling that would recognize these members as being under the duress of forced labor.

What Is Mind Control?
Early in the study of mind control, the term was equated with the military strategy of brainwashing. Mind control initially was referred to in the United States as “thought reform” or “coercive persuasion” (Lifton 1961; Schein, Schneier, and Barker 1961). Currently, however, mind control is considered to be a more sophisticated method of psychological manipulation that relies on subtler means than physical detention and torture (Hassan 1988).

In fact, people who have succumbed to cult-based mind control consider themselves to have made their decision to join a cult of their own free will. We presume that brainwashing is a behavioral-compliance technique in which individuals subjected to mind control come to accept fundamental changes to their belief system. Cult mind control may be defined as temporary or permanent psychological manipulation by people who recruit and indoctrinate cult members, influencing their behavior and mental processes in compliance with the cult leadership’s desires, and of which control members remain naive (Nishida 1995a).

After the Aum attacks, Ando, Tsuchida, Imai, Shiomura, Murata, Watanabe, Nishida, and Genjida (1998) surveyed almost 9,000 Japanese college students. The questionnaire was designed to determine: whether the students had been approached by cults and, if so, how they had reacted; their perception of alleged cult mind-control techniques; and how their psychological needs determined their reactions when the cults had attempted to recruit them.

Ando’s survey results showed that about 20% of respondent impressions of the recruiter were somewhat favorable, in comparison with their impressions of salespersons. However, their compliance level was rather low. The regression analysis showed that the students tended to comply with the recruiter’s overture when:

• they were interested in what the agent told them;
• they were not in a hurry;
• they had no reason to refuse;
• they liked the agent; or
• they were told that they had been specially selected, could gain knowledge of the truth, and could acquire special new abilities.

When asked to evaluate people who were influenced or “mind controlled” by a cult, respondents tended to think it was “inevitable” those people succumbed, and they put less emphasis on members’ individual social responsibility. When mind control led to a criminal act, however, they tended to attribute responsibility to the individual. More than 70% of respondents answered in the affirmative when asked whether they themselves could resist being subjected to mind control, a result that confirms the students’ naiveté about their own personal vulnerability. The respondents’ needs or values had little effect on their reactions to, interest in, and impressions of cult agents’ attempts to recruit them.

Mind Control as Psychological Manipulation of Cult Membership
Nishida (1994, 1995b) investigated the process of belief-system change caused by mind control as practiced by a religious cult. His empirical study evaluated a questionnaire administered to 272 former group members, content analysis of the dogma in the group’s publications, videotapes of lectures on dogma, the recruiting and seminar manuals, and supplementary interviews with former members of the group.

Cult Indoctrination Process by Means of Psychological Manipulation
In one of his studies, Nishida (1994) found that recruiters offer the targets a new belief system, based on five schemas. These schemas comprise:

1. notions of self concerning one’s life purpose (Self Beliefs);

2. ideals governing the type of individual, society, and world there ought to be (Ideal Beliefs);

3. goals related to correct action on the part of individuals (Goal Beliefs);

4. notions of causality, or which laws of nature operate in the world’s history (Causality Beliefs); and 

5. trust that authority will decree the criteria for right and wrong, good and evil (Authority Beliefs) ▲.

Content analysis of the group’s dogma showed that its recruitment process restructures the target’s belief-system, replacing former values with new ones advocated by the group, based on the above schemas.

Abelson (1986) argues that beliefs are metaphorically similar to possessions. He posits that we collect whatever beliefs appeal to us, as if working in a room where we arrange our favorite furniture and objects. He proposes that we transform our beliefs into a new cognitive system of neural connections, which may be regarded as the tools for decision making.

Just as favorite tools are often placed in the central part of a room, or in a harmonious place, it appears that highly valued beliefs are located for easy access in cognitive processing. Meanwhile, much as worn-out tools are often hidden from sight in corners or storerooms, less-valued beliefs are relocated where they cannot be easily accessed for cognitive processing. Individual changes in belief are illustrated with the replacement of a piece of the furniture while a complete belief-system change is represented as exchanging all of one’s furniture and goods, and even the design and color of our room. The belief-system change, such as occurs during the recruitment and indoctrination process, is metaphorically represented in Figure 1 (below), starting with a functional room with its hierarchy of furniture or tools, and progressing through the stages of recruitment and indoctrination to the point at which the functional room has been replaced by a new set of furniture and tools that represent the altered belief system.

Step 0. The Figure shows the five schemas as a set of the thought tools that potential recruits hold prior to their contact with the group.

Step 1. Governed by their trust in authority, targets undergoing indoctrination remain naive about the actual group name, its true purpose, and the dogma that is meant to radically transform the belief system they have held until their contact with the group. At this stage of psychological manipulation, because most Japanese are likely to guard against religious solicitation, the recruiter puts on a good face. The recruiter approaches the targets with an especially warm greeting and assesses their vulnerabilities in order to confound them.

Step 2. While the new ideals and goals are quite appealing to targets, their confidence level in the new notions of causality also rises; some residual beliefs may remain at this stage.

The targets must be indoctrinated in isolation so that they remain unaware that the dogma they are absorbing is a part of cult recruitment. Thus isolated, they cannot sustain their own residual beliefs through observing the other targets; the indoctrination environment tolerates no social reality (Festinger 1954). The goal for this stage is for the targets to learn the dogma by heart and embrace it as their new belief, even if it might seem strange or incomprehensible.

Step 3. At this stage, the recruiter’s repeated lobbying for the new belief system entices the targets to “relocate” those newly absorbed beliefs that appeal to them into the central area in their “rooms.” By evoking the others’ commitment, the recruiter uses group pressure to constrain each target. This approach seems to induce both a collective lack of common sense (Allport 1924) and individual cognitive dissonance (Festinger 1957).

Step 4. As the new recruits pass through a period of concentrated study, the earlier conversion of particular values extends to their entire belief system. By the end, they have wholly embraced the new belief system. The attractive new beliefs gradually are “relocated” from their “room’s” periphery into its center, replacing older beliefs. Recently held beliefs are driven to the room’s periphery, thoroughly diminished; new, now-central beliefs coalesce, blending with the few remaining older notions.

Shunning their former society, the targets begin to spend most of their time among group members. Their new social reality raises the targets’ conviction that the new beliefs are proper. At this time, the targets feel contentedly at home because the recruiters are still quite hospitable.

Step 5. The old belief system has become as useless as dilapidated furniture or tools. With its replacement, the transformation of the new recruits’ belief systems results in fully configured new beliefs, with trust in authority at their core, and thus with that authority an effective vehicle for thought manipulation.

At the final stage of psychological manipulation, during the recruitment and indoctrination process, the recruiters invoke the charismatic leader of the group ▲, equating the mortal with god. The recruiters instill a profound fear in the targets, fear that misfortune and calamity will beset them should they leave the cult.

Figure 1. Metamorphosis of the belief system change by cultic psychological manipulation.

Each ellipse represents the working space for decision making. The shapes colored black in the ellipse represent the newly inputted beliefs. The large shapes are developed beliefs, and the shapes in the middle represent beliefs that are highly valued by the individual.  represents the authority of the charismatic leader of the group.


Cult Maintenance and Expansion through Psychological Manipulation

Nishida (1995b) studied one cult’s method of maintaining and expanding its membership by means of psychological manipulation, or cult mind control. The results of factor analysis of his survey data revealed that cult mind-control techniques induced six situational factors that enhanced and maintained members’ belief-systems: (1) restriction of freedom, (2) repression of sexual passion, (3) physical exhaustion, (4) punishment for external association, (5) reward and punishment, and (6) time pressure.

Studies also concluded that four types of complex psychological factors influence, enhance, and maintain members’ belief systems: (1) behavior manipulation, (2) information-processing manipulation, (3) group-processing manipulation, and (4) physiological-stress manipulation.

Behavior Manipulation
Behavior manipulation includes the following factors:

1  Conditioning. The target members were conditioned to experience deep anxiety if they behaved against cult doctrine. During conditioning, they would often be given small rewards when they accomplished a given task, but strong physical and mental punishment would be administered whenever they failed at a task.

2  Self-perception. A member’s attitude to the group would become fixed when the member was given a role to play in the group (Bem 1972; Zimbardo 1975).

3  Cognitive dissonance. Conditions are quite rigorous because members have to work strenuously and are allowed neither personal time nor money, nor to associate with “outsiders.” It seems that they often experienced strong cognitive dissonance (Festinger 1957).

Information-Processing Manipulation
Information-processing manipulation factors include the following:

1  Gain-loss effect. Swings between positive and negative attitudes toward the cult became fixed as more positive than negative (Aronson and Linder 1965). Many members had negative attitudes toward cults prior to contact with their group.

2  Systemization of belief-system. In general, belief has a tenacious effect, even when experience shows it to be erroneous (Ross, Lepper, and Hubbard 1975). Members always associate each experience with group dogma; they are indoctrinated to interpret every life event in terms of the cult’s belief-system. 

3  Priming effect. It is a cognitive phenomenon that many rehearsed messages guide information processing to take a specific direction (Srull and Wyer 1980). The members listen to the same lectures and music frequently and repeatedly, and they pray or chant many times every day.

4  Threatening messages. They are inculcated with strong fears of personal calamity by means of [illnesses such as cancer, accidents, influence of evil spirits, “restored after satan”], and so on.

Group-Processing Manipulation
Group-processing manipulation components include:

1  Selective exposure to information. Members avoid negative reports, but search for positive feedback once they make a commitment to the group (Festinger 1957). It should also be added that many group members continue to live in the locale in which they exited their society. Even so, new members are forbidden to have contact with out-of-group people, or access to external media.

2  Social identity. Members identify themselves with the group because the main goal or purpose of their activity is to gain personal prestige within the group (Turner, Hogg, Oakes, Reicher, and Wetherell 1987). Therefore, they look upon fellow members as elite, acting for the salvation of all people. Conversely, they look on external critics as either wicked persecutors or pitiful, ignorant fools. This “groupthink” makes it possible for the manipulators to provoke reckless group behavior among the members (Janis 1971; Wexler 1995).

Physiological-Stress Manipulation
It has been established that physiological stress factors facilitate this constraint within the group based on the following, as examples:

1  urgent individual need to achieve group goals,
2  fear of sanction and punishment,
3  monotonous group life,
4  sublimation of sexual drive in fatiguing hard work,
5  sleep deprivation,
6  poor nutrition,
7  extended prayer and / or [study sessions].


Post-Cult Residual Psychological Distress
Over the past few decades, a considerable number of studies have been completed on the psychological problems former cult members have experienced after leaving the cult, as compared with the mind-control process itself.

It is important to note that most former members continue to experience discontent, although its cause remains controversial (Aronoff, Lynn, and Malinoski 2000). A few studies on cult phenomena have been conducted so far in Japan, notably by Nishida (1995a, 1998), and by Nishida and Kuroda (2003, 2004), who investigated ex-cultists’ post-exit problems, based mainly on questionnaires administered to former members of two different cults.

In a series of studies, Nishida and Kuroda (2003) surveyed 157 former members of the Unification Church and Aum Shinrikyō. Using factor analysis, the studies posited eleven factors that contribute to ex-members’ psychological problems. These factors can be classified into three main groups: (1) emotional distress, (2) mental distress, and (3) interpersonal distress. The eleven factors are (1) tendencies to depression and anxiety, (2) loss of self-esteem, (3) remorse and regret, (4) difficulty in maintaining social relations and friendships, (5) difficulty in family relationships, (6) floating or flashback to cultic thinking and feeling, (7) fear of sexual contact, (8) emotional instability, (9) hypochondria, (10) secrecy of cult life, and (11) anger toward the cult. These findings seem to have a high correlation with previous American studies.

Moreover, Nishida and Kuroda (2004) deduced from their analysis of variance of the 157 former members surveyed that depression and anxiety, hypochondria, and secrecy of cult involvement decreased progressively, with the help of counseling, after members left the cult. However, loss of self-esteem and anger toward the cult increased as a result of counseling.

Furthermore, Nishida (1998) found clear gender differences in the post-exit recovery process. Although female ex-cultists’ distress levels were higher than those of the males immediately after they left the cults, the women experienced full recovery more quickly than the men. The study also found that counseling by non-professionals works effectively with certain types of distress, such as anxiety and helplessness, but not for others, such as regret and self-reproof.


Conclusion
It can be concluded from Japanese studies on destructive cults that the psychological manipulation known as cult mind control is different from brainwashing or coercive persuasion. Based on my empirical studies, conducted from a social psychology point of view, I concluded that many sets of social influence are systematically applied to new recruits during the indoctrination process, influences that facilitate ongoing control of cult members. My findings agree with certain American studies, such as those conducted by Zimbardo and Anderson (1993), Singer and Lalich (1995), and Hassan (1988, 2000). The manipulation is powerful enough to make a vulnerable recruit believe that the only proper action is to obey the organization’s leaders, in order to secure humanity’s salvation, even though the requisite deed may breach social norms. Furthermore, it should be pointed out that dedicated cult veterans are subject to profound distress over the extended period of their cult involvement.


This chapter is a reprint of an article originally published in Cultic Studies Review, 2005, Volume 4, Number 3, pages 215-232.


Kimiaki Nishida, Ph.D., a social psychologist in Japan, is Associate Professor at the Rissho University 立正大学 in Tokyo and a Director of the Japan Cult Recovery Council. He is a leading Japanese cultic studies scholar and the editor of Japanese Journal of Social Psychology. His studies on psychological manipulation by cults were awarded prizes by several academic societies in Japan. And he has been summoned to some courts to explain “cult mind control.”


統一協会の伝道とマインド·コントロール



9. Towards a Demystified and Disinterested Scientific Theory of Brainwashing

by Benjamin Zablocki

from Misunderstanding Cults: Searching for Objectivity in a Controversial Field
Edited by Benjamin Zablocki and Thomas Robbins

[This item has been expanded and moved HERE ]

Nobody likes to lose a customer, but religions get more touchy than most when faced with the risk of losing devotees they have come to define as their own. Historically, many religions have gone to great lengths to prevent apostasy, believing virtually any means justified to prevent wavering parishioners from defecting and thus losing hope of eternal salvation.  …



10. Psyching Out the Cults’ Collective Mania

by Louis Jolyon West and Richard Delgado

Los Angeles Times  November 26, 1978

Louis Jolyon West is director of UCLA’s Neuropsychiatric Institute. Richard Delgado is a visiting professor of law at UCLA.

Just a week ago yesterday, the ambush of Rep. Leo J. Ryan and three newsmen at a jungle airstrip set off a terrible sequence of events that left many hundreds of people dead in the steamy rain forests of Guyana. The horrible social mechanism that ground into motion in the Peoples Temple camp that day seems inexplicable to many and has focused attention on the murky world of cults, both religious and nonreligious.

Historically, periods of unusual turbulence are often accompanied by the emergence of cults. Following the fall of Rome, the French Revolution and again during the Industrial Revolution, numerous cults appeared in Europe. The westward movement in America swept a myriad of religious cults toward California. In the years following the Gold Rush, at least 50 utopian cults were established here. Most were religious and lasted, on the average, about 20 years; the secular variety usually endured only half that long.

The present disturbances in American culture first welled up during the 1960s, with the expansion of an unpopular war in Southeast Asia, massive upheavals over civil rights and a profound crisis in values in response to unprecedented affluence, on the one hand, and potential thermonuclear holocaust, on the other. Our youth were caught up in three rebellions: red (the New Left), against political and economic monopolies; black, against racial injustice, and green (the counterculture), against materialism in all its manifestations, including individual and institutional struggles for power.

Drug abuse and violent predators took an awful toll among the counterculture’s hippies in the late 1960s. Many fled to form colonies, now generally called communes. Others turned to the apparent security of paternalistic religious and secular cults, which have been multiplying at an astonishing rate ever since.

Those communes that have endured—perhaps two or three thousand in North America—can generally be differentiated from cults in three respects:

— Cults are established by strong charismatic leaders of power hierarchies controlling resources, while communes tend to minimize organizational structure and to deflate or expel power seekers.

— Cults possess some revealed “word” in the form of a book, manifesto or doctrine, whereas communes vaguely invoke general commitments to peace, libertarian freedoms and distaste for the parent culture’s establishments.

— Cults create fortified boundaries confining their members in various ways and attacking those who would leave as defectors, deserters or traitor; they recruit new members with ruthless energy and raise enormous sums of money, and they lend to view the outside world with increasing hostility and distrust as the organization ossifies. In contrast, communes are like nodes in the far-flung network of the counterculture. Their boundaries are permeable membranes through which people come and go relatively unimpeded, either to continue their pilgrimages or to return to a society regarded by the communards with feelings ranging from indifference to amusement to pity. Most communes thus defined seem to pose relatively little threat to society. Many cults, on the other hand, are increasingly perceived as dangerous both to their own members and to others.

Recent estimates place more than 2 million Americans, mostly aged 18 to 25, in some way affiliated with cults and, by using the broadest of definitions, there may be as many as 2,500 cults in America today. If the total seems large, consider that L. Ron Hubbard’s rapidly expanding Church of Scientology claimed 5.5 million members worldwide in 1972; the Unification Church of Rev. Sun Myung Moon boasts of 30,000 members in the United States alone.

These enterprises may seem rich, respectable and secure compared to the Reverend Jim Jones tragic Peoples Temple, with its membership of only 2,000 to 3,000. However, the Church of Scientology, the Unification Church and other organizations such as Chuck Dederich’s Synanon, have all been under recent investigation by government agencies. Other large religious cults, such as the Divine Light Mission, the International Society for Krishna Consciousness and the Children of God are being carefully scrutinized by the public. For the public is alarmed by what it knows of some cults’ methods of recruitment, exploitation of members, restriction on members freedom, retaliation against defecting members, struggles with members’ families engaged in rescue operations (including so-called “deprogramming”), dubious fiscal practices and the like. Lately, death threats against investigative reporters, leaked internal memoranda justifying violence, the discovery of weapons caches, such incidents as the rattlesnake attack against the Los Angeles attorney Paul Morantz last month, the violent outburst of the Hanafi Muslims in Washington, D.C., last year and now the gruesome events in Guyana have served to increase the public’s concern.

[The 1977 Hanafi Siege occurred on March 9-11, 1977 when three buildings in Washington, D.C. were seized by 12 Hanafi Muslim gunmen. The gunmen were led by Hamaas Abdul Khaalis, who wanted to bring attention to the murder of his family in 1973. They took 149 hostages and killed radio journalist Maurice Williams. Police officer Mack Cantrell also died.]

Some cults (for instance, Synanon) are relatively passive about recruitment (albeit harsh when it comes to defections). Others, such as the Unification Church, are tireless recruiters. Many employ techniques that in some respects resemble those used in the forceful political indoctrination prescribed by Mao Tse-tung during the communist revolution and its aftermath in China. These techniques, described by the Chinese as “thought reform” or “ideological remolding,” were labeled “brainwashing” in 1950 by the American journalist Edward Hunter. Such methods were subsequently studied in depth by a number of western scientists and Edgar Schein summarized much of this research in a monograph, “Coercive Persuasion,” published in 1961.

Successful indoctrination by a cult of a recruit is likely to require most of the following elements:

— Isolation of the recruit and manipulation of his environment;
— Control over channels of communication and information;
— Debilitation through inadequate diet and fatigue;
— Degradation or diminution of the self;
— Early stimulation of uncertainty, fear and confusion, and joy and certainty as rewards for surrendering self to the group;
— Alternation of harshness and leniency in the context of discipline;
— Peer pressure, often applied through ritualized “struggle sessions,” generating guilt and requiring open confessions;
— Insistence by seemingly all-powerful hosts that the recruit’s survival –physical or spiritual– depends on identifying with the group;
— Assignment of monotonous tasks of repetitive activities, such as chanting or copying written materials;
— Acts of symbolic betrayal or renunciation of self, family and previously held values, designed to increase the psychological distance between the recruit and his previous way of life.

As time passes, the new member’s psychological condition may deteriorate. He may become incapable of complex, rational thought; his responses to questions may become stereotyped and he may find it difficult to make even simple decisions unaided. His judgement about the events in the outside world will likely be impaired. At the same time, there may be such a reduction of insight that he fails to realize how much he has changed.

After months or years of membership, such a former recruit may emerge from the cult –perhaps “rescued” by friends or family, but more likely having escaped following prolonged exploitation, suffering and disillusionment. Many such refugees appeared dazed and confused, unable to resume their previous way of life and fearful of being captured, punished and returned to the cult. “Floating” is a frequent phenomenon, with the ex-cultist drifting off into disassociated states of altered consciousness. Other frequent symptoms of the refugees include depression, indecisiveness and a general sense of disorientation, often accompanied by frightening impulses to return to the cult and throw themselves on the mercy of the leader.

This suggests that society may well wish to consider ways of preventing its members, particularly the young, from unwittingly becoming lost in cults that use psychologically and even physically harmful techniques of persuasion. Parents can inform themselves and their children about cults and the dangers they pose; religious and educational leaders can teach the risks of associating with such groups. However, when prevention fails and intervention assumes an official character –as through legislation or court action– it is necessary to consider the potential impact of such intervention on the free exercise of religion as guaranteed by the First Amendment.

Under the U.S. Constitution, religious liberty is of two types –freedom of belief and freedom of action. The first is, by its nature, absolute. An individual may choose to believe in a system that others find bizarre or ludicrous; society is powerless to interfere. Religiously motivated conduct, however, is not protected absolutely. Instead, it is subject to a balancing test, in which courts weigh the interest of society in regulating or forbidding the conduct against the interest of the group in carrying it out.

How can society best protect the individual from physical and psychological harm, from stultification of his ability to act autonomously, from loss of vital years of his life, from dehumanizing exploitation –all without interfering with his freedom of choice in regard to religious practices? And, while protecting religious freedom, how can society protect the family as a social institution from the menace of the cult as a competing super-family?

A number of legal cases involving polygamy, blood transfusions for those who object to them on religious grounds and the state’s interest in protecting children from religious zealotry suggest that the courts will hold these interests to be constitutionally adequate to check the more obvious abuses of the cults. Furthermore, the cults interest is likely to be found weakened by a lack of “sincerity,” a requirement deriving from conscious-objector and tax-exemption cases, and lack of “centrality,” or importance of the objectionable practices to such essential religious functions as worship.

To be protected by the First Amendment, religions conduct must stem from theological or moral motives rather than avarice, personal convenience, or a desire for power. Such conduct must also constitute a central or indispensable element of the religious practice.

Many religious cults demonstrate an extreme interest in financial or political aggrandizement, but little interest in the spiritual development of the faithful. Because their religious or theological core would not seem affected by a prohibition against deceptive recruiting methods and coercive techniques to indoctrinate and retain members, it is likely the courts would consider the use of such methods neither “sincere” nor “central.”

Thus the constitutional balance appears to allow intervention, though it could be objected that obnoxious practices which might otherwise justify intervention should not be considered harmful if those experiencing them do so voluntarily and do not see them as harmful at the time.

But is coercive persuasion in the cults inflicted on persons who freely choose to undergo it —who decide to be unfree— or is it imposed on persons who do not truly choose it of their own free will? The decision to join a cult and undergo drastic reformation of one’s thought and behavioral processes can be seen as similar in importance to decisions to undergo surgery, psychotherapy and other forms of medical treatment. Accordingly, it should be protected in the same manner and to the same degree as we protect the decision to undergo medical treatment. This means the decision must be fully consensual. This entails, at a minimum, that those making such decisions do so with both full mental “capacity” and with a complete “knowledge” of the choices offered them. In other words, they should give “fully informed consent” before the process of indoctrination can be initiated.

A review of legislative reports, court proceedings (including cases involving conservatorships, or the “defense of necessity” in kidnaping prosecutions), and considerable clinical material makes clear that the cult joining process is often not fully consensual. It is not fully consensual because “knowledge” and “capacity” —the essential elements of legally adequate consent— are not simultaneously present. Until cults obtain fully informed consent from prospective members giving permission in advance to apply the procedures of indoctrination, and warning of the potential risks and losses, it appears that society may properly take measures to protect itself against cultist indoctrination without violating the principle, central to American jurisprudence, that the state should not interfere with the voluntarily chosen religious behavior of adult citizens.

Most young people who are approached by cultist recruiters will have relatively unimpaired “capacity”. They may be undergoing a momentary state of fatigue, depression, or boredom; they may be worried about exams, a separation from home or family, the job market, or relations with the opposite sex —but generally their minds are intact. If the recruiter were to approach such a person and introduce himself or herself as a recruiter for a cult, such as the Unification Church, the target person would likely be on guard.

But recruiters usually conceal the identity of the cult at first, and the role the recruit is expected to play in it, until the young person has become fatigued and suggestible. Information is imparted only when the target’s capacity to analyze it has become low. In other words, when the recruit’s legal “capacity” is high, his “knowledge” is not; later the reverse obtains. Consent given under such circumstances should not deserve the respect afforded ordinary decisions of competent adults.

If intervention against cults that employ coercive persuasion is consistent with the First Amendment, a line must be drawn between cults and other organizations. But is it possible to impose restrictions on the activities of cults that use coercive persuasion without imposing the same restraints upon other societal institutions —TV advertising, political campaigns, army training camps, Jesuit seminaries— that use influence, persuasion and group dynamics in their normal procedures?

Established religious orders may sequester their trainees to some extent. Military recruiters and Madison Avenue copywriters use exaggeration, concealment and “puffing” to make their product appear more attractive than it is. Revivalists invoke guilt. Religious mystics engage in ritual fasting and self-mortification. It has been argued that the thought-control processes used by cults are indistinguishable from those of more socially accepted groups.

Yet it is possible to distinguish between cults and other institutions —by examining the intensity and pervasiveness with which mind-influencing techniques are applied. For instance, Jesuit seminaries may isolate the seminarian from the rest of the world for periods of time, but the candidate is not deliberately deceived about the obligations and burdens of the priesthood; in fact, he is warned in advance and is given every opportunity to withdraw.

In fact, few, if any, social institutions claiming First Amendment protection use conditioning techniques as intense, deceptive, or pervasive as those employed by many contemporary cults. A decision to intervene and prevent abuses of cult proselytizing and indoctrinating does not by its logic alone dictate intervention in other areas where the abuses are milder and more easily controlled.

To turn again to the sad case of the Peoples Temple, it seemed to be, for some years, a relatively small and, in its public stance, moderate cult. Its members differed from those of most cults: Many were older people, many were black, many were enlisted in family units. Nevertheless, from its origins, based on professed ideals of racial harmony and economic equality, the cult gradually developed typical cultist patterns of coercive measures, harsh practices, suspicions of the outside world and a siege mentality.

It may be that these developments comprise an institutional disease of cults. If so, the recent events in Guyana pose a new warning of continuing dangers from cults. For as time passes, leaders may age and sicken. The cult’s characteristically rigid structure and its habitual deference to the leader as repository of all authority leaves the membership vulnerable to the consequences of incredible errors of judgment, institutional paranoia and even deranged behavior by the cult’s chief.

Perhaps the tragedy of Jim Jones’ Peoples Temple will lead to more comprehensive and scientific studies of cult phenomena. Perhaps it will lead our society to a more reasoned public policy of prevention and intervention against further abuses by cults in the name of freedom of religion. If so, then perhaps the disaster in Guyana will have some meaning after all.



11. Take Back Your Life by Janja Lalich and Madeleine Tobias


Paperback: 374 pages
Publisher: Bay Tree Publishing; 2nd edition (September 10, 2009)

Cult victims and those who have experienced abusive relationships often suffer from fear, confusion, low self-esteem, and post-traumatic stress. Take Back Your Life explains the seductive draw that leads people into such situations, provides insightful information for assessing what happened, and hands-on tools for getting back on track. Written for victims, their families, and professionals, this book leads readers through the healing process.


About the Authors

Janja Lalich, Ph.D., is Associate Professor of Sociology at California State University, Chico. She has been studying the cult phenomenon since the late 1980s and has coordinated local support groups for ex-cult members and for women who were sexually abused in a cult or abusive relationship. She is the author of Bounded Choice: True Believers and Charismatic Cults, and co-author, with Margaret Singer, of Cults in Our Midst.

Madeleine Tobias, M.S., R.N., C.S., is the Clinical Coordinator and a psychotherapist at the Vet Center in White River Junction, Vermont, where she treats veterans who experienced combat and/or sexual trauma while in the military. Previously she had a private practice in Connecticut and was an exit counselor helping ex-members of cultic groups and relationships.


“If you buy one book on cults, this could be top of the list.”

Here are three reviews of an earlier edition:

An essential roadmap to recovery

For me, the special usefulness of this book came in the form of material directed at children who grew up in a cult, who have no other frame of reference to go back to.

The information I gleaned here gave me that frame of reference, and helped me to “detox” from the environment which was so seductively calling me back. It explains and makes sense of some very bewildering and deceptive manipulation techniques. And it has helped my therapy by outlining the kinds of issues that children coming out of cults usually face.

This book has a universal appeal for all cult escapees because it focuses not on beliefs or practices, but rather on manipulations and psychological pressures which are commonly brought to bear in cults. I found it easy to identify experientially with the material, without being challenged and put off by attacks on my strange belief system which I was still disengaging from.

It’s been a big part of my recovery. My thanks to the authors!

____________

A must-read for former cult members by Troy Waller:

I wish I had found this book immediately after leaving the cult I was involved in.

This book offers invaluable assistance to those who have been involved with a destructive cult, whether it be religious, political or psycho-therapeutic. The text gives former members indications of what to expect in recovery as well as practical assistance to cope with their recovery.

The text also gives a breakdown of how and why cults operate as they do; how and why people get recruited into cults; and how and why people leave cults.

This book is truly a gift from the authors’ heart, experiences and study. Thanks to them.

____________

Sane Advice for Those Leaving Cults by D. L. Barnett

We don’t hear much these days about the Branch Davidians, Heaven’s Gate or even Jim Jones. It’s tempting to think that the cult movement has faded and that the world’s attention is on more pressing matters – like suicide bombers. But they are all of a piece, according to Chico State University Associate Professor of Sociology Janja Lalich.

In “Take Back Your Life: Recovering from Cults and Abusive Relationships,” Lalich and co-author Madeleine Tobias, a Vermont psychotherapist, make clear that modern day cults have not disappeared. “If there is less street recruiting today, it is because many cults now use professional associations, campus organizations, self-help seminars, and the Internet as recruitment tools” to entice the unwary.

Who gets sucked into a cult? “Although the public tends to think, wrongly, that only those who are stupid, weird, crazy and aimless get involved in cults, this is simply untrue. … We know that many cult members went to the best schools in the country, have advanced academic or professional degrees and had successful careers and lives prior to their involvement in a cult or cultic abusive relationship. But at a vulnerable moment, and we all have plenty of those in our lives (a lost love, a lost job, rejection, a death in the family and so on), a person can fall under the influence of someone who appears to offer answers or a sense of direction.”

For the authors, “a group or relationship earns the label ‘cult’ on the basis of its methods and behaviors – not on the basis of its beliefs. Often those of us who criticize cults are accused of wanting to deny people their freedoms, religious or otherwise. But what we critique and oppose is precisely the repression and stripping away of individual freedoms that tends to occur in cults. It is not beliefs that we oppose, but the exploitative manipulation of people’s faith, commitment, and trust.”

Written for those coming out of cults, as well as for family members and professionals, “Take Back Your Life” deals with common characteristics of myriad cult types: Eastern, religious and New Age cults; political, racist and terrorist cults; psychotherapy, human potential, mass transformational cults; commercial, multi-marking cults; occult, satanic or black-magic cults; one-on-one family cults; and cults of personality. …

The book features riveting personal accounts from ex-cult members and offers a wide range of resources for the person who is trying to retrieve his or her “pre-cult” personality. Education looms large, for that can begin to break down the narrow black-and-white thinking cult members often display. Many cults redefine common terms or introduce special vocabulary making it difficult for members to make sense of the world outside of even their own inner aspirations.

The authors are also concerned about those in the education and helping professions who don’t see the dangers posed by cults both to the individual and the larger community. Part of the purpose of the book is to make a credible case that any course of therapy needs to take into account a patient’s cult associations.

“Take Back Your Life” is a book of hope, an excellent starting point for those thinking of exiting a cult and for those who are taking back their lives, one day at a time.


Contents

Acknowledgments ix
Introduction 1

Part One – The Cult Experience 7

1. Defining a Cult 9

2. Recruitment 18

3. Indoctrination and Resocialization 36

4. The Cult Leader 52

5. Abusive Relationships and Family Cults 72

Part Two – The Healing Process 87

6. Leaving a Cult 89

7. Taking Back Your Mind 104

8. Dealing with the Aftereffects 116

9. Coping with Emotions 127

10. Building a Life 151

11. Facing the Challenges of the Future 166

12. Healing from Sexual Abuse and Violence 180

13. Making Progress by Taking Action 196

14. Success Is Sweet: Personal Accounts 212

Part Three – Families and Children in Cults 239

15. Born and Raised in a Cult 241

16. Our Lives to Live: Personal Accounts 259

17. Child Abuse in Cults 280
Nori J. Muster

Part Four – Therapeutic Concerns 287

18. Therapeutic Issues 289

19. The Therapist’s Role 305
Shelly Rosen

20. Former Cult Members and Post-Traumatic Stress Disorder 314

Appendixes

A. Characteristics Associated with Cultic Groups 327
Janja Lalich and Michael Langone

B. On Being Savvy Spiritual Consumers 329
Rosanne Henry and Sharon Colvin

C. Resources 332

D. Recommended Reading 336

Notes 345

Author Index 359

Subject Index 363


Introduction

Take Back Your Life: Recovering from Cults and Abusive Relationships gives former cult members, their families, and professionals an understanding of common cult practices and their aftereffects. This book also provides an array of specific aids that may help restore a sense of normalcy to former cult members’ lives.

About twelve years ago, we wrote our first book on this topic: Captive Hearts, Captive Minds: Freedom and Recovery from Cults and Abusive Relationships. Over the years, we received mounds of positive feedback about that book in the form of letters, phone calls, postcards, emails, faxes, and personal contact at conferences and in our professional lives. Former cult members, families, therapists, and exit counselors continually told us that Captive Hearts, Captive Minds was always their number-one book. That positive reception (and the need to provide up-to-date information) was the impetus for this new book. We are delighted to offer this new resource to people who want to evaluate, understand, and, in many cases, recover from the effects of a cult experience. We hope this book will help you take back your life.

Cults did not fade away (as some would like to believe) with the passing of the sixties and the disappearance of the flower children. In fact, cult groups and relationships are alive and thriving, though many groups have matured and “cleaned up their act.” If there is less street recruiting today, it is because many cults now use professional associations, campus organizations, self-help seminars, and the Internet as recruitment tools. Today we see people of all ages— even multigenerational families—being drawn into a wide variety of groups and movements focused on everything from therapy to business ventures, from New Age philosophies to Bible-based beliefs, and from martial arts to political change.

Most cults don’t stand up to be counted in a formal sense. Currently, the best estimates tell us that there are about 5,000 such groups in the United States, some large, some remarkably small. Noted cult expert and clinical psychologist Margaret Singer estimated “about 10 to 20 million people have at some point in recent years been in one or more of such groups.”(1) Before its enforced demise, the national Cult Awareness Network reported receiving about 20,000 inquiries a year.(2)

A cult experience is often a conflicted one, as those of you who are former members know. More often than not, leaving a cult environment requires an adjustment period so that you can put yourself and your life back together in a way that makes sense to you. When you first leave a cult situation, you may not recognize yourself. You may feel confused and lost; you may feel both sad and exhilarated. You may not know how to identify or tackle the problems you are facing. You may not have the slightest idea about who you want to be or what you want to believe. The question we often ask children, “What do you want to be when you grow up?” takes on new meaning for adult ex-cult members.

Understanding what happened to you and getting your life back on track is a process that may or may not include professional therapy or pastoral counseling. The healing or recovery process varies for each of us, with ebbs and flows of progress, great insight, and profound confusion. Also, certain individual factors will affect your recovery process. One is the length and intensity of your cult experience. Another is the nature of the group or person you were involved with—or where your experience falls on a scale of benign to mildly harmful to extremely damaging. Recovering from a cult experience will not end the moment you leave the situation (whether you left on your own or with the help of others). Nor will it end after the first few weeks or months away from your group. On the contrary, depending on your circumstances, aspects of your cult involvement may require some attention for the rest of your life.

Given that, it is important to find a comfortable pace for your healing process. In the beginning, particularly, your mind and body may simply need a rest. Now that you are no longer on a mission to save the world or your soul, relaxation and rest are no longer sinful. In fact, they are absolutely necessary for a healthy, balanced, and productive life.

Reentering the non-cult world (or entering it for the first time if you were born or raised in a cult) can be painful and confusing. To some extent, time will help. Yet the passage of time and being physically out of the group are not enough. You must actively and of your own initiative face the issues of your involvement. Let time be your ally, but don’t expect time alone to heal you. We both know former cult members who have been out of their groups for many years but who have never had any counseling or education about cults or the power of social-psychological influence and control. These individuals live in considerable emotional pain and have significant difficulties due to unresolved conflicts about their group, their leader, or their own participation. Some are still under the subtle (or not so subtle) effects of the group’s systems of influence and control.

A cult experience is different for each person, even for members of the same group, family, or situation. Some former members may have primarily positive impressions and memories, while others may feel hurt, used, or angry. The actual experiences and the degree or type of harm suffered may vary considerably. Some people may leave cults with minimum distress, and adjust rather rapidly to the larger society, while others may suffer severe emotional trauma that requires psychiatric care. Still others may need medical attention or other care. The dilemmas can be overwhelming and may require thoughtful attention. Many have likened this period to being on an emotional roller coaster.

First of all, self-blame (for joining the cult or participating in it, or both) is a common reaction that tends to overshadow all positive feelings. Added to this is a feeling of identity loss and confusion over various aspects of daily life. If you were recruited at any time after your teens, you already had a distinct personality, which we call the “pre-cult personality.” While you were in the cult, you most likely developed a so-called new personality in order to adapt to the demands and ambiance of cult life. We call this the “cult personality.” Most cults engage in an array of social-psychological pressures aimed at indoctrinating and changing you. You may have been led to believe that your pre-cult personality was all bad and your adaptive cult personality all good. After you leave a cult, you don’t automatically switch back to your pre-cult self; in fact, you may often feel as if you have two personalities or two selves. Evaluating these emotions and confronting this dilemma—integrating the good and discarding the bad—is a primary task for most former cult members, and is a core focus of this book.

As you seek to redefine and reshape your identity, you will want to address the psychological, emotional, and physical consequences of living in or around a constrained, controlled, and possibly abusive environment. And as if all that weren’t enough, many basic life necessities and challenges will need to be met and overcome. These may include finding employment and a place to live, making friends, repairing old relationships, confronting belief issues, deciding on a career or going back to school, and most likely catching up with a social and cultural gap.

If you feel like “a stranger in a strange land,” it may be consoling to know that you are not the first person to have felt this way. In fact, the pervasive and awkward sense of alienation that both of us felt when we left our cults motivated us to write this book. We hope that the information here will not only help you get rid of any shame or embarrassment you might feel, but also ease your integration into a positive and productive life.

We were compelled to write this book because more often than not, people coming out of cults have tremendous difficulty finding practical information. We, too, experienced that obstacle. Both of us faced one roadblock after another as we searched for useful information and helping professionals who were knowledgeable about cults and post-cult trauma.

A matter we hope to shed light on in this book is the damage wrought by the so-called cult apologists. These individuals (mostly academics) allege that cults do no harm, and that reports of emotional or psychological damage are exaggerations or even fabrications on the part of disgruntled former members. Naturally we disagree. It is unfortunate that there is still so little public understanding of the potential danger of some cults. Certainly there are risks and harmful consequences for individuals involved in these closed, authoritarian groups and abusive relationships. If there weren’t, there would be no need for cult research and information organizations, or for books such as this. Added to individual-level consequences, there are documented dangers to society as a whole from cults whose members carry out their beliefs in antisocial ways— sometimes random, sometimes planned—through fraud, terrorist acts, drug dealing, arms trading, enforced prostitution of members, sexual exploitation, and other violent or criminal behaviors.

From our perspective, a group or relationship earns the label “cult” on the basis of its methods and behaviors—not on the basis of its beliefs. Often those of us who criticize cults are accused of wanting to deny people their freedoms, religious or otherwise. But what we critique and oppose is precisely the repression and stripping away of individual freedoms that tends to occur in cults. It is not beliefs that we oppose, but the exploitative manipulation of people’s faith, commitment, and trust. Our society must not shy away from exposing and holding accountable those social systems (whether they be communities, organizations, families, or relationships) that use deception, manipulation, coercion, and persuasion to attract, recruit, convert, hold on to, and ultimately exploit people.

Also, it’s important to note that there are many non-cult organizations to which people can dedicate their lives and may experience personal transformation. Many religious and self-help institutions, as well as mainstream political parties and special-interest groups, are examples of such non-cult organizations. We do not call them cults because they are publicly known institutions that are usually accountable to some higher body or to society in general. When people join, they have a clear idea of these organizations’ structures and goals. Deceptive or coercive practices are not integral to the growth of these organizations or their ability to retain their members.

In contrast, cult membership is less than fully voluntary. Often it is the result of intense social-psychological influence and control, sometimes called coercive persuasion. Cults tend to assault and strip away a person’s independence, critical-thinking abilities, and personal relationships, and may have a less-than-positive effect on the person’s physical, spiritual, and psychological state of being.

We wrote this book for the many individuals who have experienced harm or trauma in a cult or an abusive relationship. Because it is awkward to continually repeat the phrase “cult or cultic relationship,” in many instances throughout this book we simply shortened it to “cult” or “group,” which are meant to be inclusive of all types of cultic involvements. In the same vein, while we recognize the existence of many one-on-one cultic relationships and family cults, we tend to use simply “cult leader” or “leader” rather than always specifying “leader or abusive partner.” Also, we tend to use masculine pronouns when referring to cult leaders in general. This is not to ignore the fact that there are many female cult leaders, but merely to acknowledge that most cult leaders tend to be men. However, whether male or female, most are equal-opportunity victimizers, drawing men, women, and children of all ages into their webs of influence.

We have included case examples and personal accounts throughout the chapters to illustrate the specifics of involvement, typical aftereffects, and the healing process. Some examples are composites based on interviews and our personal and professional experiences with many hundreds of former cult members. Some former members made specific contributions or allowed us to quote them and use their real names, while others asked for pseudonyms to protect their privacy. These latter, as well as the case examples, are indicated in the text by the use of first name and last initial on the first mention of that name.

If you are a former cult member, you may identify personally with some of the experiences, emotions, challenges, and difficulties discussed here. Other topics may appear quite foreign and unrelated to your experience. It may be helpful to look them over anyway, as there may be lessons or suggestions that could be useful for your situation.

The keys to recovery are balance and moderation, both of which were quite likely absent in the cult. Now you can create a program for recovery that addresses your needs and wants, and you can change it at will to adapt to any new circumstances or needs. The important thing is to do what feels right. Most cults teach you to squelch your gut instincts, but you can now let your self speak to you—and this time, you can listen and act. From now on, only you are responsible for setting and achieving your goals. Our hope is that this book will be useful to you in your recovery process, and we wish you well.


pages 15-17

Cults as Power Structures

In Bounded Choice: True Believers and Charismatic Cults, I (Janja Lalich) present my most recent findings from an in-depth study of cultic structures and dynamics:

A cult can be either a sharply bounded social group or a diffusely bounded social movement held together through a shared commitment to a charismatic leader. It upholds a transcendent ideology (often but not always religious in nature) and requires a high level of commitment from its members in words and deeds.(8)

Four interlocking dimensions make up the framework of a cult’s social system and dynamics. You can use this framework to examine your own cult experience. These four dimensions are clearly separated here for analytical purposes so that former cult members (whose memories of cult experiences are often confused and conflicting) can more easily deconstruct and understand each phase of indoctrination and control:

Charismatic authority. This is the emotional bond between a leader and his followers. It lends legitimacy to the leader and grants authority to his actions while at the same time justifying and reinforcing followers’ responses to the leader and/or the leader’s ideas and goals. Charisma is the hook that links a devotee to a leader and/or his ideas.

The general purpose of charismatic authority is to provide leadership. The specific goal is for the leader to be accepted as the legitimate authority and to offer direction. This is accomplished through privilege and command. The desired effect, of course, is that members will believe in and identify with the leader.

Transcendent belief system. This is the overarching ideology that binds adherents to the group and keeps them behaving according to the group’s rules and norms. It is transcendent because it offers a total explanation of past, present, and future, including the path to salvation. Most importantly, the leader/group also specifies the exact methodology (or recipe) for the personal transformation necessary to travel on that path.

The goal of the transcendent belief system is to provide a worldview that offers meaning and purpose through a moral imperative. This imperative requires each member to subject himself to a process of personal transformation. The desired effect is for the member to feel a sense of connection to a greater goal while aspiring to salvation. This effect is solidified through the internalization of the belief system and its accompanying behaviors and attitudes.

Systems of control. This is the network of acknowledged—or visible—regulatory mechanisms that guide the operation of the group. It includes the overt rules, regulations, and procedures that guide and control members’ behavior.

The purpose of the systems of control is quite simply to provide organizational structure. The specific goal is to create a behavioral system and disciplinary code through rules, regulations, and sanctions. The effect is compliance, or better still, obedience.

Systems of influence. This is the network of interactions and social influence that resides in the group’s social relations. This interaction and group culture teach members to adapt their thoughts, attitudes, and behaviors in relation to their new beliefs.

The purpose of the systems of influence is to shape the group culture. The specific goal is to create institutionalized group norms and an established code of conduct by which members are expected to live. This is accomplished by various methods of peer and leadership pressure, and through social-psycho-logical influence and modeling. The desired effect is conformity and the self-renunciation that is required not only to be part of the group but also to achieve the professed goal.(9)

This combination of a transcendent belief system, all-encompassing systems of interlocking structural and social controls, and highly charged charismatic relationships between leader(s) and adherents results in a self-sealing system that exacts a high degree of commitment (as well as expressions of that commitment) from its core members. A self-sealing system is one that is closed in on itself, allowing no consideration of disconfirming evidence or alternative points of view. In the extreme, a self-sealed group is exclusive and its belief system is all inclusive, in the sense that it provides answers to everything. Typically the quest of such groups is to attain a far-reaching ideal. However, a loss of sense of self is all too often the by-product of that quest.(10)

Over the years, some people have used alternative terms or adjectives to identify cult groups, such as high-demand, high-control, totalistic, totalitarian, closed charismatic, ultra-authoritarian, and so on. In academia, some rather acrimonious debate has arisen over the use of the word cult, with some academicians and researchers using their influence to dissuade scholars, legal and helping professionals, the media, and others from identifying any group as a cult. Recent work addressing these debates and arguments can be found in Misunderstanding Cults: Searching for Objectivity in a Controversial Field, edited by Benjamin Zablocki and Thomas Robbins.(11)

Frankly we prefer to use the term cult because we feel that it has historical meaning and value. Whatever one decides to call these groups, one must not ignore the structural and behavioral patterns that have been identified through years of study and research, or through the voluminous accounts of people who successfully exited from cult groups and relationships. To sweep cults under the rug or to call them by another name won’t make cults go away—nor will it aid us in understanding these complex social systems. Most importantly, cover-ups and whitewashing won’t help former cult members evaluate or recover from their experiences in a whole and healthful manner.


from pages 26-27

Contract for Membership in a Cultic Group or Relationship

In the medical profession, ethical contracts ensure that patients have given “fully informed consent.” That is, if a doctor fails to inform a patient about the risks, side effects, and options for treatment, the uninformed patient is entitled to sue for maltreatment. Below is a mock contract for cult membership. Ask yourself if you gave informed consent at the time of your recruitment, or if you would have joined had you known your participation would involve the following conditions.

I, _______________________________ hereby agree to join
_______________________________ . I understand that my life will change in the following ways. I know what I am getting into and agree to all of the following conditions:
1. My good feelings about who I am will stem from being liked by other group members and/or my leader, and from receiving approval from the group/leader.
2. My total mental attention will focus on solving the group’s/leader’s problems and making sure that there are no conflicts.
3. My mental attention will be focused on pleasing and protecting the group/leader.
4. My self-esteem will be bolstered by solving group problems and relieving the leader’s pain.
5. My own hobbies and interests will gladly be put aside. My time will be spent however the group/leader wants.
6. My clothing and personal appearance will be dictated by the desires of the group/leader.
7. I do not need to be sure of how I feel. I will only be focused on what the group/leader feels.
8. I will ignore my own needs and wants. The needs and wants of the group/leader are all that is important.
9. The dreams I have for the future will be linked to the group/leader.
10. My fear of rejection will determine what I say or do.
11. My fear of the group’s / leader’s anger will determine what I say or do.
12. I will use giving as a way of feeling safe with the group/leader.
13. My social circle will diminish or disappear as I involve myself with the group/leader.
14. I will give up my family as I involve myself with the group / leader.
15. The group’s/leader’s values will become my values.
16. I will cherish the group’s / leader’s opinions and ways of doing things more than my own.
17. The quality of my life will be in relation to the quality of group life, not the quality of life of the leader.
18. Everything that is right and good is due to the group’s belief, the leader, or the teachings.
19. Everything that is wrong is due to me.
20. In addition, I waive the following rights to:
• Leave the group at any time without the need to give a reason or sit through a waiting period
• Maintain contact with the outside world
• Have an education, career, and future of my choice
• Receive reasonable health care and have a say in my health care
• Have a say in my own and my family’s discipline, and to expect moderation in disciplinary methods
• Have control over my body, including choices related to sex, marriage, and procreation
• Expect honesty in dealings with authority figures in the group
• Expect honesty in any proselytizing I am expected to do
• Have any complaints heard and dealt with fairly with an impartial investigation
• Be supported and cared for in my old age in gratitude for my years of service


Janja Lalich presentation:
Re-forming the Self: The Impact and Consequences of Institutional Abuse

WEBSITE: http://cultresearch.org/



12. VIDEO: Paul Morantz, Attorney and cult expert
Cults, Thought Reform, Coercive Persuasion and Confession    (7minutes)

Los Angeles attorney and cult expert Paul Morantz has devoted his professional life to fighting cults. But in the late 1970s that life almost came to an abrupt end when one of the cults he litigated against planted a live rattlesnake in his mailbox. Paul’s health (and speech) has been affected ever since.

VIDEO: University of California Television (UCTV)
The Lawyer Synanon Tried to Kill – Legally Speaking

Paul Morantz speaks with California Lawyer editor Martin Lasden about his career and the dangers he faced. Series: “Legally Speaking”

WEBSITE: http://www.paulmorantz.com/



13. Podcast: Ford Greene, Attorney and Former Moonie, on Sun Myung Moon

Peter B. Collins podcast:
Ford Greene, an expert on religious cults including Scientology and the Unification Church, returns to talk about the death of Rev. Moon. Greene, once a Moonie himself, talks about the impact of Rev. Moon’s death. We touch on my podcast with Archbishop Stallings in early September, and the spin he put on the cult behaviors of Moon and his followers. Greene has deprogrammed many Moonies, and sued the church on behalf of former members. His own sister remains a member of the church. We talk about the CIA connections of Moon and his underlings, Moon’s role in right wing politics in the US, including his operation of the Washington Times. Greene also speculates about the future of the business empire and Moon’s brainwashed followers. While Green has not seen The Master yet, he comments on the aggressive legal tactics of Scientology.

At 37:20, Gary Chew reviews the new film The Master, which is based on the early life of Scientology founder L. Ron Hubbard. Gary Chew offers a somewhat cryptic view of The Master, starring Philip Seymour Hoffman and Joaquin Phoenix.

https://www.peterbcollins.com/2012/09/28/ford-greene-attorney-and-moonie-de-programmer-on-the-death-of-rev-moon-gary-chew-reviews-the-master-maxine-doogan-tells-californians-no-on-prop-35/


Ford Greene and the Moonies

Ford Greene is featured in the book Moonwebs

The book was made into the movie, Ticket to Heaven

VIDEO:

Billet pour le ciel – par Josh Freed (français)


14. VIDEO: Steve Hassan interviewed by Chris Shelton

Sensibly Speaking Podcast #77: Dealing with Destructive Cults ft. Steve Hassan

This week I interview Steve Hassan, a cult recovery specialist and licensed mental health counselor who has written on the subject of cults and published three books, including Combatting Cult Mind Control, which is an excellent breakdown of how destructive cults work, what undue influence is, how to recover from a cult experience and what family and friends can do for their loved ones who may be stuck in a cult situation.

Steve Hassan’s website: http://freedomofmind.com



15. VIDEO:  Conformity by TheraminTrees

We each possess one of the most powerful tools in the known universe: the human brain — capable of the extremes of insight and ignorance; of productiveness and destructiveness; of detection and projection; of rationality and rationalization; of liberation and oppression.

16. VIDEO: Instruction Manual for Life by TheraminTrees


Charismatic Authority (authority is not the same as power)



17. The Social Organization of Recruitment in the Unification Church – PDF

by David Frank Taylor, M.A., July 1978, Sociology

The purpose of this study is to provide an empirical description of recruitment into the Unification Church. The Unification Church is one of many new religious movements that has appeared in America during the 1970s. The methods Church members use to attract and secure the commitment of individuals to the Church has generated controversy in recent years.

The research was initiated under the assumption that these recruitment strategies could be under­stood through the use of qualitative field methods. As an ethnographic treatment of religious indoctrina­tion, the study is based on participant observation of the recruitment process and is grounded in the interaction and language usage of participants. Close attention is given to the daily life of Church members and prospective members, where members help in a coopera­tive effort to persuade individuals to join their movement.

University of Montana
ScholarWorks at University of Montana
Theses, Dissertations, Professional Papers  –  Graduate School

LINK:
http://scholarworks.umt.edu/cgi/viewcontent.cgi?article=6585&context=etd

TABLE OF CONTENTS
Abstract …………………………………………………………… ii
Acknowledgements ……………………………………………… iii
Chapter I. INTRODUCTION TO THE STUDY …………………… 1
Chapter II. HISTORY, BELIEFS, AND STRUCTURE OF THE UNIFICATION CHURCH … 13
History and Beliefs ………………………………………………… 13
Organizational Structure …………………………………………… 20
Controversies Surrounding the Church …………………………… 23
Chapter III. A DESCRIPTION OF RECRUITMENT …………………… 31
The Encounter ………………………………………………………… 31
The Elephant Bus to Boonville ……………………………………… 36
“The Greatest Weekend” …………………………………………… 38
The Keynote Lecture: Falling in Love, Together …………………… 40
Understanding God’s Situation ……………………………………… 43
“A Universal Point of View” …………………………………………… 47
“Truth and Righteousness”…………………………………………… 50
Another Great Day …………………………………………………… 54
Sunday’s Finale ……………………………………………………… 58
Chapter IV. RECRUITMENT: A SOCIALLY ORGANIZED ACCOMPLISHMENT … 62
Finding Prospective Members ……………………………………… 62
The Choreography of Total Participation …………………………… 67
Groups ………………………………………………………………… 71
Loving ………………………………………………………………… 76
Control of Communication ………………………………………… 82
Making a Positive Evaluation ……………………………………… 84
“We Can Be New People” ………………………………………… 87
Lecture Reinforcement: Groups, Testimonies and Songs ……… 90
Dreams and Destiny ……………………………………………… 92
Testimonies and Skits …………………………………………… 94
Restoration of Value …………………….………………………… 97
We Want to Be Those People …………………………………… 99
Consensual Validation …………………………………………… 101
Expressions of Self-Fulfillment ………………………………… 106
Sustaining Group Unity and Brotherhood ……………………… 108
Following God’s Will ……………………………………………… 112
Guiding Prospects Towards Truth and Transformation ………… 115
Following Center ………………………………………………… 120
The True Parents ………………………………………………… 123
Idolization and Emulation of Leaders as Role Models ………… 125
Testimonies of Transition ………………………………………… 128
Overcoming Doubt and Negativity ……………………………… 131
Symbols of Commitment ………………………………………… 135
Dramatic Commitment Scenarios ………………………………… 137
Accomplished Commitment ……………………………………… 143
Chapter V. AN OVERVIEW ………………………………………… 146
Sincere Performance ……………………………………………… 148
Trust ………………………………………………………………… 149
Legitimized Control ………………………………………………… 151
Enthrallment ………………………………………………………… 153
Assuming the Role ………………………………………………… 155
Bibliography ………………………………………………………… 160



18. Mind Control: Psychological Reality Or Mindless Rhetoric?

by Philip G. Zimbardo, Ph.D., President, American Psychological Association

One of the most fascinating sessions at APA’s Annual Convention featured presentations by former cult members. Several participants challenged our profession to form a task force on extreme forms of influence, asserting that the underlying issues inform discourses on terrorist recruiting, on destructive cults versus new religious movements, on social-political-“therapy” cults, and on human malleability or resiliency when confronted by authority power.

That proposal is intriguing. At one level of concern are academic questions of the validity of the conceptual framework for a psychology of mind control. However, at broader levels, we discover a network of vital questions:

Does exposing the destructive impact of cults challenge the principle of religious freedom of citizens to mindfully join nontraditional religious groups?

When some organizations that promote religious or self-growth agendas become rich enough to wield power to suppress media exposés, influence legal judgments, or publicly defame psychology, how can they be challenged?

What is APA’s role in establishing principles for treating those who claim to have suffered abuse by cults, for training therapists to do so, and for establishing guidelines for expert testimony?

Personal Freedoms
A basic value of the profession of psychology is promoting human freedom of responsible action, based on awareness of available behavioral options, and supporting an individual’s rights to exercise them. Whatever we mean by “mind control” stands in opposition to this positive value orientation.

Mind control is the process by which individual or collective freedom of choice and action is compromised by agents or agencies that modify or distort perception, motivation, affect, cognition, and/or behavioral outcomes. It is neither magical nor mystical, but a process that involves a set of basic social psychological principles.

Conformity, compliance, persuasion, dissonance, reactance, guilt and fear arousal, modeling and identification are some of the staple social influence ingredients well studied in psychological experiments and field studies. In some combinations, they create a powerful crucible of extreme mental and behavioral manipulation when synthesized with several other real-world factors, such as charismatic, authoritarian leaders, dominant ideologies, social isolation, physical debilitation, induced phobias, and extreme threats or promised rewards that are typically deceptively orchestrated, over an extended time period in settings where they are applied intensively.

A body of social science evidence shows that when systematically practiced by state-sanctioned police, military or destructive cults, mind control can induce false confessions, create converts who willingly torture or kill “invented enemies,” and engage indoctrinated members to work tirelessly, give up their money—and even their lives—for “the cause.”

Power Struggles
It seems to me that at the very heart of the controversy over the existence of mind control is a bias toward believing in the power of people to resist the power of situational forces, a belief in individual will power and faith to overcome all evil adversity. It is Jesus modeling resistance against the temptations of Satan, and not the vulnerability of Adam and Eve to deception. More recently, examples abound that challenge this person-power misattribution.

From the 1930s on, there are many historical instances of state power dominating individual beliefs and values. In Stalin’s Moscow show trials, his adversaries publicly confessed to their treasons. Catholic Cardinal Mindzenty similarly gave false confessions favoring his communist captors. During the Korean War, American airmen confessed to engaging in germ warfare after intense indoctrination sessions. The Chinese Thought Reform Program achieved massive societal conversions to new beliefs. It has also been reported that the CIA put into practice nearly 150 projects—collectively termed MKULTRA—to develop various forms of exotic mind control, including the use of LSD and hypnosis. More than 900 US citizens committed suicide or murdered friends and family at the persuasive bidding of their Peoples Temple cult leader, Jim Jones.

The power of social situations to induce “ego alien” behavior over even the best and brightest of people has been demonstrated in a variety of controlled experiments, among them, Stanley Milgram’s obedience to authority studies, Albert Bandura’s research on dehumanization, my Stanford Prison Experiment, and others on deinviduation.

Understanding the dynamics and pervasiveness of situational power is essential to learning how to resist it and to weaken the dominance of the many agents of mind control who ply their trade daily on all of us behind many faces and fronts.

This article was originally published in the Monitor on Psychology, November 2002.
Philip G. Zimbardo, Ph.D., 2002 President of the American Psychological Association and Professor of Psychology at Stanford University, is one of the nation’s most distinguished psychologists, Dr. Zimbardo has conducted extensive research on the processes of social influence and control.

https://www.icsahome.com/articles/mind-control-zimbardo


19. “Socialization techniques through which the UC members were able to influence”

by Geri-Ann Galanti, Ph.D.

Abstract
This article reports on the experiences and thoughts of an anthropologist who, under an assumed identity, participated in a 3-day Unification Church workshop. Although the author’s expectation that she would encounter “brainwashing” techniques was not met, she was, nevertheless, struck by the subtle, yet powerful, socialization techniques through which the UC members were able to influence her. She concludes that, to be effective, preventive education in this area must address the subtleties of the socialization processes that can bring about major personality changes.


I recently had an encounter with what has been termed “brainwashing,” when I spent a weekend at Camp K, a Moonie training camp in Northern California [in circa 1981-83]. As a result of my experience there, I would like to offer a few comments on the nature of brainwashing from the perspective of an anthropologist. I went to the camp to do research for a project on deprogramming. I thought it was important to see what the “programming” was all about. I pretended, however, to be a young woman who wandered into their church by chance, and who knew little about Rev. Moon or Moonies.

To begin with, I was allowed plenty of sleep and given a sufficient amount of protein. Both mornings, I got out of bed around 8:30 or 9:00 – when I was tired of laying around. No one made me get up early. We were given eggs, fish, tuna, something that looked like “chicken spam,” lasagna (meatless, but plenty of cheese) and other foods. We were constantly being fed – three meals and about two snacks per day. Most people looked a bit overweight. In any case, the two things I was looking for that might “brainwash” me were not present.

I was further disarmed by the fact that the group let me know right up front that they were the Unification Church, and followers of the Reverent Moon. The San Francisco Bay area center had earned a rather bad reputation for hiding that fact until a new recruit was already well entrenched in the group. Apparently, this is no longer true. I walked into the church on Bush Street in San Francisco on a Friday evening, and the first thing that was said to me was “You understand that this is the Unification Church and that we’re followers of the Reverent Moon?” They also had a permanent sign on the front of their building stating “Unification Church.” The first evening at Bush Street, after showing some interest in the Church, I was shown a videotape about the Church and Reverend Moon. In order to go to their camp for the weekend, I had to sign a release, which clearly stated that I was going with the Unification Church. However, the fact that they were now being honest about who they were, in contrast to their past deceptiveness, served to weaken my defense.

The first night, I heard the word “brainwashing” used four or five time, always in a joking context. I finally asked John, my “spiritual father,” why that word kept cropping up so often. He said it was because people often accuse them of being brainwashed. The explanation I heard several times that weekend in this regard is that “people are so cynical and they can’t believe that we can be happy and want to help other people and love God and each other. So they think that we must be brainwashed to feel this way. Ha! Ha!” I was also told by two different Moonies about a recent psychological study comparing Moonies with young adults from other mainstream religious groups. They told me that Moonies came out much better in terms of independence, aggressiveness, assertiveness, and other positive characteristics. The group is apparently meeting the criticism leveled at them head on. Their explanations seemed so reasonable. They would ask, “We don’t look brainwashed, do we?” And they didn’t.

I somehow expected to see glassy-eyed zombies. I didn’t. There was one new member – he’d been in the group only a month and a half – who seemed to fit that stereotype. When I talked to him, his gaze wandered, his eyes not fixed on anything. But everyone else seemed perfectly normal. They were able to laugh and joke (about everything except themselves, which I’ll discuss later) and talk seriously about things. The only thing that really struck me as strange was a kind of false over-enthusiasm. Any time anyone performed, which was often, everyone would clap and cheer wildly. They were good, but not that good. During lectures, they would underscore points with a hearty “yeah!” I must admit, however, that by the end of the weekend, much of the enthusiasm seemed more charming than odd.

Since the issue was brainwashing, I was constantly monitoring my mental state. During lectures (three per day, each lasting about an hour to an hour and a half), I would sit there and smugly critique the lecture (to myself) as it was presented. My intellectual faculties were as sharp as ever. I was able to note the kinds of techniques they were using as well. Immediately before each lecture, we would sing songs from their songbook, to the accompaniment of a guitar. Their songs are very beautiful, and the lyrics always upbeat. As a result, you start off the lecture feeling good from the singing. The lectures are always ended by singing a few more songs. This puts a whole aura of “goodness” around the lectures.

The lectures were carefully orchestrated so as to create a feeling in the listener that they must be “learned,” rather than analyzed. I could discuss this in greater detail, but for now, I will return to the issue of brainwashing. Despite the use of questionable and manipulative educational techniques, I was constantly aware of the functioning of my intellect and of my beliefs, and at no time did I feel that they were being influenced. This may not be the case with an individual who has not spent 13 years in college, but, as will become clear, it only underscores the power of brainwashing. As an anthropologist, I found their beliefs interesting; as an individual, I found them ridiculous. Nor did I experience any altered states of consciousness to indicate that I was being hypnotized in any way. So I thought I was safe.

What I didn’t realize is that the “brainwashing” – or to use a better term, “mind control” – doesn’t come until later. And what is really being talked about is a process of socialization, one which goes on in every household around the world. Human beings are not born with ideas. Ideas are learned. Anthropologists, more than any other group, perhaps, are aware of the variety of beliefs that are held by people around the world. We acquire these beliefs through a process that involves observation, imitation, and testing. Beliefs that are acquired in childhood are generally the strongest, although they may be changed through experience as one grows older. When we have experiences that conflict with our world view, we either rationalize the experience (e.g., I couldn’t find my necklace in the jewelry box yesterday, but today it’s there – I must have overlooked it, or someone must have taken it and put it back), leaving our beliefs intact (e.g., objects don’t magically disappear and reappear), or, if it happens too often and we are presented with an alternative world view which accounts for it, we may change our beliefs. (This is the stuff that Kuhn writes about in his classic book, The Structure of Scientific Revolutions.) It is possible to explain the same event in many ways. What cults do is to offer an alternative way of looking at things. When everyone holds the same belief but you, their view starts to make sense. Society, especially the smaller scale societies we had throughout most of human evolution, could not operate smoothly if everyone were to hold a different belief about the nature of reality. Millions of years of evolution have selected for a human tendency to be influenced by the beliefs of others. If this were not the case, how could any child be socialized to be a member of the group? There are, of course, rebels and visionaries, people who do not accept the beliefs of the group. But they are much fewer in number. Furthermore, adolescence seems to be a major time for group conformity. Teenagers appear to have a strong need to belong, to look and act like one of the group. And it is these adolescents and post-adolescents who are most strongly attracted to cults.

How does mind control work? Let me rephrase that. Even “mind control” is too strong a term – for it, too, conjures up visions of men reaching invisible fingers into your brain, controlling your thoughts and actions like a puppeteer. I think of it more as a socialization process in which one is led to think like the rest of the group. Robert Lifton, in his seminal book entitled: Thought Reform and the Psychology of Totalism: A Study of Brainwashing in China, outlines the eight conditions that result in ideological totalism: milieu control, mystical manipulation, need for purity, personal confession, acceptance of basic group dogma as sacred, loading the language, subordination of person to doctrine, and dispensing of existence. As I see it, all of these features conspire to do two things: (1) isolate the person within a particular cultural context so that that context becomes the only reality, and (2) make the individual feel that if he becomes a member of the group, he will be special. These features are an inherent part of any culture, and not necessarily purposefully contrived to achieve particular aims. Let me give an example.

Several years ago, I spent a summer doing fieldwork in Guatemala. After a month in the field, I couldn’t remember a lot of things about home, e.g., my husband’s voice. He was back in the U.S. Reality was where I was, in Guatemala. One regret I have is not buying more of the beautiful Indian weavings. The reason I didn’t was that they were “too expensive.” The finest cost approximately $30. To buy something similar here would cost well over $100. But I had internalized the Guatemalan standard of money. That summer, no one was purposely trying to control my environment. It was controlled by virtue of the fact that I was spending most of my time in a small rural village. Though I retained most of my American ways and beliefs, my sense of reality was slowly changing, and Guatemala became the standard by which I tested reality.

Regarding the notion that ideological totalism functions to make an individual feel that if he joins the group, he will somehow be better than everyone who is not a member – this is not a new concept. All cultures promote this idea about themselves. The attitude is called “ethnocentrism.” Everything we do is right and natural; everything outsiders do is unnatural, barbaric, etc. The names that most small scale societies use to refer to themselves generally translate into something meaning “the people” or “human beings,” implying that everyone who is not a member of the group is somehow less than human. Perhaps I am overstating the case, but what I saw the Moonies do was to do on a smaller scale what all cultures do with their members.

The techniques they use are for the most part, not very sinister. They are things we encounter in everyday life. They are how we become socialized. The cult becomes a total subculture.

Which brings me to what I think is the most important part. In the beginning, they don’t influence you by changing your beliefs. As I said earlier, they did not affect mine in the least in that short weekend. (although I should point out that my beliefs are very clear and strong. Most people who join the church are self-described “searchers”: they’re looking for answers.) the way they get to you is emotionally. If you stay with an isolated group of people long enough, you will eventually begin to think like they do, act like they do, see the world as they do. It’s part of human nature. It’s what we anthropologists mean when we talk about enculturation. The degree of enculturation (taking on the culture of another group) will depend upon the relative amount of time you associate with people from your own culture and from the new culture, among other factors. If you associate only with members of the new culture, acculturation will generally be much more rapid.

So how do they get you to stay? By giving you a good time, by being likeable, by being happy. Of all the things I expected to happen that weekend, the last thing I expected was to have a good time. Except for the lectures, which I found rather boring and insulting (I thought they were aimed at about a third grade level), I really had fun. We sang a lot, people performed songs and poems, we put on a group talent show, we played volleyball. We became children again, with no responsibilities. It was like being at camp; in fact, it was called camp: Camp K. the setting was beautiful – in the mountains, along a creek, with lots of trees.

They also make you feel really good about yourself. One of the famous Moonie techniques is “love bombing,” which basically consists of giving someone a lot of positive attention. For example, one morning, Jane said to me, “You know, you’re really one of the most open people I’ve ever met. You don’t put up any defenses. You’re really open. I think that’s so great.” When she said this, part of my mind went “flash. Love-bombing, love bombing.” But the other part of me went, “Yeah, but it’s really true. (Don’t we all like to believe the best about ourselves?) She probably really means it.” In any case, it made me feel good. Despite my intellectual recognition of what she was doing, emotionally, I bought it.

Another technique they use is to make you feel part of the group. New recruits were constantly encouraged to take part in the many performances that were put on. During one of the initial group sessions, when we were introducing ourselves, I mentioned that I like to dance. That night, when we were making up our presentation for the “talent show,” everyone kept urging me to choreograph our musical number. I felt a bit shy about it, but then figured, why not? I had never seen a more supportive group in my life. There was no way to fail – except not to take part. I had about 5 minutes to make up and teach a number to a group of 15. Needless to say, my “dance” was simple and rather silly. But it was all in fun and didn’t matter. It made me feel a part of the group. It also gave them ample opportunity for more love-bombing. After the show and all the next day, at least a dozen people came up to tell me what a “great” dance it was. Despite the fact that I knew it wasn’t, it still felt good to have people compliment me on something that is important to me. I was made to feel good by being part of the group.

They also made me feel that I was a lot like individual members of the group. Part of my “cover” was that I was a third grade school teacher. (I did teach 3rd grade for 10 weeks once.) When I told this to my “spiritual father” he replied, “I used to be a school teacher too.” He kept emphasizing how much alike we are. (We’re not.) He also told me how much I remind him of a close friend of his. Someone else told me how much I reminded her of her sister-in-law. Other people told me that I look “so familiar.” It was rather transparent to me that this was merely a technique to make me feel that we were not so different and I could be a part of them. (Actually, this technique was too obvious and not effective on me.)

Socialization also works through subtle peer pressure. At the end of Saturday evening, we once again got in our groups to discuss “what we liked best about the day.” As we went around the circle, people mentioned things like the lecture we had on Rev. Moon, or the movie about the Unification Church, or something that was said in the lecture. As it was coming around to me, I was thinking, “My honest answer would be the volleyball game. I really had a great time playing volleyball. But if I say that, I’m going to sound really shallow compared to everybody else. And I know I’m not shallow.” So I chose something that was also true, thought less so, but which sounded much better. When my turn came, I said, “I really enjoyed meeting a lot of really nice people.” Because of a general human tendency to try to create a positive image of ourselves, I was slowly becoming socialized into the ways of the group. If this were a group that valued physical activity, my true response would have been appropriate. But this was a group that valued God, love, ideals, and so I found myself shaping myself in a way that emphasized the aspects of my being that were most acceptable to the values and standards of the group. We are all multi-faceted. It is a common experience to find that different people or groups of friends being out different aspects of our personality. Generally, we change subtly as we interact with each group, thus emphasizing all aspects of our personality. In a totalist group like the Moonies, however, the group values are so strong and so consistent that only one side of ourselves is elicited and reinforced. We thus shape our personality as we become socialized into the group.

The most powerful aspect of the whole experience was the personal relationships. At the beginning of the weekend, I remember thinking that there really wasn’t anyone there that I would want to be friends with. But by the end of 2 ½ very intense days, I had developed a few attachments, especially to two of the women, Susan and Jane. I also felt very guilty about deceiving them regarding who I was and why I was there. Yet I couldn’t tell them the truth because then I couldn’t be sure that they weren’t treating me differently from others – non-researchers. Even though I knew they were deceiving me in subtle ways and that the ultimate goal that was shaping their behavior toward me was the desire to get me to join the group, I still felt guilty. I honestly liked them. They seemed so open and honest with me, although I still don’t know how open and honest that really was. They seemed to like me. My ego wants to believe they did. The whole cult issue is very clouded in my mind. It is exceedingly complex. If their main motive was to get me to join the group, it was because they believed that by doing so, they were helping to save the world and my soul. Is that so dishonest? Yet how honest is it to consciously use those very effective techniques? I see them as both victims and victimizers. Simultaneously.

They presented a lifestyle alternative that was very appealing. Community, love, idealism. They presented a picture of true happiness. Yet we learn from ex-members (who admittedly have their own biases) that this picture is false. Or at least, only part of the picture. What is left out is the fear and guilt and the loss of self.

What the “brainwashing” is all about, in my view, is grabbing you emotionally. Giving you a good time, showing you others, like yourself, who are fulfilled. People who, like you, were searching for answers to life’s basic questions and found them. Why not stay a little longer, and learn a little more about them? You don’t have to believe in the doctrine right away. You can still think critically at the end of the weekend, when you make the decision to stay on for the 7-day seminar. But you’ve begun to develop emotional ties that will keep you there. To learn a little more. Until they have finally socialized you into their way of life. They grab you emotionally until they can keep you long enough to completely socialize you.

I am writing this article because I think it is important to understand what is going on. I know that I didn’t understand, despite having done a lot of reading and talking to people about it. I think it is because most of us have too many strong associations with the words “brainwashing” and “mind control.” They seem so overt. They’re not. The process can be extremely subtle. But because we have such strong associations, we do not recognize the process in its other manifestations. I think that in part it is because it is so familiar. It is something that happens everyday to every child that is born on this planet. Society is possible only because socialization techniques are effective. Socialization isn’t sinister. The problem I see with the cults is the context. As an anthropologist, I am aware of the existence of what we would term cults in other societies. I think that cults have a greater and more damaging impact in our culture because we value the individual so highly. From discussions with ex-members, it appears that one of the most negative effects of cult involvement is a loss of self. Many other societies value the group over the individual. Although I am not a psychiatrist, I would guess that it is not so damaging to the psyche to give up your individual identity to the group (the cult), if you have always been raised to value the group over the self. But in our culture, where the opposite is true, this can be devastating to many individuals.

I think it was the contrast between my expectations and my experience that allowed the weekend to have such a strong emotional affect on me. I was looking for something big and evil and what I found was very subtle and friendly, so I didn’t recognize its power. I was also mistaken in believing that the socialization process (or the influence process) was intellectual. It’s not. It’s emotional, and thus touches a deeper and more central part of one’s brain. When I left at the end of the weekend, a friend who had been in the Moonies and worked for a while as a deprogrammer picked me up. One of the first things I said to him was, “I had a great time. Remind me again what’s so bad about the Moonies.”

The next day I was interviewing a former deprogrammer. About half-way through the interview I asked her to describe exactly what she did during the deprogramming. She looked me directly in the eye and said, “Exactly what I’ve been doing with you.” This shocked me, because I didn’t think I needed any deprogramming. I didn’t buy their doctrine. They didn’t brainwash me. But they did get to me. I had forgotten all of the organization’s abuses of church members: the long hours of fund-raising, sometimes in dangerous areas, late at night; the lack of proper nutrition; the suicide training; the fear and guilt; the relative poverty the members live in, while the leaders live in splendor; the munitions factory owned by a church which is supposedly striving for world peace; the divisions created between family members; the deception; all of the horrors. Part of me remembered them, because I remember asking questions about what exactly the church does to make the world better, knowing that most members spend them time selling flowers. But that knowledge didn’t seem important. The people seemed good, so by association, the group did too. I had been influenced. The emotional truth was so much stronger than the intellectual one that it was the only one that seemed important.

I have mixed feelings about the use of the term “brainwashing” with regard to cult indoctrination. Because of the general effectiveness of the techniques in influencing a person’s thoughts and actions, I can understand the persistence of its use. If someone like Patty Hearst is going to be defended on such a basis, it needs to be recognized as a powerful and legitimate technique (although degree of susceptibility will vary). However, if the goal is to keep people out of cults, I am afraid the contrast between the stereotypic notion of brainwashing (which I don’t think we can escape) and the experience a new recruit has is too sharp, that people are disarmed and no longer aware of the techniques being used on them. Instead, I would advocate seeing the brainwashing process in the context of socialization. This is something with which we are all familiar and about which we hold few, if any, negative connotations. At the same time, it is something that we are aware of the power of. I would contend that the process of “brainwashing” can best be understood as an intensified socialization experience. I may be quibbling over semantics, but given the fact that the words in question are so loaded, I feel that semantics are important here. The Moonies take the raw material of our human needs – to be loved and to be accepted – and use the same techniques that for centuries cultures have used to shape individuals into members of the culture: peer pressure, reward and punishment, and the experience of being surrounded by individuals who all view the world in the same way.

My weekend with the Moonies was intended to answer some questions I had. Instead, it raised many more. The most solid thing I came away with, however, and my reason for writing this, is a new understanding of brainwashing. If we are to avoid it, we must first learn to recognize it.


Geri-Ann Galanti, Ph.D., is a medical anthropologist, and lecturer at the UCLA School of Medicine. Dr. Galanti was formerly on the faculty of California State University’s Department of Anthropology and California State University’s School of Nursing, where she developed the curriculum for the BSN program’s Cultural Diversity in Healthcare course. Dr. Galanti is a consultant to Civility Mutual.

Geri-Ann Galanti

This article is an electronic version of an article originally published in
Cultic Studies Journal, 1984, Volume 1, Number 1, pages 27-36. 



Sun Myung Moon’s theology used to control members


Japanese woman recruited and sold by the Moon church to a Korean farmer

A 20-year-old woman, recruited by the Family Federation for World Peace and Unification / UC in Japan, was sold to an older Korean farmer in an “apology marriage”.


Una mujer japonesa fue reclutada por la Federación de Familias y luego vendida a un granjero coreano

Mujer de 20 años reclutada en Japón por la Federación de la Familia para la Paz Mundial y la Unificación / IU y luego fue vendida a un granjero coreano en un “matrimonio de disculpa”.


Allen Tate Wood on Sun Myung Moon and the Unification Church


20. VIDEO: Recovery from RTS (Religious Trauma Syndrome) by Marlene Winell


21. VIDEO: ICSA – After the cult


22. “How do you know I’m not the world’s worst con man or swindler?” Sun Myung Moon

ORDER NUMBER 77-01-02

REVEREND SUN MYUNG MOON SPEAKS ON
LET US MEET OPPORTUNITY WELL
January 2, 1977 
World Mission Center 
Translator – Bo Hi Pak

Let’s say God promised you something. Is it an empty promise or will it be delivered? Once God makes a promise, once God makes a decision, He always keeps that promise even if it takes thousands of years for it to be fulfilled. Time after time He has fulfilled His promises. We all need this God.

How about human promises? We promise each other quite a bit. Sometimes we even make promises knowing that we will never fulfill them. In other words, we lie. We have all lied; none of you has been perfect. In human affairs everybody lives like that. This is the honest situation.

Human lies are everywhere because lies are very convenient. Without lies, commercial people practically couldn’t continue in business. What about God? He does not lie, and does not hear lies because He knows they are lies. He sees through them. Often we listen to lies without knowing it, but not God. We are exposed as we really are before God because our lies cannot hide anything from Him. This means you cannot even trust me 100%. I have human weaknesses. That’s an honest and frank statement. However, I am introducing you to a person you can trust 100%: God. My mortal body will live only one generation here on earth, but God will remain here forever.

The same thing is true for all mankind. You may often think the same thing, “I wish there was no God. He bothers me too much. Oh, God go away somewhere. I want to do my own thing.” I’m sure some of you are thinking right now, “Oh, I wish Rev. Moon were not here. He bothers me too much. He pushes me too much!”

How do you know I’m not the world’s worst con man or swindler? Have you seen my heart? No one can see it. How can you trust me? You can only trust me by experiencing life together with me. Maybe your future experience will be entirely different. Regardless, are you ready to go?

I tell you one thing: You will never lose yourself; you will never be harmed by going this way. Suppose I were telling you lies, but you took my lies as the truth, as God’s words and lived them 100%. God knows very well what is true or false. In that case, God may condemn me, but God would never condemn you. God may not give me the blessing but He definitely would not withdraw His blessing from you. Actually this is a challenge. Even if I am telling you lies, if you take them seriously as the word of God, and you fulfill them, actually there is a chance that you could become the real Rev. Moon. You can’t lose. When you take things seriously and live the teaching, not for your own sake but for God’s sake, God will never abolish your deeds.

How can you take my word 100% seriously? You got up this morning for pledge service, didn’t you? Tell me very honestly, were you willing to do that this morning? Don’t tell me big lies! You did it because you had to! Even I didn’t want to get up at 4:30. Are you different? Why do I do it if I don’t want to? Because there’s someone upstairs watching me.

Nobody really wants to go out there selling peanuts and flowers. Nobody wants to go out there on the street, acting like a crazy man trying to grab people and witness. You act almost like servants to the people, trying to win their hearts, trying to talk to them. When you think of it, the amount of work you have to do to win one person’s heart is incredible. And you have to do it day in and day out. You go out fundraising every day, from early morning to late night. Are we really fond of doing it? We do it because we have to.

Respected people outside will say, “How crazy you are. Why did you become a slave of Rev. Moon? I have never seen such a fool.” Do you have the courage to overcome that kind of reaction?

Let me tell you one episode from my past. Many times in North Korea and one time [1955] in South Korea I was in jail. There was one ardent member following me around that time, but he became tired and left. Then he read in the newspaper that l was going to jail. At that time many members were trying to encourage me, saying, “Don’t worry, Father. You just wait; we shall do 1000 times more than you.” But this particular person came to the prison, curious to see how I looked. He happened to be in such a position that I met him face to face, and I will never, never forget that man’s statement. He said to my face, “You fool, are you still doing this?”

[The main investigation into Sun Myung Moon in 1955 was into his sex rites with many students from Ewha Womans University, and some married women.] LINK


23. VIDEO: The Space Between Self-Esteem and Self Compassion: Kristin Neff

The importance of reconnecting with our emotions and self care.


24. VIDEO: What Is A Cult? CuriosityStream  July 2020

When people hear the word cult a certain image tends to burst into their brain. Brainwashed devotees worshipping a crazed narcissist and ready to sacrifice anything for them.

While Jonestown, Heaven’s Gate, and the Solar Temple have given cults a violent reputation, most cults keep a much lower profile.

Cults are everywhere. There’s probably a cult in your town. And the tricks they use to lure people in aren’t complicated but rather tried and tested social manipulation tactics that anyone can fall for.

So what is a cult, how do they trap people, and how do people end up believing that volcano alien ghosts are living on their bodies? Well, Let’s Find Out.


25. Bibliography

(A bibliography section on the UC / FFWPU / Moon church of Japan is below)

Ashamed to be a Korean: Raised in Sun Myung Moon’s cult

Alstad, Diana and Kramer, Joel (1993) The Guru Papers. Berkeley, Calif.: North Atlantic Books  WEBSITE

Anderson, Scott and Anderson, Jon Lee. (1986) Inside the League. New York: Dodd Mead and Co.
__________  Chapter Five (all)
__________ Chapter Five (extract – Sasakawa and Kodama)

Atack, Jon. (2016) Opening Minds: the secret world of manipulation, undue influence and brainwashing. Second Edition. Colchester, England: Open Minds Foundation Trentvalley Ltd.
__________ WEBSITE

Bale, Jeffrey M. The Unification Church and the KCIA – ‘Privatizing’ covert action: the case of the UC  Lobster, May 1991

Blake, Mariah (November 25, 2013) “The Fall of the House of Moon” New Republic pages 28-37
https://newrepublic.com/article/115512/unification-church-profile-fall-house-moon

__________ (December 9, 2013) “Meet the Love Child Rev. Sun Myung Moon Desperately Tried to Hide” Mother Jones http://www.motherjones.com/politics/2013/12/reverend-moon-unification-church-washington-times-secret-son

Boettcher, Robert (with Freedman, Gordon L.) (1980) Gifts of Deceit, Sun Myung Moon and the Tongsun Park Korean Scandal. New York: Holt, Rinehart and Winston

Case, Thomas W. (1995) Moonie, Buddhist, Catholic: A Spiritual Odyssey. Cincinnati, OH, USA: White Horse Press

Cialdini, Robert (2009) Influence, Science and Practice (5th edition). Boston, MA, USA: Pearson Education, Inc.  VIDEO

Choe, Joong-Hyun (1993) The Korean War and messianic groups: Two cases in contrast (Unification Church and the Olive Tree Movement). PhD. thesis, Syracuse University, USA.

Choi, Syn-duk 崔信德 (1967) Korea’s Tong-il Movement. in Transactions of the Royal Asiatic Society No. 43 (1967) Volume XLIII pages 101-113.
LINK to a PDF of the magazine

Chun, Young Bok (1976) The Korean Background of Unification Church: A New Religion. pp 14-18 in Japanese Religions Vol. 9 July 1976 No. 2. A magazine issued by the NCC Center for the Study of Japanese Religions Kyoto, Japan

Clarkson, Frederick (1997) Eternal Hostility: Struggle Between Theocracy and Democracy. Monroe, Maine, U.S.: Common Courage Press

Clarkson, Frederick (2012) Missing Pieces of the Story of Sun Myung Moon


Deikman, Arthur J., (2009) Them and Us: Cult Thinking and Terrorist Threat. Bay Tree Publishing  VIDEO SUMMARY    INTERVIEW  

De Mente, Boyé Lafayette (2018) The Korean Mind. Understanding Contemporary Korean Culture. North Clarendon, Vermont, USA: Tuttle

Durham, Deanna (1981) Life Among the Moonies: three years in the Unification Church. Plainfield, New Jersey, USA: Logos International

Edwards, Christopher (1979) Crazy for God. Engelwood Cliffs, New Jersey, USA: Prentice Hall Inc.

Elkins, Chris (1980) Heavenly Deception. Wheaton, Illinois, USA: Tynedale House Publishers, Inc.

Ford, Wendy (1990) Recovery from Abusive Groups, American Family Foundation

Freed, Josh (1980) Moonwebs, Journey into the Mind of a Cult. Canada: Dorset Publishing Inc. (Hardback), Virago (Paperback).

Freed, Josh (1980) Billet pour le ciel (French)

Goldberg, Lorna; Goldberg, William; Henry, Rosanne; Langone, Michael (2017) Cult Recovery – a clinician’s guide to working with former members and families. Bonita Springs, Florida, USA: ICSA   WEBSITE

Gorenfeld, John (2008) Bad Moon Rising (how the Reverend Sun Myung Moon created the Washington Times, seduced the religious right, and built his Kingdom). Sausalito, CA, USA: PoliPoint Press

Greene, Ford (2005)  Ford Greene: Attorney at odds

__________ (2012) PODCAST. Ford with Peter B. Collins September 2012

Guisso, Richard W.I. and Yu, Chai-shin (1988) Shamanism: The Spirit World of Korea. Berkeley, Calif.: Asian Humanities Press

Harvey, Todd. (1995) My experience in the Unification Church
__________   Eight reasons why I got out of the UC

Hassan, Steven (2015) Combating Cult Mind Control, Freedom of Mind Press
__________  WEBSITE

Herman, Judith (1992, 2015) Trauma and Recovery. The aftermath of violence – from domestic abuse to political terror. New York: Basic Books

Herman, Judith (1998) Recovery from psychological trauma. Psychiatry and Clinical Neurosciences

Hoffer, Eric (1951) The True Believer, Thoughts on the Nature of Mass Movements. New York: Harper Perennial Modern Classics

Hong, Nansook (1998) In the Shadow of the Moons: My Life In The Reverend Sun Myung Moon’s Family. Boston, USA: Little, Brown and Company.
__________ Interview with Mike Wallace on ‘60 minutes’
__________ Interview with Herbert Rosedale
__________ Interview – The Dark Side of the Moons

Hong, Nansook (1998) « L’ombre de Moon » (French)
__________ Interview – J’ai arraché mes enfants à Moon (French)

Hong, Nansook (2020) A la Sombra de los Moon (Spanish)

Hong, Nansook (2000) Ich schaue nicht zurück (German)

Horowitz, Irving Louis (1978) Science, Sin, and Scholarship: The Politics of Reverend Moon and the Unification Church. Cambridge, USA, and London: The MIT Press

Hose, Teddy:
__________ VIDEO: Over the Moon – Escaping the Unification Church
__________ VIDEO: Secrets of the Moonies

Junas, Daniel (1991) The Moon Organization Academic Network

Kaycee PODCAST The Cult Vault: Introduction to the Study of Cults
__________ PODCAST #3 The Unification Church AKA The Moonies
__________ PODCAST #45 Unification Church – Revisited

Kiaba, Jen. The Purity Knife

Kim, Chong-sun (1978) Rev Sun Myung Moon. University Press of America in Washington, D.C.: Rowman & Littlefield

Kohn, Lisa (2018) To the Moon and Back: A Childhood Under the Influence New York: Heliotrope Books LLC
__________ (2019) Lisa Kohn Interview on Generation Cult

Lalich, Janja and Tobias, Madeleine (2006) Take Back Your Life: Recovering from Cults and Abusive Relationships. Berkeley, Calif.: Bay Tree Publishing.
Lalich, Janja (September 2019)  Interview: Are we all cult members now?
__________  VIDEO
__________  WEBSITE

Langone, Michael – Editor (1993) Recovery from Cults: Help for Victims of Psychological and Spiritual Abuse. New York: W.W. Norton and Company   WEBSITE

Lifton, Robert Jay (1961) Thought Reform and the Psychology of Totalism. Chapel Hill, NC, USA: University of North Carolina University Press

__________ (2019) Losing Reality (on cults, cultism and the mindset of political and religious zealotry). New York: The New Press

Lofland, John (1971) Doomsday Cult: A Study of Conversion, Proselytization, and Maintenance of Faith. Enlarged edition. 362pp (first published by Prentice Hall) New York: Irvington Publishers, ISBN-10: 0-8290-0095-X

Mazer, Cathryn (1993) VIDEO: NBC Today Show: Cathryn Mazer and the Unification Church (11/15/93)

__________ (2001) VIDEO: Cathryn Mazer was encouraged to lie by the UC

__________ (2019) PODCAST Cathryn Mazer and her mother with Rachel Bernstein: Whisked Away By The Moonies with Cynthia Lilley and Cathryn Mazer

__________ (2019) PODCAST Cathryn Mazer and her mother with Rachel Bernstein: A Bond That Was Life-Saving with Cynthia Lilley and Cathryn Mazer, ex-Unification Church

Mook, Jane Day (May 1974) “New Growth on Burnt-Over Ground” in A.D. pages 30-36

Naylor, R.T. (2004) Hot Money and the Politics of Debt – What links the Mafia, the Vatican, the Moonies, the CIA, cocaine barons, banks – and you? Third Edition (536pp) Montreal and Kingston, Canada: McGill-Queen’s University Press

Nevalainen, Kirsti L. (2011) Change of Blood Lineage through Ritual Sex in the Unification Church (162pp)

Paden, William E. (1994 edition) Religious Worlds, the Comparative Study of Religion. Boston, MA: Beacon Press

Park, Sam (2014) Testimony of one of Moon’s secret sons

Parke, Jo Anne and Stoner, Carol (1977) All God’s Children: The Cult Experience—Salvation or Slavery? Radnor, Pennsylvania: Chilton Book Company

Parry, Robert.   Index of Articles     consortiumnews.com

Reiss, Steven (2015) The 16 Strivings for GodMercer University Press

Rice, Berkeley (1976) “The pull of Sun Moon”

Shaw, Daniel (2014) Traumatic Narcissism, Relational Systems of Subjugation. Hove, East Sussex, UK: Routledge

__________ (2020) ICSAA Prison of Shame & Fear: Understanding the Role of Shame in Cult Indoctrination & Recovery with Dan Shaw, LCSW   VIDEO

Shermer, Michael (2012) The Believing Brain: From Spiritual Faiths to Political Convictions – How We Construct Beliefs and Reinforce Them as Truths      VIDEO

Singer, Margaret Thaler with Lalich, Janja (Feb 15, 1995) Cults in Our Midst: The Hidden Menace in Our Everyday Lives. San Francisco: Jossey Bass Social and Behavioral Science Series.

Singer, Margaret Thaler Cults VIDEO

Soh, C. Sarah (2008) The Comfort Women, Sexual Violence and Postcolonial Memory in Korea and Japan. Chicago and London: The University of Chicago Press

Stein, Alexandra (2017) Terror, Love and Brainwashing – Attachment in Cults and Totalitarian Systems. London and New York: Routledge
__________ How totalism works – The brainwashing methods of isolation, engulfment and fear
__________  WEBSITE
__________  VIDEO: Talk Beliefs
__________  VIDEO: Interview with Chris Shelton

Tahk Myeong-hwan  Testimony

Underwood, Barbara and Underwood, Betty (1979) Hostage to Heaven. New York: Clarkson N. Potter, Inc.

Walker, Pete (2013) Complex PTSD: From Surviving to Thriving: a guide and map for recovering from childhood trauma. Berkeley, Calif.: An Azure Coyote book (self-published)  WEBSITE

Winell, Marlene (2007) Leaving the Fold. Berkeley, Calif.: Apocryphile Press  VIDEO

Wood, Allen Tate. Interviews and extract from his 1979 book, Moonstruck

Yamamoto, J Isamu (1977) The Puppet Master. An Inquiry into Sun Myung Moon and the Unification Church. Downers Grove, Illinois, USA: Intervarsity Press

Zablocki, Benjamin and Robbins, Thomas (Editors) (2001) Misunderstanding Cults: Searching for Objectivity in a Controversial Field. University of Toronto Press, Scholarly Publishing Division

Zieman, Bonnie (2017) Cracking the Cult Code for Therapists: What Every Cult Victim Wants Their Therapist to Know. CreateSpace Independent Publishing Platform   VIDEO



Bibliography – section on the UC / FFWPU / Moon church of Japan

Kang, Wi Jo (1976) The Unification Church: Christian Church or Political Movement?


Tragedy of the Six Marys VIDEO in Japanese translated transcript

Tragedy of the Six Marys VIDEO

Moon church of Japan 1998 VIDEO: Demand for all property

Shocking video of the UC of Japan demanding money – English transcript

Soejima, Yoshikazu (1984) This is the secret part of the ‘Unification Church’
some information in English available in a Washington Post article

Why did a Japanese Moon church member kill her Korean husband?

Suicide of a Japanese money mule for Moon in Uruguay

Japanese woman recruited and sold by the Moon church to a Korean farmer

The Papasan Choi Japanese origins of Boonville in California

Hiroko Yamasaki (Olympic athlete) joined and left the UC in Japan

A huge Moon church financial scam in Japan is revealed

Moon extracted $500 million from Japanese female members

6,500 women missing from the Moon church mass weddings

The Unification Church of Japan used members for profit, not religious purposes

The Atsuko Kumon Hong “suicide / murder” of August 2013

How Sun Myung Moon bought protection in Japan

“Apology marriages” made by Japanese UC members to Korean men

The Comfort Women controversy. This issue is used to manipulate the Japanese members


The Making of a Moonie – choice or brainwashing? (1984)
by Eileen Barker

page 265 note 31.  “In conversations with scores of non-Unificationist Koreans the first information I have been given about the Unification Church has, in almost every instance, been that Moon engages (or has engaged) in immoral sexual practices with his followers.”