Cult Indoctrination by Kimiaki Nishida

Cult Indoctrination through Psychological Manipulation

by Professor Kimiaki Nishida 西田 公昭 of Rissho University in Tokyo.

(There is an explanation of this diagram below.)

This is a shortened version of an article entitled Development of the Study of Mind Control in Japan first in published in 2005.

Recently, psychologists in Japan have been examining a contemporary social issue — certain social groups recruit new members by means of psychologically manipulative techniques called “mind control.” They then exhort their members to engage in various antisocial behaviors, from deceptive sales solicitation and forcible donation to suicide and murder [e.g. Tokyo sarin gas attack by Aum Shinrikyō in 1995]. We classify such harmful groups as “cults” or even “destructive cults.” Psychologists concerned with this problem must explain why ordinary, even highly educated people devote their lives to such groups, fully aware that many of their activities deviate from social norms, violate the law, and may injure their health. Psychologists are now also involved in the issue of facilitating the recovery of distressed cult members after they leave such groups.

Background
In the 1970s, hardly anyone in Japan was familiar with the term “destructive cult.” Even if they had been informed of cult activities, such as the 1978 Jonestown tragedy, in which 912 members of the Guyana-based American cult were murdered or committed suicide, most Japanese people would have thought the incident a sensational, curious, and inexplicable event. Because the events at Jonestown occurred overseas, Japanese people, except possibly those worried parents whose child had joined a radical cult, would not have shown any real interest.

In the 1980s, a number of Japanese, including journalists and lawyers, became concerned about the “unethical” activities of the Unification Church, whose members worshiped their so-called True Father, the cult’s Korean founder Sun Myung Moon, who proclaimed himself to be the Second Advent of Christ. One of the group’s activities entailed shady fund-raising campaigns. Another unethical activity of the cult in the 1980s was Reikan-Shôhô, a swindle in which they sold spiritual goods, such as lucky seals, Buddhist rosaries, lucky-towers [pagodas] ornaments, and so on. The goods were unreasonably expensive but the intimidated customers bought them to avoid possible future misfortune [or to liberate their deceased loved-ones from the ‘hell’ they were told they were suffering in].

The first Japanese “anti-cult” organization was established in 1987 to stop the activities of the Unification Church. The organization consisted of lawyers who helped Reikan-Shôhô victims all over Japan (see Yamaguchi 2001). According to their investigation, the lawyers’ organization determined that the Unification Church in Japan engaged in three unethical practices. First, large amounts of money were collected through deceptive means. Under duress, customers desperate to improve their fortunes bankrupted themselves through buying the cult’s “spiritual” goods. Second, members participated in mass marriages arranged by the cult without the partners getting to know each other, after the partners were told by the cult leader that their marriage would save their families and ancestors from calamity. Third, the church practiced mind control, restricting members’ individual freedom, and employing them in forced labor, which often involved illegal activity. Mind-controlled members were convinced their endeavors would liberate their fellow beings.

The 1990s saw studies by a few Japanese psychological researchers who were interested in the cult problem. By the mid-1990s, Japanese courts had already acknowledged two Unification Church liabilities during proceedings the lawyers had brought against the cult; namely, mass marriage and illegal Reikan-shôhô. (see Judgment by the Fukuoka [Japan] District Court on the Unification Church 1995). The lawyers’ main objective, however, had been that the court confirm the Unification Church’s psychological manipulation of cultists, a ruling that would recognize these members as being under the duress of forced labor.

What Is Mind Control?
Early in the study of mind control, the term was equated with the military strategy of brainwashing. Mind control initially was referred to in the United States as “thought reform” or “coercive persuasion” (Lifton 1961; Schein, Schneier, and Barker 1961). Currently, however, mind control is considered to be a more sophisticated method of psychological manipulation that relies on subtler means than physical detention and torture (Hassan 1988).

In fact, people who have succumbed to cult-based mind control consider themselves to have made their decision to join a cult of their own free will. We presume that brainwashing is a behavioral-compliance technique in which individuals subjected to mind control come to accept fundamental changes to their belief system. Cult mind control may be defined as temporary or permanent psychological manipulation by people who recruit and indoctrinate cult members, influencing their behavior and mental processes in compliance with the cult leadership’s desires, and of which control members remain naive (Nishida 1995a).

After the Aum attacks, Ando, Tsuchida, Imai, Shiomura, Murata, Watanabe, Nishida, and Genjida (1998) surveyed almost 9,000 Japanese college students. The questionnaire was designed to determine: whether the students had been approached by cults and, if so, how they had reacted; their perception of alleged cult mind-control techniques; and how their psychological needs determined their reactions when the cults had attempted to recruit them.

Ando’s survey results showed that about 20% of respondent impressions of the recruiter were somewhat favorable, in comparison with their impressions of salespersons. However, their compliance level was rather low. The regression analysis showed that the students tended to comply with the recruiter’s overture when:

• they were interested in what the agent told them;
• they were not in a hurry;
• they had no reason to refuse;
• they liked the agent; or
• they were told that they had been specially selected, could gain knowledge of the truth, and could acquire special new abilities.

When asked to evaluate people who were influenced or “mind controlled” by a cult, respondents tended to think it was “inevitable” those people succumbed, and they put less emphasis on members’ individual social responsibility. When mind control led to a criminal act, however, they tended to attribute responsibility to the individual. More than 70% of respondents answered in the affirmative when asked whether they themselves could resist being subjected to mind control, a result that confirms the students’ naiveté about their own personal vulnerability. The respondents’ needs or values had little effect on their reactions to, interest in, and impressions of cult agents’ attempts to recruit them.

Mind Control as Psychological Manipulation of Cult Membership
Nishida (1994, 1995b) investigated the process of belief-system change caused by mind control as practiced by a religious cult. His empirical study evaluated a questionnaire administered to 272 former group members, content analysis of the dogma in the group’s publications, videotapes of lectures on dogma, the recruiting and seminar manuals, and supplementary interviews with former members of the group.

Cult Indoctrination Process by Means of Psychological Manipulation
In one of his studies, Nishida (1994) found that recruiters offer the targets a new belief system, based on five schemas. These schemas comprise:

1. notions of self concerning one’s life purpose (Self Beliefs);

2. ideals governing the type of individual, society, and world there ought to be (Ideal Beliefs);

3. goals related to correct action on the part of individuals (Goal Beliefs);

4. notions of causality, or which laws of nature operate in the world’s history (Causality Beliefs); and 

5. trust that authority will decree the criteria for right and wrong, good and evil (Authority Beliefs) ▲.

Content analysis of the group’s dogma showed that its recruitment process restructures the target’s belief-system, replacing former values with new ones advocated by the group, based on the above schemas.

Abelson (1986) argues that beliefs are metaphorically similar to possessions. He posits that we collect whatever beliefs appeal to us, as if working in a room where we arrange our favorite furniture and objects. He proposes that we transform our beliefs into a new cognitive system of neural connections, which may be regarded as the tools for decision making.

Just as favorite tools are often placed in the central part of a room, or in a harmonious place, it appears that highly valued beliefs are located for easy access in cognitive processing. Meanwhile, much as worn-out tools are often hidden from sight in corners or storerooms, less-valued beliefs are relocated where they cannot be easily accessed for cognitive processing. Individual changes in belief are illustrated with the replacement of a piece of the furniture while a complete belief-system change is represented as exchanging all of one’s furniture and goods, and even the design and color of our room. The belief-system change, such as occurs during the recruitment and indoctrination process, is metaphorically represented in Figure 1 (below), starting with a functional room with its hierarchy of furniture or tools, and progressing through the stages of recruitment and indoctrination to the point at which the functional room has been replaced by a new set of furniture and tools that represent the altered belief system.

Step 0. The Figure shows the five schemas as a set of the thought tools that potential recruits hold prior to their contact with the group.

Step 1. Governed by their trust in authority, targets undergoing indoctrination remain naive about the actual group name, its true purpose, and the dogma that is meant to radically transform the belief system they have held until their contact with the group. At this stage of psychological manipulation, because most Japanese are likely to guard against religious solicitation, the recruiter puts on a good face. The recruiter approaches the targets with an especially warm greeting and assesses their vulnerabilities in order to confound them.

Step 2. While the new ideals and goals are quite appealing to targets, their confidence level in the new notions of causality also rises; some residual beliefs may remain at this stage.

The targets must be indoctrinated in isolation so that they remain unaware that the dogma they are absorbing is a part of cult recruitment. Thus isolated, they cannot sustain their own residual beliefs through observing the other targets; the indoctrination environment tolerates no social reality (Festinger 1954). The goal for this stage is for the targets to learn the dogma by heart and embrace it as their new belief, even if it might seem strange or incomprehensible.

Step 3. At this stage, the recruiter’s repeated lobbying for the new belief system entices the targets to “relocate” those newly absorbed beliefs that appeal to them into the central area in their “rooms.” By evoking the others’ commitment, the recruiter uses group pressure to constrain each target. This approach seems to induce both a collective lack of common sense (Allport 1924) and individual cognitive dissonance (Festinger 1957).

Step 4. As the new recruits pass through a period of concentrated study, the earlier conversion of particular values extends to their entire belief system. By the end, they have wholly embraced the new belief system. The attractive new beliefs gradually are “relocated” from their “room’s” periphery into its center, replacing older beliefs. Recently held beliefs are driven to the room’s periphery, thoroughly diminished; new, now-central beliefs coalesce, blending with the few remaining older notions.

Shunning their former society, the targets begin to spend most of their time among group members. Their new social reality raises the targets’ conviction that the new beliefs are proper. At this time, the targets feel contentedly at home because the recruiters are still quite hospitable.

Step 5. The old belief system has become as useless as dilapidated furniture or tools. With its replacement, the transformation of the new recruits’ belief systems results in fully configured new beliefs, with trust in authority at their core, and thus with that authority an effective vehicle for thought manipulation.

At the final stage of psychological manipulation, during the recruitment and indoctrination process, the recruiters invoke the charismatic leader of the group ▲, equating the mortal with god. The recruiters instill a profound fear in the targets, fear that misfortune and calamity will beset them should they leave the cult.

Figure 1. Metamorphosis of the belief system change by cultic psychological manipulation.

Each ellipse represents the working space for decision making. The shapes colored black in the ellipse represent the newly inputted beliefs. The large shapes are developed beliefs, and the shapes in the middle represent beliefs that are highly valued by the individual.  represents the authority of the charismatic leader of the group.


Cult Maintenance and Expansion through Psychological Manipulation

Nishida (1995b) studied one cult’s method of maintaining and expanding its membership by means of psychological manipulation, or cult mind control. The results of factor analysis of his survey data revealed that cult mind-control techniques induced six situational factors that enhanced and maintained members’ belief-systems: (1) restriction of freedom, (2) repression of sexual passion, (3) physical exhaustion, (4) punishment for external association, (5) reward and punishment, and (6) time pressure.

Studies also concluded that four types of complex psychological factors influence, enhance, and maintain members’ belief systems: (1) behavior manipulation, (2) information-processing manipulation, (3) group-processing manipulation, and (4) physiological-stress manipulation.

Behavior Manipulation
Behavior manipulation includes the following factors:

1  Conditioning. The target members were conditioned to experience deep anxiety if they behaved against cult doctrine. During conditioning, they would often be given small rewards when they accomplished a given task, but strong physical and mental punishment would be administered whenever they failed at a task.

2  Self-perception. A member’s attitude to the group would become fixed when the member was given a role to play in the group (Bem 1972; Zimbardo 1975).

3  Cognitive dissonance. Conditions are quite rigorous because members have to work strenuously and are allowed neither personal time nor money, nor to associate with “outsiders.” It seems that they often experienced strong cognitive dissonance (Festinger 1957).

Information-Processing Manipulation
Information-processing manipulation factors include the following:

1  Gain-loss effect. Swings between positive and negative attitudes toward the cult became fixed as more positive than negative (Aronson and Linder 1965). Many members had negative attitudes toward cults prior to contact with their group.

2  Systemization of belief-system. In general, belief has a tenacious effect, even when experience shows it to be erroneous (Ross, Lepper, and Hubbard 1975). Members always associate each experience with group dogma; they are indoctrinated to interpret every life event in terms of the cult’s belief-system. 

3  Priming effect. It is a cognitive phenomenon that many rehearsed messages guide information processing to take a specific direction (Srull and Wyer 1980). The members listen to the same lectures and music frequently and repeatedly, and they pray or chant many times every day.

4  Threatening messages. They are inculcated with strong fears of personal calamity by means of [illnesses such as cancer, accidents, influence of evil spirits, “restored after satan”], and so on.

Group-Processing Manipulation
Group-processing manipulation components include:

1  Selective exposure to information. Members avoid negative reports, but search for positive feedback once they make a commitment to the group (Festinger 1957). It should also be added that many group members continue to live in the locale in which they exited their society. Even so, new members are forbidden to have contact with out-of-group people, or access to external media.

2  Social identity. Members identify themselves with the group because the main goal or purpose of their activity is to gain personal prestige within the group (Turner, Hogg, Oakes, Reicher, and Wetherell 1987). Therefore, they look upon fellow members as elite, acting for the salvation of all people. Conversely, they look on external critics as either wicked persecutors or pitiful, ignorant fools. This “groupthink” makes it possible for the manipulators to provoke reckless group behavior among the members (Janis 1971; Wexler 1995).

Physiological-Stress Manipulation
It has been established that physiological stress factors facilitate this constraint within the group based on the following, as examples:

1  urgent individual need to achieve group goals,
2  fear of sanction and punishment,
3  monotonous group life,
4  sublimation of sexual drive in fatiguing hard work,
5  sleep deprivation,
6  poor nutrition,
7  extended prayer and / or [study sessions].


Post-Cult Residual Psychological Distress
Over the past few decades, a considerable number of studies have been completed on the psychological problems former cult members have experienced after leaving the cult, as compared with the mind-control process itself.

It is important to note that most former members continue to experience discontent, although its cause remains controversial (Aronoff, Lynn, and Malinoski 2000). A few studies on cult phenomena have been conducted so far in Japan, notably by Nishida (1995a, 1998), and by Nishida and Kuroda (2003, 2004), who investigated ex-cultists’ post-exit problems, based mainly on questionnaires administered to former members of two different cults.

In a series of studies, Nishida and Kuroda (2003) surveyed 157 former members of the Unification Church and Aum Shinrikyō. Using factor analysis, the studies posited eleven factors that contribute to ex-members’ psychological problems. These factors can be classified into three main groups: (1) emotional distress, (2) mental distress, and (3) interpersonal distress. The eleven factors are (1) tendencies to depression and anxiety, (2) loss of self-esteem, (3) remorse and regret, (4) difficulty in maintaining social relations and friendships, (5) difficulty in family relationships, (6) floating or flashback to cultic thinking and feeling, (7) fear of sexual contact, (8) emotional instability, (9) hypochondria, (10) secrecy of cult life, and (11) anger toward the cult. These findings seem to have a high correlation with previous American studies.

Moreover, Nishida and Kuroda (2004) deduced from their analysis of variance of the 157 former members surveyed that depression and anxiety, hypochondria, and secrecy of cult involvement decreased progressively, with the help of counseling, after members left the cult. However, loss of self-esteem and anger toward the cult increased as a result of counseling.

Furthermore, Nishida (1998) found clear gender differences in the post-exit recovery process. Although female ex-cultists’ distress levels were higher than those of the males immediately after they left the cults, the women experienced full recovery more quickly than the men. The study also found that counseling by non-professionals works effectively with certain types of distress, such as anxiety and helplessness, but not for others, such as regret and self-reproof.


Conclusion
It can be concluded from Japanese studies on destructive cults that the psychological manipulation known as cult mind control is different from brainwashing or coercive persuasion. Based on my empirical studies, conducted from a social psychology point of view, I concluded that many sets of social influence are systematically applied to new recruits during the indoctrination process, influences that facilitate ongoing control of cult members. My findings agree with certain American studies, such as those conducted by Zimbardo and Anderson (1993), Singer and Lalich (1995), and Hassan (1988, 2000). The manipulation is powerful enough to make a vulnerable recruit believe that the only proper action is to obey the organization’s leaders, in order to secure humanity’s salvation, even though the requisite deed may breach social norms. Furthermore, it should be pointed out that dedicated cult veterans are subject to profound distress over the extended period of their cult involvement.


This chapter is a reprint of an article originally published in Cultic Studies Review, 2005, Volume 4, Number 3, pages 215-232.


Kimiaki Nishida, Ph.D., a social psychologist in Japan, is Associate Professor at the Rissho University 立正大学 in Tokyo and a Director of the Japan Cult Recovery Council. He is a leading Japanese cultic studies scholar and the editor of Japanese Journal of Social Psychology. His studies on psychological manipulation by cults were awarded prizes by several academic societies in Japan. And he has been summoned to some courts to explain “cult mind control.”


統一協会の伝道とマインド·コントロール


Los Angeles Times  November 26, 1978

Psyching Out the Cults’ Collective Mania

by Louis Jolyon West and Richard Delgado

Louis Jolyon West is director of UCLA’s Neuropsychiatric Institute. Richard Delgado is a visiting professor of law at UCLA.

Just a week ago yesterday, the ambush of Rep. Leo J. Ryan and three newsmen at a jungle airstrip set off a terrible sequence of events that left many hundreds of people dead in the steamy rain forests of Guyana. The horrible social mechanism that ground into motion in the Peoples Temple camp that day seems inexplicable to many and has focused attention on the murky world of cults, both religious and nonreligious.

Historically, periods of unusual turbulence are often accompanied by the emergence of cults. Following the fall of Rome, the French Revolution and again during the Industrial Revolution, numerous cults appeared in Europe. The westward movement in America swept a myriad of religious cults toward California. In the years following the Gold Rush, at least 50 utopian cults were established here. Most were religious and lasted, on the average, about 20 years; the secular variety usually endured only half that long.

The present disturbances in American culture first welled up during the 1960s, with the expansion of an unpopular war in Southeast Asia, massive upheavals over civil rights and a profound crisis in values in response to unprecedented affluence, on the one hand, and potential thermonuclear holocaust, on the other. Our youth were caught up in three rebellions: red (the New Left), against political and economic monopolies; black, against racial injustice, and green (the counterculture), against materialism in all its manifestations, including individual and institutional struggles for power.

Drug abuse and violent predators took an awful toll among the counterculture’s hippies in the late 1960s. Many fled to form colonies, now generally called communes. Others turned to the apparent security of paternalistic religious and secular cults, which have been multiplying at an astonishing rate ever since.

Those communes that have endured—perhaps two or three thousand in North America—can generally be differentiated from cults in three respects:

— Cults are established by strong charismatic leaders of power hierarchies controlling resources, while communes tend to minimize organizational structure and to deflate or expel power seekers.

— Cults possess some revealed “word” in the form of a book, manifesto or doctrine, whereas communes vaguely invoke general commitments to peace, libertarian freedoms and distaste for the parent culture’s establishments.

— Cults create fortified boundaries confining their members in various ways and attacking those who would leave as defectors, deserters or traitor; they recruit new members with ruthless energy and raise enormous sums of money, and they lend to view the outside world with increasing hostility and distrust as the organization ossifies. In contrast, communes are like nodes in the far-flung network of the counterculture. Their boundaries are permeable membranes through which people come and go relatively unimpeded, either to continue their pilgrimages or to return to a society regarded by the communards with feelings ranging from indifference to amusement to pity. Most communes thus defined seem to pose relatively little threat to society. Many cults, on the other hand, are increasingly perceived as dangerous both to their own members and to others.

Recent estimates place more than 2 million Americans, mostly aged 18 to 25, in some way affiliated with cults and, by using the broadest of definitions, there may be as many as 2,500 cults in America today. If the total seems large, consider that L. Ron Hubbard’s rapidly expanding Church of Scientology claimed 5.5 million members worldwide in 1972; the Unification Church of Rev. Sun Myung Moon boasts of 30,000 members in the United States alone.

These enterprises may seem rich, respectable and secure compared to the Rev. Jim Jones tragic Peoples Temple, with its membership of only 2,000 to 3,000. However, the Church of Scientology, the Unification Church and other organizations such as Chuck Dederich’s Synanon, have all been under recent investigation by government agencies. Other large religious cults, such as the Divine Light Mission, the International Society for Krishna Consciousness and the Children of God are being carefully scrutinized by the public. For the public is alarmed by what it knows of some cults’ methods of recruitment, exploitation of members, restriction on members freedom, retaliation against defecting members, struggles with members’ families engaged in rescue operations (including so-called “deprogramming”), dubious fiscal practices and the like. Lately, death threats against investigative reporters, leaked internal memoranda justifying violence, the discovery of weapons caches, such incidents as the rattlesnake attack against the Los Angeles attorney Paul Morantz last month, the violent outburst of the Hanafi Muslims in Washington, D.C., last year and now the gruesome events in Guyana have served to increase the public’s concern.

[The 1977 Hanafi Siege occurred on March 9-11, 1977 when three buildings in Washington, D.C. were seized by 12 Hanafi Muslim gunmen. The gunmen were led by Hamaas Abdul Khaalis, who wanted to bring attention to the murder of his family in 1973. They took 149 hostages and killed radio journalist Maurice Williams. Police officer Mack Cantrell also died.]

Some cults (for instance, Synanon) are relatively passive about recruitment (albeit harsh when it comes to defections). Others, such as the Unification Church, are tireless recruiters. Many employ techniques that in some respects resemble those used in the forceful political indoctrination prescribed by Mao Tse-tung during the communist revolution and its aftermath in China. These techniques, described by the Chinese as “thought reform” or “ideological remolding,” were labeled “brainwashing” in 1950 by the American journalist Edward Hunter. Such methods were subsequently studied in depth by a number of western scientists and Edgar Schein summarized much of this research in a monograph, “Coercive Persuasion,” published in 1961.

Successful indoctrination by a cult of a recruit is likely to require most of the following elements:

— Isolation of the recruit and manipulation of his environment;
— Control over channels of communication and information;
— Debilitation through inadequate diet and fatigue;
— Degradation or diminution of the self;
— Early stimulation of uncertainty, fear and confusion, and joy and certainty as rewards for surrendering self to the group;
— Alternation of harshness and leniency in the context of discipline;
— Peer pressure, often applied through ritualized “struggle sessions,” generating guilt and requiring open confessions;
— Insistence by seemingly all-powerful hosts that the recruit’s survival –physical or spiritual– depends on identifying with the group;
— Assignment of monotonous tasks of repetitive activities, such as chanting or copying written materials;
— Acts of symbolic betrayal or renunciation of self, family and previously held values, designed to increase the psychological distance between the recruit and his previous way of life.

As time passes, the new member’s psychological condition may deteriorate. He may become incapable of complex, rational thought; his responses to questions may become stereotyped and he may find it difficult to make even simple decisions unaided. His judgement about the events in the outside world will likely be impaired. At the same time, there may be such a reduction of insight that he fails to realize how much he has changed.

After months or years of membership, such a former recruit may emerge from the cult –perhaps “rescued” by friends or family, but more likely having escaped following prolonged exploitation, suffering and disillusionment. Many such refugees appeared dazed and confused, unable to resume their previous way of life and fearful of being captured, punished and returned to the cult. “Floating” is a frequent phenomenon, with the ex-cultist drifting off into disassociated states of altered consciousness. Other frequent symptoms of the refugees include depression, indecisiveness and a general sense of disorientation, often accompanied by frightening impulses to return to the cult and throw themselves on the mercy of the leader.

This suggests that society may well wish to consider ways of preventing its members, particularly the young, from unwittingly becoming lost in cults that use psychologically and even physically harmful techniques of persuasion. Parents can inform themselves and their children about cults and the dangers they pose; religious and educational leaders can teach the risks of associating with such groups. However, when prevention fails and intervention assumes an official character –as through legislation or court action– it is necessary to consider the potential impact of such intervention on the free exercise of religion as guaranteed by the First Amendment.

Under the U.S. Constitution, religious liberty is of two types –freedom of belief and freedom of action. The first is, by its nature, absolute. An individual may choose to believe in a system that others find bizarre or ludicrous; society is powerless to interfere. Religiously motivated conduct, however, is not protected absolutely. Instead, it is subject to a balancing test, in which courts weigh the interest of society in regulating or forbidding the conduct against the interest of the group in carrying it out.

How can society best protect the individual from physical and psychological harm, from stultification of his ability to act autonomously, from loss of vital years of his life, from dehumanizing exploitation –all without interfering with his freedom of choice in regard to religious practices? And, while protecting religious freedom, how can society protect the family as a social institution from the menace of the cult as a competing super-family?

A number of legal cases involving polygamy, blood transfusions for those who object to them on religious grounds and the state’s interest in protecting children from religious zealotry suggest that the courts will hold these interests to be constitutionally adequate to check the more obvious abuses of the cults. Furthermore, the cults interest is likely to be found weakened by a lack of “sincerity,” a requirement deriving from conscious-objector and tax-exemption cases, and lack of “centrality,” or importance of the objectionable practices to such essential religious functions as worship.

To be protected by the First Amendment, religions conduct must stem from theological or moral motives rather than avarice, personal convenience, or a desire for power. Such conduct must also constitute a central or indispensable element of the religious practice.

Many religious cults demonstrate an extreme interest in financial or political aggrandizement, but little interest in the spiritual development of the faithful. Because their religious or theological core would not seem affected by a prohibition against deceptive recruiting methods and coercive techniques to indoctrinate and retain members, it is likely the courts would consider the use of such methods neither “sincere” nor “central.”

Thus the constitutional balance appears to allow intervention, though it could be objected that obnoxious practices which might otherwise justify intervention should not be considered harmful if those experiencing them do so voluntarily and do not see them as harmful at the time.

But is coercive persuasion in the cults inflicted on persons who freely choose to undergo it —who decide to be unfree— or is it imposed on persons who do not truly choose it of their own free will? The decision to join a cult and undergo drastic reformation of one’s thought and behavioral processes can be seen as similar in importance to decisions to undergo surgery, psychotherapy and other forms of medical treatment. Accordingly, it should be protected in the same manner and to the same degree as we protect the decision to undergo medical treatment. This means the decision must be fully consensual. This entails, at a minimum, that those making such decisions do so with both full mental “capacity” and with a complete “knowledge” of the choices offered them. In other words, they should give “fully informed consent” before the process of indoctrination can be initiated.

A review of legislative reports, court proceedings (including cases involving conservatorships, or the “defense of necessity” in kidnaping prosecutions), and considerable clinical material makes clear that the cult joining process is often not fully consensual. It is not fully consensual because “knowledge” and “capacity” —the essential elements of legally adequate consent— are not simultaneously present. Until cults obtain fully informed consent from prospective members giving permission in advance to apply the procedures of indoctrination, and warning of the potential risks and losses, it appears that society may properly take measures to protect itself against cultist indoctrination without violating the principle, central to American jurisprudence, that the state should not interfere with the voluntarily chosen religious behavior of adult citizens.

Most young people who are approached by cultist recruiters will have relatively unimpaired “capacity”. They may be undergoing a momentary state of fatigue, depression, or boredom; they may be worried about exams, a separation from home or family, the job market, or relations with the opposite sex —but generally their minds are intact. If the recruiter were to approach such a person and introduce himself or herself as a recruiter for a cult, such as the Unification Church, the target person would likely be on guard.

But recruiters usually conceal the identity of the cult at first, and the role the recruit is expected to play in it, until the young person has become fatigued and suggestible. Information is imparted only when the target’s capacity to analyze it has become low. In other words, when the recruit’s legal “capacity” is high, his “knowledge” is not; later the reverse obtains. Consent given under such circumstances should not deserve the respect afforded ordinary decisions of competent adults.

If intervention against cults that employ coercive persuasion is consistent with the First Amendment, a line must be drawn between cults and other organizations. But is it possible to impose restrictions on the activities of cults that use coercive persuasion without imposing the same restraints upon other societal institutions —TV advertising, political campaigns, army training camps, Jesuit seminaries— that use influence, persuasion and group dynamics in their normal procedures?

Established religious orders may sequester their trainees to some extent. Military recruiters and Madison Avenue copywriters use exaggeration, concealment and “puffing” to make their product appear more attractive than it is. Revivalists invoke guilt. Religious mystics engage in ritual fasting and self-mortification. It has been argued that the thought-control processes used by cults are indistinguishable from those of more socially accepted groups.

Yet it is possible to distinguish between cults and other institutions —by examining the intensity and pervasiveness with which mind-influencing techniques are applied. For instance, Jesuit seminaries may isolate the seminarian from the rest of the world for periods of time, but the candidate is not deliberately deceived about the obligations and burdens of the priesthood; in fact, he is warned in advance and is given every opportunity to withdraw.

In fact, few, if any, social institutions claiming First Amendment protection use conditioning techniques as intense, deceptive, or pervasive as those employed by many contemporary cults. A decision to intervene and prevent abuses of cult proselytizing and indoctrinating does not by its logic alone dictate intervention in other areas where the abuses are milder and more easily controlled.

To turn again to the sad case of the Peoples Temple, it seemed to be, for some years, a relatively small and, in its public stance, moderate cult. Its members differed from those of most cults: Many were older people, many were black, many were enlisted in family units. Nevertheless, from its origins, based on professed ideals of racial harmony and economic equality, the cult gradually developed typical cultist patterns of coercive measures, harsh practices, suspicions of the outside world and a siege mentality.

It may be that these developments comprise an institutional disease of cults. If so, the recent events in Guyana pose a new warning of continuing dangers from cults. For as time passes, leaders may age and sicken. The cult’s characteristically rigid structure and its habitual deference to the leader as repository of all authority leaves the membership vulnerable to the consequences of incredible errors of judgment, institutional paranoia and even deranged behavior by the cult’s chief.

Perhaps the tragedy of Jim Jones’ Peoples Temple will lead to more comprehensive and scientific studies of cult phenomena. Perhaps it will lead our society to a more reasoned public policy of prevention and intervention against further abuses by cults in the name of freedom of religion. If so, then perhaps the disaster in Guyana will have some meaning after all.


Sun Myung Moon’s theology used to control members


Japanese woman recruited and sold by FFWPU to a Korean farmer

A 20-year-old woman, recruited by the Family Federation for World Peace and Unification / UC in Japan, was sold to an older Korean farmer in an “apology marriage”.


Una mujer japonesa fue reclutada por la Federación de Familias y luego vendida a un granjero coreano

Mujer de 20 años reclutada en Japón por la Federación de la Familia para la Paz Mundial y la Unificación / IU y luego fue vendida a un granjero coreano en un “matrimonio de disculpa”.