Misunderstanding Cults

Misunderstanding Cults: Searching for Objectivity in a Controversial Field

1st Edition

Edited by Benjamin Zablocki and Thomas Robbins

Misunderstanding Cults provides a uniquely balanced contribution to what has become a highly polarized area of study. Working towards a moderate ‘third path’ in the heated debate over new religious movements (NRMs) or cults, this collection includes contributions both from scholars who have been characterized as ‘anticult’ and from those characterized as ‘cult apologists.’ The study incorporates diverse viewpoints as well as a variety of theoretical and methodological orientations, with the stated goal of depolarizing the discussion over alternative religious movements. A large portion of the book focuses explicitly on the issue of scholarly objectivity and the danger of partisanship in the study of cults.
The collection also includes contributions on the controversial and much misunderstood topic of brainwashing, as well as discussions of cult violence, child rearing within unconventional religious movements, and the conflicts between NRMs and their critics. Thorough and wide-ranging, this is the first study of new religious movements to address the main points of controversy within the field while attempting to find a middle ground between opposing camps of scholarship.

About the Authors

Benjamin Zablocki is a professor in the Department of Sociology at Rutgers University.

Thomas Robbins is an independent scholar and lives in Rochester, Minnesota.


Series: Heritage
Paperback: 538 pages
Publisher: University of Toronto Press, Scholarly Publishing Division; 1 edition (December 1, 2001)
Language: English
ISBN-10: 0802081886
ISBN-13: 978-0802081889


Contents

Preface ix
Caveat xiii

Introduction: Finding a Middle Ground in a Polarized Scholarly Arena 3
Benjamin Zablocki and Thomas Robbins

PART ONE: HOW OBJECTIVE ARE THE SCHOLARS?

1 ‘O Truant Muse’: Collaborationism and Research Integrity 35
Benjamin Beit-Hallahmi

2 Balance and Fairness in the Study of Alternative Religions 71
Thomas Robbins

3 Caught Up in the Cult Wars: Confessions of a Canadian Researcher 99
Susan J. Palmer

4 Pitfalls in the Sociological Study of Cults 123
Janja Lalich

PART TWO: HOW CONSTRAINED ARE THE PARTICIPANTS?

5 Towards a Demystified and Disinterested Scientific Theory of Brainwashing 159
Benjamin Zablocki

6 Tactical Ambiguity and Brainwashing Formulations: Science or Pseudo Science 215
Dick Anthony

7 A Tale of Two Theories: Brainwashing and Conversion as Competing Political Narratives 318
David Bromley

8 Brainwashing Programs in The Family/Children of God and Scientology 349
Stephen A. Kent

9 Raising Lazarus: A Methodological Critique of Stephen Kent’s Revival of the Brainwashing Model 379
Lorne L. Dawson

10 Compelling Evidence: A Rejoinder to Lorne Dawson’s Chapter 401
Stephen A. Kent

PART THREE: HOW CONCERNED SHOULD SOCIETY BE?

11 Child-Rearing Issues in Totalist Groups 415
Amy Siskind

12 Contested Narratives: A Case Study of the Conflict Between a New Religious Movement and Its Critics 452
Julius H. Rubin

13 The Roots of Religious Violence in America 478
Jeffrey Kaplan

Appendix 515

Contributors 521


Contributors

Benjamin Beit-Hallahmi received a PhD in clinical psychology from Michigan State University in 1970, and since then has held clinical, research, and teaching positions in academic institutions in the United States, Europe, and Israel. He is currently professor of psychology at the University of Haifa. Among his best-known publications are Despair and Deliverance (1992), The Psychology of Religious Behaviour, Belief, and Experience (1997), and the Illustrated Encyclopedia of Active New Religions (1998).

Janja Lalich specializes in the study of charismatic relationships, ideology, and social control, and issues of gender and sexuality. She received her PhD from the Fielding Institute in Santa Barbara, California, and currently teaches in the Department of Sociology at California State University, Chico. Her works include ‘Crazy’ Therapies; Cults in Our Midst; Captive Hearts, Captive Minds; and Women Under the Influence: A Study of Women’s Lives in Totalist Groups. Her forthcoming book. Bounded Choice: True Believers and Charismatic Commitment (University of California Press), is based on a comparative study of Heaven’s Gate, the group that committed collective suicide in 1997, and the Democratic Workers Party.

Benjamin D. Zablocki is a professor of sociology at Rutgers University. He received his PhD from Johns Hopkins University and has taught at the University of California – Berkeley, California Institute of Technology, and Columbia University. He has published two books on cults, The Joyful Community (University of Chicago Press 1971) and Alienation and Charisma (The Free Press 1980). He has been studying religious movements for thirty-six years, with sponsorship from the National Institutes of Health and the National Science Foundation. Currently he is working on a twenty-five-year longitudinal study of religious belief and ideology.

Stephen A. Kent is a professor in the Department of Sociology, University of Alberta. He received his BA in sociology from the University of Maryland (College Park) in 1973; an MA in the History of Religions from American University in 1978, and an MA (in 1980) and PhD in Religious Studies from McMaster University (Hamilton, Ontario) in 1984. From 1984 to 1986 he held an Izaac Walton Killam Postdoctoral Fellowship in the Department of Sociology. He has published articles in Philosophy East and West, Journal of Religious History, British Journal of Sociology, Sociological Inquiry, Sociological Analysis, Canadian Journal of Sociology, Quaker History, Comparative Social Research, Journal of Religion and Health, Marburg Journal of Religion, and Religion. His current research concentrates on nontraditional and alternative religions.

etc.


Preface

We deliberately gave this book an odd title. Misunderstanding Cults is not, of course, a guidebook on how to misunderstand cults. Rather it is a book about what makes cults (or ‘new religious movements’ as they are sometimes called) so hard to understand. Its purpose is to better comprehend why these groups are so often comically or tragically misunderstood by ‘experts’ as well as by the general public. Specifically, we have focused on the problem of academic misunderstanding and its correlative polarization of academic experts into opposing camps holding mutually hostile points of view. Our hope is to make a contribution towards overcoming this polarization and introducing a greater degree of cooperation and humility into the study of a subject matter that would be difficult to comprehend even under more collegial investigatory conditions.

Polarization in the study of cults has fostered a toxic level of suspicion among scholars working in this field. This polarization, for the most part, is between those focusing on ‘macro-meso’ issues and those focusing on ‘meso-micro’ issues. Social scientists tend to distinguish three levels of social analysis. The macro level is concerned with the largest social aggregates: governments, societies, social classes, and so on. The micro level is concerned with the smallest social units: individuals and very small groups such as nuclear families. The meso level is concerned with intermediate-sized social groupings such as neighbourhoods, cities, business firms, denominations, sects, and cults. Unfortunately, it is rare for social scientific theories to span all three of these levels simultaneously, although just such breadth is what is called for by the puzzle of cults. Between the macro-meso specialists, whose chief concern has been the problem of repressive over-regulation of cults by government and society, and the meso-micro specialists, whose chief concern has been the problem of cultic exploitation of individual devotees, there has been little trust and little mutual respect. The historic reasons for this tension will become clear to anyone reading this book.

There is a need to shake people out of comfortable oversimplifications. Squabbles at the level of ‘Cults are evil!’ ‘No! Cults are OK,’ do nothing to further our understanding of these complex sociocultural phenomena. Cults are a genuine expression of religious freedom deserving toleration. At the same time, they are opportunities for unchecked exploitation of followers by leaders deserving civic scrutiny. As fragile new belief systems, they need the protective cover of benign neglect by the state. But as religious movements, it is always possible that a few of them may turn into potential incubators of terrorism or other forms of crime and abuse.

This situation has made it a challenge to us, as editors, to assemble a dozen authors to write chapters for the book from a wide range of viewpoints. We recognize that many of these authors have had to endure criticism from some of their colleagues for ‘sleeping with the enemy,’ as it were. A few scholars who originally intended to write chapters for this volume actually dropped out of the project because of its controversial nature. We therefore want to gratefully acknowledge the courage of our authors in enduring this criticism in pursuit of higher goals of cooperation and collegiality as well as answers to the intriguing puzzles caused by the cult phenomenon.

In the early 1990s, Thomas Robbins was loosely affiliated with the macro-meso scholars and shared their concerns about the dangers of statist religious repression. Benjamin Zablocki was loosely affiliated with the meso-micro scholars and shared their concerns about the dangers of economic, physical, and psychological abuse of cult members by cult leaders. But the two of us found that we shared a worry about the unusually high degree of polarization that plagued our field. Through many long discussions and exchanges of letters and ‘position papers,’ the two of us were gradually able to move to a more tolerant understanding of each other’s concerns. This book grew directly out of our enthusiasm about the positive effects of our private dialogue, and out of a desire to take this dialogue ‘wholesale’ by promoting the value of a moderate and inclusive perspective with our colleagues and with the interested general public.

This book itself cannot entirely overcome the polarization that has long blighted our field of study. Although most of our authors have tried to modulate their perspectives, we are painfully aware that almost every reader will find a chapter that will offend. At most we have made a beginning: to paraphrase Joni Mitchell, ‘We’ve looked at cults from both sides now, from up and down, but still, somehow, it’s mostly cult’s illusions that we’re stuck with.’ Further progress in understanding this subject matter will require both patience and a great deal of additional collaboration. It will also require receptive listening to the viewpoints of others with whom we may initially disagree.

We would like to acknowledge the help of several colleagues with whom we discussed our project. These include William Bainbridge, Rob Balch, Eileen Barker, Michael Barkun, Jayne Docherty, Mimi Goldman, Massimo Introvigne, Michael Langone, Anna Looney, Phillip Lucas, John Levi Martin, James Richardson, Jean Rosenfeld, Ramon Sender, Thomas Smith, and Lisa Zablocki. We don’t mean to imply that all of these people completely endorsed this project. Some were highly critical and some made suggestions that were ignored. So it is more than a matter of ‘preface boilerplate’ to state that none of them is in any way responsible for the points of view expressed in these pages. But all of them did help us approach the task of editing this volume with a richer and more inclusive perspective.

We also wish to acknowledge the assistance of a number of people who helped us in various ways. Jean Peterson provided valuable clerical and computing assistance to Thomas Robbins during this project. Melissa Edmond, Lauren O’Callaghan, and Maria Chen provided diligent editorial assistance to Benjamin Zablocki. Virgil Duff, our editor at the University of Toronto Press, has been supportive and helpful from the beginning. Two anonymous readers have offered constructive suggestions many of which we have attempted to incorporate into our revisions, and which we believe have strengthened the organization of the volume.


Caveat

Of necessity, given its aims, this is a controversial book. Be warned that almost every reader will take issue with at least one of the essays we have included. The principal aim of the book is to restore a moderate perspective to the social scientific study of cults. Our strategy for achieving this goal has been to invite essays from as wide a range of scholarly points of view as possible, not only from moderates but from the polarized extremes as well. We believe that only by giving public voice to controversy can some degree of consensus and compromise begin to emerge. Hopefully, therefore, it will be understood that the opinions expressed in these chapters are those of each author, and do not necessarily reflect the views of the editors.

An additional issue of fairness arises in those chapters in which scholars take aim, not at other scholars but at various specific cults or anticult organizations. Two points need to be made about such criticisms. The first point is that all the data reported in this book are historical, and therefore none of the criticisms of specific organizations should be taken to apply necessarily to any of these organizations at the present time. The second point is that, even so, a fair-minded reader may very well wish to learn the point of view of the organization being criticized before evaluating the plausibility of the criticism. We, the editors, strongly urge readers to take the trouble to do so. One point upon which we both wholeheartedly agree is that, ultimately, nothing but good can come from exposure to the widest variety of intellectual perspectives. As an aid to the reader in gaining access to these points of view, we have included an appendix listing of publications and websites written by or maintained by the various cults and anticult organizations discussed in this book.

Finally, we wish to emphasize a point we repeat at greater length in the introduction. The word cult in this volume is not meant to be evaluative. The word existed as an analytic category in the social sciences long before it was vulgarized in the mass media as an epithet. In our opinion, simply describing an organization as a cult does not, in itself, imply that we believe that the organization is good or bad or that its ideology is authentic or inauthentic. Indeed we consider these sorts of judgments outside the analytic realm of competence of the social scientist.

Benjamin Zablocki
Thomas Robbins


page 3

Introduction: Finding a Middle Ground in a Polarized Scholarly Arena

Benjamin Zablocki and Thomas Robbins

Every once in a while, cults make news in a big way.1 Jonestown, Waco, Aum Shinrikyo, and Heaven’s Gate are only some of the keywords that remind us of the capacity of religious and other ideological movements to act in ways that leave much of the public thunderstruck. When bewildering events happen, there is a natural tendency to turn to ‘experts’ to explain what at first seems inexplicable. This is a well-established role of the academic expert in our society. But the striking thing about cult events is that the experts rarely agree. This is a field with little or no convergence. The more channels you turn to on your TV set, the more different contradictory opinions you run into. Eventually, the public loses interest and goes away, either with pre-existing prejudices reinforced or with the conclusion that some things are just beyond explanation and cults are one of them. This book is an attempt to discern what it is about religious cults that make them so intractable to expert analysis and interpretation. What is it about cults that makes it so easy for even the experts to misunderstand them?

This book has developed out of a series of conversations between its editors. Both of us have long deplored the divisive polarization, which, at least until recently, has plagued the academic study of religious movements.2 This polarization, into a clique of academic ‘cult bashers’ on the one hand and a clique of academic ‘cult apologists’ on the other, has impeded learning and has added confusion rather than clarity to a class of phenomena already beset with more than its share of confusion and misunderstanding. It is the goal of the editors of this book to encourage and facilitate the carving out of a moderate middle ground for scholars who wish to see charismatic religious movements in shades of grey rather than as either black or white. To aid in this effort, we have deliberately recruited contributors to this book from both extremes,3 as well as from scholars whose work is already considered more moderate.

Most books about cults, whether monographs or collections of essays, represent a single point of view or a narrow band on the viewpoint spectrum. Even when a contrarian voice is solicited, the context is clearly one of tokenism. One divergent point of view helps to set off and define the points of view of all the rest. This book is very different in two ways: (1) we have invited essays from scholars representing all of the various viewpoints within the social sciences; (2) we have urged all of our contributing essayists to eschew polemics and treat perspectives other than their own with respect and seriousness. Although this book by itself cannot overcome the residual polarization that still lingers in the study of cults, it may accomplish two important prerequisites. First, we hope it will get scholars talking to one another who in the past have always avoided reading each other’s work. Second, we hope it will enable the informed public to understand that the reason we misunderstand cults is not that they are intrinsically beyond comprehension, but rather that they pose challenges that have thus far divided scholars but which careful research may help to overcome.

Academic Polarization
We have made an assertion that perhaps will not seem immediately evident to many: that the academic study of new religious movements has been sharply divided into two opposed camps in a way that is highly detrimental to intellectual progress in the field. Probably, we need to document this assertion before attempting to draw certain conclusions from it. There is a cluster of scholars who have tended to be labelled (by their opponents) as ‘cult apologists’ who have generally taken a tolerant attitude of qualified support towards these groups. There is another cluster of scholars who have been labelled (again by their opponents) as ‘cult bashers’ who have generally taken a negative and critical attitude towards these same groups.

Until a few years ago, there was little alternative but to be a part of one or the other of these groupings. In recent years, however, a moderate interdisciplinary position has slowly and painfully begun to develop. Examples of this can be seen in the recent work of Robert Balch and John Hall in sociology, Michael Barkun in political science, and Marc Galanter in psychiatry, among others. So it should be emphasized that we use the terms ‘cult apologist’ and ‘cult basher’ in this book mainly in a historical sense;4 thus, the use of these terms is not indicative of our validation of the stigma embodied in them. Nevertheless, because the present situation cannot be understood without understanding the roots of this historical polarization, we will continue to use these terms in referring to these two intellectual clusters.

Evidence for the existence of these clusters can be seen in the very terms ‘cult’ and ‘new religious movement.’ The use of either of these terms is a kind of shibboleth by which one has been able to know, with some degree of accuracy, how to classify a scholar in this field. In the past, it was only the ‘apologists’ who tended to use the latter term; the bashers preferred the former term. The difference of opinion is not just a matter of linguistic style. The term cult is an insult to those who are positively disposed towards these groups or who feel that it is important to actively support their right to exist even while perhaps deploring some of their practices. The term new religious movement is a misleading euphemism to those who are negatively disposed. It is also thought to be misleading in that it ignores political and psychotherapeutic cults, implying, as it does, that all such groups are religious in nature.

We will try to display our own moderate colours by referring to these groups sometimes as cults and sometimes as new religious movements (NRMs). In neither case is it our intention to be judgmental. Historically the word cult has been used in sociology to refer to any religion held together more by devotion to a living charismatic leader who actively participates in the group’s decision-making than by adherence to a body of doctrine or prescribed set of rituals. By such a definition, many religions would be accurately described as cults during certain phases of their history, and as sects, denominations, or churches at other times. The mass media sometimes make a distinction between ‘genuine religions’ and cults, implying that there is something non-genuine about the latter by definition. We do not share the implicit bias that seems to be embedded in this usage. Nor, by calling a group an NRM, do we necessarily imply that the group must be benign.

Polarizing issues in this field are not limited to the cult versus NRM controversy. Attitudes towards the concept of ‘brainwashing’ and towards the methodological device of making use of ex-member (apostate) accounts as data are two of many other issues that divide scholars into these two camps. Beliefs about the advisability of scholars accepting financial support from NRMs is still another issue upon which opinions are sharply divided.

The historical reasons for the development of this polarization are too complex to be reviewed here, especially as they have been discussed extensively elsewhere by ourselves and others (Anthony and Robbins 1995; Zablocki 1997). Much of it has to do with a quarter-century of involvement by scholars from both camps in high-stakes litigation involving these religious groups. The law courts, with their need for absolutes and their contempt for scholarly ambivalence, helped to push both those who started with mildly positive dispositions towards cults into the extreme posture of the ‘cult apologist’ and those who started with mildly negative dispositions towards these same NRMs into the extreme posture of the ‘basher.’5

If these events had merely produced a tendency towards a bipolar distribution of attitudes in this scholarly subdiscipline the results would have been bad enough. But even worse was the crystallization of these two loosely affiliated clusters of scholars into what Fleck has called ‘thought communities’ (Fleck 1979). Through one group’s involvement with the organized anticult movement and the other’s attempt to establish or sustain hegemony in key scholarly organizations of social and behavioral scientists, these clusters gradually crystallized into mutually reinforcing, self-perpetuating scholarly communities. Rather than combining perspectives to get closer to the truth, these communities came to define themselves, increasingly, in terms of words that could or could not be uttered and ideas that could or could not be thought about. Hardened positions on such issues as brainwashing or apostasy, for example, exemplify Fleck’s notion of the ‘fact’ as ‘a signal of resistance (by a thought community) opposing free arbitrary thinking’ (101). Dialogue practically ceased between the two camps for a while, as each preferred to talk mainly to those who shared the same perspective.

In the manner of insular thought communities throughout history, these sought to reinforce solidarity not only by mutual intellectual congratulation of comrades in the same camp but by vilification of those in the other camp. In this way, the rivalry came to take on a bitter emotional dimension that served to energize and exacerbate the initial cognitive disagreements (Allen 1998).

Each side has had its poster child depicting the horrors that the other side was able somehow to callously condone. For the ‘apologists,’ it was the image of the sincere religious seeker kidnapped by unscrupulous deprogrammers and thrust into a dark basement of an anticult-movement safe house to be inquisitorially pressured to renounce her faith. The fact that the most notorious of the early coercive deprogrammers happened to be a husky African-American male and the archetypical religious kidnappee was generally depicted as a frail, sincere, but very frightened white female helped to assure that the revulsion caused by this portrait was never overly tepid, although this, of course, was never mentioned out loud. For the ‘bashers’ the poster image was just as heart-rending; a little girl looking trustingly up at her hopelessly brainwashed daddy while he feeds her poisoned Kool-Aid at the behest of his ranting paranoid prophet, or a little boy being beaten half to death by the community elders for his inability to memorize the weekly portion of the Bible.

We don’t mean to be dismissive of these emotional concerns. Overblown as the symbols have become, each has its roots in instances of very real suffering and injustice. The problem for the academic discipline is to be found not in the emotional sympathy of its practitioners, which is commendable, but in the curious fact that these two emotional stimuli have come to be seen as mutually exclusive. Caring about one required that you be callous about the other. In fact, our own personal litmus test in our quest for scholars who could be called ‘moderates’ is precisely the capacity to be moved to sympathy by the poster children of each of the two thought communities, i.e., to engage in what Robbins, in his chapter in this volume, calls ‘pluralistic compassion.’ Gradually, a critical mass of such moderate scholars has begun to emerge.

Through twenty-five years of wrangling, both in journals and in courtrooms, the two thought communities we have been discussing have worked out internally consistent theoretical and methodological positions on a wide variety of issues regarding cult research. Although a number of scholars have come forth professing to be moderates (Bromley 1998, 250), it is not yet nearly as clear how such a moderate stance will eventually come to be defined in this field.

Fortunately, we have a good role model to help us get started. The field of new religious movements has not been the only area in the social sciences that has ever been plagued by such divisions. In fact the larger and more general subdiscipline known as ‘social movements’ gives us some clues concerning the repairs we must make. Almost a decade ago, sociologist John Lofland published a paper taking his colleagues in the Social Movements field to task for counterproductive tendencies similar to the ones we have been discussing here (Lofland 1993). The situation he describes is not identical, of course. The role of litigation was much less of a factor in his field, for example. Nevertheless, the suggestions he makes can be helpful to us.

Lofland distinguishes two alternative mind-sets for studying social movements. He calls one of them the ‘theory bashing’ mind-set and the other the ‘answer-improving’ mind-set. The theory bashing mindset is defined as: ‘a set of contending ‘theories’ whose respective merits must be assessed; a set of constructs that must be pitted against one another; [and] a field of contenders in which one professes allegiance to one, has alliances with others, and zealously pursues campaigns to discredit and banish yet others’ (Lofland 1993: 37). In contrast, the answer-improving mind-set is defined as one in which the study of social movements is constructed as ‘a set of questions for which we are trying to provide ever-improved answers through processes of successive revision in order to delete erroneous aspects of answers and to incorporate more accurate elements into answers. Rather than aiming to discredit or vindicate a ‘theory’ one aims to construct a more comprehensive, accurate, and powerful answer to a question’ (37-8).

A Moderate Agenda
Our argument is that just such a shift in mind-set is precisely what is needed to create and sustain a moderate third path for scholars studying new religious movements. Although easy to envision, such a shift will be tricky to implement. However, the effort is worth it for two reasons. First, the study of religion is a very difficult business and none of us has all the answers. We will make more progress once we recognize that none of our paradigms comes even close to being able to claim to be the master paradigm. We work within a multi-paradigmatic discipline precisely because every single one of our research paradigms is severely limited. Second, as David Bromley (1998) has pointed out, the study of new religious movements has been marginalized by the rest of sociology precisely because of our lack of consensus on so many key issues. Establishing a moderate alternative is essential if we expect our area of research to be taken seriously by colleagues outside the field.

It seems to us that five steps are involved in such a process. The first is a move in the direction of paradigmatic toleration, including a recognition that no one paradigmatic approach can hope to capture the full complexity of religious movements. The second is a move in the direction of greater consensus and precision in conceptual vocabulary. The third is a move towards agreement on a set of principles regarding respect for scholar privacy and demand for scholar accountability in the research process. The fourth is a move towards agreement on a set of principles regarding respect for the privacy of the religious movements we study, and the legitimacy of the demand for their accountability. The fifth, and perhaps most controversial, is a move towards disestablishing the primacy that policy issues have assumed over intellectual issues in our field. That is not to say that policy issues and policy advocacy should be declared illegitimate, but rather that they should be relegated to their traditional position as secondary to our primary academic function which is to observe and to record.

Paradigmatic Toleration
Of the five steps that we mentioned, the most important is the move towards paradigmatic toleration. It is generally well established that there is no master paradigm that effectively organizes theoretical inquiry in sociology. Rather, sociology is acknowledged to be a multi-paradigmatic discipline at this point in its evolution (Effrat 1972; Friedrichs 1970). In most areas of sociology it has been argued that these multiple paradigms are at least, nevertheless, unified by what Charles Lemert (1979:13) has called homocentrism, ‘the … idea which holds that man is the measure of all things.’ But the sociology of religion is perhaps the only redoubt within the discipline of nonhomocentric paradigms as well, making the multiplicity that we have to deal with even richer and more bewildering.

How does one cope with working within a multiparadigmatic discipline? Does one treat it as a burden or an opportunity? Is the idea to strive for hegemony for one’s own paradigm or an atmosphere of mutual toleration or even of cooperation? It was Robert Merton’s contention that there need not be contention among our various paradigms. He argued that they are ‘opposed to one another in about the same sense as ham is opposed to eggs: they are perceptively different but mutually enriching’ (Merton 1975: 30). This is a notion upon which we hope that a moderate position in our field can crystallize. Since none of us has made all that impressive progress in understanding NRMs from within our own paradigms, maybe coming at them with multiple cognitive approaches will allow us to do better.

The specific paradigms that do battle within the sociology of cults are too numerous and the various alliances too complex to be dealt with here comprehensively. We will have to make do with just one example. No doubt the most egregious example of the paradigmatic intolerance and conflict that has plagued the study of religion is that between the positivists and the phenomenologists. (The former perspective is sometimes embodied in controversial ‘rational choice’ models of religious behaviour.) Each has a reputation for being a pretty arrogant bunch. However, think for a minute what it means to be studying religion, to be trying to understand the actions of people who are motivated by their relation to the sacred, to be attempting to participant-observe the ineffable, or at least the consequences of the ineffable, and then to report on it. Such undertakings ought to make us humble. They ought to make us understand that it is highly unlikely that either a pure positivist approach or a pure phenomenological approach will come away with all the answers. Such undertakings seem to us to cry out for all that the right brain can tell the left brain, and vice versa. Under these circumstances, it is not unreasonable for each of us to consider giving up our own allegiance to paradigmatic chauvinism.

Conceptual Precision and Consensus
A symptom of the extreme polarization of this field is that certain words have become emotionally charged to an abnormally high degree. Although scholars in all fields tend to argue about conceptual definitions of terms, the extent to which vocabulary choice determines status in the NRM field would delight an expectation-states theorist. To paraphrase Henry Higgins, ‘The moment that a cult scholar begins to speak he makes some other cult scholar despise him.’ Why do people get so worked up over the question of whether to call certain groups cults or NRMs, or certain processes brainwashing or resocialization, or certain people apostates or ex-members?

Choices among these words are extremely important to some people. Their use is often taken as the external sign of membership in one or another rigid thought community. Therefore, it follows that a moderate thought community needs its own identifying vocabulary. We would have hoped that Stark and Bainbridge (1996) might have at least partially satisfied that need with the dozens of painstaking definitions they offered in their comprehensive theory of religion. But a shared vocabulary is of no value unless people agree to use it, and this has not yet happened.

We speculate that the structural resistance to the adaptation of a consensually accepted moderate vocabulary is to be found in large part in the cosiness of using the vocabulary of one of the two polarized thought communities. When Zablocki speaks of brainwashing he immediately has a hundred allies in the anticult movement, all of whom are inclined to speak favorably of his work even if they have never read it (and even though some of them might be horrified if they ever actually did read it). When Robbins calls a group a NRM instead of a cult he is thereby assured that he is recognized as a member in good standing of that valiant confraternity that has pledged itself to the defense of religious liberty. Such warm fuzzies are difficult to relinquish merely for the sake of increased intellectual vitality.

It might just be possible to adopt a ‘bureaucratic’ solution to this problem. We would be happy to abide by a set of rational standards governing conceptual vocabulary, and we imagine that many of our colleagues would as well. Many of our colleagues have told us that convening a committee to propose a set of standards on the use of conceptual terms in the study of NRMs is not practical. We don’t see why. If we restrict ourselves for the moment to those writing in English, there are probably not many more than a couple of hundred scholars actively studying cults at the present time. And probably fewer than fifty of these have any serious interest in actively creating the kind of moderate thought community that we are proposing. It seems to us that these numbers are small enough to allow us to hope that a working committee of four or five representatives might be able to speak for them.

Respect for Scholarly Privacy and Demand for Scholarly Accountability
One of the most painful consequences of the polarization of the NRM field is the lack of trust that has developed among scholars in opposing camps. Generally this is expressed in the form of questioning the motivations of specific writings or specific research projects. In extreme cases, the charge of selling out for money may also be levelled.

Honest differences of opinion regarding professional norms tend to become amplified in their stridency because the arguments tend to be expressed as ad hominem attacks, and they therefore evoke strong emotional responses. It’s hard for two scholars to even talk to each other if one feels accused of something as crass as selling out for money. On the other hand, it is hard to discuss objectively the possible distortive effects of large amounts of money coming into the field without some people feeling personally attacked.

A closely related issue has to do with affiliations of scholars with government agencies that regulate religious activities or non-government organizations that have a role in controversies regarding cults. All sorts of rumors abound concerning the power, wealth, influence, and backing of organizations on both ‘sides’ such as INFORM (Information on Religious Movements, AWARE (Association of World Academics for Religious Education), CESNUR (Center for Studies on New Religions), CAN (Cult Awareness Network), and AFF (American Family Foundation [now called ICSA, International Cultic Studies Association). Scholars who have worked hard for these organizations may come to identify with them and to consider an attack on one’s organization as an attack on one’s self.

It seems clear to us that a moderate position on these issues has to be based on two complementary principles: the freedom to choose, which needs to be respected, and the responsibility to disclose, which needs to be demanded. It is very unlikely that even a small, moderate group of scholars will ever be able to reach consensus on the issue of from whom and for what one should accept fees. The same is true for the organizations that a scholar chooses to work with. At the same time, in a highly polarized field like this, disclosure is particularly important. If we want to write a book on Scientology, we are the only ones in a position to decide if we feel we can (or wish to) remain objective in this task if we take financial help for this project from Scientology itself, or from the ‘anticult’ AFF. On the other hand, our colleagues have a right to know if we have received support from either of these agencies (or others) so that each of them can decide how to assess possible impacts on our objectivity.

If a cohort of moderate scholars begins the practice of voluntarily adhering to a norm requiring frank disclosure of sources of financial support and organizational affiliations, this will put a lot of pressure on others to do likewise.

Respect for NRM Privacy and Demand for NRM Accountability
In some ways, the greatest gulf between the ‘cult apologists’ and the ‘cult bashers’ is to be found in the question of where to draw the line between the privacy rights and the accountability duties of religious groups. There is consensus only on the rather obvious norm that religious groups must obey the laws of the land and that religious conviction cannot be used as an excuse for criminal behavior.

Once we move beyond this, however, the issues quickly get murkier. Are religious movements more like large extended families with the presumption of comprehensive privacy rights that by custom adhere to kin groups? Or are they more like business corporations or government bureaucracies with the presumption that a wide variety of investigatory probes and regulatory fact-finding demands are simply part of the cost of doing business? The problem is that neither model fits very well. In some respects, cults are like families, and, as long as they mind their own business and keep their lawns mowed, they are entitled to conduct their business in as much privacy as they care to have. To the extent, however, that cults proselytize for new members, solicit funds, or manage business based upon government-approved, tax-exempt status, or raise young children in households where the lines of authority are extra-parental, it may be argued that there needs to be some degree of secular accountability.

It is not, of course, the responsibility of the academic community to draw these lines, nor are we competent to do so. Even so local an issue as private versus public schooling raises problems of enormous complexity such as can only be worked out gradually, mostly by trial and error, between the cult and the society (Keim 1975).

But, as scholars, we do have a responsibility first of all to recognize that religious movements are not all identical in this respect and to learn to distinguish those that fall closer to the private end of the continuum from those that fall closer to the public end. We need further to discuss and work out at least a rough code of ethics regarding the limits of scholarly intrusiveness for religions at various points on this continuum.

As editors of this book, we do not claim to have the answers to these questions. We are simply suggesting that the emerging thought community of moderate cult scholars discuss these issues and consider if it might be feasible to work out some rough guidelines. For what it’s worth, our thoughts on a few of these matters are as follows. First, we rather incline to the view that scholarly infiltration is not justified for private or public religion. A scholar should always be up-front and candid about her research intentions from the time of very first contact.6 Second, fair use of religious documents should be interpreted broadly in the direction of full disclosure for public religions. This remains true even if these documents are regarded as restricted by the movement itself. Stealing documents is of course wrong. But, if as usually happens in cases like these, an apostate or dissident decides to break her vow of secrecy and share esoteric documents with a scholar, then the scholar is not obligated to refuse to receive them. Third, public religions are still religions and are thus inherently more fragile than business corporations or government agencies. The scholar needs to understand that the general public can very easily misunderstand many benign religious practices and lash out defensively against them if they are presented out of context, especially to the media.

De-emphasis on Policy Issues
It was initially an overemphasis on policy issues that polarized this field, and only de-emphasis of these issues will allow it to become depolarized. By this we do not mean that scholars should cease to be interested in the policy outcomes that relate to NRMs, only that their research programs and their writing should not be dominated by policy considerations. As Weber pointed out, scholarship and politics are closely related pursuits, and, for this very reason, they must be kept separate from each other (Weber 1946a, 1946b).7 Those scholars who are moved to work actively in defense of freedom of religious expression need to avoid having this work prevent them as scholars from delving into exploitative, intolerant, and manipulative aspects of the groups they study. Those who are moved to work actively to alert society to cultic excesses need to avoid having this work prevent them as scholars from delving into more attractive aspects of the groups they study and the rewarding and/or volitional dimension of devotees’ involvement.

We do not believe that all of the policy issues dividing scholars in this field are amenable to mediation. The best one can hope for is that some people will come to recognize the validity of the concerns of others even if they think their own are more important. But severe policy disagreements in the NRM area will be slow to disappear. And, inevitably, such policy disputes will spill over into debates over purely academic questions.

It is for this reason that we are arguing that the moderate camp has got to be composed primarily of scholars whose desire to find out the answers to academic questions is significantly greater than their desire to win policy battles. Ironically, such a coalition, by giving up the opportunity to influence cult policy in the short run, may be in the best position to help shape a wise enduring policy towards these groups in the long run by actually improving our understanding of what makes cults tick.

Many older scholars have had their perspectives so decisively shaped by ideological issues that it is doubtful they will ever be willing to have collegial relations with those whose ideologies are on the other side. But as younger people whose attitudes were not shaped during the cult wars of the last twenty-five years enter this field, it is vital that they confront three rather than two options for collegial affiliation. We think it important that a group, however small at first, of scholars whose interests are guided primarily by the answer-improving mind set that we discussed earlier, be available as an alternative to the two warring factions.

Significance of Cult Controversies
Before discussing the specific plan of our book we want to briefly present our view that the issues raised by controversies over cults possess a fundamental sociocultural significance, which would remain salient even if the particular movements to which these issues currently pertain were to decline. James Beckford, an English sociologist of religion, wrote in 1985 that contemporary sociological conflicts over new religious movements raise questions which ‘are probably more significant for the future of Western societies than the NRMs themselves. Even if the movements were suddenly to disappear, the consequences of some of their practices would still be left for years to come’ (1985:11).

Beckford’s comments are reminiscent of some ideas which were expressed earlier by Roland Robertson (1979: 306-7), who noted that authoritarian sects contravene the ‘Weberian principle of consistency’ in that they demand autonomy from the state while arguably denying substantial autonomy to their individual participants. ‘It is in part because of “inconsistency” … [that such groups] … apparently create the necessity for those who claim to act on behalf of society to formulate principles of consistent societal participation’ [emphasis in original]. Authoritarian new religious movements, notes Beckford, represent an extreme situation, which, precisely because it is extreme, throws into sharp relief many of the assumptions hidden behind legal, political, and cultural structures.’ The controversial practices of some NRMs have in effect ‘forced society to show its hand and declare itself’ (Beckford 1985:11).

What Beckford, Robertson, and others (Robbins and Beckford 1993) appear to be putting forward is sort of a Durkheimian argument to the effect that controversial authoritarian and ‘totalistic’ cults and the reactions they are eliciting are serving to pose in sharper relief and possibly to shift the moral boundaries of the contemporary Western societies. Controversies over cults and what to do about them may thus produce a situation in which normative expectations which are generally merely implicit and half-submerged may come to be explicitly articulated and extrapolated, and may be transformed in the process. Cults and their critics may articulate and extrapolate differing versions of implicit moral boundaries (Beckford 1985).

One example of a way in which cult controversies help to highlight such implicit moral boundaries is to be found in the assumptions that cults often challenge about the complex three-way relationships that exist among individuals, communities, and the state (Robertson 1979). A generally unstated value of western culture is that individual persons ought to be able to manifest autonomous inner selves which transcend their confusing multiple social roles (Beckford 1985). Contemporary ‘greedy organizations’ (Coser 1974), including many cults, are perceived by many as contravening modern norms of individualism and personal autonomy. In extreme instances, the devotee of a tight-knit sectarian enclave may appear to the public at large as enslaved, and dehumanized, and as something less than a culturally legitimate person. Such a devotee may be deemed incapable of rational self-evaluation or autonomous decision-making (Delgado 1977,1980,1984). Defenders of high-demand religious movements argue to the contrary that people can find real freedom through surrender to a transcendent religious goal, and that it is really the overbearing ‘therapeutic state’ that threatens true individual choice by discouraging people from choosing religious abnegation of self (Robbins 1979; Shapiro 1983; Shepard 1985).

Another way of looking at the impact of cult controversies on moral boundaries and on the linchpin issue of personal autonomy is in terms of a conflict between the covenantal ethos of traditional close-knit religious sects and early churches, which entails broad and diffuse obligations between individuals and groups, and the modern contractual ethos, which implies more limited, conditional, and functionally specific commitments to groups (Bromley and Busching 1988). Some scholars have argued that much of the recent litigation and legislative initiatives bearing upon ‘cults’ has entailed attempts to impose a modem contractual model on close-knit and high-demand religious groups’ relations with their members (Delgado 1982; Heins 1981; Richardson 1986).

The involvement of scholars in cult-related litigation has been one of the principal sources of polarization in the field (Pfiefer and Ogloff 1992; Richardson 1991; Robbins and Beckford 1993; Van Hoey 1991). The present volume, however, does not deal specifically with legal issues and developments except to the extent that the litigiousness of some cults is able to intimidate some scholars from freely publishing their results. But the essays in this volume do touch on a range of substantive, ethical, and methodological issues arising from the practices of high-demand religious, political, and therapeutic movements. Such issues include, but are not limited to, the following: sexual exploitation of female devotees (Boyle 1998), child rearing and child abuse in totalist milieux (Boyle 1999; Palmer and Hardman 1999), compensation sought for induced emotional trauma, fraud, and psychological imprisonment in totalist groups (Anthony and Robbins 1992, 1995; Delgado 1982), child custody disputes pitting members against former members and non-members (Greene 1989), violence erupting in or allegedly perpetuated by totalist millennialist sects (Robbins and Palmer 1997), and attempts to ‘rescue’ adult devotees through coercive methods (Shepard 1985).

The problem for objectivity is that scholars have gotten involved as expert witnesses (on both sides) in court cases in which such issues have been raised. It has been very difficult for these scholars to then turn around and look at brainwashing (or any of these other issues) from a disinterested scientific perspective apart from the confrontational needs of pending litigation. Disagreements among scholars in this area have been sharp and acrimonious, perhaps irrationally and dysfunctionally so (Allen 1998). As editors of this volume we believe that the explosion of litigation in these areas, however justified in terms of either combating cultist abuses or defending religious freedom, has thus had a net deleterious effect on scholarship and has led to an extreme polarization which has undermined both objectivity and collegiality. Far too much research and theorizing has been done by scholars while experiencing the pressure of participation in pending litigation. Scholarship in other areas is often permeated with disputes, polarization, and recrimination. But polemical excess in this realm has been egregious. To the extent that the litigational perspective continues to dominate, it threatens to make a mockery of the enterprise of scholarship in this field.

The Structure of This Book
In designing this book, we first identified what seem to us the three major sources of confusion and misunderstanding surrounding charismatic religious movements.

The first of these is the role confusion of scholars and helping professionals studying cults. How can they (and should they even try to) investigate groups that specialize in the construction and maintenance of alternate views of reality without being influenced by those views of reality? Can they ever really hope to understand groups with beliefs and practices so fundamentally different from their own?

The second is a type of confusion sometimes found among those participating in religious movements. Are they doing so solely out of their own prior motives or are they being manipulatively influenced by the charismatic organizations themselves? And if the latter, is this influence of the same order (even if perhaps of greater intensity) as that practised routinely in school classrooms and television advertising, and in the preaching of conventional religion? Or is it sufficiently more intense to warrant being called brainwashing or thought reform? Do movements employing manipulative methods of indoctrination and commitment-building represent a significant contemporary social problem and menace?

It should be noted that the controversy over the existence of cultic brainwashing has been the issue which has most sharply divided many of the polarized commentators in this field. They have exhibited little consensus, among themselves, even about what they are fighting over (the existence of a social process or the existence of a social outcome). The one thing they all seem to agree on is that the resolution of this question – Does brainwashing really happen in cults or is it a paranoid and bogus invention of the ‘anticultists’? – is critical for our understanding of cults and their role in our society. For this reason we have devoted by far the greatest amount of space in this volume to a discussion of this subject.

The third is confusion among the public as to what, if anything, to do about these charismatic religious groups. Do they deserve constitutional protection on the same basis as all mainstream religions in the Western world? Or do they require special kinds of surveillance and ‘consumer protection’ in order to protect innocent seekers and innocent bystanders?

This book is organized around groupings of essays devoted to each of these three topic areas. Although all of these issues have been addressed in other books, this volume is unique in two respects. First, it is the only book which brings together these three interrelated sources of confusion within one volume. This is important because the cloud of misunderstanding surrounding these religious movements can best be understood in terms of the interplay of all three of these types of confusion. Second, most other collections of essays have been distinctly inclined towards one or the other of the two divisive academic poles mentioned above. At most there has been a token representation of a scholar from ‘the other side’ and often not even that. This book, by way of contrast, has deliberately solicited contributions entailing a wide range of scholarly perspectives.

The contributors to this volume are mainly drawn from the ranks of senior scholars, with a few promising younger scholars included for generational balance. On the whole the authors on the list have published a cumulative total of over thirty books. They have been actively involved for years in research on controversial movements. They include one scholar who is a member of an esoteric (but not particularly controversial or authoritarian) group, one who is an ex-member of a cult and one who grew up in a cult.

We, the editors, are delighted that a group of scholars with such divergent views were willing to come together within the covers of a single volume. However, we quickly recognized that if we allowed continuing rejoinders and counter-rejoinders until all were in agreement, this book would never have been published. Therefore, we adopted a strict rule that each author’s text would have to stand on its own merits and that no comments by authors on other chapters would be included. Of course, it follows from this that each author in the book is responsible only for his or her own chapter, and that no endorsement of the views of other authors is implied by the mutual willingness of each to be published in the same volume. One set of unique circumstances did require that we relax our ‘no rejoinders’ rule in one instance, which is explained below.

The first section of our volume deals with issues of scholarly objectivity, methodology, and professional norms. Allegations of partisanship, bias, and the shallow quality of fieldwork have recently been debated in a number of published articles (Allen 1998; Balch and Langdon 1998; Introvigne 1998; Kent and Krebs 1998,1999; Zablocki 1997).8 The section is divided into two sets of paired essays. The first pair (chapters 1 and 2) concern the perspectives, motivations, and objectivity of scholars investigating cults. Benjamin Beit-Hallahmi, an Israeli psychologist of religion, presents a hard-hitting critique of the irresponsibility of scholars of religion who appear to have been frequently overly solicitous towards the controversial movements they have dealt with, and to have too often been animated by an imperative of defending and vindicating putatively persecuted cults. These issues are also discussed by a sociologist, Thomas Robbins, who relates the intensity of recent conflict among ‘experts’ on cults to the earlier hot controversy over physically coercive ‘deprogramming’ and to the role of experts in an adversarial system of law and policy-making. Employing a multifaceted analogy between contemporary ‘cult wars’ and agitation in previous decades against domestic communist subversion, the author maintains that rigid orientations of apologetic defensiveness as well as crusading ‘countersubversive’ perspectives tend ultimately to sacrifice objectivity.

The second pair of essays in our first section (chapters 3 and 4) deal with the actual process of field research in the context of somewhat manipulative, close-knit, and often authoritarian and secretive groups. First, Susan Palmer, a researcher of esoteric movements, is frank and humorous in her discussion of the manipulative ploys of some esoteric sects and their desire to co-opt or domesticate the researcher. Seemingly aware of the many pitfalls, Palmer defends the ultimate necessity of first-hand observation of esoteric groups and interviews with current participants. In the second essay the pitfalls related to persuasive impression management on the part of a manipulative group are strongly emphasized by Janja Lalich, a recent PhD and a former member of an authoritarian and regimented Marxist cult. The author draws upon her own past experiences in arranging for the misleading of outside observers and constructing facades for their bemusement. She warns observers what to watch out for.

The second section of our volume is concerned with brainwashing. The literature on cultic ‘brainwashing’ is lengthy, acrimonious, and polarized (Anthony 1996; Anthony and Robbins 1994; Barker 1984; Bromley and Richardson 1983; Introvigne 1998; Katchen 1997; Lifton 1991; Martin et al. 1998; Melton forthcoming; Ofshe 1992; Ofshe and Singer 1986; Richardson 1998; Zablocki 1998; Zimbardo and Anderson 1993). The basic conceptual and evidentiary issues are discussed from three different perspectives by Benjamin Zablocki, Dick Anthony, and David Bromley.

In chapter 5, Zablocki returns to the original mid-century definition in developing a concept of brainwashing that is less polarized, less partisan, less ‘mystical’ and more scientific than those that became popular during the cult wars of the 1980s. Zablocki’s chapter seeks to demonstrate the epistemological validity of the concept and the empirical evidence for its existence in cult settings. He sees brainwashing as an objectively defined social influence mechanism, useful for understanding the dynamics of religious movements, not for making value judgments about them. He argues that brainwashing is not about free will versus determinism, but rather about how socialization places constraints on the willingness of individuals to make choices without regard for the consequences of social disapproval. In this context, he sees brainwashing as nothing more than a highly intense form of ideological socialization.

In chapter 6, Dick Anthony sharply criticizes this approach, as embodied in earlier books and papers by Zablocki. He develops a concept, which he refers to as ‘tactical ambiguity,’ to explain how brainwashing theorists have attempted to avoid empirical tests of their arguments by continually shifting the grounds of their key assumptions. Using this concept he argues that, disguised behind a veneer of pseudo-scientific jargon, Zablocki has essentially resurrected the old thoroughly discredited United States Central Intelligence Agency model of brainwashing from the 1950s. This CIA model argued that it was possible to perfect a formula that would rapidly and reliably allow the agency to remold ideologically any targeted subject against his or her will, in order to become an effective secret agent of the United States government. Anthony further argues that part of Zablocki’s theory can be dismissed as double-talk, and that the rest is unscientific dogma (disguising Zablocki’s personal scepticism about the authenticity of innovative religion) because it is based on the intrinsically untestable notion that any subject’s free will can be overwhelmed by irresistible external psychological forces. Anthony argues that the term brainwashing has such sensationalist connotations that its use prejudices any scientific discussion of patterns of commitment in religious movements.

In chapter 7, David Bromley argues that the whole controversy over cultic brainwashing is essentially ideological and political – not scientific. From Bromley’s perspective, competing ‘narratives’ in this area are not really susceptible to a definitive empirical resolution. Bromley aims at neutrality between conflicting ‘ideological’ perspectives. His relativistic approach is not directly critical of either Zablocki’s or Anthony’s position. However, it undercuts the position of the more absolutist crusaders against the utility of using the brainwashing concept as a tool for understanding cults, as well as the position of those who see brainwashing as the single key for understanding these movements.

Section 2 has a second subsection which revolves around evidence advanced by sociologist Stephen Kent that the Church of Scientology in the recent past, and the Children of God (now called The Family) in the 1970s, employed rigorous programs for resocializing deviant members, which, he argues, not only qualify as brainwashing but contain as well an element of physical coercion or captivity (chapter 8). Kent’s notion of brainwashing is convergent with Zablocki’s treatment in terms of seeing brainwashing not so much as a way of initially converting or recruiting devotees but as an intensive (and costly) method of sustaining commitment and orthodoxy among existing high-level converts in danger of straying into heterodoxy, dissent, or defection.

Kent’s argument is subjected here to a methodological critique by sociologist Lome Dawson (chapter 9). Dawson argues that Kent’s analysis is one-sided, since it uses ex-member accounts as a source of data but does not attempt to balance their perspective by obtaining countervailing accounts from the cults themselves. Dawson maintains that Kent fails to demonstrate that his explanation of the data is more convincing than a number of other explanations that can be found in the literature on new religious movements.

The Kent-Dawson dialogue departs from our rule that authors would not be given additional space in the book to reply to other authors with whom they might disagree. We made an exception here, allowing Kent to write a brief rejoinder to Dawson’s critique (chapter 10). The reason for this exception is that, at the time we commissioned Kent’s chapter, we were not aware that Dawson had written a critique of Kent’s paper that he wished to publish in our book. By the time we received Dawson’s critique, Kent had already submitted his chapter, unaware that it might be followed by a critique. In courtesy to Kent, therefore, we asked his permission to include the Dawson critique, and Kent agreed providing that he be allowed to append a brief rejoinder. Since both authors were happy with this arrangement, we agreed to make an exception in this one case.

The final section of our volume is structured somewhat differently from the preceding two sections. Instead of sets of paired or clearly interrelated essays, section 3 consists of three papers, each of which explores a separate topic relevant to questions of public policy with regard to cults. Since it was not possible, in a volume of this size, to cover all the relevant public policy issues, we devote a chapter to each of three such issues: the problems of children growing up in cults, cultic pressures on scholars to suppress evidence, and problems raised when new religious movements become violent in their relations with society.

The treatment and experiences of children brought up in cults has attracted significant critical attention and is now producing increasing scholarly research (Boyle 1999; Palmer and Hardman 1999). Sociologist Amy Siskind (1994) grew up in a radical, communal therapeutic movement. Her present contribution (chapter 11) seeks to conceptualize totalistic child-rearing systems and to describe and compare five communal groups in this area. Each group featured an ‘inspired’ charismatic leader/theoretician. In each group parents were to some degree displaced as caretakers and disciplinarians of their children by movement leaders. In two of the groups, patterns of child rearing clearly shifted significantly over time and were influenced by several key variables. Siskind argues that children growing up in regimented totalistic groups may be susceptible to certain psychological or developmental problems and will probably face difficult problems of adjustment if they later leave the group. She affirms that child rearing in totalist groups should be investigated objectively without preconceptions.

Sociologist Julius Rubin has studied and developed a critical perspective on the communal Bruderhof sect. His views and expressions are strongly resented by the movement, which, in Rubin’s view, does not tolerate criticism. The group’s retaliatory tactics appear to Rubin to embody the ability of some manipulative cults to use the law courts to intimidate or punish ‘enemies’ and thus endeavor to suppress criticism and to encourage what seems to him arguably a form of implicit censorship. As Rubin reports (chapter 12), what is particularly dangerous about this practice is the ability and willingness of some of the wealthier cults to litigate against critical scholars even in cases they cannot win, knowing that this tactic may very well intimidate both authors and academic publishing houses without deep pockets.

During the period of the millennium, increasing scholarly focus is being directed to catastrophic outbreaks of collective homicide/suicide associated with totalist cults such as the People’s Temple, the Branch Davidians, The Solar Temple, Aum Shinrikyo, and Heaven’s Gate. There is concern over the volatility of various millennialist movements as well as their persecution (Bromley and Melton forthcoming; Maaga 1999; Robbins and Palmer 1997; Wessinger 2000; Wright 1995, 1999; Young 1990). Much of this growing literature revolves around a duality in which intrinsic or endogenous sources of group volatility such as apocalyptic worldviews, charismatic leadership, totalism, and ‘mind control’ are played off against extrinsic or exogenous sources related to external opposition, persecution, and the rash and blundering provocations of hostile officials.

Jeffrey Kaplan’s earlier study of Christian identity, ‘Nordic’ neopaganism, and other militant millennialist movements. Radical Religion in America (Kaplan 1997) inclined somewhat towards the relational or extrinsic perspective and emphasized the dynamic whereby negative stereotypes of controversial movements, which are disseminated by hostile watchdog groups, tend to eventually become self-fulfilling prophecies. In his present contribution (chapter 13), Kaplan presents an overview of millenarian violence in American history with a particular focus on Christian identity paramilitarists, extreme antiabortion militants, nineteenth-century violence employed by and against the early Mormons, violence arising within a youthful Satanism subculture, and the violence of cults such as the Branch Davidians led by David Koresh and UFO movements such as Heaven’s Gate. With regard to cult-related violence, Kaplan rejects the extreme popular scenario of members ‘going unquestioning to their deaths for a charismatic leader,’ as well as the apologetic ‘counter-scenario … that if only the group had been left entirely to its own devices, all would have been well.’ Simplistic, polarized stereotypes of lethal cult episodes are misleading, in part because, as Kaplan notes, there are salient differences among several recent situations in which cults have been implicated in large-scale violence. Extreme violence erupts in only a small percentage of millenarian groups; moreover, sensational cult episodes such as Jonestown, Waco, or Heaven’s Gate account for only a fraction of the violence related to some form of millenarianism. Yet our understanding of such episodes ‘remains at best incomplete.’

Conclusions
We are under no illusion that the chapters in this book will serve to completely dispel misunderstanding of cults. Nor do we seek to disguise the many serious and acrimonious disagreements that still sharply divide scholars working in this field. After decades of polarization, it is an important first step that these authors have been willing to appear side by side in the same book. In the future we would encourage them to continue to debate these ideas in other media.

We are optimistic that the dialogue begun in these pages will have the kind of momentum that will weaken the rigid ‘thought communities’ that have polarized this field of study in the past. Somewhat more cautiously, we are hopeful that this weakening of the two extreme positions will lead to the growth and vitality of a flexible and open-minded, moderate scholarly cluster.

Notes
1 For purposes of this introductory essay we, the editors, will use the term ‘cult’ to denote a controversial or esoteric social movement which is likely in most cases to elicit a label such as ‘alternative religion,’ or ‘new religious movement.’ Such groups generally tend to be small, at least in comparison with large churches, and are often aberrant in beliefs or practices. They are sometimes very close-knit and regimented (‘totalist’) and manifest authoritarian, charismatic leadership. They may be strongly stigmatized. Although the term cult has become somewhat politicized and has taken on definite negative connotations, we do not intend to employ this term in a manner which settles contested issues by definition (i.e., we do not consider violence, ‘brainwashing,’ or criminality to be automatic or necessary attributes of cults). We also do not imply that all cults will appear to all observers to be specifically ‘religious.’ The term has sometimes been applied to groups which do not claim religious status, such as Transcendental Meditation, or which have a contested religious status, such as the Church of Scientology. Some commentators have referred to ‘therapy cults’ or ‘political cults.’ However, most well-known, controversial cults such as the Unification Church (Moonies), Hare Krishna, The Family (formerly The Children of God), the Branch Davidians, or Heaven’s Gate tend to be distinctly religious or at least supernaturalist. Our neutral use of the term cult precludes the view that cults, being pernicious, cannot be authentic religions.

2 As co-editors of this volume, we have not been and are not now entirely neutral or non-partisan in conflicted discourse over cults. Professor Zablocki is sympathetic to organizations concerned with abuses perpetuated by authoritarian groups employing manipulative indoctrination. Robbins has long been associated with avowed defenders of religious liberty and the rights of religious minorities. We thus remain divided over a number of crucial issues. But we are united in deploring the partisan excesses of past discourse, and in our desire to encourage depolarization and the development of a ‘moderate’ agenda among scholars concerned with controversial religious movements.

3 Although we speak of two extremes, we are mindful that the misunderstanding of cults is made even more convoluted by the fact that voices in the debate also include representatives of institutionalized religions. These are also polarized, but in terms of very different issues and with quite different agendas. Matters of doctrine (and possible heresy) are important to some of those within what is known as the ‘counter-cult’ movement, to distinguish it from the more secular ‘anticult’ movement. But other representatives of organized religion are energized by the felt need to defend even the most offensive of the new religions out of concern with the domino effect, once religions come to be suppressed. This book deals only tangentially with the very different set of issues raised about cults by organized religions. But their influence is impossible to ignore, even by secular scholars, if only because of the alliances to these groups that influence public stands on cult policy.

4 The term ‘cult apologist’ is in fact frequently employed by critics of cults to devalue scholars who are deemed to be too sympathetic towards or tolerant of objectionable groups. However, the term ‘cult basher’ is somewhat less ubiquitous. Scholars who disdain the views of strong critics of cults are usually more likely to refer to ‘anticultists’ or the ‘anticult movement.’ Deceptively mild, these terms can actually convey a significant stigma in an academic context. There is a clear implication that anticultists are movement activists, crusaders, and moral entrepreneurs first, and only secondly, if at all, are they scholars and social analysts; this is in contrast to putatively objective scholars who labor to dispel the myths and stereotypes disseminated by anticultists. However, since the term cult basher has an appealing surface equivalence to the more ubiquitous cult apologist, we will continue to use cult basher to denote one pole of the controversial discourse on cults.

5 Robbins’s chapter in this volume discusses the way in which growing litigation in this area and the interaction of the ‘cult of expertise,’ with the underlying adversarial system for resolving questions of law and policy in the United States operates to polarize conflicts involving cults.

6 This is a difficult issue, and we will pause before adopting an absolutist position. Some kinds of information may be accessible only to an observer who is trusted as an ‘insider’ (i.e., full participant). This may be the case not only with respect to ‘secret’ arrangements and practices, but also with regard to the thought patterns, verbal styles, and demeanor of devotees, which may vary according to whether they believe they are relating to insiders or outsiders (Balch 1980). What does seem objectionable is when an observer has used deception to gain access to a group, but with the hidden purpose of hurting or embarrassing the group, i.e., what might be termed the ‘Linda Tripp mode of participant observation.’ But must ‘undercover’ researchers always be obliged subsequently to be supportive of the (sometimes seriously flawed or objectionable) groups they have investigated?

7 We do not wish to appear unduly naive or as ivory tower isolationists. We realize that policy concerns often drive social research. The study of particular ‘social problems’ has often been constituted by social movements which have addressed or even constructed social issues. Thus a current boom in research on aspects of memory has responded in part to controversies over recovered memory syndrome and false memory syndrome. But this example may also illustrate the pitfalls of researchers becoming too partisan and committed to policy orientations. A prominent participant in recovered (vs. implanted) memory controversies has recently warned against researchers becoming too beholden to a priori commitments and thus tending to surrender the residual operational neutrality which sustains the authenticity of the research process and the credibility of emergent findings (Loftus and Ketcham 1994).

8 Some of these issues were raised earlier in the 1983 symposium. Scholarship and Sponsorship. The centerpiece of the symposium was a somewhat accusatory essay by the eminent sociologist Irving Horowitz (1983) who called for ‘neutral’ non-religious funding in the sociology of religion, particularly with respect to controversial movements. The divisive impact of Horowitz’s critique may have been muted because Horowitz was neither a scholar specializing in religion, a member of an ‘anticult’ organization, or a supporter of conceptualization of commitment-building within cults in terms of ‘brainwashing.’ The issue of partisanship and objectivity is closely related to the substantive scholarly conflict over brainwashing because allegations of ‘procult’ and apologetic bias are employed to explain why brainwashing notions are so intensely resisted by professional students of religion (Allen 1998; Zablocki 1997,1998).

References
Allen, Charlotte. 1998. ‘Brainwashed: Scholars of Religion Accuse Each Other of Bad Faith.’ Lingua Franca (December/January): 26-37.
Anthony, Dick. 1996. ‘Brainwashing and Totalitarian Influence: An Exploration of Admissibility Criteria For Testimony in Brainwashing Trials.’ PhD diss. Graduate Theological Union, Berkeley, Calif.
Anthony, Dick, and Thomas Robbins. 1992. ‘Law, Social Science, and the “Brainwashing” Exception to the First Amendment.’ Behavioral Sciences and the Law 10: 5-29.
— 1994. ‘Brainwashing and Totalitarian Influence.’ In Encyclopedia of Human Behavior. New York: Academic Press.
— 1995. ‘Negligence, Coercion, and the Protection of Religious Belief.’ Journal of Church and State 37: 509-36.
Balch, Robert W. 1980. ‘Looking Behind the Scenes in a Religious Cult: Implications for the Study of Conversion.’ Sociological Analysis 41: 137-43.
Balch, Robert W., and Stephan Langdon. 1998. ‘How the Problem of Malfeasance Gets Overlooked in the Study of New Religious Movements. In Wolves Among the Fold, edited by A. Shupe. New Brunswick, N.J.: Rutgers University Press.
Barker, Eileen. 1984. The Making of a Moonie: Choice or Brainwashing? Oxford: Basil Blackwell.
Beckford, James A. 1985. Cult Controversies: The Societal Response to the New Religious Movements. London: Tavistock.
Boyle, Robin. 1998. ‘Women, the Law, and Cults.’ Cultic Studies Journal 15: 1-32.
— 1999. ‘How Children in Cults May Use Emancipation Laws to Free Themselves.’ Cultic Studies Journal 16: 1-32.
Bromley, David. 1998. ‘Listing (in Black and White) Some Observations on (Sociological) Thought Reform.’ Nova Religio 1: 250-66.
Bromley, David, and Bruce Busching. 1988. ‘Understanding the Structure of Convents.’ Sociological Analysis 49: 15-32.
Bromley, David, and J. Gordon Melton. Forthcoming. New Religious Cults and Violence in Contemporary Society. Cambridge: Cambridge University Press.
Bromley, David G., and James T. Richardson. 1983. The Brainwashing/Deprogramming Controversy: Sociological, Psychological, Legal and Historical Perspectives. New York: Edwin Mellen.
Coser, Lewis A. 1974. Greedy Institutions: Patterns of Undivided Commitment. New York: Free Press.
Davis, Derek. 1998. ‘Religious Persecution in Today’s Germany.’ Journal of Church and State 40: 741-56.
Delgado, Richard. 1977. ‘Religious Totalism: Gentle and Ungentle Persuasion under the First Amendment.’ Southern California Law Review 51.
— 1980. ‘Religious Totalism as Slavery.’ Review of Law and Social Change 4: 51-68.
— 1982. ‘Cults and Conversion.’ Georgia Law Review 16: 533-74.
— 1984. ‘When Religous Exercise Is Not Free.’ Vanderbilt Law Review 37: 22-9.
Effrat, Andrew. 1972. ‘Power to the Paradigms.’ Sociological Inquiry 42: 3-33.
Fleck, Ludwik. 1979. Genesis and Development of a Scientific Fact. Chicago: University of Chicago Press.
Friedrichs, Robert. 1970. A Sociology of Sociology. New York: Free Press.
Greene, Ford. 1989. ‘Litigating Child Custody with Religious Cults.’ Cultic Studies Journal 6: 69-75.
Heins, Marjorie. 1981. ‘Other Peoples’ Faith: The Scientology Litigation and the Justiciability of Religious Fraud.’ Hastings Constitutional Law Quarterly 4: 241-57.
Hexham, Irving, and Karla Powe. 1999. ‘Verfassungsfeindlich: Church, State, and New Religions in Germany.’ Nova Religio 2: 208-27.
Horowitz, Irving. 1983. ‘Universal Standards Not Uniform Beliefs: Further Reflections on Scientific Method and Religious Sponsors.’ Sociological Analysis 44: 179-82.
Introvigne, Massimo. 1998. ‘Blacklisting or Greenlisting? A European Perspective on the New Cult Wars.’ Nova Religio 2: 16-23.
Kaplan, Jeffrey. 1997. Radical Religion in America. Syracuse: Syracuse University Press.
Katchen, Martin H. 1997. The Rate of Dissociation and Dissociative Disorders in Former Members of High Demand Religious Movements. PhD diss.. Department of Sociology, Sydney University, Sydney, Australia.
Keim, Albert N. 1975. Compulsory Education and the Amish: The Right Not to Be Modern. Boston: Beacon.
Kent, Stephen A., and Theresa Krebs. 1998. ‘Academic Compromise in the Social Scientific Study of Alternative Religions.’ Nova Religio 2: 44-54.
— 1999. ‘When Scholars Know Sin: Alternative Religions and Their Academic Supporters.’ Skeptic 6: 36-43.
Lemert, Charles. 1979. Sociology and the Twilight of Man: Homocentrism and Discourse in Sociological Theory. Carbondale: Southern Illinois University Press.
Lifton, Robert Jay. 1991. ‘Cult Formation.’ The Harvard Mental Health Letter, 7 February (8): 1-2.
Lofland, John. 1993. ‘Theory-bashing and Answer-improving in the Study of Social Movements.’ The American Sociologist 24: 37-57.
Loftus, Elizabeth, and Katherine Ketcham. 1994. The Myth of Repressed Memory: False Memories and Allegations of Sexual Abuse. New York: St Martin’s.
Maaga, Mary. 1999. Hearing the Voices of Jonestown. Syracuse: Syracuse University Press.
Martin, Paul, Lawrence Pile, Ron Burks, and Stephen Martin. 1998. ‘Overcoming Bondage and Revictimization: A Rational/Empirical Defense of Thought Reform.’ Cultic Studies Journal 15: 151-91.
Melton, J. Gordon. Forthcoming. The Brainwashing Controversy: An Anthology of Essential Documents. Stanford, Calif.: Center For Academic Publication.
Merton, Robert K. 1975. ‘Structural Analysis in Sociology.’ In Approaches to the Study of Social Structure, edited by P. Blau. New York: Free Press.
Ofshe, Richard. 1992. ‘Coercive Persuasion and Attitude Change.’ In The Encyclopedia of Sociology, edited by E. Borgatta and M. Borgatta. New York: Macmillan.
Ofshe, Richard, and Margaret Singer. 1986. ‘Attacks on Peripheral versus Central Elements of Self and the Impact of Thought Reform Techniques.’ Cultic Studies 3(1): 3-24.
Palmer, Susan, and Charlotte Hardman. 1999. Children in New Religions. New Brunswick: Rutgers University Press.
Pfiefer, Jeffrey, and James Ogloff. 1992. ‘Cults and the Law.’ Behavioral Sciences and the Law 10.
Richardson, James. 1986. ‘Consumer Protection and Deviant Religion.’ Review of Religious Research 28: 168-79.
— 1991. Cult/Brainwashing Cases and Freedom of Religion. Journal of Church and State 33: 55-74.
— 1998. The Accidental Expert. Nova Religio 2: 31-43.
Robbins, Thomas. 1979. Cults and the Therapeutic State. Social Policy 5: 42-6.
Robbins, Thomas, and James Beckford. 1993. Religious Movements and Church-State Issues. Religion and the Social Order 3B: 199-218.
Robbins, Thomas, and Susan Palmer. 1997. Millennium, Messiahs, and Mayhem. New York: Routledge.
Robertson, Roland. 1979. ‘Religious Movements and Modem Societies.’ Sociological Analysis 40: 297-314.
Shapiro, Robert. 1983. ‘Of Robots, Persons, and the Protection of Religious Beliefs.’ Southern California Law Review 56:1277-1318.
Shepard, William. 1985. To Secure the Blessings of Liberty: American Constitutional Law and the New Religious Movements. Chico, Calif.: Scholars’ Press.
Siskind, Amy. 1994. The Sullivan Institute/Fourth Wall Community. Religion and the Social Order 4: 51-78.
Stark, Rodney, and William Sims Bainbridge. 1996. A Theory of Religion. New Brunswick, N.J.: Rutgers University Press.
Van Hoey, Sara. 1991. ‘Cults in Court.’ Cultic Studies Journal 8: 61-79.
Weber, Max. 1946a. ‘Politics as a Vocation.’ In From Max Weber: Essays in Sociology, edited by H. Gerth and C.W. Mills. New York: Oxford University Press.
— 1946b. ‘Science as a Vocation.’ In From Max Weber: Essays in Sociology, edited by H. Gerth and C.W. Mills. New York: Oxford University Press.
Wessinger, Catherine. 2000. How the Millennium Comes Violently. Chappaqua, N.Y.: Seven Bridges Press.
Wright, Stuart. 1995. Armageddon in Waco. Chicago: University of Chicago Press.
— 1999. ‘Anatomy of a Government Massacre.’ Terrorism and Political Violence 11: 39-68.
Young, Thomas. 1990. ‘Cult Violence and the Christian Identity Movement.’ Cultic Studies Journal 7: 150-57.
Zablocki, Benjamin D. 1997. ‘The Blacklisting of a Concept: The Strange History of the Brainwashing Conjecture in the Sociology of Religion.’ Nova Religio 1: 96-121.
— 1998. ‘Exit Cost Analysis: A New Approach to the Scientific Study of Brainwashing.’ Nova Religio 1: 216-49.
Zimbardo, Philip, and Susan Anderson. 1993. ‘Understanding Mind Control: Exotic and Mundane Mental Manipulations.’ In Recovery from Cults: Help for Victims of Psychological and Spiritual Abuse, edited by M.D. Langone. New York: Norton.



page 35

1. ‘O Truant Muse’: Collaborationism and Research Integrity

Benjamin Beit-Hallahmi

In early May 1995, as Japanese law-enforcement authorities were collecting evidence linking the Aum Shinrikyo NRM to the 20 March poison gas attack which killed twelve commuters on the Tokyo subway, and preparing what they thought was a strong case, they discovered to their utter surprise that they were under attack from an unexpected direction. Four Americans arrived in Tokyo to defend Aum Shinrikyo against charges of mass terrorism. Two of them were scholars whose names are well known in the NRM research community, thanks to their many scholarly activities. But on this trip they were acting as both super-sleuths and as defenders of religious freedom. They stated that Aum Shinrikyo could not have produced the sarin gas used in the attack, and called on Japanese police not to ‘crush a religion and deny freedom.’ These statements, made at two news conferences, were met with open disbelief in the Japanese media. The fact that all travel expenses for the U.S. experts were covered by Aum Shinrikyo did not help (Reader 1995; Reid 1995). Later, one of the U.S. visitors published an account of the 1995 gas attack, which claimed that the North Korean secret services were behind it. Aum Shinrikyo was notorious in Japan long before the 1995 events. Its belief system made it likely to attract less than friendly attention. Thus, children in its schools are taught to regard Adolf Hitler as a living hero, and its official publications have carried stories of the Jewish plan to exterminate most of humanity (Kowner 1997). Still, all of us would agree that unusual beliefs, however distasteful to some, should be no justification for state action against religious groups. As students of religion, we deal with a huge variety of unusual beliefs. Was Aum Shinrikyo a victim of majority prejudice against NRMs?

Reliable reports since 1995 have indicated that Japanese authorities were actually not just overly cautious, but negligent and deferential if not protective regarding criminal activities by Aum, because of its status as an NRM. ‘Some observers wonder what took the Japanese authorities so long to take decisive action. It seems apparent that enough serious concerns had been raised about various Aum activities to warrant a more serious police inquiry prior to the subway gas attack’ (Mullins 1997: 321). The group can only be described as extremely and consistently violent and murderous. ‘Thirty-three Aum followers are believed to have been killed between … 1988 and … 1995 … Another twenty-one followers have been reported missing’ (320). Among non-members, there have been twenty-four murder victims. There were at least nine germ-warfare attacks by Aum Shinrikyo in the early 1990s, most of which had no effect (Reuters 1998). One triple murder case in 1989 and another poison gas attack in 1994 which killed seven have been committed by the group, as well as less serious crimes which the police was not too eager to investigate (Beit-Hallahmi 1998; Haworth 1995; Mullins 1997). So it is safe to conclude that religious freedom was not the issue in this case. Nor is it likely, as some Aum apologists among NRM scholars have claimed, that this lethal record (77 deaths on numerous occasions over 7 years) and other non-lethal criminal activities were the deeds of a few rogue leaders. Numerous individuals must have been involved in, and numerous others aware of, these activities. The Japanese authorities, as of May 1998, have charged 192 Aum Shinrikyo members with criminal activities (Reuters 1998).

Another claim by the Aum apologists is that the trip to Japan was initiated and financed by Aum ‘dissidents,’ shocked by the acts of their leaders. The reality is that the trip was initiated by the NRM scholars involved, who contacted Aum to offer their help, and that there are no Aum dissidents. As of 1999, Aum Shinrikyo is alive and well, one and indivisible, its members united in their loyalty to Shoko Asahara, and this includes the alleged dissidents who hosted our colleagues in 1995. Let me make clear at the outset that no one in his right mind should even hint that NRM scholars were knowingly defending the Aum murderers. They were assuming Aum Shinrikyo was another NRM worthy of defence. That is why they were ready to take Aum Shinrikyo’s money. They did not think they were taking unreasonable risks. They were acting out of a certain mind-set, which to them was not just reasonable but commendable. The Aum episode was symptomatic of an ideology that sees our world as a place in which NRMs are maliciously attacked by forces of the state, and are always presumed to be innocent.

Are we stunned by this grotesque and surrealistic episode? Do we look away? Do we raise our eyebrows? Do we shrug our shoulders? Are we shocked by the involvement of NRM researchers in this tragic story? Some NRM scholars have suggested that the trip to Japan, as reported in the media, caused the field an image problem (Reader 1995). Let me make clear right away that my concern here is not with images, but with the reality of scholarship. I am afraid that in this case, as in many others, the reality may be actually worse than the image. Is it just an isolated case of bad judgment? After all, only two NRM scholars were involved, and they were both independent scholars. Given the climate and culture of the NRM research community, and earlier demonstrations of support for NRMs in trouble, the Aum case is not some statistical outlier.

Something like a party line has developed among NRM scholars, and much of the discourse in NRM research over the past twenty years has been characterized by a happy consensus on the question of the relations between NRMs and their social environment, especially in situations of overt conflict. This consensus is responsible for a new conformity which must put strict limits on researchers’ curiosity. The level of conformity to the reigning consensus has been remarkable. This is not problematic in itself, and does occur in various fields, but it has also led to advocacy – as in the cases of Aum Shinrikyo and David Koresh – which is a public expression of support for an NRM and its vested interests, in any conflict with its social environment. NRM researchers engaged in advocacy are expressing a feeling and a reality of partnership and collaboration with NRMs in a common cultural struggle. If there had been only some isolated incidents, the only need to discuss them would have been as rare exceptions to the prevailing norms, but what we have observed is a clear pattern expressed in a total mobilization for the cause. The problem with the party line is not just that it has undermined scholarly credibility, but that it has crippled our main effort, which should be to understand and explain, rather than defend, the phenomenon under study. Discussions on the ethics of research with human subjects have to do with protecting those defined as research subjects (or objects) from abuse by researchers. Quite correctly, it is assumed that in the encounter between humans and those who study human behavior, the former have to be defended from the latter, who are seen as quite powerful (Weisstub 1998).

What should be the proper and desirable relationship between scholars and the groups they study? Naturally, this relationship must be problematic, marked by tension on both sides. No one likes to be under scrutiny of any kind, and we are all sensitive to the self-serving ways in which humans, scholars included, present themselves to others. The outsider, or even the insider, who chooses to report behind-the-scenes realities in any organization or culture is likely to produce an embarrassing exposé (Alper 1997). All academic research threatens all religions, and in particular new and weak NRMs. The minute we claim the role of the researcher in human groups, we adopt a peculiar position, which is judgmental, uninvolved, and alienated from raw experience and from true believers. The basic position is voyeuristic and reductionist; otherwise all I do is share the experience. A critical attitude and an interpretive bent are the marks of the scholar, who is unlikely to take messages from the subjects of his study at face value. Even as voyeurs we are quite peculiar, and this is exploited in a fictional satire on sociologists observing a failure of prophecy, which proves how little is really known about social science and its real dilemmas (Lurie 1967/1991). Things are a lot more serious and more complicated than the novelist Lurie ever imagined.

Credibility must be negotiated and earned by both informants and scholars, and what is at issue here is the credibility of NRM research. The official ideology of an academic field, i.e., its claims of being autonomous, expert-oriented, and accountable only to its own authorities – is a way of preventing attacks on its basic theoretical conception. Looking at the question of integrity and credibility in NRM scholarship must lead into a broader critical discussion of the field, because ideology clearly dictates research directions and questions. It is not just a problem of integrity, but of basic knowledge. This is an occasion for stocktaking, or, to borrow a religious term, soul-searching. The role of ideology in research is not a startling, novel issue – not only because of the impact of the sociology of knowledge, and not only because of Spengler’s (1926) radical theory of the historical nature of science. Awareness of the primacy of ideology in social science has been commonplace throughout this century (Beit-Hallahmi 1974a, 1974b, 1975, 1976, 1977a, 1977b, 1981, 1987, 1989, 1991, 1992b, 1995), and the suggestion that all inquiry about the social world is value laden is universally accepted (Keita 1998). The classical study by Pastore (1949) on the nature-nurture debate in psychology showed ideology clearly precedes action in academic work. What we have here is a case study of a peculiar ideological bias affecting academic work and non-academic advocacy.

We, as students of religion and members of the academic community, are all biased (Robbins 1983). Our differing ideological commitments do not prevent us from communicating and collaborating as colleagues. Scholars are expected to be sophisticated consumers of their colleagues’ work. They detect errors, bias, and oversights, and separate valuable gold nuggets from slag. In the study of religion, bias and religious commitments should not necessarily undermine scholarship; they may only set its limits. There have been religion scholars with strong religious commitments who were still great scholars (Beit-Hallahmi 1989, 1991, 1996b). Our conflicting biases should naturally lead to debates and controversy. It is indeed baffling when we observe in a particular research network the strange silence of conformity. Scholars in perfect agreement are like the dog that didn’t bark. They should make us curious, if not suspicious. I must say that until fairly recently I had what was a ‘naive faith in the professional impartiality of one’s colleagues’ (Lurie 1967/1991: 185). Having to deal with this issue and blow the whistle on my colleagues made me uncomfortable, and then I realized that I myself had gone through many years of denying what was literally staring me in the face. I should also mention that I have shared earlier versions of this chapter with colleagues in psychology, sociology, and the humanities. All reacted with shock and disbelief.

In recent years, the NRM research community displayed a general agreement on a hierarchy of credibility (Becker 1967), according to which self-presentation by NRMs was epistemologically and logically superior to all outside accounts and observations. The NRM research community will give more credence to the claims of NRM members and leaders than to claims by former members, outside observers (e.g., the media), and government officials (especially law-enforcement officials). This has led, over the past twenty years, to a pattern of collaboration with NRMs, reaching its culmination, and logical conclusion, in the Aum episode reported above. I would like to look at the origins of the conceptual consensus and its derivatives. What might be the muse that has inspired it, and, if we follow the allusion to the Shakespearean sonnet, how do beauty and truth match? To use the sonnet’s idiom, I would like to suggest that what we have here is a case of the neglect of truth in favor of beauty. The happy consensual discourse, shared by colleagues whose scholarship I admire and to whom I am in debt, turns out to be, on closer examination, a rhetoric and a logic of advocacy, apologetics, and propaganda. When the advocacy and apologetics agenda defines the rhetoric and the logic of research it creates an impoverished discourse, which denies the madness, passion, and exploitation involved in NRMs, and leads to an intellectual dead end. I would like to discuss some critical incidents which will serve as case studies in the esthetics and ethics of NRM scholarship, and then attempt some explanations for the development of the happy consensus.

The essence of the consensus has been described in a most elegant way by two leading sociologists of religion as follows: The pattern of various debates and positions adopted appear to represent something of a consensus that where there is a significant erosion of traditional religious liberties, litigation is likely to turn on evidence which conflicts with the prevailing corpus of knowledge represented by the professional societies, individual and collective activism is potentially appropriate’ (Robbins and Bromley 1991: 199). The prevailing consensus has created norms around the tasks to be performed by NRM scholars and the rewards to be allocated to them. The felicitous phrasing hints at two separate issues. The first, more general, had to do with conflicts between NRM and society; the second had to do with court cases in which the (groundless) ‘mind control’ arguments had been raised.

The assumption of a ‘significant erosion of religious liberties’ which warrants ‘activism’ seems to cover the whole planet earth. No national boundaries are mentioned, though the context is clearly the United States. The automatic reaction on the part of some scholars in the cases of David Koresh and Aum Shinrikyo proves the seriousness with which this assumption is held. If we look around us and check the status of NRMs in 1970 and today, and the level of tolerance they have enjoyed over the years, we will be forced to conclude that this erosion of religious liberties is nonexistent. Not only do we find hundreds of unmolested NRMs in North America, but in Europe, Africa, and Asia, in countries which lack U.S. constitutional guarantees, NRMs are not only surviving in the face of the significant erosion, but some are thriving, a development which NRM scholars are happy to report on. It is clear that an organization such as the Solar Temple was not subject to any such ‘erosion’ in Europe or Canada, despite the attention it received from anti-NRM groups and the criminal activities of its members. Globally, it is the rise in tolerance which enabled NRMs to expand and survive. Intolerance is very much part of life everywhere, but today its impact is being checked by economic and cultural globalization.

There has been little explicit discussion of scholarly activism and collaborationism. One leading scholar has issued a striking appeal for greater activism which goes even beyond collaboration, when he stated:

“I propose that we become religious engineers … Sociologists of religion are among the most ethical and high-minded of scholars, and there is no reason why they should not apply their knowledge to the creation of new religions. The world needs them. We have roles to play as consultants to existing new religions, helping them solve problems that our research has permitted us to understand. But we must be prepared to launch cults of our own invention, a task I admit is both hazardous to one’s own welfare and outrageous in the eyes of people who refuse to admit that all religions are human creations. But it is far better for honest religious engineers to undertake the creation of new religions for the sake of human betterment than to leave the task to madmen and wealth-hungry frauds.” (Bainbridge 1993: 289)

This suggestion from an eminent scholar runs counter to the party line. Here the elimination of a possible role conflict between that of a researcher and that of a consultant is easily achieved. As far as I know, this call has gone unheeded for the most part. NRM scholars have continued to serve as consultants to the same NRMs that Bainbridge regards as being led by ‘madmen and wealth-hungry frauds’ but have refrained from starting their own religions.

‘Activism’ in Action – The Consensus

The Aum Shinrikyo incident can only be explained by assuming that an overall ideological commitment is involved. What we witness is a sentiment of solidarity with NRMs. Some scholars seem to be saying, paraphrasing John F. Kennedy in 1961: ‘We will go anywhere, make any sacrifice’ to defend NRMs in trouble. Looking at the history of collaboration with NRMs over the past thirty years takes us from the curious to the bizarre and then to the disgraceful performance described earlier. But it all started with benevolence and mutual curiosity, as researchers and NRMs were eyeing each other. The consensus started developing back in the 1970s, when some NRMs were fighting for recognition and legitimacy, and were eager to gain academic allies. Some NRM researchers willingly obliged, joined their side, and supported their claims. The mere fact of being recognized as a religious movement worthy of study seemed like an achievement for some groups. For other groups, the ‘religious’ label was crucial. As Greil (1996: 49) has suggested, being considered a religious movement is ‘a cultural resource over which competing interest groups may vie …’ giving ‘privileges associated in a given society with the religious label.’ Moreover, ‘the right to the religious label is a valuable commodity’ (52). By applying the religion label consistently and generously to any group that asked for it, NRM scholars provided support that was not forthcoming from any other quarters. Not only is the ‘religion’ label worth money; so is support from sociologists who offer PR advice (see below). Harnessing scholarly prestige and enlisting scholars for unpleasant court cases was in the groups’ clear interest.

When we take a closer look at the practices involved, we must realize that the issue is not our relations with groups we have done research on, but most often the collaboration with groups we have never done any research on. Normally, when we do research on human behavior, the issue may be one of empathy, sympathy, or the lack of both. Here we have not only sympathy and empathy, but complete identification. What we have is a practice of collaboration and an ideology of collaborationism, a perceived and experienced symbiosis, an alliance of mutual dependence. It seems that the strong and total identification of scholars with NRMs or groups claiming to be NRMs is deeply tied to their own identity as scholars. This description of the way collaborationist scholars feel is based on their own statements.

NRM PR Campaigns
What is PR? PR is different from the direct approach to selling your products and assumed under the heading of advertising. Here the action is subtle, selling people on something which is intangible, such as image or a reputation, and doing it gently, imperceptibly. Bits of information are manipulated to shape public perceptions and public opinion. The language of public relations includes such products (sold by PR agencies) as ‘reputation protection systems,’ vulnerability assessment, re-branding, ‘image rescue’ and crisis management. We can speculate that if a religious movement decides that it is faced with an ‘image problem,’ a natural response would be a religious one (e.g., praying to the gods for an improvement). What can we say about an NRM (or a group claiming such a label) which engages in a PR campaign, sometimes with paid help from professionals? Just that it probably feels the need for reputation protection and crisis management, and that it is quite atypical. Most NRMs have never engaged in such efforts, and have never tried to enlist scholars in their PR efforts. What is also clear is that in those rare cases in which NRMs have chosen to start PR campaigns, scholars were enthusiastic in their collaboration. We cannot generalize about NRMs, and any theoretical generalization about NRMs must be an empty abstraction, but it is clear that most NRMs do not spend money on publicity campaigns. At the same time, generalizing about NRM scholars is quite easy. When publicity campaigns have been engaged in, these scholars have come to the aid of those groups initiating them. The problem, as has become evident over the past few years, is that PR has become professionalized, and scholars have remained rank amateurs. The age of professionalism demands much more than academic imprimaturs.

The Unification Church in Search of Respectability: A Case Study
The ‘Moonies’ were the first to appreciate the value of having professors on their side. Since the 1970s, they organized a variety of front organizations and held numerous conferences, of which the best known were the Unity of Science conferences. [International Conference on the Unity of the Sciences, ICUS] At these conferences academics from all over the world met to discuss what united them, which was obviously the readiness to accept a free trip, no questions asked. Most academics attending the conferences were not religion scholars, but came from all fields. Those who had a scholarly interest in religion were clearly aware of their worth in the coin of legitimacy and respectability to the Moonies. There was criticism of academics who were ready to provide recognition to the Moonies by attending the conferences (Horowitz 1978), but these critical voices were decisively ignored by NRM scholars. There is a red thread that connects the cozy relationship with the Moonies in the 1970s and the events of the 1990s. In 1977, an organization calling itself Americans for the Preservation of Religious Liberty (APRL) was founded, one of many front organizations financed by groups seeking legitimacy as religions. The same groups that started it, the Unification Church, The Family, and Scientology, have remained the most active in spending money and cultivating contacts with the NRM research community. One does not need to have access to the financial records of these groups to realize that many millions of dollars have been spent to achieve legitimacy and goodwill for such groups, through numerous front organizations, publications, public conferences, and less public transfers of funds. Has this spending been justified? This kind of cost-benefit analysis has to be done by the groups involved, using their own criteria. We can only state here that the money spent has been highly correlated with the level of friendly contacts with scholars. One may wonder about the financial resources which enable some NRMs, and not others, to spend many millions of dollars on publicity efforts. In terms of spending, the Unification Church and Scientology are probably the leaders, followed by The Family.

The Family (formerly Children of God) and Scholars
The case of The Family PR campaign is quite interesting. It seems that in the early 1990s, David Moses Berg decided that his group was in need of a PR face-lift in the form of an image change. (We can speculate that what was involved was not just a superficial PR change, but a change of heart as well, as Berg was removed from active leadership.) Contacts were developed with the NRM research community, and money was spent on publications, ‘media houses’ (i.e., the only Family dwellings which outsiders may enter), mass mailings, and visits to academic conferences. The amounts spent must have been significant. One may note that spending by The Family has brought about much in the way of goodwill and positive testimonials in writings, one film in 1994, appearances at scholarly conferences, and public statements in support of the group at such conferences. I myself attended a presentation of the film The Love Prophet and the Children of God at the 1998 meeting of the SSSR (Society for the Scientific Study of Religion). The film was followed by a discussion, in which three scholars who had benefited from The Family largesse spoke. Only one disclosed a financial relationship on that occasion. The Family is of course much different from Aum Shinrikyo, but the principle is the same: what Shils called ‘publicists and political activities’ (1978: 183).

The historical continuity in the collaborationist network is quite evident. Some of the same scholars who were happy to enjoy the hospitality of the Unification Church in the 1970s are happy to appear in a videotape for The Family in 1994. What is most impressive is that no group is ever refused the scholarly imprimatur, and the usual suspects are always there. In the case of the Family video, the scholars giving their imprimatur were Bryan Wilson, Eileen Barker, Charlotte Hardman, Lawrence Lilliston, Gary Shepherd, Anson Shupe, J. Gordon Melton, James Lewis, and Evelyn Oliver. We run across the same names in the Church Universal and Triumphant (CUT) (Lewis and Melton 1994a) and The Family (Lewis and Melton 1994b) volumes, as well as in other NRM public relations efforts. To the best of my knowledge, most of these scholars have never done any research on the Family. The problem is, of course, that the usual suspects are our distinguished colleagues, those who define the field of NRM research.

An Operative Consensus
It seems that the operative consensus started forming in the late 1970s, and was well in place by the early 1980s. Leading scholars in the field decided to take a stand in the propaganda war over the legitimacy and reputation of certain NRMs (or groups claiming to be NRMs), and to work together with them in order to give them much needed public support. It was felt that in the struggle for legitimacy, anything perceived as weakening the public stand and harming the public image of NRMs should be avoided. We can hypothesize that public opinion is formed in many ways, and influenced by the deliberate efforts of government, the media, political leaders, cultural productions, educational institutions, and scholarship. In this case scholars tried their hand at influencing events outside the ivory tower. A defensive discourse has grown to protect any seeming indiscretion or transgression. Fifty years from now, when the archives are opened up and private letters read, historians will be able to answer better the questions raised here and explain the development of the late twentieth-century consensus among NRM scholars. In the meantime we can work only on the basis of public documents, but from time to time confidential documents see the light of day and provide additional insights into the behavior of our colleagues.

To illustrate and discuss the ideology of collaborationism we are going to look at a couple of pages from the memo written by Jeffrey K. Hadden on 20 December, 1989. This memo has been widely circulated and can be found on the Internet, but I thought it worthwhile to present it. It is significant that this memo was sent to numerous colleagues, and was not kept secret. The author’s assumption was that there was nothing to hide, because of the overwhelming support for his point of view. I know that some of our colleagues do prefer a noname policy, and just want me to say ‘a prominent sociologist of religion.’ I have used this kind of language before, but today I decided that I must use full names first because scholars should be held accountable for their actions. One important reason to look at this text is that Jeffrey Hadden is by no means a marginal figure. Some of his colleagues have been trying to tell me that he is some kind of a loose cannon … At the same time, Hadden has not directly researched some of the groups he is willing to defend.

This emblematic document reports on a series of meetings and activities involving several leading NRM scholars (two of which have been presidents of the Society for the Scientific Study of Religion), NRM attorneys, NRM leaders, and some other scholars. Many future plans are discussed, most of which never materialized. The agenda and the commitments expressed are very clear. The memo proves, beyond a shadow of a doubt, that there exists not only behind-the-scenes contacts between scholars and NRMs, but also the coordinated effort on the part of leading NRM scholars to work with NRMs. What is striking is the clear sense in which the leading members of the NRM research network regarded NRMs as allies, not subjects of study. Three distinguished NRM scholars met privately with NRM representatives, not to discuss a research project or to interview them, but to discuss what was presented as a common cause. We find here the essence of collaborationism: the notion that scholars and NRMs are allies, and that there is no role conflict between being an ally and a researcher.

It seems that the scholars were more eager than the NRMs to lead the fight for NRM legitimacy. ‘Our meetings with the members of the Unification Church confirmed our earlier impressions that while they may assent to the value of a long-range strategy for dealing with the anticultist and their forensic consultants, their response is very substantially confined to ad hoc responses to crises. I pressed them on the question of whether it might be possible for the UC in collaboration with several other NRMs to raise a significant amount of money – no strings attached – to an independent group, which in turn would entertain proposals and fund research on NRMs.’ NRMS were less than enthusiastic, the writer thought, and ‘The cooperative funding of the American Conference on Religious Freedom would appear to be about as far as they are prepared to go at this time’ (Hadden 1989: 4). The collaborationism ideology, as revealed here, is somewhat paternalistic. We know what’s good for the NRMs, which cannot recognize their own best interest. And it is in their own best interest to defend every single group claiming to be an NRM.

In addition to the idea of creating an NRM-funded research organization, ‘… we spent a good deal of time considering whether the time might be right to import… INFORM, or create a U.S. organization that would perform a similar function … In spite of having some bad experiences with the media … INFORM has taken a very significant step in neutralizing anticult movements in the U.K.’ (Hadden 1989: 5). Over the years, INFORM has been accused of acting as an NRM apologist (e.g., Usher 1997), and this memo seems to confirm some of the suspicions. Another striking aspect of the activities discussed in the memo is the orientation towards litigation: ‘On the issue of the value of research and litigation, our legal consultant … was not particularly sanguine about the prospects of social scientists coming up with findings that would be of great value. In so many words, he told us that the most important think [sic] we could do is prepare a statement that refutes the claim that social science can be helpful. I interpreted this as the agnostic statement we discussed in Salt Lake. Which brings us back to the question of a resolution for ASA Council consideration’ (Hadden 1989: 6). We have not seen any similar documents written since 1989, but the public record over the past few years does indicate that leading NRM scholars have been involved in litigation as expert witnesses, consultants, or authors of amicus curiae briefs, always on the side of NRMs. These leading scholars became available to those groups that were interested in their support and services. The NRM advocacy consensus led to a broad propaganda effort, and had a significant effect on research. It looks like anti-NRM groups became a major interest, and that for some scholars more effort went into studying anti-NRM groups than into studying NRMs. This started quite early on (Shupe and Bromley 1980; Shupe, Bromley, and Oliver 1984).

Three years after the meeting reported in the 1989 memorandum, in 1992, the Association of World Academics for Religious Freedom (AWARE), which described itself (Lewis 1994a: 94) as ‘… an information centre set up to propagate objective information about non-traditional religions,’ came on the scene. Each and every NRM scholar undoubtedly considers him- or herself an information centre propagating objective information about non-traditional religions, so there must be some real reasons for the creation of another such centre. ‘The primary goal of AWARE is to promote intellectual and religious freedom by educating the general public about existing religions and cultures, including, but not limited to, alternative religious groups … AWARE’s orientation is scholarly and non-sectarian; the organization is not religious and does not have a religious ideology to propagate … AWARE also educates the scholarly community and the general public about the severe persecution that religious and cultural minorities experience in the United States as well as abroad, and to support the United States government in its efforts to heal the prejudice that exists in our country and in the world’ (214).

AWARE, led by James R. Lewis, has become a contractor for operations that can no longer claim any semblance or resemblance to research. One symptomatic product of the post-Waco NRM consensus is the Lewis volume titled From the Ashes: Making Sense of Waco (1994a). It seems like a typical apologetic pamphlet, a collection of 47 statements, authored by 46 individuals and 3 groups. Of the 46 individuals, 34 are holders of a PhD degree, and 19 are recognized NRM scholars. One cannot claim that this collection of opinion-pieces is unrepresentative of the NRM research network; quite the contrary. Most of the top scholars are here. The most significant fact is the participation by so many recognized scholars in this propaganda effort. In addition to From the Ashes we now have Church Universal and Triumphant in Scholarly Perspective (Lewis and Melton 1994a) and Sex, Slander, and Salvation: Investigating the Children of God / The Family (Lewis and Melton 1994b). The last two are clearly made-to-order PR efforts (with a few scholarly papers which got in by honest mistakes on the part of both authors and editors). The Family and Church Universal and Triumphant were interested in academic character witnesses, and many NRM scholars were happy to oblige. Balch and Langdon (1996) provide an inside view of how AWARE operates by offering a report on the fieldwork, if such a term can be used, which led to the AWARE 1994 volume on CUT (Lewis and Melton 1994a). What is described is a travesty of research. It is much worse than anybody could imagine, a real sellout by recognized NRM scholars. Among the contributors to the Family volume we find Susan J. Palmer, James T. Richardson, David G. Bromley, Charlotte Elardman, Massimo Introvigne, Stuart A. Wright, and John A. Saliba. The whole NRM research network is involved, the names we have known over the past thirty years, individuals with well-deserved reputations lend their support to this propaganda effort. There must be some very good reasons (or explanations, at least) for this behavior. The PR documents produced for groups such as Church Universal and Triumphant or The Family are but extreme examples of the literature of apologetics which has dominated NRM research for many years. Another aspect of these cases is that the reporting of financial arrangements is less than truthful. The fact that CUT financed the whole research expedition to Wyoming is not directly reported. We learn that CUT provided only room and board, while AWARE covered all other costs (Lewis 1994b). The fact that The Family volume was financed by the group itself is never reported anywhere, although it is clear to the reader that the whole project was initiated by Family leaders (Lewis 1994c). The Family volume has been recognized for what it is: a propaganda effort, pure and simple, paid for by the group (Balch 1996).

Lewis (1994c: vii), reporting on The Family, states that he ‘found the young adults to be balanced, well-integrated individuals, and the children to be exceptionally open and loving.’ In reporting on the Church Universal and Triumphant, Lewis (1994b: ix) states that he ‘found the adults to be balanced, well-integrated individuals, and the children to exceptionally bright and open.’ As a copywriter, James Lewis clearly leaves something to be desired, but so many copywriters have used the innocence of children to sell us anything from carpeting to laundry detergents. As these lines are being written, another volume of scholarly propaganda is being produced by the Lewis-Melton team, with the help of some distinguished scholars. The subject is the group known as MSIA (Movement for Spiritual Inner Awareness), led by John-Roger. The readers can expect another encounter with well-integrated adults and exceptionally open children. But what is the intended audience for such publications? Who is being addressed? In the case of the Lewis-Melton volumes, it is quite clear that the intended audience is not strictly academic. The intended non-academic audience is revealed through the press release style of the introductions. The audience in mind is very clearly the leaders of CUT and the Family, those who are paying for the product.

The issue of the conflict of interest created by financial support from corporations is one of the hottest in academic life today. The conflict is obvious to all except NRM researchers, who claim to be immune to it, and suggest that receiving money does not affect their activities or commitments in any way (e.g., Barker 1983). Such claims certainly sound strange when coming from social scientists. The norm of reciprocity is considered one of the building blocks of all human interaction, and it is clear that gifts are nearly always given with an expectation of reciprocity in mind (Arrow 1972). This norm, observed by most humans and conceptualized by some of the greatest names in sociology and anthropology (Gouldner 1960; Homans 1974; Levi-Strauss 1965; Mauss 1954; Titmus 1970), is suddenly ignored when the time comes to explain the behavior of real sociologists in the real world. Early theorists (Exod. 23: 8; Deut. 16: 19) were quite as decisive in their views concerning the possible effects of gifts on scholarly judgment: ‘Thou shall take no gift: for the gift blindeth the wise, and perverth the words of the righteous.’ This ancient opinion, possibly 2,500 years old, has not been heeded.

The Great Denial
The problem with the new dogma, and the new Inquisition guarding it, is that it obviously stifles curiosity and is likely to have a debilitating impact on research. The practice of apologetics is tied, causally or not, to the Great Denial. It starts with an emphatic denial of the pain, suffering, and hardships which characterize not only NRMs, but much of human existence. Pathology, deception, and exploitation can be found in many human interactions. In NRMs they may be more or less prevalent, but they are surely present. Does the literature reflect that? Not very often. What we deny is the madness, the oppression, and the deception, which create the totalitarian nature of many NRMs. The Great Denial offers those who accept it a sanitized, euphemistic, and sterilized view of humanity, an exclusion of tragedy, passion, sadness, frustration, brutality, and death. This is the great fear. It goes without saying that psychopathology and deception are excluded as explanations of any acts or events in NRMs.

How do we explain religious conversion? Conversion is an exceptional (some may say anomalous) behavior, occurring in a tiny proportion of religious believers. In old religions it has become rare; in new religions it is even rarer. It is also a mostly modern phenomenon, but even in modern times this behavior is highly unusual (Argyle and Beit-Hallahmi 1975; Beit-Hallahmi and Argyle 1997). Exceptional acts may be connected to exceptional situations, populations, personalities, or personality traits (cf. Anthony and Robbins 1997). Various motivations have been proposed to account for conversion, including serious psychopathology (e.g., borderline conditions, schizophrenia, depression) or vulnerability in the seeker (Beit-Hallahmi 1992a), as well as ‘normal’ psychodynamic factors of narcissism, dependency needs, or counterdependent impulses. Some of the individuals involved may be judged by clinicians to be ‘deeply disturbed,’ ‘seriously paranoid,’ or ‘having an immature ego.’ It has often been argued, not only in the case of religion but also in the case of other significant ideologies, that early psychic wounds and the need for healing and compensation are the engines behind strong personal commitment. The question is that of judging whether the individual resorts to adaptive or maladaptive coping strategies, and whether the cognitive processes involved may be abnormal. Not too surprisingly, psychiatrists, psychoanalysts, and clinical psychologists have tended to mention biographical factors and psychopathology more often. Thus, Laing (1959) regarded those who would join NRMs as engaged in a desperate defence against psychotic breakdown. Similar suggestions have been made by Linn and Schwartz (1958) and Schimel (1973). Observations regarding psychopathology in NRM members, referring to groups by name, have appeared in the first part of this century (Bender and Spalding 1940; Yarnell 1957).

The Great Denial starts with denying the reality of distress in the seekers who join NRMs. While sociologists ignore psychopathology as an explanation for any behavior in formal writings, they refer to it in the real world. In most of the writings, conversion becomes a rational, intellectual process of creating a conscious ideology (or acquiring it). There are no personal or personality factors. No love or madness or pathology are encountered on the road to salvation. It should be noted that the use of psychopathology labels is never an explanation; it is only a starting point and a guide to further analysis. The language of psychopathology is too limited even just to describe the reality of life in groups such as Aum Shinrikyo or Heaven’s Gate.

Viewing a cognitive search for a differing world view as the basis for the process of conversion is a starting point which creates a sterilized view of NRMs. The Great Denial naturally continues with describing the internal workings of NRMs as virtually untouched by any pathology and by any motives except religious beliefs. There is little emotion or psychological upheaval. There is no drama, only abstract social forces and conscious decisions by individuals. The deprivations and tragedies of the real world are left behind. The Black women, who were the majority of the victims in Jonestown, and a significant minority in Waco, are ignored by NRM researchers (with some welcome exceptions; cf. Anthony and Robbins 1997), who should be most alert to their presence.

Any discussion in terms of psychopathology or deception will stigmatize all NRMs, and cause incalculable damage to their public image. Terms having to do with psychodynamics, psychopathology, or psychoanalysis frighten sociologists. But we are not talking about psychosis, pathologizing everybody, or ‘medicalization.’ Psychoanalysis is a theory of ‘normal’ universal motives: we all are given to the same pressures and needs. We all have narcissistic ideas and apocalyptic dreams – various needs leading to what Charles Rycroft (1991) has called pathological idealization. It is possible to hold on desperately to this pathological idealization in order to avoid disillusionment. In everyday life, and in the lyrics of the blues, we call that crazy love.

The denial of madness, so common to sociologists, is the denial of crazy love. The followers of David Koresh and Marshall Applewhite were not just following a totalitarian ideology. They were madly in love with their leaders. The denial of madness in general is tied to the denial of pathology in groups, but the pathology is visible to all, except NRM researchers. Marshall Applewhite was crazy, as was David Koresh. This has to be noted, while not being a complete explanation for the group’s behavior. How did a madman become a group leader? We have to look at his followers, few in number, whose pathology miraculously matched his. Why did Applewhite castrate himself, with six of his disciples following suit? Rich (1998) suggested that this was a radical and total reaction to homosexual urges. To have this insight come from a journalist demonstrates again the poverty of scholarly discussion on the reality of NRMs, so often filled with horrors and anxieties. And this reality isn’t new. Applewhite was a late twentieth-century version of George Rapp (1785-1847), founder of the Harmony Society, who castrated himself and his own son, and forced several of his followers to be castrated. And the idea of castration and genital mutilation is nothing new in the history of religion.

Defending NRMs and denying psychopathology in the seekers are connected. Denying evil deeds ascribed to NRMs is tied to denying the stigma of pathology in NRM members. The important idea of social constructivism may be taken too far, towards a radical critique which suggests (à la Michel Foucault) that all pathology (physical or mental) is a social construction, another instrument of oppression in the service of an evil establishment. Unfortunately schizophrenia is not a social construction, and can be found in all human societies at a constant ratio of around 1 to 100. Other serious psychopathologies add to a rate of prevalence, which means that the human psyche is less resilient than we would wish to believe. As a result, in every generation and in every culture, a significant minority of humanity does reach a point of self-destructive regression in its attempts to cope with reality.

Nevertheless, the denial goes on. Some NRM researchers have suggested that mental health professionals in the United States serve as social control agents in suppressing NRMs, through ‘the medicalization of deviant religious behavior’ (Hadden 1990; Robbins and Anthony 1982). Richardson (1994: 36) stated that the ‘… effort to “medicalize” participation in newer religious groups has been relatively successful.’ And in the bluntest presentation of this thesis, Hadden (1990) compared the actions of these professionals to Soviet psychiatrists acting against political dissidents, and to Nazi atrocities. He then even suggested that ‘the time is right for the United States to be visited by an international team of psychiatrists and social scientists for the purpose of investigating the role of American mental health professionals in their dealings with New Religious Movements’ (13). Such a research team would easily discover that the vast majority of mental health professionals in the United States (and elsewhere) have little interest in NRMs, and even less knowledge. The ‘medicalization of NRMs’ never took place, and Hadden himself admits in a footnote that ‘the proportion who are actively engaged in anticult activities [sic] is almost certainly very small’ (20). As we all know, it was a mere handful of psychiatrists, and a few clinical psychologists, who have been active in looking at NRMs and their members. Benign neglect is the attitude of 99 per cent of the world’s mental health professionals towards NRMs, for two simple reasons: first, NRMs involve a small minority of the general population, and a minority of mental health clients; second, these professionals are extremely individualistic in orientation. Despite the fact that religion (old or new) and psychotherapy offer individuals competing models of conceiving meaning and personal destiny (Beit-Hallahmi 1992a; Kilbourne and Richardson 1984), most psychotherapists seem to be happily oblivious to the existence of NRMs. We see that the medicalization thesis is totally irrelevant to reality.

Another critical claim regarding the use of psychopathology labels suggests that there is a common anti-religious prejudice in psychiatric nosology and diagnosis (Richardson 1993). If such a prejudice does exist, it has not affected the research on the relationship between religiosity and psychopathology, which neither assumes nor finds this prejudicial connection (Argyle and Beit-Hallahmi 1975; Beit-Hallahmi and Argyle 1997; Lowenthal 1995). Houts and Graham (1986), among others, have demonstrated that in clinical practice, religiosity is not judged as being tied to pathology; the opposite may be true (Shafranske 1997). Many researchers rather point to the therapeutic effects of religiosity, especially in the case of NRMs (Beit-Hallahmi and Argyle 1997). Early in this century it was a Viennese physician named Sigmund Freud who noted that it was not ‘… hard to discern in that all the ties that bind people to mystico-religious or philosophico-religious sects and communities are expressions of crooked cures for all kinds of neuroses’ (Freud 1921: 142), and whether the cures are temporary or permanent is indeed a researchable issue. The opposite of the therapeutic effect is madness escalation within the group, mentioned above in reference to Koresh and Applewhite. Assessing psychopathology in the seekers who join NRMs is problematic, because it requires the use of control groups as well as retrospective and prospective biographical assessment. Comparing seekers and converts to matched controls, who are similar in all other respects save conversion, has rarely been reported (Beit-Hallahmi and Nevo 1987; Ullman 1989). In retrospective life history interviews, however, you can assess such things as drug addiction, prior hospitalizations, bereavement, and other relevant events, which then can be verified. Much better than any biographical reconstruction is prospective research, where a cohort of individuals is assessed at the starting point, and then followed up over time. Life events, such as joining an NRM, are recorded as they happen.

The study by Stone (1992) is one of a handful of relevant prospective studies. It traced the histories of 520 mostly borderline and schizophrenic patients who were hospitalized between 1963 and 1976, ten to twenty-five years after being discharged. This study was not designed to look at religious involvements, and it came as a surprise to the researcher that 7 of the 520 individuals had a conversion to an old religion, while 25 joined NRMs, for a total conversion rate of 6 per cent. This confirms many anecdotal observations and some systematic surveys (Beit-Hallahmi and Argyle 1997) about the involvement of former mental patients in NRMs. Stone’s (1992) interpretation of the data, by the way, is wholly sympathetic to religion, NRMs, and their therapeutic benefits for psychiatric patients.

‘So Many Rascals?’: Deception, Money, and the Great Denial

An issue that is even more sensitive than that of psychopathology in NRMs is a rarely discussed aspect of charisma: the reality of cynical manipulation on the part of NRM entrepreneurs, hustlers engaged in self-invention in the service of greed. Charisma is, among other things, an odourless and colourless substance that enables con artists to prey on wealthy widows, and religious hustlers to exploit those seeking, and finding, salvation. You can be both deranged and a hustler, as shown by Luc Jouret. He taught people that there was more to life than money, and to prove their triumph over materialism they had to turn over their money to him, a little like David Koresh carrying the burden of sexuality all by himself for all (male) members of the Branch Davidians. In the novel S., which is a roman à clef by John Updike (1988) about the Rajneesh movement, a similar point is made about material possessions, which are clearly a burden, and so the group is ready to shoulder this burden for its members, by asking them to sign over their financial assets.

In response to a question about the motivation of ‘gurus,’ Idries Shah, a man who has been often called a guru, says: ‘Some are frankly phonies, and they don’t try to hide it from me. They think I am one too, so when we meet they begin the most disturbing conversations. They want to know how I get money, how I control people, and so on … They actually feel there is something wrong with what they are doing, and they feel better if they talk to somebody else who is doing it’ (Hall 1975: 53). Are NRM researchers being conned by NRM entrepreneurs, or are they in reality willing shills for the con men (and women) at the top (cf. Zablocki 1997)? There are just a few references in the NRM literature to deception, manipulation, and greed.

What I have personally observed in some NRMs are two aspects of financing which create a great deal of opposition but are rarely discussed in the NRM literature. The first is deception in the solicitation of funds in public places. I can recall four incidents over twenty-five years, during which I have observed members of ISKCON (International Society for Krishna Consciousness) using false claims to collect money from innocent members of the public. The first was in Colorado in 1975, the second on the New York City subway in 1986, the third at the Los Angeles Airport in 1991, and the fourth was on my own campus in Haifa in 1998. On three occasions the claim was made that funds were being collected for homeless children, and on one occasion for ‘retired priests.’ The second aspect of financing which creates much opposition is the large amounts of money paid by NRM members as membership fees. In the case of one NRM I know, Emin, it has been observed that the group’s demands for money have forced group members to become highly responsible and productive members of society. Most outside observers, however, have still regarded the exorbitant sums as evidence of cynical exploitation.

We may want to differentiate between tactical and strategic deception in groups claiming the NRM label. When a member of ISKCON (in street clothes) approaches us and asks for a donation for a shelter housing homeless children, we may dismiss the incident, if we wish, as simply instrumental or tactical deception. It is simply a way of getting money out of kind strangers, and it works better than getting into a discussion about the Vedas. It is also a non-threatening entree to a recruitment effort. Some have regarded this kind of deception as trivial. I do not, because the way a group presents itself to outsiders must be symptomatic. One becomes suspicious of groups or individuals who choose to present themselves to the world fraudulently even on selected occasions. But beyond these behaviors, encountered in public places in many cities around the world, there is also the spectre of strategic deception, in which the whole existence of a group presenting itself as an NRM is an easy way of getting money out of kind strangers (some of whom become members). The notion of strategic deception is quite common in non-academic discussions of NRMs, but rarely mentioned in academic discourse. In the latter, strategic deception is never considered, though there are references to tactical deception. Zaretsky and Leone (1974) described ‘spiritual leaders’ as the last frontier for entrepreneurs in the ‘helping’ professions, as no diplomas were necessary, but the concept of the entrepreneur has been absent from recent NRM literature. Samuel L. Clemens, an observer of the U.S. scene in the nineteenth century, reported on the lucrative practices of ‘workin’ camp-meetin’s, and missionaryin’ around’ (Twain 1884/1965: 107). Little seems to have changed since then, and such practices can still earn you a decent living.

NRM researchers give all groups claiming the religion label their imprimatur, and it is quite interesting to note that no group has ever been refused this seal of approval. As far as I know, there has never been a case where a claim to be recognized as an NRM was rejected by scholars. The one axiom uniting all researchers (at least publicly) is that all groups that claim to be NRMs are indeed NRMs. The possibility of strategic deception is never considered by NRM researchers, even when the commercial nature of a group’s operations is highlighted by the use of registered trademarks and official claims of ‘trade secrets’ (Behar 1991; Passas and Castillo 1992). A naive observer, upon encountering the web site of such a group – (for example, MSIA) – with its slick merchandising, using the language of ‘goods and services’ openly, and offering books, courses, booklets, workshops, trinkets, and videotapes, may think that this site represents a commercial enterprise. It takes a sophisticated NRM scholar, with a PhD and numerous publications, to figure out that behind this commercial glimmer hides the Sacred, with a capital S. Most of us will keep wondering why a religious movement feels it has got behind a commercial facade. Is it because of persecution?

Someone once said that the only two sure things in life are death and taxes. All religions promise us eternal life, and thus deal with the first existential issue facing humanity. Some so-called NRMs are doing something about the second, seemingly insurmountable, issue of taxes. How can one assume that in among thousands of groups worldwide there isn’t a single case of strategic deception? It takes some faith, and what seems logically impossible is the reality of granting legitimacy to any group wishing to enjoy the benefits of tax-free operations. In one of the most memorable scenes from one of the greatest films of all time, The Third Man (1949, directed by Carol Reed, screenplay by Graham Greene), on the Ferris wheel above Vienna, Harry Lime (played by Orson Welles) proclaims what I have dubbed ‘Harry Lime’s First Law of Business Dynamics’: ‘Free of income tax, old man, free of income tax. The only way you can save money nowadays.’ Harry Lime’s Law has been put to good use by such moral paragons as Roy Cohn, and one must wonder whether it has occurred to anybody in the world of tax-free NRMs. Barker (1991: 11) noted the ‘considerable economic advantages to be gained from being defined as a religion,’ but has not suggested that this may motivate any specific NRMs or their leaders. Considerable economic advantages are to be gained from the support offered by NRM researchers to groups under fire, and these groups may show their appreciation by rewarding the researchers with some of their profits.

I have always thought that Anton Szandor LaVey’s Church of Satan, which was actually liquidated as part of his divorce, had been a commercial enterprise, and it was not included in my reference works on NRMs (Beit-Hallahmi 1993, 1998). When LaVey died in November 1997 there was at least one obituary, by someone with an intimate knowledge of the man, one that described the Church of Satan as ‘a cash cow’ (Cabal 1997: 16). LaVey apparently did not inspire much respect, nor did he seek it. Anton Szandor LaVey was an operator. He made up his biography as a lion tamer and carnival barker (Knipfel 1998) and did not try to hide the fact that he was just out to make a living by attacking conventional wisdom. In terms of faith he was clearly an atheist, and still, such an eminent scholar as Melton (1989), in the most authoritative source on religions in the United States, was ready to take seriously and list as a ‘religion’ the Church of Satan and its various offshoots.

In the real world, far from NRM scholars, numerous works of fiction (and some of nonfiction) portray NRM founders and leaders as cynical hustlers and fakes. Adams (1995: 50) states: ‘The career of Aimee Semple McPherson provided new chapters in the history of American hypocrisy, and Bruce Barton taught his readers to combine (however improbably) the principles of Jesus Christ with those of hucksterism.’ This kind of analysis is a great U.S. cultural tradition. Henry James, an early critic of NRMs in the nineteenth century, does a good job of mocking both NRMs and the ‘New Age’ (yes. New Age) in The Bostonians (James 1877/1956). As we know, in the nineteenth century many cultural precedents were set, regarding both NRMs and their critics, such as Mark Twain, and in the absence of academic researchers. The literary tradition of mockery and critique directed at both old and new religions seems to be alive a century later, without much effect on academics. Elmer Gantry (Lewis 1927) seems to be reborn and technologically amplified in the shape of modem televangelists. Reborn, a 1979 grade B film by Bigas Luna, describes a joint scheme by the mafia and a televangelist, and gives a backstage view of televangelism as hustling.

In a New Yorker article discussing the positive role of inner-city churches in the United States in saving youngsters from a life of crime and self-destruction, a call is made for greater support of these churches. In addition to much well-deserved agreement from several observers of the inner-city scene, a cautionary note is sounded by Representative Tony Hall, described as ‘a Democrat who has been active with religious charities.’ ‘We do have to be careful,’ he says. ‘There are some real thieves in this business’ (Klein 1997: 46). Gates (1989) referred, in passing, to no less a figure than Father Divine as ‘that historic con man of the cloth’ (44). This assessment radically challenges social science and history of religion accounts (cf. Burnham 1979; Cantril 1941; Fauset 1944; Harris 1971; Parker 1937; Weisbrot 1983). Another observer stated that ‘…a Spiritualist cult is a house of religious prostitution where religion is only the means for the end of commercialization’ (Washington 1973: 115). The U.S. NRM known as the United Church and Science Of Living Institute, founded in 1966 by former Baptist minister Frederick Eikerekoetter II, also known as Reverend Ike, is widely regarded as a con game. Most members are African Americans. In these instances the reality of hustling is thus well-acknowledged in the life of poor African Americans (cf. Baer 1981). We must wonder about its prevalence in other groups which cater to clients in search of religious salvation. But then Martin Marty (1987) called Rajneesh a con man, and the Attorney General of Oregon said that the Rajneesh organization had committed the number one crimes in several categories in the history of the United States: the largest network of fraudulent marriages, the most massive scheme of wiretapping, and the largest mass poisoning (Gordon 1987). Updike (1988), in a thinly veiled roman à clef, gives us a description of Rajneesh as a lecherous hustler, preying on the easy marks of Western bourgeoisie, and provides a glossary of Hindu terms, in case any of the readers wants to start an ashram somewhere. The novel is a literary failure, but probably a reliable ethnography. If we look at the world of NRMs according to Updike or Mehta (1990) we must ask ourselves whether they are such unsophisticated observers, or whether they are more sophisticated than the crowds of scholars who have observed the same world and found only idealism and the sacred. Why is there such a distance between the non-academic literary observers and NRM researchers? The former always describe this combination of sincere illusions (and delusions) together with the cynical manipulation of easy marks.

Even NRM researchers must be aware of the existence of groups that owe their survival to fraudulent claims of miracle cures. The group known as Cultural Minorities Spiritual Fraternization Church Of The Philippines, founded by Philip S. Malicdan in the 1960s, is a business which sells ‘psychic surgery’ to gullible (and terminally ill) foreigners. And then a book titled 218 Tax Havens (Kurtz 1993) offers advice, according to its publisher, on ‘What it takes to set up your own church or – even better – open up an independent branch of an existing church in California – no matter in what part of the world you’re living – and subsequently do all your business under the tax number of that church (DO-1599959) – thus staying completely tax free’ (Privacy Reports 1993). Where did Mr Kurtz get these ideas?

Prediction and Retrodiction

‘The prediction/retrodiction may be ‘testable’ in some sense, i.e., experiment or future observations may test the prediction, the acquisition of additional information about the past may test the retrodiction’ (Saperstein 1997: 7). Recent NRM tragedies allow us to test the consensus through retrodictions. In these cases we have more data and we can perform an in-depth examination. Here tragedy becomes an opportunity. When disaster strikes, the backstage is exposed, not just the tragic ending itself but the whole mechanics and history of the group, its hidden anatomy and physiology. The tragedies offer us a unique vantage point, that of the behavioral autopsy. Following each of them we get more information about the inner workings of NRMs, and sometimes we obtain almost complete histories. Just as in a medical autopsy, these cases can demonstrate where observers went wrong in their diagnosis and treatment.

The brief history of the Order of the Solar Temple by Hall and Schuyler (1997) ties us immediately to the real world of salvation hustlers and their victims. Here members of the francophone bourgeoisie, in scenes which could have been staged by Luis Bunuel and Federico Fellini, seek escape by any means from their lives of quiet desperation. There is not only demand but also some supply, and various con artists are right there to take advantage of the growing market. Hall and Schuyler (289) cite Max Weber, who wrote that Joseph Smith ‘may have been a very sophisticated swindler.’ In the case of the Solar Temple, we may be dealing with less sophisticated swindlers, but they still can get away with a lot. Its two leaders, Joseph DiMambro and Luc Jouret, had between them a wide repertoire of fraudulent practices, from bad checks to homeopathy. The official belief system of the group, combining claims about ‘ancient Egypt,’ ‘energy fields,’ reincarnation, and the ‘Age of Aquarius,’ is so widely offered in hundreds of groups all over the world (cf. Beit-Hallahmi 1992a) as to be banal and harmless.

Most of those who make a living marketing this rather used merchandise will never commit violence. But this was a high-involvement group, not just a series of lectures. By the late 1980s, the Solar Temple was a target for anti-NRM groups. In 1993, it became the target for police attention (for illegal weapon charges) and sensational media reports in both Canada and Australia. In July 1993 Jouret and two associates received light sentences from a judge in Quebec for their attempts to buy pistols with silencers. The early warnings were not heeded. The most sensational media reports, calling the Solar Temple a ‘doomsday cult,’ turned out to be on the mark. It is possible that DiMambro, terminally ill, wanted to take as many with him as he could. It is clear that many of the dead at this going-away party were murdered, some for revenge, while others were willing victims. Of course, Jouret and Di Mambro were more than just hustlers. It is likely that they at least actually believed in some of the ideas they propagated (Mayer 1999).

There are other NRM disasters which remain less well-known, even though they involve real horrors. A case in point is that of the Faith Assembly, led by Hobart E. Freeman. In this group 100 members and their children were reported to have died between 1970 and 1990, because of their refusal to seek medical care. Freeman himself, a scholar of Hebrew and Greek and the author of ten books which were well received within the evangelical Christian community, has been described as schizophrenic, but was able to control his followers and persuade them to continue risking their own and their children’s lives. He died in 1984, but his ideas live on among some (Beit-Hallahmi 1998; Hughes 1990). Freeman’s victims are less known than those of other deranged leaders, but some of his followers were convicted and sent to prison for what some observers called infanticide. Richardson and Dewitt (1992: 561) note that public opinion ‘… seems to favor protection of children over parental rights and freedom of religion,’ and it certainly should. The protection of powerless children must take priority over any other consideration.

Another NRM tragedy which has received little notice is the story of Osel Tendzin (formerly Thomas Rich of Passaic, N.J.). He was the Vajra Regent who died in 1990 after knowingly infecting some of his followers with the HIV virus. He is rarely mentioned in studies of this group (cf. Beit-Hallahmi 1998; Cartwright and Kent 1992; Rubin 1997). This story is quite astounding and deserves close scrutiny, even though the NRM involved refuses to do that, and remembers Osel Tendzin only in his status as a great teacher, ‘Radiant Holder of the Teachings.’ Osel Tendzin became leader of Vajradhatu in 1987, upon the death (by cirrhosis of the liver) of Chogyam Trungpa, who appointed him as successor. Both leaders shared the habit of enjoying sex and alcohol with followers. Osel Tendzin had had AIDS since 1985, and this was known to Chogyam Trungpa and to at least two other members of the Vajradhatu board. In December 1988, the world, together with the group’s members, learned of Osel Tendzin’s trail of death by sex (Butler 1990). He died in 1990, still nominally the leader, surrounded by admiring disciples. The case of Osel Tendzin and his behaviour provides less than an autopsy, but a dramatic backstage glimpse. Here, not only a Bunuel and a Fellini were required to stage the scenes, but also Ingmar Bergman and Orson Wells. There are less tragic cases where no deaths are involved, but criminal activities and various scandals provide openings into backstage activities. In the case of the Love Family (Balch 1988) no lives were lost, but some lives were misspent, wasted, and destroyed.

Melton and Moore (1982: 171) criticized anti-NRM sources as ‘shallow and full of errors.’ Following the Waco tragedy we have much knowledge about everyday life in the Branch Davidians, just as we have gained some insider views of the Solar Temple and Heaven’s Gate. Not only that, but it is now quite clear that without these horrendous tragedies we would have never known about the reality of back-stage life among the Branch Davidians or in the Solar Temple. Recent historical-behavioral autopsies enable us to realize that in every single case allegations by hostile outsiders, critics, and detractors have sometimes been closer to the reality than any other accounts. The party line has been that ‘… defectors are involved in either conscious or unconscious self-serving behavior’ (Richardson 1980: 247), as opposed to members and leaders, who are totally selfless. Ever since Jonestown, statements by ex-members turned out to be just as accurate, or more so than those of apologists and NRM researchers. While it is true that organizations (and individuals) look less than perfectly appetizing with their entrails exposed, the reality revealed in the cases of People’s Temple, Nation of Yahweh, Rajneesh International, the Branch Davidians, the Solar Temple, or Heaven’s Gate is much more than unattractive; it is positively horrifying, confirming our worst nightmares. Each one of these cases may be unrepresentative, deviating from the norm in the majority of NRMs, but the cumulative record must be considered. Aum Sinhrikyo was an aberration, and the Solar Temple was an aberration, and David Koresh was an aberration, and Osel Tendzin was an aberration, and so on and on. But aberrations are piling up, and we do have a representative sample, because the groups involved come from differing backgrounds and traditions. One group is in Japan, another is a branch of a mainstream Protestant denomination, another is a schismatic millenarian group, another is UFO-oriented, one Tibetan Buddhist, and another Rosicrucian. Our overall failure in retrodiction is tied to our bias and ignorance, and the NRM consensus seems totally out of touch with the realities uncovered in these autopsies.

In every case of NRM disaster over the past fifty years, starting with Krishna Venta (Beit-Hallahmi 1993), we encounter a hidden reality of madness and exploitation, a totalitarian, psychotic reality which is actually worse than detractors’ allegations. The dynamics we discover in these social movements are not just of ideological and organizational totalism, but of totalitarianism and fascism. Exposing the inside workings of an NRM reveals a leader even more deranged than anybody could have imagined at the head of a small-scale dictatorship system similar to the well-known dictatorships of the twentieth century. The questions we ask about these groups should be similar to the ones raised about historical twentieth-century totalitarian regimes (Adorno et al. 1950; Fromm 1941). We are now better informed about the leaders that have been exposed, but this invokes some sobering thoughts about leaders in groups whose inner workings are still hidden from sunlight. Of course, most religious organizations are undemocratic by definition (and by claimed revelation), but in recent times not all have been totalitarian dictatorships.

Accuracy in the Mass Media and in Scholarly Writings

The moral of recent well-publicized NRM tragedies, such as the Branch Davidians case or the Heaven’s Gate case, as far as reporting and analysis is concerned, is that reading Newsweek, Time, or the New York Times may be just as profitable, or more, as reading scholarly works. Media reporting in general is quite limited, but investigative reporting by major media, when time and effort are put in, and wire-service reports, are worth at least taking seriously. There are sometimes real errors in the media, but this happens in academic works too. Reading the scholarly literature, with a few notable and commendable exceptions, does not result in a better understanding of what the Branch Davidians at Waco were all about. The most basic human questions are left unanswered. Newsweek reported quite accurately on who David Koresh really was, and at the same time was severely critical of U.S. government authorities and their outrageous mishandling of the case. If you were just to rely on Newsweek to understand what happened at Waco, you would reach the conclusion that a group of deranged victims, led by a manipulative madman, was being decimated by superior federal firepower in a senseless and tragic confrontation ignited by the illegal acts of some group members. Would you miss out on some fine points of scholarly analysis? Not if such analysis consisted of a scholarly defence of indefensible behavior. One way of looking at the Mt Carmel tragedy is by comparing it to the many tragic confrontations between deranged individuals and insensitive or unaware members of police forces around the world. The only acceptable defence for David Koresh may be the insanity defence, and this is something you will gather soon enough from the media, which have proven themselves just as accurate in the cases of Krishna Venta, Benjamin Purnell, Aum Shinrikyo, the People’s Temple, the Solar Temple, and Heaven’s Gate. As we have demonstrated in the case of the Branch Davidians and Aum Shinrikyo, sensational media reports never die; they don’t even fade away. They just come back reincarnated as scholarly data, to be interpreted or euphemized.

Note

I wish to thank L.B. Brown, Yoram Carmeli, Avner Falk, Mark Finn, Maxine Gold, A.Z. Guiora, Stephen A. Kent, Roslyn Lacks, Michael Langone, James R. Lewis, J. Gordon Melton, Jean-François Mayer, Dan Nesher, Tom Robbins, Leonard Saxe, Israel Shahak, Zvi Sobel, Roger O’Toole, and Benjamin Zablocki for generous help and advice. Whatever remaining faults are still found in this chapter are the author’s sole responsibility.

References

Adams, R.M. 1995. ‘Wonderful Town?’ The New York Review of Books, 20 April.

Adorno, T., et al. 1950. The Authoritarian Personality. New York: Harper & Row.

Alper, G. 1997. The Dark Side of the Analytic Moon. Bethesda, Md.: International Scholars Publications.

Anthony, D., and T. Robbins, 1997. ‘Religious Totalism, Exemplary Dualism, and the Waco Tragedy.’ In Millennium, Messiahs, and Mayhem, edited by T. Robbins and S.J. Palmer. New York: Routledge.

Argyle, M., and B. Beit-Hallahmi. 1975. The Social Psychology of Religion. London: Routledge & Kegan Paul.

Arrow, K.J. 1972. ‘Gifts and Exchanges.’ Philosophy and Public Affairs 1: 343-62.

Baer, H.A. 1981. ‘Prophets and Advisors in Black Spiritual Churches: Therapy, Palliative, or Opiate?’ Culture, Medicine and Psychiatry 5:145-70.

Bainbridge, W.S. 1993. ‘New Religions, Science and Secularization.’ Religion and the Social Order 3A: 277-92.

Balch, R. (1988). ‘Money and Power in Utopia: An Economic History of the Love Family.’ Money and Power in the New Religions. J.T. Richardson. Lewiston, N.Y.: Edwin Mellen.

Balch, R.W. 1996. Review of Sex, Slander, and Salvation: Investigating the Children of God/The Family, edited by J.R. Lewis, and J.G. Melton. Journal for the Scientific Study of Religion 36: 35, 72.

Balch, R.W., and S. Langdon, 1996. ‘How Not to Discover Malfeasance in New Religions: An Examination of the AWARE Study of the Church Universal and Triumphant.’ University of Montana, unpublished.

Barker, 1983. ‘Supping with the Devil: How Long a Spoon Does the Sociologist Need?’ Sociological Analysis 44:197-205.

— 1991. ‘But Is It a Genuine Religion?’ Report from the Capitol (April): 10-11, 14.

Becker, H.S. 1967. ‘Whose Side Are We On?’ Social Problems 14: 239-47.

Behar, R. 1991. ‘The Thriving Cult of Greed and Power.’ Time, 6 May, 32-9.

Beit-Hallahmi, B. 1974a. ‘Salvation and Its Vicissitudes: Clinical Psychology and Political Values.’ American Psychologist 29:124-9.

— 1974b. ‘Psychology of Religion 1880-1930: The Rise and Fall of a Psychological Movement.’ Journal of the History of the Behavioral Sciences 10: 84-90.

— 1975. ‘Encountering Orthodox Religion in Psychotherapy.’ Psychotherapy: Theory, Research and Practice 12: 357-9.

— 1976. ‘On the ‘Religious’ Functions of the Helping Professions. Archive fur Religionpsychologie 12: 48-52.

— 1977a. ‘Humanistic Psychology – Progressive or Reactionary?’ Self and Society 12: 97-103.

— 1977b. ‘Curiosity, Doubt and Devotion: The Beliefs of Psychologists and the Psychology of Religion. In Current Perspectives in the Psychology of Religion, edited by H.N. Malony. Grand Rapids, Mich.: Eerdmans Publishing.

— 1981. ‘Ideology in Psychology: How Psychologists Explain Inequality.’ In Value Judgment and Income Distribution, edited by R. Solo and C.H. Anderson. New York: Praeger.

— 1987. ‘The Psychotherapy Subculture: Practice and Ideology.’ Social Science Information 26: 475-92.

— 1989. Prolegomena to the Psychological Study of Religion. Lewisburg, Pa.: Bucknell University Press.

— 1991. ‘More Things in Heaven and Earth: Sharing the Psychoanalytic Understanding of Religion.’ International Journal of the Psychology of Religion 1: 91-3.

— 1992a. Despair and Deliverance: Private Salvation in Contemporary Israel. Albany, N.Y.: SUNY Press.

— 1992b. ‘Between Religious Psychology and the Psychology of Religion’ In Object Relations, Theory and Religion: Clinical Applications, edited by M. Finn and J. Gartner. New York: Praeger.

— 1993. The Annotated Dictionary of Modern Religious Movements. Danbury, Conn.: Grolier.

— 1995. ‘Ideological and Philosophical Bases for the Study of Personality and Intelligence. In International Handbook of Personality and Intelligence, edited by D.H. Saklofske and M. Zeidner. New York: Plenum.

— 1996a. ‘Religion as Pathology: Exploring a Metaphor.’ In Religion, Mental Health, and Mental Pathology, edited by H. Grzymala-Moszczynska, and B. Beit-Hallahmi. Amsterdam: Rodopi.

— 1996b. Psychoanalytic Studies of Religion: Critical Assessment and Annotated Bibliography. Westport, Conn.: Greenwood Press.

— 1998. The Illustrated Encyclopedia of Active New Religions. New York: Rosen Publishing.

Beit-Hallahmi, B., and M. Argyle. 1977. ‘Religious Ideas and Psychiatric Disorders.’ International Journal of Social Psychiatry 23: 26-30.

— 1997. The Psychology of Religious Behaviour, Belief, and Experience. London: Routledge.

Beit-Hallahmi, B., and B. Nevo. 1987. ‘“Born-again” Jews in Israel: The Dynamics of an Identity Change.’ International Journal of Psychology 22:75-81.

Bender, L., and M.A. Spalding. 1940. ‘Behavior Problems in Children from the Homes of Followers of Father Divine’ Journal of Nervous and Mental Disease 91:460-72.

Burnham, K.E. 1979. God Comes to America. Boston: Lambeth.

Butler, K. 1990. ‘Encountering the Shadow in Buddhist America.’ Common Boundary (May/June): 14-22.

Cabal, A. 1997. ‘The Death of Satan.’ New York Press (12-18 November).

Cantril, H. 1941. The Psychology of Social Movements. New York: Wiley.

Cartwright, R.H., and S.A. Kent. 1992. ‘Social Control in Alternative Religions: A Familial Perspective.’ Sociological Analysis 53: 345-61.

Fauset, A.H. 1944. Black Gods of the Metropolis. Philadelphia: University of Pennsylvania Press.

Freud, S. 1921. ‘Group Psychology and the Analysis of the Ego.’ The Standard Edition of the Complete Psychological Writings of Sigmund Freud 18: 69-143.

Fromm, E. 1941. Escape from Freedom. New York: Rinehart.

Gates, H.L., Jr. 1989. ‘Whose Canon Is It, Anyway?’ New York Times Book Review 26 February.

Gordon, J.S. 1987. The Golden Guru: The Strange Journey of Bhagwan Shree Rajneesh. Lexington, Mass.: Stephen Green Press.

Gouldner, A. 1960. ‘The Norm of Reciprocity: A Preliminary Statement.’ American Sociological Review 25:161-78.

Greil, A.L. 1996. ‘Sacred Claims: The “Cult Controversy” as a Struggle over the Right to the Religious Label.’ Religion and the Social Order 6:47-63.

Hadden, J.K. 1989. ‘Memorandum’ (20 December).

— 1990. ‘The Role of Mental Health Agents in the Social Control of Religious Deviants: A Comparative Examination of the U.S.S.R. and the United States.’ Paper prepared for presentation at a conference on Religion, Mental Health, and Mental Pathology. Cracow, Poland (December 1990).

Hall, E. 1975. ‘A conversation with Idries Shah.’ Psychology Today (July).

Hall, J.R., and P. Schuyler. 1997. ‘The Mystical Apocalypse of the Solar Temple.’ In Millennium, Messiahs, and Mayhem, edited by T. Robbins and S.J. Palmer. New York: Routledge.

Harris, S. 1971. Father Divine. New York: Macmillan.

Haworth, A. 1995. ‘Cults: Aum Shinrikyo.’ The Guardian (14 May).

Homans, G.C. 1974. Social Behavior: Its Elementary Forms. New York: Harcourt Brace Jovanovich.

Horowitz, I.L., ed. 1978. Science, Sin, and Scholarship: The Politics of Reverend Moon and the Unification Church. Cambridge, Mass.; MIT Press. [Sun Myung Moon: Missionary to Western Civilization]

Houts, A.C., and K. Graham, 1986. ‘Can Religion Make You Crazy? Impact of Client and Therapist Values on Clinical Judgments’ Journal of Consulting and Clinical Psychology 54: 267-71.

Hughes, R.A. 1990. Psychological Perspectives on Infanticide in a Faith Healing Sect. Psychotherapy 27:107-15.

James, H. (1877/1956). The Bostonians. New York: Modern Library.

Keita, L. 1998. The Human Project and the Temptations of Science. Amsterdam: Rodopi.

Kilbourne, B., and J.T. Richardson. 1984. ‘Psychotherapy and New Religions in a Pluralistic Society.’ American Psychologist 39: 237-51.

Klein, J. 1997. ‘In God They Trust.’ New Yorker, 16 June.

Knipfel, J. 1998. ‘The Devil, You Say!’ New York Press (4-10 Nov.).

Kowner, R. 1997. ‘On Ignorance, Respect and Suspicion: Current Japanese Attitudes towards Jews.’ Acta No. 11. Jerusalem: Vidal Sassoon International Center for the Study of Antisemitism, Hebrew University of Jerusalem.

Kurtz, G. 1993. 218 Tax Havens. Hong Kong: Privacy Reports.

Laing, R.D. 1959. The Divided Self. London: Tavistock.

Levi-Strauss, C. 1965. The Principle of Reciprocity. In Sociological Theory, edited by L.A. Coser and L. Rosenberg. New York: Macmillan.

Lewis, J.R. (1994b). Introduction: On Tolerance, Toddlers, and Trailers: First Impressions of Church Universal and Triumphant. In Church Universal and Triumphant in Scholarly Perspective, edited by J.R. Lewis and J.G. Melton. Stanford, Calif.: Center for Academic Publication.

— 1994c. ‘Introduction.’ In Sex, Slander, and Salvation: Investigating the Children of God/The Family, edited by J.R. Lewis and J.G. Melton. Stanford, Calif.: Center for Academic Publication.

— 1995. ‘Self-Fulfilling Stereotypes: The Anticult Movement, and the Waco Confrontation.’ In Armageddon in Waco, edited by S.A. Wright. Chicago: University of Chicago Press.

Lewis, J.R., ed. 1994a. From the Ashes: Making Sense of Waco. Lanham, Md.: Rowman & Littlefield.

Lewis, J.R., and J.G. Melton, eds. 1994a. Church Universal and Triumphant in Scholarly Perspective. Stanford, Calif.: Center for Academic Publication.

— 1994b. Sex, Slander, and Salvation: Investigating the Children of God / The Family. Stanford, Calif.: Center for Academic Publication.

Lewis, S. 1927. Elmer Gantry. New York: Harcourt Brace.

Linn, L., and L.W. Schwartz. 1958. Psychiatry and Religious Experience. New York: Random House.

Lowenthal, K.M. 1995. Mental Health and Religion. London: Chapman & Hall.

Lurie, A. (1967/1991). Imaginary Friends. New York: Avon Books.

Marty, M. (1987). Religion and Republic. Boston: Beacon.

Mauss, M. 1954. The Gift. Glencoe, Ill.: The Free Press.

Mayer, J.F. 1999. ‘Les chevaliers de l’apocalypse: L’Ordre de Temple Solaire et ses adeptes.’ In Sectes et Société, edited by F. Champion amd M. Cohen. Paris: Seuil.

Mehta, G. 1990. Karma Cola: Marketing The Mystic East. New York: Vintage.

Melton, J.G. 1989. Encyclopedia of American Religions. Detroit: Gale.

Melton. J.G. and R. Moore. 1982. The Cult Experience. New York: Pilgrim Press.

Mullins, M.R. 1997. ‘Aum Shinrikyo as an Apocalyptic Movement.’ In Millennium, Messiahs, and Mayhem, edited by T. Robbins and S.J. Palmer. New York: Routledge.

Parker, R.A. 1937. The Incredible Messiah. Boston: Little, Brown.

Passas, N. and M.E. Castillo. 1992. ‘Scientology and Its “Clear” Business.’ Behavioral Sciences and the Law 10:103-16.

Pastore, N. 1949. The Nature-Nurture Controversy. New York: King’s Crown Press.

Privacy Reports, Inc. 1993. How the Rich Get Richer: Their Secrets Revealed (flyer).

Reid, T.R. 1995 ‘U.S. Visitors Boost Cause of Japanese Cult.’ Washington Post, 9 May.

Reader, 1.1995. ‘Aum Affair Intensifies Japan’s Religious Crisis: An Analysis.’ Religion Watch (July/August): 1-2.

Reuters News Service (1998). ‘Life Sentence to Aum Member for the Poison Gas Attack’ 27 May.

Rich, F. 1998. ‘Lott’s Lesbian Ally.’ New York Times, 22 July, A19.

Richardson, J.T. 1980. ‘People’s Temple and Jonestown: A Corrective Comparison and Critique.’ Journal for the Scientific Study of Religion 19: 239-55.

— 1993. ‘Religiosity as Deviance: Negative Religious Bias and Its Misuse of the DSM-III.’ Deviant Behavior 14:1-21.

— 1994. ‘Update on “The Family”: Organizational Change and Development in a Controversial New Religious Group.’ In Sex, Slander, and Salvation: Investigating the Children of God/The Family, edited by J.R. Lewis and J.G. Melton. Stanford, Calif.: Center for Academic Publication.

Richardson, J.T. and J. Dewitt. 1992. ‘Christian Science, Spiritual Healing, the Law, and Public Opinion.’ Journal of Church and State 34: 549-61.

Robbins, T. 1983. The Beach Is Washing Away. Sociological Analysis 44: 207-13.

Robbins, T., and D. Anthony. 1982. ‘Deprogramming, Brainwashing: The Medicalization of Deviant Religious Groups’ Social Problems 29: 283-97.

Robbins, T., and D.G. Bromley (1991). ‘New Religious Movements and the Sociology of Religion.’ Religion and Social Order 1:183-205.

Rubin, J.B. 1997. ‘Psychoanalysis Is Self-Centered.’ In Soul on the Couch: Spirituality, Religion, and Morality in Contemporary Psychoanalysis, edited by C. Spezzano and G. Garguilo. Hillsdale, N.J.: Analytic Press.

Rycroft, C. 1991. Psychoanalysis and Beyond. London: Hogarth.

Saperstein, A.M. 1997. ‘Dynamical Modeling in History.’ Physics and Society 26, no. 4: 7-8.

Schimel, J.L., ed. 1973. ‘Esoteric Identification Processes in Adolescence and Beyond.’ The Journal of the American Academy of Psychoanalysis 1:403-15.

Shafranske, E., 1997. Religion and the Clinical Practice of Psychology. Washington, D.C.: APA.

Shils, E. 1978. ‘The Academic Ethos.’ The American Scholar (spring): 165-90.

Shupe, A., and D. Bromley. 1980. The New Vigilantes: The Anti-Cult Movement in America. Beverly Hills: Sage.

Shupe, A., D.G. Bromley, and D. Oliver. 1984. The Anti-Cult Movement in America: A Bibliographical and Historical Survey. New York: Garland.

Spengler, O.1926. The Decline of the West. New York: Knopf.

Stone, M.H. 1992. ‘Religious Behavior in the Psychiatric Institute 500.’ In Object Relations Theory and Religion: Clinical Applications, edited by M. Finn and J. Gartner. Westport, Conn.: Praeger.

Titmus, R. 1970. The Gift Relationship. London: Allen & Unwin.

Twain, M. 1884/1965. The Adventures of Huckleberry Finn. New York: Harper & Row.

Ullman, C. 1989. The Transformed Self: The Psychology of Religious Conversion. New York: Plenum.

Updike, J. 1988. S. New York: Fawcett Crest.

Usher, R. 1997. ‘Cult Control.’ Time (27 January).

Washington, J., Jr. 1973. Black Sects and Cults. Garden City, N.Y.: Doubleday.

Weisbrot, R. 1983. Father Divine and the Struggle for Racial Equality. Urbana: University of Illinois Press.

Weisstub, D.N. 1998. Research on Human Subjects. Amsterdam: Pergamon.

Yarnell, H. 1957. ‘An Example of the Psychopathology of Religion: The Seventh Day Adventist Denomination.’ Journal of Nervous and Mental Disease 125: 202-12.

Zablocki, B. (1997). ‘Distinguishing Front-Stage from Back-Stage Behavior in the Study of Religious Communities.’ San Diego, Society for the Scientific Study of Religion.

Zaretsky, I.I., and Leone M.P., eds. 1974. Religious Movements in Contemporary America. Princeton: Princeton University Press.



page 159

5. Towards a Demystified and Disinterested Scientific Theory of Brainwashing

Benjamin Zablocki

 

Nobody likes to lose a customer, but religions get more touchy than most when faced with the risk of losing devotees they have come to define as their own. Historically, many religions have gone to great lengths to prevent apostasy, believing virtually any means justified to prevent wavering parishioners from defecting and thus losing hope of eternal salvation. In recent centuries, religion in our society has evolved from a system of territorially based near-monopolies into a vigorous and highly competitive faith marketplace in which many churches, denominations, sects, and cults vie with one another for the allegiance of ‘customers’ who are free to pick and choose among competing faiths. Under such circumstances, we should expect to find that some of the more tight-knit and fanatical religions in this rough-and-tumble marketplace will have developed sophisticated persuasive techniques for holding on to their customers. Some of the most extreme of these techniques are known in the literature by the controversial term ‘brainwashing.’ This chapter is devoted to a search for a scientific definition of brainwashing and an examination of the evidence for the existence of brainwashing in cults. I believe that research on this neglected subject is important for a fuller understanding of religious market dynamics.(1) And, ultimately, research on this subject may yield a wider dividend as well, assisting us in our quest for a fuller understanding of mass charismatic movements such as Fascism, Nazism, Stalinism, and Maoism.

Do We Need to Know Whether Cults Engage in Brainwashing?

The question of why people obey the sometimes bizarrely insane commands of charismatic leaders, even unto death, is one of the big unsolved mysteries of history and the social sciences. If there are deliberate techniques that charismatic leaders (and charismatically led organizations) use to induce high levels of uncritical loyalty and obedience in their followers, we should try to understand what these techniques are and under what circumstances and how well they work.

This chapter is about nothing other than the process of inducing ideological obedience in charismatic groups. Many people call this process brainwashing, but the label is unimportant. What is important is that those of us who want to understand cults develop models that recognize the importance that some cults give to strenuous techniques of socialization designed to induce uncritical obedience to ideological imperatives regardless of the cost to the individual.

The systematic study of obedience has slowed down considerably within the behavioural sciences. Early laboratory studies of obedience-inducing mechanisms got off to a promising start in the 1960s and 1970s, but were correctly criticized by human rights advocates for putting laboratory subjects under unacceptable levels of stress (Kelman and Hamilton 1989; Milgram 1975; Zimbardo 1973). Permission to do obedience-inducing experiments on naive experimental subjects became almost impossible to obtain and these sorts of laboratory experiments virtually ceased. However, large numbers of charismatic cultic movements appeared on the scene just in time to fill this vacuum left by abandoned laboratory studies. Being naturally occurring social ‘experiments,’ obedience-induction in such groups could be studied ethno-graphically without raising the ethical objections that had been raised concerning laboratory studies.

Social theorists are well aware that an extremely high degree of obedience to authority is a reliably recurring feature of charismatic cult organizations (Lindholm 1990; Oakes 1997). But most social scientists interested in religion declined this opportunity. For reasons having more to do with political correctness than scientific curiosity, most of them refused to design research focused on obedience-induction. Many even deny that deliberate programs of obedience-induction ever occur in cults.

The existence of a highly atypical form of obedience to the dictates of charismatic leaders is not in question. Group suicides at the behest of a charismatic leader are probably the most puzzling of such acts of obedience (Hall 2000; Lalich 1999; Weightman 1983), but murder, incest, child abuse, and child molestation constitute other puzzling examples for which credible evidence is available (Bugliosi and Gentry 1974; Litton 1999; Rochford 1998). Moreover, the obedience reported is not limited to specific dramatic actions or outbursts of zeal. Less dramatic examples of chronic long-term ego-dystonic behaviour,(2) such as criminal acts, abusive or neglectful parenting, and promiscuous sexual behaviour have also been documented (Carter 1990; Hong 1998; Layton 1998; Rochford 1998; Williams 1998). However, agreement on these facts is not matched, as we shall see, by agreement on the causes of the obedience, its pervasiveness among cult populations, or the rate at which it decays after the influence stimuli are removed.

But given the fact that only a small proportion of the human population ever join cults, why should we care? The answer is that the sociological importance of cults extends far beyond their numerical significance. Many cults are harmless and fully deserving of protection of their religious and civil liberties. However, events of recent years have shown that some cults are capable of producing far more social harm than one might expect from the minuscule number of their adherents. The U.S. State Department’s annual report on terrorism for the year 2000 concludes that ‘while Americans were once threatened primarily by terrorism sponsored by states, today they face greater threats from loose networks of groups and individuals motivated more by religion or ideology than by politics’ (Miller 2000:1).

In his recent study of a Japanese apocalyptic cult, Robert Jay Lifton (1999: 343) has emphasized this point in the following terms:

“Consider Asahara’s experience with ultimate weapons … With a mad guru and a few hundred close followers, it is much easier to see how the very engagement with omnicidal weapons, once started upon, takes on a psychological momentum likely to lead either to self-implosion or to world explosion … Asahara and Aum have changed the world, and not for the better. A threshold has been crossed. Thanks to this guru, Aum stepped over a line that few had even known was there. Its members can claim the distinction of being the first group in history to combine ultimate fanaticism with ultimate weapons in a project to destroy the world. Fortunately, they were not up to the immodest task they assigned themselves. But whatever their bungling, they did cross that line, and the world will never quite be the same because, like it or not, they took the rest of us with them.”

Potentially fruitful scientific research on obedience in cultic settings has been stymied by the well-intentioned meddling of two bitterly opposed, but far from disinterested, scholarly factions. On the one hand, there has been an uncompromising outcry of fastidious naysaying by a tight-knit faction of pro-religion scholars. Out of a fear that evidence of powerful techniques for inducing obedience might be used by religion’s enemies to suppress the free expression of unpopular religions, the pro-religion faction has refused to notice the obvious and has engaged in a concerted (at times almost hysterical) effort to sweep under the rug any cultic-obedience studies not meeting impossibly rigorous controlled experimental standards (Zablocki 1997). On the other hand, those scholars who hate or fear cults have not been blameless in the pathetic enactment of this scientific farce. Some of them have tried their best to mystically transmute the obedience-inducing process that goes on in some cults from a severe and concentrated form of ordinary social influence into a magic spell that somehow allows gurus to snap the minds and enslave the wills of any innocent bystander unlucky enough to come into eye contact. By so doing, they have marginalized themselves academically and provided a perfect foil for the gibes of pro-religion scholars.

Brainwashing is the most commonly used word for the process whereby a charismatic group systematically induces high levels of ideological obedience. It would be naively reductionistic to try to explain cultic obedience entirely in terms of brainwashing. Other factors, such as simple conformity and ritual, induce cultic obedience as well. But it would be an equally serious specification error to leave deliberate cultic manipulation of personal convictions out of any model linking charismatic authority to ideological obedience.

However, the current climate of opinion, especially within the sociology of new religious movements, is not receptive to rational discussion of the concept of brainwashing, and still less to research in this area. Brainwashing has for too long been a mystified concept, and one that has been the subject of tendentious writing (thinly disguised as theory testing) by both its friends and enemies. My aim in this chapter is to rescue for social science a concept of brainwashing freed from both mystification and tendentiousness. I believe it is important and long overdue to restore some detachment and objectivity to this field of study.

The goal of achieving demystification will require some analysis of the concept’s highly freighted cultural connotations, with particular regard to how the very word brainwash became a shibboleth in the cult wars. It is easy to understand how frightening it may be to imagine that there exists some force that can influence one down to the core level of basic beliefs, values, and worldview. Movies like The Manchurian Candidate have established in the popular imagination the idea that there exists some mysterious technique, known only to a few, that confers such power. Actually, as we will see, the real process of brainwashing involves only well-understood processes of social influence orchestrated in a particularly intense way. It still is, and should be, frightening in its intensity and capacity for extreme mischief, but there is no excuse for refusing to study something simply because it is frightening.

The goal of establishing scientific disinterest will require the repositioning of the concept more fully in the domain of behavioural and social science rather than in its present domain, which is largely that of civil and criminal legal proceedings. It is in that domain that it has been held hostage and much abused for more than two decades. The maxim of scholarly disinterest requires the researcher to be professionally indifferent as to whether our confidence in any given theory (always tentative at best) is increased or decreased by research. But many scholarly writers on this subject have become involved as expert witnesses, on one side or the other, in various law cases involving allegations against cult leaders or members (where witnesses are paid to debate in an arena in which the only possible outcomes are victory or defeat). This has made it increasingly difficult for these paid experts to cling to a disinterested theoretical perspective.

In my opinion, the litigational needs of these court cases have come, over the years, to drive the scientific debate to an alarming degree. There is a long and not especially honourable history of interest groups that are better armed with lawyers than with scientific evidence, and that use the law to place unreasonable demands on science. One need only think of the school segregationists’ unreasonable demands, fifty years ago, that science prove that any specific child was harmed in a measurable way by a segregated classroom; or the tobacco companies’ demands, forty years ago, that science demonstrate the exact process at the molecular level by which tobacco causes lung cancer. Science can serve the technical needs of litigation, but, when litigation strategies set the agenda for science, both science and the law are poorer for it.

My own thirty-six years of experience doing research on new religious movements has convinced me beyond any doubt that brainwashing is practised by some cults some of the time on some of their members with some degree of success. Even though the number of times I have used the vague term some in the previous sentence gives testimony to the fact that there remain many still-unanswered questions about this phenomenon, I do not personally have any doubt about brainwashing’s existence. But I have also observed many cults that do not practise brainwashing, and I have never observed a cult in which brainwashing could reasonably be described as the only force holding the group together. My research (Zablocki 1971; 1991; 1996; Zablocki and Aidala 1991) has been ethnographic, comparative, and longitudinal. I have lived among these people and watched the brainwashing process with my own eyes. I have also interviewed people who participated in the process (both as perpetrators and subjects). I have interviewed many of these respondents not just one time but repeatedly over a course of many years. My selection of both cults and individuals to interview has been determined by scientific sampling methods (Zablocki 1980: app. A), not guided by convenience nor dictated by the conclusions I hoped to find. Indeed, I have never had an axe to grind in this field of inquiry. I didn’t begin to investigate cults in the hope of finding brainwashing. I was surprised when I first discovered it. I insist on attempting to demonstrate its existence not because I am either for or against cults but only because it seems to me to be an incontrovertible, empirical fact.

Although my own ethnographic experience leads me to believe that there is overwhelming evidence that brainwashing is practised in some cults, my goal in this chapter is not to ‘prove’ that brainwashing exists, but simply to rescue it from the world of bogus ideas to which it has been banished unfairly, and to reinstate it as a legitimate topic of social science inquiry. My attempt to do so in this chapter will involve three steps. First, I will analyse the cultural misunderstandings that have made brainwashing a bone of contention rather than a topic of inquiry. Second, I will reconstruct the concept in a scientifically useful and empirically testable form within the framework of social influence theory. Third, I will summarize the current state of evidence (which seems to me to be quite compelling) that some cults do in fact engage in brainwashing with some degree of success.

Cultural Contention over the Concept of Brainwashing

That Word ‘Brainwashing’

The word brainwashing is, in itself, controversial and arouses hostile feelings. Since there is no scientific advantage in using one word rather than another for any concept, it may be reasonable in the future to hunt around for another word that is less polemical. We need a universally recognized term for a concept that stands for a form of influence manifested in a deliberately and systematically applied traumatizing and obedience-producing process of ideological resocialization.

Currently, brainwashing is the generally accepted term for this process, but I see no objection to finding another to take its place. There are in fact other terms, historically, that have been used instead, like ‘thought reform’ and ‘coercive persuasion.’ Ironically, it has been those scholars who complain most about ‘the B-word’ who have also been the most insistent that none of the alternatives is any better. As long as others in the field insist on treating all possible substitute constructions as nothing more than gussied-up synonyms for a mystified concept of brainwashing (see, for example, Introvigne 1998: 2), there is no point as yet in trying to introduce a more congenial term.

An overly literal reading of the word brainwashing (merely a literal translation of the accepted Chinese term shi nao) could be misleading, as it seems to imply the ability to apply some mysterious biochemical cleanser to people’s brains. However, the word has never been intended as a literal designator but as a metaphor. It would be wise to heed Clifford Geertz’s (1973: 210) warning in this connection, to avoid such a ‘flattened view of other people’s mentalities [that] more complex meanings than [a] literal reading suggests [are] not even considered.’

Thus, please don’t allow yourself to become prejudiced by a visceral reaction to the word instead of attending to the underlying concept. There is a linguistic tendency, as the postmodernist critics have taught us, for the signified to disappear beneath the signifies But the empirically based social sciences must resist this tendency by defining terms precisely. The influence of media-driven vulgarizations of concepts should be resisted. This chapter argues for the scientific validity of a concept, not a word. If you are interested in whether the concept has value, but you gag on the word, feel free to substitute a different word in its place. I myself have no particular attachment to the word brainwashing.

But if all we are talking about is an extreme form of influence, why do we need a special name for it at all? The name is assigned merely for convenience. This is a common and widely accepted practice in the social sciences. For example, in economics a recession is nothing more than a name we give to two consecutive quarters of economic contraction. There is nothing qualitatively distinctive about two such consecutive quarters as opposed to one or three. The label is assigned arbitrarily at a subjective point at which many economists begin to get seriously worried about economic performance. This label is nevertheless useful as long as we don’t reify it by imagining that it stands for some real ‘thing’ that happens to the economy when it experiences precisely two quarters of decline. Many other examples of useful definitions marking arbitrary points along a continuum could be cited. There is no objective way to determine the exact point at which ideological influence becomes severe and encompassing enough, and its effects long lasting enough, for it to be called brainwashing. Inevitably, there will be marginal instances that could be categorized either way. But despite the fact that the boundary is not precisely defined, it demarcates a class of events worthy of systematic study.

The Reciprocal Moral Panic

Study of brainwashing has been hampered by partisanship and tendentious writing on both sides of the conflict. In one camp, there are scholars who very badly don’t want there to be such a thing as brainwashing. Its nonexistence, they believe, will help assure religious liberty, which can only be procured by defending the liberty of the most unpopular religions. If only the nonexistence of brainwashing can be proved, the public will have to face up to the hard truth that some citizens choose to follow spiritual paths that may lead them in radical directions. This camp has exerted its influence within academia. But, instead of using its academic skills to refute the brainwashing conjecture, it has preferred to attack a caricature of brainwashing supplied by anticult groups for litigational rather than scientific purposes.

In the other camp, we find scholars who equally badly do want there to be such a thing as brainwashing. Its existence, they believe, will give them a rationale for opposition to groups they consider dangerous. A typical example of their reasoning can be found in the argument put forth by Margaret Singer that ‘Despite the myth that normal people don’t get sucked into cults, it has become clear over the years that everyone is susceptible to the lure of these master manipulators’ (Singer 1995: 17). Using a form of backward reasoning known as the ecological fallacy, she argues from the known fact that people of all ages, social classes, and ethnic backgrounds can be found in cults to the dubious conclusion that everyone must be susceptible. These scholars must also share some of the blame for tendentious scholarship. Lacking positions of leadership in academia, scholars on this side of the dispute have used their expertise to influence the mass media, and they have been successful because sensational allegations of mystical manipulative influence make good journalistic copy.

It’s funny in a dreary sort of way that both sides in this debate agree that it is a David and Goliath situation, but each side fancies itself to be the David courageously confronting the awesome power of the opposition. Each side makes use of an exaggerated fear of the other’s influence to create the raw materials of a moral panic (Cohen 1972; Goode and Ben Yehudah 1994). Thus, a disinterested search for truth falls victim to the uncompromising hostility created by each side’s paranoid fear of the power of the other.

The ‘cult apologists’ picture themselves as fighting an underdog battle against hostile lords of the media backed by their armies of ‘cultbashing’ experts. The ‘cult bashers’ picture themselves as fighting an underdog battle for a voice in academia in which apologists seem to hold all the gatekeeper positions. Each side justifies its rhetorical excesses and hyperbole by reference to the overwhelming advantages held by the opposing side within its own arena. But over the years a peculiar symbiosis has developed between these two camps. They have come to rely on each other to define their own positions. Each finds it more convenient to attack the positions of the other than to do the hard work of finding out what is really going on in cults. Thomas Robbins (1988: 74) has noted that the proponents of these two models ‘tend to talk past each other since they employ differing interpretive frameworks, epistemological rules, definitions … and underlying assumptions.’ Most of the literature on the subject has been framed in terms of rhetorical disputes between these two extremist models. Data-based models have been all but crowded out.

Between these two noisy and contentious camps, we find the curious but disinterested scientist who wants to find out if there is such a thing as brainwashing but will be equally satisfied with a positive or a negative answer. I believe that there can and should be a moderate position on the subject. Such a position would avoid the absurdity of denying any reality to what thousands of reputable ex-cult members claim to have experienced – turning this denial into a minor cousin of holocaust denial. At the same time, it would avoid the mystical concept of an irresistible and overwhelming force that was developed by the extremist wing of the anticult movement.

One of the most shameful aspects of this whole silly affair is the way pro-religion scholars have used their academic authority to foist off the myth that the concept of brainwashing needs no further research because it has already been thoroughly debunked. Misleadingly, it has been argued (Introvigne forthcoming; Melton forthcoming) that the disciplines of psychology and sociology, through their American scholarly associations, have officially declared the concept of brainwashing to be so thoroughly discredited that no further research is needed. Introvigne, by playing fast and loose with terminology, attempts to parlay a rejection of a committee report into a rejection of the brainwashing concept by the American Psychological Association. He argues that ‘To state that a report “lacks scientific rigor” is tantamount to saying that it is not scientific’ (Introvigne 1998: 3), gliding over the question of whether the ‘it’ in question refers to the committee report or the brainwashing concept.(3) Conveniently, for Introvigne, the report in question was written by a committee chaired by Margaret Singer, whose involuntarist theory of brainwashing is as much a distortion of the foundational concept as Introvigne’s parody of it.

The truth is that both of these scholarly associations (American Psychological Association and American Sociological Association) were under intense pressure by a consortium of pro-religion scholars (a.k.a. NRM scholars) to sign an amicus curiae brief alleging consensus within their fields that brainwashing theory had been found to be bunk. This was in regard to a case concerning Moonie brainwashing that was before the United States Supreme Court (Molko v Holly Spirit Association, Supreme Court of Calif. SF 25038; Molko v Holy Spirit Association, 762 p.2d 46 [Cal. 1988], cert, denied, 490 U.S. 1084 [1989]). The bottom line is that both of the associations, after bitter debate, recognized that there was no such consensus and refused to get involved. Despite strenuous efforts of the NRM scholars to make it appear otherwise, neither professional association saw an overwhelming preponderance of evidence on either side. Both went on record with a statement virtually identical to my argument in this chapter: that not nearly enough is known about this subject to be able to render a definitive scientific verdict, and that much more research is needed. A few years later, the Society for the Scientific Study of Religion went on record with a similar statement, affirming ‘the agnostic position’ on this subject and calling for more research (Zablocki 1997:114).

Although NRM scholars have claimed to be opposed only to the most outrageously sensationalized versions of brainwashing theory, the result, perhaps unintended, of their campaign has been to bring an entire important area of social inquiry to a lengthy halt. Evidence of this can be seen in the fact that during the period 1962 to 2000, a time when cults flourished, not a single article supportive of brainwashing has been published in the two leading American journals devoted to the sociology of religion, although a significant number of such articles have been submitted to those journals and more than a hundred such articles have appeared in journals marginal to the field (Zablocki 1998: 267).

The erroneous contention that brainwashing theory has been debunked by social science research has been loudly and frequently repeated, and this ‘big lie’ has thus come to influence the thinking of neutral religion scholars. For example, even Winston Davis, in an excellent recent article on suicidal obedience in Heaven’s Gate, expresses characteristic ambivalence over the brainwashing concept:

“Scholarship in general no longer accepts the traditional, simplistic theory of brainwashing … While the vernacular theory of brainwashing may no longer be scientifically viable, the general theory of social and psychological conditioning is still in rather good shape … I therefore find nothing objectionable [sic] in Benjamin Zablocki’s revised theory of brainwashing as ‘a set of transactions between a charismatically led collectivity and an isolated agent of the collectivity with the goal of transforming the agent into a deployable agent.’ The tale I have to tell actually fits nicely into several of Robert Lifton’s classical thought reform categories. (Davis 2000: 241-2)”

The problem with this all too typical way of looking at things is the fact that I am not presenting some new revised theory of brainwashing but simply a restatement of Robert Lifton’s (1989, 1999) careful and rigorous theory in sociological terms.

There are, I believe, six issues standing in the way of our ability to transcend this reciprocal moral panic. Let us look closely at each of these issues with an eye to recognizing that both sides in this conflict may have distorted the scientifically grounded theories of the foundational theorists – Lifton (1989), Sargant (1957), and Schein (1961) – as they apply to cults.

The Influence Continuum

The first issue has to do with the contention that brainwashing is a newly discovered form of social influence involving a hitherto unknown social force. There is nothing about charismatic influence and the obedience it instills that is mysterious or asks us to posit the existence of any new force. On the contrary, everything about brainwashing can be explained entirely in terms of well-understood scientific principles. As Richard Ofshe has argued: ‘Studying the reform process demonstrates that it is no more or less difficult to understand than any other complex social process and produces no results to suggest that something new has been discovered. The only aspect of the reform process that one might suggest is new, is the order in which the influence procedures are assembled and the degree to which the target’s environment is manipulated in the service of social control. This is at most an unusual arrangement of commonplace bits and pieces’ (1992: 221-2).

Would-be debunkers of the brainwashing concept have argued that brainwashing theory is not just a theory of ordinary social influence intensified under structural conditions of ideological totalism, but is rather a ‘special’ kind of influence theory that alleges that free will can be overwhelmed and individuals brought to a state of mind in which they will comply with charismatic directives involuntarily, having surrendered the capability of saying no. Of course, if a theory of brainwashing really did rely upon such an intrinsically untestable notion, it would be reasonable to reject it outright.

The attack on this so-called involuntarist theory of brainwashing figures prominently in the debunking efforts of a number of scholars (Barker 1989; Hexham and Poewe 1997; Melton forthcoming), but is most closely identified with the work of Dick Anthony (1996), for whom it is the linchpin of the debunking argument. Anthony argues, without a shred of evidence that I have been able to discover, that the foundational work of Lifton and Schein and the more recent theories of myself (1998), Richard Ofshe (1992), and Stephen Kent (Kent and Krebs 1998) are based upon what he calls the ‘involuntarism assumption.’ It is true that a number of prominent legal cases have hinged on the question of whether the plaintiff’s free will had been somehow overthrown (Richardson and Ginsburg 1998). But nowhere in the scientific literature has there been such a claim. Foundational brainwashing theory has not claimed that subjects are robbed of their free will. Neither the presence nor the absence of free will can ever be proved or disproved. The confusion stems from the difference between the word free as it is used in economics as an antonym for costly, and as it used in philosophy as an antonym for deterministic. When brainwashing theory speaks of individuals losing the ability to freely decide to disobey, the word is being used in the economic sense. Brainwashing imposes costs, and when a course of action has costs it is no longer free. The famous statement by Rousseau (1913, p.3) that ‘Man is born free, and everywhere he is in chains,’ succinctly expresses the view that socialization can impose severe constraints on human behaviour. Throughout the social sciences, this is accepted almost axiomatically. It is odd that only in the sociology of new religious movements is the importance of socialization’s ability to constrain largely ignored.

Unidirectional versus Bi-directional Influence

The second issue has to do with controversy over whether there are particular personality types drawn to cults and whether members are better perceived as willing and active seekers or as helpless and victimized dupes, as if these were mutually exclusive alternatives. Those who focus on the importance of the particular traits that recruits bring to their cults tend to ignore the resocialization process (Anthony and Robbins 1994).(4) Those who focus on the resocialization process often ignore personal predispositions (Singer and Ofshe 1990).

All this reminds me of being back in high school when people used to gossip about girls who ‘got themselves pregnant.’ Since that time, advances in biological theory have taught us to think more realistically of ‘getting pregnant’ as an interactive process involving influence in both directions. Similarly, as our understanding of totalistic influence in cults matures, I think we will abandon unidirectional explanations of cultic obedience in favour of more realistic, interactive ones. When that happens, we will find ourselves able to ask more interesting questions than we do now. Rather than asking whether it is the predisposing trait or the manipulative process that produces high levels of uncritical obedience, we will ask just what predisposing traits of individuals interact with just what manipulative actions by cults to produce this outcome.

A number of the debunking authors use this artificial and incorrect split between resocialization and predisposing traits to create a divide between cult brainwashing theory and foundational brainwashing theory as an explanation for ideological influence in China and Korea in the mid-twentieth century. Dick Anthony attempts to show that the foundational literature really embodied two distinct theories. One, he claims, was a robotic control theory that was mystical and sensationalist. The other was a theory of totalitarian influence that was dependent for its success upon pre-existing totalitarian beliefs of the subject which the program was able to reinvoke (Anthony 1996: i). Anthony claims that even though cultic brainwashing theory is descendant from the former, it claims its legitimacy from its ties to the latter.

The problem with this distinction is that it is based upon a misreading of the foundational literature (Lifton 1989; Schein 1961). Lifton devotes chapter 5 of his book to a description of the brainwashing process. In chapter 22 he describes the social structural conditions that have to be present for this process to be effective. Anthony misunderstands this scientific distinction. He interprets it instead as evidence that Lifton’s work embodies two distinct theories: one bad and one good (Anthony and Robbins 1994). The ‘bad’ Lifton, according to Anthony, is the chapter 5 Lifton who describes a brainwashing process that may have gone on in Communist reindoctrination centres, but which, according to Anthony, has no applicability to contemporary cults. The ‘good’ Lifton, on the other hand, describes in chapter 22 a structural situation that Anthony splits off and calls a theory of thought reform. Anthony appears to like this ‘theory’ better because it does not involve anything that the cult actually does to the cult participant (Anthony and Robbins 1995). The cult merely creates a totalistic social structure that individuals with certain predisposing traits may decide that they want to be part of.

Unfortunately for Anthony, there are two problems with such splitting. One is that Lifton himself denies any such split in his theory (Lifton 1995,1997). The second is that both an influence process and the structural conditions conducive to that process are necessary for any theory of social influence. As Lifton demonstrates in his recent application of his theory to a Japanese terrorist cult (Lifton 1999), process cannot be split off from structure in any study of social influence.

Condemnatory Label versus Contributory Factor

The third issue has to do with whether brainwashing is meant to replace other explanatory variables or work alongside them. Bainbridge (1997) and Richardson (1993) worry about the former, complaining that brainwashing explanations are intrinsically unifactoral, and thus inferior to the multifactoral explanations preferred by modern social science. But brainwashing theory has rarely, if ever, been used scientifically as a unifactoral explanation. Lifton (1999) does not attempt to explain all the obedience generated in Aum Shinrikyo by the brainwashing mechanism. My explanation of the obedience generated by the Bruderhof relies on numerous social mechanisms of which brainwashing is only one (Zablocki 1980). The same can be said for Ofshe’s explanation of social control in Synanon (1976). Far from being unifactoral, brainwashing is merely one essential element in a larger strategy for understanding how charismatic authority is channelled into obedience.

James Thurber once wrote a fable called The Wonderful O (1957), which depicted the cultural collapse of a society that was free to express itself using twenty-five letters of the alphabet but was forbidden to use the letter O for any reason. The intellectual convolutions forced on Thurber’s imaginary society by this ‘slight’ restriction are reminiscent of the intellectual convolutions forced on the NRM scholars by their refusal to include brainwashing in their models. It is not that these scholars don’t often have considerable insight into cult dynamics, but the poor mugs are, nevertheless, constantly getting overwhelmed by events that their theories are unable to predict or explain. You always find them busy playing catch-up as they scramble to account for each new cult crisis as it develops on an ad hoc basis. The inadequacy of their models cries out ‘specification error’ in the sense that a key variable has been left out.

The Thurberian approach just does not work. We have to use the whole alphabet of social influence concepts from Asch to Zimbardo (including the dreaded B-word) to understand cultic obedience. Cults are a complex social ecology of forces involving attenuation effects (Petty 1994), conformity (Asch 1951), crowd behaviour (Coleman 1990), decision elites (Wexler 1995), deindividuation (Festinger, Pepitone et al. 1952), extended exchange (Stark 1999), groupthink (Janis 1982), ritual (Turner 1969), sacrifice and stigma (Iannaccone 1992), situational pressures (Zimbardo and Anderson 1993), social proof (Cialdini 1993), totalism (Lifton 1989), and many others. Personally, I have never seen a cult that was held together only by brainwashing and not also by other social psychological factors, as well as genuine loyalty to ideology and leadership.

Arguments that brainwashing is really a term of moral condemnation masquerading as a scientific concept have emerged as a reaction to the efforts of some anticultists (not social scientists) to use brainwashing as a label to condemn cults rather than as a concept to understand them. Bromley (1998) has taken the position that brainwashing is not a variable at all but merely a peremptory label of stigmatization – a trope for an ideological bias, in our individualistic culture, against people who prefer to live and work more collectivistically. Others have focused on the obverse danger of allowing brainwashing to be used as an all-purpose moral excuse (It wasn’t my fault. I was brainwashed!), offering blanket absolution for people who have been cult members – freeing them from the need to take any responsibility for their actions (Bainbridge 1997; Hexham and Poewe 1997; Introvigne forthcoming; Melton forthcoming). While these allegations represent legitimate concerns about potential abuse of the concept, neither is relevant to the scientific issue. A disinterested approach will first determine whether a phenomenon exists before worrying about whether its existence is politically convenient.

Obtaining Members versus Retaining Members

The fourth issue has to do with a confusion over whether brainwashing explains how cults obtain members or how they retain them. Some cults have made use of manipulative practices like love-bombing and sleep deprivation (Galanti 1993), with some degree of success, in order to obtain new members. A discussion of these manipulative practices for obtaining members is beyond the scope of this chapter. Some of these practices superficially resemble techniques used in the earliest phase of brainwashing. But these practices, themselves, are not brainwashing. This point must be emphasized because a false attribution of brainwashing to newly obtained cult recruits, rather than to those who have already made a substantial commitment to the cult, figures prominently in the ridicule of the concept by NRM scholars. A typical straw man representation of brainwashing as a self-evidently absurd concept is as follows: ‘The new convert is held mentally captive in a state of alternate consciousness due to “trance-induction techniques” such as meditation, chanting, speaking in tongues, self-hypnosis, visualization, and controlled breathing exercises … the cultist is [thus] reduced to performing religious duties in slavish obedience to the whims of the group and its authoritarian or maniacal leader’ (Wright 1998: 98).

Foundational brainwashing theory was not concerned with such Svengalian conceits, but only with ideological influence in the service of the retaining function. Why should the foundational theorists, concerned as they were with coercive state-run institutions like prisons, ‘re-education centres,’ and prisoner-of-war camps have any interest in explaining how participants were obtained? Participants were obtained at the point of a gun.(5) The motive of these state enterprises was to retain the loyalties of these participants after intensive resocialization ceased. As George Orwell showed so well in his novel 1984, the only justification for the costly indoctrination process undergone by Winston Smith was not that he love Big Brother while Smith was in prison, but that Big Brother be able to retain that love after Smith was deployed back into society. Nevertheless, both ‘cult apologists’ and ‘cult bashers’ have found it more convenient to focus on the obtaining function.

If one asks why a cult would be motivated to invest resources in brainwashing, it should be clear that this can not be to obtain recruits, since these are a dime a dozen in the first place, and, as Barker (1984) has shown, they don’t tend to stick around long enough to repay the investment. Rather, it can only be to retain loyalty, and therefore decrease surveillance costs for valued members who are already committed. In small groups bound together only by normative solidarity, as Hechter (1987) has shown, the cost of surveillance of the individual by the group is one of the chief obstacles to success. Minimizing these surveillance costs is often the most important organizational problem such groups have to solve in order to survive and prosper. Brainwashing makes sense for a collectivity only to the extent that the resources saved through decreased surveillance costs exceed the resources invested in the brainwashing process. For this reason, only high-demand charismatic groups with totalistic social structures are ever in a position to benefit from brainwashing.(6)

This mistaken ascription of brainwashing to the obtaining function rather than the retaining function is directly responsible for two of the major arguments used by the ‘cult apologists’ in their attempts to debunk brainwashing. One has to do with a misunderstanding of the role of force and the other has to do with the mistaken belief that brainwashing can be studied with data on cult membership turnover.

The widespread belief that force is necessary for brainwashing is based upon a misreading of Lifton (1989) and Schein (1961). A number of authors (Dawson 1998; Melton forthcoming; Richardson 1993) have based their arguments, in part, on the contention that the works of foundational scholarship on brainwashing are irrelevant to the study of cults because the foundational literature studied only subjects who were forcibly incarcerated. However, Lifton and Schein have both gone on public record as explicitly denying that there is anything about their theories that requires the use of physical force or threat of force. Lifton has specifically argued (‘psychological manipulation is the heart of the matter, with or without the use of physical force’ [1995: xi]) that his theories are very much applicable to cults.(7) The difference between the state-run institutions that Lifton and Schein studied in the 1950s and 1960s and the cults that Lifton and others study today is in the obtaining function not in the retaining function. In the Chinese and Korean situations, force was used for obtaining and brainwashing was used for retaining. In cults, charismatic appeal is used for obtaining and brainwashing is used, in some instances, for retaining.

A related misconception has to do with what conclusions to draw from the very high rate of turnover among new and prospective recruits to cults. Bainbridge (1997), Barker (1989), Dawson (1998), Introvigne (forthcoming), and Richardson (1993) have correctly pointed out that in totalistic religious organizations very few prospective members go on to become long-term members. They argue that this proves that the resocialization process cannot be irresistible and therefore it cannot be brainwashing. But nothing in the brainwashing model predicts that it will be attempted with all members, let alone successfully attempted. In fact, the efficiency of brainwashing, operationalized as the expected yield of deployable agents(8) per 100 members, is an unknown (but discoverable) parameter of any particular cultic system and may often be quite low. For the system to be able to perpetuate itself (Hechter 1987), the yield need only produce enough value for the system to compensate it for the resources required to maintain the brainwashing process.

Moreover, the high turnover rate in cults is more complex than it may seem. While it is true that the membership turnover is very high among recruits and new members, this changes after two or three years of membership when cultic commitment mechanisms begin to kick in. This transition from high to low membership turnover is known as the Bainbridge Shift, after the sociologist who first discovered it (Bainbridge 1997: 141-3). After about three years of membership, the annual rate of turnover sharply declines and begins to fit a commitment model rather than a random model.(9)

Membership turnover data is not the right sort of data to tell us whether a particular cult practises brainwashing. The recruitment strategy whereby many are called but few are chosen is a popular one among cults. In several groups in which I have observed the brainwashing process, there was very high turnover among initial recruits. Brainwashing is too expensive to waste on raw recruits. Since brainwashing is a costly process, it generally will not pay for a group to even attempt to brainwash one of its members until that member has already demonstrated some degree of staying power on her own.(10)

Psychological Traces

The fifth issue has to do with the question of whether brainwashing leaves any long-lasting measurable psychological traces in those who have experienced it. Before we can ask this question in a systematic way, we have to be clear about what sort of traces we should be looking for. There is an extensive literature on cults and mental health. But whether cult involvement causes psychological problems is a much more general question than whether participation in a traumatic resocialization process leaves any measurable psychological traces.

There has been little consensus on what sort of traces to look for. Richardson and Kilboume (1983: 30) assume that brainwashing should lead to insanity. Lewis (1998:16) argues that brainwashing should lead to diminished IQ scores. Nothing in brainwashing theory would lead us to predict either of these outcomes. In fact, Schein points out that ‘The essence of coercive persuasion is to produce ideological and behavioral change in a fully conscious, mentally intact individual’ (1959: 437). Why in the world would brainwashers invest scarce resources to produce insanity and stupidity in their followers? However, these aforementioned authors (and others) have taken the absence of these debilitative effects as ‘proof’ that brainwashing doesn’t happen in cults. At the same time, those who oppose cults have had an interest, driven by litigation rather than science, in making exaggerated claims for mental impairment directly resulting from brainwashing. As Farrell has pointed out, ‘From the beginning, the idea of traumatic neurosis has been accompanied by concerns about compensation’ (1998: 7).

Studies of lingering emotional, cognitive, and physiological effects on ex-members have thus far shown inconsistent results (Katchen 1997; Solomon 1981; Ungerleider and Wellisch 1983). Researchers studying current members of religious groups have found no significant impairment or disorientation. Such results have erroneously been taken as evidence that the members of these groups could, therefore, not possibly have been brainwashed. However, these same researchers found these responses of current members contaminated by elevations on the ‘Lie’ scale, exemplifying ‘an intentional attempt to make a good impression and deny faults’ (Ungerleider and Wellisch 1983: 208). On the other hand, studies of ex-members have tended to show ‘serious mental and emotional dysfunctions that have been directly caused by cultic beliefs and practices (Saliba 1993: 106). The sampling methods of these latter studies have been challenged (Lewis and Bromley 1987; Solomon 1981), however, because they have tended to significantly over-sample respondents with anti-cult movement ties. With ingenious logic, this has led Dawson (1998: 121) to suggest in the same breath that cult brainwashing is a myth but that ex-member impairment may be a result of brainwashing done by deprogrammers.

All this controversy is not entirely relevant to our question, however, because there is no reason to assume that a brainwashed person is going to show elevated scores on standard psychiatric distress scales. In fact, for those for whom making choices is stressful, brainwashing may offer psychological relief. Galanter’s research has demonstrated that a cult ‘acts like a psychological pincer, promoting distress while, at the same time, providing relief’ (1989: 93). As we shall see below, the brainwashing model predicts impairment and disorientation only for people during some of the intermediate stages, not at the end state. The popular association of brainwashing with zombie or robot states comes out of a misattribution of the characteristics of people going through the traumatic brainwashing process to people who have completed the process. The former really are, at times, so disoriented that they appear to resemble caricatures of zombies or robots. The glassy eyes, inability to complete sentences, and fixed eerie smiles are characteristics of disoriented people under randomly varying levels of psychological stress. The latter, however, are, if the process was successful, functioning and presentable deployable agents.

Establishing causal direction in the association between cult membership and mental health is extremely tricky, and little progress has been made thus far. In an excellent article reviewing the extensive literature in this area, Saliba (1993: 108) concludes: ‘The study of the relationship between new religious movements and mental health is in its infancy.’ Writing five years later, Dawson (1998: 122) agrees that this is still true, and argues that ‘the inconclusive results of the psychological study of members and ex-members of NRMs cannot conceivably be used to support either the case for or against brainwashing.’ Saliba calls for prospective studies that will establish baseline mental health measurements for individuals before they join cults, followed by repeated measures during and afterward. While this is methodologically sensible, it is impractical because joining a cult is both a rare and unexpected event. This makes the general question of how cults affect mental health very difficult to answer.

Fortunately, examining the specific issue of whether brainwashing leaves psychological traces may be easier. The key is recognizing that brainwashing is a traumatic process, and, therefore, those who have gone through it should experience an increasing likelihood in later years of post-traumatic stress disorder. The classic clinical symptoms of PTSD – avoidance, numbing, and increased arousal (American Psychiatric Association 1994: 427) – have been observed in many excult members regardless of their mode of exit and current movement affiliations (Katchen 1997; Zablocki 1999). However, these soft and somewhat subjective symptoms should be viewed with some caution given recent controversies over the ease with which symptoms such as these can be iatrogenically implanted, as, for example, false memories (Loftus and Ketcham 1994).

In the future, avenues for more precise neurological tracking may become available. Judith Herman (1997: 238) has demonstrated convincingly that ‘traumatic exposure can produce lasting alterations in the endocrine, autonomic, and central nervous systems … and in the function and even the structure of specific areas of the brain.’ It is possible in the future that direct evidence of brainwashing may emerge from brain scanning using positron emission tomography. Some preliminary research in this area has suggested that, during flashbacks, specific areas of the brain involved with language and communication may be inactivated (Herman 1997: 240; Rauch, van der Kolk et al. 1996). Another promising area of investigation of this sort would involve testing for what van der Kolk and McFarlane (1996) have clinically identified as ‘the black hole of trauma.’ It should be possible to determine, once measures have been validated, whether such traces appear more often in individuals who claim to have gone through brainwashing than in a sample of controls who have been non-brainwashed members of cults for equivalent periods of time.

Separating the Investigative Steps

The final issue is a procedural one. There are four sequential investigative steps required to resolve controversies like the one we have been discussing. These steps are concerned with attempt, existence, incidence, and consequence. A great deal of confusion comes from nothing more than a failure to recognize that these four steps need to be kept analytically distinct from one another.

To appreciate the importance of this point, apart from the heat of controversy, let us alter the scene for a moment and imagine that the scientific conflict we are trying to resolve is over something relatively innocuous – say, vegetarianism. Let us imagine that on one side we have a community of scholars arguing that vegetarianism is a myth, that nobody would voluntarily choose to live without eating meat and that anyone who tried would quickly succumb to an overpowering carnivorous urge. On the other side, we have another group of scholars arguing that they had actually seen vegetarians and observed their non-meat-eating behavior over long periods of time, and that, moreover, vegetarianism is a rapidly growing social problem with many new converts each year being seduced by this enervating and debilitating diet.

It should be clear that any attempt to resolve this debate scientifically would have to proceed through the four sequential steps mentioned above. First, we would have to find out if anybody ever deliberately attempts to be a vegetarian. Maybe those observed not eating meat were simply unable to obtain it. If nobody could be found voluntarily attempting to follow a vegetarian diet, we would have to conclude that vegetarianism is a myth. If, however, we find at least one person attempting to follow such a diet, we would next have to observe him carefully enough and long enough to find out whether he succeeds in abstaining from meat. If we observe even one person successfully abstaining from meat, we would have to conclude that vegetarianism exists, increasing our confidence in the theory of the second group of researchers. But the first group could still argue, well, maybe you are right that a few eccentric people here and there do practise vegetarianism, but not enough to constitute a social phenomenon worth investigating. So, the next step would be to measure the incidence of vegetarianism in the population. Out of every million people, how many do we find following a vegetarian diet? If it turns out to be very few, we can conclude that, while vegetarianism may exist as a social oddity, it does not rise to the level of being a social phenomenon worthy of our interest. If, however, we find a sizeable number of vegetarians, we still need to ask, ‘So what?’ This is the fourth of our sequential steps. Does the practice of vegetarianism have any physical, psychological, or social consequences? If so, are these consequences worthy of our concern?

Each of these investigative steps requires attention focused on quite distinct sets of substantive evidence. For this reason, it is important that we not confuse them with one another as is so often done in ‘apologist’ writing about brainwashing, where the argument often seems to run as follows: Brainwashing doesn’t exist, or at least it shouldn’t exist, and even if it does the numbers involved are so few, and everybody in modem society gets brainwashed to some extent, and the effects, if any, are impossible to measure. Such arguments jump around, not holding still long enough to allow for orderly and systematic confirmation or disconfirmation of each of the steps.

Once we recognize the importance of keeping the investigative steps methodologically distinct from one another, it becomes apparent that the study of brainwashing is no more problematic (although undoubtedly much more difficult) than the study of an advertising campaign for a new household detergent. It is a straightforward question to ask whether or not some charismatic groups attempt to practise radical techniques of socialization designed to turn members into deployable agents. If the answer is no, we stop because there can be no brainwashing. If the answer is yes, we go on to a second question: Are these techniques at least sometimes effective in producing uncritical obedience? If the answer to this question is yes (even for a single person), we know that brainwashing exists, although it may be so rare as to be nothing more than a sociological oddity. Therefore, we have to take a third step and ask, How frequently is it effective? What proportion of those who live in cults are subjected to brainwashing, and what proportion of these respond by becoming uncritically obedient? And, finally, we need to ask a fourth important question: How long do the effects last? Are the effects transitory, lasting only as long as the stimulus continues to be applied, or are they persistent for a period of time thereafter, and, if so, how long? Let us keep in mind the importance of distinguishing attempt from existence, from incidence, from consequences.

Brainwashing as a Scientific Concept

What I am presenting here is not a ‘new’ theory of brainwashing but a conceptual model of the foundational theory developed in the mid-twentieth century by Lifton, Schein, and Sargant as it applies to charismatic collectivities. Because its scientific stature has been so frequently questioned, I will err on the side of formality by presenting a structured exposition of brainwashing theory in terms of eight definitions and twelve hypotheses. Each definition includes an operationalized form by which the trait may be observed. If either of the first two hypotheses is disconfirmed, we must conclude that brainwashing is not being attempted in the cult under investigation. If any of the twelve hypotheses is disconfirmed, we must conclude that brainwashing is not successful in meeting its goals within that cult.

I do not pretend that the model outlined here is easy to test empirically, particularly for those researchers who either cannot or will not spend time immersing themselves in the daily lives of cults, or for those who are not willing, alternatively, to use as data the detailed retrospective accounts of ex-members. However, it should be clear that the model being proposed here stays grounded in what is empirically testable and does not involve mystical notions such as loss of free will or information disease (Conway and Siegelman 1978) that have characterized many of the extreme ‘anticult models.’

Nor do I pretend that this model represents the final and definitive treatment of this subject. Charismatic influence is still a poorly understood subject on which much additional research is needed. With few exceptions, sociology has treated it as if it were what engineers call a ‘black box,’ with charismatic inputs coming in one end and obedience outputs going out the other. What we have here is a theory that assists in the process of opening this black box to see what is inside. It is an inductive theory, formed largely from the empirical generalizations of ethnographers and interviewers. The model itself presents an ideal-type image of brainwashing that does not attempt to convey the great variation among specific obedience-inducing processes that occur across the broad range of existing cults. Much additional refinement in both depth and breadth will certainly be needed.

Definitions

D1. Charisma is defined, using the classical Weberian formula, as a condition of ‘devotion to the specific and exceptional sanctity, heroism, or exemplary character of an individual person, of the normative patterns or order revealed or ordained by him’ (Weber 1947: 328). Being defined this way, as a condition of devotion, leads us to recognize that charisma is not to be understood simply in terms of the characteristics of the leader, as it has come to be in popular usage, but requires an understanding of the relationship between leader and followers. In other words, charisma is a relational variable. It is defined operationally as a network of relationships in which authority is justified (for both superordinates and subordinates) in terms of the special characteristics discussed above.

D2. Ideological Totalism is a sociocultural system that places high valuation on total control over all aspects of the outer and inner lives of participants for the purpose of achieving the goals of an ideology defined as all important. Individual rights either do not exist under ideological totalism or they are clearly subordinated to the needs of the collectivity whenever the two come into conflict. Ideological totalism has been operationalized in terms of eight observable characteristics: milieu control, mystical manipulation, the demand for purity, the cult of confession, ‘sacred science,’ loading the language, doctrine over person, and the dispensing of existence (Lifton 1989: chap. 22).(11)

D3. Surveillance is defined as keeping watch over a person’s behaviour, and, perhaps, attitudes. As Hechter (1987) has shown, the need for surveillance is the greatest obstacle to goal achievement among ideological collectivities organized around the production of public goods. Surveillance is not only costly, it is also impractical for many activities in which agents of the collectivity may have to travel and act autonomously and at a distance. It follows from this that all collectivities pursuing public goals will be motivated to find ways to decrease the need for surveillance. Resources used for surveillance are wasted in the sense that they are unavailable for the achievement of collective goals.

D4. A deployable agent is one who is uncritically obedient to directives perceived as charismatically legitimate (Selznick 1960). A deployable agent can be relied on to continue to carry out the wishes of the collectivity regardless of his own hedonic interests and in the absence of any external controls. Deployability can be operationalized as the likelihood that the individual will continue to comply with hitherto ego-dystonic demands of the collectivity (e.g., mending, ironing, mowing the lawn, smuggling, rape, child abuse, murder) when not under surveillance.

D5. Brainwashing is an observable set of transactions between a charismatically structured collectivity and an isolated agent of the collectivity, with the goal of transforming the agent into a deployable agent. Brainwashing is thus a process of ideological resocialization carried out within a structure of charismatic authority.

The brainwashing process may be operationalized as a sequence of well-defined and potentially observable phases. These hypothesized phases are (1) identity stripping, (2) identification, and (3) symbolic death/rebirth. The operational definition of brainwashing refers to the specific activities attempted, whether or not they are successful, as they are either observed directly by the ethnographer or reported in official or unofficial accounts by members or ex-members. Although the exact order of phases and specific steps within phases may vary from group to group, we should always expect to see the following features, or their functional equivalents, in any brainwashing system: (1) the constant fluctuation between assault and leniency; and (2) the seemingly endless process of confession, re-education, and refinement of confession.

D6. Hyper-credulity is defined as a disposition to accept uncritically all charismatically ordained beliefs. All lovers of literature and poetry are familiar with ‘that willing suspension of disbelief for the moment, which constitutes poetic faith’ (Coleridge 1970: 147). Hyper-credulity occurs when this state of mind, which in most of us is occasional and transitory, is transformed into a stable disposition. Hyper-credulity falls between hyper-suggestibility on the one hand and stable conversion of belief on the other.(12) Its operational hallmark is plasticity in the assumption of deeply held convictions at the behest of an external authority. This is an other-directed form of what Robert Lifton (1968) has called the protean identity state.

D7. Relational Enmeshment is a state of being in which self-esteem depends upon belonging to a particular collectivity (Bion 1959; Bowen 1972; Sirkin and Wynne 1990). It may be operationalized as immersion in a relational network with the following characteristics: exclusivity (high ratio of in-group to out-group bonds), interchangeability (low level of differentiation in affective ties between one alter and another), and dependency (reluctance to sever or weaken ties for any reason). In a developmental context, something similar to this has been referred to by Bowlby (1969) as anxious attachment.

D8. Exit Costs are the subjective costs experienced by an individual who is contemplating leaving a collectivity. Obviously, the higher the perceived exit costs, the greater will be the reluctance to leave. Exit costs may be operationalized as the magnitude of the bribe necessary to overcome them. A person who is willing to leave if we pay him $1,000 experiences lower exit costs than one who is not willing to leave for any payment less than $1,000,000. With regard to cults, the exit costs are most often spiritual and emotional rather than material, which makes measurement in this way more difficult but not impossible.

Hypotheses

Not all charismatic organizations engage in brainwashing. We therefore need a set of hypotheses that will allow us to test empirically whether any particular charismatic system attempts to practise brainwashing and with what effect. The brainwashing model asserts twelve hypotheses concerning the role of brainwashing in the production of uncritical obedience. These hypotheses are all empirically testable. A schematic diagram of the model I propose may be found in figure 1.

This model begins with an assumption that charismatic leaders are capable of creating organizations that are easy and attractive to enter (even though they may later turn out to be difficult and painful to leave). There are no hypotheses, therefore, to account for how charismatic cults obtain members. It is assumed that an abundant pool of potential recruits to such groups is always available. The model assumes that charismatic leaders, using nothing more than their own intrinsic attractiveness and persuasiveness, are initially able to gather around them a corps of disciples sufficient for the creation of an attractive social movement. Many ethnographies (Lofland 1966; Lucas 1995) have shown how easy it is for such charismatic movement organizations to attract new members from the general pool of anomic ‘seekers’ that can always be found within the population of an urbanized mobile society.

The model does attempt to account for how some percentage of these ordinary members are turned into deployable agents. The initial attractiveness of the group, its vision of the future, and/or its capacity to bestow seemingly limitless amounts of love and esteem on the new member are sufficient inducements in some cases to motivate a new member to voluntarily undergo this difficult and painful process of resocialization.

H1. Ideological totalism is a necessary but not sufficient condition for the brainwashing process. Brainwashing will be attempted only in groups that are structured totalistically. However, not all ideologically totalist groups will attempt to brainwash their members. It should be remembered that brainwashing is merely a mechanism for producing deployable agents. Some cults may not want deployable agents or have other ways of producing them. Others may want them but feel uncomfortable about using brainwashing methods to obtain them, or they may not have discovered the existence of brainwashing methods.

Figure 1: The Effect of Charismatic Influence on Uncritical Obedience

H2. The exact nature of this resocialization process will differ from group to group, but, in general, will be similar to the resocialization process that Robert Lifton (1989) and Edgar Schein (1961) observed in Communist re-education centres in the 1950s. For whatever reasons, these methods seem to come fairly intuitively to charismatic leaders and their staffs. Although the specific steps and their exact ordering differ from group to group, their common elements involve a stripping away of the vestiges of an old identity, the requirement that repeated confessions be made either orally or in writing, and a somewhat random and ultimately debilitating alternation of the giving and the withholding of ‘unconditional’ love and approval. H2 further states that the maintenance of this program involves the expenditure of a measurable quantity of the collectivity’s resources. This quantity is known as C, where C equals the cost of the program and should be measurable at least at an ordinal level.

This resocialization process has baffled many observers, in my opinion because it proceeds simultaneously along two distinct but parallel tracks, one involving cognitive functioning and the other involving emotional networking. These two tracks lead to the attainment of states of hyper-credulity and relational enmeshment, respectively. The group member learns to accept with suspended critical judgment the often shifting beliefs espoused by the charismatic leader. At the same time, the group member becomes strongly attached to and emotionally dependent upon the charismatic leader and (often especially) the other group members, and cannot bear to be shunned by them.

H3. Those who go through the process will be more likely than those who do not to reach a state of hyper-credulity. This involves the shedding of old convictions and the assumption of a zealous loyalty to these beliefs of the moment, uncritically seized upon, so that all such beliefs become not mere ‘beliefs’ but deeply held convictions.

Under normal circumstances, it is not easy to get people to disown their core convictions. Convictions, once developed, are generally treated not as hypotheses to test empirically but as possessions to value and cherish. There are often substantial subjective costs to the individual in giving them up. Abelson (1986: 230) has provided convincing linguistic evidence that most people treat convictions more as valued possessions than as ways of testing reality. Cognitive dissonance theory predicts with accuracy that when subject to frontal attack, attachment to convictions tends to harden (Festinger, Riechen et al. 1956; O’Leary 1994). Therefore, a frontal attack on convictions, without first undermining the self-image foundation of these convictions, is doomed to failure. An indirect approach through brainwashing is often more effective.

The unconventional beliefs that individuals adopt when they join cults will come to be discontinuous with the beliefs they held in precult life. What appears to happen is a transformation from individually held to collectively held convictions. This is a well-known phenomenon that Janis (1982) has called groupthink. Under circumstances of groupthink, the specific content of one’s convictions becomes much less important than achieving the goal that all in the group hold the same convictions. In elaboration likelihood terms we can say that the subject undergoes a profound shift from message processing to source processing in the course of resocialization (Petty and Wegener 1998).

When the state of hyper-credulity is achieved, it leaves the individual strongly committed to the charismatic belief of the moment but with little or no critical inclination to resist charismatically approved new or contradictory beliefs in the future and little motivation to attempt to form accurate independent judgments of the consequences of assuming new beliefs. The cognitive track of the resocialization process begins by stripping away the old convictions and associating them with guilt, evil, or befuddlement. Next, there is a traumatic exhaustion of the habit of subjecting right-brain convictions to left-brain rational scrutiny. This goes along with an increase in what Snyder (1974) has called self-monitoring, implying a shift from central route to peripheral route processing of information in which the source rather than the content of the message becomes all important.

H4. As an individual goes through the brainwashing process, there will be an increase in relational enmeshment with measurable increases occurring at the completion of each of the three stages. The purging of convictions is a painful process and it is reasonable to ask why anybody would go through it voluntarily. The payoff is the opportunity to feel more connected with the charismatic relational network. These people have also been through it, and only they really understand what you are going through. So cognitive purging leads one to seek relational comfort, and this comfort becomes enmeshing. The credulity process and the enmeshing process depend on each other.

The next three hypotheses are concerned with the fact that each of the three phases of brainwashing achieves plateaus in both of these processes. The stripping phase creates the vulnerability to this sort of transformation. The identification phase creates realignment, and the rebirth phase breaks down the barrier between the two so that convictions can be emotionally energized and held with zeal, while emotional attachments can be sacralized in terms of the charismatic ideology. The full brainwashing model actually provides far more detailed hypotheses concerning the various steps within each phase of the process. Space constraints make it impossible to discuss these here. An adequate technical discussion of the manipulation of language in brainwashing, for example, would require a chapter at least the length of this one. Figure 2 provides a sketch of the steps within each phase. Readers desiring more information about these steps are referred to Lifton (1989: chap. 5).

Figure 2: The Stages of Brainwashing and Their Effect on Hyper-credulity and Emotional Enmeshment

H5. The stripping phase. The cognitive goal of the stripping phase is to destroy prior convictions and prior relationships of belonging. The emotional goal of the stripping phase is to create the need for attachments. Overall, at the completion of the stripping phase, the situation is such that the individual is hungry for convictions and attachments and dependent upon the collectivity to supply them. This sort of credulity and attachment behaviour is widespread among prisoners and hospital patients (Goffman 1961).

H6. The identification phase. The cognitive goal of the identification phase is to establish imitative search for conviction and bring about the erosion of the habit of incredulity. The emotional goal of the identification phase is to instill the habit of acting out through attachment. Overall, at the completion of the identification phase the individual has begun the practice of relying on the collectivity for beliefs and for a cyclic emotional pattern of arousal and comfort. But, at this point this reliance is just one highly valued form of existence. It is not yet viewed as an existential necessity.

H7. The symbolic death and rebirth phase. In the death and rebirth phase, the cognitive and emotional tracks come together and mutually support each other. This often gives the individual a sense of having emerged from a tunnel and an experience of spiritual rebirth.(13) The cognitive goal of this phase is to establish a sense of ownership of (and pride of ownership in) the new convictions. The emotional goal is to make a full commitment to the new self that is no longer directly dependent upon hope of attachment or fear of separation. Overall, at the completion of the rebirth phase we may say that the person has become a fully deployable agent of the charismatic leader. The brainwashing process is complete.

H8. states that the brainwashing process results in a state of subjectively elevated exit costs. These exit costs cannot, of course, be observed directly. But they can be inferred from the behavioral state of panic or terror that arises in the individual at the possibility of having his or her ties to the group discontinued. The cognitive and emotional states produced by the brainwashing process together bring about a situation in which the perceived exit costs for the individual increase sharply. This closes the trap for all but the most highly motivated individuals, and induces in many a state of uncritical obedience. As soon as exit from the group (or even from its good graces) ceases to be a subjectively palatable option, it makes sense for the individual to comply with almost anything the group demands – even to the point of suicide in some instances. Borrowing from Sartre’s insightful play of that name, I refer to this situation as the ‘no exit’ syndrome. When demands for compliance are particularly harsh, the hyper-credulity aspect of the process sweetens the pill somewhat by allowing the individual to accept uncritically the justifications offered by the charismatic leader and/or charismatic organization for making these demands, however farfetched these justifications might appear to an outside observer.

H9. states that the brainwashing process results in a state of ideological obedience in which the individual has a strong tendency to comply with any behavioral demands made by the collectivity, especially if motivated by the carrot of approval and the stick of threatened expulsion, no matter how life-threatening these demands may be and no matter how repugnant such demands might have been to the individual in his or her pre-brainwashed state.

H10. states that the brainwashing process results in increased deployability. Deployability extends the range of ideological obedience in the temporal dimension. It states that the response continues after the stimulus is removed. This hypothesis will be disconfirmed in any cult within which members are uncritically obedient only while they are being brainwashed but not thereafter. The effect need not be permanent, but it does need to result in some measurable increase in deployability over time.

H11. states that the ability of the collectivity to rely on obedience without surveillance will result in a measurable decrease in surveillance. Since surveillance involves costs, this decrease will lead to a quantity S, where S equals the savings to the collectivity due to diminished surveillance needs and should be measurable at least to an ordinal level.

H12. states that S will be greater than C. In other words, the savings to the collectivity due to decreased surveillance needs is greater than the cost of maintaining the brainwashing program. Only where S is greater than C does it make sense to maintain a brainwashing program. Cults with initially high surveillance costs, and therefore high potential savings due to decreased surveillance needs [S], will tend to be more likely to brainwash, as will cults structured so that the cost of maintaining the brainwashing system [C] are relatively low.

Characteristics of a Good Theory

There is consensus in the social sciences that a good inductive qualitative theory(14) is one that is falsifiable, internally consistent, concrete, potentially generalizable, and has a well-defined dependent variable (King, Keohane et al. 1994). I think it should be clear from the foregoing that this theory meets all of these conditions according to prevailing standards in the social and behavioral sciences. However, since brainwashing theory has received much unjustified criticism for its lack of falsifiability and its lack of generalizability, I will briefly discuss the theory from these two points of view.

The criterion of falsifiability, as formulated primarily by Popper (1968), is the essence of what separates theory from dogma in science. Every theory must be able to provide an answer to the question of what evidence would falsify it. If the answer is that there is no possible evidence that would lead us to reject a so-called theory, we should conclude that it is not really a theory at all but just a piece of dogma.

Although Dawson (1998) and Richardson (1993) have included the falsifiability problem in their critiques of brainwashing, this criticism is associated mainly with the work of Dick Anthony (1996). Anthony’s claim that brainwashing theory is unfalsifiable is based upon two related misunderstandings. First, he argues that it is impossible to prove that a person is acting with free will so, to the extent that brainwashing theory rests on the overthrow of free will, no evidence can ever disprove it. Second, he applies Popper’s criterion to cults in a way more appropriate for a highly developed deductive theoretical system. He requires that either brainwashing explain all ego-dystonic behaviour in cults or acknowledge that it can explain none of it. But, as we have seen, brainwashing is part of an inductive multifactorial approach to the study of obedience in cults and should be expected to explain only some of the obedience produced in some cults.

With regard to generalizability, cultic brainwashing is part of an important general class of phenomena whose common element is what Anthony Giddens has called ‘disturbance of ontological security’ in which habits and routines cease to function as guidelines for survival (Cohen 1989: 53). This class of phenomena includes the battered spouse syndrome (Barnett and LaViolette 1993), the behaviour of concentration camp inmates (Chodoff 1966), the Stockholm Syndrome (Kuleshnyk 1984; Powell 1986), and, most importantly, behaviour within prisoner of war camps and Communist Chinese re-education centres and ‘revolutionary universities’ (Lifton 1989; Sargant 1957; Schein 1961). There exist striking homologies in observed responses across all of these types of events, and it is right that our attention be drawn to trying to understand what common theme underlies them all. As Oliver Wendell Holmes (1891: 325) attempted to teach us more than a century ago, the interest of the scientist should be guided, when applicable, by ‘the plain law of homology which declares that like must be compared with like.’

Evidence for Brainwashing in Cults

I have attempted to test the model as much as possible with the limited data that currently exist. I have relied on three sources of evidence. The first and most important of these consists of ethnographic studies of a wide variety of contemporary American charismatic cults conducted by myself and others. The first-hand opportunities I have had to watch (at least the public face of) charismatic resocialization in numerous cult situations has convinced me of the need to theorize about this phenomenon. The second source of data consists of interviews with former leaders of charismatic groups. Although I have only a handful of such interviews, they are particularly valuable for elucidating the process from the perspective of ‘management,’ rather than from the perspective of the subjects. The third source of data consists of reports of ex-members of cults, drawing heavily on scientifically sampled interviews that my students and I have conducted. Most of these respondents were interviewed at least twice over a roughly twenty-five-year period.

Because evidence in this field of study tends to be so bitterly contested, it is perhaps necessary to point out that my own studies in this area were all subject to rigorous and competitive peer review. Five of my studies were reviewed and funded by three organizations – the National Institute of Mental Health (2), the National Science Foundation (2), and the National Institute of Health (1) – over a period extending from 1964 to 2001. On all of these I was the principal investigator, and the research designs are in the public record. During this same period, other research of mine in this same field of study was funded by peer-reviewed faculty research grants from all of the universities with which I have been affiliated: the University of California at Berkeley, the California Institute of Technology, Columbia University, and Rutgers University. It is a strange anomaly that this body of work seems to be generally respected throughout the social and behavioural sciences, with the exception of a small field, the sociology of new religious movements, where some try their best to hold it up to ridicule and disesteem.

Ethnographic Accounts

Bainbridge (1997) has argued that most ethnographic studies of cults have failed to find evidence of brainwashing. But it is more accurate to say that ethnographers have been divided on this subject. Lalich, Ofshe, Kent, and myself have found such evidence abundantly (Kent and Krebs 1998; Lalich 1993; Ofshe, Eisenberg et al. 1974; Zablocki 1980). Even Barker, Beckford, and Richardson, who are among the most hostile to the brainwashing conjecture, have found evidence of attempted brainwashing, although they have claimed that these attempts are largely or entirely unsuccessful (Barker 1984; Beckford 1985; Richardson, Harder et al. 1972). Still other ethnographers (Balch 1985; Rochford, Purvis et al. 1989) seem ambivalent on the subject and not sure what to make of the evidence. Others such as Palmer (1994) and Hall (1987, 2000) have been fairly clear about the absence of brainwashing in their observations.

Such disparity is to be expected. There is no reason to believe that all cults practise brainwashing any more than that all cults are violent or that all cults make their members wear saffron robes. Most ethnographers who did discover evidence of brainwashing in the cults they investigated were surprised by the finding. The fact that evidence of this sort has been repeatedly discovered by researchers who were not particularly looking for it suggests that the process really exists in some cults. I have observed fully developed brainwashing processes in some cults, partially developed ones in others, and none whatsoever in others. As ethnographic work in cults continues to accumulate, we should expect to find a similar degree of heterogeneity in published reports. Certainly, there is abundant evidence of uncritically obedient behaviour in charismatic cults (Ayella 1990; Davis 2000; Katchen 1997; Lalich 1999; Lifton 1999; Wallis 1977), and this behaviour needs to be explained. The presence or absence of brainwashing may ultimately turn out to contribute to such an explanation.

When I first studied the Bruderhof thirty-five years ago, using ethnographic methods, I noticed a strong isomorphism between the phases of Bruderhof resocialization and the phases of brainwashing in Chinese re-education centres described by Lifton. Since 1 could think of no other reason why the Bruderhof would support such a costly and labour-intensive resocialization program if it were not to create deployable agents with long-term loyalty to the community, I hypothesized that something akin to brainwashing must be going on. My observations over the next thirty-five years have only strengthened my confidence in the correctness of this hypothesis. Bruderhof members were never kept from leaving by force or force threat. But the community put a lot of time and energy into assuring that defections would be made rare and difficult by imbuing in its members an uncritical acceptance of the teachings of the community and a terror of life outside the community.(15)

Some (but not all) of the other cultic groups I have lived with as a participant-observer have shown signs of a brainwashing process at work. Individuals being plucked suddenly out of the workaday routine of the group, appearing to become haggard with lack of sleep for prolonged periods, secretiveness and agitation, alternating periods of shunning and warm communal embrace, all suggest the presence of such a process. Some of these people, years later, having left the cult, have confirmed to me that such a process is what they went through when I observed them under this stress. According to my ethnographic observations, some sort of fully or partially developed brainwashing process figures in the resocialization of at least half of the cults I have studied during at least some phases of their history.

Leader Accounts

A second source of evidence may be found in reports given by people who were actually responsible for practising brainwashing with their fellow cult members. Several cult leaders who left their groups have since apologized to other ex-members for having subjected them to brainwashing methods. One such former cult leader put it this way:

“What you have to understand is that, for us, breaking the spirit… emptying out the ego, is very very important. And any means to that end … well, we would have said it was justified. And over the years we developed [by trial and error] ways of accomplishing this [task]. It was only after I was finished with [the cult] and living in the world again that I did some reading and realized how similar [our techniques] were to what the Communists did – to brainwashing. I think you would have to say that what we did was a kind of brainwashing even if we didn’t mean it to be so.”

In another case I interviewed the widow of a cult leader who had died and whose cult had disbanded soon thereafter. She said the following:

“Those kinds of things definitely happened [on quite a few occasions]. It’s not like we ever sat down and said, hey we’re going to brainwash everybody. That would have been’ crazy. It’s more like we knew how important our mission was and how [vulnerable it was] to treachery. I think we got a little paranoid about being overcome by treachery within, especially after Gabe and Helen left and starting saying those things about us. So everybody had to be tested. I had to be tested. Even he [the leader] had to be tested. We all knew it and we all [accepted it]. So we would pull a person out of the routine and put him in solitary for awhile. Nobody could talk to him except [my husband] and maybe a few others. I couldn’t even talk to him when I brought him his meals. That was usually my job … At first it was just isolation and observation and having deep long talks far into the night about the mission. We didn’t know anything about brainwashing or any of that stuff. But gradually the things you describe got in there too somehow. Especially the written confessions. I had to write a bunch of them towards the end when [X] was sick. Whatever you wrote was not enough. They always wanted more, and you always felt you were holding out on them. Finally your confessions would get crazy, they’d come from your wildest fantasies of what they might want. At the end I confessed that I was killing [my husband] by tampering with his food because I wanted to – I don’t know – be the leader in his place I guess. All of us knew it was bullshit but somehow it satisfied them when I wrote that … And, even though we knew it was bullshit, going through that changed us. I mean I know it changed me. It burned a bridge … [T]here was no going back. You really did feel you changed into being a different person in a weird sort of way.”

Perhaps the closest thing I have found to a smoking gun in this regard has to do with a sociology professor who became a charismatic cult leader. Two of this cult leader’s top lieutenants independently spoke to me on this subject. Both of these respondents described in great detail how they assisted in concerted campaigns to brainwash fellow cult members. Both felt guilty about this and found the memory painful to recount. One of them indicated that the brainwashing attempt was conscious and deliberate:

“During her years in academia, Baxter became very interested in mass social psychology and group behavior modification. She studied Robert Jay Lifton’s work on thought reform; she studied and admired ‘total’ communities such as Synanon, and directed methods of change, such as Alcoholics Anonymous. She spoke of these techniques as positive ways to change people.” (Lalich 1993: 55)

In this cult, which has since disbanded, there seems to be general consensus among both leaders and followers that systematic brainwashing techniques were used on a regular basis and were successful in their aim of producing deployable agents.

Ex-Member Accounts

Our third source of evidence is the most controversial. There has been a misguided attempt to deny the validity of negative ex-member accounts as a source of data about cults. They’ve been condemned as ‘atrocity tales’ (Richardson 1998:172), and Johnson (1998: 118) has dismissed them categorically by alleging that ‘the autobiographical elements of apostate narratives are further shaped by a concern that the targeted religious groups be painted in the worst possible light.’

The apostate role has been defined by Bromley (1997) largely in terms of the content of attitudes towards the former cult. If these attitudes are negative and expressed collectively in solidarity with other negatively disposed ex-members, they constitute evidence that the person must be not an ordinary ex-member but an ‘apostate.’ This is a direct violation of Robert Merton’s (1968) admonition that role sets be defined in terms of shared structural characteristics, not individual attitudes. What if this same logic were used to denigrate abused spouses who choose to be collectively vocal in their complaints? Nevertheless, this perspective on so-called ‘apostate accounts’ has been widely influential among cult scholars.

David Bromley is a sociological theorist of great personal integrity but limited field experience. I think that if Bromley and his followers could just once sit down with a few hundred of these emotionally haunted ex-members whom they blithely label ‘apostates,’ and listen to their stories, and see for themselves how badly most of them would like nothing more than to be able to put the cult experience behind them and get on with their lives, they would be deeply ashamed of the way they have subverted role theory to deny a voice to a whole class of people.

Dawson (1995) has correctly pointed out that there are methodological problems involved in using accounts of any kind as data. We need to be careful not to rely only on ex-member accounts. Triangulation of data sources is essential. But even the reports of professional ethnographers are nothing more than accounts, and thus subject to the same sort of limitations. Ex-member accounts have been shown to have reliability and validity roughly equivalent to the accounts given by current cult members (Zablocki 1996).

Solomon (1981) has provided some empirical support for the argument that those with stormy exits from cults and those with anticult movement affiliations are more likely to allege that they have been brainwashed than those with relatively uneventful exits and no such affiliation. ‘Cult apologists’ have made much of the finding that ex-members affiliated with anticult organizations are more likely to allege brainwashing than those who are not. Their hatred of the anticult movement has blinded them to two important considerations: (1) The causal direction is by no means obvious – it is at least as likely that those who were brainwashed are more likely to seek out anticult organizations as support groups as that false memories of brainwashing are implanted by anticult groups into those ex-members who fall into their clutches; and (2) Although the percentages may be lower, some ex-members who don’t affiliate with anticult groups still allege brainwashing.

Many ex-members of cults find brainwashing the most plausible explanation of their own cult experiences. While some might be deluding themselves to avoid having to take responsibility for their own mistakes, it strains credulity to imagine that all are doing so. Here, just by way of example, are excerpts from interviews done with five exmembers of five different cults. None of these respondents was ever affiliated, even marginally, with an anticult organization:

‘They ask you to betray yourself so gradually that you never notice you’re giving up everything that makes you who you are and letting them fill you up with something they think is better and that they’ve taught you to believe is something better.’

‘What hurts most is that I thought these people were my new friends, my new family. It wasn’t until after that I realized how I was manipulated little step by little step. Just like in Lifton; it’s really amazing when you think of it … couldn’t just be a coincidence … I don’t know if you can understand it, but what hurts most is not that they did it but realizing that they planned it out so carefully from the beginning. That was so cold.’

‘I’ve never been able to explain it to people who weren’t there. I don’t really understand it myself. But black was white, night was day, whatever they told us to believe, it was like a test. The more outrageous the idea the greater the victory, when I could wrap my mind around it and really believe it down to my toes. And, most important, be prepared to act on it just like if it was proven fact. That’s the really scary part when I look back on it.’

‘In the frame of mind I was in [at the time], I welcomed the brainwashing.

I thought of it like a purge. I needed to purge my old ways, my old self. I hated it and I felt really violent toward it… I wanted to wash it all away and make myself an empty vehicle for [the guru’s] divine plan … [Our] ideal was to be unthinking obedient foot soldiers in God’s holy army.’

Many wax particularly eloquent on this subject when interviewed in the aftermath of media events involving cultic mass suicides or murders. The fifth respondent said the following:

‘It makes me shudder and … thank God that I got out when I did. ‘Cause that could have been me doing that, could have been any of us. [I have] no doubt any one of us would have done that in the condition we all were in – killed ourselves, our kids, any that [the leaders] named enemies.’

I have quoted just five ex-members because of limitations of space. Many more could be found. Thousands of ex-members of various groups (only a small minority of whom have ever been interviewed by me) have complained of being brainwashed. Contrary to the allegations of some ‘cult apologists,’ very few of these are people who had been deprogrammed (and presumably brainwashed into believing that they had been brainwashed). The accounts of these people tend often to agree on the particulars of what happened to them, even though these people may never have talked with one another.

Another striking aspect of these brainwashing accounts by ex-members is that they are held to consistently for many years. I have interviewed many ex-cult members twenty to thirty years after leaving the cult, and have yet to have a single case of a person who alleged brainwashing immediately after leaving the cult, later recant and say it wasn’t true after all. More than anything else, this consistency over extended periods of time convinces me that ex-member accounts often may be relied on. Even if some of the details have been forgotten or exaggerated with the passage of time, the basic outline of what happened to them is probably pretty accurate. All in all, therefore, I think it is fair to conclude, both from accumulated ethnographic and ex-member data, that brainwashing happens to at least some people in some cults.

Incidence and Consequences

Finally, we come to the aspect of brainwashing theory for which our data are sketchiest, the one most in need of further research. How often does brainwashing actually occur (incidence)(16) and how significant are its consequences?

Defining what we mean by incidence is far from a simple matter. In the reporting of brainwashing there are numerous false positives and false negatives, and no consensus as to whether these errors lead to net underestimation or net overestimation. Several factors can produce false positives. Unless the term is precisely defined to respondents, some answers will reflect folk definitions of the term. It might mean little more to them than that they believe they were not treated nicely by their former cults. Other respondents may share our definition of the term, but answer falsely out of a desire to lay claim to the victim role or out of anger towards the cult. False negatives also can occur for several reasons. Most significantly, current members (as well as ex-members who still sympathize with the cult) may deny brainwashing to protect the cult. Others may understand the term differently than do the interviewers, and still others may be embarrassed to admit that they had been brainwashed. These errors can be minimized but hardly eliminated by in-depth interviewing in which respondents are asked not merely to label but to describe the process they went through.

There is insufficient space in this chapter to discuss these important methodological issues. I will therefore merely state the criteria upon which I base my own measurement. I treat incidence as a ratio of X to Y. In Y are included all those who were fully committed members of a cult for a year or more, but who are currently no longer affiliated with any cult.(17) In X are included those members of the Y set who both claim to have been brainwashed and who are able to give evidence of the particulars of their own brainwashing experience (at least through phase 2) consistent with those discussed in the previous section of this chapter.

In the handful of systematic studies that have been done, estimates of brainwashing incidence seem to cluster around 10 per cent (plus or minus 5 per cent) of former cult members (Katchen 1997; Wright 1987; Zablocki, Hostetler et al. in press). However, there is tremendous variation in estimates for this number given by people working in this field. Ignoring those scholars who deny that brainwashing is ever attempted or ever successful, I have heard anecdotal estimates as low as <0.1 per cent and as high as 80 per cent, given by ethnographers.

Stuart Wright’s (1987) data on voluntarily exiting ex-members indicate that 9 per cent say they had been brainwashed. This study is noteworthy because it examined ex-members of a variety of different cults rather than just one. It relied, however, on each respondent’s own definition of what it meant to be brainwashed.

My national longitudinal study (Zablocki 1980) relied primarily on a two-stage sampling procedure in which geographical regions were first selected and groups then sampled within these regions. I have followed 404 cases, most of them surveyed at least twice over intervals extending up to twenty-five years. Of those who were interviewed, 11 per cent meet the criteria for having been brainwashed discussed above. Interestingly, all those in my sample who claim to have been brainwashed stick to their claims even after many years have passed. My own study is the only one that I know of that has repeatedly interviewed members and former members over several decades.

Another issue is whether overall incidence among the ex-member population is the most meaningful statistic to strive for given the heterogeneity among cults and types of cult member. Cults vary in the proportion of their members they attempt to brainwash from 0 per cent to 100 per cent. Since brainwashing significantly increases exit costs (according to hypothesis 8), it follows that examples of brainwashed individuals will be somewhat over-represented among current cult members and somewhat under-represented among ex-members.

The incidence, among ex-members, is higher (24 per cent in my sample) when the relevant population is confined to a cult’s ‘inner circle,’ the core membership surrounding the leader. In an important and neglected article, Wexler (1995) makes the point that it is simplistic to think of a cult as comprising only a leader and a homogeneous mass of followers. Most cults have a third category of membership, a corps of lieutenants surrounding the leader, which Wexler refers to as a ‘decision elite.’ It follows from the hypotheses discussed earlier that we should expect attempts to brainwash to be concentrated among members in this category.

One study suggests that incidence is also higher among adults who grew up in cults (Katchen 1997). My own ethnographic observation supports this last point, and further suggests that cults under extreme stress become more likely to engage in brainwashing or to extend already existing brainwashing programs to a much wider circle of members.

With regard to consequences, we must distinguish between obedience consequences and traumatic consequences. Uncritical obedience is extinguished rapidly, certainly within a year of exiting if not sooner. The popular idea that former cult members can be programmed to carry obedience compulsions for specific acts to be performed long after membership in the cult has ceased is, in my opinion, wholly a myth based largely on a movie, The Manchurian Candidate. I know of nobody who has ever seen even a single successful instance of such programming. However, many brainwashed ex-members report that they would not feel safe visiting the cult, fearing that old habits of obedience might quickly be reinstilled.

There is evidence, in my data set, of persistent post-traumatic effects. The majority of those who claim to have been brainwashed say that they never fully get over the psychosocial insult, although its impact on their lives diminishes over time. The ability to form significant bonds with others takes a long time to heal, and about a third wind up (as much as a quarter of a century later) living alone with few significant social ties. This is more than double the proportion of controls (cult participants who appeared not to have been brainwashed) that are socially isolated twenty-five years later. Visible effects also linger in the ability to form new belief commitments. In about half there is no new commitment to a belief community after two years. By twenty-five years, this has improved, although close to twenty-five per cent still have formed no such commitment. Occupationally, they tend to do somewhat better, but often not until having been separated from the cult for five to ten years.

Conclusions

We can conclude from all of the above that those who claim that cultic brainwashing does not exist and those who claim it is pandemic to cults are both wrong. Brainwashing is an administratively costly and not always effective procedure that some cults use on some of their members. A few cults rely heavily on brainwashing and put all their members through it. Other cults do not use the procedure at all. During periods of stressful confrontation, either with external enemies or among internal factions, or in attempts to cope with failed apocalyptic prophecies, it is not uncommon for brainwashing suddenly to come to play a central role in the cult’s attempts to achieve order and social control. At such times, risk of uncritically obedient violent aggression or mass suicide may be heightened.

Hopefully, it will be clear from this chapter that brainwashing has absolutely nothing to do with the overthrow of ‘free will’ or any other such mystical or nonscientific concept. People who have been brainwashed are ‘not free’ only in the sense that all of us, hemmed in on all sides as we are by social and cultural constraints, are not free. The kinds of social constraints involved in brainwashing are much more intense than those involved in socializing many of us to eat with knives and forks rather than with our hands. But the constraints involved differ only in magnitude and focus, not in kind. Any brainwashed cult member always retains the ability to leave the cult or defy the cult as long as he or she is willing to pay the mental and emotional price (which may be considerable) that the cult is able to exact for so doing.

As I finish this chapter, a number of European nations are debating the advisability of anti-brainwashing laws, some of which eventually may be used to inhibit freedom of religious expression. In light of this trend a number of colleagues have criticized me, not on the grounds that my facts are incorrect, but that my timing is unfortunate. One socked me with the following, particularly troubling, complaint: ‘Ben, if you had discovered evidence, in 1942, of a higher prevalence among Jews than among non-Jews of the Tay-Sachs genetic defect, would you have published your findings in a German biology journal?’ Ultimately, although I respect the sentiments behind my colleagues’ concerns, I must respectfully disagree with their fastidious caution. It never works to refuse to look at frightening facts. They only become larger, more frightening, and more mystically permeated when banished to one’s peripheral vision. A direct, honest acknowledgment of the limited but significant role that brainwashing plays in producing uncritical obedience in some cults will serve, in the long run, to lessen paranoid reactions to ‘the threat of the cults,’ rather than to increase them.

Notes

1 Most of the examples in this chapter will be drawn from studies of religious cults because these are ones with which I am most familiar through my research. But it should be noted that cults need not be religious, and that there are plenty of examples of brainwashing in political and psychotherapeutic cults as well.

2 When I speak of ego dystonic behaviour, I refer to behaviour that was ego dystonic to the person before joining the cult and after leaving the cult.

3 I have no doubt that Introvigne, who is a European attorney, is sincere in his desire to stifle brainwashing research out of a fear that any suggestion that brainwashing might possibly occur in cults will be seized on by semiauthoritarian government committees eager to suppress religious liberty. Personally, I applaud Introvigne’s efforts to protect the fragile tree of religious freedom of choice in the newly emerging democracies of Eastern Europe. But I don’t appreciate his doing so by (perhaps inadvertantly) sticking his thumb on the scales upon which social scientists attempt to weigh evidence.

4 The Anthony and Robbins article cited demonstrates how little we really know about traits that may predispose people to join cults. They say

‘… some traditionally conservative religious groups attract people who score highly on various measures of totalitarianism, e.g., the F scale or Rokeach’s Dogmatism scale … It seems likely that these results upon certain Christian groups would generalize to alternative religious movements or cults, as many of them have theological and social beliefs that seems similar to those in some fundamentalist denominations’ (1994: 470). Perhaps, but perhaps not. No consensus has yet emerged from numerous attempts to find a cult personality type, but this seems like a promising area of research to continue.

5 Some, it is true, were nominally volunteers into re-education programs. However, the power of the state to make their lives miserable if they did not volunteer cannot be ignored.

6 Unfortunately, however, uncritical obedience can be wayward and dangerous. It can be useful to a cult leader when the cult is functioning well. But it often has been perverted to serve a destructive or self-destructive agenda in cults that have begun to disintegrate.

7 Some confusion on this subject has emerged from the fact that Lifton has distanced himself from those attempting to litigate against cults because of alleged brainwashing. He has consistently argued (and I wholeheartedly agree) that brainwashing, in and of itself, where no force is involved, should not be a matter for the law courts.

8 Formal definitions for this and other technical terms will be presented in the next section of this chapter.

9 In other words, the probability of a person’s leaving is inversely dependent upon the amount of time he or she has already spent as a member.

10 The ‘cult-basher’ version of brainwashing theory has played into this misunderstanding by confounding manipulative recruitment techniques (like sleep deprivation and ‘love-bombing’) with actual brainwashing. While there may be some overlap in the actual techniques used, the former is a method for obtaining new members, whereas brainwashing is a method for retaining old members.

11 Because of space limitations, I cannot give this important subject the attention it deserves in this chapter. Readers not familiar with the concept are referred to the much fuller discussion of this subject in the book by Robert Lifton as cited.

12 Students of cults have sometimes been misled into confusing this state of hyper credulity with either hyper suggestibility on the one hand or a rigid ‘true belief’ system on the other. But at least one study has shown that neither the hyper-suggestible, easily hypnotized person nor the structural true believer are good candidates for encapsulation in a totalist cult system (Solomon 1981:111-12). True believers (often fundamentalists who see in the cult a purer manifestation of their own world view than they have seen before) do not do well in cults and neither do dyed-in-the-wool sceptics who are comfortable with their scepticism. Rather it is those lacking convictions but hungering for them that are the best candidates.

13 Hopefully, no reader will think that I am affirming the consequent by stating that all experiences of spiritual rebirth must be caused by brainwashing. This model is completely compatible with the assumption that most spiritual rebirth experiences have nothing to do with brainwashing. The reasoning here is identical to that connecting epilepsy with visions of the holy. The empirical finding that epileptic seizures can be accompanied by visions of the holy does not in any way imply that such visions are always a sign of epilepsy.

14 The theory outlined here is basically a qualitative one, although it does call for the measurement of two quantities, C and S. However, it will frequently be sufficient if these two quantities can be measured at just an ordinal level, and indeed that is generally all that will be possible in most circumstances.

15 Bruderhof members, particularly those in responsible positions, are never fully trusted until they have gone through the ordeal of having been put into the great exclusion (being sent away) and then spiritually fought their way back to the community. Such exclusion serves as the ultimate test of deployability. Is the conversion deep enough to hold even when away from daily reinforcement by participation in community life? The degree to which the Bruderhof stresses the importance of this ideal serves as additional evidence that the creation of deployable agents is a major aim of the socialization process.

16 A related question is what portion of those a cult attempts to brainwash actually get brainwashed. No data have been collected on this issue to the best of my knowledge.

17 I do not distinguish between voluntary and involuntary mode of exit in my measure because my sample includes only an insignificant number (less than one-half of one per cent) who were deprogrammed out of their cults.

References

Abelson, R. 1986. ‘Beliefs are Like Possessions.’ Journal for the Theory of Social Behaviour 16: 223-50.

American Psychiatric Association. 1994. Diagnostic and Statistical Manual of Mental Disorders 4th ed. Washington D.C.: American Psychiatric Association.

Anthony, D. 1996. ‘Brainwashing and Totalitarian Influence: An Exploration of Admissibility Criteria For Testimony in Brainwashing Trials.’ Berkeley, Calif.: Graduate Theological Union.

Anthony, D., and T. Robbins. 1994. ‘Brainwashing and Totalitarian Influence.’ Encyclopedia of Human Behavior. New York: Academic Press, 1.

– 1995. ‘Religious Totalism, Violence, and Exemplary Dualism: Beyond the Extrinsic Model.’ Terrorism and Political Violence 7 (3): 10-50.

Asch, S. 1951. ‘Effects of Group Pressure upon the Modification and Distortion of Judgments.’ In Groups, Leadership, and Men, edited by H. Guetzknow. Pittsburgh: Carnegie.

Ayella, M. 1990. ‘“They Must Be Crazy”: Some of the Difficulties in Researching “Cults.”’ American Behavioral Scientist 33 (5): 562-77.

Bainbridge, W.S. 1997. The Sociology of Religious Movements. New York: Routledge.

Balch, R.W. 1985. ‘“When the Light Goes Out, Darkness Comes”: A Study of Defection from a Totalistic Cult.’ In Religious Movements: Genesis, Exodus, and Numbers, edited by R. Stark. New York: Paragon.

Barker, E. 1984. The Making of a Moonie: Choice or Brainwashing. Oxford: Basil Blackwell.

– 1989. New Religious Movements: A Practical Introduction. London: Her Majesty’s Stationery Office.

Barnett, O.W., and A.D. LaViolette. 1993. It Could Happen to Anyone: Why Battered Women Stay. Newbury Park, Calif.: Sage.

Beckford, J.A. 1985. Cult Controversies: The Societal Response to the New Religious Movements. London: Tavistock.

Bion, W.R. 1959. Experiences in Groups. New York: Basic Books.

Bowen, M. 1972. ‘Toward the Differentiation of a Self in One’s Own Family.’ In Family Interaction, edited by J. Framo. New York: Springer.

Bowlby, J. 1969. Attachment and Loss: Attachment. New York: Basic Books.

Bromley, D. 1998. ‘Listing (in Black and White) Some Observations on (Sociological) Thought Reform.’ Nova Religio 1: 250-66.

– 1997. ‘The Social Construction of Religious Apostasy.’ In The Politics of Religious Apostasy, edited by D.G. Bromley. New York: Praeger.

Bugliosi, V., and C. Gentry. 1974. Helter Skelter: The True Story of the Manson Murders. New York: Norton.

Carter, L.F. 1990. Charisma and Control in Rajneeshpuram: The Role of Shared Values in the Creation of a Community. New York: Cambridge University Press.

Chodoff, P. 1966. ‘Effects of Extreme Coercive and Oppressive Forces: Brainwashing and Concentration Camps.’ In American Handbook of Psychiatry. Vol. 3, edited by S. Arieti, 384-405. New York, Basic Books.

Cialdini, R. 1993. Influence: The Psychology of Persuasion. New York: William Morrow.

Cohen, 1.1989. Structuration Theory. New York: St Martin’s.

Cohen, S. 1972. Folk Devils and Moral Panics. Oxford: Basil Blackwell.

Coleman, J.S. 1990. Foundations of Social Theory. Cambridge, Mass.: Harvard University Press.

Coleridge, S.T. 1970. Biographia Literaria. New York: Random House.

Conway, F., and J. Siegelman. 1978. Snapping: America’s Epidemic of Sudden Personality Change. Philadelphia: Lippincott.

Davis, W. 2000. ‘Heaven’s Gate: A Study of Religious Obedience.’ Nova Religio 3: 241-67.

Dawson, L. 1995. ‘Accounting for Accounts: How Should Sociologists Treat Conversion Stories.’ International Journal of Comparative Religion and Philosophy 1: 51-68.

– 1998. Comprehending Cults: The Sociology of New Religious Movements. New York: Oxford University Press.

Farrell, K. 1998. Post-traumatic Culture. Baltimore: Johns Hopkins University Press.

Festinger, L., A. Pepitone, et al. 1952. ‘Some Consequences of Deindividuation in a Group.’ Journal of Abnormal and Social Psychology 47: 382-9.

Festinger, L., H. Riechen, et al. 1956. When Prophecy Fails. Minneapolis: University of Minnesota Press.

Galanter, M. 1989. Cults: Faith, Healing, and Coercion. New York: Oxford University Press.

Galanti, G.-A. 1993. ‘Reflections on “Brainwashing.”’ In Recovery From Cults: Help for Victims of Psychological and Spiritual Abuse, edited by M.D. Langone. New York: Norton.

Geertz, C. 1973. The Interpretation of Cultures. New York: Basic Books.

Goffman, E. 1961. Asylums. Garden City, N.Y.: Anchor Books.

Goode, E., and N. Ben Yehudah. 1994. Moral Panics. Cambridge, Mass.: Blackwell.

Hall, J.R. 1987. Gone from the Promised Land. New Brunswick: N.J.: Transaction.

– 2000. Apocalypse Observed: Religious Movements and Violence in North America, Europe, and Japan. New York: Routledge.

Hechter, M. 1987. Principles of Group Solidarity. Berkeley: University of California Press.

Herman, J. 1997. Trauma and Recovery: The Aftermath of Violence – From Domestic Abuse to Political Terror. New York: Basic Books.

Hexham, I., and K. Poewe. 1997. New Religions as Global Cultures: Making the Human Sacred. Boulder, Colo.: Westview.

Holmes, O.W. 1891. Crime. Boston: Houghton Mifflin.

Hong, N. 1998. In the Shadow of the Moons – My life in the Reverend Sun Myung Moon’s family. Boston: Little, Brown.

Iannaccone, L.R. 1992. ‘Sacrifice and Stigma: Reducing Free-Riding in Cults, Communes, and Other Collectives.’ Journal of Political Economy 100: 271-91.

Introvigne, M. 1998. ‘Liar, Liar: Brainwashing, CESNUR, and APA,’ Center for Studies on New Religions: http://www.cesnur.org/testi/se_brainwash.htm

– Forthcoming. ‘“Brainwashing”: Career of a Myth in the United States and Europe.’ In The Brainwashing Controversy: An Anthology of Essential Documents, edited by J.G. Melton and M. Introvigne. Stanford, Calif., Center for Academic Publications.

Janis, 1.1982. Groupthink: Psychological Studies of Policy Decisions and Fiascos. Boston: Houghton Mifflin.

Johnson, D.C. 1998. ‘Apostates Who Never Were: The Social Construction of Absque Facto Apostate Narratives.’ In The Politics of Apostasy: The Role of Apostates in the Transformation of Religious Movements, edited by D.G. Bromley. Westport: Conn.: Praeger.

Katchen, M.H. 1997. ‘The Rate of Dissociation and Dissociative Disorders in Former Members of High Demand Religious Movements.’ PhD diss.. Sociology department, Sydney University, Sydney, Australia.

Kelman, H.C. and V.L. Hamilton. 1989. Crimes of Obedience: Toward a Social Psychology of Authority and Responsibility. New Haven: Yale University Press.

Kent, S.A., and T. Krebs. 1998. ‘Academic Compromise in the Social Scientific Study of Alternative Religions.’ Nova Religio 2: 44-54.

King, G., R.O. Keohane, et al. 1994. Designing Social Inquiry: Scientific Inference in Qualitative Research. Princeton: Princeton University Press.

Kuleshnyk, 1.1984. ‘The Stockholm Syndrome: Toward an Understanding.’ Social Action and the Law 10:37-42.

Lalich, J. 1993. ‘A Little Carrot and a Lot of Stick: A Case Example.’ In Recovery From Cults: Help for Victims of Psychological and Spiritual Abuse, edited by M.D. Langone. New York: Norton.

– 1999. ‘Bounded Choice: The Fusion of Personal Freedom and Self-Renunciation in Two Transcendent Groups.’ Human and Organizational Systems. PhD diss. Fielding Institute, Santa Barbara, Calif.

Layton, D. 1998. Seductive Poison: A Jonestown Survivor’s Story of Life and Death in the Peoples Temple. New York: Doubleday.

Lewis, J.R. 1998. Cults in America. Santa Barbara, Calif.: ABC-CLIO.

Lewis, J.R., and D.G. Bromley. 1987. ‘The Cult Withdrawal Syndrome: A Case of Misattribution of Cause?’ Journal for the Scientific Study of Religion 26: 508-22.

Lifton, R.J. 1968. ‘Protean Man.’ Partisan Review 35:13-27.

– 1989. Thought Reform and the Psychology of Totalism. Chapel Hill: University of North Carolina Press.

– 1995. Foreword to Cults in Our Midst, by M. Singer. San Francisco: Jossey-Bass.

– 1997. ‘Reflections on Aum Shinrikyo.’ In The Year 2000: Essays on the End, edited by C.B. Strozier and M. Flynn. New York: New York University Press.

– 1999. Destroying the World to Save It: Aum Shinrikyo, Apocalyptic Violence, and the New Global Terrorism. New York: Henry Holt.

Lindholm, C. 1990. Charisma. Cambridge: Basil Blackwell.

Lofland, J. 1966. Doomsday Cult: A Study of Conversion, Proselytization and Maintenance of Faith. Englewood Cliffs, N.J.: Prentice Hall.

Loftus, E. and K. Ketcham 1994. The Myth of Repressed Memory: False Memories and Allegations of Sexual Abuse. New York: St Martin’s.

Lucas, P.C. 1995. The Odyssey of a New Religion: The Holy Order of MANS from New Age to Orthodoxy. Bloomington, Ind.: Indiana University Press.

Melton, J.G. forthcoming. ‘Brainwashing and the Cults: The Rise and Fall of a Theory.’ In The Brainwashing Controversy: An Anthology of Essential Documents, edited by J.G. Melton and M. Introvigne. Stanford, Calif.: Center For Academic Publication.

Merton, R.K. 1968. Social Theory and Social Structure. New York: Free Press.

Milgram, S. 1975. Obedience to Authority. New York: Harper & Row.

Miller, J. 2000. ‘South Asia Called Major Terror Hub in a Survey by U.S.’ New York Times, 30 April, 1.

Oakes, L. 1997. Prophetic Charisma: The Psychology of Revolutionary Religious Personalities. Syracuse: Syracuse University Press.

Ofshe, R. 1976. ‘Synanon: The People Business.’ In The New Religious Consciousness, edited by C. Glock and R. Bellah, 116-37. Berkeley: University of California Press.

– 1992. ‘Coercive Persuasion and Attitude Change.’ In The Encyclopedia of Sociology, edited by E. Borgatta and M. Borgatta. New York: Macmillan.

Ofshe, R., N. Eisenberg, et al. 1974. ‘Social Structure and Social Control in Synanon.’ Voluntary Action Research 3: 67-76.

O’Leary, S.D. 1994. Arguing the Apocalypse: A Theory of Millennial Rhetoric. New York: Oxford University Press.

Palmer, S.J. 1994. Moon Sisters, Krishna Mothers, Rajneesh Lovers: Women’s Roles in New Religions. Syracuse: Syracuse University Press.

Petty, R. 1994. ‘Two Routes to Persuasion: State of the Art.’ In International Perspectives on Psychological Science, edited by G. d’Ydewalle, P. Eelen, and P. Bertelson. Hillsdale: N.J.: Erlbaum.

Petty, R.E., and D.T. Wegener 1998. ‘Attitude Change: Multiple Roles For Persuasion Variables.’ In The Handbook of Social Psychology. Vol. 1, edited by D.T. Gilbert, S.T. Fiske, and G. Lindzey, 323-90. New York: McGraw-Hill.

Popper, K.R. 1968. The Logic of Scientific Discovery. New York: Harper & Row.

Powell, J.O. 1986. ‘Notes on the Stockholm Syndrome.’ Studies in Symbolic Interaction 7: 353-65.

Rauch, S.L., B. van der Kolk, et al. 1996. ‘A Symptom Provocation Study of Post-traumatic Stress Disorder Using Positron Emission Tomography and Script-Driven Imagery.’ Archives of General Psychiatry 53: 380-7.

Richardson, J.T., 1993. ‘A Social Psychological Critique of “Brainwashing” Claims about Recruitment to New Religions.’ Pt. B of Religion and the Social Order: The Handbook on Cults and Sects in America, edited by D.G. Bromley and J.K. Hadden. Greenwich, Conn.: JAI.

Richardson, J.T., 1998. ‘Apostates, Whistleblowers, Law, and Social Control.’ In The Politics of Apostasy: The Role of Apostates in the Transformation of Religious Movements, edited by D.G. Bromley. Westport, Conn.: Praeger.

Richardson, J.T., and G. Ginsburg. 1998. ‘A Critique of “Brainwashing” Evidence in Light of Daubert: Science and Unpopular Religions.’ In Law and Science: Current Legal Issues. Vol. 1, edited by H. Reece, 265-88. New York: Oxford University Press.

Richardson, J.T., and B. Kilbourne. 1983. ‘Classical and Contemporary Applications of Brainwashing Models: A Comparison and Critique.’ In The Brainwashing/Deprogramming Controversy: Sociological, Psychological, Legal, and Historical Perspectives, edited by D.G. Bromley and J.T. Richardson. New York: Edwin Mellen.

Richardson, J.T., M. Harder, et al. 1972. ‘Thought Reform and the Jesus Movement.’ Youth and Society 4:185-202.

Robbins, T. 1988. Cults, Converts, and Charisma: The Sociology of New Religious Movements. Beverly Hills: Sage.

Rochford, E.B., Jr., S. Purvis, et al. 1989. ‘New Religions, Mental Health, and Social Control.’ In Research in The Social Scientific Study of Religion. Vol. 1, edited by M.L. Lynn and D.O. Moberg, 57-82. Greenwich, Conn.: JAI.

– 1998. ‘Child Abuse in the Hare Krishna Movement: 1971-1986.’ ISKCON Communication Journal 6: 43-69.

Rousseau, J. 1913. The Social Contract and Discourses. London: J.M. Dent.

Saliba, J.A. 1993. ‘The New Religions and Mental Health.’ Pt. B of Religion and the Social Order: The Handbook on Cults and Sects in America, edited by D.G. Bromley and J.K. Hadden. Greenwich: Conn.: JAI.

Sargant, W. 1957. Battle for the Mind: A Physiology of Conversion and Brainwashing. Westport, Conn.: Greenwood.

Schein, E. 1959. ‘Brainwashing and Totalitarianization in Modern Society.’ World Politics 2: 430-41.

– 1961. Coercive Persuasion. New York: Norton.

Selznick, P. 1960. The Organizational Weapon. Glencoe, 111.: Free Press.

Singer, M. 1995. Cults in Our Midst. San Francisco: Jossey-Bass.

Singer, M., and R. Ofshe. 1990. ‘Thought Reform Programs and the Production of Psychiatric Casualties.’ Psychiatric Annals 20:188-93.

Sirkin, M., and L. Wynne. 1990. ‘Cult Involvement as a Relational Disorder.’ Psychiatric Annals 20:199-203.

Snyder, M. 1974. ‘The Self-Monitoring of Expressive Behavior.’ Journal of Personality and Social Psychology 30: 526-37.

Solomon, T. 1981. ‘Integrating the “Moonie” Experience: A Survey of Ex-Members of the Unification Church.’ In In Gods We Trust, edited by T. Robbins and D. Anthony. New Brunswick, N.J.: Transaction.

Stark, R. 1999. ‘Micro Foundations of Religion: A Revised Theory.’ Sociological Theory 17: 264-89.

Thurber, J. 1957. The Wonderful O. New York: Simon & Schuster.

Turner, V.W. 1969. The Ritual Process. Hardmondsworth, England: Penguin Press.

Ungerleider, J.T. and D.K. Wellisch. 1983. ‘The Programming (Brainwashing)/ Deprogramming Religious Controversy.’ In The Brainwashing/Deprogramming Controversy: Sociological, Psychological, Legal and Historical Perspectives. Vol. 5, edited by D.G. Bromley and J.T. Richardson, 205-11. New York: Edwin Mellen.

van der Kolk, B.A., and A.C. McFarlane. 1996. ‘The Black Hole of Trauma.’ In Traumatic Stress: The Effects of Overwhelming Experience on Mind, Body, and Society, edited by B.A. van der Kolk, A.C. McFarlane, and L. Weisaeth. New York: Guilford.

Wallis, R. 1977. The Road to Total Freedom: A Sociological Analysis of Scientology. New York: Columbia University Press.

Weber, M. 1947. The Theory of Social and Economic Organization. New York: Free Press.

Weightman, J.M. 1983. Making Sense of the Jonestown Suicides. New York: Edwin Mellen.

Wexler, M. 1995. ‘Expanding the Groupthink Explanation to the Study of Contemporary Cults.’ Cultic Studies 12: 49-71.

Williams, M. 1998. Heaven’s Harlots: My Fifteen Years as a Sacred Prostitute in the Children of God Cult. New York: William Morrow.

Wright, S. 1998. ‘Exploring Factors That Shape the Apostate Role.’ In The Politics of Apostasy: The Role of Apostates in the Transformation of Religious Movements, edited by D.G. Bromley. Westport, Conn.: Praeger.

– 1987. Leaving Cults: The Dynamics of Defection. Washington, D.C.: Society for the Scientific Study of Religion.

Zablocki, B.D. 1971. The Joyful Community. Chicago: University of Chicago Press.

– 1980. Alienation and Charisma: A Study of Contemporary American Communes. New York: Free Press.

– 1980. The Joyful Community. Chicago: University of Chicago Press.

– 1996. Reliability and Validity of Apostate Accounts in the Study of Religious Communities. New York: Association for the Sociology of Religion.

– 1997. ‘The Blacklisting of a Concept: The Strange History of the Brainwashing Conjecture in the Sociology of Religion.’ Nova Religio 1: 96-121.

– 1998. ‘Exit Cost Analysis: A New Approach to the Scientific Study of Brainwashing.’ Nova Religio 1: 216-49.

– 1998. ‘Reply to Bromley.’ Nova Religio 1: 267-71.

– 1999. ‘Hyper Compliance in Charismatic Groups.’ In Mind, Brain, and Society: Toward a Neurosociology of Emotion, edited by D. Franks and T. Smith. Stamford, Conn.: JAI.

Zablocki, B.D., A. Aidala, et al. 1991. ‘Marijuana Use, Introspectiveness, and Mental Health.’ Journal of Health and Social Behavior 32: 65-89.

Zablocki, B.D., J. Hostetler, et al. In press. Religious Totalism. Syracuse: Syracuse University Press.

Zimbardo, P. 1973. ‘On the Ethics of Investigation in Human Psychological Research: With Special Reference to the Stanford Prison Experiment.’ Cognition 2: 243-56.

Zimbardo, P., and S. Anderson. 1993. ‘Understanding Mind Control: Exotic and Mundane Mental Manipulations.’ In Recovery From Cults: Help for Victims of Psychological and Spiritual Abuse, edited by M.D. Langone. New York: Norton.