Lita Linzer Schwartz, Ph.D.
Pennsylvania State University
This article examines the use of persuasion and control techniques during various historical periods. Parallels are observed between both religious and political conversion campaigns, as well as between historical and modern cases. The importance of examining the real purposes behind the use of such control practices, especially in today’s cults, is emphasized.
The sociologist Thomas Robbins, in his book Cults, Converts and Charisma (1986), introduces his discussion of conversion techniques by stating:
Persons who actually “live” their religion in a thorough and totalistic manner, particularly when marginal exotic groups and deviant perspectives are involved, are perceived as having undergone an unnatural metamorphosis. This perception engenders a compelling inquiry: how did they get this way? (p. 63)
We might begin by asking the same question.
Conversion incorporates a number of elements, the most notable of which are proselytization and commitment. It is the proselytization phase that uses persuasion, while control is exercised in both stages by cults  and other groups. These are not new techniques.
Historical Uses of Persuasive Techniques
If we look back at the Inquisition, the very persuasive technique used was “Convert or be burned!” Today’s cults are much more subtle, at least initially. As far as control is concerned, totalitarian governments have long anticipated Orwell’s 1984 by having family members spy upon one another and neighbors denounce neighbors for heretical or nonconformist thoughts, speech, and/or behavior. Instances of similar types of control can be found as well among the Jews of first-century Palestine (Hankoff, 1983) and in the rigidity of some European communities between then and the Reformation. If we look into American history, we can also find that among the Puritans, one conformed or was banished from colonial New England (Pattison & Ness, 1989). Even today, the Amish “shun” nonconformists, thereby maintaining the sect’s  traditional beliefs and practices.
Persuasion can be a long-term effort as it was with Theodore Ratisbonne in the early 19th century, when Abbe Bautain and others tried to convince Theodore of the correctness of their philosophy and theology.  Beginning in 1823, when Theodore took a course in philosophy with Bautain, the young man was gradually convinced that “Christian dogmas are the development, the application, the accomplishment of the announced truths of Judaism” (Ratisbonne, 1904, pp. 60-61). Some four years later he was converted to Catholicism and secretly baptized (Isser & Schwartz, 1988). Apart from the emotional problems that had led him to a search for the meaning of life and for “truth,” this kind of step-by-step low-key persuasion typically leads to a deeper and more long-term commitment to the newly found answers. It is an approach used effectively by both the Church of Latter Day Saints (Stark & Bainbridge, 1980) and Jehovah’s Witnesses — both regarded as cults in their formative periods but which have since attained the status of sects.  Acceptance by the members of a new group and success as a functionary of the group — be it religious or political — reinforces the commitment.
This type of effort, however, is far too slow for many proselytizers and missionaries. As a result, the emphasis shifts to isolation from the familiar and literally from the family, which might serve to dissuade the prospective convert from the desired change. (An aside: In August 1990, there were items in the newspaper that Boy George, who had been heavily involved in drugs, had become a member of the Hare Krishna movement, much to his family’s relief. In the early years of the modern cult movements, many families felt this way. . .until their children were cut off from them.) In our study of a number of cases involving children and adolescents in the 19th and 20th centuries, Natalie Isser and I found that the youngsters were kept from their families in convents and seminaries and soon succumbed to the pressures applied by the adults in these settings to convert (1988). They were, for the most part, afraid of the unknown, afraid of being alone, and afraid of being abandoned as it appeared they had been by their parents.
Involuntary conversions were and are most effective “with those who are the most vulnerable — the young, the naive, the weak, and the neurotic” (Isser & Schwartz, 1988, p. 114). The cases we studied reflected these characteristics, from Edgardo Mortara, removed from his family in 1858 and kept apart from them from age 5 to late adolescence, to the Finaly brothers whose lives were saved during World War II by the directress of a municipal nursery in France, who refused to surrender them to their surviving families for eight years.  The isolation of these children from their families is analogous to and has the same effect as the separation from family and friends practiced by many cultic groups.
Comparisons With Modern Uses
It was a combination of the children’s cases and those of several adults, including Theodore Ratisbonne and later the instantaneous and apparently “miraculous” conversion of his younger brother Alphonse in 1842, that led us to a more extensive study of conversion techniques used by the Chinese Communists during the Korean War and then by cults. What we found was that there were similarities and parallels between the historical cases and the modern ones. The major difference was in the “refinement” of the techniques.William Sargant, a British physician, studied the ancient Greeks to identify the techniques of persuasion and control used in the initiation rites of the religions of that era. In his description of how confessions were obtained in later periods, whether during the Inquisition, under the czarist police, or under Stalin and most of his successors, it is easy to perceive some parallels to the techniques used by cults: To elicit confessions, one must try to create feelings of anxiety and guilt, and induce states of mental conflict if these are not already present. Even if the accused person is genuinely guilty, the normal functioning of his brain must be disturbed so that judgment is impaired. If possible he must be made to feel a preference for punishment — especially if combined with a hope of salvation when it is over — rather than a continuation of the mental tension already present, or now being induced by the examiner. (Sargant, 1957, pp. 185-186)
As we know, in some of the allegedly religious groups and in the so-called therapeutic groups such as est, self-criticism and even self-contempt are used as prerequisites to receiving the “salvation” or special “knowledge” held out as the reinforcement for membership and devotion to the group’s precepts.
Sargant’s primary interest was in the physiological changes in the brain promoted by persistent tension, anxiety, and the frequently changing attitudes of the examiners. These led, he concluded, to greatly heightened suggestibility as well as physical debilitation and mental exhaustion. It might be noted that this approach is used regularly by police personnel in questioning suspects. (See Ofshe  for a description of law-enforcement abuses.) We’re all familiar (from television) with the starkly bare interrogation room that promotes tension and anxiety and with the “good cop, bad cop” team of questioners.
The hope of salvation mentioned by Sargant was also the “carrot” used by such eminent preachers as Jonathan Edwards in colonial New England and John Wesley in 18th-century Britain — after they had aroused anxiety, feelings of guilt, conflicting loyalties, and heightened group suggestibility. Again, this technique became familiar through the portrayal of “Elmer Gantry.” In fact, Sargant asserted, the techniques used by the Communists in Russia and Korea could be better understood if these religious uses were studied. Of course, physical terror was used by these political groups in addition to the massive psychological assault, as the latter process alone would have been too slow to obtain
control of hundreds of millions of people.
Both Sargant and Joost Meerloo, author of The Rape of the Mind (1956), point out, moreover, that these techniques are also related to Pavlov’s simple conditioning techniques. One might also cite the use of Skinner’s operant conditioning theory. In both situations, reinforcement is heavily dependent upon the organism’s behavior, with positive reinforcement (from a simple affirmative nod to the excessive “love bombing” employed by some cult members) given for all acceptable behavior and no reinforcement for contrary behavior.
Sargant also quotes Richard Walker, author of China Under Communism, as identifying six steps in the training of party workers who will transmit the message between the party and the masses:
1. Training in an area isolated from family and friends;
2. Fatigue — no opportunity for relaxation or reflection;
4. Uncertainty — related to those who didn’t measure up;
5. Use of vicious language;
6. Seriousness of the process — humor is forbidden (Sargant, 1957, pp. 165-166).
Does this sound suspiciously like what happened to those recruited by many of the cults in the 1960s, ’70s, and ’80s? The first four steps certainly have been reported and documented by thousands of former cult devotees. To some extent, military drill instructors use modifications of these same techniques with new recruits in basic training. One ex-DI told me, for example, that a major message conveyed in the training course for DIs was that for the first two weeks no recruit can do anything correctly — even if he makes his bed perfectly, exceeds the quota of push-ups, and so on. (You may recall seeing such scenes in the film “An Officer and a Gentleman.”)
Depersonalization, Deindividualization, and the Search for Happiness
An essential part of cult indoctrination has been a kind of depersonalization, that is, breaking down one’s persona or identity in order to create a new one in the image of the group and its leader. It’s interesting to note how this was carried out by a Charles Manson as well as a Jim Jones (Lindholm, 1990) and also, in earlier generations, by some convents and seminaries with their novices. In the latter situations, however, novices usually not only entered an order voluntarily and with knowledge of what was to come, but also had the opportunity to withdraw if they found that they could not adapt to the strict religious life.
According to Halperin, in discussing the meal served to potential recruits, “groups that place a great emphasis on denigrating individuality must inevitably create a food which denies individual taste” (1983, p. 227). The assumption of a common “uniform,” as with the Hare Krishnas today, or the pooling of all clothing so that no individual is associated with a particular item that will make him or her stand out among the others, is another aspect of deindividualization.
Another approach, one that may be traced back to some of the early followers of Jesus and other religious figures, is less threatening. It is, in fact, an appeal to the very human desire to be happy. Snow and Machalek (1982) describe the appeal: Two couples standing outside of a Los Angeles restaurant are asked by a neatly dressed Caucasian female if they have ever heard of “Nam-Myoho-Renge-Kya.” They look at her as if to say, “What are you talking about?” Noting their confusion, the proselytizer asks if they want to be happy and fulfill their dreams. They respond that they are quite content. The proselytizer emphasizes that they could get whatever they want — mentally, physically, or spiritually — if only they chanted. She then indicates how chanting has provided her with greater meaning and purpose, enabled her to get better grades in school, and improved her relationship with her parents. Their response was still one of disinterest, so the proselytizer moves on in search of other prospects. (1982, p. 15)
Perhaps the unique feature of this description is the readiness with which the proselytizer appeared to move on. As in any other field that utilizes essentially sales techniques, some practitioners are more persistent than others. Whether an itinerant salesman, a Hare Krishna devotee seeking donations, or one of today’s street beggars, some will move on after one rebuff while others hang on hoping to change the target’s mind through use of a new argument or simple repetition.
We must consider as well the nature of the target. In times of stress and uncertainty, more people are amenable to persuasion by someone who appears to have “the answers,” even if this means accepting an undue measure of control. On a national scale, we’ve seen this happen time and time again from the period of the Exodus to modern-day Iran. On the individual level, vulnerability to techniques of persuasion and control is particularly noted at several points in development — late adolescence, early adulthood, and today in very late adulthood. These are the times in a person’s life cycle of major change and numerous options. For those individuals who have not yet determined direction and who may not have experience in decision-making in an amorphous situation, the invitation to a solution that promises happiness and fulfillment is tempting indeed. Drawn in by the very effective persuasive techniques of the recruiters and the “already committed” and increasingly controlled by the practitioners of the organization, they, too, throw themselves into their new group in a “thorough and totalistic manner” with all the zeal of the Crusaders of old.
Part of the attraction of cults (and other totalistic groups) may be their exclusivity: “Only we will be saved!” or “We have the solutions to the world’s problems!” Another attraction, at least for social isolates, is the social network that is gained at once upon acceptance; yet, for the cult, it is also a means of controlling the new recruit’s behavior.
However, acceptance by the group is dependent on total commitment to the group’s ideas and on the demonstration of such commitment by seeking to “reform,” that is, proselytize, others. The refinement of the persuasive techniques, used without the physical terror imposed by an Inquisitor, a Stalin, a Hitler, or an Imam, is such that it would be envied by a Mary Kay distributor or a Madison Avenue advertising executive.
In conclusion, let me suggest that most of the techniques used by cults are neither new nor exclusive to these groups. Throughout history religious leaders and orders, shamans in aboriginal and native tribes, charismatic political leaders, the military, and overzealous salesmen have used these techniques very effectively. Our concern is aroused when those using undue pressure, proselytization, and control practices do so for their own enhancement or to meet their own psychological needs rather than for the true benefit of the individuals being drawn in or for the society in which the techniques are used and the group functions. Thus, we cannot fault the techniques of persuasion and control (except the extreme forms) for they have been necessary at many points in history, but we can and should question the purposes for which they are employed.
1. Cult has been defined as “a group or movement exhibiting a great or excessive devotion or dedication to some person, idea, or thing and employing unethically manipulative techniques of persuasion and control. . .designed to advance the goals of the group’s leaders, to the actual or possible detriment of members, heir families, or the community. (Cultism: A Conference for Scholars and Policy Makers, 1986, pp. 119-120)
2. A sect is a religious group that is not characterized by the exploitative manipulation of cults and is usually an offshoot of a mainstream religion or has moved toward accommodation with the mainstream. Sects may originate as cults or sometimes deteriorate into cults.
3. Theodore Ratisbonne was an older son in a prominent Alsatian Jewish family. After his conversion to Catholicism, he founded the order of Notre Dame de Sion, dedicated to educating and converting Jewish girls. He was involved in some major scandals where Jewish girls were hidden from their families while he beguiled them into conversion.
4. The difference in status reflects a shift in perception of the group by the larger society as well as some modification of the group’s accommodation to the laws of the larger society.
5. The Mortara boy had been secretly baptized by a Catholic servant and, under the laws of the Vatican States at that time, was taken from his Jewish family and raised in seminaries. He subsequently became a priest. The scandal of this event united Jewish communities across Europe and the United States in protest.
Cultism: A conference for scholars and policy makers. (1986). Cultic Studies Journal, 3 (1), 117-134.
Halperin, D.A. (1983).
Group processes in cult affiliation and recruitment. In D.A. Halperin (Ed.), Psychodynamic perspectives on religion, sect and cult (pp. 223-234). Boston: John Wright, PSG Inc.
Hankoff, L.D. (1983). Religious innovation in the Jewish revolt against Rome. In D.A. Halperin (Ed.), Psychodynamic perspectives on religion, sect and cult (pp. 1-30). Boston: John Wright, PSG Inc.
Isser, N., & Schwartz, L.L. (1988). The history of conversion and contemporary cults. New York: Peter Lang.
Lindholm, C. (1990). Charisma. Cambridge, MA: Basil Blackwell.
Meerloo, J.A.M. (1956). The rape of the mind. Cleveland: World Publishing.
Ofshe, R. (1989). Coerced confessions: The logic of seemingly irrational action. Cultic Studies Journal, 6(1), 1-15.
Pattison, E.M., & Ness, R.C. (1989). In M. Galanter (Ed.), Cults and new religious movements (pp. 43-83). Washington: American Psychiatric Association.
Ratisbonne, M.T. (1904). Pere Marie-Theodore Ratisbonne, fondateur de la societe des pretres et de la congregation des religieuses de Notre-Dame de Sion, 2 vols. Paris.
Robbins, T. (1988). Cults, converts and charisma. Beverly Hills, CA: Sage.Sargant, W. (1957). Battle for the mind. Garden City, NY: Doubleday.
Snow, D.A., & Machalek, R. (1982). On the presumed fragility of unconventional beliefs. Journal for the Scientific Study of Religion, 21, 15-26.
Stark, R., & Bainbridge, W.S. (1980). Networks of faith: Interpersonal bonds and recruitment to cults and sects. American Journal of Sociology, 85, 1376-1395.
Lita Linzer Schwartz, Ph.D., is Professor of Educational Psychology at the Ogontz Campus of Pennsylvania State University. She has written numerous books and articles on cults and conversion. This paper was originally presented at the American Family Foundation annual meeting at Stony Point, NY, in September 1990.