Razor’s Edge Indeed: A Deprogrammer’s View of Harmful Cult Activity
A character named Diamond (actor James Earl Jones) spoke those words in a British film called Signs and Wonders, released in 1994. The film was significant because it was the first time, to my knowledge, that a major film production about cults and deprogramming made the distinction between deprogrammer and exit counselor. Deprogramming, as it appeared in the English language in the 1970s, referred to actions taken to persuade a person to abandon allegiance to a controversial group or cult. The neologism exit counseling appeared in the early 1980s to distinguish coercive and oppositional deprogramming models of cult intervention from that of a non-coercive, educational approach.
Note that in this paper I may use cult in its lesser definition as adopted by ICSA although that definition violates the primarily neutral, academic application of the word.
In secular society, devotional cults form around sports teams, and these cults have some radical fans. The moderating influence of the team and the surrounding society prevents a radical fan from controlling the team and most of the other fans. In harmful cults that operate within self-sealing or closed systems, the moderating influences fade as effective social and psychological controls over the power and often malignant narcissism of the leaders [see Figure 5 later in this article for the “Cult (healthy type)” description].
With my model, I do not impugn every group that has a closed milieu; but history and experience tell me that the more closed a social system becomes, the greater the potential for deceit and abuse of power. In concert, the four elements, or facets, noted above create a matrix or process for some degree of potentially harmful cult activity. Each element is a red flag, so to speak. If all four appear as described, then the red flags should be waving. If there is harm, the degree of harm can be subjective, objective, or both. Subjective harm includes how much an ex-member has lost in perception, perspective, and self-esteem. Objective harm concerns loss in investments, health, relationships, education, and employment. While some former cult members have to start over alone and broke, others have careers and families intact. In every case, what I look for as a deprogrammer before I deign to cut short a true believer’s cult membership is reflected in the following model. Although I parsed the facets or elements to four, they could easily extend to eight or sixteen; but experience with audiences has taught me to economize any definition of cult and to elaborate from there.
The cult member in this model is proverbially stuck in the chrysalis stage, seeking to transcend the normal, boring, or limited self. The cult doctrine and leader will tell him that without the effort to transmute the self, his service to the purpose cannot gain power and reach perfection. But who knows how much effort is sufficient? By what evidence is the devotee free, saved, or enlightened? Does anyone in the group ever get to fly? Who really benefits? Is the member a chrysalis, or merely a bug wrapped in a spider’s cocoon? Is he merely fodder for a predatory leader and a parasitic system?
If anything defines mind control, it is circular ideation or a fixed mind set. Tethered ideologically to a leader’s revelation, the devotee will adjust his or her thoughts and impulses to sustain the least resistance and to stay in a flow. Leadership or cult management can jerk the chain of influence to get the member back in line if he drifts too far. Management can snap a whip if the devotee comes too close. The leader’s domain of authority is inside the circle—no one else is allowed there without permission. Think of a dressage trainer with a young or untamed horse in a ring (I was introduced to training horses this way, so I have some idea). In contrast, a well-trained or experienced horse will hardly tug on the lead and needs only subtle movement from the whip to guide.
“To leave this path is like a dog returning to eat its vomit.”
“Now that I’ve found God, Satan is everywhere trying to take me back.”
“Satan acts through the ones you love.”
“Now that you are on the Path, dark forces will attempt to dissuade and harm you. Your own mind will rebel with doubts and counterarguments.”
In choosing to defect, the group member must come to grips with what happened. Spiritual rape is a common description. For some, rape may be too strong an image; but it is nevertheless very unsettling to realize that they have shared their most intimate selves for years with a deceitful, perhaps nutcase guru who is incapable of truly guiding them. Then there is the peril of facing their own functional integrity. A body of believers can participate wholeheartedly in the delusions of the lead person—cults, like persons, can have personality disorders with delusional features (for example, grandiosity), albeit shared ones, as in folie de groupe. Thus, doubting devotees may feel imperiled by the very possibility of recognizing that their behavior in the cult was madness in action.
Deprogramming works when it reduces the perils of the exit process. Deprogrammers do this by reality testing questionable beliefs and perceptions with the client. Insights from the lives of ex-members from the same and other cults help: “If they survived and thrived under worse perils, then so can you.” Undermining the authority of the leader with solid scholarship and accurate history brings the ex-member not only to eye level with the leader, but also takes the leader out of the center of the circle, thus removing the illusion that she is transcendent.
Starting from the left side of Figure 1, one common symbolic structure for cult formation is the triangle. It indicates a dominant management force or leader at the top overseeing or lording over a sequence of social layers, with a mass of subservient devotees on the bottom. Next is the square that symbolizes being “boxed in” by facets that experts in the field have variously labeled. Arthur Deikman posited four sides of this box: compliance with a group, dependence on a leader, avoiding dissent, and devaluing outsiders. Janja Lalich, with her “bounded choice” theory, defines a cult with four attributes that form a self-sealing system: charismatic authority, transcendent belief system, systems of control, and systems of influence. Steven Hassan offers four attributes of “cult mind control” in his BITE model: behavior control, information control, thought control, and emotional control.
The circle is perhaps the obvious and most elegant illustration, as both a symbol and a metaphor for cult formation: circle of friends, inner circle, sphere of influence, encircled, and so on. The model I propose expands on the circle to help me explain the reality of harmful cult experience. In Figure 2, a conical shape illustrates the “ideal path” that seekers are led to imagine when they enter a transcendent belief system that promises total freedom, enlightenment, or a way out of mundane or sinful earthly life. The ideal path appears to spiral up, up, and away into heaven, infinity, or nirvana. The guru is already “there.” The devotee strives to make his way to salvation guided by the guru. In harmful systems, the devotee feels progress in the beginning but soon gets stuck between the perilous “fall” back to where he started and the impossible or inaccessible space ostensibly occupied by the leader. The devotee remains on a narrow ledge (mimicking the razor’s edge of Buddhism), feeling the tension and excitement of being on a “high” path.
Looking at the illustration of the seeker and the leader in Figure 2 from above, we see something like the models in Figures 3, 4, and 5. Figure 3 offers the actual view an outsider or critic will have of someone in a harmful cult. The circular path that the group member believes is a spiral upward is actually a pit or rut, wherein a restrictive lifestyle keeps the member sealed off from both the social surround and the sacred domain of the leader.
In Figure 4, the unhealthy cult devotee perceives the path as progressing toward enlightenment and perfection while rising “up” to spiritual freedom, ascension, and immortality. Circular movement gives the illusion of progress.
Moving to a healthier cult system in Figure 5, we see an expansion into the surround with less restriction. We find that devotees recognize a more democratic relationship with leadership, with mechanisms to replace leadership when necessary. The transcendent reality remains as transcendent to a living leader as it does to devotees—all can fall, all can rise equally, all can find inspiration, none are “God.” After writing an earlier draft, I came upon this same concept by authors David Johnson and Jeff Van Vonderen (1991) in their analysis of abuse in Christian churches. Their illustration has the same democratic relationship that leaders and members should have with the transcendent (Jesus in their illustration) and with one another.Figure 2
Figure 4 shows the harmful cult system as closed around the membership if members are to sustain a transpersonal purpose and avoid peril. Doing rituals, transformational sessions, recruiting, and fundraising always trumps the discursive activity of examining internal doubts and entertaining surrounding criticism. In Figure 5, the expansive, second model of a healthy cult (yes, there is such a thing), we clearly see an enclosed arena of activity that nevertheless sustains easy access both socially and intellectually with the surround (the social and intellectual environment). I borrow the term surround as applied by self-psychologists who follow the work of Heinz Kohut (1913–1981) who significantly advanced Freud’s analytic approach to psychotherapy. “Narrowly conceived, self-psychology consists of ideas of Heinz Kohut, ideas that apply to the understanding and treatment of narcissistic disorders.”
Narcissism as both a behavior trait and a disorder appears in discussions about cults and cult leaders; thus, my interest in Kohut and how he used the surround to augment assessments of self. In Kohut’s psychology, some measure of narcissism may be healthy, just as in my discussion here, certain cult formations can be healthy. Malignant narcissism appears in totalist systems that harm participants and society. Cults as closed authoritarian systems create perceptions about the social environment and greatly influence interactions with that environment. In that process, a manipulative cult will tap and feed the narcissistic tendencies of recruits with grandiose transpersonal causes and infect the recruits with flawed perceptions of peril projected onto the surround. I believe Kohut’s insights regarding the self as part of an interactive social structure can add value to this discussion. Here I only wish to alert the reader to why I use surround in my illustrations.
In the healthier version, the group member has ease of contact with the surround, as well as reasonable entry and exit, with no hidden agendas either way. The transpersonal purpose is not confused with the person of the living leader or guru. In other words, until the leader is dead and gone, he is just as human as his followers, albeit with a special role. He must serve the purpose, not have the purpose serve him as if he were God or a god. There is no such thing as a living god. Gods are spirits, if they exist at all. Even in Christianity, a religion that claims a living deity in the historical Jesus, we read of the struggle among the Apostles to recognize “God” as Jesus until after his death and reported resurrection. Similarly, the avatars of Hinduism exist as divine creatures only in Hindu scripture and on some devotional levels. Any claim by a living guru to be the tenth or Kalki Avatar, for example, is bogus until he dies and “earns” that designation through a living testament to the fruits of his labor. The quality of the tradition is what we can criticize when the “divine” person is gone.
For example, the Self-Realization Fellowship (SRF) founded by Swami Yogananda in 1925 posits the mysterious, a-historical Babaji as the divine root inspiration for the lineage of SRF gurus. Babaji can function as the traditional embodiment of the transpersonal, much like Jesus does in Christianity, but the embodied or living leader cannot, in my view. To the extent that any devotee sees the living guru as having achieved a transpersonal state is the extent to which the devotee risks living in a closed system controlled by the guru. The only humans I know who handle divine power well are those who can hold molten steel in their bare hands indefinitely.
I remind the reader that these are my models that assist me in helping my clients assess their group experience. I do not make exceptions regarding the God confusion. No living leader is God or a god. Many traditions have deified a living leader, such as Caesar; but during Roman inaugurations, a slave stood behind a triumphant general and chanted, “Memento mori [remember, thou art mortal; remember, you will die].” In a similar vein, during papal coronations a plain Catholic monk holds a pole on which burns a common piece of flax. Once the flax stops burning, the monk thrice repeats, “Pater sancte, sic transit gloria mundi [Fame is fleeting, Holy Father; remember, you are mortal].” Cult leaders and dictators who take center stage as objects of devotion tend to avoid this admonition.
As self-object to his adoring throngs within a cult circle, any leader can be caught up in a divinization mood. If that leader already has unfulfilled needs for adulation, a disorder of malignant narcissism, then the totalist cult emerges readily; but the fan or devotee is just as responsible for the deification. This is always a two-way process. Robert Lifton called this “ideological totalism,” wherein the immoderate desires of a group meet the grandiose ideas of a leader. The leader presents a convincing possibility that he has attained transcendence or embodies the transcendental purpose, and the group says, “We want that, too. Show us the way.” This meeting ground takes on a “momentum of its own,” says Lifton, beyond the initial goals envisioned by the leader or the followers.
Feeding his narcissism, the leader accommodates the devotion and finds ways to control the play of forces that surround his position. The more cult members get “caught up” in the irrational nets of devotion, the less likely it is that they will have their rational feet on the ground. This is an unstable position because humans are not gods. To sustain the god illusion, the group and guru must devise strategies to frame perception. Phobias and paranoid responses inevitably arise due to conflicts with reality, thus creating the circle of peril. Critical responses from the surround inadvertently feed the peril by fulfilling cult-induced perceptions of an enemy ready to destroy the seeker’s soul by creating doubt and inviting defection from the divine path.
Acknowledging the transcendental goal does not mean it is achievable, necessary, or even desirable. How we live with God may be more valid and viable than becoming God. To use another metaphor, we can acknowledge that the sun is necessary for our existence, but that does not mean that being closer to the sun increases our existence. Narcissistic leaders would have us believe that their techniques can hurtle us toward that sun of transcendence, while skeptics ridicule their antics and opponents curse their lies. One image of pseudo-transcendence comes from Transcendental Meditation devotees (TMers) who claim to “fly” as they hop around while holding a seated lotus position. The group members call this the first stage of “yogic flying” and will produce many pseudo-scientific studies to support their sacred claims. Manipulative cults have come up with a wide variety of Towers of Babel for millennia.
To expose the false beliefs, the deprogrammer must not only convince the cult member that his perceptions are not defective but also point out how the group managers and leader have manipulated those very perceptions and behaviors. Moreover, a cult member will choose to defect only with a realization that a less restricted mind offers better options for a better life. My job, the deprogrammer’s job, is to reinforce healthy ways of using information. Brain science indicates that a healthy brain is one that continues to stop and think. As Kathleen Taylor indicates in her book Brainwashing, a healthy brain function is not stuck in thought patterns or to a flawed organizing principle like an addiction to a drug, a false belief, or a highly constrained social system.
In effect, the deprogrammer’s job is to raise the seeker’s awareness back to eye level with reality, thereby both reducing the perception of exit peril and exposing the false authority of a leader. He does this to some extent by repeating the process that got the cult member into the closed system. After gaining rapport, the deprogrammer unveils new ways of seeing cult experience and behavior. New information may surprise, intrigue, and attract the cult member to want to hear more. The result is a wider frame of reference, with clearer options for choice. With access to reliable, reasonable evidence and insight into better options, the member can navigate safely through an exit and beyond, as Figure 6 illustrates.Figure 6
Interventions vary in intent and intensity based on need and the current status of the cult member. I am not about to describe the process of exit counseling in depth here. For that, the reader can turn to other sources (Giambalvo, Hassan, Langone). My purpose is merely to suggest that, to better advise a client regarding intervention approach, an exit counselor must determine which stage a cult member is in.
The seeker expresses curiosity after he has read literature, attended one meeting, or tried a new technique for the first time. He has an attraction to, but does not yet express any identification with, the group or movement. Locus of control remains in the self, which continues to make choices with a wide frame of reference to the environment, family, and friends. Intervention at this stage is relatively less intense. A good Internet exposé of the cult, a critical book about it, or a conversation on the phone with an ex-member or exit counselor can all work to curtail that attractive “leap” of faith or entry by the individual into a manipulated experience at a workshop or service.
The seeker has gone to a weekend or week-long intensive and comes back glowing with affirmation. The seeker cum member engages in positive talk about the group and makes effort to recruit. The member deflects any negative information and may not engage in argument. At this “honeymoon” stage, the exit counselor will advise the concerned persons against argument or sharing negative information with the new member. A formal intervention will require significant preparation of the concerned persons prior to any meeting between the exit counselor and cult member. “Preps” vary according to an exit counselor’s style and approach. Some counselors may require several days of therapeutic sessions and even months of effort to regain rapport with the cult member before intervention. Of course, noncoercive access to a meeting with the cult member must be possible to arrange. Typically, if it is to succeed, the actual exit session can last two or more days, and maybe a week.
At this stage, the cult member has been in for enough time (usually years) to have reached saturation point regarding what the group actually offers. The member may even be one of the sub-leaders or elders, and she has seen and experienced much of the inside conflict and abuse but will not define it as such. She may even be aware of many ex-member stories, yet she has no emotional empathy and intellectual integrity to see any criticism as significant. As above, the concerned persons must refrain from argument with the member about the group for intervention to be possible. Preparation with the exit counselor and intervention team is crucial.
Giambalvo, Carol. Family Interventions for Cult-Affected Loved Ones (originally published as Exit Counseling: A Family Intervention). Available as PDF document at http://store.icsahome.com/merchant.mvc?
Hassan, Steven (1988). Combatting Cult Mind Control (Rochester, VT: Park Street Press).
Johnson, David and Van Vonderen, Jeff (1991). The Subtle Power of Spiritual Abuse (Minneapolis, MN: Bethany House).
Lalich, Janja (2007). Bounded Choice: True Believers and Charismatic Cults (Berkeley, CA: University of California Press).
Langone, Michael, editor (1995). Recovery from Cults: Help for Victims of Psychological and Spiritual Abuse (New York, NY: W. W. Norton & Company).
Zablocki, Benjamin and Robbins, Thomas (2001). Misunderstanding Cults: Searching for Objectivity in a Controversial Field (Toronto, Canada: University of Toronto Press).
 “Persistence of ‘Deprogramming’ Stereotypes in Film,” by Joseph Szimhart, 2004. Cultic Studies Journal. http://www.icsahome.com/infoserv_articles/szimhart_joseph_persistenceofdeprogrammingstereotypes_abs.htm
 Johnson and Van Vonderen, The Subtle Power of Spiritual Abuse, p. 230.
 Robert J. Lifton (1961) Thought Reform and the Psychology of Totalism: A Study of “Brainwashing” in China, New York City: W. W. Norton. See Chapter 22.