How Does One Define Cult?
There are many definitions of cult, but for our purpose, one cited by ICSA is useful: “an ideological organization held together by charismatic relations and demanding total commitment.” (1) This definition is compatible with some definitions of new religious movements (NRMs), but cult can also refer to nonreligious organizations. As defined here, cults (on the high-demand/high-control end of the social influence spectrum—see below) are at risk of abusing members, but do not necessarily do so:
Although cultic groups vary a great deal, a huge body of clinical evidence and a growing body of empirical research indicate that some groups harm some people sometimes, and that some groups may be more likely to harm people than other groups. (2)
HOWEVER, the research focus today of ICSA is NOT on cults per se, but rather on the degree of intensity of the psychosocial influence within groups. After many years of international research on cultic groups, ICSA finds there are often too many variables to produce accurate lists of so-called dangerous cults. In addition, history has documented that sometimes “one man’s religion is another man’s cult”—occasionally with tragic consequences. (3) Nevertheless, it is possible to discern when the normal processes of social influence become extreme or harmful in a group; this shift can lead to observable psychological trauma in some individuals. (4) The cause of this harm is often the above-normal level of demand and social control in the group; this intense process is sometimes called the cultic dynamic.
The Cultic Dynamic
It is well known that groups use social-influence processes to create and maintain norms of belief and behavior. (5) This strategy is necessary to maintain a group identity, to distinguish an in-group from others (outsiders). In fact, it is a fundamental requirement for all groups and cultures.
However, it has been found that groups tend to align themselves along a social-influence continuum that runs from low control/low demand at one end to high control/high demand at the other. Those groups at the high end of the spectrum run a greater risk of being cultic in their social-influence processes. This higher risk is particularly true if there is deceptive advertising, misinformation, and censoring of information in these groups; if there are inner circles that have secret and different beliefs and behaviors from the publicly affirmed norms; if there is an extremely narcissistic leader without a functioning system of checks and balances; if outside oversight is not in place; if the group has a lack of transparency in economic matters; and so on. It should be noted, however, that even perceived “strangeness” or “dangerous beliefs” do not automatically create a cultic dynamic—even though these elements may increase the possibility of such a dynamic eventually coming into play. No matter how much we may dislike or disapprove of a particular group’s beliefs, this does not make the group a cult.
Are All Members of Cultic Groups Damaged by Those Groups?
Even in cultic groups that score at the high end of the control/demand continuum, not all members are abused or equally affected. (6) Members who are totally invested in a particular group or movement are more likely to suffer severe negative psychological consequences than more peripheral members. Also, developmental psychologists such as Erik Erikson and Mary Ainsworth (7) describe how individual differences in personality (i.e., trust-versus-mistrust or secure-versus-insecure attachment issues) influence how the core of the individual is more or less vulnerable later in life. This complex situation may create potential social-influence vulnerabilities, perhaps “setting one up” for later cultic group involvement. Differences in ego defense mechanisms also render some individuals more susceptible to unethical psychosocial demands and control practices. (8) Basic personality issues may also predispose some people to be more vulnerable than others to charismatic and prophetic leaders and groups. For example, since many cultic leaders have narcissistic personality traits, followers often may have codependency character traits (as described in the ancient Greek myth of Narcissus and Echo, on which Freud based his theory). (9) Nevertheless, these individual variables do not determine, in themselves, who becomes involved in cults; that circumstance might be just bad luck: an individual being in the wrong place at the wrong time. And everyone has weaknesses, so individual vulnerabilities cannot be the only cause for cultic involvement. However, once one is in a cultic group, these personality variables, in combination with the intense cultic dynamic, do impact the nature and extent of one’s suffering and trauma when one leaves. (10)
In general, some people in the same cultic group will be hurt more than others, some may not be affected at all, and some may actually benefit. Groups change over time and from one branch or subgroup to another; leaders’ personalities change, as do the personalities of various members. (11) Even persons with secure and intelligent personalities may encounter problems at times, especially during times of transition and crisis—and they may become vulnerable to unethical psychosocial influence and control.
As a result of all these interwoven variables, it is very difficult to say that a particular group, in all branches, at all times, affects all members in a particular way. Nevertheless, trained social workers and therapists know a dangerous cultic group environment when they encounter it—and so treat former members in various degrees of suffering. (12) These helping professionals know it is the intense psychosocial dynamic of these high-control/high-demand cultic groups and their charismatic (and often narcissistic) leaders that is at the core of their clients’ sense of abuse and trauma.
(1) B. Zablocki, Cults: Theory and Treatment Issues (paper presented to a conference, May 31, 1997, in Philadelphia, Pennsylvania).
(2) M. Langone, Cults, Psychological Manipulation, & Society (paper presented at AFF Annual Conference, University of Minnesota, St. Paul Campus, May 14, 1999; published in Cultic Studies Journal, 18, 2001, pp. 1–12, para. 10).
(3) In our Christian history, for example, tens of thousands were killed for belonging to “heretic” sects/cults such as the Cathars, a Gnostic branch (Albigensian Crusades, 1209–1220). And more recently (1993), there were many killings in Waco, Texas at the siege of the Branch Davidians (led by David Koresh, whose branch had broken away from the Seventh Day Adventists).
(4) There are many case studies—e.g., Robert Lifton, Destroying the World to Save It: Aum Shinrikyo (1999); Jim Guerra, From Dean’s List to Dumpster: Why I Left Harvard to Join a Cult (2000); Mark Laxer, Take Me for a Ride: Coming of Age in a Destructive Cult (1993); Margaret Singer and Janja Lalich, Cults in Our Midst (1996); Jayanti Tamm, Cartwheels in a Sari (2009); and more.
(5) See, for example, Robert Cialdini, Influence: Science and Practice (2009), for the six common social-influence processes: reciprocity, consistency, social proof, authority, liking, and scarcity, which use social influence and peer pressure to control and modify member behavior. These processes are often subtle and gradual, reducing followers’ ability to use conscious cognitive functions such as independent and critical thinking. Processes such as cognitive dissonance change our thinking and action to be congruent with each other at a precognitive level. See especially Leon Festinger, When Prophecy Fails (1956). There is also an emerging field of evolutionary psychology, which looks at genetic and epigenetic changes and the hundred thousands of years of primate social inheritance.
(6) Janja Lalich, Bounded Choice: True Believers and Charismatic Cults (2004). See also Steven Hassan, Freedom of Mind (2013), for an overview of these and other cultic dynamics.
(7) Erik Erikson’s stages of psychosocial development are trust vs. mistrust (infancy), autonomy vs. shame and doubt (toddler years), initiative vs. guilt (preschool), industry vs. inferiority (elementary school), identity vs. role confusion (adolescence), intimacy vs. isolation (early adulthood), generativity vs. stagnation (middle adulthood), and integrity vs. despair (late adulthood) (in The Life Cycle Completed, 1997). John Bowlby, Attachment and Loss, Vol. I (1982), and Mary Ainsworth, ”Infant-Mother Attachment,” American Psychologist, 34(10), pp. 932–937 (1979), both find that early caregiver-child attachment problems can lead to insecure or anxious personality formation. Erikson’s trust vs. mistrust stage seems to dovetail nicely with Bowlby-Ainsworth’s attachment theory. In terms of potential cultic-group involvement, the transition from each of these life-crisis-stages to the next is a stressful time, and a time when individuals are vulnerable to the intense and seductive influence processes of cultic groups.
(8) See, for example, Daniel Goleman’s Vital Lies, Simple Truths: The Psychology of Self-Deception (2005). Individuals’ ego-defense mechanisms may keep them from acknowledging deeply disturbing contradictions, deceptions, or misdeeds (pp. 117–123). Freud describes the following mechanisms: repression (forgetting and forgetting one has forgotten); denial and reversal (reaction formation: what is so is not the case: the opposite is the case); projection (what is inside is cast outside); isolation (events without feelings); rationalization (I give myself a cover story); sublimation (replacing the threatening with the safe); selective inattention (I don’t see what I don’t like); and automatism (I don’t notice what I do).
(9) For example, Len Oakes, Prophetic Charisma: The Psychology of Revolutionary Religious Personalities (1997); see also Charles Lindholm, Charisma (2002, PDF version online). There is a whole literature on the congruence between narcissism and charisma (see Oakes). The classic On Charisma and Institution Building (1968) by Max Weber describes charisma as an energizing, galvanizing force and cults as the core of every religion. Another powerful aspect of social influence is described in the classic “obedience to authority” experiments by Prof. Stanley Milgram (1961). He described how a leader (i.e., “cultic” group leader), once he is perceived as having authority, tends to be followed blindly (Max Weber described three types of authority: rational-legal authority, traditional authority, and especially for our case, charismatic authority). Once a member is involved in a group, and the leader is perceived to have authority, there are powerful psychosocial pressures that come into play, sometimes overriding an individual’s own impulses or values. The prisoner’s dilemma, also called the Faustian bargain in game theory (Merrill, Flood, and Dresher, 1950), and Philip Zimbardo’s Stanford prison experiment (1971) describe as well how readily people may conform, under the right conditions, to group and leader pressure and expectations.
(10) For example, Daniel Shaw, Traumatic Narcissism: Relational Systems of Subjugation (2014).
(11) See, for example, Dr. Eileen Barker, “Ageing in New Religions: The Variations of Later Experiences” (2013). In K. Baier & F. Winter (Eds.), Altern in den religion (pp. 227–60). Vienna, Austria: LIT Verlag. [Also available in E. Barker, “Ageing in new Religions: The Varieties of Later Experiences” (2013), Diskus, The Journal of the British Association for the Study of Religions, 12(2011), pp. 1–23 (online access via religiousstudiesproject.com/DISKUS/index.php/DISKUS/article/view/21/20).]
(12) Dr. Eileen Barker, “Ageing in New Religions: The Variations of Later Experiences” (2013). In K. Baier & F. Winter (Eds.), Altern in den religion (pp. 227-60). Vienna, Austria: LIT Verlag. [Also available in E. Barker, “Ageing in New Religions: The Varieties of Later Experiences” (2013), Diskus, The Journal of the British Association for the Study of Religions, 12(2011), pp. 1–23 (online access via religiousstudiesproject.com/DISKUS/index.php/DISKUS/article/view/21/20).]