Kimiaki Nishida, Ph.D.
Japanese studies on psychological manipulation, “cult mind control,” have been developing in this decade from a social psychological perspective. This paper reviews empirical examinations of the indoctrination process, maintenance and expansion, anti-social activity, and the post-cult psychological distress. From these studies, it was found that many sets of social influences are systematically applied to new recruits during the indoctrination process. These influences facilitate ongoing control of cult members. The mechanisms of influence and psychological consequences of cult psychological manipulation are discussed.
Recently, psychologists in Japan have been examining a contemporary social issue—certain social groups recruit new members by means of psychologically manipulative techniques called “mind control.” They then exhort their members to engage in various antisocial behaviors, from deceptive sales solicitation and forcible donation to suicide and murder. We classify such harmful groups as “cults” or even “destructive cults.” Psychologists concerned with this problem must explain why ordinary, even highly educated people devote their lives to such groups, fully aware that many of their activities deviate from social norms, violate the law, and may injure their health. Psychologists are now also involved in the issue of facilitating the recovery of distressed cult members after they leave such groups.
In the 1970s, hardly anyone in Japan was familiar with the term “destructive cult.” Even if they had been informed of cult activities, such as the 1978 Jonestown tragedy, in which 912 members of the Guyana-based American cult were murdered or committed suicide, most Japanese people would have thought the incident a sensational, curious, and inexplicable event. Because the events at Jonestown occurred overseas, Japanese people, except possibly those worried parents whose child had joined a radical cult, would not have shown any real interest.
In the 1980s, a number of Japanese, including journalists and lawyers, became concerned about the “unethical” activities of the Unification Church, whose members worshiped their so-called True Father, the cult’s Korean founder Sun Myung Moon, who proclaimed the Second Advent of Christ. One of the group’s activities entailed a shady fund-raising campaign. Another unethical activity of the cult in the 1980s was Reikan-shoho, a swindle in which they sold spiritual goods, such as lucky seals, Buddhist rosaries, lucky-tower ornaments, and so on. The goods were unreasonably expensive, but the intimidated customers bought them to avoid possible future misfortune.
The first Japanese “anticult” organization was established in 1987 to stop the activities of the Unification Church. The organization consisted of lawyers who helped Reikan-shoho victims all over Japan (see Yamaguchi, 2001). According to their investigation, the lawyers’ organization determined that the Unification Church in Japan engaged in three unethical practices. First, large amounts of money were collected through deceptive means. Under duress, customers desperate to improve their fortunes bankrupted themselves buying the cult’s “spiritual” goods. Second, members participated in mass marriages arranged by the cult without the partners getting to know each other, after the partners were told by the cult leader that their marriage would save their families and ancestors from calamity. Third, the church practiced mind control, restricting members’ individual freedom, and employing them in forced labor, which often involved illegal activity. Mind-controlled members were convinced their endeavors would liberate their fellow beings.
The 1990s saw studies by a few Japanese psychological researchers who were interested in the cult problem. By the mid-1990s, Japanese courts had already acknowledged two Unification Church liabilities during proceedings the lawyers had brought against the cult; namely, mass marriage and illegal Reikan-shoho (see Judgment by the Fukuoka [Japan] District Court on the Unification Church, 1995). The lawyers’ main objective, however, had been that the court confirm the Unification Church’s psychological manipulation of cultists, a ruling that would recognize these members as being under the duress of forced labor.
Around the same period, Aum Shinrikyo, a nihilist Japanese Buddhist sect established by Guru Asahara Syoko, also became involved in many crimes. However, nobody except some members’ families knew about the cult’s activities. Concerned members’ families who were apprehensive about the cult’s activities, which they claimed were dangerous for both its members and society, established an anti-Aum organization. The anti-Aum group appealed to cult members to leave the organization. In spite of those efforts, Aum’s membership and power expanded over the next ten years. Even the police and media were unable to expose the cult’s covert crimes because Aum skillfully hid behind its legal right to religious freedom.
On March 20, 1995, Aum mounted a sarin gas attack in a Tokyo subway, which killed eleven people and injured about five thousand. Consequently, Tokyo police initiated a compulsory investigation of the cult. Following the attack, the Japanese began to learn about cults and mind control (see Hirata, 2001). Although most of the criminals associated with the sarin attack were arrested, the cult’s motives remain unclear. A decade later, Guru Asahara has yet to divulge his rationale.
Since the onset of Aum Shinrikyo’s terrorist activities, Japanese society has still not reestablished its precult harmony. Moreover, we have begun to discover other cults that may pose a threat to the concord of our society.
What Is Mind Control?
Early in the study of mind control, the term was equated with the military strategy of brainwashing. Mind control initially was referred to in the United States as thought reformcoercive persuasion (Lifton, 1961; Schein, Schneier, & Barker, 1961). Currently, however, mind control is considered to be a more sophisticated method of psychological manipulation that relies on subtler means than physical detention and torture (Hassan, 1988).
In fact, people who have succumbed to cult-based mind control consider themselves to have made their decision to join a cult of their own free will. We presume that brainwashing is a behavioral-compliance technique in which individuals subjected to mind control come to accept fundamental changes to their belief system. Cult mind control may be defined as temporary or permanent psychological manipulation by people who recruit and indoctrinate cult members, influencing their behavior and mental processes in compliance with the cult leadership’s desires, and of which control members remain naive (Nishida, 1995a).
After the Aum attacks, Ando, Tsuchida, Imai, Shiomura, Murata, Watanabe, Nishida, and Genjida (1998) surveyed almost 9,000 Japanese college students. The questionnaire used was designed to determine whether the students had been approached by cults and, if so, how they had reacted; their perception of alleged cult mind-control techniques; and how their psychological needs determined their reactions when the cults had attempted to recruit them.
Ando’s survey results showed that about 20 percent of respondent impressions of the recruiter were somewhat favorable, in comparison with their impressions of salespersons. However, their compliance level was rather low. The regression analysis showed that the students tended to comply with the recruiter’s overture when
- they were interested in what the agent told them;
- they were not in a hurry;
- they had no reason to refuse;
- they liked the agent; or
- they were told that they had been specially selected, could gain knowledge of the truth, and could acquire special new abilities.
When asked to evaluate people who were influenced or “mind controlled” by a cult, respondents tended to think it was “inevitable” those people succumbed, and they put less emphasis on members’ individual social responsibility. When mind control led to a criminal act, however, they tended to attribute responsibility to the individual. More than 70 percent of respondents answered in the affirmative when asked whether they themselves could resist being subjected to mind control, a result that confirms the students’ naïveté about their own personal vulnerability. The respondents’ needs or values had little effect on their reactions to, interest in, and impressions about cult agents’ attempts to recruit them.
Mind Control as Psychological Manipulation of Cult Membership
Nishida (1994, 1995b) investigated the process of belief-system change caused by mind control as practiced by a religious cult. His empirical study evaluated a questionnaire administered to 272 former group members, content analysis of the dogma in the group’s publications, videotapes of lectures on dogma, the recruiting and seminar manuals, and supplementary interviews with former members of the group.
Cult Indoctrination Process by Means of Psychological Manipulation
In one of his studies, Nishida (1994) found that recruiters offer the targets a new belief system, based on five schemas. These schemas comprise
- Notions of concerning one’s life purpose;
- Ideals governing the type of individual, society, and world there ought to be;
- Goals related to correct action on the part of individuals;
- Notions of causality, or which laws of nature operate in the world”s history; and
- Trust that authority will decree the criteria for right and wrong, good and evil.
Content analysis of the group’s dogma showed that its recruitment process restructures the target’s belief-system, replacing former values with new ones advocated by the group, based on the above schemas.
Abelson (1986) argues that beliefs are metaphorically similar to possessions. He posits that we collect whatever beliefs appeal to us, as if working in a room where we arrange our favorite furniture and objects. He proposes that we transform our beliefs into a new cognitive system of neural connections, which may be regarded as the tools for decision making.
Just as favorite tools are often placed in the central part of a room, or in a harmonious place, it appears that highly valued beliefs are located for easy access in cognitive processing. Meanwhile, much as worn-out tools are often hidden from sight in corners or storerooms, less-valued beliefs are relocated where they cannot be easily accessed for cognitive processing. Individual changes in belief are illustrated with the replacement of a piece of the furniture, while a complete belief-system change is represented as exchanging all of one’s furniture and goods, and even the design and color of our room. The belief-system change, such as occurs during the recruitment and indoctrination process, is metaphorically represented in Figure 1 [the figure shows better in Chrome], starting with a functional room with its hierarchy of furniture or tools, and progressing through the stages of recruitment and indoctrination to the point at which the functional room has been replaced by a new set of furniture and tools that represent the altered belief system.
Step 0 in the figure shows the five schemas as a set of the thought tools that potential recruits hold prior to their contact with the group.
Step 1Governed by their trust in authority, targets undergoing indoctrination remain naïve about the actual group name, its true purpose, and the dogma that is meant to radically transform the belief system they have held until their contact with the group. At this stage of psychological manipulation, because most Japanese are likely to guard against religious solicitation, the recruiter puts on a good face. The recruiter approaches the targets with an especially warm greeting and assesses their vulnerabilities, in order to confound them.
Step 2 While the new idealsgoals are quite appealing to targets, their confidence level in the new notions of causality also rises; some residual beliefs may remain at this stage. The targets must be indoctrinated in isolation so that they remain unaware that the dogma they are absorbing is part of cult recruitment. Thus isolated, they cannot sustain their own residual beliefs through observing the other targets; the indoctrination environment tolerates no social reality (Festinger, 1954). The goal for this stage is for the targets to learn the dogma by heart and embrace it as their new belief, even if it might seem strange or incomprehensible.
Step 3 At this stage, the recruiter’s repeated lobbying for the new belief system entices the targets to “relocate” those newly absorbed beliefs that appeal to them into the central area in their “rooms.” By evoking the others’ commitment, the recruiter uses group pressure to constrain each target. This approach seems to induce both a collective lack of common sense (Allport, 1924) and individual cognitive dissonance (Festinger, 1957).
Step 4 As the new recruits pass through a period of concentrated study, the earlier conversion of particular values extends to their entire belief system. By the end, they have wholly embraced the new belief system. The attractive new beliefs gradually are “relocated” from their “room’s” periphery into its center, replacing older beliefs. Recently held beliefs are driven to the room’s periphery, thoroughly diminished; new, now-central beliefs coalesce, blending with the few remaining older notions.
Shunning their former society, the targets begin to spend most of their time among group members. Their new social reality raises the targets’ conviction that the new beliefs are proper. At this time, the targets feel contentedly at home because the recruiters are still quite hospitable.
Step 5 The old belief system has become as useless as dilapidated furniture or tools. With its replacement, the transformation of the new recruits’ belief systems results in fully configured new beliefs, with trust in authority at their core, and thus with that authority an effective vehicle for thought manipulation.
At the final stage of psychological manipulation during the recruitment and indoctrination process, the recruiters invoke the charismatic leader of the group, equating the mortal with god. The recruiters instill a profound fear in the targets, fear that misfortune and calamity will beset them should they leave the cult.
Cult Maintenance and Expansion through Psychological Manipulation
Nishida (1995b) studied one cult’s method of maintaining and expanding its membership by means of psychological manipulation, or cult mind control. The results of factor analysis of his survey data revealed that cult mind-control techniques induced six situational factors that enhanced and maintained members’ belief-systems: 1) restriction of freedom, 2) repression of sexual passion, 3) physical exhaustion, 4) sanction of external association, 5) reward and punishment, and 6) time pressure. Studies also concluded that four types of complex psychological factors influence, enhance, and maintain members’ belief systems: 1) behavior manipulation, 2) information-processing manipulation, 3) group-processing manipulation, and 4) physiological-stress manipulation.
Behavior manipulation includes the following factors:
- Conditioning. The target members were conditioned to experience deep anxiety if they behaved against cult doctrine. During conditioning, they often would be given small rewards when they accomplished a given task, but strong physical and mental punishment would be administered whenever they failed at a task.
- elf-perception. A member’s attitude to the group would become fixed when the member was given a role to play in the group (Bem, 1972; Zimbardo, 1975).
- Cognitive dissonance. Conditions are quite rigorous because members have to work strenuously and are allowed neither personal time nor money, nor to associate with “outsiders.” It seems that they often experienced strong cognitive dissonance (Festinger, 1957).
Information-processing-manipulation factors include the following:
- Gain-loss effect. Swings between positive and negative attitudes toward the cult became fixed as more positive than negative (Aronson & Linder, 1965). Many members had negative attitudes toward cults prior to contact with their group.
- Systemization of belief-system. In general, belief has a tenacious effect, even when experience reveals the as erroneous (Ross, Lepper, & Hubbard 1975). Members always associate each experience with group dogma; they are indoctrinated to interpret every life event in terms of the cult’s belief-system.
- Priming effect. It is a cognitive phenomenon that many rehearsed messages guide information processing to take a specific direction (Srull & Wyer, 1980). The members listen to the same lectures and music frequently and repeatedly, and they pray or chant many times every day.
- Threatening messages. They are inculcated with strong fears of personal calamity by occult power, nuclear war, and so on.
Group-processing manipulation components include:
- elective exposure to information. Members avoid negative reports, but search for positive feedback once they make a commitment to the group (Festinger, 1957). It should also be added that many group members continue to live in the locale in which they exited their society. Even so, new members are forbidden to have contact with out-of-group people, or access to external media.
- Social identity. Members identify themselves with the group because the main goal or purpose of their activity is to gain personal prestige within the group (Turner, Hogg, Oakes, Reicher, & Wetherell, 1987). Therefore, they look upon fellow members as elite, acting for the salvation of all people. Conversely, they look on external critics as either wicked persecutors or pitiful, ignorant fools. This “groupthink” makes it possible for the manipulators to provoke reckless group behavior among the members (Janis, 1971; Wexler, 1995).
It has been established that physiological stress factors facilitate this constraint within the group based on the following, as examples:
- Urgent individual need to achieve group goals
- Fear of sanction and punishment
- Monotonous group life
- Sublimation of sexual drive in fatiguing hard work
- Sleep deprivation
- Poor nutrition
- Extended prayer and/or meditation
The Consequences of Mind Control Induced by Cultic Psychological Manipulation
Psychological manipulation used by cults for mind control has two crucial consequences. One is extreme antisocial behavior, as exemplified by mass suicides of cult members. The other is the post-exit psychological instability that former cultists experience, although it remains unclear whether this instability was caused by abusive cult practices or by coercive exit stresses, such as deprogramming.
The Psychology of Antisocial Behavior As Cult Terrorism
Nishida (2001) identified the criminal terrorist behavior of Aum Shinrikyo as representative of destructive cults in Japan. The purpose of his study was to investigate the psychological processes of Aum members who had committed crimes, such as producing weapons without a license, and culminating in the deadly sarin attack. For the study, four defendants of Aum Shinrikyo were interviewed in jail. The defendants, labeled A, B, C, and D, committed crimes, including terrorist acts such as murder, using VX and sarin gases. At the time of their interviews, three of the defendants had decided to leave the group. As well, three of the interviewees (A, B, and C) were among the seventy-six former group members who completed questionnaires that were designed to examine their experiences and lives within the cult.
It has been shown that the profound devotion to Guru Asahara was as unswerving among the highly ranked criminal defendants as it was among more lowly, innocent cultists (see Table 1). Basic statistics and factor-pattern analysis of Aum’s psychological manipulations also indicate that the defendants’ responses were stronger than innocent members’ responses (see Table 2). It can be concluded from the analysis that Aum believers were unconditionally deferential to Asahara’s authority, and that his psychological manipulation had a more profound effect on those members who committed serious crimes than on members who were innocent of criminal activity.
As the defendants’ answers further revealed, members were warned repeatedly that the only correct manner to execute their guru’s assignments was to do so unequivocally. The analysis confirms that Asahara carefully monitored cultists to determine who was sufficiently obedient to commit even the most heinous crimes at his command. The higher one’s status in the cult, the more deeply influenced by Asahara one was. Psychological manipulation was most effective among highly placed devotees, as compared with cult members of lower status.
The findings regarding the defendants’ psychological process during the commission of crimes must be emphasized in the analysis of the study; they committed their crimes in obedience to an authority; this obedience arose from their perception that the cult’s dogma was superior. In the defendants’ discussions of the grave matter of their terrorist murders, they looked on the killings as their personal “practice” that guaranteed “salvation” for the victims. Because they had lived under conditions of extreme physical and mental duress,however, it was difficult for them to understand that their guru had been able to dictate such terrible crimes. As such, their answers may not indicate what they truly thought, but rather what they imagined their own thoughts to have been at the time of the attack.
There is another, more sinister reason why some defendants committed such dire acts. Some were terrified that they themselves would be killed if they disobeyed Asahara’s grim orders. Other cult members almost certainly committed numerous crimes for fear of his wrath; indeed, they had seen disobedient fellow members killed, and, in fact, had been regularly threatened by Asahara with death.
Post-Cult Residual Psychological Distress
Over the past few decades, a considerable number of studies have been completed on the psychological problems former cult members have experienced after leaving the cult, as compared with the mind-control process itself. It is important to note that most former members continue to experience discontent, although its cause remains controversial (Aronoff, Lynn, & Malinoski, 2000). A few studies on cult phenomena have been conducted so far in Japan, notably by Nishida (1995a, 1998), and by Nishida and Kuroda (2003, 2004), who investigated ex-cultists’ post-exit problems, based mainly on questionnaires administered to former members from two different cults.
In a series of studies, Nishida and Kuroda (2003) surveyed 157 former members of the Unification Church and Aum Synrikyo. Using factor analysis, the studies posited eleven factors that contribute to ex-members’ psychological problems. These factors can be classified into three main groups: 1) emotional distress, 2) mental distress, and 3) interpersonal distress. The eleven factors are 1) tendencies to depression and anxiety, 2) loss of self-esteem, 3) remorse and regret, 4) difficulty in maintaining social relations and friendships, 5) difficulty in family relationships, 6) floating or flashback to cultic thinking and feeling, 7) fear of sexual contact, 8) emotional instability, 9) hypochondria, 10) secrecy of cult life, and 11) anger toward the cult. These findings seem to have a high correlation with previous American studies.
Moreover, Nishida and Kuroda (2004) deduced from their analysis of variance of the 157 former members surveyed that depression and anxiety, hypochondria, and secrecy of cult involvement decreased progressively, with the help of counseling, after members left the cult. However, loss of self-esteem and anger toward the cult increased as a result of counseling.
Furthermore, Nishida (1998) found clear gender differences in the post-exit recovery process. Although female ex-cultists’ distress levels were higher than those of the males immediately after they left the cults, the women experienced full recovery more quickly than the men. The study also found that counseling by nonprofessionals works effectively with certain types of distress, such as anxiety and helplessness, but not for others, such as regret and self-reproof.
It can be concluded from Japanese studies on destructive cults that the psychological manipulation known as cult mind control is different from brainwashing or coercive persuasion. Based on my empirical studies, conducted from a social psychology point of view, I concluded that many sets of social influence are systematically applied to new recruits during the indoctrination process, influences that facilitate ongoing control of cult members. My findings agree with certain American studies, such as those conducted by Zimbardo and Anderson (1993), Singer and Lalich (1995), and Hassan (1988, 2000). The manipulation is powerful enough to make a vulnerable recruit believe that the only proper action is to obey the organization’s leaders, in order to secure humanity’s salvation, even though the requisite deed may breach social norms. Furthermore, it should be pointed out that dedicated cult veterans are subject to profound distress over the extended period of their cult involvement.
Abelson, R. P(1986). Beliefs are like possessionsJournal for the Theory of Social Behavior, 16(3), 223-250.
Allport, F. H. (1924). Social psychology. Boston: Boston Houghton Mifflin.
Ando, K., Tsuchida, S., Imai, Y., Shiomura, K., Murata, K., Watanabe, N., Nishida, K. & Genjida, K. (1998). College students and cults in Japan: How are they influenced and how do they perceive the group members? Japanese Psychological Research,(4), 207-220. [Reprinted in Cultic Studies Review, 4(1), 2005.]
Aronoff, J., Lynn, S., & Malinoski, P. (2000). Are cultic environments psychologically harmful? Clinical Psychology Review,(1), 91-111.
Aronson, E. & Linder, D., (1965), Gain and loss of esteem as determinants of interpersonal attractiveness. Journal of experimental social psychology,1,156-171.
Bem, D. J. 1972. Self-perception theoryIn Berkowitz (Ed.). Advances in experimental social psychology (vol. 6). New York: Academic Press.
Festinger, L. (1954).A theory of social comparison process. Human Relations, 7, 117-140.
Festinger, L. (1957). A theory of cognitive dissonance. Stanford: Stanford University Press.
Hassan, S. (1988). Combating cult mind control. Rochester: Park Street Press.
Haasan, S. (2000). Releasing the bonds. Somerville: Freedom of Mind Press.
Hirata, H. (2001). The crimes and teachings of Aum Shinrikyo. Cultic Studies Review, 18, 36-42.
Jannis, I. L. (1971). Groupthink. Psychology Today 5, 43-46.
Judgment by the Fukuoka (Japan) District Court on the Unification Church. (1995). Cultic Studies Journal, 12(1), 72-102.
Lifton, R. J. (1961). Thought reform and the psychology of totalism. New York: W. W. Norton.
Nishida, K. (1994). A study of belief formation and its change (3): Process of belief-system change by cult mind control. Research in Social Psycholog(2), 131-144. (In Japanese)
Nishida, K. (1995a). What is mind control? Tokyo: Kinokuniyasyoten. (In Japanese)
Nishida, K. (1995b). A study of belief formation and its change (4): Analysis of belief-system enhancement/maintenance by cult mind control. Research in Social Psych(1), 18-29. (In Japanese)
Nishida, K. (1998). Science on “beliefs as thought tools”: The social psychology of belief systems and mind control. Tokyo: Saiensusya. (In Japanese)
Nishida, K. (2001). A social psychology analysis of Aum Shinrikyo’s criminal behavior. Japanese Journal of Social Psychology,(3), 170-183. (In Japanese)
Nishida, K., & Kuroda, F. (2003). A study of psychological problems after leaving destructive cults: Progress during the period after leaving and the effect of counseling. Japanese Journal of Social Psychology,(3),192-203. (In Japanese)
Nishida, K., & Kuroda, F. (2004). The influence of life in “destructive cults” on ex-cultist’s psychological problems, after leaving the cults. Japanese Journal of Psychology,(1), 9-15. (In Japanese)
Ross, L., Lepper, M. & Hubbard, M., (1975). Perseverance in self-perception and social perception : attributional processes in the debriefing paradigm, Journal of personality and social psychology. 32,880-892
Schein, E., Schneier, I., & Barker, C. H. (1961). Coercive persuasion. New York: W. W. Norton.
Singer, M., & Lalich, J. (1995). Cults in our midst. San Francisco: Jossey-Bass Publishers.
Srull, T.K. & Wyer, R.S., (1980)/ Category accessibility and social perception: Some implications for the study of person, memory, and interpersonal judgments. Journal of personality and social psychology,841-856.
Turner, J. C., Hogg, M. A., Oakes, P. J., Reicher, S. D., & Wetherell, M. S. (1987). Rediscovering the social group: A self-categorization theory. Oxford: Blackwell.
Wexler, M. L. (1995).. Expanding the groupthink explanation to the study of contemporary cults. Cultic Studies Journa(1), 49-71.
Yamaguchi, H. (2001). Cults in Japan: Legal issues. Cultic Studies Review, 18, 43-68.
Zimbardo,P.G. (1975). On transforming experimental research into advocacy for social change. Deutsch, M. & Hornstein, H.(eds.) Applying social psychology:Implications for research, practice, and training. Hillsdale. NJ: Erlbaum.
Zimbardo, P. G., & Anderson, S. (1993). Understanding mind-control: Exotic and mundane mental manipulations. In Langone, M. D. (Ed.). Recovery from cults. New York: W. W. Norton & Company.