Psychology Wiki
Register
Advertisement

Assessment | Biopsychology | Comparative | Cognitive | Developmental | Language | Individual differences | Personality | Philosophy | Social |
Methods | Statistics | Clinical | Educational | Industrial | Professional items | World psychology |

Clinical: Approaches · Group therapy · Techniques · Types of problem · Areas of specialism · Taxonomies · Therapeutic issues · Modes of delivery · Model translation project · Personal experiences ·


Brainwashing, also known as thought reform or re-education, is the application of persuasive techniques to change the beliefs or behavior of one or more people usually for political or religious purposes. Whether any techniques at all exist that will actually work to change thought and behavior to the degree that the term "brainwashing" connotes is a controversial and at times hotly debated question.

Origin of the term[]

The term brainwashing is a relatively new term in the English language. Before 1950, it did not exist. Earlier forms of coercive persuasion had been seen during the Inquisition, the show trials against "enemies of the state" in the Soviet Union, etc., but no specific term emerged until the methodologies of these earlier movements were systematized during the early decades of the People's Republic of China for use in their struggles against internal class enemies and foreign invaders. Until that time, descriptions were limited to concrete descriptions of specific techniques.

The term xǐ năo (洗腦, the Chinese term literally translated as "to wash the brain") was first applied to methodologies of coercive persuasion used in the "reconstruction" (改造 gǎi zào) of the so-called feudal (封建 fēng jiàn) thought patterns of Chinese citizens raised under prerevolutionary regimes. The term first came into use in the United States in the 1950s during the Korean War, to describe those same methods as applied by the Chinese communists to attempt deep and permanent behavioral changes in foreign prisoners, and especially during the Korean War to disrupt the ability of captured United Nations troops to effectively organize and resist their imprisonment.

It was consequently used in the United States to explain why, compared to earlier wars, a relatively high percentage of American GIs defected to the Communists after becoming prisoners of war. Later analysis determined that some of the primary methodologies employed on them during their imprisonment included sleep deprivation and other intense psychological manipulations designed to break down the autonomy of individuals. American alarm at the new phenomenon of substantial numbers of U.S. troops switching their allegiances to the enemy was ameliorated after prisoners were repatriated and it was learned that few of them retained allegiance to the Marxist and "anti-American" doctrines that had been inculcated during their incarcerations. The key finding was that when rigid control of information was terminated and the former prisoners' natural methods of reality testing could resume functioning, the superimposed values and judgments were rapidly attenuated.

Although the use of brainwashing on United Nations prisoners during the Korean War produced some propaganda benefits, its main utility to the Chinese lay in the fact that it significantly increased the maximum number of prisoners that one guard could control, thus freeing other Chinese soldiers to go to the battlefield.

In later times the term "brainwashing" came to apply to other methods of coercive persuasion and even to the effective use of ordinary propaganda and indoctrination. And in the formal discourses of the Chinese Communist Party, the more clinical-sounding term "sī xǐang gǎi zào 思想改造" (thought reform) came to be preferred.

Present use of the term[]

Many people have come to use the terms "brainwashing" or "mind control" to explain the otherwise intuitively puzzling success of some methodologies for the religious conversion of inductees to new religious movements (including cults).

The term "brainwashing" is not widely used in psychology and other sciences, because of its vagueness and history of being used in propaganda, not to mention its association with hysterical fears of people being taken over by foreign ideologies. It is often more helpful to analyze "brainwashing" as a combination of manipulations to promote persuasion and attitude change such as propaganda, coercion, capture-bonding, and restriction of access to neutral sources of information. Note that many of these techniques are more subtly used (usually unconsciously) by advertisers, governments, schools, parents and peers, so the aura of exoticism around "brainwashing" is undeserved. At the same time, nuanced forms of indoctrination and propaganda in religious, political and commercial venues may occasion wider and deeper impacts than do outright coercive tactics. Mirroring George Orwell's doublespeak, strategists of indoctrination and propaganda frequently disguise themselves as promoters of freedom and liberation.

Thought reform is the alteration of a person's basic attitudes and beliefs by outside manipulation. The term usually relates closely to brainwashing and mind control.

One of the first published uses of the term thought reform occurred in the title of the book by Robert Jay Lifton (a professor of psychology and psychiatry at John Jay College and at the Graduate Center of the City University of New York): Thought Reform and the Psychology of Totalism: A Study of 'Brainwashing' in China (1961). (Lifton also testified at the 1976 trial of Patty Hearst.) In that book he used the term thought reform as a synonym for brainwashing, though he preferred the first term. The elements of thought reform as published in that book are sometimes used as a basis for cult checklists and are as follows. [1] [2]

  • Milieu Control
  • Mystical Manipulation
  • The Demand For Purity
  • Confession
  • Sacred Science
  • Loading the Language
  • Doctrine Over Person
  • Dispensing of Existence

Benjamin Zablocki sees brainwashing as "term for a concept that stands for a form of influence manifested in a deliberately and systematically applied traumatizing and obedience-producing process of ideological resocializations" and states this same concept had historically also been called thought reform and coercive persuasion.

Popular speech continues to use the word brainwashed informally and pejoratively to describe persons subjected to intensive influence resulting in the rejection of old beliefs and in the acceptance of new ones; or to account for someone who holds strong ideas considered to be implausible and that seem resistant to evidence, common sense, experience, and logic. Such popular usage often implies a belief that the ideas of the allegedly brainwashed person developed under some external influence such as books, television programs, television commercials (as producing brainwashed consumers), video games, religious groups, political groups, or other people. Mind control expresses a conception only mildly less dramatic than brainwashing, with thought control slightly milder again. With thought reform and coercion we start to move into acceptably neutral academic jargon and into the areas of propaganda, influence and persuasion.

Political brainwashing[]

Studies of the Korean War[]

The Communist Party of China used the phrase "xǐ nǎo" ("brain rinse ") to describe their methods of persuasion in ensuring that members who did not conform to the Party message were brought into orthodoxy. The phrase was a play on "xǐ xīn", (洗心"wash heart") a monition found in many Daoist temples exhorting the faithful to cleanse their hearts of impure desires before entering.

In September 1950, the Miami Daily News published an article by Edward Hunter (1902-1978) titled "'Brain-Washing' Tactics Force Chinese into Ranks of Communist Party." It contained the first printed use of the English-language term "brainwashing," which quickly became a stock phrase in Cold War headlines. Hunter, a CIA propaganda operator [3] who worked under-cover as a journalist, turned out a steady stream of books and articles on the subject. An additional article by Hunter on the same subject appeared in New Leader magazine in 1951. In 1953 Allen Welsh Dulles, the CIA director at that time, explained that "the brain under [Communist influence] becomes a phonograph playing a disc put on its spindle by an outside genius over which it has no control."

In his 1956 book "Brain-Washing: The Story of the Men Who Defied It", Edward Hunter described "a system of befogging the brain so a person can be seduced into acceptance of what otherwise would be abhorrent to him." According to Hunter, the process is so destructive of physical and mental health that many of his interviewees had not fully recovered after several years of freedom from Chinese captivity.

Later, two studies of the Korean War defections by Robert Lifton and Edgar Schein concluded that brainwashing had a transient effect when used on prisoners of war. Lifton and Schein found that the Chinese did not engage in any systematic re-education of prisoners, but generally used their techniques of coercive persuasion to disrupt the ability of the prisoners to organize to maintain their morale and to try to escape. The Chinese did, however, succeed in getting some of the prisoners to make anti-American statements by placing the prisoners under harsh conditions of physical and social deprivation and disruption, and then by offering them more comfortable situations such as better sleeping quarters, better food, warmer clothes or blankets. Nevertheless, the psychiatrists noted that even these measures of coercion proved quite ineffective at changing basic attitudes for most people. In essence, the prisoners did not actually adopt Communist beliefs. Rather, many of them behaved as though they did in order to avoid the plausible threat of extreme physical abuse. Moreover, the few prisoners influenced by Communist indoctrination apparently succumbed as a result of the confluence of the coercive persuasion, and of the motives and personality characteristics of the prisoners that already existed before imprisonment. In particular, individuals with very rigid systems of belief tended to snap and realign, whereas individuals with more flexible systems of belief tended to bend under pressure and then restore themselves when the external pressures were removed.

Two researchers working individually, Lifton and Schein, discussed coercive persuasion in their analysis of the treatment of Korean War POWs. They defined coercive persuasion as a mixture of social, psychological and physical pressures applied to produce changes in an individual's beliefs, attitudes, and behaviors. Lifton and Schein both concluded that such coercive persuasion can succeed in the presence of a physical element of confinement, "forcing the individual into a situation in which he must, in order to survive physically and psychologically, expose himself to persuasive attempts." They also concluded that such coercive persuasion succeeded only on a minority of POWs and that the end result of such coercion remained very unstable, as most of the individuals reverted to their previous condition soon after they left the coercive environment.

The use of coercive persuasion techniques in China[]

Following the armistice that interrupted hostilities in the Korean War, a large group of intelligence officers, psychiatrists, and psychologists was assigned to debrief United Nations soldiers being repatriated. The government of the United States wanted to understand the unprecedented level of collaboration, the breakdown of trust among prisoners, and other such indications that the Chinese were doing something new and effective in their handling of prisoners of war. Formal studies in academic journals began to appear in the mid-1950s, as well as some first-person reports from former prisoners. In 1961, two books were published by specialists in the field who synthesized these studies for the non-specialists concerned with issues of national security and social policy. Edgar H. Schein wrote on Coercive Persuasion, and Robert J. Lifton wrote on Thought Control and the Psychology of Totalism. Both books were primarily concerned with the techniques called "xǐ nǎo" or, more formally "sī xiǎng gǎi zào" (reconstructing or remodeling thought). The following discussion is based in large part on their studies.

Although American attention came to bear on thought reconstruction or brainwashing as one result of the Korean War, the techniques had been used on ordinary Chinese citizens after the establishment of the Peoples Republic of China. The PRC had refined and extended techniques earlier used in the Soviet Union to prepare prisoners for show trials, and they in turn had learned much from the Inquisition. In the Chinese context, these techniques had multiple goals that went far beyond the simple control of subjects in the prison camps of North Korea. They aimed to produce confessions, to convince the accused that they had indeed perpetrated anti-social acts, to make them feel guilty of these crimes against the state, to make them desirous of a fundamental change in outlook toward the institutions of the new communist society, and, finally, to actually accomplish these desired changes in the recipients of the brainwashing/thought-reform. To that end, brainwashers desired techniques that would break down the psychic integrity of the individual with regard to information processing, with regard to information retained in the mind, and with regard to values. Chosen techniques included: dehumanizing of individuals by keeping them in filth, sleep deprivation, partial sensory deprivation, psychological harassment, inculcation of guilt, group social pressure, etc. The ultimate goal that drove these extreme efforts consisted of the transformation of an individual with a "feudal" or capitalist mindset into a "right thinking" member of the new social system, or, in other words, to transform what the state regarded as a criminal mind into what the state could regard as a non-criminal mind.

The methods of thought control proved extremely useful when they came to be employed for gaining the compliance of prisoners of war. Key elements in their success included tight control of the information available to the individual and tight control over the behavior of the individual. When, after repatriation, close control of information ceased and reality testing could resume, former prisoners fairly quickly regained a close approximation of their original picture of the world and of the societies from which they had come. Furthermore, prisoners subject to thought control often had simply behaved in ways that pleased their captors, without changing their fundamental beliefs. So the fear of brainwashed sleeper agents, such as that dramatized in the novel and the films The Manchurian Candidate, never materialized.

Terrible though the process frequently seemed to individuals imprisoned by the Chinese Communist Party, these attempts at extreme coercive persuasion ended with a reassuring result: they showed that the human mind has enormous ability to adapt to stress and also a powerful homeostatic capacity. John Clifford, S.J. gives an account of one man's adamant resistance to brainwashing in In the Presence of My Enemies that substantiates the picture drawn from studies of large groups that were reported by Lifton and Schein. Allyn and Adele Rickett [4] wrote a more penitent account of their imprisonment (Allyn Rickett had by his own admission broken PRC laws against espionage) in "Prisoners of the Liberation," but it too details techniques such as the “struggle groups” described in other accounts. Between these opposite reactions to attempts by the state to reform them, experience showed that most people would change under pressure and would change back when the pressure was removed. The other interesting result was that some individuals derived benefit from these coercive procedures due to the fact that the interactions, perhaps as an unintended side effect, actually promoted insight into dysfunctional behaviors that were then abandoned.

Mass brainwashing[]

In societies where the government maintains tight control of both the mass media and education system and uses this control to disseminate propaganda on a particularly intensive scale, the overall effect can be to brainwash large sections of the population. This is particularly effective where nationalist or religious sentiment is invoked and where the population is poorly educated and has limited access to independent or foreign media. For example, the Chinese and North Korean governments have often been accused of brainwashing their people.

Refutation of political brainwashing[]

According to research and forensic psychologist Dick Anthony, the CIA invented the brainwashing ideology as a propaganda strategy to undercut communist claims that American POWs in Korean communist camps had voluntarily expressed sympathy for communism and that definitive research demonstrated that collaboration by western POWs had been caused by fear and duress, and not by brainwashing. He argues that the CIA brainwashing theory was pushed to the general public through the books of Edward Hunter, who was a secret CIA "psychological warfare specialist" passing as a journalist. He further asserts that for twenty years starting in the early 1950s, the CIA and the Defense Department conducted secret research (notably including Project MKULTRA) in an attempt to develop practical brainwashing techniques (possibly to counteract the brainwashing efforts of the Chinese), and that their attempt was a failure.

Brainwashing controversy in new religious movements and cults[]

The main disputes regarding brainwashing exist in the field of cults and NRMs. The controversy about the existence of cultic brainwashing is one of the most polarizing issues which separate the camps of cult sympathizers and cult critics. There is no agreement about the existence of a social process attempting coercive influence and neither about the existence of the social outcome that people are influenced against their will.

The issue gets even more complicated through the existence of several brainwashing definitions, some of them almost strawman caricatures, and through the introduction of the similarly controversial mind control concept in the 1990s, which is at times interchangeably used for brainwashing and at other times differentiated from brainwashing. Additionally, some authors refer to brainwashing as recruitment method (Barker) while others refer to brainwashing as a method of retaining existing members (Kent 1997, Zablocki 2001).

Another factor is, that brainwashing theories have been discussed in the court, where the experts had to pronounce their views before the jury in simpler terms than those used in academic publications and where the issue had to be presented rather black and white to make a point in the case. Such cases including their black and white colorings have been taken up by the media.

In 1984 British sociologist Eileen Barker said in her book The Making of a Moonie: Choice or Brainwashing, which was based on her first hand studies of British Unification Church members, that she had found no extraordinary persuasion techniques being used to recruit or retain members.

It is reported that "In his article in Nova Religio, Zablocki was worried less about those academics who may stretch the brainwashing concept than about those, like Bromley, who reject it altogether. And in advancing his case, he took a hard look at such scholars’ intentions and tactics. (His title is deliberately provocative: 'The Blacklisting of a Concept: The Strange History of the Brainwashing Conjecture in the Sociology of Religion.')"[1] In his book Combatting Cult Mind Control American psychologist Steven Hassan describes the extraordinary persuasion technique that in his opinion were used to accomplish his own recruitment and retention by the Unification Church.

Philip Zimbardo writes that "Mind control is the process by which individual or collective freedom of choice and action is compromised by agents or agencies that modify or distort perception, motivation, affect, cognition and/or behavioral outcomes. It is neither magical nor mystical, but a process that involves a set of basic social psychological principles."(Zimbardo, 2002)

The APA, DIMPAC, and the brainwashing theories[]

In the early 1980s, some U.S. mental health professionals became controversial figures due to their involvement as expert witnesses in court cases against new religious movements. In their testimony, they stated that anti-cult theories of brainwashing, mind control, or coercive persuasion were generally accepted concepts within the scientific community. The American Psychological Association (APA) in 1983 asked Margaret Singer, one of the leading proponents of coercive persuasion theories, to chair a taskforce called the APA taskforce on Deceptive and Indirect Techniques of Persuasion and Control (DIMPAC) to investigate whether brainwashing or "coercive persuasion" did indeed play a role in recruitment by such movements. Before the taskforce had submitted its final report, however, the APA submitted on February 10, 1987 an amicus curiæ brief in an ongoing case. The brief stated that

[t]he methodology of Drs. Singer and Benson has been repudiated by the scientific community, that the hypotheses advanced by Singer were little more than uninformed speculation, based on skewed data and that "[t]he coercive persuasion theory ... is not a meaningful scientific concept.[5].

The brief characterized the theory of brainwashing as not scientifically proven and suggests the hypothesis that cult recruitment techniques might prove coercive for certain sub-groups, while not affecting others coercively. On March 24, 1987, APA filed a motion to withdraw its signature from this brief, as it considered the conclusion premature, in view of the ongoing work of the DIMPAC taskforce.[6]. The amicus as such was kept, as only APA withdraw the signature, but not the co-signed scholars among them Jeffrey Hadden, Eileen Barker, David Bromley and J. Gordon Melton. On May 11, 1987, the APA Board of Social and Ethical Responsibility for Psychology (BSERP) rejected the DIMPAC report because the brainwashing theory espoused lacks the scientific rigor and evenhanded critical approach necessary for APA imprimatur", and concluded Finally, after much consideration, BSERP does not believe that we have sufficient information available to guide us in taking a position on this issue."

The rejection memo was accompanied by two letters from external advisers to the APA that reviewed the report. One of the letters, from Professor Benjamin Beit-Hallahmi of the University of Haifa, stated amongst other comments that "lacking psychological theory, the report resorts to sensationalism in the style of certain tabloids" and that "the term 'brainwashing' is not a recognized theoretical concept, and is just a sensationalist 'explanation' more suitable to 'cultists' and revival preachers. It should not be used by psychologists, since it does not explain anything", and asked that the report should not be made public. The second letter, from Professor of Psychology Jeffrey D. Fisher, Ph.D., said that the report "[...] seems to be unscientific in tone, and biased in nature. It draws conclusions, which in many cases do not mesh well with the evidence presented. At times, the reasoning seems flawed to the point of being almost ridiculous. In fact, the report sometimes seems to be characterized by the use of deceptive, indirect techniques of persuasion and control - the very thing it is investigating". [2]

When her findings were rejected by the APA's BSERP, Singer sued the APA in 1992 for "defamation, frauds, aiding and abetting and conspiracy" and lost in 1994. [3]

Several scholars in the NRM sympathizers camp have since interpreted this in the way that APA had then rejected the brainwashing theories and that there was no scientific support for them (e.g. Introvigne, 1998, Bromley and Hadden In their 1993 Handbook of Cults and Sects in America.)

Zablocki (1997) and Amitrani (2001) cite APA boards and scholars on the subject and conclude that there is no unanimous decision of the APA regarding this issue. They also write that Margaret Singer despite the rejection of the DIMPAC report continued her work and was respected in the psychological community, which they corroborate by mentioning that in the 1987 edition of the peer-reviewed Merck's Manual, Margaret Singer was the author of the article "Group Psychodynamics and Cults" (Singer, 1987).

Benjamin Zablocki, professor of sociology and one of the reviewers of the rejected DIMPAC report, writes in 1997:

"Many people have been misled about the true position of the APA and the ASA with regard to brainwashing. Like so many other theories in the behavioral sciences, the jury is still out on this one. The APA and the ASA acknowledge that some scholars believe that brainwashing exists but others believe that it does not exist. The ASA and the APA acknowledge that nobody is currently in a position to make a Solomonic decision as to which group is right and which group is wrong. Instead they urge scholars to do further research to throw more light on this matter. I think this is a reasonable position to take."

APA Division 36 (then "Psychologists Interested in Religious Issues", today "Psychology of Religion") in its 1990 annual convention approved the following resolution:

"The Executive Committee of the Division of Psychologists Interested in Religious Issues supports the conclusion that, at this time, there is no consensus that sufficient psychological research exists to scientifically equate undue non-physical persuasion (otherwise known as "coercive persuasion", "mind control", or "brainwashing") with techniques of influence as typically practiced by one or more religious groups. Further, the Executive Committee invites those with research on this topic to submit proposals to present their work at Divisional programs." (PIRI Executive Committee Adopts Position on Non-Physical Persuasion Winter, 1991, in Amitrano and Di Marzio, 2001)

In 2002, APA's then president, Philip Zimbardo wrote in Psychology Monitor:

"A body of social science evidence shows that when systematically practiced by state-sanctioned police, military or destructive cults, mind control can induce false confessions, create converts who willingly torture or kill "invented enemies," engage indoctrinated members to work tirelessly, give up their money--and even their lives--for "the cause." (Zimbardo, 2002)

Other voices[]

The often quoted Fishman Case the court concluded:

"At best, the evidence establishes that psychiatrists, psychologists, and sociologists disagree as to whether or not there is agreement regarding the Singer-Ofshe thesis. "

Social scientists who study new religious movements, such as Jeffrey K. Hadden (see References), understand the general proposition that religious groups can have considerable influence over their members, and that that influence may have come about through deception and indoctrination. Indeed, many sociologists observe that "influence" occurs ubiquitously in human cultures, and some argue that the influence exerted in "cults" or new religious movements does not differ greatly from the influence present in practically every domain of human action and of human endeavor.

The Association of World Academics for Religious Education, states that "... without the legitimating umbrella of brainwashing ideology, deprogramming -- the practice of kidnapping members of NRMs and destroying their religious faith -- cannot be justified, either legally or morally."

F.A.C.T.net states that "Forced deprogramming was sometimes successful and sometimes unsuccessful, but is not considered an acceptable, legal, or ethical method of rescuing a person from a cult."[4]

The American Civil Liberties Union (ACLU) published a statement in 1977 related to brainwashing and mind control. In this statement the ACLU opposed certain methods "depriving people of the free exercise of religion." The ACLU also rejected (under certain conditions) the idea that claims of the use of 'brainwashing' or of 'mind control' should overcome the free exercise of religion. (See quote)

In the 1960s, after coming into contact with new religious movements (NRMs, a subset of which are popularly referred to as "cults"), some young people suddenly adopted faiths, beliefs, and behavior that differed markedly from their previous lifestyles and seemed at variance with their upbringings. In some cases, these people neglected or even broke contact with their families. All of these changes appeared very strange and upsetting to their families. To explain these phenomena, the theory was postulated that these young people had been brainwashed by these new religious movements by isolating them from their family and friends (inviting them to an end of term camp after university for example), arranging a sleep deprivation program (3 a.m. prayer meetings) and exposing them to loud and repetitive chanting. Another alleged technique of religious brainwashing involved love bombing rather than torture.

James Richardson, a Professor of Sociology and Judicial Studies at the University of Nevada, states that if the NRMs had access to powerful brainwashing techniques, one would expect that NRMs would have high growth rates, while in fact most have not had notable success in recruitment, most adherents participate for only a short time, and that the success in retaining members has been limited. This claim has been rejected by Langone who compared the figures of various movements some which do by common consent not use brainwashing and others who are by some authors reported to use brainwashing. (Langone, 1993)

In their Handbook of Cults and Sects in America, Bromley and Hadden present one possible ideological foundation of brainwashing theories that they state demonstrates the lack of scientific support: They argue that a simplistic perspective they see as inherent in the brainwashing metaphor appeals to those attempting to locate an effective social weapon to use against disfavored groups, and that any relative success of such efforts at social control should not detract from any lack of scientific basis for such opinions.

Philip Zimbardo, Professor Emeritus of Psychology at Stanford University writes "Whatever any member of a cult has done, you and I could be recruited or seduced into doing -- under the right or wrong conditions. The majority of `normal, average, intelligent' individuals can be led to engage in immoral, illegal, irrational, aggressive and self destructive actions that are contrary to their values or personality -- when manipulated situational conditions exert their power over individual dispositions."(Zimbardo, 1997)

Some religious groups, especially those of Hindu and Buddhist origin, openly state that they seek to improve the natural human mind by spiritual exercises. Intense spiritual exercises have an effect on the mind, for example by leading to an altered state of consciousness. These groups also state that they do not [condone the] use [of] coercive techniques to acquire or to retain converts.

On the other hand, several scholars in sociology and psychology have in recent years stated that there is among many scholars of NRMs a bias to deny any brainwashing possibility and to disregard actual evidence. (Zablocki 1997, Amitrani 1998, Kent 1998, Beit-Hallahmi 2001)

Psychologist Steven Hassan, author of the book Combatting Cult Mind Control, has suggested that the influence of sincere but misled people can provide a significant factor in the process of thought reform. Many scholars in the field of new religious movements do not accept Hassan's BITE model for understanding cults.


References[]

  • Amitrani, Alberto et al.: Blind, or just don't want to see? "Brainwashing", mystification and suspicion, 1998, [7]
  • Amitrani, Alberto et al.: Blind, or just don't want to see? ""Mind Control" in New Religious Movements and the American Psychological Association, 2001, Cultic Studies Review [8]
  • Anthony, Dick. 1990. "Religious Movements and 'Brainwashing' Litigation" in Dick Anthony and Thomas Robbins, In Gods We Trust. New Brunswick, NJ: Transaction. Excerpt
  • APA Amicus curiae, February 11, 1987 [9]
  • APA Motion to withdraw amicus curiae March 27, 1987[10]
  • APA Board of Social and Ethical Responsibility for Psychology, Memorandum on Brainwashing: Final Report of the Task Force, May 11, 1987 [11]
  • Bardin, David, Mind Control ("Brainwashing") Exists, in Psychological Coercion & Human Rights, April 1994, [12]
  • Benjamin Beith-Hallahmi: Dear Colleagues: Integrity and Suspicion in NRM Research, 2001 [13]
  • David Bromley, A Tale of Two Theories: Brainwashing and Conversion as Competing Political Narratives in Benjamin Zablocki and Thomas Robbins (ed.), Misunderstanding Cults, 2001, ISBN 0-8020-8188-6
  • Hadden, Jeffrey K., The Brainwashing Controversy, [14] November 2000
  • Hadden, Jeffery K., and Bromley, David, eds. (1993), The Handbook of Cults and Sects in America. Greenwich, CT: JAI Press, Inc., pp. 75-97
  • Hassan, Steven Releasing The Bonds: Empowering People to Think for Themselves, 2000. ISBN 0-9670688-0-0.
  • Hindery, Roderick, Indoctrination and Self-deception or Free and Critical Thought? 2001.
  • Huxley, Aldous, Brave New World Revisited. Perennial (2000); ISBN 0-06-095551-1
  • Introvigne, Massimo, “Liar, Liar”: Brainwashing, CESNUR and APA, 1998 [15]
  • Kent, Stephen A., Brainwashing in Scientology's Rehabilitation Project Force (RPF)", November 7, 1997 [16]
  • Stephen A. Kent and Theresa Krebs: When Scholars Know Sin, Skeptic Magazine (Vol. 6, No. 3, 1998). [17]
  • Kent, Stephen A.: Brainwashing Programs in The Family/Children of God and Scientology , in Benjamin Zablocki and Thomas Robbins (ed.), Misunderstanding Cults, 2001, ISBN 0-8020-8188-6
  • Langone, Michael: Recovering from Cults, 1993
  • Robert J. Lifton, Thought Reform and the Psychology of Totalism (1961), ISBN 0-8078-4253-2
  • Marks, John , "The Search for the Manchurian Candidate", 1978 [18]
  • Richardson, James T., "Brainwashing Claims and Minority Religions Outside the United States: Cultural Diffusion of a Questionable Concept in the Legal Arena", Brigham Young University Law Review circa 1994
  • Scheflin, Alan W and Opton, Edward M. Jr., The Mind Manipulators. A Non-Fiction Account, (1978), p. 437
  • Schein, Edgar H. et al., Coercive persuasion;: A socio-psychological analysis of the "brainwashing" of American civilian prisoners by the Chinese Communists, (1961)
  • Shapiro, K. A. et al, Grammatical distinctions in the left frontal cortex, J. Cogn. Neurosci. 13, pp. 713-720 (2001). [19]
  • Singer, Margaret "Group Psychodynamics", in Merck's Manual, 1987.
  • Wakefield, Hollida, M.A. and Underwager, Ralph, Ph.D., Coerced or Nonvoluntary Confessions, Institute for Psychological Therapies, 1998
  • West, Louis J., "Persuasive Techniques in Religious Cults, 1989
  • Zablocki, Benjamin: The Blacklisting of a Concept: The Strange History of the Brainwashing Conjecture in the Sociology of Religion. Nova Religion, Oct. 1997
  • Zablocki, Benjamin, Towards a Demystified and Disinterested Scientific Theory of Brainwashing, in Benjamin Zablocki and Thomas Robbins (ed.), Misunderstanding Cults, 2001, ISBN 0-8020-8188-6
  • Zablocki, Benjamin, "Methodological Fallacies in Anthony's Critique of Exit Cost Analysis", ca. 2002, [20]
  • Zimbardo, Philip, What messages are behind today's cults? in Monitor on Psychology, May 1997
  • Zimbardo, Philip, Mind Control: Psychological Reality or Mindless Rhetoric? in Monitor on Psychology, November 2002

See also[]

theory of conversion exit tactics
brainwashing
coercive persuasion
love bombing
mind control
personality alteration
religious conversion
snapping
thought reform
deprogramming
exit counseling
intervention (counseling)
post-cult trauma
psychotherapy


Bibliography[]

  • Anthony, Dick, Brainwashing and Totalitarian Influence. An Exploration of Admissibility Criteria for Testimony in Brainwashing Trials, Ph.D. Diss., Berkeley (California): Graduate Theological Union, 1996, p. 165.
  • Barker, Eileen, The Making of a Moonie: Choice or Brainwashing, Oxford, UK : Blackwell Publishers, 1984 ISBN 0-631-13246-5
  • Committee on Un-American Activities (HUAC), Communist Psychological Warfare (Brainwashing), United States House of Representatives, Washington, D. C., Tuesday, March 13, 1958
  • Hassan, Steven. Releasing The Bonds: Empowering People to Think for Themselves, 2000. ISBN 0-9670688-0-0.
  • Hunter, Edward, Brain-Washing in Red China. The Calculated Destruction of Men’s Minds, New York: The Vanguard Press, 1951; 2nd expanded ed.: New York: The Vanguard Press, 1953
  • Robert J. Lifton, Thought Reform and the Psychology of Totalism (1961), ISBN 0-8078-4253-2
  • Sargant, William, Battle for the Mind: A Physiology of Conversion and Brainwashing, 1996, ISBN 1-883536-06-5
  • Taylor, Kathleen, Brainwashing: The Science Of Thought Control, 2005, ISBN 0-19-280496-0
  • Benjamin Zablocki and Thomas Robbins (ed.), Misunderstanding Cults, 2001, ISBN 0-8020-8188-6
  • Philip Zimbardo, Mind control: psychological reality or mindless rhetoric? Monitor on Psychology, Volume 33, No. 10 November 2002

External links[]

de:Gehirnwäsche es:Lavado mental eo:Cerbolavado fr:Lavage de cerveau hr:Ispiranje mozga he:שטיפת מוח lt:Smegenų plovimas nl:Hersenspoelen no:Hjernevasking fi:Aivopesu sv:Hjärntvätt zh:洗腦

This page uses Creative Commons Licensed content from Wikipedia (view authors).
  1. [21] Brainwashed! Scholars of Cults Accuse Each Other of Bad Faith, Lingua Franca, December 1998.
  2. APA memo and two enclosures
  3. Case No. 730012-8 Margaret Singer v. American Psychological Association
  4. Use of Forced Deprogramming F.A.C.T.net
Advertisement