How Good People & Well-Intentioned Groups Can Go Bad


The ways and reasons how cults and noxious movements can turn relatively decent people into irrational bigots terrified and/or in loathe of outgroup members and how well-meaning movements can turn into a pits of vipers or go completely corrupt after awhile has been something I've long been fascinated with. It's also something I've watched go down far, far too many times. So for those of you interested in writing this sort of thing, I present this article.

Table of Contents



Having the right buttons pushed and the right traits encouraged

Just about everyone would like to be special or exceptional in some way, and if a group can make someone feel thus, that can potentially be a very powerful motivator to join the group. (EG, "Are you a member of [demographic]? Then join us and take your rightful place!") Likewise, so is making a person feel wanted, needed, and useful. (EG, "We need your help to make paradise happen!") Most people want to be part of something bigger than themselves and belong to a definable group, and they want to strive for something good. (EG, "Join us, child of God, and help us bring the nation to purity and righteousness!") And by joining up with a group, we can gain an identity as well as new friends and allies who will support us in our endeavors and beliefs. All of these are potentially very powerful draws that can bypass critical and rational thinking, as they appeal to some of our most primal needs and desires.

Someone with mild paranoid-xenophobic tendencies can maintain a relatively healthy worldview if other people around can provide counterpoints to allay those fears and worries. On the other hand, if the people the person hangs around reinforces those fears, that person can soon slide into an irrational and bigoted worldview. In fact, almost anyone can spiral down into irrational hatred if there are enough people around feeding and nurturing that person's pet prejudices and biases.


Getting hooked on the superiority high

Joining a group or cause one perceives as good comes with a huge emotional rush. One is suddenly on the right side of things now and now has a life purpose and something to fight for. It's a seductive, addictive feeling - we feel energized and vitalized, and every time we come to a confrontation or argument with someone who doesn't see the things we do, no matter how angry and frustrated we get at this person we can always leave the argument knowing that we've got the moral high ground. Seeing those who agree with our beliefs speak out passionately (or even ragefully) about them can give us the same feeling, and boy does it feel good.

And here's the problem: it doesn't matter if the group or cause is good; the only thing that matters is that the person who joins it perceives it as good. People can potentially get the same rush of righteousness arguing for a preferred electronics brand or a favored soap opera as they can arguing for civil rights or free water access. Hypothetically, you could have a group that believes in purging the planet of all cruciferous vegetables, and as long as someone believes that the cause is good and righteous one could get a righteousness rush off trying to warn people against the evils of cabbage and mustard, or from listening to one's fellow anticruciferans preaching out against turnips and broccoli.

Furthermore, the sense of superiority the person has can create the illusion that because they're on the right side, then they're better than everyone else and so can do no wrong. Thus, you had Harry Potter fans acting just as rudely (if not moreso sometimes) as the Twilight fans they believed themselves to be superior to. On a more serious scale, this can potentially mean vicious and even deadly attacks against members of an outgroup for small or even perceived slights against the group. (See also: the Ideology Sue.)

And then if you have a group where the leaders and/or its members constantly reinforce their perceived superiority among each other, things can get nasty even faster.


Gradual extremization or indoctrination

A group that began as something innocuous or even positive can over time turn into something toxic and harmful, particularly if its more extreme members are the loudest and most powerful. It especially helps if moderate members who could have helped keep the group more grounded and stable are bullied and shamed for dissenting by the extremists. If this happens, moderate members will either leave or be driven out, keep their own opinions to themselves, or get caught up in the extremists' fervor. Before long, you can end up with a group where the most powerful and influential members are the extremists, and any dissent from that party line will be met with swift retribution, be it social (shaming tactics, shunning, etc) or physical (banning/blocking, corporal punishment, execution, etc).

Toxic groups that need new blood may keep their more controversial bits hidden from new members - eg, stuff that's obviously hateful or just plain wacky. Instead, new members are gradually eased in and given access to new levels of truth as senior members deem fit. By the time they get to the truly nasty or kooky stuff, they may be so brainwashed that they accept it with little to no questioning, or they may be so involved in or dependent on the group that they couldn't leave even if they wanted to, or the group or its cause may have become such an integral part of their identities that they don't know who they'd be or what they'd do without it. Once they're deeply involved in the group's way of thinking and/or define themselves by being members of the group, all sorts of actions and behaviors that would have seemed horrific and unthinkable before may seem rational and justifiable.

A noxious group can present itself to potential new members as something harmless or even beneficial - such as a group that simply wants to help people and bring about world peace, or bring about equality, or share a new spiritual truth. Once the newbies have made some form of commitment to the group and feel a sense of loyalty and belonging and generally agree with the basic stuff, then older members might start sharing some of the heavier stuff - perhaps some of the more xenophobic beliefs (EG, "It's a fact that [negative trait] is an inborn trait of [outgroup demographic]" or "All [outgroup demographic] are [negative trait]!") or morally questionable practices (EG, "If people don't want to donate to our cause, it's okay to make them think they're donating to a cause they like instead because it's all for the greater good.") And so on and so forth.

Note that at no point can the new truths contradict the person's core goals or beliefs: For example, if someone joins a group that purports to want to bring the country back to its former glory because that's what xe wholeheartedly wants to do, xe'll most likely balk and back out if xe suddenly finds out that the group actually plans to put the country under the power of a hostile nation. Likewise, a pacifist who joins a religious order because xe was told that the gods were peaceful and nonviolent is not likely to react with simple compliance if told that the gods actually demand blood sacrifices.

Impatience and frustration is also a potential cause of extremization in both groups and individuals. It's all too easy to become angry and embittered when one's hard work doesn't really seem to be paying off, or if it isn't paying off as fast as one thinks it should. All of this built-up frustration makes it easy to start lashing out at people who appear to be a part or cause of a problem, and if this behavior goes unchecked the person or group can spiral into increasingly repugnant and reprehensible behaviors - such as profiling, stereotyping, and "shoot first, ask questions later" and "guilty until proven innocent" attitudes and policies.


Blocking out input from those who disagree or dissent

This can happen in multiple ways in any size of group or organization. We're probably all familiar with how dictatorships can regulate and block undesirable materials from entering a country, but this sort of thing can happen on as small a scale as a religious or ideological movement on the Internet. So, let's take a look at the various ways and reasons people can censor and block out contradictory information:

1. Individuals ignore dissenting opinions because to acknowledge them and possibly even admit that their opponents might have a point would mean that they were wrong about something. This could potentially mean having to admit that they'd just wasted a lot of time and/or money and possibly wronged a lot of people… or even having to admit that their entire worldview was wrong and thereby lose their sense of place and purpose in the world. The cause has become more important than the truth, and this is the worst position anyone fighting for anything can be in.

2. They've been told by authority figures that dissenting materials and opinions from outsiders can't be trusted. Common reasons given are that the materials exist only to undermine the group and the good or righteous work it does, or that it's full of falsehoods because it's written by people who have been brainwashed to believe lies, or that dissenting opinions and ideas will plant dangerous ideas into their heads that will ultimately lead them to damnation or destruction. Once they're sufficiently afraid of outside opinions, they can usually be counted upon to police themselves. Furthermore, once the group believes that outsiders are on the whole hateful and malicious and will oppose their beliefs with ridicule and contempt as a matter of course, any encounter with someone who does exactly this will only validate their beliefs and solidify their loyalty to the group.

3. The authority figures take it upon themselves to censor what members have access to by banning and prohibiting members from accessing it in the first place. This can potentially mean banning or blocking people from posting dissenting opinions on Internet communities, moving everyone out to a commune or compound where bringing in dissenting materials is prohibited and even punishable, or by banning certain materials from being published or imported on a nationwide scale.

Dismissing outside ideas as dangerous or outright blocking them while building the perception of outgroups as intolerant and dangerous can be a powerful means of brainwashing people and taking control of their thoughts. Without access to outside opinions, peoples' perceptions of reality are relatively easy to mold toward what the leaders or what the most extreme members want them to be - with nothing to tell them otherwise, they'll have no real way to know differently. And If members of the outgroups are painted as inhuman monsters who simply seek for your subjugation or destruction because they're irredeemably corrupt or inherently evil, something like wiping them all out can start to seem pretty justifiable…


Denying or rationalizing away the bad parts… from the minor to the horrific

People don't want to think of themselves as bad, so when faced with the unavoidable prospect of doing something they'd normally consider morally objectionable, they often start finding ways to justify it. Likewise, if other people in a group they're a part of start doing things they would ordinarily consider to be morally objectionable and they feel a strong emotional attachment to this group or have no real way to leave it or see no alternative to it, they may start finding ways to justify the group's behaviors or simply refuse to acknowledge them.

A hypothetical prison ward ordered to beat prisoners for acting out for understandable reasons (eg, the prisoners were convicted on flimsy or questionable charges and are being treated horribly) might come to justify xir actions by convincing xirself that the prisoners must have been put in there for a good reason and that if the prisoners didn't want punished, then they should have behaved. Or, a hypothetical member of a religious group who hears about how several members of the group have been abusing others and spreading toxic doctrine may simply write them off as "fake" members that can simply be ignored and written off rather than dealt with - and if abusive/toxic members are left unchecked long enough, they can soon swell in number and/or influence to the point where they overwhelm or overpower the rest.

Here are some other potential justifications and denialisms:

See also Mindsets & Rationales That Lend Well To Villainy for more potential rationalizations for awful actions.


The Milgram Effect

In short, people often tend to do what perceived authority tells them to do… even when you might think that things like common sense and a basic moral compass should tell them otherwise. In 1961, Stanley Milgram conducted an experiment to see just how much weight the "I was just following orders!" excuse actually held… and it turned out to be quite a bit.

Milgram's experiment involved a setup where one participant (the "teacher") would ask a second (the "learner") a series of questions. If the questions were answered incorrectly, the teacher was to deliver an electrical shock to the learner. The shocks would increase in intensity throughout the experiment until the learner begged to be let out, and would eventually cease to answer. The teacher would be instructed to shock the second anyway by a researcher observing the experiment. If the teacher asked the researcher in charge of the project whether to continue despite the distress of the second, the researcher would tell them to continue on. All teachers continued up to giving 300 volt shocks, and 65% went all the way to giving the learners 450 volt shocks.

...Okay, in reality, the "learners" weren't actually being shocked; they were simply pretending to be. And the researcher was just an actor. But point was, these perfectly ordinary people complied disturbingly easily with orders to torture someone because they believed the orders were coming from someone with authority. Repeats of this experiment showed that this wasn't just a one-time thing or just this particular sampling, either - different tests got pretty much the same results.

So, what happens when an authority figure tells a squad of soldiers to shoot everybody in a village because they might all be terrorists, or when an authority figure starts telling parents that they need to beat the living daylights out of their children for each and every perceived misdeed…?

Yeah...


You know a group has gone well and truly rotten when any of these are present...


If you liked this, you might also be interested in:


Things About Brainwashing Writers Need To Know
Things About Moral Panics Writers Should Know
Basic Tips To Write Better Abuse Victims & Abuse Situations
Basic Tips To Create Better Characters With Tragic & Traumatic Backstories
Changing Alignments, Allegiances, & Loyalties More Believably
On Writing Sympathetic Morally-Ambiguous Characters
Basic Tips To Write Better & More Despicable Villains
Creating & Writing Fictional Organizations
A Beginner's Guide To Spotting Cranky Websites

External Resources


Propaganda & Debate Techniques
The Cult Test
Why Do People Become Islamic Extremists? (YouTube)



Back to Villains & Villainy
Go to a random page!