Tuesday, March 10, 2009

Understanding Cognitive Dissonance

The Collection of Posts about Understanding Cognitive Dissonance from the Under Much Grace Blog 

The term cognitive dissonance is used to describe the feeling of discomfort that results from holding two conflicting beliefs. When there is a discrepancy between beliefs and behaviors, something must change in order to eliminate or reduce the dissonance.

Part I:

Robert Lifton’s Thought Reform model remains the gold standard describing how manipulative groups can cause a person to transform, accepting beliefs and practices that they otherwise would reject. Steven Hassan has also described an important aspect of manipulation that Lifton's Model does not discuss regarding the role that cognitive dissonance plays in manipulation.

Hassan states that there are three intrinsic aspects of the self: a person’s thoughts, their emotions and their behavior. He also adds in information as another factor, but for now, consider the three internal aspects of a person.


Human beings need a degree of consistency, and the aspects of self must all be congruent or in agreement. Consider this simple example: If a person thinks that it is wrong to ride a bike, they generally will express negative emotions concerning bike riding, and they will not engage in bike riding, likely doing things that discourage people from riding bikes. Now imagine that this person is forced to ride a bike for some reason. This creates a great deal of psychological stress for them, because their thoughts and emotions oppose their behavior. If compelled to ride a bike for very long, the person will have to create some justification to ease their emotions as well as their thoughts to decrease the stress that they experience.

According to Hassan, if a force, group or person can gain some degree of control over just one of the aspects of self of an individual, they have a very high likelihood of converting the other aspects of the person. In terms of our example, compelling our anti-bike exemplar to ride bikes will eventually result in making a bike believer out of him. The individual, because of the nature of how the mind works, will have to shift to accommodate the behavior. The stress of cognitive dissonance is so powerful and psychologically painful, most people will follow the path of least resistance, converting the other remaining elements of self to accommodate the element that is out of sync.



Hassan includes the aspect of “information” as another element of self, because when a person is bombarded with information, this also produces painful dissonance and acts just like another element of self. Information can cause a person to shift beliefs as well.

Consider an example of how cognitive dissonance can cause a person to change their mind. I like the example of a car salesmen and an impulsive decision to purchase a car. Imagine that you have gone to the car dealer where you purchased a vehicle for service. You have no real interest in buying a car, and you go to the dealership with no such intent. In fact, it is not in your best interest at the time to purchase a car, and it behooves you to just use the one you have. But as you drive in, a new car catches your eye. The new model really is nice, and you like the way it looks as you drive by to get to the service area.


Then someone comes out to you from the service department to explain that your car’s warranty has expired, and they discovered that a particularly expensive component of the car needs to be replaced. Your emotions have been stimulated. Not only have your emotions been engaged through your like of the look of the car, you’ve actually thought about how nice it would be to have one while you were waiting. The dealer has the part you need in stock, and it will just be another hour to complete the repairs. You decide to stay to wait on the car, but you are emotionally engaged because you did not anticipate having such a high repair bill which creates a great deal of stress for you.


You have another hour to wait, but you finished reading all the material you brought with you. You pick up some literature on the new model, and you wander out into the showroom to actually look at the new model. You are very stressed and confused about the high repair bill, and this does not fit with what you anticipated. It doesn’t make sense to you that the car should need repairing at this point, and it certainly should not cost so much. Feeling a little trapped, you have just begun to feel cognitive dissonance because your feelings and your thoughts about your old car do not match sense. Your emotions and the situation have just influenced your behavior, and you have taken action. Now you have engaged two aspects of the self.


As you stand in the showroom, you think about how you don’t need a new car. You should just pay the repair bill. Cars have problems and they show the signs of ware and tear. It’s really not in your best interest to buy a car at this time, and the one you have now meets your needs. But as you stand there in the showroom under the guise of occupying yourself while you wait, you are approached by a salesman. He reads the signs of stress on your face, and he may have even asked the service department about your circumstances.

He approaches you, and he offers to let you take a test drive while you’re waiting. You figure that you might as well drive the car since you did really like the way it looked and wondered how it handled on the road. Bingo. That salesman knows that he is well on the way to making a sale. The more he engages your emotions and the more small shows of behavioral compliance he can elicit from you, the greater his chances of selling you the car. The salesman knows what he’s doing, and he appeals to Cialdini’s principle of scarcity to engage your emotions even more. He tells you the car that you want is actually the very last car on their lot. They also have a limited time offer on financing that just so happens to be available for a short time, and he would have to check, but the deal may be gone tomorrow, along with the car itself.

From an emotional standpoint now, you really do not want to think about your old car. This has become a problem for you, and the fact that you did not anticipate problems is painful. At the same time, the idea of buying a new car that you like is quite pleasant. You have a choice as to which emotions you will entertain, even though it does not feel much like a choice because it is subtle. Who wants to occupy their time considering that life is not really fair, things wear out and break down, and even when you take good care of your belongings, they may not last? How much better to think about something pleasant instead? You can entertain the idea of how much you like that new model in the showroom. And your behavior has now further reinforced your shifting emotions, because you really do like how the car felt when you drove it. Your behavior and emotions have begun to shift into a state of agreement making your thoughts the odd element.

The salesman knows that he has a fish that is getting quite interested in the bait he’s put out. Now he will focus on getting your cooperation, because even the small acts of compliance will dramatically increase his chances of getting you to change your mind. If he can get you to give him your name and phone number, he has just dramatically increased his chances of selling you the car, even if you still do not want to buy the car. The process of cognitive dissonance has begun. The individual has the choice to walk away from the action and rethink his actions and emotions to bring them into alignment with his thoughts (No New Cars!). Entertaining the emotions involved and continuing the behavior that demonstrates interest in a new car creates stress if a person’s thoughts do not correspond to these emotions and behaviors.


The salesman also knows that he is highly likely to make a sale if he has engaged emotion and secured compliance, because these two factors will strengthen. Actually, the fact that he has your name and phone number may be helpful to him, but it is just important to him because he has secured your cooperation. You then call the bank to see exactly how much money you have in your savings account.


You started out your day planning only to have your car repaired, and now you are considering buying a new car. What has happened?

Lets look at the process again.



You started out with no dissonance. Your thoughts, emotions and behavior all corresponded, and you were comfortable.






 Your emotions became engaged, challenging what you believed and what you anticipated in a negative way. You also saw something that appealed to your sense of pleasant emotion at the same time. You allowed the stress and the pleasure to overflow into another facet of self, and you change your behavior, though it seems quite insignificant.





The process progresses, and you have been engaged even further. The emotional stress spills over into your behavior as you try to get information in order to make a decision about what to think.




At this point, you have put yourself into a position where you will need to make some kind of decision about the information you have learned. You are subject to emotional stress, but your behaviors have introduced some new emotions – ones of pleasure that counter your stress. Your problem of confusion over the stress of having a high car repair bill that you did not anticipate makes the emotions associated with buying a new car quite pleasant. It seems to solve a problem for you. You can shed the old car and buy a new one, even though it is not exactly in your best interest from a thoughtful perspective.




Discontinuity created when thoughts, emotions and behaviors creates a great deal of psychological stress, and engage and control of one element of the self usually always causes the all three elements to shift in some way. The individual must choose whether they will follow the path of least resistance or whether they will work to bring their self back into alignment. If you do nothing at this point, your thoughts will shift to accommodate your behavior and your emotion. Either way, you need to make a change. If you do choose not to buy the car, you will have to accept the situation and stop entertaining the emotions and behaviors that do not correspond with buying a new car. If you choose to follow the path of least resistance, you will change your thinking and buy the car. You cannot float indefinitely in this dissonance and discontinuity.

Part II:

In any manipulative group that practices surreptitious techniques to influence members, one of the primary means of establishing and maintaining control of the community and environment comes through what Robert Lifton called “Milieu Control.”
For most groups, this involves not only control of the flow of information into / within a group but also the generation of information to keep those within the group from engaging critical thought. Any information from outside this closed system (or sometimes from a disgruntled member) that challenges the group must be silenced, because people will then begin to think for themselves and will question the veracity of the information that the group has communicated to them. The outside information will magnify the flaws in the group’s dogma. Most groups communicate the idea that only information from certain sources may be considered, and other sources will be portrayed as false or evil.

For this reason Steven Hassan includes information along with the three elements of the self (thought, emotion and behavior) as a means of dominating a person for the purposes of surreptitious manipulation. When the group can gain control of one aspect of the self, it is highly likely that the other aspects of the person will follow the path of least resistance to reduce psychological stress by causing a change in the other unaffected aspects (cognitive dissonance). Personally, I like to consider information as a separate factor because, though it can be manipulated in order to convert a person, it is external to the person themselves. However in terms of influence, it is just as powerful a means of establishing control.

Within a manipulative or harmful group, leadership uses information to affect all of the aspects of self in order to reinforce the identity of members. They isolate group members from information outside of the group, replacing it with their perceptions and message. In order to overcome pre-existing thoughts or ideas that might be held by members, it may be necessary to use logical fallacy and propaganda techniques to convey and reinforce their messages (though both of these tactics also employ emotional tactics as well). Information may also be used to convey messages of shame and fear to coerce members, but they can also appeal to desires by promising solutions to troubling problems. Groups actually create a perception of problems and then provide an irresistible solution to those problems, and naturally, the solutions can only be found by actively participating with the group. Cialdini in particular outlines the social pressure (see Asch), authority (see Milgram), and systems of positive and negative reinforcement (see Spiritual Abuse) utilized to influence members in order to gain behavioral compliance.

Most groups use some method of isolation of members from external information that contradicts the group’s message and information, or the group may allow select information to be filtered into the group after it has been altered or tainted with negative connotation. Lifton says that isolationism or withdrawal often serve as unavoidable adaptations. Group members learn rather quickly which sources and information should be avoided because they will experience cognitive dissonance when faced with the truth. Any message that scrutinizes the group or that contradicts the group think will produce great discomfort in the follower, so groups actually have to do very little warning of their members. (Because members experience discomfort when they do review challenging information that brings the group or their teachings under scrutiny making the milieu a self-reinforcing system, when groups do formally declare warnings to their followers, it does make one wonder if the groups have become particularly threatened.)

If the group is unable to stop any information that challenges its ideology, the sacred science of the group, and the doctrine over person at work in a manipulative system, it predictably resorts to various measures of damage control. Groups marginalize or discredit sources of information and the information itself through typical propaganda techniques. "Poisoning the well," reducteo ad hitlerum, ad hominem arguments, straw men, red herrings and other typical tactics are used to discredit the source of information (refer to sidebar information). Groups often employ fear mongering, communicating to followers that review of information deemed questionable or dangerous will place their eternal souls in great jeopardy. Most members will not be Bereans about this type of information because reviewing the challenging information while holding on to the group dogma will place them in harms way with the group and may not be worth investigating.
Bill Gothard propagates a particularly powerful means of establishing and maintaining milieu control with his “umbrella of protection” teachings. Gothard makes submission and acts of selfless humility a type of sacrament necessary for the earning of grace that enables Christians to live safe and protected lives. Going against the standard set by the group exposes believers to harm, so that under Gothard’s ideology, failure to observe the rules that maintain milieu control expose one to God’s wrath and open one up to satanic attack. This serves to further isolate the group and polarize members to reject information from outside of the system because of fear of losing their salvation as well as their physical well being.

Most interesting to watch in closed systems and manipulative groups is their response to disgruntled members of the group. Thought reform and mind control is very effective, but it is not a sure thing. Most people walk away from cultic groups of their own volition and are not usually dismissed or disfellowshiped from their groups. As group members become dissatisfied, the same types of techniques used to influence members to reject information from outside of their system is used to isolate members who demonstrate non-compliance. Groups devote many damage control measures to containing information propagated by problematic members who become wise to their deception. Groups often dissemble and exaggerate information to cast these disgruntled members in unfavorable light so that if they do communicate problematic or thought provoking information, members will be highly inclined to ignore or dismiss this information.




Before moving on to Part III of Understanding Cognitive Dissonance, I wanted to explain a bit more about what Steve Hassan describes as Information Control.

From Information Control as listed under "Mind Control: the BITE Model" under the Freedom of Mind website Articles and Links Section, authored by Steve Hassan:

1. Use of deception
a. Deliberately holding back information
b. Distorting information to make it acceptable
c. Outright lying
2. Access to non-cult sources of information minimized or discouraged
a. Books, articles, newspapers, magazines, TV, radio
b. Critical information
c. Former members
d. Keep members so busy they don't have time to think
3. Compartmentalization of information; Outsider vs. Insider doctrines
a. Information is not freely accessible
b. Information varies at different levels and missions within pyramid
c. Leadership decides who "needs to know" what
4. Spying on other members is encouraged
a. Pairing up with "buddy" system to monitor and control
b. Reporting deviant thoughts, feelings, and actions to leadership
5. Extensive use of cult generated information and propaganda
a. Newsletters, magazines, journals, audio tapes, videotapes, etc.
b. Misquotations, statements taken out of context from non-cult sources
6. Unethical use of confession
a. Information about "sins" used to abolish identity boundaries
b. Past "sins" used to manipulate and control; no forgiveness or absolution

~~~


Note About the Ethical Use of Confession


Hassan is not decrying all confession of sin before God for sins committed as unethical. He is referring to the thought reform practice of confession wherein information confessed is used to shame, berate and manipulate followers in order to attain compliance with the group standard. God offers us forgiveness when we repent. Spiritual abusers and Pharisees do not.

We are all very human, and God works a miracle, even with our sins. Somehow, he works even the events that we find to be quite hopeless into opportunities to minister to others. When we REPENT, God forgives us and gives us an opportunity to grow in holiness and change our ways. Sometimes, we forget about the Blood that was shed for us and easily become blind to our own sin. Consider how we are told in 2 Samuel 12 that Nathan went to King David, describing David’s own sin to him, yet he did not recognize them until Nathan pronounced God’s message of judgment. My own sins hurt those that I love and they hurt the body, just as all of our sins bring reproach upon the Lord. When we sin, we forget about the wounds of our Lord Jesus, those He suffered to buy our redemption. Our sins are inconvenient, we stop seeing them, but others do not.

One of the most Christ-like people I ever knew said something provocative in front of a group of people. He said "My life is full of sin." In context, he was lamenting the difficulties and seeming paradox of wanting to be pure and holy before God, desiring to do God's will always, yet inevitably realizing daily that we fail. That which we want to do, we do not, and no good dwells in our flesh to which all believers are somewhat subject until we leave this life (Romans 7). And we get this so mixed up because the key to being a Christian rests not in our ability to be sinless (something we don't have) but in our realization of the forgiveness that God offers to us. Our strength is not found in our ability to be without sin but in our remaining in Christ.

If we have sinned, God offers us a very simple solution: repentance. This differs vastly from what Hassan refers to as the unethical use of confession to men as a form of sacerdotalism, or as a human mediator. In cultic groups, leadership never forgive sins and use them for purposes of manipulation. This is something vastly different from confession of one's sins unto God, sins that He remembers no more. Glory to God for his new mercies every morning! He is full of compassion and of great mercy, giving us so many opportunities to repent. Our shame becomes His glory when we do (Ps 4), and his strength is made perfect in our weakness (2 Cor 12:9).


Part III:

Responding to Cognitive Dissonance





In the last three posts, we established that cognitive dissonance is the psychological stress experienced when a person is presented with information, asked to do something, or is encouraged to feel emotion that contradicts the different aspects of the self including thoughts, emotions and behaviors.



Steve Hassan adds “information” to these aspects of the self as another means of bringing about cognitive dissonance, as information has the power to produce this same kind of painful psychological stress that manifests when the aspects of self are challenged. If a manipulator can gain influence over one of the aspects of self or over the information that a person receives, the mind usually always shifts the remaining elements of self to conform to the aspect that the manipulator has affected, all in order to avoid great psychological stress. A salesman who makes you feel guilty or who can entice you to try a free sample has a much better chance of convincing you to change your mind about purchasing their product.

The first key to resisting this type of surreptitious manipulation and coercion
is realization of how the process works.

Please also note this important point: Any aspect of the self and of information can cause a shift of all of the other aspects of self, given the right conditions. 



The more pressure that is applied to a person under stress, the more likely that thought reform can occur, if the person is unaware of tactics of manipulation and their inner resources show depletion. The process of converting someone to new thoughts, emotions and behaviors is termed thought reform, a process identified by objective, predictable criteria (Lifton’s Thought Reform criteria, Henke’s Spiritual Abuse criteria, etc). Manipulative groups employ these methods and techniques in order to make new converts as well as keep existing followers under control in a closed social, ideological system. This blog post will concern itself specifically with how manipulative groups or even how individuals receive, attribute and process information that contradicts the group’s position.



When a member of a manipulative group encounters information that contradicts a closed group’s positions or any information that challenges the lofty status of leadership and authority within the group, cognitive dissonance ensues. This process is not pathologic, but it is indicative that your “self” has been challenged in some way. When a person who has been culled into a manipulative group is faced with information that will help liberate them from the milieu control that that the group has been able to establish for them, they will definitely experience cognitive dissonance as well. The process itself is not negative, but it is the process of conversion and realization. It is the covert use of these tactics along with the induction of cognitive dissonance in order to circumvent informed consent to both the covert nature of process of conversion itself and informed consent about the true doctrines of the belief systems employed by these manipulative and shame-based groups that is objectionable. These tactics are particularly objectionable when employed in the name of God in order to coerce, shame and manipulate earnest, trusting and unsuspecting people.

When presented with information that contradicts that of the manipulative group, the follower responds to the cognitive dissonance in predictable ways. From an example from my own past, I believed my own pastor to be very Christ-like and upright, and I admired him greatly. Over the course of time, I heard several things that he said that were very wrong and not Scriptural, but I ignored them, rationalizing that I must have misunderstood something he said. It created cognitive dissonance for me, so I used a type of denial to shield myself from the unpleasant realization that my pastor said something very doctrinally unsound. I also denied a behavior that he told me about himself that I just could not even believe, again telling myself that I did not have enough information and must have understood. I learned later that I had understood everything perfectly but rationalized and filtered out the painful aspects of the situation.

People experiencing cognitive dissonance can respond in a whole host of ways, but primarily they reduce to three basic categories.
  1. Deny the information (a type of withdrawal or isolation)
  2. Filter the information or rationalize it in some way (altering the information to control the amount of induced cognitive dissonance)
  3. Receive the information (inducing cognitive dissonance which may result in a change of opinion about the group)

Part IV:

Likely Denial Unless the Spiritual Abuse Becomes Personal



As noted in the previous post, people experiencing cognitive dissonance in response to challenging information can respond in a whole host of ways, but primarily they reduce to three basic categories: (1) denial, (2) filtering/rationalizing the information to limit cognitive dissonance or (3) actually receiving the information, making no effort to escape the cognitive dissonance.

The first two options demonstrate the thought squelching processes necessary for manipulative groups to maintain milieu control, and the third option stimulate critical thinking which will ultimately guide a person away from the group influence. This post will deal primarily with denial but will also discuss why some people might be more predisposed to denial than others.





Denial, a form of isolation or withdrawal is fairly self-explanatory. (It is also employed as a result of confirmation bias which I will discuss in a later post, but I included denial here as an independent factor.) The most obvious way that people can compensate is to avoid the information through isolation, though eventually one does encounter dissonance, even when immersed in the neotribal subculture. When messages do penetrate the group’s barriers, one can esteem the messages as lies so that they do not have to be weighed and evaluated. All veracity of the information is denied and rejected because the pain of the truth becomes too great for the person to easily process. It is easier to abandon the new information.
Frankly, the whole list of informal logical fallacies and propaganda techniques can be employed for this purpose so that members can find reasons to reject the challenging, dissonant information. This alleviates them from having to pay any significant attention to the information at all. It may initially disturb individuals who encounter the information, but they have been trained well to preserve the milieu of the group at all costs. Members know that they will be punished by the group in various ways if they entertain the information, and the group will nearly always resort to damage control. Damage control by group leadership and other willing group members who are actively staving off their own discomfort go to great lengths to refute that which challenges them. The group will richly reward those members who labor to reject the dissonant information with positive reinforcement.

Many other factors affect whether a person will choose denial to shield themselves from the stress of the challenging and dissonant information. If new information depicts a scenario that is too close to a person’s individual experience prior to group involvement for example, an individual may choose to entertain the information. But generally, if a group member is reasonably far removed from the direct effects of a situation and is not personally involved with anyone involved, they will naturally trust their presuppositions about the group. When any reasonable person learns negative information about someone that they trust and know only to be virtuous (and they are satisfied with the relationship), they will also experience a degree of dissonance, scrutinizing the dissonant information. They have more cause to trust their satisfaction with what they know as opposed to cause to doubt.




But the person who is not part of a closed and manipulative system will not be pressured or punished for reviewing the information, and their standing before God will not be drawn into question as a consequence. A person within a closed, manipulative system will not only have to process their own normal internal stressors regarding the new information itself, they also have to overcome the conditioning and the dynamics of their group. The pressure not only affects only their internal mental and emotional state, any dissident behavior threatens their standing within the group (loss of one’s emotional and relationship support) and likely their entire faith. Apart from the group, a person will experience challenge of that which contradicts what they know, but the group’s milieu control measures make the process all the more intense for those in in a manipulative group. These individuals have very little motivation to resist denial and consider the dissonant information and will suffer if they do. It is not worth the effort.



Cialdini writes about how we tend to disbelieve unpleasant things about people that we like. Expert manipulators exploit this tendency and capitalize upon our human trait of “liking” through predictable tactics. Physically attractive people are often attributed with more virtue, based merely on their appearance, creating an “halo effect.” Human beings also tend to cooperate with people who are more like them, seem familiar to them, and through positive associations like praise and care. What man dying of thirst does not have a favorable impression of someone who gives him water until he is restored to health? These forces prove to be incredibly powerful under any circumstances.

However, when the negative consequences of mistreatment become personal, additional pressures create their own dissonance. Whereas the individual who has some personal distance from the effects of misbehavior and the effects of spiritual abuse, the person who has been personally impacted by the negative consequences or the abuse itself has more cause to question the milieu. If someone you never met offers a complaint against your minister, for example, you have no cause to place a great deal of emphasis on this information. You love your minister, and most of what you know about him is quite pleasant and positive. But if you, your friends or your family suffer directly because of mistreatment from this same minister, you no longer find yourself in the previously impersonal situation where you still experience a high degree of satisfaction.





If your friends or family have been harmed, you have a high degree of personal stress and will likely develop dissatisfaction. If a minister within a manipulative group harms you, the stress created by the problem also becomes more significant than any concerns you may have had about the consequences of behaving like a dissident within the group. Some personal circumstances pale in comparison to the preservation of the group milieu. Often, this is the only time that some people will consider the possibility that their group leader, minister, or confidant could demonstrate hurtful behavior. 
Most people who emerge from spiritually abusive situations actually state that they never would have believed that spiritual abuse was possible had they not witnessed and experienced the mistreatment personally. (They dismissed warnings such as Paul's warning to Timothy to avoid Alexander the Coppersmith as noted in 2 Timothy 4:14 until they experienced their own unpleasant dealings with their own real-life Alexanders. Or perhaps no one made efforts to warn them at all.)


Part V

Confirmation Bias



Confirmation bias describes the human tendency to interpret new information in such a way as to (subconsciously) confirm what one already believes, a type of selective thinking. When confirmation bias manifests, information that contradicts one’s preconceived ideas and assumptions fails to catch the attention of the individual, or it is ignored or downplayed. Errors in logic, faulty statistical analysis/interpretation and errors in memory result from this type of bias of thought. New information that confirms one’s favored hypothesis is considered good data, and any information that disproves one’s hypothesis is considered and defined as faulty data.




~~In this diagram and for the purposes of this discussion of the overall phenomenon of cognitive dissonance, I have depicted confirmation bias only as a filter that does allow some information in through the barrier. Please note however that confirmation bias can just as easily result in a complete and total rejection of all new information (denial), not only a partial filtering of information exclusively. I have singled out “denial” separately, but confirmation bias also plays a role in both denial
as well as filtering out and accepting only selected information.~~


The very remarkable skill of the human mind is its desire to find meaning in things, noting patterns in events and seeing only what we wish to see. Sometimes I think that much of the Christian life involves just reviewing all that we’ve been working so hard to ignore, as God works in us to purge, correct and redeem. Our true skill as human beings (particularly when following our flesh) is not really our ability to realize truth as much as it is to evade it. A few years ago, I reviewed a scientific article that presented research about a particular issue, and it was so biased and flawed that I could barely believe that anyone had bothered to submit it for publication. The author selected only the data that supported what he had hoped to find. As Blaise Pascal once wrote, “The heart has its reasons that reason knows not.”

This reminds me of a quote from an evolutionist that I will paraphrase (their name escapes me at the moment):
We believe in evolution, not because of scientific fact or evidence, but because the only alternative is creation, and that is unthinkable.

Leo Tolstoy also made this comment, one of several similar statements found within his writings:
I know that most men, including those at ease with problems of the greatest complexity, can seldom accept the simplest and most obvious truth if it be such as would oblige them to admit the falsity of conclusions which they have proudly taught to others, and which they have woven, thread by thread, into the fabrics of their life.


For the Christian, we should definitely practice some bias, but our bias manifests not as confirmation bias as it is described here but as bias after we have looked squarely at all of the truth and evidence that we find without ignoring any of it. Our bias should manifest in how we put all of the evidence we find into perspective, not through denying that which seems to challenge us. Christians should engage the world truthfully without intimidation then apply the Word of God to what we encounter, bringing out thoughts captive to Christ. I’m reminded of the Serenity Prayer that says “Taking, as He did, this sinful world as it is, not as I would have it.”

In summary, I have to agree with Dr. Albert Mohler as he expressed the Christian’s responsibility in “Confirmation Bias in a Fallen World”:
The reality of confirmation bias and its threat to intellectual integrity is one reason that Christian thinkers must read widely and think carefully. We must not limit ourselves to reading material from those who agree with us, fellow Christians who share a common worldview and perspective. Instead, we have to "read the opposition" as well -- and read opposing viewpoints with fairness and care.

Part VI:




In a previous post, we discussed confirmation bias: the human tendency to interpret new information in such a way as to (subconsciously) confirm what one already believes, a type of selective thinking.

Within manipulative groups, spiritually abusive churches, and cults, the system of control capitalizes upon this human trait in order to surreptitiously manipulate its membership in order to maintain milieu control. Free flow of information, both within the group and from outside of the group is tightly controlled so that members have no cause to doubt group teachings and to reinforce them in the minds of members. When groups cannot contain information that provokes thought and reveals truth that will challenge the follower, the group will strive to contain the information from permeating throughout the entire group.

Attaching a negative connotation to information or a source of information will discourage members from seriously considering these challenging ideas. Some have already been mentioned here, but I wanted to note the power that connotation has. We’ve already established that information within the manipulative group generally targets one of the aspects of the “self,” either the thoughts, the emotions or the behaviors of a person, in order to establish control of the entire personality or self, thus establishing the group mindset within the follower. I would like to point out that informal logical fallacies tend to be very effective because they target two aspects of the self at one time through appeal to both thought and emotion. These fallacies also prove to be very subtle, and as a general rule, unless you are prepared to spot them, they also prove quite powerful. Knowledge of how these fallacies are employed provides the best defense against the power of the covert manipulation that the fallacies impose.

One more obvious means of discrediting information utilized by manipulative groups includes many variations on “appeals to prejudice.” Information might actually be neutral but points out holes in the group position will stimulate the follower to start thinking independently of the group, requiring problem-solving. The group survives because people who do not want to be burdened with the moral weight of responsibility of thinking for oneself and will adopt group opinion, and problem-solving and analytical thought makes that goal nearly impossible. Appeals to prejudice not only present an argument with some factual information, they also strike directly at strong emotion at the same time. Groups tag neutral information with emotive terms that indicate some degree of moral flaw or turpitude to the information so that people will “get stuck” in the emotion of the association and will not consider the veracity of the argument or the information itself.

One such example of this is the word “secular” itself. Secular refers to that which is not religious, but it is not evil. The profession of plumbing, for example, is not a religious tradition (though I could make quite a few jokes about it which is probably why I tend to use this as an example). Indoor plumbing is neither holy nor profane, but it is something from which all people reap great benefit and blessing. It is not inherently religious, and study of the dynamics and physics of water and pipes have nothing to do with anything religious. Yet the term secular is often associated with that which is evil when discussing religion. For many Christians, use of the term carries a negative connotation and is something that stands in opposition to Christianity in some capacity, depending on the context of the discussion. When one discusses secular humanism, it is expressly an expressly non-Christian worldview, and many manipulators capitalize on the connotation of the term.
 
Another very obvious example of this type of appeal to prejudice is reductio ad Hitlerum or transference. Considering that Hitler represents one of the most deplorable persons of historical significance in recent history, making reference to Adolf Hitler as a comparable figure to one’s opponent casts the opponent in the worst possible moral and ethical light. The association becomes difficult to overlook, and most people will stop processing information thoughtfully, using problem-solving, for a moment in order to process the emotional connotation. Few people will be able to follow the rest of the argument, because evoking the name of Hitler will draw them into an emotional trap. Most people will shift from an EEG pattern of beta waves at this point and will go directly into an alpha state, becoming highly programmable and open to suggestion. They will cease to think critically and will be more inclined to absorb groupthink and other arguments that they would otherwise reject.

Connotation of this type opens up a distraction for the listener, and these tactics mentioned thus far become “red herrings.” The red herring fallacy describes a broad category of subfallacies that are particularly effective, as they direct the listener away from the essential and significant facts or issues on to trivial or peripheral issues that have no bearing on the true nature of the matter at hand. Hounds will follow a scent when tracking an animal or a person, but if they smell a red herring, the pungent odor will throw them off the trail of scent. People are thrown off the trail of the holes in the argument presented by new information by tagging or attenuating the information with something offensive or peripheral in order to obscure rational thought and evade detection of flaws.

Members of a manipulative group may encounter information that is damning to the group, but if only a small and strategic portion of the information can be tainted with negative connotation, the dissonance created by the challenging information will intensify as a result. Most group members will get caught up in the dissonance of the emotional component so as to impede and inhibit the clear thought of the member. They may comprehend the argument, but the negative connotation will hang over the matter like a thick, dark, disturbing cloud. That will be enough for most members to reject consideration of the information altogether.

In addition to tainting the message itself, groups also usually reinforce rejection of the information by also attacking the source of the information by “killing the messenger.” This usually involves a layering of many different logical fallacies that appeal to both emotion and reason in order to manipulate opinion and ward people away from data that will cause them to think rationally about the new and challenging information. Argumentum ad hominem (“argument against the man”) focus on red herrings of personal information that is peripheral to the information but will also distract the listener from rational, critical thought. The “abusive” subtype draws in personal qualities (“They have brown hair and they laugh too loud”) while the “circumstantial” subtype of ad hominem argument draws experience into the discussion (“Their opinion regarding the assault of a person is irrelevant because they were once assaulted themselves and immigrated from a country with which our country is now at war”).

Part VII:
An Ideal Environment for Covert Influence?





Previous posts discussed how cognitive dissonance works, but they did not really discuss what this might look like in a religious setting when the individual enters a place with tremendous pressures designed to support a particular state of mind? Entering into a setting like this is fine if it is done without coercion and with the individual’s full and informed consent.

It becomes objectionable, however, when manipulators make use of techniques of influence in order to covertly exploit individuals. Previous discussions examined single examples and elements of manipulation and thought conversion, but it did not discuss how individuals become bombarded by this information and can find it overwhelming when experiencing nearly every type of influence concurrently. Techniques of manipulation intensify their effectiveness when done in social settings and when they are layers upon one another.
~~~
Imagine that you walk into the sanctuary of a church. You hear music playing, but you do not realize that this music’s meter, rhythm, chords and patterns induce a particularly prayerful state of mind. Though you do not realize it, your body and mind synchronize with this music, shifting you from the state of mind where rational, critical problem-solving takes place into a more emotional and experiential state. The service begins, and the more simplistic seem repetitive, and they do not focus on doctrine or confession but upon emotions and experience. It is not balanced, and the music minister repeats the chorus of the last song 12 times before concluding the singing portion of the worship service. You feel a bit out of touch, particularly after the repeating chorus at the end.

You don’t realize how much you follow the patterns of the people seated around you. Social mentoring sets the standard for appropriate behavior in the church that evening. You feel rather weary from the long day, and you follow right along with the group. Your attention drifts for a moment, and suddenly, everyone around you stands up. The man seated nearby looks at you in your still seated position, and you feel a bit embarrassed. You aren’t quite sure why everyone is standing, but the critical gazes around you feel uncomfortable to you, so you stand up also. We tend to determine what is correct, at least in terms of behavior in this setting, by conforming to the group standard. A moment ago, you looked over and noticed that the man that you consider to be the wisest and most respected person in the group was standing and he also noticed that you were still seated. You figured that if he was standing, that is a good gauge to you indicating that also should be standing, too. You stood up, though you don’t quite know why everyone else stood up, but you don’t want to be impolite.

Now feeling a bit more self-conscious, you notice that you are not dressed like everyone else in the group. You feel a bit uncomfortable and think that you should really be respectful and dress more appropriately next time you come to the mid-week service. You are dressed in your formal business attire, and everyone else is a bit more casual.
Someone in the front has started making some announcements, and they have asked for volunteers. You again feel the gaze of others on you, and a friend looks over at you and makes eye contact with you as the plea for a particular volunteer is made. The person in the front asks specifically for people in your profession to rise to the occasion, and they mention how the help will benefit the unfortunate. The person asks for a show of hands of willing volunteers. You feel self-conscious again, and against your better judgment, you raise your hand. Your current schedule for the next month makes such volunteer work difficult because of a certain project, and you consider the option of telling the person after the service that you really now have second thoughts, but you did make a commitment, and you did so in front of the group. If you had not been so fatigued by the long day before coming to church and if you had not been a bit self-conscious already, you would not have volunteered. You are disappointed in yourself. Oh, well.

The minister gets up to speak, and you smile, considering how much you really enjoy the company of this pastor. You admire his attire, he is your ideal image of a pastor. He has the nicest family, too. He strikes you as a very honest man and you just find him to be very likable.

It is the midweek service, so things are a bit informal in comparison to the Sunday services. The pastor reads a section of Scripture and stops three times to say, “Repeat this verse with me….” So the congregation repeats the short verses with the pastor as requested. You don’t particularly like that practice, but you participate. After all, it is the informal service, and it is Scripture. This echoing of the pastor does get your attention. The platform is elevated and you gaze upward to see the pastor, and you find yourself looking up at about a 40 degree angle to see him on the platform. (You do not realize that when you gaze upward like this that your brain shifts out of a problem-solving mode into a more relaxed mode that actually encourages you to accept information without question.) He also says, “Can you say, ‘Amen’?” You chime in with the group, repeating “Amen.”

Last week when you taught Sunday School, you asked the pastor to assist him in a demonstration before the class. He was quite helpful, and you are always honored when he sits in on your class. This evening, at this more informal service, your pastor asks for a male volunteer for a little object lesson, and he motions over towards the area where you are seated. No one seems too interested in cooperating, but since the pastor was so agreeable last week, you stand up and offer to participate. The pastor asks you to go up to the front to assist him with holding something, a very simple request. It doesn’t really make sense to you, but he is the pastor, after all. He’s the one preaching at the moment, so you show yourself as a good Indian to his chief. He thanks you in front of everyone for being so helpful. You consider again how much you like the pastor as you return to your seat. You really hate getting up before crowds like this, particularly in church, but you do so because this is what was expected and what was asked of you.

You notice that there was a bit of a political twist in the sermon this evening. Something was said about a political figure’s behavior, and the pastor said “Well, why should we expect otherwise? Of course, that’s what we should expect him to say, considering his background.” He also made an argument that doesn’t quite make logical sense. If the end times are coming, then there will be increased war and decreased peace. We are at war with another nation, therefore the pastor says that it is surely the end times. You consider that you heard this same thing when you were very young, 30 or 40 years earlier. You wonder how long the end times really are. The straw man argument was used regarding all of a particular political party, and the connotations used to identify this party were neither favorable nor terribly accurate. But everyone hopped on the band wagon and really responded to the message.

At the end of the service, the front of the church seems quiet and looks a little more focused to you. You almost feel inspired and as if you can see more clearly. As an Assemblies of God pastor once shared with me about his church, your pastor has a remote in his jacket pocket which allows him to control the lighting. The pastor is gradually dimming lights in the periphery around the front of the sanctuary, making them brighter in the center where he is standing, and the house lights that were quite dim have begun to brighten. To some people in the congregation, the gradual increase in the level of light above them corresponds with the zenith of the sermon, and this enhances the sense of epiphany for them…

And as you drive home, you find yourself humming one of the repetitive choruses that you sung that evening, and you didn’t even realize that you remembered the words to it. Hmmm.

~~~
You do not realize it but you were intentionally placed in an alpha state of awareness (highly suggestible) with the use of music and lighting. Just having to gaze upward to see the pastor on the platform causes a physiologic response of going even deeper into this alpha state. Every time you felt an emotional response of embarrassment or self-consciousness, your critical thought suspended for several minutes, making you highly open to suggestion. When called upon to volunteer by the person making announcements, you responded because of your tendency for consistency, desiring people to think of you in the way that you like to think of yourself. You also responded to social pressure, something intensified because this pressure was applied in a very public forum.

You responded to your pastor, though you did not really want to participate and don’t enjoy that sort of thing. You complied in response to the appeal to authority, social pressure/proof, consistency, reciprocity and liking (see Cialdini). The pastor was able to gain your compliance several times, both with the group and individually. You are now conditioned to respond to other more significant requests for compliance in the future because they have built your trust and increased your familiarity via the small requests. 
Though some things about his message did not sit well with you, you failed to realize that the pastor actually threw several complex logical fallacies at you in a very short period of time. The rapid use of these arguments did not allow you sufficient time to consider their validity, so you were swayed by the situation. The mention of the end times implies scarcity and capitalizes upon the principle of scarcity and a desire to survive. You found yourself agreeing with the pastor on points that you otherwise would have rejected and taken issue with under other, less pressured circumstances. Now consider that all of these factors and “weapons of influence” were all layered upon one another. Also consider that you were very fatigued, and so you were much more compliant than you would have been otherwise.

In and of themselves, this does not constitute spiritual abuse. But consider that these powerful influences, all layers upon one another in complex ways, could be used against you to manipulate and exploit you. If the pastor promised and advocated one set of ideas and secured your trust, how would you feel and what would you think if the pastor was not honest about his objectives? What if he kept his true purposes for the congregation concealed until he had gained the trust and love of the congregation? What if he misused his authority and position? What if he started teaching subtle error? Consider how easy it would be for you to be pressured and exploited.


Part VIII: In Summary
Revisiting old posts from
Under Much Grace, posted in 2007

Closing the Ideological Sale

Indoctrination and reinforcement of the changed aspects of self that have been set off balance through cognitive dissonance are often accomplished through social mentoring and this helps solidify the transformation. So if I can get you to feel something, the quicker that I can get you to behave in accordance with your feelings, the more solid the transformation becomes. 

"Can you say, Amen?" That's a "three-fer." I've invited you to think in agreement with me, I've asked you to respond with an action of repeating me, and I've likely engaged your emotion. The quicker that I can get you to reinforce the shift or change, the better. This is great to know when buying a car. The salesman wants you to get in the car, drive it, love it and want it. If he can get your name and number (if you weren't absolutely determined to consider buying the car), and you like him, he's much closer to closing the sale. The quicker that he can facilitate this, all the better.


The double bind is a type of cognitive dissonance wherein you are "damned if you do and damned if you don't." Jesus referred to the technique of the “double bind” when he chastised the Pharisees for their “thought stopping” riddle regarding swearing by the temple, which meant nothing, versus the more binding oath of swearing by the gold of the temple.

Matthew 23:16-17 (King James Version)

Woe unto you, ye blind guides, which say, Whosoever shall swear by the temple, it is nothing; but whosoever shall swear by the gold of the temple, he is a debtor! Ye fools and blind: for whether is greater, the gold, or the temple that sanctifieth the gold?

Contradictory and/or complex information communicated congruently and simultaneously causes temporary confusion and disorientation (e.g., The harder I try to understand, the more I will never understand. Understand?) The Pharisees were intimidating (emotion) and demanded others to agree. They also wanted to solicit behavior and modeled it for others.
Such a divisive presentation of information in a controlled environment and in pressured conversation induces most people to respond by a temporary suspension of thought. There you are, in front of a Pharisee, and you are under pressure. They are also dangling your eternal fate over your own head, baiting you to comply. Resistance isn't futile, but it is often difficult under certain circumstances. This causes adaptation of behavior to fit the circumstance created to establish dominance and manipulative control over the will of the individual.

That's one way that thought reform can go to church or meet you at the door when you arrive.




Simply stated, cognitive dissonance occurs any time something "Does Not Compute." We feel pressure and discomfort because things do not add up. Our emotions become engaged so that these emotions conflict with our thoughts and our behavior. Or someone says something that does not make sense in context (e.g., when my former pastor who seemed to be doctrinally orthodox told me that people had horrible things happen to them when they left the church against the blessing and better judgment of the elders -- not because they made a necessarily unwise choice but primarily because they did not let the elders make their decisions for them). Biblically orthodox Protestant pastors do not make such statements unless they believe just as strongly in extra-Biblical doctrines as well. The context or the circumstances do not match whatever new thought, emotion or behavior has been introduced.

Behaviors can also throw us into cognitive dissonance. If we are asked to perform a task that does not match the context of the situation, this can temporarily suspend our critical thinking and our understanding of the environment which is disconcerting if not actually painful. Biderman points out that small and seemingly insignificant requests for behavioral compliance set up a pattern of compliance between the target and the manipulator. Over time, the requests can be made more relavent, but this also serves to sear the conscience. One becomes desensitized to the disturbance of the congnitive dissonance created by requests that don't seem to have any purpose or logical end. Hassan also adds information into mix, as information that does not correspond to a person's continuity of self; the internal agreement and consistency of thought, emotion and behavior; does induce a state of cognitive dissonance.

Even dissonance lasting only a few seconds changes a person's state of consciousness. A person can shift from a predominantly thoughtful, problem-solving state of mind wherein they make their own decisions and think logically (a Beta State):

into a much slower state wherein one becomes indiscriminate and non-critical, into a predominantly alpha state of awareness:
When in an alpha state, it becomes much more difficult to think logically, discerning exactly what is meant and whether the person wants to agree. Matters are accepted without as much discernment when the brain slows down to processes the cognitive dissonance. The alpha state is the ideal state that hypnotherapists and stage hypnotists like to place people into so as to plant suggestions and ideas into the mind of the subject. This can be a beneficial effect, such as when we are worshiping the Lord or when we are listening to the encouragement of loving encouragement from our supportive family and friends. When in the presence of a manipulator, this process can become our undoing. It is the mind and body's physiologic response to "Does Not Compute."


Lalich states this
-->"Bounded Choice: True Believers and Charismatic Cults":
Cognitive dissonance theory recognizes that a person experiences and is motivated by a type of psychological discomfort produced by two thoughts that do not follow, or by conflicting views of reality. When there is such inconsistency and/or conflict, a person tends to experience internal tension and is motivated to reduce the uncomfortable feeling. Yet people will continue to hold their beliefs and behave accordingly even when a particular perception butts up against a different context or a different reality… The basis of this theory is that humans will tend to reduce the uncomfortable feeling caused by the dissonance by bringing their attitude in line with their behavior rather than changing their behavior…

It is about attitude and behavior, about the internal thought process of the individual faced with the dilemma of reconciling external and internal realities.
(pg. 249)
Cognitive dissonance makes us vulnerable and easier to manipulate. We need to be alert to it and aware of it when it occurs so we can avoid manipulation and exploitation. Knowing how the process works can make us more determined to choose. We might want to allow ourselves to be influenced, and this might be the best option for us, but we should be astute and discerning about the process. Covert influence relies heavily upon this type of dissonance, and we can disarm it by increasing our self-awareness.

Would I comply with this if I were under other circumstances? Am I pressured right now? Is someone forcing me to choose between a drastically limited number of options when more options may be available elsewhere? Can I go home and sleep on things before I make a decision? Am I tired and just agreeing with something to get away from the social pain? Would I necessarily agree with this statement if I was not being directly asked about it in a high-pressured situation? Is someone projecting shame or suggesting that I am not consistent with qualities that I would like them to associate with me in order to get me to think differently or act in a way they would prefer? Is the group opinion really all that important to me for me to act in the way that a manipulator would desire?

When you feel cognitive dissonance, take a step back. Clear your head. Consider your behavior. Ask yourself what you think and feel.

If you don't have the opportunity to process these ideas and feelings, there's a good chance that you are being manipulated. Know that you don't have to be. Most people can usually find a way to slow down the situation or to step back from it. Be assertive and protect yourself.

Read more HERE.

All Rights Reserved

Please feel free to use original material presented here on this blog, attributing the site.

Copyrighted works are made available here under the 'fair use' exception of U.S. copyright law, for research and educational purposes only.