Thought Reform
Don't drink the Kool Aid! Social engineering, cult indoctrination and a bit of social media.
Introduction
I remember a quote from Maajid Nawaz many years ago and think that there is something very wrong today . . . isn’t there?!
No idea is above scrutiny, and no people are beneath dignity.
Maajid Nawaz
Now it seems there are some ideas that are above scrutiny and there are people beneath dignity. The polarisation of our discourse seem to point to a problem. Many well meaning people have sent us down a dark path, the path to hell is paved with good intentions after all.
A few years ago, I started looking into social engineering, or human hacking. That led me through literature covering negotiation, influence, empathy, body language and so on. Eventually I arrived at the more ‘extreme’ end, models of thought reform which describe how ideas are transmitted to and adopted by others. Social engineers will sometimes make a distinction that the difference between influence and manipulation is intent. This is an unfulfilling characterisation as matters of good or bad are subjective and the argument leaves itself open for quite harsh critique.
Thought reform is more fundamental style of manipulation that goes beyond developing rapport and using tactical empathy, this is the manipulation of values using cynical methods. At the extremes it is the mechanics of cults, but it is also the mechanics of special interest grievance groups. A cult leader might consider themselves to be acting for a greater good and this might even be a sincere conviction. The go to defence of social engineering, intent, becomes somewhat untenable.
The mechanisms of thought reform are everywhere. It might not have escaped your attention that people are acting in more and more unhinged ways. Like many things the discourse has its genesis within online spaces and then is reflected in interactions in the real world. You might notice that now more than ever people are quick to characterise and reduce people they disagree with to a stereotype. People are vilified based on associations to those labels carry with little to no regard for any nuance or understanding. This isn’t a new problem, but it is exasperated by prevalence and access to online content. In this sordid debacle these are not problems confined to a single group, rather they extend to all. There is an overtly technological bent to this and what was created to connect us cynically divides us through our own choosing.
You might ask what does this have to do with security? What I am talking about here is human behaviour which directly relates to our feelings of security. It is inevitable that you will encounter people holding dogmatic ideas in a work context and having some understanding of how people arrive at these conclusions is useful even just from the standpoint of a Machiavellian power hierarchies. Of course, there is a broader anchoring in social engineering and understanding some of these concepts will help within organisations and communities.
Let me ask you some questions.
Were you outraged the last time someone spoke about racists, homophobes, illegal immigrants, fascists, communists, the deep state, grooming gangs, pro-life, pro-choice, terrorism, traitors, quislings, Israel, or Palestine? The list could go on.
Do these words elicit a strong emotional response?
Did you speak out in condemnation of such things?
What was the price for your outrage?
Did you feel affirmed in your grievance when likeminded people supported your outrage?
Then maybe your price was the willingness to empathise, to understand the perspective of another person. Or worse, maybe is it that the truth was overlooked to justify such moral indignation.
The Process of Thought Reform
How does this happen exactly? There has been much discussion since the 50’s when the term ‘brain-washing’ was first coined by a journalist called Edward Hunter in and article “Brain-washing Tactics Force Chinese Into Ranks of Communist Party”. Hunter interviews an acquaintance in Hong Kong who had been subjected to Maoist brainwashing techniques.
There are established pathways that lead to extreme perspectives. There are many models for this which use many different techniques such as the BITE model of authoritarian control. Here I discuss a model that has three key phases and extend into concepts outlined by Robert Jay Lifton. The three stages are,
Initiation - this generally relates to the validation of a pre-existing grievance.
Indoctrination - this is about explaining the reasons for the grievance and introducing the elements of ideology to start altering perspective.
Reprogramming - this is where values are replaced and the identity of the individual is subjugated to a group identity.
Due to the nature of how information can be proliferated online through text and video active recruiters are not required in the same way they were. The majority of these steps can be performed by the person themselves through the media they engage with and how they select which media they engage with.
The following diagram is taken from notes I made many years ago and outlines a fairly typical pathway from a legitimate grievance to an ideologically driven world view. I’m not saying it’s complete but it is a reasonable interpretation.
Where we are discussing vulnerability, we are doing so from a psychological standpoint and not the common use in IT. In this context we might define it as an inability to protect against emotional harm from external stressors. Where this is coupled with a legitimate grievance then there is the way in that forms the base from which the rest follows.
Milieu Control
The most basic feature of the thought reform environment, the psychological current upon which all else depends, is the control of human communication.
Robert Jay Lifton
Milieu is described by Robert J Lifton in his book Thought Reform and the Psychology of Totalism. In the book Lifton interviews those who had undergone Maoist brain-washing similar to the earlier article by Hunter. The term milieu essentially means a social setting or environment that a person is in. We know that social setting will influence behaviour as was observed in the Milgram Experiments or the Stanford Prison Experiment (SPE). Individual identity can be subordinated to group identity in fairly short order. This is as true in wider society as it is within organisations, the latter I have previously discussed in the following article.
Lifton clarifies that Milieu Control is control of communication within an environment enforcing the prevailing ideology and creating the polarised concepts of pure and impure, good and evil. In this type of scenario where the individual identity is subordinated to a group identity then the adherents are freed from the subtle nuances of truth. We see these patterns manifest in social media and traditional media which in themselves have become the mechanisms to achieve milieu control. These formats are driven by engagement or by sales, so those mechanisms inevitably reward polarised perspectives further upholding polarisation.
A consequence of social media is how people will coalesce into groups of the similarly minded. They tend to self-select towards using mechanisms such as recommended connections or contacts, like and share mechanisms, and follow or connect features. In conjunction with block features this enables a curated internet experience where a person’s right order of the world is informed by a mutually shared group identity. People have created their own environment, their own milieu.
The immediacy of social media being available on devices that are always with us provides us with information stream continuously nudging perspective alignment to the milieu. In many ways it is an automation of a Maoist method of reaffirming the information over a protracted timeline that Hunter describes in his article in the Miami Sunday News. It also provides a proxy for the group setting that Hunter describes which provides the group enforcement of the milieu.
Purity
Totalising ideology will bifurcate all dialogue into extreme positions of pure and impure, absolute good and absolutely evil. This fosters strong ingroup preferences and vehement dismissal of any perspective that does not conform to the paradigm that is established.
This can be, and frequently is elicited using any number of logical fallacies such as false dichotomies, strawman arguments, or ad hominem. In itself the framing of concepts of purity is a form of all these logical fallacies. Now we need to be clear, there will always remain an element of doubt from the adherent to this paradigm and we must understand there are degrees of conformance.
We can see from virtue ethics that altering the underlying values or character of a person will inform their perspective on what is right or wrong and drive their decisions using deontological ethic they will adopt. The change of an individual’s values over time will alter how they behave. It doesn’t need to go this far as individuals can operate using different ethical standards that are environmentally contextual such as in the SPE. But as Lifton notes, milieu control pushes people towards the integration of the internal and external milieux so that any inconsistencies between them are removed.
We see purity statements in the support of various causes. Such examples of will be along the lines of.
I am a [insert group label] and I am appalled and ashamed by [some action] by [another group with tacit commonality]. This does not represent me or [insert group label].
This does a lot of heavy lifting. Statements of this nature are fealty to an ideological perspective. They also serve to define the ingroup and the outgroup and give condemnation. There is of course the mutually reaffirming aspect and dopamine hit from others within the ingroup which is the allure of such technology.
Contained within this is another concept which is the cult of confession which is described by Lifton as the demand that one confesses to crimes that one has not committed. These types of affirmation statements concede guilt to an action to which the person is not a party. Lifton describes this as an act of symbolic self-surrender. Extending from this to more traditional principles of influence we can see that the people making such statements are pliable to commitment and consistency as described by Robert Cialdini.
The Drama Triangle
There is a sinister element manifest through the encouragement of purity statements. It reinforces a psychologically unhealthy model which is Karpman’s Drama Triangle. It casts a victim, a persecutor, and a rescuer role. This is a slightly separate conversation but it is useful in identifying dysfunctional social interactions. It is also worth noting that people can move between these roles depending on the situation.
A Victim will see themselves as inferior to the situation. Characterised by self-pity, feelings of unworthiness, avoiding confrontation, lacking responsibility, and feel they have no control or power. They will seek a persecutor.
A Rescuer will consider themselves as superior to the victim. Characterised by a lack of respect for someone’s ability to think for or help themselves and also will subconsciously keeps the victim dependent on them.
A Persecutor will persecute the victim and be controlling and manipulative. Characterised by feelings of superiority, controlling and dominating actions, using manipulative techniques including shaming and blaming.
These are described in the following diagram taken from my notes many years ago but it articulates the point.
Where someone adopts the role of a rescuer, they need a victim to rescue, and the need a persecutor to protect them from. When a persecutor is identified, then they adopt the characteristics of that role in the mind of the rescuer irrespective of their involvement. The problem with this type of socially dysfunctional interaction is that the perceived persecutor may not have done anything at all, yet they are mischaracterised as such. Clearly this upholds other methods of polarisation and the perspectives of totalism.
Loading the Language
Totalising language is used to reaffirm the purity concepts, enforce group affiliation, and serves as a cognitive shortcut. Terms that are dichotomised to the pure and impure paradigm are not neutral. They are ‘thick’ concepts which means that terms are loaded with connotation, innuendo, and associated meaning. The language will be definitive and highly reductive. Groups that have defined an outgroup will adopt such language to describe the outgroup and the terms used to describe them typically will elicit disgust or outrage. Lifton gives examples from Maoist China such as ‘capitalist’, ‘imperialist’, and ‘bourgeois’. He characterises this type of language as being constrictive. These terms often become derisive and used as ways to silence dissenting opinion. Being labelled with such terms will evoke the disgust response from the ingroup and might be considered as thought terminating cliches.
The linguistic alignment elicits neural resonance within the ingroup essentially reinforcing an empathetic connection. Dopamine is likely to play a significant role here in further cementing the ingroup preference. Chris Voss describes this through a process called mirroring and is a byproduct of creating a specific vernacular is to evoke these mechanisms. These factors combined with mirroring being a mechanism in language acquisition and learning elements make the establishment of exclusive terms desirable to the process of thought reform.
When describing Carl Jung, Jordan Peterson states “People don't have ideas. Ideas have people”. This concise line articulates the sentiment that people act on the ideas that have been imparted into them. By observing conversations in online spaces, you become familiar with the vernacular that various groups use. Familiarity the origin of certain phrases leads to understanding what news they consume, from what platforms, who they follow, and the associated political views and beliefs. There is a lineage of the ideas that manifests through language.
Richard Dawkins is credited for coining the phrase meme which is common parlance today but the original conceptualisation suggests something a little more insidious.
When you plant a fertile meme in my mind you literally parasitize my brain, turning it into a vehicle for the meme’s propagation in just the way that a virus may parasitize the genetic mechanism of a host cell.
N. K. Humphrey – The Selfish Gene – Richard Dawkins
Why is this a problem?
The media, politicians, and online agitators use these mechanisms to consolidate perception within groups in a way that is preferable to their an agenda. It’s no coincidence that rhetoric was described by Aristotle thousands of years ago. This is something that has been with us for some time but in a modern context the problem is exasperated. If you have subordinated your voice to that of a group, then you are effectively a vassal for the opinions of others.
Conclusion
Well, I could be wrong, but maybe the whole world is going insane.
This subject could be discussed in extenso. I have only discussed a couple of points in Lifton’s model but the ones I think are most pertinent to what I see happening in online spaces at the moment. It’s not one group, it’s many of them.
Perhaps I’ll put some questions to you to consider. Maybe think about some of the following the next time you are reading posts on the internet. If you see any of this in what you or other around you say then it could well be time for some self reflection.
Do you recognise terms that have loaded meanings that vilify another group?
Does this language prevent you from engaging with the arguments of another group?
How would those around you react if you challenged the ideas of the group?
Does what you are saying frame you as the Rescuer?
When did you last seriously engage with the ideas of someone you disagreed with?
Perhaps it’s time we all brushed up on our critical thinking. Then there might be a discussion worth having.