Cognitive Dissonance: The original experiment–Festinger and Carlsmith, 1959, Cognitive Consequences of Forced Compliance– the experimental subject completed a boring experiment, then randomly received either $1 or $20 as an inducement to lie about the ‘fun’ experiment to ‘the next subject.’ Those paid $20 to lie were more likely to rate the whole experiment as boring, as it was designed to be “an experience about which he [the experimental subject] would have a somewhat negative opinion.” Those paid $1 had better opinions… who would lie just to obtain one dollar? If 1$ folks actually liked the one-handed empty spool loader/unloader and then the turning of the pegs, there is no lie and no cognitive dissonance. Those 1$ folks were also significantly more likely to volunteer for more such experiments than those receiving $20. Attitudes were measured by a questionnaire afterwards. Perhaps the 1$ folks didn’t like thinking that they were hypocrites merely because some talking nearly-furless ape in a lab coat asked it–did some of them reprogram themselves to eliminate the hypocrisy? (However, authority often wields great power… see the summary of Dr. Milgram’s famous experiment a couple paragraphs down.) In one sentence--a person’s thoughts and beliefs guide their behavior, but their behavior can also change their thoughts and beliefs.
In his autobiography, Ben Franklin explains how he dealt with the animosity of a rival legislator: “Having heard that he had in his library a certain very scarce and curious book, I wrote a note to him, expressing my desire of perusing that book, and requesting he would do me the favour of lending it to me for a few days. He sent it immediately, and I return’d it in about a week with another note, expressing strongly my sense of the favour. When we next met in the House, he spoke to me (which he had never done before), and with great civility; and he ever after manifested a readiness to serve me on all occasions, so that we became great friends, and our friendship continued to his death.” Classic-‘forced’ compliance used to induce CD. Possible confound-receiving two letters from Ben Franklin may have been enough of a reward, as the man could write well.
Some modern studies report that asking a subject to fake a smile or to nod their heads will change their attitudes positively to whatever they are exposed to. In a way, this can be subliminal CD, especially if the subjects think they are nodding or shaking their heads to test the headphones that they are wearing.
I have heard a rumor that Mall-Wart ‘management trainees’ are encouraged to publicly shout corporate slogans over and over again. Sometimes they receive trivial rewards for especially enthusiastic responses.
Perhaps this is why a high-ranking human learns to make subordinate humans wait when there is a meeting, even if the high-ranked human has summoned the subordinate human.
I have heard that the millions of old pots and pans collected during a world war wound up in landfills. Perhaps if a person donated unwanted cookware, they were more likely to support the war in other ways.
A rumor has it that some therapists will agree with many small statements that the patient says. The patient is then asked to agree with a statement the therapist makes, such as a history of drug addition or drunk driving.
Sometimes salespeople will ask an indecisive buyer for some ID for a bogus credit-check or some such. (A smart talking ape would physically control his ID if he complied with that at all, since allowing the saleperson to hold the ID gives the salesperson additional ‘leverage.’)
CD explains why the useful idiots who say stupid things like ‘radiation is good for you’ are given copious coverage on corpwhore media. These useful idiots supply the sheeple with false facts that support the notion that ‘they’ wouldn’t willingly endanger huge numbers of life-forms for the sake of profit. http://ex-skf.blogspot.com/2011/10/japanese-critic-plans-hotel.html
If, in the course of human events, you encounter someone who actually says something like ‘I Do Not Want To Believe This!’ upon learning of a nastiness, the best thing to do (imo) is instantly AGREE WITH THEM. ENTHUSIASTICALLY. Use statements like ‘I know what you mean, I might take a billion dollar bribe myself, but that’s why we have laws, right?’ Seek further ‘agreements’ because denial is the most common mental path used to reduce CD.
The Milgram experiment that tested obedience to authority: Experimental subject answered ad, went to lab, was paid 4.5 dollars (1963) to keep no matter what. The subject then met two people-white lab-coat guy (paid actor for standardization purposes), and another guy who claimed to be another ad answerer (but was more of the hired help). In what appeared to be a random way, the subject was given the task of teaching word pairs to the hired help guy and was asked to apply electrical shocks to punish wrong answers. First, the subject watched the hired help guy get strapped into a chair, then went into another room that shares a common wall (so he could hear the strapped-in guy) with the lab-coated actor. The huge device for applying electrical shocks was built to be impressive and had various labels (at 450 V, it read DANGER SEVERE SHOCK). Each shock-activating switch could only be thrown once, and the numbers were higher each time (30 V to 450 V).
The subject did not know the true purpose of the experiment until the debriefing at the end. The impressive machine delivered no electrical shocks, and a tape recorder (not a common device in 1963) contained the audio-only responses to the faux electric shocks.
The first ‘shocks’ were mild. But the word task got harder, the taped responses were wrong more often, and the faux shocks increased in severity. (The boiling-frog metaphor comes to mind.) At higher voltages, the tape recorder plays a few complaints in with the responses. Predictibly, they increase in severity with the voltage. Twenty switches later, at three hundred volts, the actor/recorder demanded to be released, and did not answer any more questions. Since no response was to be treated as a wrong answer, there were ten more switches, up to 450 volts, clearly labeled as dangerous…sixty-five percent walked that path, often with great mental distress, and flicked the last switch. Thirty five percent, or fourteen out of the original forty males recruited for this study, called it quits before things got that far. This study has been replicated and permutated many times. Females = Males. If the subject cannot hear or see the results of the switch flicking, the obedience rate is around 93% despite clearly marked warnings on the ‘control panel.’ If the subject has to physically hold the guy’s hand to a faux shock plate, 70% drop out before the 30% who obey and apply the final 450 volts. The physical presence of an ‘authority’ compels obedience, and the closer, the better. If the odious task is split up into many small tasks, thereby diffusing the responsibility, obedience is practically guaranteed.
This is one reason why you always see so many TSA agents in one place. TPTB need the bent brass watching the wage-slave agents do their (-) karma job, otherwise the sheeple might not get irradiated and/or groped.
This is an unconfirmed story, but I have heard that the experimental subject who decided to stop applying shocks at the lowest voltage claimed to be an electrical engineer (rare in 1963) and claimed knowledge of the effects of electric shocks. Perhaps he also had plenty of experience with tape recorders and/or stereo systems, and perhaps could recognize bogus control panels and/or lo-fi speakers too. The moral is obvious: the difference between people and sheeple is the difference between knowledge and ignorance.
I have also heard that when Dr. Milgram first started doing this research, the learner/victim/actor was seen only as a shadow on a small window and the first fifteen subjects all obeyed to the 450V level. Milgram subsequently changed the experiment to allow for some learner/victim feedback.
This is a true story… Nobody picked higher than 15V or 30V when subjects were allowed to pick any voltage.
The effectiveness of social pressure (conformity) was quantified by Asch (1955). In his experiment, the subject, along with seven others, was asked to do some very easy visual discrimination tasks. Seven others appeared to be other undergrads but were working for Dr. Asch, and occasionally would all supply the wrong answer when asked. The experiment design allowed the subject to hear their answers before answering, and about three quarters of the subjects would also gave the wrong answer at least once. If one of the seven picked the correct answer among six identical wrong answers, then only 5% of the subjects gave the conforming answer at least once. Groups of two or three were not found to be as effective as groups of six or seven in producing conformity. If the subject feels like a fish out of water (socially) or is unsure of what is expected, or in other ‘ambiguous’ situations, conformity increases. Certain ‘collectivist’ cultures (ex Japan or India) were found to induce additional stay-with-the-herd bias.
Imo, this explains why TPTB make sure that nobody expresses dissent when their puppets dance. Everyone always applauded for Stalin or Hitler or Roosevelt. Even today, the local demopubs carefully screen their respective audiences and ensure that nobody laughs, boos, or throws shoes, and will forcefully remove anyone who does not treat their puppets with deference. The puppets talk about approved topics in approved ways and push towards a BAU consensus. But times change–imo, modern media would quickly (rather than slowly) lose credibility with an important part of their audience if they never televised dissenting opinions. However, the media talking heads cannot take unpleasant truths seriously. Belittle, diminish, and if at all possible, get a laugh from the audience when they should be listening respectfully. The dissonance thus created lessens the likelihood of the sheeple seriously listening to those humans brave enough to tell truth to Empyre.
I would also like to point out that modern publicly available research does not stress their subjects as much as Dr. Milgram accidentally did. Nobody knows how much the [CLASSIFIED TOP SECRET] research take subject stress levels into consideration.
The Mere Exposure Effect: Repeatedly seeing something will enhance the viewer’s attitude. For example, experimental subjects who viewed nonsense words repeatedly tended to have positive feelings towards them and rated them higher than unexposed subjects. This effect could still be detected even when subjects were repeatedly shown nonsense words for milliseconds at a time (subliminally). (If subjects are exposed to the word ‘beef’ for 5 milliseconds at a time, they reported feeling more hunger than controls did.)
Nonsense words are not very potent stimuli, especially when compared with consumer goods. Is it any wonder that advertisements are everywhere in a consumer society? Imo, this is why dictators put their self-portraits and self-aggrandizing statues everywhere, and why politicians try to get their mugs in front of as many sheeplefaces as they can as often as they can.
Sublimininal stimuli: Some short phrases shown to subjects are modestly effective in changing behavior. More effective are faces showing emotion, but quantifying emotional transmittance is not something science does well imo. More effective still are unnoticed but liminal stimuli. For example, Mullen and his grad students took 2.5 second snips of video from newscasters during 1984’s presidential race. He used 37 of those segments after he removed all the audio and any video that mentioned candidate names (minimizing lip-reading potential). Then he and his grad students showed the segments to people and asked the subjects to rate the facial expressions. Two newscasters, Rather and Brokaw, were rated neutral. However, Peter Jennings of ABC was rated positively while speaking of both candidates, but more positively for one of them. Mullen and his grad students reported that TV viewers who watched ABC were more likely to vote for Mr. Jenning’s favorite by a significant margin. Mullen was able to repeat this research in 1988 with essentially the same results.
It is well known that humans often ‘mirror’ emotions. (Proverb: Smile and the whole world smiles with you. Thought experiment- a man is watching another man get kicked in a ‘sensitive’ area. What is the expression shared by both men?) This mirroring effect is enhanced if subjects watch a trusted and admired human and is extensively used in advertising. Some think that this ‘mirroring’ of emotions helps keep violence to a minimum and aids compassionate behaviors, especially since most humans are unaware of this tendency. (BTW–a good actor can effectively ‘play the audience.’ A psychopath who can do this is dangerous, as it is said they do not subconsciously respond to audience feedback like most humans do.)
This trick really works: If students smile frequently and pay close attention to a teacher only when the teacher stands on one side of the room, the teacher often will unknowingly teach mostly from the good-vibes side of the room. This can work even if the majority of students do not participate, and can work on professors who already know about this trick (if the students are clever enough).