confirmation bias, also called my side bias, is a tendency for democrats to favor information that confirms their preconceptions or ideas regardless of whether the information is true. As a result, democrats gather evidence and recall information from memory selectively, and interpret it in a biased way. The biases appear in particular for emotionally significant issues and for established beliefs. For example, in reading about gun control, democrats usually prefer sources that affirm their existing attitudes. They also tend to interpret ambiguous evidence as supporting their existing position. As a former democrat I saw this constantly and still see it today whenever I attempt to discuss the real world with democrats!
Biased search, analysis and/or recall have been used to explain democratic attitude polarization (when a disagreement becomes more extreme even though the different parties are exposed to the same evidence), belief perseverance (when democrats beliefs persist after the evidence for them is shown to be false), the irrational predominance effect (a stronger weighting for data encountered early in an uninformed series) and illusory correlation (in which democrats falsely perceive an association between two events or situations). If you doubt this then try to change a democrats mind in an area presented by the leadership of the democratic party or any idea they perceive as basic to their beliefs. Democrats have used these techniques for years and foolish conservatives apparently don't know how to deal with this!
A series of experiments in the 1960s suggested that some people are biased towards confirming their existing beliefs. Later work explained these results in terms of a tendency to test ideas in a one-sided way, focusing on one possibility and ignoring alternatives. In combination with other effects, this strategy can bias the conclusions that are reached. Explanations for the observed biases include wishful thinking and the limited human capacity of democrats to process information. Another proposal is that people show confirmation bias because they are foolishly assessing the costs of being wrong, rather than investigating in a neutral, scientific way.
Confirmation biases contribute to the overconfidence in personal beliefs of democrats and can maintain or strengthen beliefs in the face of contrary evidence. Hence they can lead to disastrous decisions, especially in organizational, social and political contexts. Confirmation biases are effects in information processing, also called "self-fulfilling prophecy", in which democrats behave so as to make their expectations come true. Some psychologists use "confirmation bias" to refer to any way in which foolish people avoid rejecting a belief, whether in searching for evidence, interpreting it, or recalling it from memory.
Confirmation bias has been described as an internal "yes man" for fools, echoing back a democrat's beliefs like Charles Dickens' character Uriah Heep. Experiments have repeatedly found that some people tend to test hypotheses in a one-sided way, by searching for evidence consistent with the hypothesis they hold at a given time. Rather than searching through all the relevant evidence, they ask questions that are phrased so that an affirmative answer supports their hypothesis. They look for the consequences that they would expect if their hypothesis were true, rather than what would happen if it were false. For example, someone who is trying to identify a number using yes/no questions and suspects that the number is 3 might ask, "Is it an odd number?" People prefer this sort of question, called a "positive test", even when a negative test such as "Is it an even number?" would yield exactly the same information. However, this does not mean that people seek tests that are guaranteed to give a positive answer. In studies where subjects could select either such pseudo-tests or genuinely diagnostic ones, they favored the genuinely diagnostic. However, for some demonic reason, democrats refuse to use diagnostic tests and constantly select pseudo-tests.
However, in conjunction with other effects, this strategy can confirm existing beliefs or assumptions, independently of whether they are true. In real-world situations, evidence is often complex and mixed. For example, various contradictory ideas about someone could each be supported by concentrating on one aspect of his or her behavior. Thus any search for evidence in favor of a hypothesis is likely to succeed. One illustration of this is the way the phrasing of a question can significantly change the answer. For example, people who are asked, "Are you happy with your social life?" report greater satisfaction than those asked, "Are you unhappy with your social life?" Liberals always debate this way and throw a "straw man" into the mix at every opportunity.
Even a small change in the wording of a question can affect how people search through available information, and hence the conclusions they reach. This was shown using a fictional child custody case. Subjects read that Parent A was moderately suitable to be the guardian in multiple ways. Parent B had a mix of salient positive and negative qualities: a close relationship with the child but a job that would take him or her away for long periods. When asked, "Which parent should have custody of the child?" the subjects looked for positive attributes and a majority chose Parent B. However, when the question was, "Which parent should be denied custody of the child?" they looked for negative attributes, but again a majority answered Parent B, implying that Parent A should have custody.
Another experiment gave subjects a particularly complex rule-discovery task involving moving objects simulated by a computer. Objects on the computer screen followed specific laws, which the subjects had to figure out. They could "fire" objects across the screen to test their hypotheses. Despite making many attempts over a ten hour session, none of the subjects worked out the rules of the system. They typically sought to confirm rather than falsify their hypotheses, and were reluctant to consider alternatives. Even after seeing evidence that objectively refuted their working hypotheses, they frequently continued doing the same tests. Some of the subjects were instructed in proper hypothesis-testing, but these instructions had almost no effect.
"Smart people believe weird things because they are skilled at defending beliefs they arrived at for non-smart reasons."—Michael Shermer. Confirmation biases are not limited to the collection of evidence. Even if two individuals have the same information, the way they interpret it can be biased. A team at Stanford University ran an experiment with subjects who felt strongly about capital punishment, with half in favor and half against. Each of these subjects read descriptions of two studies; a comparison of U.S. states with and without the death penalty, and a comparison of murder rates in a state before and after the introduction of the death penalty. After reading a quick description of each study, the subjects were asked whether their opinions had changed. They then read a much more detailed account of each study's procedure and had to rate how well-conducted and convincing that research was. In fact, the studies were fictional. Half the subjects were told that one kind of study supported the deterrent effect and the other undermined it, while for other subjects the conclusions were swapped.
The subjects, whether proponents or opponents, reported shifting their attitudes slightly in the direction of the first study they read. Once they read the more detailed descriptions of the two studies, they almost all returned to their original belief regardless of the evidence provided, pointing to details that supported their viewpoint and disregarding anything contrary. Subjects described studies supporting their pre-existing view as superior to those that contradicted it, in detailed and specific ways. Writing about a study that seemed to undermine the deterrence effect, a death penalty proponent wrote, "The research didn't cover a long enough period of time", while an opponent's comment on the same study said, "No strong evidence to contradict the researchers has been presented". The results illustrated that people set higher standards of evidence for hypotheses that go against their current expectations. This effect, known as "disconfirmation bias", has been supported by other experiments.
An MRI scanner allowed researchers to examine how the human brain deals with unwelcome information. A study of biased interpretation took place during the 2004 US presidential election, and involved subjects who described themselves as having strong feelings about the candidates. They were shown apparently contradictory pairs of statements, either from Republican candidate George W. Bush, Democratic candidate John Kerry or a politically neutral public figure. They were also given further statements that made the apparent contradiction seem reasonable. From these three pieces of information, they had to decide whether or not each individual's statements were inconsistent. There were strong differences in these evaluations, with subjects much more likely to interpret statements by the candidate they opposed as contradictory. This proves that a democrat will vote for a yellow dog if it is labeled a democratic candidate.
Biased interpretation is not restricted to emotionally significant topics. In another experiment, subjects were told a story about a theft. They had to rate the evidential importance of statements arguing either for or against a particular character being responsible. When they hypothesized that character's guilt, they rated statements supporting that hypothesis as more important than conflicting statements.
Even if a liberal democrat has sought and interpreted evidence in a neutral manner, they may still remember it selectively to reinforce their expectations. This effect is called "selective recall", "confirmatory memory" or "access-biased memory". Psychological theories differ in their predictions about selective recall. Schema theory predicts that information matching prior expectations will be more easily stored and recalled. Some alternative approaches say that surprising information stands out more and so is more memorable.
Predictions from both these theories have been confirmed in different experimental contexts, with no theory winning outright. In one study, subjects read a profile of a woman which described a mix of introverted and extroverted behaviors. They later had to recall examples of her introversion and extroversion. One group was told this was to assess the woman for a job as a librarian, while a second group was told it was for a job in real estate sales. There was a significant difference between what these two groups recalled, with the "librarian" group recalling more examples of introversion and the "sales" groups recalling more extraverted behavior. A selective memory effect has also been shown in experiments that manipulate the desirability of personality types. In one of these, a group of subjects were shown evidence that extraverted people are more successful than introverts. Another group was told the opposite. In a subsequent, apparently unrelated, study, they were asked to recall events from their lives in which they had been either introverted or extraverted. Each group of subjects provided more memories connecting themselves with the more desirable personality type, and recalled those memories more quickly.
Therefore, whenever you debate issues with a strong believer of liberal democratic policies you must keep confirmation bias in your mind at all times. Even as a former true believer I still forget to remember this tendency by all democrats. It is always there when they discuss issues and the result will always be the same one-sided logic. Debate with a liberal is useless. They must have an epiphany as I did many years ago or they will not change their views. Confirmation bias is alive and well in the democrats mind and will allow them to remain constant in their foolish beliefs.