Living Post-Truth: Feelings Over Facts

What is Post-Truth?

According to the Cambridge Dictionary, post-truth refers to a situation in which people are more likely to accept an argument based on their feelings and beliefs as opposed to logic and reason. In today’s world, the greatest examples of this can be seen in politicized arguments over science.

For example, many people believe that climate change is a hoax, even in the face of undeniable scientific evidence that contradicts them. It was found that in a 2013 survey of 4,000 peer-reviewed papers that took a stance on climate change, 97% agreed that human activity was causing global warming. According to a new survey, the dissenting 3% have little in common aside from contradicting each other and “methodological flaws like cherry picking, curve fitting, ignoring inconvenient data, and disregarding known physics.” Despite the obvious proof that climate change is caused by human activity, people still deny it. Some have an economic or political stake in it, such as oil companies and the people endorsed by them. For everyone else, it is their emotions and beliefs that are on the line. Knowing this, the question becomes, “Why do people accept arguments based on their feelings instead of facts?”


Why People Are More Likely to Accept Arguments Based on How They Feel

As humans, we seek balance between our beliefs, attitudes, and behaviors; when they conflict, we experience cognitive dissonance. Cognitive dissonance also occurs when our beliefs conflict with facts. In order to resolve the conflict, we should alter our beliefs, attitudes, or behavior, but instead, we may choose to ignore what is right in front of us. This concept is explored in Lee McIntyre’s Post-Truth, part of the MIT Press Essential Knowledge Series. In exploring this concept, McIntyre cites Leon Festinger’s study, A Theory of Cognitive Dissonance.


Festinger’s experiments found that if two groups performed the same menial, boring task but one group was paid only $1 while the other was paid $20, when they were told to report how enjoyable the task was, the group that was paid less said the task was more enjoyable. This is because they needed to justify why they had done something when all they got back from it was $1. In this case, the dissonance is caused by a conflict between their negative attitude towards the task and their behavior, which was completing the task. In order to resolve the conflict, they changed their attitude.


Festinger’s results show the significant effects of internal conflict on how people perceive reality, but McIntyre takes this further, using Solomon Asch’s Opinions and Social Pressure to highlight how external conflicts have similar effects. Solomon Asch found that we may change our beliefs to avoid not only dissonance within ourselves, but also with those around us, and to do so we may discount even the most obvious evidence.


Asch’s experiments involved gathering a group of seven to nine people, all of whom but one were in on the experiment, dubbed “confederates.” The experimental subject, the one who wasn’t a confederate, would always be placed at the last seat of the table. The subjects would then be shown a card with one line and another card with three lines; one of the three lines was the same length as the one on the first card while the other two had“substantially different” lengths. The experimenter would go around the table and asked the subjects to identify which line was the same length. For the first few trials, both the confederates and the experimental subject would report correctly. However, the confederates would then begin to report that one of the two obviously wrong choices was the same length. By the time it was the subject’s turn to answer, they were rightfully perplexed and astonished, yet 37% of the subjects tested yielded to the majority opinion. This shows that just as we seek psychic harmony, we seek harmony in social situations.


Confirmation bias, the tendency to process information by looking for, or interpreting, information that is consistent with one’s existing beliefs also plays a role in why people reason with their emotions.


Peter Cathcart Wason produced On the Failure to Eliminate Hypotheses in a Conceptual Task (1955) in order to investigate the extent to which people use confirming evidence alone (enumerative induction) or both confirming and disconfirming evidence (eliminative induction) when making conclusions about a simple conceptual task. Wason’s experiment involved gathering 29 subjects, all of whom were psychology undergraduates, and providing them with a sequence of three numbers: 2, 4, 6. They were then tasked with discovering the rule the sequence conformed to by writing down another set of numbers and the reason for choosing those numbers, whereby the experimenter would tell them if the numbers did or did not conform to the rule. This process continued until the subjects came up with the rule; if it was correct, they could leave; if it was not, the process continued until they either found the correct rule, gave up, or the experiment exceeded 45 minutes.


Many subjects then came up with the series “8, 10, 12,” and, after being told that this did conform to the rule, concluded that the rule must be “increasing intervals of two.” Few tested other possibilities that may fit the actual rule but not the rule they came up with. This is likely due to the reinforcement of their beliefs at the beginning of the experiment. Being told that the instances they produced of numbers increasing by two were correct reaffirmed their belief that the only possible instances followed this rule, so they did not attempt to disprove their rule by producing disconfirming instances.


The correct rule was any 3 numbers in ascending order.


In the end, only six of the 29 subjects correctly named the correct rule on their first try. These subjects tended to create more disconfirming instances (eliminative induction) than those who named an incorrect rule. Of the remaining subjects, 10 were able to announce the correct rule on their next try after producing more disconfirming sequences and using more elimination than simple enumeration.


Wason’s experiment found that the ability to produce the correct rule was the result of using eliminative induction and testing one’s own beliefs, as well as that “very few intelligent young adults spontaneously test their beliefs in a situation which does not appear to be of a ‘scientific’ nature.” The lack of readiness to question and deconstruct our beliefs does not just apply in a lab setting but also in our daily lives. On this Wason wrote:

[...] the readiness (as opposed to the capacity) to think and argue rationally in an unsystematized area of knowledge is presumably related to other factors besides intelligence, in so far as it implies a disposition to refute, rather than vindicate assertions, and to tolerate the disenchantment of negative instances. And certainly these qualities are no less important for thinking in general than the more obvious cognitive functions associated with purely deductive reasoning.

We must be ready to challenge our beliefs in order to think logically as opposed to falling victim to confirmation bias, which prevents us from seeing possibilities that are right in front of us.


How All of This Applies to Us

We may hate to admit it, but all of us are prone to fall victim to cognitive dissonance, social pressure, and confirmation bias. No one is immune. The most we can do is be aware of our biases and be prepared to not only have our beliefs challenged by others but to challenge our own beliefs. If we refuse to open ourselves up to change, there will be no escaping the post-truth era.

We want your Feedback.

© 2023 by Train of Thoughts. Proudly created with Wix.com