Protecting Ourselves from Sense



Asheley Landrum recently wrote an article about the bullshit of bullshit. In it, she critiques studies about bullshit as being bullshit themselves, because of the authors’ predisposition toward information that supports said bullshit effect.

One of the studies in question looked at the relationship between conservative beliefs and finding profundity in ‘bullshit statements’.

The problem was the study authors’ identity-protective cognition. Being liberals themselves, they were perhaps a little too adamant that an effect between ‘having conservative opinions’ and ‘believing bullshit’ existed, and this reflected in their methodology.

For example, the number of self-identifying conservatives in their sample was much lower than the number of self-identifying liberals. Smaller samples always skew towards the extreme ends of normal distribution, and so it isn’t surprising that an effect was found. This was just one of many methodological flaws that Landrum found.

Source: http://dx.doi.org/10.1371/journal.pone.0153419

She concluded that the study was media-bait because it provided such black-and-white findings- which are exceptionally rare in science- and could have the added effect of turning conservatives away from social science and, counterproductively, toward less scientific opinions. This, in itself, is a form of identity-protective cognition.

So why is this such a big deal? 

Identity-protective cognition is basically a way of holding onto beliefs we have internalised through a variety of common biases like confirmation bias and post-rationalisation.

The danger comes when we hold onto beliefs that are so strong that we tie them to our sense-of-self. Humans are naturally inclined towards things that make us feel self-actualised, and sometimes this inclination can override our ability to critically evaluate the sensible from the nonsensical.

How else do you explain this.

If there’s one thing that years of decision research has found, it’s that we are exceptionally good at denying the truths of things make us uncomfortable or threaten our way of looking at the world.

A particularly damaging outcome of this is the knowledge transference fallacy. This occurs when people who are very intelligent in one domain believe that they are intelligent in other domains they may not be. The classic example is politicians denying decades of scientific literature on topics such as climate change or vaccines because of a single article they read in a Saturday newspaper.

We’re easily influenced, whether we want to believe it or not.

Landrum’s article can be found here. It’s a good read for anyone interested in bullshit, and how data can be framed to support multiple conclusions, not all of which are accurate.

a.ce

No comments:

Post a Comment