Blind Spots is all about exploring ideas or concepts that blind us to things that are there to be seen in plain sight, and that cause us to make far worse decisions than we have the potential to make. Here I’ve finally found the words to describe something I’ve been trying to describe for about fifteen years.
I’ve mentioned the Pandemic and the Human Race’s response to it in my last piece. Purely for the sake of demonstrating again how delusionally closed-minded we sometimes have the capacity to be – I’d like to point out two things. (This piece is not about Covid, I swear – but some people don’t like to believe humanity’s closed-mindedness can be as bad as it can be. I don’t blame them.)
The first is that, since medieval times, it seems humanity has never really gone 200 to 300 years without a pandemic. And, as humanity’s transport and communication networks have developed, pandemics have become more severe. With the last one being 102 years ago, it’s fair to say that the probability of this one happening eventually was over 95%. We had all we needed to know this would come but we denied it anyway.
The second is that, as of this Easter weekend, April 2020, there are still Covid deniers out there. It is still illegal to say ‘Coronavirus’ in Turkmenistan, and some fairly popular (although quite fringe) commentators in the US are still convinced that it’s a talking point that Democrats are using to damage Trump politically; even as Fox News themselves are reporting on the mass graves being dug outside New York. Once it did come, many of us had the capacity to persist with denial.
I’ve always been obsessed with the importance of open-mindedness, but this pandemic is giving me more reason than ever before to believe that the single biggest thing humanity can do to make the world a better place is bring looking and listening back into fashion and graduate the intake of information to the top spot of importance in the decision making process.
How is it that we can have the capacity to be so closed -minded? Here I’d like to explore what I believe are four of the biggest things that cause us to close ourselves off from incoming information.
If you liked my Requiem for Notions, this is my attempt at the Postmortem.
I’m going to use a dirty word here. ‘Philosophy’ is something that is associated with lofty academics and other bullshitting navel-gazers. This is partly because philosophy academics have painted it as something purely intellectual and inaccessible . It’s also partly down to the fact that, well, if you’ve been an academic all your life, what would you have to apply this philosophical knowledge to anyway?
Practical philosophy is concerned with how, and not what, we think. It defines how we interpret information, what it is we need to experience before we can say something is true, and ultimately, how we make decisions.
If the subjects, professions, and opinions we have learned are the ‘apps’, the philosophy is the operating system that hosts and runs those apps. Crucially, even though they are so deeply embedded we’re unaware of them, philosophies are the foundation language or ‘code’ in which all western academic fields and professions are thought in – especially in universities.
Some philosophies, like the ones I want to get into here, are accidental philosophies. They are things that you can be accused of being. In these cases, it is extremely rare for someone to say “I’m an X-ist”. But that person may still be using ‘X’ as their system of thought or values.
A person with zero interests outside of reality TV, celebrity culture, social media, clothes, or cosmetics is unlikely to consciously think “I’m a consumerist” to themselves. They’re likely to not know what consumerism is. That makes them no less a consumerist. You don’t have to know you subscribe to a way of thinking for it to dominate the way you think.
The over-application of these four philosophies is detrimental to the world. They are also very commonplace, but hardly anyone knows that they’ve been raised to think like this. I want to talk about them because I think they need to be mainstreamed. Awareness of them would be more practical outside of academia with people whose rubber touches the road. Merely having a label for something allows you to work with and communicate it. Without the vocabulary to describe something you can be helplessly subjected to it without even being aware of it.
Here are the four dominant but faulty philosophies:
#1 – Platonism
Like styles of music that build and iterate on one another and so evolve over time, philosophies usually have a ‘lineage’ that they trace back to an ‘ancestor’.
Before getting into the specifics of how this philosophy works ; the more important thing to note first is that this is the dominant code that serves as the foundation for pretty much every profession and academic field outside of physics, chemistry, mathematics, computer science, engineering, and medicine. If you studied and/or work in any of the arts, humanities, social sciences, in business, or in law, the very foundations and assumptions that underpin your education come from the philosophy of Plato. Plato’s philosophy is the DNA out of which the University system evolved in medieval Europe. It is the dominant mentality on which most of Western Culture is built on.
The philosophy of Plato, often called ‘Rationalism’, is based on the belief that the best indicator of truth is logic. The idea is that by using reason, we can imagine a more perfect (and crucially, more true) version of the imperfect world that actually is there.
In practice, this means that the things that make most sense on paper are also the most likely to be true.
Plato and his followers believed in theories, models, and frameworks as being the ultimate forms of learning and the strongest indicators of truth. It’s a very understandable reaction to read something that sounds logical and think that “this is the way it must be”.
It is very easy to slip between trying to figure out what is true to thinking about what makes sense without even realising it. Please read that last sentence again.
It’s natural that we would want the world to be reasonable because we are capable of picturing a reasonable world in our minds’ eyes.
Rationalism’s faultiness is evident when there’s an inconsistency between, on one hand, what we think or the theories we have been taught in our fields or professions, and, on the other hand, what we are actually experiencing. In these circumstances; Rationalism has raised our culture to often reject what we are experiencing in favor of what makes sense on paper or what our teachers and textbooks have told us is true.
Where Platonism’s flaws are most exposed are in its habit of teaching people that the map is more real than the territory, and that when there is a mistake on the map, that we should point and shout at the mountain and tell it that it shouldn’t be there.
In Finance circles there have been more cases than anybody can count of fund managers sustaining massive losses, despite following investment management theories that were academically accepted, only to have those fund managers say that the market was wrong instead of questioning the effectiveness of the investment management methods they had learned in college. Long Term Capital Management’s loss of $4.6bn in 15 weeks using these academically accepted theories has done very little to dislodge belief in these theories because the very fundamental basis of Platonism is that what we can see (including instances of the theories not working) if of little importance compared to what we have decided is logical. The value of a theory isn’t based on its applicability to real life in the first place. Can you see the double bind this can cause? There’s a circular knot tied in the learning process here because the entire thing is built on the assumption that the theory cannot be wrong, and so, any clashes between the theory and the reality cast the reality into doubt. It’s easy to see how this results in locked steering wheels on collision courses.
In his work on Actively Open-Minded Thinking, the psychologist Jonathan Barron suggests that there seem to be two very different fundamental approaches to thinking and deciding. Some feel it is a sign of weakness to change your mind when confronted with contrary evidence and some people see it as a sign of weakness not to. Some seem to believe that the best way to think is as an advocate or lawyer, making the strongest, most convincing, and most logical cases to consolidate what we already think and believe. Others seem to have a more detective or research-like way of thinking where finding the truth is what is most important. The problem with the first approach is that you’re likely to commit every time to whatever it was you happened to think or learn first. This gives you great odds of crashing versus someone who lets themselves steer during the race.
What’s clear to me is that this contrast could be a more hardwired biological/psychological difference that may exist within us that gives rise to the tension between those who believe that what they already know as being important, and those who feel that what there is yet to know is more important. What is more real; what is already in our heads or what is out there in reality?
The expression “That’s all very well in practice, but how does that work in theory?” sums up the priority Rationalism gives to ideas ahead of things that can be experienced.
Plato’s Rationalism is an appeal to use our minds instead of our senses to decide what’s true. I believe this is a dangerous system to use because it relegates sensory experience and encourages the thinking that, what is in our own minds is more important than what is really out there, outside of our heads, in reality. It strikes me as somewhat arrogant and suggestive that nature cares what we think.
Academic fields and the professions aside, the fact that most of us learn about what is happening in the world via journalistic news, which is itself descended from this philosophy gives Plato’s Rationalism even more outsized influence over how we think the world works.
Since 2017 a few controversies in the American University system have thrown up some very strange debates. A social justice movement, focussed mainly on issues of minority rights and the combatting of patriarchal privilege, has become very assertive. They hold a set of beliefs that they feel any disagreement with makes you a bigot by default. The reason I believe these debates — called the ‘culture wars’ in the US — have something of interest to us is because, when their beliefs come into conflict with hard science, they appear to have a habit of branding the scientific findings as racist or bigotted, as if the messenger is generating the reality that they are reporting on.
For illustrative purposes I’ll just mention one such case of this. It is an article of faith within the woke social justice community that gender is a purely social construct. The most influential proponent of this belief is Judith Butler, a Berkely professor of comparative literature. What is dangerous about debating this point with them or trying to claim that there is a certain degree of objective biological reality behind gender is that they will accuse you of hatred of LGBTQ+ identities. This is in line with their numerous claims that several robust scientific findings in the fields of evolutionary biology and psychology are politically incorrect.
Indeed, the fact that a literature professor is seen by the community as a suitable thought-leader ahead of biologists or neuroscientists gives us many clues into how they believe the world works.
A quick delve into what Postmodernism is and where it came from. In the latter half of the 1960s, as the various Christian churches’ grip over social and moral standards began to loosen, many of the theretofore quite solid ideas on what was morally acceptable behaviour began to be realised for what they were – tradition. People realised that things like whether or not it was OK to be gay, which had before been as debatable as granite, were nothing more than social constructs formulatd on the words contained in religious books, and accumulated cutural customs. So many moral norms that had been confused with objective factual truths for centuries came out in the wash as social constructs. This is the original core of Postmodernism. A scepticism of the imaginary structures that tell us what is socially or morally right or wrong. In this realm, Postmodernism has been very valuable in helping those who fell outside of our medieval Christian values graduate into fully fledged human beings, who are entitled to the same dignity and respect as everyone else, without having to pretend or hide who or what they are.
But then Postmodernism seems to have gradually metastasized into a form of batshit-crazy uber-subjectivism. Particularly in the more fundamentalist and woke social justice warrior community, they seem to have done away with a belief in objective truth altogether, preferring to say that all of reality is itself a social construct. The only solid truth that exists, they say, is power. This is to a certain extent understandable given the capacity power and wealth have always had to distort and shroud truth to their own ends. We laugh at the left now when they say that reality is subject to power but the fact is that the right have operated on this basis since time immemorial.
The leading scholar on Postmodernism, Stephen Hicks (himself not a postmodernist, he studies it but doesn’t subscribe to it) states that at it’s very core, Postmodernism says that there is no solid reality until we linguistically breathe it into life by describing it. They are certain that, if a tree falls in the forest, and no one is there to describe it, it does not make a noise.
If a community believes that the written and spoken world is what gives birth to reality (instead of, hopefully, the rest of us, that think words just describe things that were there already), it’s not hard to see why that community might see more credibility in the theories of a literature professor ahead of the more rigorous work of a biologist or neuroscientist.
The Woke movement is an extreme example of how far it can go when it is suggested that everything is a matter of opinion but, for decades, our culture had already been infiltrated with the idea that opinions deserve automatic respect even if devoid of supporting evidence. It’s probably a mild side effect of the freedoms of speech, thought, and expression we luckily have enjoyed since WWII.
Postmodernism has been chipping away at the idea of a truth that exists whether we like it or not for decades. But this social justice movement is the moment that Postmodernism’s iceberg begins to peak over the waves and come into popular view for how ridiculous it is.
I’ve said already that Postmodernism has been beneficial in the fight for social justice and in overthrowing the tyranny of social intolerance to variance in race, gender, and sexuality. To wrap up Postmodernism I’d like to explain why I think it’s so harmful when it dominates outside of the realms in which it’s appropriate and starts spreading its tentacles into the territory of hard science.
Some realms in life are subjective and others are objective. Subjective realms are ones where human opinion alters how the dynamics of that area work. If everyone believes that oil will be worth 20% less tomorrow, oil will be worth a lot less tomorrow as the panic will induce panic selling. Taste in music or any other arts is subjective – you cannot quantify who the best rock band of the 1990s were. It is a matter of perspective. We have discovered that social and moral norms belong natively to this subjective category. The things that are OK or not OK to be or do are defined by us, and there is significant variance from culture to culture. There still exist warrior cultures where strength is worshipped as the ultimate virtue, and where violent behaviours that would make the average Westerner sick are perceived to be natural and justified.
Then there are realms of life that behave the way they do regardless of our perspective. Gravity is not a social construct. Our opinions have zero impact on the orbits of the other planets around our sun. If the subjective realms are the human world, the objective ones are Mother Nature’s world. Nature’s dynamics do not rely on our permission to do their thing. Peter Pan’s claim that “Anything is possible if you wish hard enough” suggests that Mother Nature will rearrange itself to accommodate our notions.
The Postmodern claim that truth itself is a social construct is a dangerous pandora’s box that prevents us from being able to tell the difference between subjective and objective realms. The fault of Postmodernism is not its claim that many of our norms are subjective social constructs. Its fault is that it oversubscribes to subjectivity, spreads outside of the limits within which it is useful for us, and in turn diminishes our ability to categorise the questions we come up against as subjective or objective. If we don’t develop a strong ability to tell these apart it wouldn’t be looking good for some of the horrific economic and environmental challenges we’re going to need to solve.
|Subjective Questions||Objective Questions|
|Beatles Vs. Rolling Stones||2+2|
|Acceptable / Unacceptable Social Norms||How to build a bridge that will not collapse|
|Food & Drink preferences||Whether or not a killer pandemic is a thing|
It’s important to be able to tell which category the questions we encounter belong to.
How is it that we have popular movements claiming against all evidence that the Earth is flat, that we haven’t really been to the moon, that the Holocaust didn’t happen, or that vaccines don’t work?
Our culture has had a millenium of Platonism raising us to prioritise ideas over experience.
Add to this the fact that Postmodernism and it’s claim that there is no such thing as complete truth — and that every perspective has a certain amount of automatic validity — has been the dominant Western idea of the past fifty years. It’s hardly surprising that belief in truth has been undermined.
In May 1977, Richard Nixon was asked if his actions during the scandal that removed him from office two and a half years earlier were illegal. His answer was: “Well, when the President does it, that means that it is not illegal”. This concept of Presidential Infallibility has been wheeled out when needed by both George W. Bush and Donald Trump since then.
It’s a dangerous, denialist, blank cheque to write for yourself. Again it relegates anything that can be experienced to a level of less importance than the initial assumptions that you’ve chosen to buy into.
Keith Kahn-Harris identifies two very important dynamics in denialism.
The first is that plain old-fashioned denial isn’t enough by itself. Being in denial acknowledges that there is something to deny in the first place. What’s needed is something more formal; an ability to “turn our everyday capacity to deny into an organised attempt to undermine our collective ability to understand the world and change it for the better”.
The second element is that desire for things not to be true is the main driver behind Denialism. Surely this relates back to how some people naturally think more like advocates and some like detectives. For the latter, the very idea of having desires about what is true or not can seem very strange. It seems to somehow miss the point.
In subjective matters like business, politics, or public relations, Denialsim can probably be useful. It can allow you to undermine your rival’s narrative (even if it’s the truth) by turning your denial into a belief-system that your followers can identify with. No amount of scans of Obama’s Hawaiian birth cert deflated the conspiracy theory that he was born outside of the USA. In objective realms though, covering your eyes with your hands and hoping a killer pandemic won’t see you is not going to be of any benefit to you. But in a Postmodern culture, where the ability to tell subjective realms from objective ones is seriously undermined, Denialism can flourish.
#4 – Solipsism
If I could choose for one term to be popularised and mainstreamed it would be Solipsism. Apparently more of an accusation than a belief amongst ancient Greek philosophers – technically, Solipsism is the claim that the only thing we know for certain exists is ourselves. It has since come to be used as a description of behaviours or attitudes that are displayed by people who act like the universe revolves around them.
If our own existence is the only thing that we can be certain is real, then the door is opened to believing that perhaps everyone and everything else is an imaginary sprite generated in a dream of our own making. It gets amusing to reason out from there that everyone is walking around generating their own realities.
Very few people in their right minds would consciously say that they believe this, but very many people in their moderately right minds seem to unconsciously operate on the basis that this is how life works.
I’m sure many of us have had contact with people who seem to believe that life will rearrange itself to accommodate what it is they want to think. Solipsism is the closest thing I’ve found to something that helps me understand this tendency.
I believe that these four blindfolds have evolved in a layered way where the beliefs of one enable the next one to emerge. Platonism trains us to believe in theories more than experiences. Postmodernism undermines our senses further by saying that all things, even hard scientific facts, are subjective social constructs that only exist because we believe they do. Building on that, Denialism derives permission from the last two layers to believe that reality only exists if you give it consent to exist. It’s only a short hop from this belief to a point of solipsistically believing that reality is subject to and dependent on your mind and not the other way around.
I call these philosophies The Four Blindfolds of the Apocalypse because I really believe that they could be the death of us. They rob us of the ability to make good decisions based on what is right in front of our noses. Worse, they commit us to paths that are not good for us and can cause us to be too stubborn to course correct because, if it’s not working, it must be the world’s fault for not playing along. How can we hope to solve the health, economic, and environmental challenges that are stacked against us if it’s so fashionable to think our ideas matter more than the world around us? It is a quadruple-locked Failsure Mechanism.
The other way, of course, would be to give a bit more credence to our senses and a bit less credence to our ideas. Just a bit of a rebalance.
It’s very telling that the saying “things aren’t always what they seem” seems a far more popular expression than “don’t believe everything you think”. Surely the things we can actually experience are deserving of more credibility than ideas flitting through our minds?
Plato’s arch-rivals were the Empiricists. This school of philosophy works off the claim that sights and sounds and other things that can be experienced, while not perfect, are the best indicators of truth. Despite its lucky dominance of some very important fields, it is amazing that these successes haven’t led to the philosophy becoming more dominant elsewhere.
Look at the progress made in chemistry, physics, medicine, biomedical science, microbiology, computer science, and engineering.
Compare this with how Economists (tragically a field that is more Platonic than Empirical) are still unable to successfully foresee or prevent crashes. The causes of the 2008 economic crash and the 1929 one are almost identical. And because the field holds certain notions to be sacred, the conditions for the next one are not being strongly mitigated since some of those conditions are seen as a good thing.
There are two layers of Empiricism. I’ll use an analogy from medicine to explain them.
The first layer is one that there seems to be reasonably popular awareness of. Drugs, treatments, and other therapies must be trialed for efficacy so that an overwhelmingly compelling body of evidence is built up to prove that they can cure or prevent illnesses.
The second layer, less popularly understood, is more subtle but I believe, more important. When a doctor first approaches a patient’s bed or takes on a case the first thing they must do is read their chart or study the case background.
But decision making in Politics, Business, and Economics doesn’t have as strong a heritage in doing this. It is much more common for decision makers to have answers in mind before questions are read. For example, most economic problems, depending on the specific case, tend to require either state intervention or increased market influence. The state and the market are tools, not beliefs that should be identified with. But the vast majority of economists are predictably and generically loyal to this tool or that tool every time, regardless of what the particular case calls for.
A doctor that identified as a Penicillin-ist, and that didn’t routinely check to see if this particular patient was allergic to Penicillin, would be as ridiculous as a carpenter that called themselves a ‘Saw-ist’, and who believed in using the saw so strongly that they always thought it best, regardless of the job.
This is the second, less visible layer of Empiricism: routinely taking stock of this particular case before deciding on what actions will match the scenario. It brings to mind the fact that we exclaim “Look Out!” when we are warning someone to be careful. We don’t exclaim “Look in!”
Empirical ways of thinking can disrupt all four blindfolds as blindfolds two, three, and four are all enabled by our Platonic worship of ideas.
If we could rebalance the importance we assign to our thoughts and our senses, perhaps we could escape this habit we have of confusing our search for the truth with arguments over whether Man Utd. are better than Liverpool.
If we conducted our relationships, business, economics, and politics in the same way that we conduct our science; we would be dangerous.