The Modern Survival Guide #4
This is an entry in the Modern Survival Guide series, a guidebook I’m writing for things I think people need to know about living in the modern world. These are my opinions, and you read further at your own peril of absorbing my worldview. You have been warned… and that goes extra for this entry, because this time we’re talking about brains and their limits.
We humans like to believe that we’re smart. I know I sure do. Look at me writing stuff down, I must be super intelligent! I can even use onomatopoeia in a sentence and spell it right on the second try! Sadly, your brain and my brain and, frankly, everyone’s brains are wired to conspire against us. And as George Carlin said a few times, this produces bullshit that is bad for us.
We’re pretty much obliged to use our brains a lot more than people in the past really had to, or at least in a different way than they had to. A modern person gets bombarded with an endless stream of information, facts, figures, lies, propaganda, rumors, and analysis from multiple different angles of the moral, political, and social spectrum. We get more information in a day than people a century ago got in a year. And our brains can’t take it. We develop cognitive biases to cope, and this leads us further and further from the main seam of reality.
There are a LOT of cognitive biases. So, to survive and thrive in this sea of thought, here are a few things to watch out for: heuristics, pattern-recognition fallacies, logical fallacies, and knee-jerk worldview defense mechanisms. These are the ones I think come up the most, and are the most damaging. Feel free to disagree; maybe I’m biased, after all.
Heuristics are nasty, sneaky buggers. A heuristic is a mental shortcut; it’s your brain using past experiences to construct a vision of the future that usually provides some emotional guidance. These things are how we can make snap decisions. Heuristics evolved back in the days where all we needed to know was “Saber-Tooth Tiger: Bad!” Sadly, heuristics aren’t great at logic and figuring out actual causes behind events. They’re really good at pattern recognition generalization, though.
Racism is a heuristic. It’s someone’s brain taking one piece of information and constructing a broad, sweeping generalization out of it. Again, this works great for tigers. All tigers have pretty much the same motivation: Kill things and eat them. Sleep. Wake. Mate. Kill things and eat them. Repeat. This doesn’t work as well for people. People have more motivational options than tigers. People can change their stripes.
Heuristics are a problem because they’re lazy and self-reinforcing due to the other stuff we’ll discuss in a minute. They encourage a simplistic worldview that isn’t always correct but feels like it is. They feel “truthy.” Your brain will actively throw heuristic interpretations at you (all black people are criminals, all politicians are liars, all men are sexist, all women are sluts, all bats have rabies) and it feels hard to try to fight them. You probably have bunches of them you don’t even know about, because you won’t notice them too much until someone challenges you on a concept… or you work to spot one.
Here’s how you spot a heuristic: keep an eye on stuff you think and say until you hit a concept that you believe bone-deep. Then ask yourself why you think that. If you can point to an article you read or something similarly evidence-based, you’re clear(ish). If your brain’s response is something like “well, everyone knows,” or, “all of them are like that,” congratulations. You’ve probably found a heuristic.
Heuristics feed on pattern recognition fallacies, and these things are really hard to fight through. Basically, our brains are pattern recognition and assessment engines. We’re really, really good at spotting shapes and assigning meaning to them. This helped a lot back in the saber-tooth tiger days, and it’s still always useful… but your brain doesn’t quite know where to stop. It can and will assign false patterns to everyday phenomenon that are not connected.
These days, false pattern recognition has a name — Apophenia. The most common example is the man in the moon; you see a pattern in the moon that looks kind of like a face, and your brain says “hey, that’s a face!” But really… it’s just a series of discolorations on the moon’s surface. The moon does not have a face; the pattern has no meaning. Another example is the Gambler’s Fallacy — the tendency to see patterns in random numbers. The patterns don’t actually exist, but the gambler’s mind wants them to, so that the gambler can know when to bet. Other examples include faces in clouds, Jesus in toast, and miracles. If you seek miracles you find miracles. This doesn’t mean miracles exist; it does mean people have a tendency to see things they want to see. This is called confirmation bias.
Pattern recognition fallacies wreck our brains by giving us a false picture of reality. It’s a series of short steps from “the crops failed” to “the crops failed when I wore my old socks” to “old socks cause crops to fail,” and this is a path humans have walked many, many, many times. But it doesn’t help us; socks do not affect crops.
You don’t believe people do this kind of thing? Check your friends at the next ballgame. See who’s wearing their lucky socks.
You defeat pattern recognition fallacies with analysis. Most of science is set up to do this — to prove that one event causes or, more often, does not cause a particular response. This is the whole basis of the scientific method, and it’s why I’m typing this entry on a computer and not inscribing it with a quill pen on parchment. The more complex things get, the less our innate pattern recognition abilities help us, and the more we have to rely on fact-based data.
Heuristics and pattern recognition fallacies often lead to logical fallacies. Seriously, click that link, it’s a fantastic visual rendition of some of the big ones. Logical fallacies are things that seem like they might make sense, but don’t. They are failures of common sense. Straw man arguments, appeals to fear, mistaking cause and effect, guilt by association, and appeals to ignorance are all common logical fallacies that sort of kind of make sense if someone says them in a confident tone of voice. I’ve fallen for them before; hell, I’ve perpetrated them before. Odds are, so have you.
Logical fallacies exist in nature because brains are great at making connections from available patterns, but not that great at ensuring that these connections follow a logical flow. The slippery slope is a classic example of such a problem. A slippery slope argument happens every time someone talks about drug legislation — you can’t legalize marijuana, because it’s a gateway drug to cocaine, and cocaine leads to crack, and crack leads to meth, and then the user is dead. We don’t want people to die of drug overdoses, so don’t do pot.
That chain of events seems to make sense. It works for the brain because the brain knows drugs are bad, m’kay, and therefore doing drugs should have consequences (note how alcohol rarely makes it into that conversation, though). So it seems like doing one kind of proscribed drug ought to lead to other, worse drugs. Trouble is, finding a solid proof for that argument is very hard. Most of the studies that attempt to link marijuana to harder drugs are rooted in correlative logic, and correlation does not equal causation. But the argument persists regardless. The slippery slope is even harder to climb out of than it is to start down.
Partly this is the fault of cognitive bias, but partly it’s also the result of defense mechanisms in our brains that protect us from having to admit we did something wrong. The worst of these, in my opinion, is rationalization — the brain’s tendency to try to find good reasons why it believes something.
This is why smart people sometimes believe the most asinine things — it’s easier to rationalize if you have more computing power to work with. This is also why so few college professors in the same field agree with each other. The world is big and complicated, and there’s enough room in it for multiple competing explanations, but only one can be right. Most of the time, anyway. So the more expertise you have, the more ammo you have to defend your own personal worldview.
This doesn’t mean all scientists are wrong, by the way, it’s just the explanation for why they’re peer-reviewed.
Defense mechanisms are also why you get things like religious wars, political impasses, and nasty Facebook fights. Your brain sets a lot of store in you being right; back in the old days, being right meant surviving. The people who were certain those berries were poisonous didn’t eat the berries, and didn’t get poisoned (note that the berries don’t have to be poisonous for this to work). So once you had a “correct” operating pattern, you lived longer. Add in the modern social pressure to defend yourself and worldview in public, and being “right” is often more important than being factually accurate. And that’s how we get things like “alternative facts.”
Heuristics lead us to poor conclusions. Pattern-recognition makes us see things that aren’t there. Logical fallacies lead us to incorrect solutions to problems. Defense mechanisms keep us from changing our minds. Put all these things together, and you have one stubborn, stupid primate.
But we don’t have to be. Identifying all the ways our minds can mess with us is step 1 in unmessing our minds. Examining motivations, double-checking accepted conventions, looking for facts, asking “why” more than once when confronted with a new idea, and acknowledging at least once per day that we are all fallible humans are all good steps to take to defeat our brains’ worst impulses. These are all mindful ways to defend against walking down the path of the stubborn and dumb.
Ultimately, this helps us survive in the modern world by helping us identify and solve real problems, keeping us from wasting effort, and providing real answers to help us plan for the next event. Otherwise we’re all just a couple of steps away from praying to the next passing comet to ensure the sun will come up.