web analytics

Risk assessment, weasel style


Rick Rostrom commented on the last post that Al Gore’s statement about the relationship between reason and fear — while garbled as only Algore can — was based on some real research. Indeed it is.

Specifically, imaging of the brain has recently taught us that the pathways from our emotion gland to our logic lobe are much larger than the pathways leading back the other way. From this (near as I can figure it) Al deduces that it’s easier to frighten people into thinking than it is to think people into being scared.

This is why stupid people shouldn’t be allowed to handle facts (not you Rick — I know you know Al knows nuffink). Analogies about pipelines and highways and streams can only get you so far, and then they drop you over a cliff into a kettle of fish. The “size” of a neurological connection doesn’t necessarily speak to how “easy” it is for information to move. It is likely to have entirely different implications. Say, speed.

Like, I’m a hell of a lot more frightened of getting cancer (logical; there’s a lot of it in my family) than I am being crushed by a grand piano falling from a great height. But if I see a Steinway hurtling toward my head, I’m going to need to jump sideways really, really fast. And then figure out how the fuck a crane got in here, with the low ceilings and all.

I have worked with people who assess risk for engineering projects. They scoff at what they believe is the emotional, irrational way people evaluate personal risk. There’s a sort of math makes it science prejudice about sticking with pure probabilities and leaving sphincter-clenching horror out of the equation. But is that really more sciencier?

Okay, you’re like a willionty-jillion times more likely to die in a car crash than a plane crash. So why do people sweat flying in a way they don’t sweat driving? Wellll…most of us have personal experience of traffic accidents; they range from the truly fucking awful to the merely annoying. A plane wreck, on the other hand — son, you’re going to die. And before you do, you’re probably going to see it coming. Good and hard. Trapped in a small metal box. With a bunch of screaming strangers. And your pants on fire.

Yeah, I think even Spock would add that into his risk evaluation alongside pure numerical probability.

So, how likely a thing is does have to count the most. But other factors do and should count, as well. How horrible it would be. Whether you could prevent it. How predictable it is. How much warning you’re likely to have.

Have you heard the argument that terrorism should be WAY down in our list of priorities because the death count is so small? That there is some serious stupid masquerading as science. Terrorism adds human malice into the equation: a bunch of somebodies aiming all their brainal capacity at sneaking past every safeguard to do something of maximum horror, pain, visibility and surprise. I want a buttload of resources thrown at that creepy shit no matter how much more likely I am to be hit by lightning.

Emotional considerations are a kind of a logic. Thinking is not the opposite of feeling. They can elbow each other out of the way, but they aren’t two different states of the same element.

And poor old Al Gore, who thinks he can use the one to prop up the other, doesn’t have either on his side.

July 9, 2009 — 7:33 pm
Comments: 29