RE: virus: The Prisoner's Dilemma

Deron Stewart (
Fri, 19 Feb 1999 10:15:23 -0800

I had hoped not to "drop the ball" but I guess I did in a big way...I ended up communicating nothing. Not the first time this has happened in cyberspace, I guess...

I'll try adding a glossary, I don't know if this will help... (I'm tempted to just write this one off as a failed attempt)...but if you're interested, please reread the original post with these comments in mind...


"big picture" -- taking into account the full context of a situation including any paradoxical or self-referential elements that would make a simplistic "rational" approach less than optimal. No universal or galactic considerations or anything so grand.

"hypothetical situation" -- just a single situation that an individual may find themselves in. Not a statement about whether the universe is inherently rational or knowable or anything dramatic like that. I could have used a concrete example like a "dollar auction" or "Newcomb's Paradox" but I wanted to avoid discussing the particular situation because that wasn't the point.

"non-rational" -- irrational, a-rational, whatever word you want to use for a strategy that isn't "rational". (Whatever we choose to call it, not all strategies are rational, so this is the word describing those that aren't.)

"new name" -- for example "transrational", or "meta-strategy" or something to refer to the Rational use of irrational strategies ("Level 3"?)

"denounce them in the harshest terms" -- those that announce the "meta-strategy" are labelled as irrational, or mystics...

At the end of the day, the best strategy is "rational" almost by definition, right? This circularity of definition makes discussion difficult. (it's like the old "proof" that all scientists are interesting because if you make a list from most interesting to least interesting then someone has to be on the bottom. But to be the "least interesting scientist" is a pretty interesting thing to be, so that person moves up the list, leaving a new "least interesting scientist" on the bottom, and so on...)

The point for me is that people who are steeped in math, science, logic, game theory and critical thinking tend to be blind to some of the "creative" strategies that aren't accessible to that way of thinking. Or at least I know that I have been (and probably continue to be).

Critical thinking is like a big heavy club. Very powerful tool, satisfying and easy to use, but it isn't the right tool for every job. (I guess it feels like a razor sharp blade to the wielder, but looks like a big club to everyone else...). There was a time in my life when I didn't realize other tools existed -- or they all seemed frail and wimpy beside the mighty club of I try to have more of a "right tool for the right job" kind of attitude...but it is hard to break old habits.



p.s. De Bono's "six hats" idea may be a good example of a useful "toolkit".

I couldn't find a good link on the web, but a summary description from a not so good link is (

White Hat -- information gathering
Red Hat -- expressing emotions (without need for justification) Black Hat -- caution and pointing out risks and shortcomings (critical thinking)
Green Hat -- proposing and being creative Blue Hat -- thinking about the process itself Yellow Hat -- finding value

The black hat is the most powerful. And the most abused...

-----Original Message-----
From: Deron Stewart <>
Date: Thursday, February 18, 1999 10:22 AM

>I think what you are saying here is that the truly rational person with
>vision will choose to do what's best from a "big picture" point of view,
>regardless of what is narrowly "rational" in a given situation. Is that a
>fair restatement?

Yes, but with the caveat that a bigger picture isn't necessarily better. At some point (global? galactic? universal?) you start encountering diminishing returns. I'm not sure what the "right" scope is, and I must confess I haven't given it much thought.

>If so then I agree completely. (This feels sooooo close to being a
>breakthrough that I hope I don't drop the ball here...).
>I want to posit a hypothetical situation in which every "rational" choice
>is inferior to some "irrational" choice. (i.e. ignorant and whimsical
>people are getting higher "payouts" than learned and logical people in
>situation. And what's worse is that the more the learned people think
>the problem the worse they do!)

This is automatically true given what I said above. If you pick your scope and some silly large level (e.g. galactic), then every "rational" choice will be inferior to some "irrational" (human-scale) choice.

>Eventually. After a very long time. A few of the learned people figure out
>that there is no "rational" solution to this particular problem and with
>the benefit of this larger perspective thay adopt a "non-rational"
>to this situation which improves their payout.

How is non-rational different than irrational in this context? Apologies if this is a tangent, I don't think I understand what you are implying.

>They give this type of solution a new name to distinguish it from
>which are "rational" in a simple way, and with great enthusiam they
>announce this breakthrough to their learned colleagues.

>The colleagues then denounce them in the harshest terms...