C O O P E R A T I O N 3The "Ecological" Prisoner's DilemmaAfter deriving some preliminary conclusions about this result, Axelrod tried an even more interesting innovation. In this new round, for which Axelrod publicly requested submissions from any source, there were 62 entrants plus one (RANDOM, from Axelrod) for a total of 63. All these strategies were then pitted against one another in a giant free-for-all tournament. The winner was Tit for Tat, submitted again by Rapaport. (But, oddly, by no one else.) Again, it had the highest average score of payoffs. Axelrod scored the results of the tournament as a 63x63 matrix which showed how each strategy had fared against every other strategy. An analysis of the strategies played revealed that there were six strategies that best represented all the others. Since the 63x63 matrix showed how each strategy played against all others, Axelrod was able to calculate the results of six hypothetical "replays" in which one of the six representative strategies was initially dominant. Tit for Tat scored first in five of the six replays, and scored second in the sixth. Then came the cleverest innovation yet. Suppose, Axelrod's notion had it, we performed a hypothetical replay in which all strategies were pitted against each other, and in each turn the "loser" was replaced by a copy of the "winning" strategy, thus altering the population of players? Each strategy's score--already known from the 63x63 matrix--could treated as a measure of "fitness" against other strategies in a kind of "ecological" tournament. The results left no doubt. The lowest-ranked strategies, which tended to be "not-nice" (in other words, which tried to defect occasionally to see what they could get away with), were extinct within 200 rounds. Only one not-nice strategy (which had been ranked eighth in the original 63x63 competition) lasted past 400 rounds, but by then the population of surviving strategies consisted only of those which replied to defections with immediate retaliation. Because the not-nice strategy had no more strategies which could be taken advantage of, it began a precipitous decline. By the thousandth round, it too was virtually eliminated. And the winning strategy? Once again it was Tit for Tat, which was not only the most prevalent strategy at the end of 1000 rounds, but the strategy with the highest rate of growth. Tit for Tat was not merely successful, it was robust--it did well in all kinds of environments. Why did Tit for Tat do so well? How could such a simple strategy perform so capably in such a broad mix of more complex strategies? More to the essential point, how could Tit for Tat do so well even when surrounded by strategies which depended on defecting and so would supposedly tend to earn better payoffs? It appeared that a strategy which cooperated by default was able to not only survive but actually thrive amidst a sea of defectors. In other words, cooperation evolved over time in a world dominated by uncooperative players. If this simulation bore any relation to the real world of humans, there could be some important lessons in it for us.
Next: BackgroundThe Prisoner's DilemmaThe Iterated Prisoner's DilemmaThe "Ecological" Prisoner's DilemmaHow Cooperation WorksHow Tit for Tat WorksThe Principles of Tit for TatThe Implications of Tit for TatThe Future of CooperationHome
|