You weren't listening. In the example given, the fact of reality that
one /cannot/ capture the gazelle alone was a premise, and you can't
change that. Reality is like that in many places: there exist millions
of ways to benefit yourself by cooperating and /incidentally/ benefiting
others that simply /cannot/ be done alone. This is a fact, and can't
be argued away. I cannot build my own car--I don't have that choice.
And even if I did, it would probably be cheaper for me in terms of my
time to just buy it, thereby incidentally benefiting its manufacturer.
But I don't buy it with the purpose of filling Mr. Iacocca's pockets;
I buy it with the purpose of driving to work to fill mine, and only
mine. The fact that I aided someone else was just a consequence.
In a computer simulation the idea is clearer: each of Axelrod's
programs, for instance, was selected entirely on the basis of one
single linear mesure of its own wealth; it did not even have any
way to know anything about anyone else's wealth. It was, by hard-
wired electronic design, completely selfish. Yet those progams that
found their selfish ends best served by cooperation with others did
well in the simulation against more naive predatory programs. He
showed that cooperation can evolve naturally as a more effective
strategy than predation, even when the /only goal/ of the strategy--
hard-wired and immutable--was selfish gain.