I would like to argue that altruistic behavior is derived from the valuing of a good that one has shared with the one for whose sake one acts. The problem, however, seems to be that the altruist doesn't hope to enjoy that common good: that is, if you give your life to save your children, then once you're dead you are no longer enjoying the good of community. But at the moment you are acting for their good, you are sharing in some sense sharing that good with them and possibly doing so more abundantly than you otherwise could. So you not only act so that they will live well later on, but to be with them-later-on right now.
Integral to Dembski's idea of specified complexity (SC) is the notion that something extrinsic to evolution is the source of the specification in how it develops. He compares SC to the message sent by space aliens in the movie "Contact." In that movie, earthbound scientists determine that radio waves originating in from somewhere in our galaxy are actually a signal being sent by space aliens. The scientists determine that these waves are a signal is the fact that they indicate prime numbers in a way that a random occurrence would not. What is interesting to me is the fact that Dembski relies upon an analogy with a sign rather than a machine. Like a machine, signs are produced by an intelligent being for the sake of something beyond themselves. Machines, if you will, have a meaning. Signs, if you will, produce knowledge. But the meaning/knowledge is in both cases something other than the machine/sign itself. Both signs and machines are purposeful or teleological...
Comments