TIP: Click on subject to list as thread! ANSI
echo: evolution
to: All
from: Wirt Atmar
date: 2004-08-16 13:26:00
subject: Re: Dawkins gives incorre

Tim writes:

>I observe that there are some simple factual errors in:
>
>''The Information Challenge''
>
>
>http://www.world-of-dawkins.com/Dawkins/Work/Articles/1998-12-04infochall
ange.s$
>
>The above will likely trigger the s.b.e long URL bug - so:
>
>http://tinyurl.com/4eqbh
>
>This bit:
>
>``Mutation is not an increase in true information content, rather the
>  reverse, for mutation, in the Shannon analogy, contributes to increasing
>  the prior uncertainty.''
>
>...is not correct.  Mutation typically *increases* the information in the 
>genome, by increasing its surprise value.
>
>Similarly this bit:
>
>``natural selection is by definition a process whereby information is fed
>  into the gene pool of the next generation.''
>
>...is also not correct - natural selection usually /eliminates/ variation,
>and thus /destroys/ information.
>
>``But now we come to natural selection, which reduces the "prior 
>  uncertainty" and therefore, in Shannon's sense, contributes
>  information to the gene pool.''
>
>...and...
>
>``natural selection feeds information into gene pools''
>
>...are also not correct - for the same reason: natural
>selection usually eliminates information from gene pools -
>by destroying individuals that carry it.
>
>This area is critical point in the essay.  Dawkins apparently gives
>completely the wrong answer to the question his essay is addressing.
>
>Dawkins stated position appears to be not remotely defensible -
>it is completely mistaken - he totally reverses the roles of
>mutation and natural selection, as far as their effect on
>information content of genomes is concerned.
>
>It appears that St Richard is fallible after all ;-)

I'm not so sure that Shannon would think that Dawkins is all that wrong.
Shannon saw the "surprise" measure of "information"
that you're mentioning in
an inverse manner, as a depature from the monotonous signal. It's the
"e" in
this signal that carries the "information," not the "z's":

     zzzzzzzzzzzzzzzzzzzzzzzzzezzzzzzzzzzzzzzzz

The constant repetition of an infinite string of "z's" tells you
nothing about
the world. However, when discussed in this way, both the "z's"
and "e's" were
implied to have signal value (i.e., they "meant" something). The
symbols are
not meant to represent meaninngess noise.

But more importantly to the discussion at hand, there are two definitions of
information, and they should not be confused. The first is the more common:
“information is that quality that encodes behavior,” and is the definition
that should always take precedence. No better illustration of this definition
exists than Richard Lewontin’s diagram of the evolution of a population
through a single generation as a series of transformational mappings that
appears in his 1974 book, "The Genetic Basis of Evolutionary
Change," where he
defines two state spaces, one a behavioral (phenotypic) space, and the other a
coding (genotypic) state space, bound together by a series of mappings.

As I read the Dawkins quotes above, it seems clear that he is using the term
"information" is the standard manner.

Shannon defined "information" in a second, nonobvious way, which could be
castigated as a poor choice of words, as
the metric:

     I = - log (pi)

in direct imitation of Ludwig Boltzmann’s earlier definition of entropy:

     S = k log W

where pi is the probability of occurrence of the ith symbol in the symbol
set, S is entropy, k is Boltzmann’s constant, and W the probability of a
collection of system states. By defining his "information metric" in this
manner, what Shannon was explicitly defining the unexpectedness of the ith
symbol, or the level of surprise that accompanies its appearance, but nothing
else.

When taken out of context, the mathematics of the equation makes no discernment
as to whether the ith symbol is noise or signal, but I feel fairly strongly
that Shannon originally meant the symbol to imply signal, not noise, if for no
other reason than these thoughts were the basis of his nearly simultaneous work
on compression algorithms, which operate by removing all of the redundancy from
a transmission, allowing the line above to be reduced to something like:

     30z1e16z

(which is an example of a simple run-length encoder, and which is not
necessarily the most efficient compression encoder).

When all of the redundancy has been sucked out of a completely meaningful
signal, in a lossless manner, the compression algorithm is said to have
achieved "maximum entropy," but that term does not mean to imply
that any part
of that "entropy" is necessarily the result of meaningless noise.

Mutation (meaningless error) does the Shannonian entropy of an encoding
program, but it also quite obviously decreases the "true information
content"
of the program, to use Dawkins' phrase above.

Wirt Atmar
---
 RIMEGate(tm)/RGXPost V1.14 at BBSWORLD * Info{at}bbsworld.com

---
 * RIMEGate(tm)V10.2* RelayNet(tm) NNTP Gateway * MoonDog BBS
 * RgateImp.MoonDog.BBS at 8/16/04 1:26:05 PM
* Origin: MoonDog BBS, Brooklyn,NY, 718 692-2498, 1:278/230 (1:278/230)
SEEN-BY: 633/267 270
@PATH: 278/230 10/345 106/1 2000 633/267

SOURCE: echomail via fidonet.ozzmosis.com

Email questions or comments to sysop@ipingthereforeiam.com
All parts of this website painstakingly hand-crafted in the U.S.A.!
IPTIA BBS/MUD/Terminal/Game Server List, © 2025 IPTIA Consulting™.