TIP: Click on subject to list as thread! ANSI
echo: evolution
to: All
from: Tim Tyler
date: 2004-08-20 13:06:00
subject: Re: Dawkins gives incorre

CurtAdams  wrote or quoted:
> tim{at}tt1lock.org
> 
> >This bit:
> 
> >``Mutation is not an increase in true information content, rather the
> >  reverse, for mutation, in the Shannon analogy, contributes to increasing
> >  the prior uncertainty.''
> 
> >...is not correct.  Mutation typically *increases* the information in the 
> >genome, by increasing its suprise value.
> 
> I'm with Wirt Atmar on this one.  I agree that Dawkins seems to have tripped up
> in the application of Shannon's definition of "information".
 The problem is
> that Shannon's definition of "information", while a mighty handy and 
> useful concept, isn't close to the common-sense meaning of 
> "information".  "Data" would be a far better term.
> 
> Consider a video of pure random white noise vs. a video of a movie.  The movie
> will be highly compressable, and by the Shannon definition contains less
> "information".
> But ask almost anybody how much information they contain, and they'll say the
> movie has some and the white noise none. [...]

I agree.  The technical term and common usage are not in very good agreement.

> Saying the white noise has more information is nonsense in the common 
> sense of the word. "Data" is also vague, but there's a commonsense 
> distinction between "data" and "information" which
corresponds 
> passably to the distinction between Shannon-sense "information" and 
> common-sense information.   Shannon-sense "information" is a
reasonable 
> formalization of common-sense "data" by this interpretation.

If someone asks "how much" data there is, I'm normally inclined to answer 
with the storgage capicity of the medium required to represent the data
without compression.

I think this is /usually/ what they want to know - and it gives an
objective answer.  Otherwise - for example - if you get into Shannon 
information, you have to get into what sort of data they are expecting, 
what compression technology they have available - and things become more 
messy.

So I would say:

a) 000000000000000000000000000000000000000000000000000000000000000000000
b) 10010101010100101010011101010010100010101001010101011010100001010101

a) has more data than b) - but b) has greater complexity and
higher information content (assuming an observer with fairly
simple expectations of the data).

> If you want a mathematical formalization of common-sense
"information",
> Bayesian/likelihood "support" is much closer. 
"Support" tells you to what
> extent a given message is more compatible with one model than another. 
> Messages equally compatible with all models, ie noise, carry no 
> "support", and hence no information.  Messages exclusively compatible 
> with only one model carry a lot of information, even if quite short.  
> (say, a measurement "proving" a scientific theory". 
But the terms are 
> already out there and I'm not sure what's to be done.

If you are modelling the contents of the message (based on your 
current knowledge) then this sounds a lot like Shannon information.
If you are modelling something else, then it introduces a third variable 
into the equation - information content then depends on the message, your 
current knowledge - *and* on what you are modelling (or on what models 
you are considering).

There are some pros and cons to doing this.

FWIW, there's been another attempt to deal with the issue that 
(rather counter-intuitively) random noise has a high information content.

This is called "structural complexity", as popularised by Ben
Goertzel in his (online) book "The Evolving Mind" (Chapter 1).

http://www.goertzel.org/books/mind/contents.html

> If you consider a likelihood definition of "information"
Dawkins' statement
> makes perfect sense.

Dawkins spends much of his essay talking about Shannon information.

I think he's using Shannon's definition of information throughout
his essay.
-- 
__________
 |im |yler  http://timtyler.org/  tim{at}tt1lock.org  Remove lock to reply.
---
þ RIMEGate(tm)/RGXPost V1.14 at BBSWORLD * Info{at}bbsworld.com

---
 * RIMEGate(tm)V10.2áÿ* RelayNet(tm) NNTP Gateway * MoonDog BBS
 * RgateImp.MoonDog.BBS at 8/20/04 1:06:42 PM
* Origin: MoonDog BBS, Brooklyn,NY, 718 692-2498, 1:278/230 (1:278/230)
SEEN-BY: 633/267 270
@PATH: 278/230 10/345 106/1 2000 633/267

SOURCE: echomail via fidonet.ozzmosis.com

Email questions or comments to sysop@ipingthereforeiam.com
All parts of this website painstakingly hand-crafted in the U.S.A.!
IPTIA BBS/MUD/Terminal/Game Server List, © 2025 IPTIA Consulting™.