TIP: Click on subject to list as thread! ANSI
echo: evolution
to: All
from: Tim Tyler
date: 2004-09-14 21:52:00
subject: Re: Dawkins gives incorre

Guy Hoelzer  wrote or quoted:
> in article chvng2$2hqs$1{at}darwin.ediacara.org, Tim Tyler at tim{at}tt1lock.org:
> > Guy Hoelzer  wrote or quoted:
> >> in article chsg65$1hqg$1{at}darwin.ediacara.org, Tim Tyler at
tim{at}tt1lock.org:
> >>> Guy Hoelzer  wrote or quoted:

> >> Are you arguing that treating p_i as frequency is almost never done, 
> >> or that this practice has not increased in frequency?  Or are you 
> >> just arguing that you don't think it has become sufficiently common 
> >> to call it a transition?
> > 
> > p_i is /always/ the probability of the i'th symbol arising.
> > 
> > Sometimes the probabilities are determined completly by symbol frequencies
> > - but the p_i's are never frequencies.
> 
> If they are "determined completely by by symbol frequencies"
then they are
> frequencies.

A frequency is normally a measurement of the number of times that a 
repeated event occurs per unit time.

> I must say I am quite surprised at your continuing insistence that model of
> information is unlike anything in the minds of scientists publishing in this
> area.  Are you unaware of the current debate over the meanings of both
> entropy and information in the context of order/disorder (dispersion)?

That question seems too vague to answer.  People debate all kinds of
stuff all the time.  AFAIK, the meanings of the terms "entropy" and 
"information" are not especially controversial at the moment.

> How do you explain the information theoretical methods of analysis, 
> such as the Akaike Information Content measure, that have been growing 
> fast in application.  It is fundamental to these methods that they 
> yield precisely the same result in the hands of every scientist, so 
> that they are repeatable and verifiable. The role of perceiver, which 
> was Shannon's initial concern, has been dropped from information theory 
> by many.

I'm not sure about the Akaike Information Criterion, but - as far as
I can tell - is escapes observer-dependence by completely specifying
a particular hypothetical observer (its model) and then asking how 
effective that observer is at predicting the data.

In other words, the term "information" in its title appears to refer
not to the information gained by someone measuring its value - but to
the information that can be expected to be gained by a completely-
specified hypothetical observer witnessing the data stream.

> > They always add up to 1.0 - like probabilities do.
> 
> Like frequencies always do.

Frequencies are usually measured in Hertz - and never add up to a 
dimensionless quantity such as 1.0.

Indeed, adding the values of frequencies together is usually a bad move: 
since 1hz+2hz != 3hz.

Have you encountered Chaitin's algorithmic information theory?

It doesn't appear to be what you are talking about - but it shares
the element of observer-independence (though it tends to become
language-dependent in the process).
-- 
__________
 |im |yler  http://timtyler.org/  tim{at}tt1lock.org  Remove lock to reply.
---
þ RIMEGate(tm)/RGXPost V1.14 at BBSWORLD * Info{at}bbsworld.com

---
 * RIMEGate(tm)V10.2áÿ* RelayNet(tm) NNTP Gateway * MoonDog BBS
 * RgateImp.MoonDog.BBS at 9/14/04 9:52:45 PM
* Origin: MoonDog BBS, Brooklyn,NY, 718 692-2498, 1:278/230 (1:278/230)
SEEN-BY: 633/267 270
@PATH: 278/230 10/345 106/1 2000 633/267

SOURCE: echomail via fidonet.ozzmosis.com

Email questions or comments to sysop@ipingthereforeiam.com
All parts of this website painstakingly hand-crafted in the U.S.A.!
IPTIA BBS/MUD/Terminal/Game Server List, © 2025 IPTIA Consulting™.