| TIP: Click on subject to list as thread! | ANSI |
| echo: | |
|---|---|
| to: | |
| from: | |
| date: | |
| subject: | Re: Characterizing comple |
Guy Hoelzer wrote or quoted:
> Tim Tyler at tim{at}tt1lock.org wrote on 8/8/04 3:22 PM:
> > Guy Hoelzer wrote or quoted:
> >> in article ceqv4u$2r7j$1{at}darwin.ediacara.org, Tim Tyler at
tim{at}tt1lock.org
> >>> Guy Hoelzer wrote or quoted:
> >>>> I agree. Information is about pattern and structure
(it can be static),
> >>>
> >>> This is not true of Shannon's information. Shannon's definition of
> >>> information refers to whether the data is suprising - and makes no
> >>> mention of whether it refers to pattern or structure.
> >>
> >> I mostly agree. I often find Shannon information to be a
useful measure,
> >> but I don't think it is the same as physical information for
the reason you
> >> point out. However, it can be a useful measure because it ought to be
> >> positively correlated with the extent of physical information
in general.
> >>
> >> I disagree with your interpretation of Shannon information as
being about
> >> how surprising the data are. That is what statistics are
for. There is no
> >> indication of rarity or inconsistency with prior beliefs in Shannon's
> >> information measure. It is, instead, a summary of data variance
> >> that places an upper limit on the amount of physical information
> >> that might be represented by the data.
> >
> > Shannon's information *is* conventionally a measure of its
"surprise value".
> >
> > E.g.:
> >
> > ``The intuitive feeling that the greatness of an equation reflects the
> > element of surprise receives a kind of confirmation in Shannon?s
> > equation I = ?plog2 p, which forms the basis of information theory.
> > Igor Aleksander describes the message it conveys as follows:
> >
> > ...[T]he amount of information I depends on the surprise that the
> > message holds. This is because the mathematical way of expressing
> > surprise is as a probability p; the less probable an event is, the
> > more surprising it is and the more information it conveys.''
> >
> > http://americanscientist.org/template/BookReviewTypeDetail/assetid/21209
> > [I hope this long URL survived the s.b.e URL mangler!]
> >
> > ``Shannon sees things a little differently. He defines information as a
> > measure of uncertainty as to the next message to be received in the
> > communication or messaging event. The higher the uncertainty or
> > surprise, the greater the information, and the greater the entropy.
> >
> > - http://www.mgtaylor.com/mgtaylor/jotm/winter97/infotheory.htm
> >
> > Inconsistency with prior knowledge is *exactly* what Shannon's
> > information metric is all about.
> >
> > Information is *totally* subjective in Shannon's model. A message
> > that one agent finds informative may contain no information if is is
> > encountered by another agent.
> >
> > Shannon's information is most certainly *not* an upper limit on the amount
> > of physical information that might be represented by some specified data.
> >
> > That might be termed "theoretical information capacity"
or "data storage
> > size".
>
> Well, one of us should be able to learn something here. My understanding of
> Shannon information is clearly at odds with yours, and with the (IMHO
> misleading) quotes you provided. If you are correct that Shannon
> information has anything whatsoever to do with subjective interpretation or
> the "surprise" factor, then you ought to be able to show me how the
> observer's viewpoint is represented in Shannon's equation. It appears to be
> utterly absent to me, and thus would play no role at all in the meaning of
> Shannon information.
The subjective element is in the probability.
Different agents will typically have different estimates of the
probablity of different symbols arising in a stream - based on
factors such as their knowledge of the history of the stream
and their understanding of the source that's producing it.
For example, an agent with a complete understanding of the source
would be able to identify each symbol with a high degree of
certainty. His estimate of the probability of each symbol arising
will vary at each step - and will be close to either 1.0 or 0.0 for
each symbol. Witnessing the stream will give this agent little or
no information.
A different agent without such knowledge might make very different
estimates of the probabilities - and witnessing the stream will
provide that agent with much more information.
In general, different agents are likely to produce different
figures of the quantity of information carried by a message -
since their values for the probability of each symbol occurring
will differ - since their probability estimates are based of their
knowledge of the stream.
That knowledge can vary dramatically between agents - from complete
understanding of the source, to complete ignorance of it.
--
__________
|im |yler http://timtyler.org/ tim{at}tt1lock.org Remove lock to reply.
---
þ RIMEGate(tm)/RGXPost V1.14 at BBSWORLD * Info{at}bbsworld.com
---
* RIMEGate(tm)V10.2áÿ* RelayNet(tm) NNTP Gateway * MoonDog BBS
* RgateImp.MoonDog.BBS at 8/10/04 5:34:32 PM
* Origin: MoonDog BBS, Brooklyn,NY, 718 692-2498, 1:278/230 (1:278/230)SEEN-BY: 633/267 270 @PATH: 278/230 10/345 106/1 2000 633/267 |
|
| SOURCE: echomail via fidonet.ozzmosis.com | |
Email questions or comments to sysop@ipingthereforeiam.com
All parts of this website painstakingly hand-crafted in the U.S.A.!
IPTIA BBS/MUD/Terminal/Game Server List, © 2025 IPTIA Consulting™.