| TIP: Click on subject to list as thread! | ANSI |
| echo: | |
|---|---|
| to: | |
| from: | |
| date: | |
| subject: | Re: Characterizing comple |
Guy Hoelzer wrote or quoted:
> Hi Tim,
>
> in article ceqv4u$2r7j$1{at}darwin.ediacara.org, Tim Tyler at tim{at}tt1lock.org
> wrote on 8/4/04 8:25 AM:
>
> > Guy Hoelzer wrote or quoted:
> >> dkomo at dkomo871{at}comcast.net wrote on 8/2/04 10:00 AM:
> >>> Olivier d'ANHOFFRE wrote:
> >
> >>>> We can define "complexity" as
"organized information".
> >>>
> >>> The term "organized information" seems
redundant. I can't envision
> >>> "disorganized information." If you can,
please give an example.
> >>> Otherwise, I agree that complexity and information are
correlated, but
> >>> have different meanings.
> >>
> >> I agree. Information is about pattern and structure (it can
be static),
> >
> > This is not true of Shannon's information. Shannon's definition of
> > information refers to whether the data is suprising - and makes no
> > mention of whether it refers to pattern or structure.
>
> I mostly agree. I often find Shannon information to be a useful measure,
> but I don't think it is the same as physical information for the reason you
> point out. However, it can be a useful measure because it ought to be
> positively correlated with the extent of physical information in general.
>
> I disagree with your interpretation of Shannon information as being about
> how surprising the data are. That is what statistics are for. There is no
> indication of rarity or inconsistency with prior beliefs in Shannon's
> information measure. It is, instead, a summary of data variance that places
> an upper limit on the amount of physical information that might be
> represented by the data.
Shannon's information *is* conventionally a measure of its "surprise
value".
E.g.:
``The intuitive feeling that the greatness of an equation reflects the
element of surprise receives a kind of confirmation in Shannon?s
equation I = ?plog2 p, which forms the basis of information theory.
Igor Aleksander describes the message it conveys as follows:
...[T]he amount of information I depends on the surprise that the
message holds. This is because the mathematical way of expressing
surprise is as a probability p; the less probable an event is, the
more surprising it is and the more information it conveys.''
http://americanscientist.org/template/BookReviewTypeDetail/assetid/21209
[I hope this long URL survived the s.b.e URL mangler!]
``Shannon sees things a little differently. He defines information as a
measure of uncertainty as to the next message to be received in the
communication or messaging event. The higher the uncertainty or
surprise, the greater the information, and the greater the entropy.
- http://www.mgtaylor.com/mgtaylor/jotm/winter97/infotheory.htm
Inconsistency with prior knowledge is *exactly* what Shannon's
information metric is all about.
Information is *totally* subjective in Shannon's model. A message
that one agent finds informative may contain no information if is is
encountered by another agent.
Shannon's information is most certainly *not* an upper limit on the amount
of physical information that might be represented by some specified data.
That might be termed "theoretical information capacity" or
"data storage
size".
--
__________
|im |yler http://timtyler.org/ tim{at}tt1lock.org Remove lock to reply.
---
þ RIMEGate(tm)/RGXPost V1.14 at BBSWORLD * Info{at}bbsworld.com
---
* RIMEGate(tm)V10.2áÿ* RelayNet(tm) NNTP Gateway * MoonDog BBS
* RgateImp.MoonDog.BBS at 8/8/04 9:40:51 PM
* Origin: MoonDog BBS, Brooklyn,NY, 718 692-2498, 1:278/230 (1:278/230)SEEN-BY: 633/267 270 @PATH: 278/230 10/345 106/1 2000 633/267 |
|
| SOURCE: echomail via fidonet.ozzmosis.com | |
Email questions or comments to sysop@ipingthereforeiam.com
All parts of this website painstakingly hand-crafted in the U.S.A.!
IPTIA BBS/MUD/Terminal/Game Server List, © 2025 IPTIA Consulting™.