| TIP: Click on subject to list as thread! | ANSI |
| echo: | |
|---|---|
| to: | |
| from: | |
| date: | |
| subject: | Re: Is `LIFE` the result |
Robert Karl Stonjek wrote in message
news:b6us2p$1ark$1{at}darwin.ediacara.org...
> TIM:
> So Shannon is talking about thermodynamic entropy in his paper
> "A Mathematical Theory of Communication" found at
> http://cm.bell-labs.com/cm/ms/what/shannonday/shannon1948.pdf
> You could try reading the following webpages at
> http://www.panspermia.com/seconlaw.htm which are not neccesery accurate
> but do
> show scientists use the word Entropy to reffer to a number of distinct
> concepts
> that are not equivalent.
>
> RKS:
> The underlying nature of entropy is found in its statistical nature.
> This was first discovered in thermodynamics where the nature of heat was
> already known to be statistical - the general flow is from hot to cold,
> but there is always some heat that passes the wrong way (from cold to
> hot) but nowhere near as much as in the other direction.
>
> There are plenty of errors in the web page you mention. It is not
> accurate to say that "entropy never falls in a closed
system" because at
> equilibrium entropy rises and falls - only the average entropy remains
> the same. In the same way that average entropy can not rise, the
> average entropy can also not fall. So the accurate statement is "The
> average entropy of a closed system can neither rise nor fall." That is
> because heat can neither be gained nor lost.
>
> He also mentions the Feynman lectures on physics and makes errors there
> as well. The two chapters are separate, as he mentions. There is
> chapter 44-6 "Entropy" and chapter 46-5 "Order and
Entropy". What is
> not mentioned is chapter 46-4 "irreversibility" where
Feynman starts off
> with thermodynamic entropy, considers the molecular version, then
> changes from fast and slow atoms (corresponding to heat) to imaginary
> black and white atoms. He shows that one can move from 'thermodynamic'
> entropy to 'logical' entropy without having to change theory at all
> because the underling entropy is the same.
>
> Statistical mechanics is the discipline that Feynman is using to bridge
> the gap - as I mention before.
>
> --
> Kind Regards,
> Robert Karl Stonjek.
>
I am coming to think that much too much is made of "Life!" I
lately have been trying out the concept that life is no different in
principle from "non-life"--the differences are only quantitative
(ranging from zero to very large values) in terms of certain
properties we take as somewhat sacred in life...perhaps a very
anthromorphic view. If one takes a strongly deterministic view of the
universe, it becomes clear this is so. If one assumes lesser degrees of
determinism it must still be so--though to lesser degrees.
Think about it! We (and all life forms) are no different in principle (or
importance) than the smallest rock twirling around the
universe. 'Tis all in the subjective viewpoint. All is subject to the same
laws of entropy--and all other physical laws!
.....tonyC
---
þ RIMEGate(tm)/RGXPost V1.14 at BBSWORLD * Info{at}bbsworld.com
---
* RIMEGate(tm)V10.2áÿ* RelayNet(tm) NNTP Gateway * MoonDog BBS
* RgateImp.MoonDog.BBS at 4/9/03 8:47:02 PM
* Origin: MoonDog BBS, Brooklyn,NY, 718 692-2498, 1:278/230 (1:278/230)SEEN-BY: 633/267 270 @PATH: 278/230 10/345 106/1 2000 633/267 |
|
| SOURCE: echomail via fidonet.ozzmosis.com | |
Email questions or comments to sysop@ipingthereforeiam.com
All parts of this website painstakingly hand-crafted in the U.S.A.!
IPTIA BBS/MUD/Terminal/Game Server List, © 2025 IPTIA Consulting™.