A Space & astronomy forum. SpaceBanter.com

Go Back   Home » SpaceBanter.com forum » Astronomy and Astrophysics » Astronomy Misc
Site Map Home Authors List Search Today's Posts Mark Forums Read Web Partners

WTF Is Going On Here?!



 
 
Thread Tools Display Modes
  #11  
Old May 29th 09, 07:56 PM posted to sci.physics,sci.astro,sci.math,sci.crypt,comp.theory
Martin Michael Musatov
external usenet poster
 
Posts: 2
Default WTF Is Going On Here?!



Daryl McCullough wrote:
Bluuuue Rajah says...

[about definitions of entropy]

The one from my statistical mechanics text (Reif) says S=k*ln(Omega),
where Omega is the density of states. The one from my Information
Theory book (Reza) says S=-Sum{P(i)*ln[P(i)]}, where P(i) are
probabilities of the various different system states.


In classical statistical mechanics, you partition phase space
into tiny little hypercubes with the same volume, v_0. This makes
the state space discrete (two states are considered the same if
they are in the same tiny hypercube). The condition of thermal
equilibrium at constant energy E (with energy measured to accuracy
delta-E) is that all states with energy between E and E+delta-E
are equally likely. The number of such states is given by
N = Omega(E) delta-E/v_o. If all states are equally likely,
then for each allowed state i, P(i) = 1/N. So the information-theoretic
entropy gives:

S = - sum P(i) log(P(i)) = - sum P(i) log(1/N)
= log(N)
= log (Omega delta-E/v_0)
= log(Omega) + log(delta-E/v_0)

So the information-theoretic entropy agrees with the
statistical mechanical entropy up to the additive
constant log(delta-E/v_0). Typically only differences
of entropy are measurable.

(There is a matter of the constant k, but that's just a units
thing. You can choose units to make k whatever you like.)

--
Daryl McCullough
Ithaca, NY


WebResults 1 - 10 of about 17,000 for proof entropy PN=P. (0.30
seconds)Minimizing the relative entropy in a facePN is equal to qJ[1]
(1)-lb[1] and is normal since it has finite relative entropy with
respect to p. The proof is completed by showing that for each
p, ...http://www.springerlink.com/index/Q1248X470360258G.pdf - Similar
pagesInformation Theory and MartingalesFundamental limit properties
hold for conditional entropy .... L X(P) limit of log pn must equal
log dP/dQ. This completes the proof of con- clusion (3). ...http://
http://www.stat.yale.edu/~arb4/publi...maringales.pdf
- - Cached - Similar pages188 A. VOURDAS, C. BENDJABALLAH REFERENCES
[1] P. Carruthers, M.M. ...entropy for quantum state p. After von
Neumann, quantum entropies are ... where An is an eigenvalue of p and
Pn is the projection from U onto the .... Proof : Since the spectrum
of A is lower bounded, the operator e~^A is a trace class ...http://
pm1.bu.edu/~tt/qcl/pdf/ohya___m1993777b7715.pdf - - Cached - Similar
pagesJournal of Mathematical Analysis and Applications : The
stability ...16 May 2008 ... The basic reference concerning the
entropy of degree is the book by Acz l and Dar czy [1]. ... (2.2)
Proof. In the proof we adapt some ideas from [2]. .... Since( J n ) is
-recursive, for all (p 1 ,...,p n+1 ) n+1 ...http://
linkinghub.elsevier.com/retrieve/pii/S0022247X08005362 - Similar
pagesCitebase - Relative Entropy: Free Energy Associated with ...We
give only a heuristic proof below and leave the rigorous treatment to
the ... P (í1 , í2 , ...ím ) = pN í1 pN í2 ...pN ím . (7) (N í1 )!(N
í2 )! ...http://www.citebase.org/abstract%3Fidentifier%3Doai
%253AarXiv.org%253Amath-ph%252F0007010%26action%3Dcite****s%26cite****s
%3Dcites - 28k - Cached - Similar pagesRandom Coding Strategies, for
Minimum Entropythe entropy of p with respect to product measure p x
v. ..... Pn * P, 0-m. I(P,). 5 I(P) as the proof of Theorem 2 shows.
Also,. R is convex (so is ...http://ieeexplore.ieee.org/
iel5/18/22685/01055416.pdf%3Farnumber%3D1055416 - Similar pagesThe
Semicircle Law, Free Random Variables and EntropyProof: For pi,...,pN
0, we can define FPz PN (ai,. . . ,aAT;n, r, e) similarly rfl

(ai, . . . ,ajv; n, /-, e) by taking ||A,|| p instead of ||Aj|| -
R,
/Spectrum of multidimensional dynamical systems with positive
entropyditional entropy we have. H(Pn V Q") - £T(Pn) + iJ(Q", 1 P) H
(Pn) + E(Qn I Pr), .... Proof.

Let lP^ be the group of all rationals of the form m/pn where ...http://
matwbn.icm.edu.pl/ksiazki/sm/sm108/sm10817.pdf

Limit Sets of^-Unimodal Maps with Zero Entropy elementary geometric
proof that the limit sets of these mappings are Cantor Limit Sets of S-
Unimodal Maps with Zero Entropy. 657 f CO) = c(l) p(n, 1) ...http://
projecteuclid.org/DPubS/Repository/1.0/Disseminate%3Fhandle
%3Deuclid.cmp/1104159401%26view%3Dbody%26content-tyInformation
dynamics and open systems: classical and quantum approach

Or-- 1 We may assume that every eigenvalue of pn - p is less than 1
for D For the entropy in CDS (classical dynamical systems), the
monotonicity.

--
Martin Musatov
Los Angeles, CA
  #12  
Old May 30th 09, 07:01 AM posted to sci.physics,sci.astro,sci.math,sci.crypt,comp.theory
BradGuth
external usenet poster
 
Posts: 21,544
Default WTF Is Going On Here?!

On May 28, 10:09*am, Sanny wrote:
WTF is going on here? *Why is Reif's definition of entropy different
from the other two? *Entropy is entropy, right? *Shouldn't they all be
the same equation?


There are many types of Entropies.

1. Chemical Energy Entropy

2. Gas Laws Entropy

3. Matter Entropy [Thermodynamics]

Entropy is used at many places and have different meaning.

Its complex and confusing. But you have to memorize those equations by
heart to pass the exams.

Bye
Sanny


Is memory the same thing as intelligence? If so, I have a parrot
that's smarter than Einstein.

~ BG
  #13  
Old May 30th 09, 08:15 AM posted to sci.physics,sci.astro,sci.math,sci.crypt,comp.theory
Sanny[_2_]
external usenet poster
 
Posts: 11
Default WTF Is Going On Here?!

Entropy is used at many places and have different meaning.

Its complex and confusing. But you have to memorize those equations by
heart to pass the exams.


Is memory the same thing as intelligence? *If so, I have a parrot
that's smarter than Einstein.


Yes memory is essential for any intelligence.

Can you imagine computer without harddisk?

An Alzheimer's person has bad memory so his intelligence fail.

Say you are asked to find area of circle and you forget the equation
of area pi*r^2. Then you will never be able to find area of circle.

Large memory is needed for more intelligence.

Intelligence works by analyzing data. to store data you need memory.

Ants have low memory So they have low intelligence.

Dogs have more memory so they are more intelligent.

Man have even more memory So they are most intelligent.

So without memory intelligence cannot work.

Area of Circle?

Area of Square?

Pythagorus Theorem

Newton Laws.

How to change gears of car when driving.

You have to memorize them else you cannot use them.

Bye
Sanny

=============================================
Play Chess: http://www.GetClub.com/Chess.html
A Descision making Game

Enjoy & Chat: http://www.GetClub.com
Talk with Computer

Earn $1600/ month: http://www.getclub.com/salesjob.html
Get $200-$400 per sale.

Business Planning Softwa http://www.softtanks.com/
3 Versions of Software
1. Small Version: for Owner of Small Shops/ Companies
2. Medium Version: For: Managers working in Small/ Medium sized
Companies or CEO/ Business Owners.
3. Large Version: For: Executives in Large Companies/ Business Owners
=============================================




  #14  
Old May 30th 09, 04:06 PM posted to sci.physics,sci.astro,sci.math,sci.crypt,comp.theory
Benj
external usenet poster
 
Posts: 267
Default WTF Is Going On Here?!

On May 30, 3:15*am, Sanny wrote:

Is memory the same thing as intelligence? *If so, I have a parrot
that's smarter than Einstein.


Yes memory is essential for any intelligence.

Can you imagine computer without harddisk?

An Alzheimer's person has bad memory so his intelligence fail.

Say you are asked to find area of circle and you forget the equation
of area pi*r^2. Then you will never be able to find area of circle.

Large memory is needed for more intelligence.

Intelligence works by analyzing data. to store data you need memory.

Ants have low memory So they have low intelligence.

Dogs have more memory so they are more intelligent.

Man have even more memory So they are most intelligent.

So without memory intelligence cannot work.

Area of Circle?

Area of Square?

Pythagorus Theorem

Newton Laws.

How to change gears of car when driving.

You have to memorize them else you cannot use them.


Sanny you are an idiot. Please go memorize the entropy formulas so you
can pass your exams. Eventually they may give you a piece of paper. It
is that PAPER that proves your are intelligent! Nothing more. Keep
kissing ass to get that paper!

 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump


All times are GMT +1. The time now is 08:00 AM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 SpaceBanter.com.
The comments are property of their posters.