Freethought & Rationalism ArchiveThe archives are read only. |
04-23-2002, 10:42 AM | #11 |
Junior Member
Join Date: Apr 2002
Location: In your Imagination
Posts: 69
|
Hangover? Nasty... I always try to aviod it by drinking loads of water, and drinking Vodka (fewer impurities)
Thanks alot liquid, that is clearing it up, although a specific example certainly would be helpful(and what's a corollary ? ) 9% Beer? Could you please imform me with a name so I could obtain a smaple (approx. 20 Bottles)*... *Don't worry, I'm in the UK, it's perfectly legal. [ April 23, 2002: Message edited by: Skepticwithachainsaw ]</p> |
04-23-2002, 12:03 PM | #12 |
Regular Member
Join Date: Sep 2000
Location: Pasadena, CA, USA
Posts: 455
|
Chainsaw: 1) Is the definition for entropy all ways of looking at the same thing? For example is the statistical explanation a direct consequence of the loss of energy available to form new potential states? If so could you explain this (or point me in the right direction), I think I've got a vauge idea of how this works but I'd like something more concrete.
While all of the thermodynamic definitions of entropy are fundamentally equivalent, it is probably easier not to worry over that point just yet. In the classical definition (S = Q/T or dS = dQ/T), it's all about energy available to do work. So dQ measures the amount of energy that goes inot the "heat resevoir" of the system, while the rest of the energy (if any), that is not part of Q, has gone into some form of mechanical work. The guys who discovered this were engineers interested in the conversion of mechanical work into heat, specifically in the process of boring out cannon barrels. In the statistical definition it's all about microstates versus macrostates. The energy or the temperature of a system is a "macrostate", which means it is a variable that makes sense only when applied to the system as a whole, and not necessarily to some arbitrary part of the system. A pressurized tank of gas has an energy associated with it. But there are lots of ways to distribute the gas molecules, each one with its own position and kinetic energy, so as to produce the same macrostate energy. If all you do is measure the energy of the gas tank (via pressure & temperature), you really have no idea which one of those many possible distributions the molecules are actually in. Each of the possible distributions is a "microstate". Each microstate has a probability associated with it. in the special case where all of the possible microstates are equally probable, then the entropy S is defined such that S = k*log(N), where N is the total number of microstates possible, each of which results in the same macrostate. In the more general case, where the microstates are not all equally probable, then a more explicit sum is necessary, and S = k*sum{P(i)*log(P(i))} where the sum is over all possible microstates, and P(i) is the probability associated with the given microstate, indicated by the value of the index counter "i". Now note that the entropy is defined for the whole system, just like the temperature or the energy. Maybe the real microstate has everything all smoothed out, or maybe the real microstate has clumps in it. If they both produce the same macrostate, then the issue is all probability. How likely is the real microstate, either as an individual microstate, or perhaps as a member of a class of similar microstates? Right away you see that the word "impossible" can never be properly applied to an entropy based argument. Only the word "improbable" will do. Now, most people colloquially associated "impossible" with "improbable enough to be practically impossible", but in that case you really need to get serious about being explicit in showing what the probability really is. Now think about why the 2nd law specifies "isolated" systems. Entropy, like everything else, flows "downhill". If there is more entropy over there than there is over here, then entropy will tend to flow from over there to over here, until the "entropy landscape" is flat as a pancake. That's when entropy stops flowing. But entropy & energy are coupled together, in the opposite sense, so that low entropy usually means high energy, and low energy means high entropy (remember the definition of energy is "the ability to do work" and classical entropy is associated with "the inability to do work"). So what the second law tells you is that the entropy of an isolated system will always increase (for irreversible actions, we'll worry about reversibility later). If the entropy is a maximum, then the energy must be a minimum. The 2nd law is just another way of saying that all natural systems tend towards their lowest energy (highest entropy) state. When the confused stack of papers on your desk finally falls to the floor, it's not just seeking a lower energy state, but a higher entropy state too. If the system in question is not "isolated", but rather is "open", that means that energy and entropy can flow freely across the boundary between the system and its surroundings. Enrgy and entropy can cross the boundary independently, and not necessarily as bound opposites. Liquid water freezes into ice because both energy and entropy can flow away (it's colder and more "ordered" than the liquid). A creationist will argue that "order" can't appear in the universe of a big bang cosmology because the 2nd law does not allow it, the entropy has to increase. But any small part of the universe is an open system, and both entropy and energy can flow freely across the arbitrary boundary. So when a cloud of cosmic stuff condenses into a galaxy cluster, or a galaxy, or a star, or a planet, it's just entropy & energy at work. The entropy expelled by such a process increases the entropy of the isolated universe by a larger amount than the decrease in the condensation. The 2nd law is satisfied on a cosmic scale, while structure formation goes on. It takes time for the energy & entropy of a system to smooth the entropy landscape that I mentioned earlier. It takes time for any system to spread it's entropy around. That time can be calculated, at least in round numbers, by looking at the basic physics involved in spreading the entropy around in an expansing universe. For instance, a star like our sun will sit around for maybe 10^10 years before it rfinally runs out. But a very small red dwarf star will take 10^14 years before it runs down. White dwarfs & neutron stars will contribute to the bumpy entropy landscape of the universe for as long as 10^1500 years or so. And black holes, along with their Hawking radiation, will make the entropy landscape bumpy for a staggering 10^(10^76)) years, even while the universe expands. So the trick to the creationist's game is time. The universe, in a big bang cosmology, is only about 1.5 or 2 x 10^10 years old, hardly the fist step in the 10^14 year red dwarf universe, and nothing compared to the time scale set by black holes. Creationists insist that the big bang can't be true, because the universe should have "wound down" by now. But when we replace the propaganda with the proper numbers, we see that we have plenty of time to go before the "heat death" of the universe (assuming that the concept of a "heat death" is valid, which is not obvious either). I write & talk to much, but hopefully that gets the big point across. Chainsaw: 2) How does this relate to irreversible reactions, is it that the Entropy involved in a reaction/interaction always increases or that it only increases for irreversible reactions? Reversibility is a good thing to forget about. It is highly improbable. It's one of those things physicists use to define the "ideal world" approximated by cold, hard reality. A truly reversible process will not increase (or decrease) the entropy of an isolated system. But you will have a hard time finding any real process that is truly reversible, so don't fuss over it too much. Now, a short P.S. It is commonly believed that the universe is an "isolated" system, and it may well be. But it may well also not be. If some of the esoteric ideas of string theory are true, such as the big bang being the symptom of two colliding 5-dimensional "branes", then that which we call "space-time" may well not be isolated at all. Cheers. |
04-23-2002, 03:20 PM | #13 |
Regular Member
Join Date: Apr 2002
Location: USA
Posts: 153
|
<img src="graemlins/notworthy.gif" border="0" alt="[Not Worthy]" /> !!!Wow!!! <img src="graemlins/notworthy.gif" border="0" alt="[Not Worthy]" />
I am glad to be here, and thought I'd mention it. Liquid & Tim: Could you recommend a good introductory to intermediate (2 semester) college text as well as any general books discussing/teaching the aforementioned subject matter? |
04-24-2002, 07:19 AM | #14 |
Veteran Member
Join Date: Jan 2001
Location: UK
Posts: 1,440
|
Got to rush so will do example later.
The beer is Delerium Tremens. I live in Belgium some of the time, which has the best beers in the world (with the czech republic a close competitor). There are lots of beers here, and even normal beers are typcially 1 to 2% stronger than those in the UK. We also don't have bitter or rubbish 'real' ales. The other special beers (as they are known)I would recommend are Duvel (12%) and Leffe (which isn't so strong but tastes v. good). The latter can be bought in most UK supermarkets. Oh yes, and Wit beer... try chilled Hoegaarden on a hot summer day. And it isn't pronounced "Ho-garden", I'll tell you that much! A corollary is simply an alternative, equivalent implication of a statement. For instance, if I say 'my dog is black and my cat is the opposite colour', a corollary would be 'my cat is white and my dog is the opposite colour'. Second semester Thermodynamics? Well, from an engineering point of view: Applied Thermodynamics for Engineering Technologists - 5th Edition Eastop & McConkey, Longman, 0582091934 Engineering Thermodynamics - Work & Heat Transfer 4th Edition Rogers & Mayhew, Longman, 0582045665 They address the basic theory, and then expand it into real-world application and implications. To go in the other direction, and look at the fairly obscure stuff, you are probably better off asking a pure physicist. |
04-24-2002, 07:31 AM | #15 | |
Regular Member
Join Date: Apr 2002
Location: USA
Posts: 153
|
Quote:
I would probably do better with an introductory physics class. I have some serious catching up to do... |
|
04-24-2002, 10:13 AM | #16 |
Veteran Member
Join Date: Jan 2001
Location: UK
Posts: 1,440
|
Well, those two books cover all the 'foundation' thermodynamics, from the top end of what you would learn in high school (if you learnt any) to some of the stuff in your theird year of university. As mentioned, they have an engineering bias but the difference doesn't really begin until you get into statistical thermodynamics and micro-thermodynamics.
Right, now for an example of equivalent corollaries. 2nd Law... Kelvin Planck Statement: It is impossible to construct an engine which, operating in a cycle, will produce no other effect than the extraction of heat from a single heat reservoir and the performance of an equivalent amount of work i.e. 100% efficiency impossible. Corollary 1: Heat cannot pass spontaneously i.e. without the assistance of an external agency, from a lower temperature body to a higher temperature body. Showing why C1 can be proved in terms of KPS... Consider a high temperature reservoir T1 and a low temperature reservoir T2. [n is an upwards arrowhead. Dots are just spacers.] t1t1t1t1t1t1t1t1t1t1t1t1t1t1t1t1t1t1t1 ...n............I.Q2...........I.Q2-Q1 ...I.Q1........V..............V...... [engine]...[engine]->X...[engine]->Y ...n...........I..................... ...I.Q1.......V.Q1.................. t2t2t2t2t2t2t2t2t2t2t2t2t2t2t2t2t2t2t2t2 [@X: Wnet = Q2 - Q1 @Y: Wnet = Q2 - Q1] In the left-hand situation, Q1 heat units are transferred spontaneously from the low to the high-temp reservoir, in violation of corollary 1 (incidentally, this is the Clausus statement). In the central situation, Q2 heat units are transferred from the high temp reservoir to a heat engine, Q1 units being rejected to the temperature sink. This results in a work output: Wnet = Q2-Q1 (Assuming Q2>Q1) If the left and central situations are combined, the net heat absorbed from the low temp reservoir is zero, while that absorbed from the high temp reservoir is Q2-Q1, resulting in a useful work output: Wnet = Q2-Q1 Therefore, the resultant situation (the one at the right) is one in which an engine produces no other effect than the extraction of heat from a single reservoir and the performance of the equivalent amount of work. Thus the right situation contravenes the KPS, so the Clausis statement, C1, violated in teh left situation, is clearly true. In layman's terms, the two statements (which can be and are defined mathematically) say the same thing in a different way. [Edited to fix diagram] [ April 24, 2002: Message edited by: liquid ] [ April 24, 2002: Message edited by: liquid ] [ April 24, 2002: Message edited by: liquid ]</p> |
04-24-2002, 11:37 AM | #17 |
Regular Member
Join Date: Sep 2000
Location: Pasadena, CA, USA
Posts: 455
|
Chainsaw: Could you recommend a good introductory to intermediate (2 semester) college text as well as any general books discussing/teaching the aforementioned subject matter?
I think a good choice would be Fundamentals of Classical and Statistical Thermodynamics, by Bimalendu N. Roy, John Wiley & Sons, 2002. I paid $45.00 for it at the Caltech bookstore, which is probably a pretty good price for textbooks these days. My copy is paperback, 743 pages including 16 chapters, 7 appendices & index. As the title suggests, it covers both classical & statistical thermodynamics. There are 2 chapters devoted to chemical thermodynamics; the section on spontaneous & nonspontaneous processes is useful reading, considering the creationist tendency to argue about how things can't happen spontaneously. There is also a full treatment of the 2nd law. The American Institute of Physics carries a recommendation for two books that I have not seen, Thermal Physics, by Ralph Baierlein, Cambridge U. P., 1999, and An Introduction to Thermal Physics, by Daniel V. Schroeder Addison-Wesley, 2000 (<a href="http://www.aip.org/pt/vol-53/iss-8/p44.html" target="_blank">see review of both books here</a>). As for "general books", my recommendation is to not bother. My experience is that "general books" just don't cut it, and aren't worth the effort to read. If you're going to try to learn this stuff, do it right, with a real textbook. It's more work, but in the end, you actually wind up knowing something. But there are some good sources for further reading. One important reference, in my opinion, is Insight Into Entropy, by Daniel F. Styer, American Journal of Physics 68(12): 1090-1096 (December 2000). Entropy is the easiest thing in the world to misunderstand, and even a lot of scientists do exactly that. Styer does a good job of pointing out misperceptions, and giving the reader a better view of what entropy really is. For advanced reading, the classic work is The Principles of Statistical Mechanics, by Richard C. Tolman, Oxford University Press, 1938. It is currently available through Dover, as part of their extensive reprint series of older, major science books. It's more advanced than Roy's book, but about the same number of pages, and deals only briefly with classical thermodynamics. It's one of my favorite books on the subject, and does not suffer any ill effects from being over 60 years old. Of course I also recommend my own <a href="http://www.tim-thompson.com/entropy.html" target="_blank">Adventures in Entropy</a>. It is perhaps incomplete at the moment, but the basic treatment of entropy in particular should help, or so I hope. |
Thread Tools | Search this Thread |
|