![]() It seems to me that it is actually the first reason put forward by von Neumann ( your uncertainty function has been used in statistical mechanics under that name) that fully justifies Timothy Chow's remark concerning the second one: Surely von Neumann's comment was partially a joke, so it may be misguided to try to interpret it too literally. Zurek (ed.) Complexity, Entropy, and the Physics of Information. Wheeler, 1990, “Information, physics, quantum: The search for links” in W. Information Theory and Statistical Mechanics. This is IT: A Primer on Shannon’s Entropy and Information. A Mathematical Theory of Communication Bell System Technical Journal 27: 379-423. Mathematische Grundlagen der Quantenmechanik (Mathematical Foundations of Quantum Mechanics) Princeton University Press. Energy and Information Scientific American 225(3): 179-190. Might modern information theorists know of the specific difficulties he had in mind and whether these have been suitably addressed? References: I find it particularly surprising given that he axiomatised what we now call the von Neumann entropy in Quantum Mechanics around two decades prior to Shannon's development of Classical Information Theory. What I am curious about is what von Neumann meant with his last point. In the second place, and more important, no one really knows what entropy really is, so in a debate you will always have the advantage.' In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. Von Neumann told me, 'You should call it entropy, for two reasons. I thought of calling it 'information,' but the word was overly used, so I decided to call it 'uncertainty.' When I discussed it with John von Neumann, he had a better idea. According to Claude Shannon, von Neumann gave him very useful advice on what to call his measure of information content : ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |