## Thursday, November 17, 2011

### S = k ln W

 Herr B's gravestone in Vienna, Austria
The title of this essay is the Boltzmann equation. It is carved upon his gravestone in the same cemetery where Beethoven, Brahms, Schubert, and the Strauss family are also buried.  The form of the equation inscribed on his tomb was actually first written by Max Planck, but that's fine as the original equation written in 1872 by Herr B contains partial derivatives which are hard to carve in stone.

The terms such that where S is the entropy of a system, it can be described by the Boltzmann Constant k (1.38067 joules per degree kelvin) multiplied by natural logarithm of W, which is the number of states accessible to that system.

In plain English, the entropy of a system is determined by the number of ways that the system's components can be arranged. If we are dealing with a gas, it is the number of ways the gas particles can float around, interact, and collide with each other. It's a very, very, very, very large number of ways, so to get the numbers into a manageable form (read small numbers small enough for tiny brains to handle), you take the logarithm of it. This equation links entropy to probability. And, in a roundabout fashion I have no time to explicate, it also has something to do with thermodynamics, and the so-called 2nd Law.

Yeah, and so what? Well, this equation, along with Boltzmann's contribution of a statistical approach to the kinetic theory of gases, and still more, supplies the very broad shoulders that allow Max Planck et al to stand upon. It pointed the way for the development of Quantum Mechanics - by far the most successful set of theories ever developed.

And there's more. Once the American electrical engineer, Claude Shannon, grabbed ahold of Boltzmann's concepts, and developed his own version of Herr B's H-function and the term entropy, then the whole field of information theory opened up. (Although it should be noted that Mr. Shannon's claim that information theoretic entropy is the same as statistical entropy is in dispute in some circles).

Finally we get to the weird stuff. When Shannon investigated the informational aspects of communications,  he found that a message transmitted with optimal efficiency over a channel of limited bandwidth looks exactly like random noise. Not surprising when you think about it. A message with the largest choice of arrangements will have the highest probability of not being degraded in all arrangements. So, the more random, the more likely to get through (whatever it is, noise, interference, etc.)

 Not to be confused with...
 Black-body power curve
When applied to a message broadcast over the electromagnetic spectrum (i.e. using an EM transmission medium) the most information-efficient format is indistinguishable from noise or 'static' (or, if you prefer, it will resemble black-body radiation).

Okay. And um, so...? So, uh, what do we know is all static? Shit on the TV? How about quasars? OR the cosmic microwave background? Could it be we are getting messages beamed to us all the time, and we are just too stupid to figure it out?

(Oh, as far as my contention that intelligence as such does not exist? That stupidity is a real universal force. And that, just as 'cold' is the absence of heat, so  'intelligence' is the absence of stupid, well, I do really need to go on? Like Shannon entropy, there are very few ways to do the smart thing, but a very, very, very, very large number of ways to fuck things up. QED).