Yesterday I posted a chart like this. Someone made a joke, and I didn't get it. I thought they were pointing out an error. I checked... and all my numbers were off by a factor of 10²³.
Here are the right numbers, I hope. In penance, I'm going to explain entropy.
(1/n)
The entropy in quantum statistical mechanics is often called von Neumann entropy. So, any thorough explanation of entropy has got to explain this.
Okay:
The von Neumann entropy of a density matrix ρ is
-tr(ρ lnρ)
Yay, I'm done! 🎉
@johncarlosbaez Why not try explaining something simpler, like the number 24? :-P
@bstacey @johncarlosbaez its a lot like four sixes... or even six fours
Hush! Last time he spent years illustrating stuff from group theory, division rings, algebraic topology, and shooting cannon balls at 24.
Why is a box filled with gas where the atoms conspire for a short time to look like a picture of Charly Chaplin said to be in a low entropy state? Why can't such an unlikely TV set show something new for a change. And how is that not reversing time. Does knowing Chaitin's constant change the entropy of strings...
@johncarlosbaez NB, your emogi / characters in "Then came ????? information theory..." doen't appear for all clients.
You might want to clarify.
@johncarlosbaez ... speaking of entropy and information loss ;-)
I've never wanted to explain entropy on Twitter, since it's very subtle, and people have lots of crazy ideas about it, which they insist on talking about. I'd much rather explain algebraic topology, where that doesn't happen. But people prefer entropy.
(2/n)