Yesterday I posted a chart like this. Someone made a joke, and I didn't get it. I thought they were pointing out an error. I checked... and all my numbers were off by a factor of 10²³.

Here are the right numbers, I hope. In penance, I'm going to explain entropy.

(1/n)

I've never wanted to explain entropy on Twitter, since it's very subtle, and people have lots of crazy ideas about it, which they insist on talking about. I'd much rather explain algebraic topology, where that doesn't happen. But people prefer entropy.

(2/n)

I can't explain entropy in just one tweet! For starters, it means different things... which turn out to be related.

First came classical thermodynamics, where you can measure changes in something called entropy by doing experiments.

(3/n)

Then came classical statistical mechanics, where entropy was defined using a combination of classical mechanics and probability theory.

Then came quantum statistical mechanics, which gave a 𝘯𝘦𝘸 mathematical definition of entropy.

(4/n)

Then came information theory, which shows random strings of symbols have an 'information' closely related to the entropy in classical statistical mechanics.

Then came 𝘢𝘭𝘨𝘰𝘳𝘪𝘵𝘩𝘮𝘪𝘤 information theory, which lets you define an information of a single string!

(5/n)

In fact, all these 5 entropy concepts can be seen as special cases of 𝘰𝘯𝘦: the entropy in quantum statistical mechanics!

This is also the entropy we need to deeply understand the entropy of matter, or the cosmic microwave background radiation.

(6/n)

The entropy in quantum statistical mechanics is often called von Neumann entropy. So, any thorough explanation of entropy has got to explain this.

Okay:

The von Neumann entropy of a density matrix ρ is

-tr(ρ lnρ)

Yay, I'm done! 🎉

Just kidding.

I'm still not sure what I'll say about entropy, but obviously that was NOT an explanation.

I should probably post a bunch of fun tweets, one every day or two, and hope they add up to something useful.

(8/n, n = 8)

@johncarlosbaez Why not try explaining something simpler, like the number 24? :-P

@bstacey @johncarlosbaez its a lot like four sixes... or even six fours

Hush! Last time he spent years illustrating stuff from group theory, division rings, algebraic topology, and shooting cannon balls at 24.

Why is a box filled with gas where the atoms conspire for a short time to look like a picture of Charly Chaplin said to be in a low entropy state? Why can't such an unlikely TV set show something new for a change. And how is that not reversing time. Does knowing Chaitin's constant change the entropy of strings...

@johncarlosbaez NB, your emogi / characters in "Then came ????? information theory..." doen't appear for all clients.

You might want to clarify.

@johncarlosbaez ... speaking of entropy and information loss ;-)

The social network of the future: No ads, no corporate surveillance, ethical design, and decentralization! Own your data with Mastodon!