Comment by DramShopLaw on 11/07/2020 at 18:26 UTC*

1 upvotes, 1 direct replies (showing 1)

View submission: What’s the relationship between entropy and compressible information?

You can describe information entropy in two ways. The first is classical Shannon entropy, where you’re measuring the amount by which one sample gives you information about the probability distribution of what comes in the next sample. It measures how much one occurrence constrains the probabilities of the next. The shuffled deck has more entropy because the only info you get pulling one card is that the specific card won’t reappear. If the cards are ordered, then one card gives you clues about the ordering, in addition to now knowing that card won’t reappear. so you have gained more information from one instance.

Then you go into what I’ve seen described as “algorithmic entropy.” This measures the number of logical-mathematical steps it would take to reproduce the sample set you’ve received. If the cards are truly random, they have higher algorithmic entropy, because the only way to reproduce the pattern is to list every card you pulled in order. But if they have been ordered in some way, then there’s some iterative process you could create to generate the next occurrence based on the prior ones. This would take less steps because you don’t list things in order.

I believe you have merged the two. But the two can be merged. I read a paper arguing that physical (thermodynamic) entropy can be seen as the sum of the two, where we treat thermodynamic entropy as the amount of information we lose about a system’s initial conditions when it evolves to equilibrium. Equivalently, we can see thermodynamic entropy as the amount of excess information, beyond temperature or pressure or density etc., that we’d need to describe a present arrangement of the atoms and molecules.

Replies

Comment by [deleted] at 11/07/2020 at 19:05 UTC

2 upvotes, 1 direct replies

[removed]