Entropy+(Some+Notes)

The Relationship of Entropy to Message Size in Binary Symbol Strings http://www.michaelharold.com/downloads/The%20Relationship%20of%20Entropy%20to%20Message%20Size%20in%20Binary%20Symbol%20Strings.pdf

Entropy of a Distribution I had a conversation with Sahu about calculating the energy of a distribution.

Used Shannon's Entropy Equation: -Sum p(x)*log(p(x))/log(2)

Excerpt from chat

Okay let's say I have two distributions each distribution has 10 numbers and I want to see how much entropy is in each distrubtion so for the first distribution 11:41 PM let's say the sequence is 3,3,3,5,5,5,8,8,8,10 calculating the entropy of this distribution we get -(3/10*log(3/10)+3/10log(3/10)+3/10*log(3/10)+1/10*log(1/10)) =1.31 No for the next distribution we have 11:42 PM 1,2,3,4,5,6,7,8,9,10 anshuman: yes me: calculating the entropy we get anshuman: this is correct me: -(1/10*log(1/10)*10 =2.30 which is higher than the entropy for the distribution with grouped numbers I think this is all making sense now

Alright I was also thinking about what kinds of distributions lead to the best entropy for life. Complete entropy is not good for life, and total complete order is not good for life. Given our 10 numbers with a value 0-10 example: lowest entropy distribution would be: 1,1,1,1,1,1,1,1,1,1 Highest entropy distribution would be: 0,1,2,3,4,5,6,7,8,9 Some intermediate entropy would be: 3,3,3,5,5,5,8,8,8,10

So what types of distributions would have the most usuable information content or be the best for life? I was thinking maybe something where there are the greatest number of groups with different values belonging to them. For 10 items some possible ways of splitting up the groups could be 10 (like the lowest entropy distribution where all 10 items have the same value) 5 5 2 3 3 2 7 2 1 2 3 5 6 4 6 2 2

So here the distributions that have the most groups with different values are 7 2 1 example: 3,3,3,3,3,3,3,8,8,5 and 2 3 5 example: 3,3,8,8,8,5,5,5,5,5

I wonder if there is any significance between the ratio of the entropy of this type of distribution compared to the maximum entropy for this given number of items. I actually did compare this ratio from the 7 2 1 example and got something near 1/(golden ratio) (around 0.62). Cool! hahaha This might be stretching it a bit. Maybe I should look more at this later.