The entropy of a set of messages S
is calculated by the following equation
Where s
is an element of S
, meaning a message from the set with probability p(s)
.
Self Information
.
The following is a set of probabilities of 4 messages
$H(S)=0.252log_{2}(1/0.252) + 0.51log_{2}(1/0.51) + 01log_{2}(1/0*1)$
$H(S)=0.5log_{2}(4) + 0.5log_{2}(2)$