Hartley determined (in 1928) that the amount of information in the message is
itl(H) = itl(n) log itl(s)
the entropy itl(H) of the information source is
itl(H) = -sum(itl(i) = 1..itl(n), itl(p)itl(i) log2 itl(p)itl(i))
itl(H) = -itl(p)itl(H) log2 itl(p)itl(H) - itl(p)itl(T) log2 itl(p)itl(T)
itl(H) = -0.5 log2 0.5 - 0.5 log2 0.5 = -0.5(-1) - 0.5(-1) = 1
it is possible to send information at a rate of itl(C)/itl(H) - itl(e) but no faster.
e 0.117 i 0.086 s 0.081 a 0.079 r 0.074
n 0.074 t 0.067 o 0.059 l 0.053 c 0.041
d 0.038 u 0.032 g 0.028 p 0.027 m 0.027
h 0.021 b 0.020 y 0.016 f 0.013 v 0.010
k 0.008 w 0.008 z 0.004 x 0.002 j 0.002
q 0.001
This page last modified on 14 i2m(11) 2004.