# 信息论

## 信息的量

### 信息I(Information)

$r.v.S \in {s_1,s_2,s_3, \dots}$$p(S=s)=p_k, \sum_{k}p_k=1$

$I(S_k)\equiv \log_2\left(\frac{1}{p_k}\right)=-\log_2(p_k)$

### 信息熵(Entropy)

$H(S)=E[I(S_k)] =-\sum_{k}^{}p_k I(S_k)$

$= \sum_{k}^{}p_k \log_2\left(\frac{1}{p_k}\right)$

#### 性質

1. $H(S^n) = nH(S)$

2. $0 \le H(S) \le \log_2 N$ N:信號符號數

#### 例子

P(面一)=1/5,

P(面二)=2/5,

P(面三)=2/5

$H(S)=\frac{1}{5}\log_2 (5)+\frac{2}{5}\log_2\left(\frac{5}{2}\right)+\frac{2}{5}\log_2\left(\frac{5}{2}\right)$

### 互信息

$I(X, Y) = H(X) + H(Y) - H(X, Y)$

$H(X, Y) = - \sum_{x, y}^{} p(x, y) \log p(x, y)$

## 参考文献

1. ^ F. Rieke, D. Warland, R Ruyter van Steveninck, W Bialek. Spikes: Exploring the Neural Code. The MIT press. 1997. ISBN 978-0262681087.
2. ^ cf. Huelsenbeck, J. P., F. Ronquist, R. Nielsen and J. P. Bollback (2001) Bayesian inference of phylogeny and its impact on evolutionary biology, Science 294:2310-2314
3. ^ Rando Allikmets, Wyeth W. Wasserman, Amy Hutchinson, Philip Smallwood, Jeremy Nathans, Peter K. Rogan, Thomas D. Schneider, Michael Dean (1998) Organization of the ABCR gene: analysis of promoter and splice junction sequences, Gene 215:1, 111-122
4. ^ Burnham, K. P. and Anderson D. R. (2002) Model Selection and Multimodel Inference: A Practical Information-Theoretic Approach, Second Edition (Springer Science, New York) ISBN 978-0-387-95364-9.
5. ^ Jaynes, E. T. (1957) Information Theory and Statistical Mechanics, Phys. Rev. 106:620
6. ^ Charles H. Bennett, Ming Li, and Bin Ma (2003) Chain Letters and Evolutionary Histories, Scientific American 288:6, 76-81
7. ^ David R. Anderson. Some background on why people in the empirical sciences may want to better understand the information-theoretic methods (pdf). November 1, 2003 [2010-06-23].