# 率失真理论

## 失真函數

${\displaystyle d:\chi \times {\hat {\chi }}\rightarrow R_{+}}$

### 漢明失真

${\displaystyle d(x,{\hat {x}})={\begin{cases}0,&{\text{if }}x={\hat {x}}\\1,&{\text{if }}x\neq {\hat {x}}\end{cases}}}$

### 平方誤差失真

${\displaystyle d(x,{\hat {x}})=(x-{\hat {x}})^{2}}$

## 率失真函數

${\displaystyle \inf _{Q_{Y|X}(y|x)}I_{Q}(Y;X)\ {\mbox{subject to}}\ D_{Q}\leq D^{*}.}$

${\displaystyle I(Y;X)=H(Y)-H(Y|X)\,}$

${\displaystyle H(Y)=-\int _{-\infty }^{\infty }P_{Y}(y)\log _{2}(P_{Y}(y))\,dy}$
${\displaystyle H(Y|X)=-\int _{-\infty }^{\infty }\int _{-\infty }^{\infty }Q_{Y|X}(y|x)P_{X}(x)\log _{2}(Q_{Y|X}(y|x))\,dx\,dy.}$

${\displaystyle \inf _{Q_{Y|X}(y|x)}E[D_{Q}[X,Y]]{\mbox{subject to}}\ I_{Q}(Y;X)\leq R.}$

### 無記憶（獨立）高斯訊號來源

${\displaystyle R(D)=\left\{{\begin{matrix}{\frac {1}{2}}\log _{2}(\sigma _{x}^{2}/D),&{\mbox{if }}0\leq D\leq \sigma _{x}^{2}\\\\0,&{\mbox{if }}D>\sigma _{x}^{2}.\end{matrix}}\right.}$[1]

### 二元信號源

${\displaystyle R(D)={\begin{cases}H(p)-H(D),&0\leq D\leq min\{p,1-p\}\\0,&D\geq min\{p,1-p\}\end{cases}}}$

### 平行高斯信號源

${\displaystyle X_{1},X_{2}...,X_{m}}$${\displaystyle X_{i}\thicksim N(0,\sigma _{i}^{2})}$，此時率失真函數為，

${\displaystyle R(D)=\sum _{i=1}^{m}{1 \over 2}log{{\sigma _{i}^{2}} \over {D_{i}}}}$

${\displaystyle D_{i}={\begin{cases}\lambda ,&{\text{if }}{\lambda }<{\sigma _{i}^{2}}\\\sigma _{i}^{2},&{\text{if }}{\lambda }\geq {\sigma _{i}^{2}}\end{cases}}}$

${\displaystyle \lambda }$必須滿足限制：

${\displaystyle \sum _{i=1}^{m}D_{i}=D}$

## 注釋

1. Thomas M. Cover, Joy A. Thomas. Elements of Information Theory. John Wiley & Sons, New York. 2006.