1. 程式人生 > >記錄一下概念(信息論中)

記錄一下概念(信息論中)

tex tps 其中 llb wikipedia ota lba text spl

記錄一些常見的概念。

(香儂)熵

Information entropy is the average rate at which information is produced by a stochastic source of data.
\[H[x]=-\sum_{x} p(x) \log _{2} p(x)\]
更多:https://en.wikipedia.org/wiki/Entropy_(information_theory)

交叉熵

\[{\displaystyle H(p,q)=-\sum _{x\in {\mathcal {X}}}p(x)\,\log q(x)= {\displaystyle H(p)+D_{\mathrm {KL} }(p\|q)}}\]


更多:https://en.wikipedia.org/wiki/Cross_entropy

Kullback–Leibler divergence

也叫相對熵(relative entropy)
\[{\displaystyle D_{\text{KL}}(P\parallel Q)=\int _{-\infty }^{\infty }p(x)\log \left({\frac {p(x)}{q(x)}}\right)\,dx}\]

  • 非對稱
  • 非負
  • 凸性

更多: https://en.wikipedia.org/wiki/Kullback%E2%80%93Leibler_divergence

Jensen–Shannon divergence

也叫information radius,total divergence to the average
\[{\displaystyle {\rm {JSD}}(P\parallel Q)={\frac {1}{2}}D(P\parallel M)+{\frac {1}{2}}D(Q\parallel M)}\]
其中\({\displaystyle M={\frac {1}{2}}(P+Q)}\)

  • 有界
  • 對稱

更多:https://en.wikipedia.org/wiki/Jensen%E2%80%93Shannon_divergence

f-divergence

\[{\displaystyle D_{f}(P\parallel Q)\equiv \int _{\Omega }f\left({\frac {dP}{dQ}}\right)\,dQ.}\]


更多:https://en.wikipedia.org/wiki/F-divergence

記錄一下概念(信息論中)