X-Git-Url: https://fleuret.org/cgi-bin/gitweb/gitweb.cgi?a=blobdiff_plain;f=inftheory.tex;h=0724c0d1a067ef6b4503533be2477693dd574ef7;hb=3d2e95025caa9692fa75535e365b429de0edbc04;hp=33ccfe5cfb1e214b2e0e51edda186f0c719ae404;hpb=1e87b0a30c1b32eb50429af1340ea9706e3ccab6;p=tex.git diff --git a/inftheory.tex b/inftheory.tex index 33ccfe5..0724c0d 100644 --- a/inftheory.tex +++ b/inftheory.tex @@ -116,7 +116,7 @@ that quantifies the amount of information shared by the two variables. \section{Conditional Entropy} -Okay given the visible interest for the topic, an addendum: Conditional entropy is the average of the entropy of the conditional distribution: +Conditional entropy is the average of the entropy of the conditional distribution: % \begin{align*} &H(X \mid Y)\\ @@ -126,7 +126,9 @@ Okay given the visible interest for the topic, an addendum: Conditional entropy Intuitively it is the [minimum average] number of bits required to describe X given that Y is known. -So in particular, if X and Y are independent +So in particular, if X and Y are independent, getting the value of $Y$ +does not help at all, so you still have to send all the bits for $X$, +hence % \[ H(X \mid Y)=H(X)