From: François Fleuret <francois@fleuret.org>
Date: Thu, 18 Jan 2024 16:22:53 +0000 (+0100)
Subject: Update.
X-Git-Url: https://fleuret.org/cgi-bin/gitweb/gitweb.cgi?a=commitdiff_plain;h=3d2e95025caa9692fa75535e365b429de0edbc04;p=tex.git

Update.
---

diff --git a/inftheory.tex b/inftheory.tex
index 33ccfe5..0724c0d 100644
--- a/inftheory.tex
+++ b/inftheory.tex
@@ -116,7 +116,7 @@ that quantifies the amount of information shared by the two variables.
 
 \section{Conditional Entropy}
 
-Okay given the visible interest for the topic, an addendum: Conditional entropy is the average of the entropy of the conditional distribution:
+Conditional entropy is the average of the entropy of the conditional distribution:
 %
 \begin{align*}
 &H(X \mid Y)\\
@@ -126,7 +126,9 @@ Okay given the visible interest for the topic, an addendum: Conditional entropy
 
 Intuitively it is the [minimum average] number of bits required to describe X given that Y is known.
 
-So in particular, if X and Y are independent 
+So in particular, if X and Y are independent, getting the value of $Y$
+does not help at all, so you still have to send all the bits for $X$,
+hence
 %
 \[
   H(X \mid Y)=H(X)