From 3d2e95025caa9692fa75535e365b429de0edbc04 Mon Sep 17 00:00:00 2001 From: =?utf8?q?Fran=C3=A7ois=20Fleuret?= Date: Thu, 18 Jan 2024 17:22:53 +0100 Subject: [PATCH] Update. --- inftheory.tex | 6 ++++-- 1 file changed, 4 insertions(+), 2 deletions(-) diff --git a/inftheory.tex b/inftheory.tex index 33ccfe5..0724c0d 100644 --- a/inftheory.tex +++ b/inftheory.tex @@ -116,7 +116,7 @@ that quantifies the amount of information shared by the two variables. \section{Conditional Entropy} -Okay given the visible interest for the topic, an addendum: Conditional entropy is the average of the entropy of the conditional distribution: +Conditional entropy is the average of the entropy of the conditional distribution: % \begin{align*} &H(X \mid Y)\\ @@ -126,7 +126,9 @@ Okay given the visible interest for the topic, an addendum: Conditional entropy Intuitively it is the [minimum average] number of bits required to describe X given that Y is known. -So in particular, if X and Y are independent +So in particular, if X and Y are independent, getting the value of $Y$ +does not help at all, so you still have to send all the bits for $X$, +hence % \[ H(X \mid Y)=H(X) -- 2.20.1