projects
/
tex.git
/ commitdiff
commit
grep
author
committer
pickaxe
?
search:
re
summary
|
shortlog
|
log
|
commit
| commitdiff |
tree
raw
|
patch
|
inline
| side by side (parent:
1e87b0a
)
Update.
author
François Fleuret
<francois@fleuret.org>
Thu, 18 Jan 2024 16:22:53 +0000
(17:22 +0100)
committer
François Fleuret
<francois@fleuret.org>
Thu, 18 Jan 2024 16:22:53 +0000
(17:22 +0100)
inftheory.tex
patch
|
blob
|
history
diff --git
a/inftheory.tex
b/inftheory.tex
index
33ccfe5
..
0724c0d
100644
(file)
--- a/
inftheory.tex
+++ b/
inftheory.tex
@@
-116,7
+116,7
@@
that quantifies the amount of information shared by the two variables.
\section{Conditional Entropy}
\section{Conditional Entropy}
-
Okay given the visible interest for the topic, an addendum:
Conditional entropy is the average of the entropy of the conditional distribution:
+Conditional entropy is the average of the entropy of the conditional distribution:
%
\begin{align*}
&H(X \mid Y)\\
%
\begin{align*}
&H(X \mid Y)\\
@@
-126,7
+126,9
@@
Okay given the visible interest for the topic, an addendum: Conditional entropy
Intuitively it is the [minimum average] number of bits required to describe X given that Y is known.
Intuitively it is the [minimum average] number of bits required to describe X given that Y is known.
-So in particular, if X and Y are independent
+So in particular, if X and Y are independent, getting the value of $Y$
+does not help at all, so you still have to send all the bits for $X$,
+hence
%
\[
H(X \mid Y)=H(X)
%
\[
H(X \mid Y)=H(X)