Skip to content

Commit

Permalink
Merge pull request RemyDegenne#150 from RemyDegenne/blueprint
Browse files Browse the repository at this point in the history
Blueprint: mark the two main `sorry` as `\notready`
  • Loading branch information
RemyDegenne authored Sep 10, 2024
2 parents 63d463e + a3d0d15 commit 7066b7a
Show file tree
Hide file tree
Showing 5 changed files with 27 additions and 6 deletions.
2 changes: 2 additions & 0 deletions blueprint/src/macros/print.tex
Original file line number Diff line number Diff line change
Expand Up @@ -12,8 +12,10 @@

% Dummy macros that make sense only for web version.
\newcommand{\lean}[1]{}
\newcommand{\discussion}[1]{}
\newcommand{\leanok}{}
\newcommand{\mathlibok}{}
\newcommand{\notready}{}
% Make sure that arguments of \uses and \proves are real labels, by using invisible refs:
% latex prints a warning if the label is not defined, but nothing is shown in the pdf file.
% It uses LaTeX3 programming, this is why we use the expl3 package.
Expand Down
3 changes: 2 additions & 1 deletion blueprint/src/sections/convex.tex
Original file line number Diff line number Diff line change
Expand Up @@ -28,8 +28,9 @@ \chapter{Convex functions of Radon-Nikodym derivatives}
%\lean{}
%\leanok
%\uses{}
\notready
For $f$ convex, $f\left(\mu [g \mid m]\right) \le \mu [f \circ g \mid m]$.
\end{theorem}

\begin{proof}
\begin{proof} %\leanok
\end{proof}
2 changes: 2 additions & 0 deletions blueprint/src/sections/dpi_div.tex
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,8 @@

Another important concept is the \emph{conditional divergence} of two kernels $\kappa, \eta : \mathcal X \rightsquigarrow \mathcal Y$ given a measure $\mu \in \mathcal M(\mathcal X)$. This is the value $D(\mu \otimes \kappa, \mu \otimes \eta)$.

TODO: two possible concepts of conditional divergence are $D(\mu \otimes \kappa, \mu \otimes \eta)$ and $\mu\left[x \mapsto D(\kappa(x), \eta(x))\right]$. They are equal for $f$-divergences (which gives also convexity of the divergence) but not in general.

\section{Generic divergences}

\begin{definition}[Divergence]
Expand Down
24 changes: 21 additions & 3 deletions blueprint/src/sections/f_divergence.tex
Original file line number Diff line number Diff line change
Expand Up @@ -578,6 +578,24 @@ \subsection{Integral representation of $f$-divergences}
\end{proof}


\begin{theorem}
\label{thm:integration_by_parts}
%\lean{}
%\leanok
\notready
\uses{}
If $f$ and $g$ are two Stieltjes functions with associated measures $\mu_f$ and $\mu_g$ and $f$ is continuous on $[a, b]$, then
\begin{align*}
\int_x f(x) \partial\mu_g = f(b)g(b) - f(a)g(a) - \int_x g(x) \partial\mu_f \: .
\end{align*}
\end{theorem}

\begin{proof} \notready
\uses{}

\end{proof}


\begin{lemma}
\label{lem:convex_taylor}
\lean{ConvexOn.convex_taylor}
Expand All @@ -592,13 +610,13 @@ \subsection{Integral representation of $f$-divergences}
\end{lemma}

\begin{proof}\leanok
\uses{}
\uses{thm:integration_by_parts}
Let $\Lambda$ be the Lebesgue measure and let $x < y$. Since $f$ has right derivative $f'_+$ in $(x,y)$ and $f'_+$ is integrable on that interval,
\begin{align*}
f(y) - f(x) = \int_{z \in (x, y]} f'_+(z) \partial \Lambda
\: .
\end{align*}
We now integrate by parts, using Theorem TODO. $\Lambda$ is the measure obtained from the Stieltjes function $z \mapsto z - y$.
We now integrate by parts, using Theorem~\ref{thm:integration_by_parts}. $\Lambda$ is the measure obtained from the Stieltjes function $z \mapsto z - y$.
\begin{align*}
\int_{z \in (x, y]} f'_+(z) \partial \Lambda
&= - \int_{z \in (x,y]} (z - y)\partial \gamma_f + f'_+(y)(y - y) - f'_+(x)(x - y)
Expand Down Expand Up @@ -928,7 +946,7 @@ \subsubsection{Sigma-algebras}
\end{lemma}

\begin{proof}%\leanok
\uses{lem:fDiv_map_eq_fDiv_trim_of_ac}
\uses{lem:fDiv_map_eq_fDiv_trim_of_ac, thm:fDiv_trim_le}
By Lemma~\ref{lem:fDiv_map_eq_fDiv_trim_of_ac}, $D_f(g_* \mu, g_* \nu) = D_f(\mu_{| g^* \mathcal Y}, \nu_{| g^* \mathcal Y})$. Then apply Theorem~\ref{thm:fDiv_trim_le}.
\end{proof}

Expand Down
2 changes: 0 additions & 2 deletions blueprint/src/sections/risk.tex
Original file line number Diff line number Diff line change
Expand Up @@ -528,8 +528,6 @@ \subsection{Bayes risk increase}

\section{Simple binary hypothesis testing}

TODO: rework this section to use the Bayes risk increase?

In this section, for a measure $\xi$ on $\{0,1\}$, we write $\xi_0$ for $\xi(\{0\})$ and $\xi_1$ for $\xi(\{1\})$~. We sometimes write $(a,b)$ for the measure $\xi$ on $\{0,1\}$ with $\xi_0 = a$ and $\xi_1 = b$.

\begin{lemma}
Expand Down

0 comments on commit 7066b7a

Please sign in to comment.