Skip to content

Commit

Permalink
Added Unit Tests documentation
Browse files Browse the repository at this point in the history
  • Loading branch information
kyleabeauchamp committed Sep 3, 2013
1 parent 05743d1 commit 9613b37
Show file tree
Hide file tree
Showing 9 changed files with 933 additions and 0 deletions.
1 change: 1 addition & 0 deletions unit_tests/notes.te
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@

43 changes: 43 additions & 0 deletions unit_tests/notes.tex
Original file line number Diff line number Diff line change
@@ -0,0 +1,43 @@

\subsection*{Derivation of Jeffrey's Prior}

This section derives the Jeffrey's prior for the BELT likelihood. We do not recommend the use of this prior, as it is expensive to compute and unable to provide regularization. This section can be skipped by most readers; we include it only for completeness.

Jeffrey's prior dictates that

$$P(\alpha) \propto \det(I(\alpha))^\frac{1}{2}$$

The Fisher information matrix, $I(\alpha)$, is given by

$$I_{ab}(\alpha) = E_\alpha(\frac{d\log P(F|\alpha)}{d\alpha_{a}}\frac{d\log P(F|\alpha)}{d\alpha_{b}})$$

First, we examine the log likelihood (dropping terms independent of $\alpha$) and calculate its derivative:

$$LL = \log P(F|\alpha) = - \frac{1}{2}\sum_i (\frac{F_i - \langle F_i\rangle_\alpha}{\sigma_i})^2$$

$$\frac{d(LL)}{d\alpha_a} = \sum_i \frac{1}{\sigma_i}(F_i - \langle F_i \rangle_\alpha) \frac{d\langle F_i \rangle_\alpha}{d\alpha_a}$$

When we insert this equation into the expectation, only $(F_i - \langle F_i \rangle_\alpha)$ depends on $F_i$. The remaining terms can be pulled outside the expectation:

$$I_{ab} = \sum_{ij} \frac{d\langle F_i \rangle_\alpha}{d\alpha_a} \frac{d\langle F_j \rangle_\alpha}{d\alpha_b} E(\frac{1}{\sigma_i \sigma_j}(F_i - \langle F_i \rangle_\alpha)(F_j - \langle F_j \rangle_\alpha))$$

Because the conditional likelihood is a diagonal multivariate normal, the expectation is simply $\delta_{ij}$, leading to

$$I_{ab} = \sum_{i} \frac{d\langle F_i \rangle_\alpha}{d\alpha_a} \frac{d\langle F_i \rangle_\alpha}{d\alpha_b} $$

Now, we know that

$$\frac{d\langle F_i \rangle_\alpha}{d\alpha_a} = \sum_k f_{ak} \frac{d\pi_k}{d\alpha_a}$$

where $f_{ak} = f_a(x_k)$. Similarly, we can show that

$$\frac{d\pi_k}{d\alpha_a} = \pi_k (\langle F_a \rangle_{\alpha} - f_{ak})$$


Putting all this together, we can show that

$$I = S^T S$$

Where

$$S_{ia} = \sum_k \pi_k (<F_a>_\alpha - f_{ka}) f_{ki}$$
1 change: 1 addition & 0 deletions unit_tests/notes.tex~
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@

10 changes: 10 additions & 0 deletions unit_tests/tests.aux
Original file line number Diff line number Diff line change
@@ -0,0 +1,10 @@
\relax
\@writefile{toc}{\contentsline {section}{\numberline {1}Introduction}{1}}
\@writefile{toc}{\contentsline {section}{\numberline {2}Reweighting a 1D Gaussian}{1}}
\@writefile{toc}{\contentsline {section}{\numberline {3}Reweighting a 2D Gaussian}{1}}
\@writefile{toc}{\contentsline {section}{\numberline {4}1D Gaussian BELT}{2}}
\@writefile{toc}{\contentsline {section}{\numberline {5}1D Gaussian BELT with Maxent Prior}{2}}
\@writefile{toc}{\contentsline {section}{\numberline {6}1D Gaussian BELT with MVN prior}{3}}
\@writefile{toc}{\contentsline {section}{\numberline {7}Uniform RV on [0,1]}{4}}
\@writefile{toc}{\contentsline {section}{\numberline {8}Exponential RV}{4}}
\@writefile{toc}{\contentsline {section}{\numberline {9}Summary:}{4}}
Loading

0 comments on commit 9613b37

Please sign in to comment.