forked from rohinarora/EECE5644-Machine_Learning
-
Notifications
You must be signed in to change notification settings - Fork 0
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
- Loading branch information
Rohin arora
authored and
Rohin arora
committed
Sep 26, 2019
1 parent
28e2005
commit a7dcd74
Showing
24 changed files
with
195 additions
and
3 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,2 @@ | ||
meta.md | ||
.DS_Store |
Binary file not shown.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,6 @@ | ||
{ | ||
"cells": [], | ||
"metadata": {}, | ||
"nbformat": 4, | ||
"nbformat_minor": 2 | ||
} |
Binary file not shown.
Large diffs are not rendered by default.
Oops, something went wrong.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,48 @@ | ||
#### Answer 1. | ||
|
||
1. Variance(x) <-> $Var(x)$ | ||
|
||
$Var(x)$ | ||
=$E[(x-\mu)^2]$ | ||
=$E[x^2+\mu^2-2\mu*x]$ | ||
=$E[x^2]+E[\mu^2]-2*E[\mu*x]$ (by linearity of expectation) | ||
=$E[x^2]+\mu^2-2*\mu*E[x]$ ($\mu^2=E[\mu^2]$ and $\mu*E[x]=E[\mu*x]$ as $\mu$ is a constant) | ||
=$E[x^2]+\mu^2-2*\mu^2$ ($\mu=E[x]$ -> defination of mean) | ||
=$E[x^2]-\mu^2$ | ||
|
||
|
||
<div align="right"> | ||
<b> | ||
QED | ||
</b> | ||
</div> | ||
|
||
|
||
2. Variance($\vec x$)<-> $Var(\vec x)$ | ||
|
||
$Var(\vec x)$ | ||
= $E[(\vec x-\mu)(\vec x-\mu)^T]$ | ||
= $E[\vec x*\vec x^T -\vec x*\mu^T-\mu*\vec x^T+\mu*\mu^T]$ | ||
= $E[\vec x*\vec x^T] -E[\vec x*\mu^T]-E[\mu*\vec x^T] +E[\mu*\mu^T]$ (by linearity of expectation) | ||
= $E[\vec x*\vec x^T] -E[\vec x]*\mu^T-\mu*E[\vec x^T] +\mu*\mu^T$ | ||
= $E[\vec x*\vec x^T] -\mu*\mu^T-\mu*\mu^T +\mu*\mu^T$ | ||
= $E[\vec x*\vec x^T] -\mu*\mu^T$ | ||
|
||
|
||
|
||
|
||
|
||
<div align="right"> | ||
<b> | ||
QED | ||
</b> | ||
</div> | ||
|
||
##### References | ||
|
||
1. https://en.wikipedia.org/wiki/Variance | ||
2. https://www.wolframalpha.com/ | ||
3. https://atom.io | ||
4. https://www.python.org/ | ||
5. https://www.scipy.org/ | ||
6. https://jupyter.org/ |
Binary file added
BIN
+6.05 MB
L1/L00_Sup_Closas_LinearAlgebra_UPC_2005_ThisIsForAnEntireCourse_CourtesyOfProfPauClosas.pdf
Binary file not shown.
Binary file not shown.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Binary file not shown.
Binary file not shown.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,3 @@ | ||
https://stats.stackexchange.com/questions/86487/what-is-the-meaning-of-the-density-of-a-distribution-at-a-point | ||
|
||
https://math.stackexchange.com/questions/1412015/intuitive-meaning-of-the-probability-density-function-at-a-point |
Binary file added
BIN
+71.7 MB
...lOptimization_ThisIsForAnEntireCourse_WeWillReferToItWhenReviewingOptimizationMethods.pdf
Binary file not shown.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,15 @@ | ||
* Think of gradient as a first derivative of simple function like parabola. And hessian as second derivative of the simple function. The intuition directly transfers when x is a vector values complex function, first derivative is a gradient with column vector, and hessian is second derivative. The roots of | ||
* Global and local minima | ||
* Equality constraints are almost always active. They are inactive in the rare case when Equality constraint surface passes through unconstraint optimum value. | ||
* Inequality constraints may or may not be active. In general, and constraint is active if it plays an "active" role in preventing the solution to reach unconstraint minima | ||
* All constraints become inactive in EM algorithm (to fit GMM) | ||
* When training SVM, some constraints active. Most inactive | ||
* For active constraints, langrange multiplier is positive. Else 0. | ||
* Can look at values of langrange multiplier and decide if constraints could be relaxed | ||
* 2nd order constraints for constraint optimization skipped | ||
##### PCA as a optimization problem with eqaulity constraints | ||
|
||
* "eigenfaces" | ||
* The n principle components form a new basis for the original vector $\vec{x}$ | ||
|
||
* LDA - next lecture. optimization problem without constraints |
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,9 @@ | ||
* decision boundary given by SVM, Fisher LDA, logistic regression. | ||
* generalized eigenvalue/eigenvector | ||
* 2nd derivative to find maxima | ||
* how to choose $\gamma$. | ||
* do we not need any distance metric in the error function? | ||
* want- | ||
* min $\sigma_1^{_2}$ and $\sigma_2^{_2}$ | ||
* same as min $\sigma_1^{_2}+\sigma_1^{_2}$ | ||
* same as max $(1/\sigma_1^{_2}+\sigma_1^{_2})$ |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1 @@ | ||
* Put the assignment code on github. Write the assignment on overleaf |
Binary file not shown.
Binary file not shown.