From 94132a13e3f1dfbe4c8dc4af15ad570535674ea5 Mon Sep 17 00:00:00 2001 From: yuxuanjerrychen01 Date: Wed, 10 Apr 2024 21:40:15 +0000 Subject: [PATCH] typo fix --- notes/optimization.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/notes/optimization.md b/notes/optimization.md index 2614e0b..530748d 100644 --- a/notes/optimization.md +++ b/notes/optimization.md @@ -393,7 +393,7 @@ $$\bf{x_0} = \textbf{starting guess}$$ $$\bf{x_{k+1}} = x_k - \frac{f'(x_k)}{f''(x_k)}$$ The method typically **converges quadratically**, provided that $$x_k$$ is -sufficiently close to the local minimum. In other words, Newton's method has local convergence, and may ay fail to converge, or converge to a maximum or point of inflection. +sufficiently close to the local minimum. In other words, Newton's method has local convergence, and may fail to converge, or converge to a maximum or point of inflection. For Newton's method for optimization in 1-D, we evaluate $$f'(x_k)$$ and $$f''(x_k)$$, so it requires 2 function