Skip to content

Commit

Permalink
build based on 7cc8328
Browse files Browse the repository at this point in the history
  • Loading branch information
Documenter.jl committed Mar 23, 2024
1 parent c19e7a0 commit 643b684
Show file tree
Hide file tree
Showing 32 changed files with 32 additions and 32 deletions.
2 changes: 1 addition & 1 deletion dev/.documenter-siteinfo.json
Original file line number Diff line number Diff line change
@@ -1 +1 @@
{"documenter":{"julia_version":"1.10.2","generation_timestamp":"2024-03-21T13:54:23","documenter_version":"1.3.0"}}
{"documenter":{"julia_version":"1.10.2","generation_timestamp":"2024-03-23T00:50:42","documenter_version":"1.3.0"}}
2 changes: 1 addition & 1 deletion dev/LICENSE/index.html

Large diffs are not rendered by default.

2 changes: 1 addition & 1 deletion dev/algo/adam_adamax/index.html
Original file line number Diff line number Diff line change
Expand Up @@ -5,4 +5,4 @@
epsilon=1e-8)</code></pre><p>where <code>alpha</code> is the step length or learning parameter. <code>beta_mean</code> and <code>beta_var</code> are exponential decay parameters for the first and second moments estimates. Setting these closer to 0 will cause past iterates to matter less for the current steps and setting them closer to 1 means emphasizing past iterates more. <code>epsilon</code> should rarely be changed, and just exists to avoid a division by 0.</p><pre><code class="language-julia hljs">AdaMax(; alpha=0.002,
beta_mean=0.9,
beta_var=0.999,
epsilon=1e-8)</code></pre><p>where <code>alpha</code> is the step length or learning parameter. <code>beta_mean</code> and <code>beta_var</code> are exponential decay parameters for the first and second moments estimates. Setting these closer to 0 will cause past iterates to matter less for the current steps and setting them closer to 1 means emphasizing past iterates more.</p><h2 id="References"><a class="docs-heading-anchor" href="#References">References</a><a id="References-1"></a><a class="docs-heading-anchor-permalink" href="#References" title="Permalink"></a></h2><p>Kingma, Diederik P., and Jimmy Ba. &quot;Adam: A method for stochastic optimization.&quot; arXiv preprint arXiv:1412.6980 (2014).</p></article><nav class="docs-footer"><a class="docs-footer-prevpage" href="../particle_swarm/">« Particle Swarm</a><a class="docs-footer-nextpage" href="../cg/">Conjugate Gradient »</a><div class="flexbox-break"></div><p class="footer-message">Powered by <a href="https://github.com/JuliaDocs/Documenter.jl">Documenter.jl</a> and the <a href="https://julialang.org/">Julia Programming Language</a>.</p></nav></div><div class="modal" id="documenter-settings"><div class="modal-background"></div><div class="modal-card"><header class="modal-card-head"><p class="modal-card-title">Settings</p><button class="delete"></button></header><section class="modal-card-body"><p><label class="label">Theme</label><div class="select"><select id="documenter-themepicker"><option value="auto">Automatic (OS)</option><option value="documenter-light">documenter-light</option><option value="documenter-dark">documenter-dark</option></select></div></p><hr/><p>This document was generated with <a href="https://github.com/JuliaDocs/Documenter.jl">Documenter.jl</a> version 1.3.0 on <span class="colophon-date" title="Thursday 21 March 2024 13:54">Thursday 21 March 2024</span>. Using Julia version 1.10.2.</p></section><footer class="modal-card-foot"></footer></div></div></div></body></html>
epsilon=1e-8)</code></pre><p>where <code>alpha</code> is the step length or learning parameter. <code>beta_mean</code> and <code>beta_var</code> are exponential decay parameters for the first and second moments estimates. Setting these closer to 0 will cause past iterates to matter less for the current steps and setting them closer to 1 means emphasizing past iterates more.</p><h2 id="References"><a class="docs-heading-anchor" href="#References">References</a><a id="References-1"></a><a class="docs-heading-anchor-permalink" href="#References" title="Permalink"></a></h2><p>Kingma, Diederik P., and Jimmy Ba. &quot;Adam: A method for stochastic optimization.&quot; arXiv preprint arXiv:1412.6980 (2014).</p></article><nav class="docs-footer"><a class="docs-footer-prevpage" href="../particle_swarm/">« Particle Swarm</a><a class="docs-footer-nextpage" href="../cg/">Conjugate Gradient »</a><div class="flexbox-break"></div><p class="footer-message">Powered by <a href="https://github.com/JuliaDocs/Documenter.jl">Documenter.jl</a> and the <a href="https://julialang.org/">Julia Programming Language</a>.</p></nav></div><div class="modal" id="documenter-settings"><div class="modal-background"></div><div class="modal-card"><header class="modal-card-head"><p class="modal-card-title">Settings</p><button class="delete"></button></header><section class="modal-card-body"><p><label class="label">Theme</label><div class="select"><select id="documenter-themepicker"><option value="auto">Automatic (OS)</option><option value="documenter-light">documenter-light</option><option value="documenter-dark">documenter-dark</option></select></div></p><hr/><p>This document was generated with <a href="https://github.com/JuliaDocs/Documenter.jl">Documenter.jl</a> version 1.3.0 on <span class="colophon-date" title="Saturday 23 March 2024 00:50">Saturday 23 March 2024</span>. Using Julia version 1.10.2.</p></section><footer class="modal-card-foot"></footer></div></div></div></body></html>
2 changes: 1 addition & 1 deletion dev/algo/brent/index.html

Large diffs are not rendered by default.

2 changes: 1 addition & 1 deletion dev/algo/cg/index.html
Original file line number Diff line number Diff line change
Expand Up @@ -42,4 +42,4 @@
* stopped by an increasing objective: false
* Reached Maximum Number of Iterations: false
* Objective Calls: 53
* Gradient Calls: 53</code></pre><p>We see that for this objective and starting point, <code>ConjugateGradient()</code> requires fewer gradient evaluations to reach convergence.</p><h2 id="References"><a class="docs-heading-anchor" href="#References">References</a><a id="References-1"></a><a class="docs-heading-anchor-permalink" href="#References" title="Permalink"></a></h2><ul><li>W. W. Hager and H. Zhang (2006) Algorithm 851: CG_DESCENT, a conjugate gradient method with guaranteed descent. ACM Transactions on Mathematical Software 32: 113-137.</li><li>W. W. Hager and H. Zhang (2013), The Limited Memory Conjugate Gradient Method. SIAM Journal on Optimization, 23, pp. 2150-2168.</li></ul></article><nav class="docs-footer"><a class="docs-footer-prevpage" href="../adam_adamax/">« Adam and AdaMax</a><a class="docs-footer-nextpage" href="../gradientdescent/">Gradient Descent »</a><div class="flexbox-break"></div><p class="footer-message">Powered by <a href="https://github.com/JuliaDocs/Documenter.jl">Documenter.jl</a> and the <a href="https://julialang.org/">Julia Programming Language</a>.</p></nav></div><div class="modal" id="documenter-settings"><div class="modal-background"></div><div class="modal-card"><header class="modal-card-head"><p class="modal-card-title">Settings</p><button class="delete"></button></header><section class="modal-card-body"><p><label class="label">Theme</label><div class="select"><select id="documenter-themepicker"><option value="auto">Automatic (OS)</option><option value="documenter-light">documenter-light</option><option value="documenter-dark">documenter-dark</option></select></div></p><hr/><p>This document was generated with <a href="https://github.com/JuliaDocs/Documenter.jl">Documenter.jl</a> version 1.3.0 on <span class="colophon-date" title="Thursday 21 March 2024 13:54">Thursday 21 March 2024</span>. Using Julia version 1.10.2.</p></section><footer class="modal-card-foot"></footer></div></div></div></body></html>
* Gradient Calls: 53</code></pre><p>We see that for this objective and starting point, <code>ConjugateGradient()</code> requires fewer gradient evaluations to reach convergence.</p><h2 id="References"><a class="docs-heading-anchor" href="#References">References</a><a id="References-1"></a><a class="docs-heading-anchor-permalink" href="#References" title="Permalink"></a></h2><ul><li>W. W. Hager and H. Zhang (2006) Algorithm 851: CG_DESCENT, a conjugate gradient method with guaranteed descent. ACM Transactions on Mathematical Software 32: 113-137.</li><li>W. W. Hager and H. Zhang (2013), The Limited Memory Conjugate Gradient Method. SIAM Journal on Optimization, 23, pp. 2150-2168.</li></ul></article><nav class="docs-footer"><a class="docs-footer-prevpage" href="../adam_adamax/">« Adam and AdaMax</a><a class="docs-footer-nextpage" href="../gradientdescent/">Gradient Descent »</a><div class="flexbox-break"></div><p class="footer-message">Powered by <a href="https://github.com/JuliaDocs/Documenter.jl">Documenter.jl</a> and the <a href="https://julialang.org/">Julia Programming Language</a>.</p></nav></div><div class="modal" id="documenter-settings"><div class="modal-background"></div><div class="modal-card"><header class="modal-card-head"><p class="modal-card-title">Settings</p><button class="delete"></button></header><section class="modal-card-body"><p><label class="label">Theme</label><div class="select"><select id="documenter-themepicker"><option value="auto">Automatic (OS)</option><option value="documenter-light">documenter-light</option><option value="documenter-dark">documenter-dark</option></select></div></p><hr/><p>This document was generated with <a href="https://github.com/JuliaDocs/Documenter.jl">Documenter.jl</a> version 1.3.0 on <span class="colophon-date" title="Saturday 23 March 2024 00:50">Saturday 23 March 2024</span>. Using Julia version 1.10.2.</p></section><footer class="modal-card-foot"></footer></div></div></div></body></html>
2 changes: 1 addition & 1 deletion dev/algo/complex/index.html
Original file line number Diff line number Diff line change
Expand Up @@ -62,4 +62,4 @@
* Stopped by an increasing objective: false
* Reached Maximum Number of Iterations: false
* Objective Calls: 48
* Gradient Calls: 48</code></pre><p>Automatic differentiation support for complex inputs may come when <a href="https://github.com/JuliaDiff/Capstan.jl">Cassete.jl</a> is ready.</p><h2 id="References"><a class="docs-heading-anchor" href="#References">References</a><a id="References-1"></a><a class="docs-heading-anchor-permalink" href="#References" title="Permalink"></a></h2><ul><li>Sorber, L., Barel, M. V., &amp; Lathauwer, L. D. (2012). Unconstrained optimization of real functions in complex variables. SIAM Journal on Optimization, 22(3), 879-898.</li><li>Kreutz-Delgado, K. (2009). The complex gradient operator and the CR-calculus. arXiv preprint arXiv:0906.4835.</li></ul></article><nav class="docs-footer"><a class="docs-footer-prevpage" href="../precondition/">« Preconditioners</a><a class="docs-footer-nextpage" href="../manifolds/">Manifolds »</a><div class="flexbox-break"></div><p class="footer-message">Powered by <a href="https://github.com/JuliaDocs/Documenter.jl">Documenter.jl</a> and the <a href="https://julialang.org/">Julia Programming Language</a>.</p></nav></div><div class="modal" id="documenter-settings"><div class="modal-background"></div><div class="modal-card"><header class="modal-card-head"><p class="modal-card-title">Settings</p><button class="delete"></button></header><section class="modal-card-body"><p><label class="label">Theme</label><div class="select"><select id="documenter-themepicker"><option value="auto">Automatic (OS)</option><option value="documenter-light">documenter-light</option><option value="documenter-dark">documenter-dark</option></select></div></p><hr/><p>This document was generated with <a href="https://github.com/JuliaDocs/Documenter.jl">Documenter.jl</a> version 1.3.0 on <span class="colophon-date" title="Thursday 21 March 2024 13:54">Thursday 21 March 2024</span>. Using Julia version 1.10.2.</p></section><footer class="modal-card-foot"></footer></div></div></div></body></html>
* Gradient Calls: 48</code></pre><p>Automatic differentiation support for complex inputs may come when <a href="https://github.com/JuliaDiff/Capstan.jl">Cassete.jl</a> is ready.</p><h2 id="References"><a class="docs-heading-anchor" href="#References">References</a><a id="References-1"></a><a class="docs-heading-anchor-permalink" href="#References" title="Permalink"></a></h2><ul><li>Sorber, L., Barel, M. V., &amp; Lathauwer, L. D. (2012). Unconstrained optimization of real functions in complex variables. SIAM Journal on Optimization, 22(3), 879-898.</li><li>Kreutz-Delgado, K. (2009). The complex gradient operator and the CR-calculus. arXiv preprint arXiv:0906.4835.</li></ul></article><nav class="docs-footer"><a class="docs-footer-prevpage" href="../precondition/">« Preconditioners</a><a class="docs-footer-nextpage" href="../manifolds/">Manifolds »</a><div class="flexbox-break"></div><p class="footer-message">Powered by <a href="https://github.com/JuliaDocs/Documenter.jl">Documenter.jl</a> and the <a href="https://julialang.org/">Julia Programming Language</a>.</p></nav></div><div class="modal" id="documenter-settings"><div class="modal-background"></div><div class="modal-card"><header class="modal-card-head"><p class="modal-card-title">Settings</p><button class="delete"></button></header><section class="modal-card-body"><p><label class="label">Theme</label><div class="select"><select id="documenter-themepicker"><option value="auto">Automatic (OS)</option><option value="documenter-light">documenter-light</option><option value="documenter-dark">documenter-dark</option></select></div></p><hr/><p>This document was generated with <a href="https://github.com/JuliaDocs/Documenter.jl">Documenter.jl</a> version 1.3.0 on <span class="colophon-date" title="Saturday 23 March 2024 00:50">Saturday 23 March 2024</span>. Using Julia version 1.10.2.</p></section><footer class="modal-card-foot"></footer></div></div></div></body></html>
Loading

0 comments on commit 643b684

Please sign in to comment.