Skip to content

Commit

Permalink
Fix outdated info and typos
Browse files Browse the repository at this point in the history
  • Loading branch information
twinkarma committed Feb 22, 2024
1 parent 4d645ea commit 5384f1c
Show file tree
Hide file tree
Showing 3 changed files with 11 additions and 10 deletions.
13 changes: 7 additions & 6 deletions slides/00_overview.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,9 @@
title: "0. Overview"
---

# RSES: Introduction to Deep Learning Course
# Deep Learning Demystified
## Foundations for Non-Computer Scientist


Twin Karmakharm

Expand All @@ -19,16 +21,15 @@ Twin Karmakharm
* Grant support
* Software optimisation, GPU and HPC
* Training, outreach and education activities
* Led by Dr. Paul Richmond
* Led by Dr. Romain Thomas
* Visit us at [https://rse.shef.ac.uk](https://rse.shef.ac.uk)
* [Apply for up to 12 days of RSE time on AI/HPC related project](https://rse.shef.ac.uk/collaboration/tier2)
---

### Course Materials

All course materials can be found at:
Course materials can be found at:

[https://rses-dl-course.github.io/](https://rses-dl-course.github.io/)
[https://rse.shef.ac.uk/dl-demystified-course](https://rse.shef.ac.uk/dl-demystified-course)


---
Expand Down Expand Up @@ -141,7 +142,7 @@ A platform to run lab code interactively
`Ctrl+Space` to get code suggestion

---

<!-- .slide: data-visibility="hidden" -->

### Practical Labs Using Jupyter Notebooks
A platform to run lab code interactively
Expand Down
2 changes: 1 addition & 1 deletion slides/01_introduction_to_deep_learning.md
Original file line number Diff line number Diff line change
Expand Up @@ -375,7 +375,7 @@ There's likely a framework for a language that you know e.g.:
* Matlab - Deep Learning Toolbox

Or import from python
* 'keras' package for R - Used for R version of the course
* 'keras' package for R

---

Expand Down
6 changes: 3 additions & 3 deletions slides/02_neural_networks.md
Original file line number Diff line number Diff line change
Expand Up @@ -189,7 +189,7 @@ Note:

### Error and Loss

We need a way good way to quantify how far our prediction is from our goal.
We need a good way to quantify how far our prediction is from our goal.

<object type="image/svg+xml" data="assets/img/neuralnetwork-initialisation.svg" style="background: white; width: 80%; margin-top: 1em">
</object>
Expand Down Expand Up @@ -415,7 +415,7 @@ Too small learning rate ($\eta$) means our network takes longer to converge.

How do we know the 'optimal' learning rate?

We won't know until actually training the network! <!-- .element: class="fragment" -->
We won't know until we train the network! <!-- .element: class="fragment" -->

Start with the default values provided. <!-- .element: class="fragment" -->

Expand Down Expand Up @@ -460,7 +460,7 @@ The algorithm for computing the gradients efficiently and adjusting the weights
1. Calculate loss - compare prediction with target
1. Back propagation - find gradients for each weight and bias
1. Repeat for all samples
1. Average gradients of weights and bias then update <!-- .element: class="fragment" -->
1. Average the delta of weights and bias then update <!-- .element: class="fragment" -->
1. Start again from 1 <!-- .element: class="fragment" -->

---
Expand Down

0 comments on commit 5384f1c

Please sign in to comment.