-
Notifications
You must be signed in to change notification settings - Fork 129
Glossary
Romain F. Laine edited this page Mar 13, 2020
·
9 revisions
Term | Our definition of it |
---|---|
Neural network | An analysis network which structure was inspired by the network found in the brain, namely that of connected nodes and communication channels |
Notebook | A notebook refers to a Jupiter notebook. They allow the running of Python code, especially in the online environment provided by Google Colab |
Epoch | An epoch is a particular step during training of a network. Usually, a sequence of epochs are running sequentially during which a subset of the training dataset is used to improve the network performance. Typically, at the end of each epoch the loss function is calculated on the validation dataset |
Training | The training of a network is the stage at which the network is presented with a so-called training dataset and from which the network learns to perform the task at stake efficiently. During training, the network is able to adjust its own internal parameters and improve its capacity. |
Training dataset | |
Validation dataset | |
Loss function | |
Batches (and number thereof) | |
Patches (and number thereof) | |
Steps (and number thereof) |
Main:
- Home
- Step by step "How to" guide
- How to contribute
- Tips, tricks and FAQs
- Data augmentation
- Quality control
- Running notebooks locally
- Running notebooks on FloydHub
- BioImage Modell Zoo user guide
- ZeroCostDL4Mic over time
Fully supported networks:
- U-Net
- StarDist
- Noise2Void
- CARE
- Label free prediction (fnet)
- Object Detection (YOLOv2)
- pix2pix
- CycleGAN
- Deep-STORM
Beta notebooks
Other resources: