Skip to content

Commit

Permalink
Merge branch 'main' into dependabot/pip/tqdm-4.66.3
Browse files Browse the repository at this point in the history
  • Loading branch information
anaismoller authored Nov 1, 2024
2 parents ce3b48c + 6d986f6 commit a2de9d6
Show file tree
Hide file tree
Showing 69 changed files with 5,257 additions and 3,714 deletions.
8 changes: 5 additions & 3 deletions .github/workflows/publish.yml
Original file line number Diff line number Diff line change
Expand Up @@ -39,9 +39,11 @@ jobs:
# Uncomment if you need mpi
# - name: Set-up MPI
# uses: mpi4py/setup-mpi@v1

- name: Set-up Cuda Toolkit
run: sudo apt-get install nvidia-cuda-toolkit nvidia-cuda-toolkit-gcc

# - name: Set-up Cuda Toolkit
# run: |
# sudo apt-get update
# sudo apt-get install nvidia-cuda-toolkit nvidia-cuda-toolkit-gcc

- name: Set-up Poetry
uses: snok/install-poetry@v1
Expand Down
45 changes: 33 additions & 12 deletions .github/workflows/pull_request.yml
Original file line number Diff line number Diff line change
Expand Up @@ -20,6 +20,13 @@ jobs:

run_tests:
runs-on: ubuntu-latest
strategy:
matrix:
include:
- conda_env: supernnova
conda_env_file: env/conda_env.yml
- conda_env: supernnova-cuda
conda_env_file: env/conda_gpu_env.yml

steps:

Expand All @@ -34,50 +41,64 @@ jobs:
# uses: mpi4py/setup-mpi@v1

- name: Set-up Cuda Toolkit
run: sudo apt-get install nvidia-cuda-toolkit nvidia-cuda-toolkit-gcc
run: |
sudo apt-get update
sudo apt-get install nvidia-cuda-toolkit nvidia-cuda-toolkit-gcc
- name: Set-up Poetry
uses: snok/install-poetry@v1
- name: Setup miniconda
uses: conda-incubator/setup-miniconda@v3
with:
virtualenvs-create: true
virtualenvs-in-project: true
installer-parallel: true
activate-environment: ${{matrix.conda_env}}
environment-file: ${{matrix.conda_env_file}}

- name: Set-up Python
uses: actions/setup-python@v4
with:
python-version: '3.11'
cache: 'poetry'
- name: Verify Environment
shell: bash -l {0}
run: |
conda info
conda list
- name: Verify cuda support
if: ${{ matrix.conda_env == 'supernnova-cuda' }}
shell: bash -l {0}
run: python env/verify_cuda_support.py

# Configure project
- name: Set project version
run: poetry version $(git describe --tags --abbrev=0)
shell: bash -l {0}
run: poetry version $(git describe --tags --match "v[0-9]*" --abbrev=0)

# Install the project (we need some of the tools installed here for liniting etc)
- name: Install the project
shell: bash -l {0}
run: poetry install --no-interaction --extras "docs dev"

# Enforce code formating standards
- name: Enforce linting
shell: bash -l {0}
run: poetry run ruff .

- name: Enforce formating
shell: bash -l {0}
run: poetry run black .

# Make sure the Poetry project is properly maintained
- name: Enforce maintainance of Poetry project
shell: bash -l {0}
run: |
poetry check
poetry lock --check
# Run tests
- name: Generate build
shell: bash -l {0}
run: poetry build

- name: Code tests
shell: bash -l {0}
run: poetry run pytest

- name: Documentation build test
shell: bash -l {0}
run: |
cd docs
poetry run make html
Expand Down
2 changes: 1 addition & 1 deletion .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ supernnova/**/.ipynb_checkpoints
science_modules/**/*.pyc
rfsn/**/*.pyc
charnokmoss/**/*.pyc
*.csv
# *.csv
*.h5
.cache
/data
Expand Down
1 change: 1 addition & 0 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -12,6 +12,7 @@ repos:
hooks:
- id: check-merge-conflict
- id: check-yaml
exclude: archive/|tmp/
- id: check-added-large-files
- id: no-commit-to-branch
args: ['--branch', 'main'] # Commits to main only allowed VIA PR for this project
Expand Down
6 changes: 6 additions & 0 deletions Makefile
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
USERNAME = $(USER)
USER_ID = $(shell id -u)
USER_GID = $(shell id -g)

%:
@DOCKER_BUILDKIT=1 docker build -f env/Dockerfile --build-arg 'TARGET=$*' --build-arg 'USERNAME=$(USERNAME)' --build-arg 'USER_ID=${USER_ID}' --build-arg 'USER_GID=${USER_GID}' -t rnn-$* .
137 changes: 26 additions & 111 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,135 +7,50 @@
[![Build Status](https://travis-ci.org/supernnova/SuperNNova.svg?branch=master)](https://travis-ci.org/supernnova/SuperNNova)


### What is SuperNNova (SNN)

SuperNNova is an open-source photoemtric time-series classification framework.

The framework includes different RNN architectures (LSTM, GRU, Bayesian RNNs) and can be trained with simulations in `.csv` and `SNANA FITS` format. SNN is part of the [PIPPIN](https://github.com/dessn/Pippin) end-to-end cosmology pipeline.

You can train your own model for time-series classification (binary or multi-class) using photometry and additional features.


Please include the full citation if you use this material in your research: [A Möller and T de Boissière,
MNRAS, Volume 491, Issue 3, January 2020, Pages 4277–4293.](https://academic.oup.com/mnras/article-abstract/491/3/4277/5651173)

### Read the documentation
For the main branch:
[https://supernnova.readthedocs.io](https://supernnova.readthedocs.io/en/latest/)

The paper branch differs slightly from the master. Take a look to "changelog_paper_to_new_branch" or [Build the docs for this branch](#docs).

### Installation
Clone this repository (preferred)
```bash
git clone https://github.com/supernnova/supernnova.git
```
or install pip module (check versioning)
```bash
pip install supernnova
```
and configure environment using this [documentation](https://supernnova.readthedocs.io/en/latest/installation/python.html)

### Read the paper

Links to the publication: [MNRAS](https://academic.oup.com/mnras/article-abstract/491/3/4277/5651173),[ArXiv](https://arxiv.org/abs/1901.06384). All results quoted in these publications were produced using the branch "paper" which is frozen for reproducibility.
### Read the papers

Please include the full citation if you use this material in your research: [A Möller and T de Boissière,
MNRAS, Volume 491, Issue 3, January 2020, Pages 4277–4293.](https://academic.oup.com/mnras/article-abstract/491/3/4277/5651173)

To reproduce [Möller & de Boissière, 2019 MNRAS](https://academic.oup.com/mnras/article-abstract/491/3/4277/5651173) switch to `paper` branch and build documentation.

## Table of contents
1. [Repository overview](#overview)
2. [Getting Started](#start)
0. [Use Poetry in new releases]
1. [With Conda](#conda)
2. [With Docker](#docker)
3. [Usage](#usage)
3. [Reproduce paper](#paper)
4. [Pipeline Description](#pipeline)
5. [Running tests](#test)
6. [Build the docs](#docs)

## Repository overview <a name="overview"></a>

├── supernnova --> main module
├──data --> scripts to create the processed database
├──visualization --> data plotting scripts
├──training --> training scripts
├──validation --> validation scripts
├──utils --> utilities used throughout the module
├── tests --> unit tests to check data processing
├── sandbox --> WIP scripts

## Getting started <a name="start"></a>

### With Conda <a name="conda"></a>

cd env

# Create conda environment
conda create --name <env> --file <conda_file_of_your_choice>

# Activate conda environment
source activate <env>

### With Docker <a name="docker"></a>

cd env

# Build docker images
make cpu # cpu image
make gpu # gpu image (requires NVIDIA Drivers + nvidia-docker)

# Launch docker container
python launch_docker.py (--use_gpu to run GPU based container)

To reproduce the Dark Energy Survey analyses use commit `fcf8584b64974ef7a238eac718e01be4ed637a1d`:
- [Möller et al. 2022 MNRAS](https://ui.adsabs.harvard.edu/abs/2022MNRAS.514.5159M/abstract)
- [Möller et al. 2024 MNRAS](https://ui.adsabs.harvard.edu/abs/2024MNRAS.533.2073M/abstract)
- [Vincenzi et al. 2023 MNRAS](https://ui.adsabs.harvard.edu/abs/2023MNRAS.518.1106V/abstract)
- [DES Collaboration 2024 ApJ](https://ui.adsabs.harvard.edu/abs/2024ApJ...973L..14D/abstract)

For more detailed instructions, check the full [setup instructions](https://supernnova.readthedocs.io/en/latest/installation/python.html)
To reproduce Fink analyses until 2024 use commit `fcf8584b64974ef7a238eac718e01be4ed637a1d` and check [Fink's github](https://github.com/astrolabsoftware/fink-science).


## Usage <a name="usage"></a>

When cloning this repository:

# Create data
python run.py --data --dump_dir tests/dump --raw_dir tests/raw --fits_dir tests/fits

# Train a baseline RNN
python run.py --train_rnn --dump_dir tests/dump

# Train a variational dropout RNN
python run.py --train_rnn --model variational --dump_dir tests/dump

# Train a Bayes By Backprop RNN
python run.py --train_rnn --model bayesian --dump_dir tests/dump

# Train a RandomForest
python run.py --train_rf --dump_dir tests/dump

When using pip, a full example is [https://supernnova.readthedocs.io](https://supernnova.readthedocs.io/en/latest/)

# Python
import supernnova.conf as conf
from supernnova.data import make_dataset

# get config args
args = conf.get_args()

# create database
args.data = True # conf: making new dataset
args.dump_dir = "tests/dump" # conf: where the dataset will be saved
args.raw_dir = "tests/raw" # conf: where raw photometry files are saved
args.fits_dir = "tests/fits" # conf: where salt2fits are saved
settings = conf.get_settings(args) # conf: set settings
make_dataset.make_dataset(settings) # make dataset

## Reproduce paper results <a name="paper"></a>
Please change to branch ``paper``:

python run_paper.py

## General pipeline description <a name="pipeline"></a>

- Parse raw data in FITS format
- Create processed database in HDF5 format
- Train Recurrent Neural Networks (RNN) or Random Forests (RF) to classify photometric lightcurves
- Validate on test set


## Running tests with py.test <a name="tests"></a>

PYTHONPATH=$PWD:$PYTHONPATH pytest -W ignore --cov supernnova tests


## Build docs <a name="docs"></a>
### Build docs <a name="docs"></a>

cd docs && make clean && make html && cd ..
firefox docs/_build/html/index.html


### ADACS
This package has been updated to a recent pytorch through the [ADACS Merit allocation program](https://adacs.org.au/merit-allocation-program) 2023-2024.
Loading

0 comments on commit a2de9d6

Please sign in to comment.