Skip to content

Commit

Permalink
Train model for docs using GAP (#236)
Browse files Browse the repository at this point in the history
train model for docs
---------

Co-authored-by: Davide Tisi <[email protected]>
Co-authored-by: Philip Loche <[email protected]>
  • Loading branch information
3 people authored Jun 7, 2024
1 parent 6baedb3 commit c50a1ff
Show file tree
Hide file tree
Showing 9 changed files with 21 additions and 20 deletions.
2 changes: 0 additions & 2 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -166,8 +166,6 @@ cython_debug/
# model output directories
outputs/
examples/basic_usage/*.xyz
!docs/static/*.pt
!examples/ase/*.pt

# sphinx gallery
docs/src/examples
Expand Down
4 changes: 4 additions & 0 deletions .readthedocs.yml
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,9 @@ build:
tools:
python: "3.12"
rust: "1.75"
jobs:
pre_build:
- set -e && cd examples/ase && bash train.sh


# Build documentation in the docs/ directory with Sphinx
Expand All @@ -28,5 +31,6 @@ python:
- method: pip
path: .
extra_requirements:
- gap
- soap-bpnn
- requirements: docs/requirements.txt
2 changes: 2 additions & 0 deletions docs/static/qm9/options.yaml
Original file line number Diff line number Diff line change
@@ -1,6 +1,8 @@
# architecture used to train the model
architecture:
name: experimental.soap_bpnn
training:
num_epochs: 5 # a very short training run

# Mandatory section defining the parameters for system and target data of the
# training set
Expand Down
Binary file removed examples/ase/model.pt
Binary file not shown.
10 changes: 2 additions & 8 deletions examples/ase/options.yaml
Original file line number Diff line number Diff line change
@@ -1,13 +1,7 @@
device: cpu

architecture:
name: experimental.soap_bpnn
training:
batch_size: 16
num_epochs: 100
learning_rate: 0.01
name: experimental.gap

# Section defining the parameters for system and target data
# training set section
training_set:
systems: "ethanol_reduced_100.xyz"
targets:
Expand Down
3 changes: 1 addition & 2 deletions examples/ase/train.sh
Original file line number Diff line number Diff line change
@@ -1,4 +1,3 @@
#!/bin/bash

metatrain train options.yaml

mtt train options.yaml
7 changes: 5 additions & 2 deletions examples/basic_usage/usage.sh
Original file line number Diff line number Diff line change
Expand Up @@ -3,14 +3,17 @@
mtt train options.yaml

# The functions saves the final model `model.pt` to the current output folder for later
# evaluation. All command line flags of the train sub-command can be listed via
# evaluation. An `extensions/` folder, which contains the compiled extensions for the model,
# might also be saved depending on the architecture.
# All command line flags of the train sub-command can be listed via

mtt train --help

# We now evaluate the model on the training dataset, where the first arguments specifies
# trained model and the second an option file containing the path of the dataset for evaulation.
# The extensions of the model, if any, can be specified via the `-e` flag.

mtt eval model.pt eval.yaml
mtt eval model.pt eval.yaml -e extensions/

# The evaluation command predicts those properties the model was trained against; here
# "U0". The predictions together with the systems have been written in a file named
Expand Down
2 changes: 1 addition & 1 deletion src/metatrain/experimental/gap/model.py
Original file line number Diff line number Diff line change
Expand Up @@ -209,7 +209,7 @@ def load_checkpoint(cls, path: Union[str, Path]) -> "GAP":
def export(self) -> MetatensorAtomisticModel:
capabilities = ModelCapabilities(
outputs=self.outputs,
atomic_types=self.dataset_info.atomic_types,
atomic_types=sorted(self.dataset_info.atomic_types),
interaction_range=self.hypers["soap"]["cutoff"],
length_unit=self.dataset_info.length_unit,
supported_devices=["cuda", "cpu"],
Expand Down
11 changes: 6 additions & 5 deletions tox.ini
Original file line number Diff line number Diff line change
Expand Up @@ -136,12 +136,13 @@ deps =
-r docs/requirements.txt
allowlist_externals =
bash
extras = soap-bpnn # this model is used in the documentation
extras = # these models are used in the documentation
gap
soap-bpnn
commands =
# Run .sh scripts in the example folder.
#bash -c "set -e && cd {toxinidir}/examples/basic_usage && bash usage.sh"
# Disable slow training at runtime for ase example to speed up CI for now.
#bash -c "set -e && cd {toxinidir}/examples/ase && bash train_export.sh"
# Run example and usage scripts.
bash -c "set -e && cd {toxinidir}/examples/basic_usage && bash usage.sh"
bash -c "set -e && cd {toxinidir}/examples/ase && bash train.sh"
sphinx-build \
{posargs:-E} \
--builder html \
Expand Down

0 comments on commit c50a1ff

Please sign in to comment.