Skip to content

Commit

Permalink
Merge pull request #503 from materialsproject/explicit-warning-stackl…
Browse files Browse the repository at this point in the history
…evels


d80f17d unpin and update pre-commit hooks
3f607e1 add explicit stacklevel to warnings (fixes flake8 B028)
d3c690d fix dead dynamic wf doc link reported in openjournals/joss-reviews#5995 (comment)
7666ca6 add a link checker CI to catch dead doc links automatically.
  • Loading branch information
janosh authored Dec 8, 2023
2 parents 8ffcf52 + 7666ca6 commit 3b7919a
Show file tree
Hide file tree
Showing 12 changed files with 55 additions and 21 deletions.
29 changes: 29 additions & 0 deletions .github/workflows/link-check.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,29 @@
name: Check Links

on:
push:
branches: [main]
pull_request:
branches: [main]
workflow_dispatch:

jobs:
check_links:
runs-on: ubuntu-latest

steps:
- name: Checkout code
uses: actions/checkout@v4

- name: Set up Python
uses: actions/setup-python@v5
with:
python-version: "3.11"

- name: Install dependencies
run: |
pip install pytest-check-links nbconvert
- name: Run link check
run: |
pytest --check-links **/**/*.md **/**/*.ipynb --check-links-ignore "https://www.gauss-centre.eu"
18 changes: 9 additions & 9 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ default_language_version:
exclude: "^src/atomate2/vasp/schemas/calc_types/"
repos:
- repo: https://github.com/charliermarsh/ruff-pre-commit
rev: v0.1.3
rev: v0.1.7
hooks:
- id: ruff
args: [--fix]
Expand All @@ -23,18 +23,18 @@ repos:
additional_dependencies: [black]
exclude: ^(README.md|paper/paper.md)$
- repo: https://github.com/pycqa/flake8
rev: 6.0.0
rev: 6.1.0
hooks:
- id: flake8
entry: pflake8
files: ^src/
additional_dependencies:
- pyproject-flake8==6.0.0
- flake8-bugbear==22.12.6
- flake8-typing-imports==1.14.0
- flake8-docstrings==1.6.0
- flake8-rst-docstrings==0.3.0
- flake8-rst==0.8.0
- pyproject-flake8
- flake8-bugbear
- flake8-typing-imports
- flake8-docstrings
- flake8-rst-docstrings
- flake8-rst
- repo: https://github.com/pre-commit/pygrep-hooks
rev: v1.10.0
hooks:
Expand All @@ -43,7 +43,7 @@ repos:
- id: rst-directive-colons
- id: rst-inline-touching-normal
- repo: https://github.com/pre-commit/mirrors-mypy
rev: v1.6.1
rev: v1.7.1
hooks:
- id: mypy
files: ^src/
Expand Down
4 changes: 2 additions & 2 deletions docs/tutorials/3-defining-jobs.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -103,7 +103,7 @@
"id": "fatal-bible",
"metadata": {},
"source": [
"Jobs also have an index. This tracks the number of times the job has been \"replaced\" (replacing is covered in detail in the [Dynamic and nested Flows tutorial](dynamic-flows)).\n"
"Jobs also have an index. This tracks the number of times the job has been \"replaced\" (replacing is covered in detail in the [Dynamic and nested Flows tutorial](5-dynamic-flows.html)).\n"
]
},
{
Expand Down Expand Up @@ -233,7 +233,7 @@
"source": [
"from jobflow.managers.local import run_locally\n",
"\n",
"response = run_locally(add(1,2))"
"response = run_locally(add(1, 2))"
]
},
{
Expand Down
2 changes: 1 addition & 1 deletion paper/paper.md
Original file line number Diff line number Diff line change
Expand Up @@ -206,6 +206,6 @@ Naturally, the summary presented in this article constitutes only a small subset

# Acknowledgements

This work was primarily funded and intellectually led by the Materials Project, which is funded by the U.S. Department of Energy, Office of Science, Office of Basic Energy Sciences, Materials Sciences and Engineering Division, under Contract no. DE-AC02-05-CH11231: Materials Project program KC23MP. A.S.R. acknowledges support via a Miller Research Fellowship from the Miller Institute for Basic Research in Science, University of California, Berkeley. J.G would like to acknowledge the Gauss Centre for Supercomputing e.V. ([www.gauss-centre.eu](http://www.gauss-centre.eu/)) for funding workflow-related developments by providing generous computing time on the GCS Supercomputer SuperMUC-NG at Leibniz Supercomputing Centre ([www.lrz.de](http://www.lrz.de/)) (Project pn73da). J.R. acknowledges support from the German Academic Scholarship Foundation (Studienstiftung). M.L.E. thanks the BEWARE scheme of the Wallonia-Brussels Federation for funding under the European Commission's Marie Curie-Skłodowska Action (COFUND 847587). G.P. and D.W. acknowledge Umicore for the financial support in developing the remote execution mode of jobflow. D.W. and G.M.R. acknowledge funding from the European Union’s Horizon 2020 research and innovation program under the grant agreement No 951786 (NOMAD CoE). A.M.G. is supported by EPSRC Fellowship EP/T033231/1.
This work was primarily funded and intellectually led by the Materials Project, which is funded by the U.S. Department of Energy, Office of Science, Office of Basic Energy Sciences, Materials Sciences and Engineering Division, under Contract no. DE-AC02-05-CH11231: Materials Project program KC23MP. A.S.R. acknowledges support via a Miller Research Fellowship from the Miller Institute for Basic Research in Science, University of California, Berkeley. J.G would like to acknowledge the Gauss Centre for Supercomputing e.V. (<https://www.gauss-centre.eu>) for funding workflow-related developments by providing generous computing time on the GCS Supercomputer SuperMUC-NG at Leibniz Supercomputing Centre ([www.lrz.de](http://www.lrz.de/)) (Project pn73da). J.R. acknowledges support from the German Academic Scholarship Foundation (Studienstiftung). M.L.E. thanks the BEWARE scheme of the Wallonia-Brussels Federation for funding under the European Commission's Marie Curie-Skłodowska Action (COFUND 847587). G.P. and D.W. acknowledge Umicore for the financial support in developing the remote execution mode of jobflow. D.W. and G.M.R. acknowledge funding from the European Union’s Horizon 2020 research and innovation program under the grant agreement No 951786 (NOMAD CoE). A.M.G. is supported by EPSRC Fellowship EP/T033231/1.

# References
3 changes: 2 additions & 1 deletion src/jobflow/core/flow.py
Original file line number Diff line number Diff line change
Expand Up @@ -274,7 +274,8 @@ def output(self, output: Any):
f"Flow '{self.name}' contains a Flow or Job as an output. "
f"Usually the Flow output should be the output of a Job or "
f"another Flow (e.g. job.output). If this message is "
f"unexpected then double check the outputs of your Flow."
f"unexpected then double check the outputs of your Flow.",
stacklevel=2,
)

# check if the jobs array contains all jobs needed for the references
Expand Down
3 changes: 2 additions & 1 deletion src/jobflow/core/job.py
Original file line number Diff line number Diff line change
Expand Up @@ -359,7 +359,8 @@ def __init__(
f"Job '{self.name}' contains an Flow or Job as an input. "
f"Usually inputs should be the output of a Job or an Flow (e.g. "
f"job.output). If this message is unexpected then double check the "
f"inputs to your Job."
f"inputs to your Job.",
stacklevel=2,
)

def __repr__(self):
Expand Down
2 changes: 1 addition & 1 deletion src/jobflow/core/store.py
Original file line number Diff line number Diff line change
Expand Up @@ -282,7 +282,7 @@ def update(

from jobflow.utils.find import find_key, update_in_dictionary

if save is None or save is True:
if save in (None, True):
save = self.save

save_keys = _prepare_save(save)
Expand Down
2 changes: 1 addition & 1 deletion src/jobflow/managers/local.py
Original file line number Diff line number Diff line change
Expand Up @@ -156,7 +156,7 @@ def _run(root_flow):
response, jobflow_stopped = _run_job(job, parents)

encountered_bad_response = encountered_bad_response or response is None
if jobflow_stopped is True:
if jobflow_stopped:
return False

return not encountered_bad_response
Expand Down
5 changes: 3 additions & 2 deletions src/jobflow/settings.py
Original file line number Diff line number Diff line change
Expand Up @@ -107,7 +107,7 @@ class JobflowSettings(BaseSettings):
JOB_STORE: JobStore = Field(
default_factory=lambda: JobStore(
MemoryStore(),
additional_stores=defaultdict(lambda: _default_additional_store()),
additional_stores=defaultdict(_default_additional_store),
),
description="Default JobStore to use when running locally or using FireWorks. "
"See the :obj:`JobflowSettings` docstring for more details on the "
Expand Down Expand Up @@ -137,7 +137,8 @@ def load_default_settings(cls, values):
if Path(config_file_path).exists():
if Path(config_file_path).stat().st_size == 0:
warnings.warn(
f"An empty JobFlow config file was located at {config_file_path}"
f"An empty JobFlow config file was located at {config_file_path}",
stacklevel=2,
)
else:
try:
Expand Down
2 changes: 1 addition & 1 deletion src/jobflow/utils/enum.py
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ def __str__(self):

def __eq__(self, other):
"""Compare to another enum for equality."""
if type(self) == type(other) and self.value == other.value:
if type(self) is type(other) and self.value == other.value:
return True
return str(self.value) == str(other)

Expand Down
2 changes: 1 addition & 1 deletion src/jobflow/utils/find.py
Original file line number Diff line number Diff line change
Expand Up @@ -233,7 +233,7 @@ def get_root_locations(locations):
>>> _get_root_locations([["a", "b"], ["a"], ["c", "d"]])
[["a"], ["c", "d"]]
"""
sorted_locs = sorted(locations, key=lambda x: len(x))
sorted_locs = sorted(locations, key=len)
root_locations = []
for loc in sorted_locs:
if any(loc[: len(rloc)] == rloc for rloc in root_locations):
Expand Down
4 changes: 3 additions & 1 deletion src/jobflow/utils/graph.py
Original file line number Diff line number Diff line change
Expand Up @@ -46,7 +46,9 @@ def itergraph(graph: nx.DiGraph):
subgraphs = [graph.subgraph(c) for c in nx.weakly_connected_components(graph)]

if len(subgraphs) > 1:
warnings.warn("Some jobs are not connected, their ordering may be random")
warnings.warn(
"Some jobs are not connected, their ordering may be random", stacklevel=2
)

for subgraph in subgraphs:
yield from nx.topological_sort(subgraph)
Expand Down

0 comments on commit 3b7919a

Please sign in to comment.