Skip to content

Commit

Permalink
Merge branch 'master' into support-AHF-ramses
Browse files Browse the repository at this point in the history
  • Loading branch information
cphyc committed Apr 19, 2022
2 parents 49ae220 + f887de1 commit 0277954
Show file tree
Hide file tree
Showing 205 changed files with 1,316 additions and 1,360 deletions.
5 changes: 2 additions & 3 deletions .github/workflows/integration-test.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -53,13 +53,13 @@ jobs:
if: steps.cache-test-datasets.outputs.cache-hit != 'true'
working-directory: test_tutorial_build
run: |
wget -T 60 -nv ftp://zuserver2.star.ucl.ac.uk/app/tangos/mini_tutorial_test.tar.gz
wget -T 60 -nv ftp://zuserver2.star.ucl.ac.uk/app/tangos/mini_tutorial_test.tar.gz
tar -xzvf mini_tutorial_test.tar.gz
- name: Build test database
working-directory: test_tutorial_build
run: bash build.sh

- uses: actions/upload-artifact@v2
with:
name: Tangos database
Expand All @@ -69,4 +69,3 @@ jobs:
working-directory: test_tutorial_build
run: tangos diff data.db reference_database.db --ignore-value-of gas_map gas_map_faceon gas_map_sideon uvi_image uvi_image_sideon uvi_image_faceon
# ignore-value-of above is a clunky fix for the use of 'approximate fast' images. Better solution would be good in the long term.

12 changes: 5 additions & 7 deletions .github/workflows/pypi.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -14,12 +14,12 @@ jobs:
environment: PyPI
steps:
- uses: actions/checkout@v2

- name: Determine version tag
run: |
echo "SETUP_VERSION=`python setup.py --version`" >> $GITHUB_ENV
echo "VERSION_TAG=`git describe --tags | cut -c 2-`" >> $GITHUB_ENV
- name: Verify version naming is consistent
run: |
if [ "${{ env.VERSION_TAG }}" == "${{ env.SETUP_VERSION }}" ]; then
Expand All @@ -30,17 +30,15 @@ jobs:
echo setup.py-derived version is ${{ env.SETUP_VERSION }}
exit 1;
fi
- name: Install twine
run: pip3 install twine

- name: Build source distribution
run: python3 setup.py sdist

- name: Twine upload
env:
TWINE_USERNAME: __token__
TWINE_PASSWORD: ${{ secrets.PYPI_TOKEN }}
run: twine upload dist/tangos-${{ env.VERSION_TAG }}.tar.gz


44 changes: 44 additions & 0 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,44 @@
minimum_pre_commit_version: "1.15.0"

ci:
autofix_prs: false

repos:
- repo: https://github.com/asottile/setup-cfg-fmt
rev: v1.20.1
hooks:
- id: setup-cfg-fmt
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v4.2.0
hooks:
- id: trailing-whitespace
- id: end-of-file-fixer
- id: no-commit-to-branch
args: [--branch, master]
- id: check-shebang-scripts-are-executable
- id: check-executables-have-shebangs
- id: check-yaml
- repo: https://github.com/asottile/pyupgrade
rev: v2.32.0
hooks:
- id: pyupgrade
args: [--py37-plus]
- repo: https://github.com/pre-commit/pygrep-hooks
rev: v1.9.0
hooks:
- id: rst-backticks
- repo: https://github.com/nbQA-dev/nbQA
rev: 1.3.1
hooks:
- id: nbqa-pyupgrade
args: [--py37-plus]
- id: nbqa-isort

- repo: https://github.com/PyCQA/isort
rev: 5.10.1
hooks:
- id: isort
name: isort (python)
- id: isort
name: isort (cython)
types: [cython]
2 changes: 1 addition & 1 deletion CODE_OF_CONDUCT.md
Original file line number Diff line number Diff line change
Expand Up @@ -47,7 +47,7 @@ threatening, offensive, or harmful.

This Code of Conduct applies both within project spaces and in public spaces
when an individual is representing the project or its community. Examples of
representing a project or community include acting as a representative at an
representing a project or community include acting as a representative at an
online or offline event. Representation of a project may be
further defined and clarified by project maintainers.

Expand Down
27 changes: 13 additions & 14 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,38 +3,37 @@ Tangos - The agile numerical galaxy organisation system

[![Build Status](https://github.com/pynbody/tangos/actions/workflows/build-test.yaml/badge.svg?branch=master)](https://github.com/pynbody/tangos/actions) [![DOI](https://zenodo.org/badge/105990932.svg)](https://zenodo.org/badge/latestdoi/105990932)

_Tangos_ lets you build a database (along the lines of [Eagle](http://icc.dur.ac.uk/Eagle/database.php)
_Tangos_ lets you build a database (along the lines of [Eagle](http://icc.dur.ac.uk/Eagle/database.php)
or [MultiDark](https://www.cosmosim.org/cms/documentation/projects/multidark-bolshoi-project/))
for your own cosmological and zoom simulations.
for your own cosmological and zoom simulations.

It's a modular system for Python 3.6+, capable of generating and querying databases. _Tangos_:

- is designed to store and manage results from your own analysis code;
- provides web and python interfaces;
- allows users to construct science-focussed queries, including across entire merger trees,
- allows users to construct science-focussed queries, including across entire merger trees,
without requiring any knowledge of SQL;
When building databases, _tangos_:

When building databases, _tangos_:

- manages the process of populating the database with science data, including auto-parallelising
your analysis;
- can be customised to work with multiple python modules such as
[pynbody](http://pynbody.github.io/pynbody/) or [yt](http://yt-project.org) to
- can be customised to work with multiple python modules such as
[pynbody](http://pynbody.github.io/pynbody/) or [yt](http://yt-project.org) to
process raw simulation data;
- can use your favourite database as the underlying store, thanks to [sqlalchemy](https://www.sqlalchemy.org).
By default, _tangos_ uses the file-based database [sqlite](https://sqlite.org), but it is also routinely
By default, _tangos_ uses the file-based database [sqlite](https://sqlite.org), but it is also routinely
tested against the server-based MySQL.


Getting started
---------------

For information on getting started refer to the [tutorials on our github pages](https://pynbody.github.io/tangos/).
These tutorials are also available in markdown format [within the tangos repository](docs/index.md).


Acknowledging the code
----------------------
When using _tangos_, please acknowledge it by citing the release paper:
Pontzen & Tremmel, 2018, ApJS 237, 2. [DOI 10.3847/1538-4365/aac832](https://doi.org/10.3847/1538-4365/aac832); [arXiv:1803.00010](https://arxiv.org/pdf/1803.00010.pdf). Optionally you can also cite the Zenodo DOI for the specific version of _tangos_ that you are using, which may be found [here](https://doi.org/10.5281/zenodo.1243070).

2 changes: 1 addition & 1 deletion bluewaters.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
Running the database on the BlueWaters computer
---------------------------------------------------
To run on BlueWaters, you will need to use the bluewaters version of halo_database, located currently in the mjt_BW branch (this will likely be merged with the master branch eventually). This branch uses mpi4py rather than pypar, which is currently default. The latter doesn't play nice with the installed MPI on BlueWaters machines. All necessary code (except for pynbody) should be installed on BlueWaters under the module called bwpy. Just as with any other computer, simply clone the halo_database repository and change branches to mjt_BW.
To run on BlueWaters, you will need to use the bluewaters version of halo_database, located currently in the mjt_BW branch (this will likely be merged with the master branch eventually). This branch uses mpi4py rather than pypar, which is currently default. The latter doesn't play nice with the installed MPI on BlueWaters machines. All necessary code (except for pynbody) should be installed on BlueWaters under the module called bwpy. Just as with any other computer, simply clone the halo_database repository and change branches to mjt_BW.

To submit a job doing a database calculation, you need a submission script with the following general syntax. Notice that on BlueWaters aprun is used instead of mpirun and we are required to import all necessary modules and defile all necessary environment variables (as well as source the .bashrc file). This is because the environment is not automatically preserved on compute nodes.

Expand Down
19 changes: 10 additions & 9 deletions docs/Data exploration with python.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -21,8 +21,9 @@
"outputs": [],
"source": [
"%matplotlib inline\n",
"import tangos\n",
"import pylab as p"
"import pylab as p\n",
"\n",
"import tangos"
]
},
{
Expand Down Expand Up @@ -397,7 +398,7 @@
"\n",
"p.plot(SFR_time_bins, SFR)\n",
"p.xlabel(\"Time/Gyr\")\n",
"p.ylabel(\"SFR/$M_{\\odot}\\,yr^{-1}$\")"
"p.ylabel(r\"SFR/$M_{\\odot}\\,yr^{-1}$\")"
]
},
{
Expand Down Expand Up @@ -509,7 +510,7 @@
"BH_accrate = halo.calculate('BH.BH_mdot_histogram')\n",
"p.plot(SFR_time_bins, BH_accrate)\n",
"p.xlabel(\"Time/Gyr\")\n",
"p.ylabel(\"BH accretion rate/$M_{\\odot}\\,yr^{-1}$\")"
"p.ylabel(r\"BH accretion rate/$M_{\\odot}\\,yr^{-1}$\")"
]
},
{
Expand Down Expand Up @@ -565,7 +566,7 @@
"\n",
"p.plot(mass*1e10,vmax,'k.')\n",
"p.loglog()\n",
"p.xlabel(\"$M/h^{-1} M_{\\odot}$\")\n",
"p.xlabel(r\"$M/h^{-1} M_{\\odot}$\")\n",
"p.ylabel(r\"$v_{max}/{\\rm km s^{-1}}$\")"
]
},
Expand Down Expand Up @@ -610,7 +611,7 @@
"p.semilogx()\n",
"p.ylim(-0.1,0.9)\n",
"p.xlim(1e12,1e15)\n",
"p.xlabel(\"$M/h^{-1} M_{\\odot}$\")\n",
"p.xlabel(r\"$M/h^{-1} M_{\\odot}$\")\n",
"p.ylabel(\"Fractional growth in mass\")"
]
},
Expand Down Expand Up @@ -754,7 +755,7 @@
"p.legend(loc=\"lower right\")\n",
"p.semilogy()\n",
"p.xlabel(\"t/Gyr\")\n",
"p.ylabel(\"SFR/$M_{\\odot}\\,yr^{-1}$\")"
"p.ylabel(r\"SFR/$M_{\\odot}\\,yr^{-1}$\")"
]
},
{
Expand Down Expand Up @@ -801,8 +802,8 @@
"p.plot(Mstar_no_AGN, Mstar_AGN, 'k.')\n",
"p.plot([1e6,1e11],[1e6,1e11],'k-',alpha=0.3)\n",
"p.loglog()\n",
"p.xlabel(\"$M_{\\star}/M_{\\odot}$ without AGN\")\n",
"p.ylabel(\"$M_{\\star}/M_{\\odot}$ with AGN\")"
"p.xlabel(r\"$M_{\\star}/M_{\\odot}$ without AGN\")\n",
"p.ylabel(r\"$M_{\\star}/M_{\\odot}$ with AGN\")"
]
}
],
Expand Down
2 changes: 1 addition & 1 deletion docs/_config.yml
Original file line number Diff line number Diff line change
@@ -1 +1 @@
theme: jekyll-theme-minimal
theme: jekyll-theme-minimal
2 changes: 1 addition & 1 deletion docs/advanced.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,4 +9,4 @@ topics:
- [Parallelisation strategies](mpi.md)
- The [live calculation](live_calculation.md) system
- Writing [custom handlers](custom_input_handlers.md) for new file formats or libraries
- Understanding [time-histogram properties](histogram_properties.md)
- Understanding [time-histogram properties](histogram_properties.md)
14 changes: 7 additions & 7 deletions docs/black_holes_and_crossmatching.md
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
Tangos Tutorial – Black holes
=============================

This tutorial covers adding black holes and crossmatching simulations to make halo-to-halo comparisons.
This tutorial covers adding black holes and crossmatching simulations to make halo-to-halo comparisons.
It extends the [first steps with Changa+AHF tutorial](first_steps_changa+ahf.md)
by adding a simulation with black holes. The simulation is started from identical initial
conditions and therefore serves as a comparison to quantify the effect of AGN feedback.
Expand All @@ -12,15 +12,15 @@ Download, add and process the new simulation
First, download the
[raw simulation data](https://zenodo.org/record/5155467/files/tutorial_changa_blackholes.tar.gz?download=1) required
for this tutorial.
Unpack the tar file either in your home folder or the folder that you pointed the `TANGOS_SIMULATION_FOLDER`
Unpack the tar file either in your home folder or the folder that you pointed the `TANGOS_SIMULATION_FOLDER`
environment
variable to.

Next, refer back to the [first steps with Changa+AHF tutorial](first_steps_changa+ahf.md).
Follow all the steps there but replacing `tutorial_changa` with `tutorial_changa_blackholes`.

Note that if you are using Michael Tremmel's black hole implementation in Changa, you need to run his
pre-processing script to generate the black hole logs (such as `.shortened.orbit` and `.mergers`) from
pre-processing script to generate the black hole logs (such as `.shortened.orbit` and `.mergers`) from
the raw output logs.


Expand All @@ -47,7 +47,7 @@ tangos_add_bh --sims tutorial_changa_blackholes
```

This scans through the timesteps, adds black holes from each snapshot, and links them together using merger
information from changa's output `.mergers` file.
information from changa's output `.mergers` file.

However no properties are associated with the black holes until you ask for them. Property calculations
can be applied to black holes (and other objects) in just the same way as halos, using the `tangos write`
Expand All @@ -60,14 +60,14 @@ tangos write BH_mass BH_mdot_histogram --for tutorial_changa_blackholes --type b
Here
- `BH_mass` and `BH_mdot_histogram` are properties referring to, respectively, the mass of
the black hole at a given timestep and the recent accretion history. Note that `BH_mdot_histogram`
is a [histogram property](histogram_properties.md) that can be reassembled across time
in different ways, like `SFR_histogram`.
is a [histogram property](histogram_properties.md) that can be reassembled across time
in different ways, like `SFR_histogram`.
- `--for tutorial_changa_blackholes` idenfities that we are only adding these properties to that
particular simulation
- `--type bh` is a new directive, indicating the writer should be applied to all black hole
objects in the simulation (rather than regular halos).

If you want to speed up the processes above, `tangos_add_bh` and `tangos write` can both
If you want to speed up the processes above, `tangos_add_bh` and `tangos write` can both
be [MPI parallelised](mpi.md).

Explore
Expand Down
Loading

0 comments on commit 0277954

Please sign in to comment.