Skip to content

Commit

Permalink
Merge branch 'develop' of https://github.com/jcsda/spack-stack into f…
Browse files Browse the repository at this point in the history
…eature/update_derecho
  • Loading branch information
climbfuji committed Aug 17, 2023
2 parents b7f3b84 + 6b43ce7 commit fffaf05
Show file tree
Hide file tree
Showing 7 changed files with 50 additions and 142 deletions.
1 change: 1 addition & 0 deletions configs/common/packages.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -67,6 +67,7 @@
version: ['1.1.0']
fms:
version: ['2023.01']
# when switching to 2023.02, need to add "+use_fmsio" to variants
variants: precision=32,64 +quad_precision +gfs_phys +openmp +pic constants=GFS build_type=Release
g2:
version: ['3.4.5']
Expand Down
5 changes: 4 additions & 1 deletion doc/source/AddingTestPackages.rst
Original file line number Diff line number Diff line change
Expand Up @@ -21,4 +21,7 @@ To install in an additional environment within an official spack-stack space, si
To use the environment, access it in the same way as a regular spack-stack installation, i.e., add the directory provided by ``spack stack setup-meta-modules`` ending with '/modulefiles/Core' to your MODULEPATH variable. This will give access to the upstream modules needed while avoiding package conflicts; do not use the upstream environment in ``$MODULEPATH`` directly.

.. note::
The ``--upstream`` option for the ``spack stack create env`` command adds a specified Spack/spack-stack installation path as an upstream environment in the resulting spack.yaml, and can be invoked multiple times in order to include multiple upstream environments. The command will *give a warning but not fail* if an invalid directory is provided (either because it does not end with the typical 'install/' directory name, or because it does not exist). Be mindful of these warnings, and if the path does not exist, check for typos and make sure that you are using the path for the correct system from the table in :numref:`Section %s <Preconfigured_Sites>`.
The ``--upstream`` option for the ``spack stack create env`` command adds a specified Spack/spack-stack installation path as an upstream environment in the resulting spack.yaml, and can be invoked multiple times in order to include multiple upstream environments. The command will *give a warning but not fail* if an invalid directory is provided (either because it does not end with the typical ``install`` directory name, or because it does not exist). Be mindful of these warnings, and if the path does not exist, check for typos and make sure that you are using the path for the correct system from the table in :numref:`Section %s <Preconfigured_Sites>`.

.. note::
More details on chaining spack environments and a few words of caution can be found in the `Spack documentation <https://spack.readthedocs.io/en/latest/chain.html?highlight=chaining%20spack%20installations>`_. Those words of caution need to be taken seriously, especially those referring to not deleting modulefiles and dependencies in the upstream spack environment (if having permissions to do so)!
6 changes: 3 additions & 3 deletions doc/source/KnownIssues.rst
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@ General

4. Installation of duplicate packages ``ecbuild``, ``hdf5``

One reason for this is an external ``cmake@3.20`` installation, which confuses the concretizer when building a complex environment such as the ``skylab-dev`` or ```jedi-ufs-all`` environment. For certain packages (and thus their dependencies), a newer version than ``[email protected]`` is required, for others ``[email protected]`` works, and spack then thinks that it needs to build two identical versions of the same package with different versions of ``cmake``. The solution is to remove any external ``[email protected]`` package (and best also earlier versions) in the site config and run the concretization step again. Another reason on Ubuntu 20 is the presence of external ``openssl`` packages, which should be removed before re-running the concretization step.
One reason for this is an external ``cmake@3.20`` installation, which confuses the concretizer when building a complex environment such as the ``skylab-dev`` or ```unified-dev`` environment. For certain packages (and thus their dependencies), a newer version than ``[email protected]`` is required, for others ``[email protected]`` works, and spack then thinks that it needs to build two identical versions of the same package with different versions of ``cmake``. The solution is to remove any external ``[email protected]`` package (and best also earlier versions) in the site config and run the concretization step again. Another reason on Ubuntu 20 is the presence of external ``openssl`` packages, which should be removed before re-running the concretization step.

5. Installation of duplicate package ``nco``

Expand Down Expand Up @@ -111,9 +111,9 @@ macOS

This error occurs on macOS Monterey with ``mpich-3.4.3`` installed via Homebrew when trying to build the jedi bundles that use ``ecbuild``. The reason was that the C compiler flag ``-fgnu89-inline`` from ``/usr/local/Cellar/mpich/3.4.3/lib/pkgconfig/mpich.pc`` was added to the C++ compiler flags by ecbuild. The solution was to set ``CC=mpicc FC=mpif90 CXX=mpicxx`` when calling ``ecbuild`` for those bundles. Note that it is recommended to install ``mpich`` or ``openmpi`` with spack-stack, not with Homebrew.

2. Installation of ``poetry`` using ``pip3`` or test with ``python3`` fails
2. Installation of ``gdal`` fails with error ``xcode-select: error: tool 'xcodebuild' requires Xcode, but active developer directory '/Library/Developer/CommandLineTools' is a command line tools instance``.

This can happen when multiple versions of Python were installed with Homebrew and ``pip3``/``python3`` point to different versions. Run ``brew doctor`` and check if there are issues with Python not being properly linked. Follow the instructions given by ``brew``, if applicable.
If this happens, install the full ``Xcode`` application in addition to the Apple command line utilities, and switch ``xcode-select`` over with ``sudo xcode-select -s /Applications/Xcode.app/Contents/Developer`` (change the path if you installed Xcode somewhere else).

3. Error ``AttributeError: Can't get attribute 'Mark' on <module 'ruamel.yaml.error' from ...`` when running ``spack install``

Expand Down
110 changes: 6 additions & 104 deletions doc/source/MaintainersSection.rst
Original file line number Diff line number Diff line change
Expand Up @@ -25,16 +25,18 @@ Miniconda (legacy)

miniconda can be used to provide a basic version of Python that spack-stack uses to support its Python packages. This is not recommended on configurable systems (user workstations and laptops using GNU compiler) where Python gets installed by spack. But any system using Intel compilers with spack-stack will need an external Python to build ecflow with Python bindings (because ecflow requires a boost serialization function that does **not** work with Intel, a known yet ignored bug), and then both Python and ecflow are presented to spack as external packages. Often, it is possible to use the default (OS) Python if new enough (3.9+), or a module provided by the system administrators. If none of this works, use the following instructions to install a basic Python interpreter using miniconda:

The following is for the example of `miniconda_ver="py39_4.12.0"` (for which `python_ver=3.9.12`) and `platform="MacOSX-x86_64"` or `platform="Linux-x86_64"`
````
The following is for the example of ``miniconda_ver="py39_4.12.0"`` (for which ``python_ver=3.9.12``) and ``platform="MacOSX-x86_64"`` or ``platform="Linux-x86_64"``

.. code-block:: console
cd /path/to/top-level/spack-stack/
mkdir -p miniconda-${python_ver}/src
cd miniconda-${python_ver}/src
wget https://repo.anaconda.com/miniconda/Miniconda3-${miniconda_ver}-${platform}.sh
sh Miniconda3-${miniconda_ver}-${platform}.sh -u -b -p /path/to/top-level/spack-stack/miniconda-${python_ver}
eval "$(/path/to/top-level/spack-stack/miniconda-${python_ver}/bin/conda shell.bash hook)"
conda install -y -c conda-forge libpython-static
```
After the successful installation, create modulefile ``/path/to/top-level/spack-stack/modulefiles/miniconda/${python_ver}`` from template ``doc/modulefile_templates/miniconda`` and update ``MINICONDA_PATH`` and the Python version in this file.

.. _MaintainersSection_Qt5:
Expand Down Expand Up @@ -819,57 +821,7 @@ The simplest case of adding new packages that are available in spack-stack is de
Chaining spack-stack installations
----------------------------------

Chaining spack-stack installations is a powerful way to test adding new packages without affecting the existing packages. The idea is to define one or more upstream spack installations that the environment can use as dependencies. One possible way to do this is:

1. Mirror the environment config of the upstream repository, i.e. copy the entire directory without the ``install`` and ``.spack_env`` directories and without `spack.lock`. For example:

.. code-block:: console
rsync -av --exclude='install' --exclude='.spack-env' --exclude='spack.lock' \
envs/jedi-ufs/ \
envs/jedi-ufs-chain-test/
2. Edit `envs/jedi-ufs-chain-test/spack.yaml`` and add an upstream configuration entry directly under the ``spack:`` config so that the contents looks like:

.. code-block:: console
spack:
upstreams:
spack-instance-1:
install_tree: /path/to/spack-stack-1.0.0/envs/jedi-ufs/install
concretizer:
unify: when_possible
...
3. Activate the environment

4. Install the new packages, for example:

.. code-block:: console
spack install -v --reuse [email protected]+debug
5. Create modulefiles

.. code-block:: console
spack module [lmod|tcl] refresh
6. When using ``tcl`` module files, run the ``spack stack setup-meta-modules`` script. This is not needed when using ``lmod`` modulefiles, because the meta modules in ``/path/to/spack-stack-1.0.0/envs/jedi-ufs-chain-test/install/modulefiles/Core`` will be ignored entirely.

To use the chained spack environment, first load the usual modules from the upstream spack environment. Then add the full path to the newly created modules manually, ignoring the meta modules (``.../Core``), for example:

.. code-block:: console
module use /path/to/spack-stack-1.0.0/envs/jedi-ufs-chain-test/install/modulefiles/openmpi/4.1.3/apple-clang/13.1.6
7. Load the newly created modules. When using `tcl` module files, make sure that conflicting modules are unloaded (`lmod` takes care of this).

.. note::
After activating the chained environment, ``spack find`` doesn't show the packages installed in upstream, unfortunately.

.. note::
More details and a few words of caution can be found in the `Spack documentation <https://spack.readthedocs.io/en/latest/chain.html?highlight=chaining%20spack%20installations>`_. Those words of caution need to be taken seriously, especially those referring to not deleting modulefiles and dependencies in the upstream spack environment (if having permissions to do so)!
Chaining spack-stack installations is a powerful way to test adding new packages without affecting the existing packages. The idea is to define one or more upstream spack installations that the environment can use as dependencies. This is described in detail in :numref:`Section %s <Add_Test_Packages>`.

----------------------------------------
Testing/adding packages outside of spack
Expand Down Expand Up @@ -906,53 +858,3 @@ Python packages can be added in various ways:
.. note::
Users are equally strongly advised to not use ``conda`` or ``miniconda`` in combination with Python modules provided by spack-stack, as well as not installing packages other than ``poetry`` in the basic ``miniconda`` installation for spack-stack (if using such a setup).

.. _MaintainersSection_Directory_Layout:

==============================
Recommended Directory Layout
==============================

To support multiple installs it is recommended to use `bootstrap.sh` to setup Miniconda and create a standard directory layout.

After running `bootstrap.sh -p <prefix>` the directory will have the following directories:

* apps - Externally installed pre-requisites such as Miniconda and git-lfs.
* modulefiles - External modules such as Miniconda that are not tied to Spack.
* src - Prerequisite and spack-stack sources.
* envs - Spack environment installation location.

A single checkout of Spack can support multiple environments. To differentiate them spack-stack sources in `src` and corresponding environments in `envs` should be grouped by major version.

For example, major versions of spack-stack v1.x.y should be checked out in the `src/spack-stack` directory as `v1` and each corresponding environment should be installed in `envs/v1`.

.. code-block:: console
spack-stack
├── apps
│   └── miniconda
│   └── py39_4.12.0
├── envs
│   └── v1
│   ├── jedi-ufs-all
│   └── skylab-1.0.0
├── modulefiles
│   └── miniconda
│   └── py39_4.12.0
└── src
├── miniconda
│   └── py39_4.12.0
│   └── Miniconda3-py39_4.12.0-MacOSX-x86_64.sh
└── spack-stack
└── v1
├── envs
│   ├── jedi-ufs-all
│   └── skylab-1.0.0
The install location can be set from the command line with:

.. code-block:: console
spack config add "config:install_tree:root:<prefix>/envs/v1/jedi-ufs-all"
spack config add "modules:default:roots:lmod:<prefix>/envs/v1/jedi-ufs-all/modulefiles"
Loading

0 comments on commit fffaf05

Please sign in to comment.