diff --git a/.github/workflows/main.yml b/.github/workflows/main.yml
index c764f37318..966f2b2768 100644
--- a/.github/workflows/main.yml
+++ b/.github/workflows/main.yml
@@ -27,11 +27,6 @@ jobs:
python -m pip install coverage
coverage run --source=reframe ./test_reframe.py
coverage report -m
- - name: Upload Coverage to Codecov
- if: matrix.python-version == '3.8'
- uses: codecov/codecov-action@v4
- with:
- fail_ci_if_error: true
unittest-py36:
runs-on: ubuntu-20.04
@@ -86,7 +81,7 @@ jobs:
run: |
docker run reframe-${{ matrix.modules-version }}:latest
- tutorialtest:
+ eb-spack-howto:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
@@ -99,7 +94,7 @@ jobs:
- name: Build Image for Tutorial Tests
run: |
echo "${{ secrets.GITHUB_TOKEN }}" | docker login ghcr.io -u $ --password-stdin
- docker build -f ci-scripts/dockerfiles/tutorials.dockerfile -t reframe-tutorials:latest .
+ docker build -f ci-scripts/dockerfiles/eb-spack-howto.dockerfile -t reframe-tutorials:latest .
docker logout
- name: Run Tutorial Tests
run: |
diff --git a/.github/workflows/test-flux.yaml b/.github/workflows/test-flux.yaml
index b6f01a48d4..13520b4119 100644
--- a/.github/workflows/test-flux.yaml
+++ b/.github/workflows/test-flux.yaml
@@ -38,6 +38,6 @@ jobs:
run: |
export PATH=$PWD/bin:$PATH
which reframe
- flux start reframe -c tutorials/flux -C tutorials/flux/settings.py -l
- flux start reframe -c tutorials/flux -C tutorials/flux/settings.py --run
- flux start python3 ./test_reframe.py --rfm-user-config=tutorials/flux/settings.py -vvvv
+ flux start reframe -c examples/howto/flux -C examples/howto/flux/settings.py -l
+ flux start reframe -c examples/howto/flux -C examples/howto/flux/settings.py -r
+ flux start python3 ./test_reframe.py --rfm-user-config=examples/howto/flux/settings.py -vvvv
diff --git a/ci-scripts/dockerfiles/tutorials.dockerfile b/ci-scripts/dockerfiles/eb-spack-howto.dockerfile
similarity index 80%
rename from ci-scripts/dockerfiles/tutorials.dockerfile
rename to ci-scripts/dockerfiles/eb-spack-howto.dockerfile
index 1bce7487a6..90e4aeb758 100644
--- a/ci-scripts/dockerfiles/tutorials.dockerfile
+++ b/ci-scripts/dockerfiles/eb-spack-howto.dockerfile
@@ -37,4 +37,4 @@ RUN echo '. /usr/local/lmod/lmod/init/profile && . /home/rfmuser/spack/share/spa
ENV BASH_ENV /home/rfmuser/setup.sh
-CMD ["/bin/bash", "-c", "./bin/reframe -r -C tutorials/config/lmodsys.py -R -c tutorials/build_systems"]
+CMD ["/bin/bash", "-c", "./bin/reframe --system=tutorialsys -r -C examples/tutorial/config/baseline_modules.py -R -c examples/tutorial/easybuild/eb_test.py -c examples/tutorial/spack/spack_test.py"]
diff --git a/docs/_static/img/reframe-system-arch.svg b/docs/_static/img/reframe-system-arch.svg
new file mode 100644
index 0000000000..4bb51a5bac
--- /dev/null
+++ b/docs/_static/img/reframe-system-arch.svg
@@ -0,0 +1,3 @@
+
+
+
\ No newline at end of file
diff --git a/docs/_static/img/test-naming.svg b/docs/_static/img/test-naming.svg
new file mode 100644
index 0000000000..9934cfb73b
--- /dev/null
+++ b/docs/_static/img/test-naming.svg
@@ -0,0 +1,3 @@
+
+
+
\ No newline at end of file
diff --git a/docs/config_reference.rst b/docs/config_reference.rst
index cbe7c8ab26..47f66a013a 100644
--- a/docs/config_reference.rst
+++ b/docs/config_reference.rst
@@ -2,7 +2,7 @@
Configuration Reference
***********************
-ReFrame's behavior can be configured through its configuration file (see :doc:`configure`), environment variables and command-line options.
+ReFrame's behavior can be configured through its configuration file, environment variables and command-line options.
An option can be specified via multiple paths (e.g., a configuration file parameter and an environment variable), in which case command-line options precede environment variables, which in turn precede configuration file options.
This section provides a complete reference guide of the configuration options of ReFrame that can be set in its configuration file or specified using environment variables.
@@ -115,7 +115,7 @@ System Configuration
:required: Yes
A list of hostname regular expression patterns in Python `syntax `__, which will be used by the framework in order to automatically select a system configuration.
- For the auto-selection process, see `here `__.
+ For the auto-selection process, check the configuration of the :attr:`~config.autodetect_methods` option.
.. py:attribute:: systems.max_local_jobs
@@ -244,6 +244,8 @@ System Configuration
This option is broken in 4.0.
+.. _system-partition-configuration:
+
System Partition Configuration
==============================
@@ -484,7 +486,7 @@ System Partition Configuration
.. versionadded:: 4.0.0
ReFrame also allows you to register your own custom launchers simply by defining them in the configuration.
- You can follow a small tutorial `here `__.
+ You can follow a small tutorial :ref:`here `.
.. py:attribute:: systems.partitions.access
@@ -564,7 +566,7 @@ System Partition Configuration
:default: ``8``
The maximum number of concurrent regression tests that may be active (i.e., not completed) on this partition.
- This option is relevant only when ReFrame executes with the `asynchronous execution policy `__.
+ This option is relevant only when ReFrame executes with the :ref:`asynchronous execution policy `.
.. py:attribute:: systems.partitions.prepare_cmds
@@ -582,7 +584,7 @@ System Partition Configuration
:required: No
:default: ``[]``
- A list of job scheduler `resource specification `__ objects.
+ A list of job scheduler `resource specification <#custom-job-scheduler-resources>`__ objects.
.. py:attribute:: systems.partitions.processor
@@ -591,7 +593,24 @@ System Partition Configuration
:default: ``{}``
Processor information for this partition stored in a `processor info object <#processor-info>`__.
- If not set, ReFrame will try to auto-detect this information (see :ref:`proc-autodetection` for more information).
+ If not set, ReFrame will try to determine this information as follows:
+
+ #. If the processor configuration metadata file in ``~/.reframe/topology/{system}-{part}/processor.json`` exists, the topology information is loaded from there.
+ These files are generated automatically by ReFrame from previous runs.
+
+ #. If the corresponding metadata files are not found, the processor information will be auto-detected.
+ If the system partition is local (i.e., ``local`` scheduler + ``local`` launcher), the processor information is auto-detected unconditionally and stored in the corresponding metadata file for this partition.
+ If the partition is remote, ReFrame will not try to auto-detect it unless the :envvar:`RFM_REMOTE_DETECT` or the :attr:`general.remote_detect` configuration option is set.
+ In that case, the steps to auto-detect the remote processor information are the following:
+
+ a. ReFrame creates a fresh clone of itself in a temporary directory created under ``.`` by default.
+ This temporary directory prefix can be changed by setting the :envvar:`RFM_REMOTE_WORKDIR` environment variable.
+ b. ReFrame changes to that directory and launches a job that will first bootstrap the fresh clone and then run that clone with ``{launcher} ./bin/reframe --detect-host-topology=topo.json``.
+ The :option:`--detect-host-topology` option causes ReFrame to detect the topology of the current host,
+ which in this case would be one of the remote compute nodes.
+
+ In case of errors during auto-detection, ReFrame will simply issue a warning and continue.
+
.. versionadded:: 3.5.0
@@ -703,7 +722,7 @@ ReFrame can launch containerized applications, but you need to configure properl
Custom Job Scheduler Resources
==============================
-ReFrame allows you to define custom scheduler resources for each partition that you can then transparently access through the :attr:`~reframe.core.pipeline.RegressionTest.extra_resources` attribute of a regression test.
+ReFrame allows you to define custom scheduler resources for each partition that can then be transparently accessed through the :attr:`~reframe.core.pipeline.RegressionTest.extra_resources` attribute of a test or from an environment.
.. py:attribute:: systems.partitions.resources.name
@@ -967,6 +986,18 @@ They are associated with `system partitions <#system-partition-configuration>`__
It first looks for definitions for the current partition, then for the containing system and, finally, for global definitions (the ``*`` pseudo-system).
+.. py:attribute:: environments.resources
+
+ :required: No
+ :default: ``{}``
+
+ Scheduler resources associated with this environments.
+
+ This is the equivalent of a test's :attr:`~reframe.core.pipeline.RegressionTest.extra_resources`.
+
+ .. versionadded:: 4.6
+
+
.. _logging-config-reference:
Logging Configuration
@@ -1097,6 +1128,7 @@ All logging handlers share the following set of common attributes:
``%(check_executable)s``, The value of the :attr:`~reframe.core.pipeline.RegressionTest.executable` attribute.
``%(check_executable_opts)s``, The value of the :attr:`~reframe.core.pipeline.RegressionTest.executable_opts` attribute.
``%(check_extra_resources)s``, The value of the :attr:`~reframe.core.pipeline.RegressionTest.extra_resources` attribute.
+ ``%(check_fail_reason)s``, The failure reason if the test has failed.
``%(check_hashcode)s``, The unique hash associated with this test.
``%(check_info)s``, Various information about this test; essentially the return value of the test's :func:`~reframe.core.pipeline.RegressionTest.info` function.
``%(check_job_completion_time)s``, Same as the ``(check_job_completion_time_unix)s`` but formatted according to ``datefmt``.
@@ -1169,6 +1201,9 @@ All logging handlers share the following set of common attributes:
.. versionadded:: 4.3
The ``%(check_#ALL)s`` special specifier is added.
+.. versionadded:: 4.7
+ The ``%(check_fail_reason)s`` specifier is added.
+
.. py:attribute:: logging.handlers.format_perfvars
.. py:attribute:: logging.handlers_perflog.format_perfvars
@@ -1216,6 +1251,8 @@ All logging handlers share the following set of common attributes:
In addition to the format directives supported by the standard library's `time.strftime() `__ function, ReFrame allows you to use the ``%:z`` directive -- a GNU ``date`` extension -- that will print the time zone difference in a RFC3339 compliant way, i.e., ``+/-HH:MM`` instead of ``+/-HHMM``.
+.. _file-handler:
+
The ``file`` log handler
------------------------
@@ -1617,14 +1654,6 @@ General Configuration
.. versionadded:: 3.12.0
-.. py:attribute:: general.git_timeout
-
- :required: No
- :default: 5
-
- Timeout value in seconds used when checking if a git repository exists.
-
-
.. py:attribute:: general.dump_pipeline_progress
Dump pipeline progress for the asynchronous execution policy in ``pipeline-progress.json``.
@@ -1636,6 +1665,24 @@ General Configuration
.. versionadded:: 3.10.0
+.. py:attribute:: general.flex_alloc_strict
+
+ :required: No
+ :default: ``False``
+
+ Fail flexible tests if their minimum task requirement is not satisfied.
+
+ .. versionadded:: 4.7
+
+
+.. py:attribute:: general.git_timeout
+
+ :required: No
+ :default: 5
+
+ Timeout value in seconds used when checking if a git repository exists.
+
+
.. py:attribute:: general.pipeline_timeout
Timeout in seconds for advancing the pipeline in the asynchronous execution policy.
@@ -1672,7 +1719,6 @@ General Configuration
Try to auto-detect processor information of remote partitions as well.
This may slow down the initialization of the framework, since it involves submitting auto-detection jobs to the remote partitions.
- For more information on how ReFrame auto-detects processor information, you may refer to :ref:`proc-autodetection`.
.. versionadded:: 3.7.0
diff --git a/docs/configure.rst b/docs/configure.rst
deleted file mode 100644
index 9c2b96376d..0000000000
--- a/docs/configure.rst
+++ /dev/null
@@ -1,515 +0,0 @@
-=================================
-Configuring ReFrame for Your Site
-=================================
-
-ReFrame comes pre-configured with a minimal generic configuration that will allow you to run ReFrame on any system.
-This will allow you to run simple local tests using the default compiler of the system.
-Of course, ReFrame is much more powerful than that.
-This section will guide you through configuring ReFrame for your site.
-
-ReFrame's configuration can be either in JSON or in Python format and can be split into multiple files.
-The Python format is useful in cases that you want to generate configuration parameters on-the-fly, since ReFrame will import that Python file and the load the resulting configuration.
-In the following we will use a single Python-based configuration file also for historical reasons, since it was the only way to configure ReFrame in versions prior to 3.0.
-
-
-.. versionchanged:: 4.0.0
- The configuration can now be split into multiple files.
-
-
-Loading the configuration
--------------------------
-
-ReFrame builds its final configuration gradually by combining multiple configuration files.
-Each one can have different parts of the configuration, for example different systems, different environments, different general options or different logging handlers.
-This technique allows users to avoid having a single huge configuration file.
-
-The first configuration file loaded in this chain is always the generic builtin configuration located under ``${RFM_INSTALL_PREFIX}/reframe/core/settings.py``.
-This contains everything that ReFrame needs to run on a generic system, as well as basic settings for logging, so subsequent configuration files may skip defining some configuration sections altogether, if they are not relevant.
-
-ReFrame continues on looking for configuration files in the directories defined in :envvar:`RFM_CONFIG_PATH`.
-For each directory, will look within it for a ``settings.py`` or ``settings.json`` file (in that order), and if it finds one, it will load it.
-
-Finally, ReFrame processes the :option:`--config-file` option or the :envvar:`RFM_CONFIG_FILES` environment variable to load any specific configuration files passed from the command line.
-
-
-Anatomy of the Configuration File
----------------------------------
-
-The whole configuration of ReFrame is a single JSON object whose properties are responsible for configuring the basic aspects of the framework.
-We'll refer to these top-level properties as *sections*.
-These sections contain other objects which further define in detail the framework's behavior.
-If you are using a Python file to configure ReFrame, this big JSON configuration object is stored in a special variable called ``site_configuration``.
-
-We will explore the basic configuration of ReFrame by looking into the configuration file of the tutorials, which permits ReFrame to run on the Piz Daint supercomputer and a local computer.
-For the complete listing and description of all configuration options, you should refer to the :doc:`config_reference`.
-
-.. literalinclude:: ../tutorials/config/daint.py
- :start-at: site_configuration
-
-There are three required sections that the final ReFrame configuration must have: ``systems``, ``environments`` and ``logging``, but in most cases you will define only the first two, as ReFrame's builtin configuration already defines a reasonable logging configuration. We will first cover these sections and then move on to the optional ones.
-
-.. tip::
-
- These configuration sections may not all be defined in the same configuration file, but can reside in any configuration file that is being loaded.
- This is the case of the example configuration shown above, where the ``logging`` section is "missing" as it's defined in ReFrame's builtin configuration.
-
----------------------
-Systems Configuration
----------------------
-
-ReFrame allows you to configure multiple systems in the same configuration file.
-Each system is a different object inside the ``systems`` section.
-In our example we define only Piz Daint:
-
-.. literalinclude:: ../tutorials/config/daint.py
- :start-at: 'systems'
- :end-before: 'environments'
-
-Each system is associated with a set of properties, which in this case are the following:
-
-* ``name``: The name of the system.
- This should be an alphanumeric string (dashes ``-`` are allowed) and it will be used to refer to this system in other contexts.
-* ``descr``: A detailed description of the system.
-* ``hostnames``: This is a list of hostname patterns following the `Python Regular Expression Syntax `__, which will be used by ReFrame when it tries to automatically select a configuration entry for the current system.
-* ``modules_system``: This refers to the modules management backend which should be used for loading environment modules on this system.
- Multiple backends are supported, as well as the special ``nomod`` backend which implements the different modules system operations as no-ops.
- For the complete list of the supported modules systems, see `here `__.
-* ``partitions``: The list of partitions that are defined for this system.
- Each partition is defined as a separate object.
- We devote the rest of this section in system partitions, since they are an essential part of ReFrame's configuration.
-
-A system partition in ReFrame is not bound to a real scheduler partition.
-It is a virtual partition or separation of the system.
-In the example shown here, we define three partitions that none of them corresponds to a scheduler partition.
-The ``login`` partition refers to the login nodes of the system, whereas the ``gpu`` and ``mc`` partitions refer to two different set of nodes in the same cluster that are effectively separated using Slurm constraints.
-Let's pick the ``gpu`` partition and look into it in more detail:
-
-.. literalinclude:: ../tutorials/config/daint.py
- :start-at: 'name': 'gpu'
- :end-at: 'max_jobs'
-
-The basic properties of a partition are the following:
-
-* ``name``: The name of the partition.
- This should be an alphanumeric string (dashes ``-`` are allowed) and it will be used to refer to this partition in other contexts.
-* ``descr``: A detailed description of the system partition.
-* ``scheduler``: The workload manager (job scheduler) used in this partition for launching parallel jobs.
- In this particular example, the `Slurm `__ scheduler is used.
- For a complete list of the supported job schedulers, see `here `__.
-* ``launcher``: The parallel job launcher used in this partition.
- In this case, the ``srun`` command will be used.
- For a complete list of the supported parallel job launchers, see `here `__.
-* ``access``: A list of scheduler options that will be passed to the generated job script for gaining access to that logical partition.
- Notice how in this case, the nodes are selected through a constraint and not an actual scheduler partition.
-* ``environs``: The list of environments that ReFrame will use to run regression tests on this partition.
- These are just symbolic names that refer to environments defined in the ``environments`` section described below.
-* ``max_jobs``: The maximum number of concurrent regression tests that may be active (i.e., not completed) on this partition.
- This option is relevant only when ReFrame executes with the `asynchronous execution policy `__.
-
- For more partition configuration options, have a look `here `__.
-
-
---------------------------
-Environments Configuration
---------------------------
-
-We have seen already environments to be referred to by the ``environs`` property of a partition.
-An environment in ReFrame is simply a collection of environment modules, environment variables and compiler and compiler flags definitions.
-None of these attributes is required.
-An environment can simply be empty, in which case it refers to the actual environment that ReFrame runs in.
-In fact, this is what the generic fallback configuration of ReFrame does.
-
-Environments in ReFrame are configured under the ``environments`` section of the documentation.
-For each environment referenced inside a partition, a definition of it must be present in this section.
-In our example, we define environments for all the basic compilers as well as a default built-in one, which is used with the generic system configuration.
-In certain contexts, it is useful to see a ReFrame environment as a wrapper of a programming toolchain (MPI + compiler combination):
-
-.. literalinclude:: ../tutorials/config/daint.py
- :start-at: 'environments'
- :end-at: # end of environments
-
-Each environment is associated with a name.
-This name will be used to reference this environment in different contexts, as for example in the ``environs`` property of the system partitions.
-A programming environment in ReFrame is essentially a collection of environment modules, environment variables and compiler definitions.
-
-An important feature in ReFrame's configuration is that you can scope the definition of section objects to different systems or system/partition combinations by using the ``target_systems`` property.
-In our example, this means that the ``gnu`` environment will be defined this way only for tests running on the system ``daint``.
-
-
----------------------
-Logging configuration
----------------------
-
-ReFrame has a powerful logging mechanism that gives fine grained control over what information is being logged, where it is being logged and how this information is formatted.
-Additionally, it allows for logging performance data from performance tests into different channels.
-Let's see how logging is defined in the builtin configuration:
-
-.. literalinclude:: ../reframe/core/settings.py
- :start-at: 'logging'
- :end-at: # end of logging
-
-Logging is configured under the ``logging`` section of the configuration, which is a list of logger objects.
-Unless you want to configure logging differently for different systems, a single logger object is enough.
-Each logger object is associated with a `logging level `__ stored in the ``level`` property and has a set of logging handlers that are actually responsible for handling the actual logging records.
-ReFrame's output is performed through its logging mechanism and that's why there is the special ``handlers$`` property.
-The handler defined in this property, in the builtin configuration shown here, defines how exactly the output of ReFrame will be printed.
-You will not have to override this in your configuration files, unless you really need to change how ReFrame's output look like.
-
-As a user you might need to override the ``handlers`` property to define different sinks for ReFrame logs and/or output using different verbosity levels.
-Note that you can use multiple handlers at the same time.
-All handler objects share a set of common properties.
-These are the following:
-
-* ``type``: This is the type of the handler, which determines its functionality.
- Depending on the handler type, handler-specific properties may be allowed or required.
- For a complete list of available log handler types, see `here `__.
-* ``level``: The cut-off level for messages reaching this handler.
- Any message with a lower level number will be filtered out.
-* ``format``: A format string for formatting the emitted log record.
- ReFrame uses the format specifiers from `Python Logging `__, but also defines its owns specifiers.
-* ``datefmt``: A time format string for formatting timestamps.
- There are two log record fields that are considered timestamps: (a) ``asctime`` and (b) ``check_job_completion_time``.
- ReFrame follows the time formatting syntax of Python's `time.strftime() `__ with a small tweak allowing full RFC3339 compliance when formatting time zone differences.
-
-We will not go into the details of the individual handlers here.
-In this particular example we use three handlers of two distinct types:
-
-1. A file handler to print debug messages in the ``reframe.log`` file using a more extensive message format that contains a timestamp, the level name etc.
-2. A stream handler to print any informational messages (and warnings and errors) from ReFrame to the standard output.
- This handles essentially the actual output of ReFrame.
-3. A file handler to print the framework's output in the ``reframe.out`` file.
-
-It might initially seem confusing the fact that there are two ``level`` properties: one at the logger level and one at the handler level.
-Logging in ReFrame works hierarchically.
-When a message is logged, a log record is created, which contains metadata about the message being logged (log level, timestamp, ReFrame runtime information etc.).
-This log record first goes into ReFrame's internal logger, where the record's level is checked against the logger's level (here ``debug``).
-If the log record's level exceeds the log level threshold from the logger, it is forwarded to the logger's handlers.
-Then each handler filters the log record differently and takes care of formatting the log record's message appropriately.
-You can view logger's log level as a general cut off.
-For example, if we have set it to ``warning``, no debug or informational messages would ever be printed.
-
-Finally, there is a special set of handlers for handling performance log messages.
-Performance log messages are generated *only* for `performance tests `__, i.e., tests defining the :attr:`~reframe.core.pipeline.RegressionTest.perf_variables` or the :attr:`~reframe.core.pipeline.RegressionTest.perf_patterns` attributes.
-The performance log handlers are stored in the ``handlers_perflog`` property.
-The ``filelog`` handler used in this example will create a file per test and per system/partition combination (``.///.log``) and will append to it the obtained performance data every time a performance test is run.
-Notice how the message to be logged is structured in the ``format`` and ``format_perfvars`` properties, such that it can be easily parsed from post processing tools.
-Apart from file logging, ReFrame offers more advanced performance logging capabilities through Syslog, Graylog and HTTP.
-
-For a complete reference of logging configuration parameters, please refer to the :doc:`config_reference`.
-
-
------------------------------
-General configuration options
------------------------------
-
-General configuration options of the framework go under the ``general`` section of the configuration file.
-This section is optional and, in fact, we do not define it for our tutorial configuration file.
-However, there are several options that can go into this section, but the reader is referred to the :doc:`config_reference` for the complete list.
-
-
----------------------------
-Other configuration options
----------------------------
-
-There is finally one additional optional configuration section that is not discussed here:
-
-The ``modes`` section defines different execution modes for the framework.
-Execution modes are discussed in the :doc:`pipeline` page.
-
-
-.. _building-the-final-config:
-
-Building the Final Configuration
---------------------------------
-
-.. versionadded:: 4.0.0
-
-As mentioned above ReFrame can build its final configuration incrementally from a series of user-specified configuration files starting from the basic builtin configuration.
-We discussed briefly at the beginning of this page how ReFrame locates and loads these configuration files and the documentation of the :option:`-C` option provides more detailed information.
-But how are these configuration files actually combined?
-This is what we will discuss in this section.
-
-Configuration objects in the top-level configuration sections can be split in two categories: *named* and *unnamed*.
-Named objects are the systems, the environments and the modes and the rest are unnamed.
-The named object have a ``name`` property.
-When ReFrame builds its final configuration, named objects from newer configuration files are either appended or prepended in their respective sections, but unnamed objects are merged based on their ``target_systems``.
-More specifically, new systems are *prepended* in the list of the already defined, whereas environments and modes are *appended*.
-The reason for that is that systems are tried from the beginning of the list until a match is found.
-See :ref:`pick-system-config` for more information on how ReFrame picks the right system.
-If a system is redefined, ReFrame will warn about it, but it will still use the new definition.
-This is done for backward compatibility with the old configuration mechanism, where users had to redefine also the builtin systems and environments in their configuration.
-Similarly, if an environment or a mode is redefined, ReFrame will issue a warning, but only if the redefinition is at the same scope as the conflicting one.
-Again this is done for backward compatibility.
-
-Given the Piz Daint configuration shown in this section and the ReFrame's builtin configuration, ReFrame will build internally the following configuration:
-
-.. code-block:: python
-
- site_configuration = {
- 'systems': [
- {
- # from the Daint config
- 'name': 'daint',
- ...
- },
- {
- # from the builtin config
- 'name': 'generic',
- ...
- }
- ],
- 'environments': [
- {
- # from the builtin config
- 'name': 'builtin'
- ...
- },
- {
- # from the Daint config
- 'name': 'gnu',
- ...
- }
- ],
- 'logging': [
- # from the builtin config
- ]
- }
-
-You might wonder why would I need to define multiple objects in sections such as ``logging`` or ``general``.
-As mentioned above, ReFrame merges them if they refer to the same target systems, but if they don't they can serve as scopes for the configuration parameters they define.
-Imagine the following ``general`` section:
-
-.. code-block:: python
-
- 'general': [
- {
- 'git_timeout': 5
- },
- {
- 'git_timeout': 10,
- 'target_systems': ['daint']
- },
- {
- 'git_timeout': 20,
- 'target_systems': ['tresa']
- }
- ]
-
-This means that the default value for ``git_timeout`` is 5 seconds for any system, but it is 10 for ``daint`` and 20 for ``tresa``.
-The nice thing is that you can spread that in multiple configuration files and ReFrame will combine them internally in a single one with the various configuration options indexed by their scope.
-
-
-.. _pick-system-config:
-
-Picking the Right System Configuration
---------------------------------------
-
-As discussed previously, ReFrame's configuration file can store the configurations for multiple systems.
-When launched, ReFrame will pick the first matching configuration and load it.
-
-ReFrame uses an auto-detection mechanism to get information about the host it is running on and uses that information to pick the right system configuration.
-The default auto-detection method uses the ``hostname`` command, but you can define more methods by setting either the :attr:`autodetect_methods` configuration parameter or the :envvar:`RFM_AUTODETECT_METHODS` environment variable.
-After having retrieved the hostname, ReFrame goes through all the systems in its configuration and tries to match it against the :attr:`~config.systems.hostnames` patterns defined for each system.
-The first system whose :attr:`~config.systems.hostnames` match will become the current system and its configuration will be loaded.
-
-As soon as a system configuration is selected, all configuration objects that have a ``target_systems`` property are resolved against the selected system, and any configuration object that is not applicable is dropped.
-So, internally, ReFrame keeps an *instantiation* of the site configuration for the selected system only.
-To better understand this, let's assume that we have the following ``environments`` defined:
-
-.. code:: python
-
- 'environments': [
- {
- 'name': 'cray',
- 'modules': ['cray']
- },
- {
- 'name': 'gnu',
- 'modules': ['gnu']
- },
- {
- 'name': 'gnu',
- 'modules': ['gnu', 'openmpi'],
- 'cc': 'mpicc',
- 'cxx': 'mpicxx',
- 'ftn': 'mpif90',
- 'target_systems': ['foo']
- }
- ],
-
-
-If the selected system is ``foo``, then ReFrame will use the second definition of ``gnu`` which is specific to the ``foo`` system.
-
-You can override completely the system auto-selection process by specifying a system or system/partition combination with the ``--system`` option, e.g., ``--system=daint`` or ``--system=daint:gpu``.
-
-
-Querying Configuration Options
-------------------------------
-
-ReFrame offers the powerful ``--show-config`` command-line option that allows you to query any configuration parameter of the framework and see how it is set for the selected system.
-Using no arguments or passing ``all`` to this option, the whole configuration for the currently selected system will be printed in JSON format, which you can then pipe to a JSON command line editor, such as `jq `__, and either get a colored output or even generate a completely new ReFrame configuration!
-
-Passing specific configuration keys in this option, you can query specific parts of the configuration.
-Let's see some concrete examples:
-
-* Query the current system's partitions:
-
- .. code-block:: console
-
- ./bin/reframe -C tutorials/config/settings.py --system=daint --show-config=systems/0/partitions
-
- .. code:: javascript
-
- [
- {
- "name": "login",
- "descr": "Login nodes",
- "scheduler": "local",
- "launcher": "local",
- "environs": [
- "gnu",
- "intel",
- "nvidia",
- "cray"
- ],
- "max_jobs": 10
- },
- {
- "name": "gpu",
- "descr": "Hybrid nodes",
- "scheduler": "slurm",
- "launcher": "srun",
- "access": [
- "-C gpu",
- "-A csstaff"
- ],
- "environs": [
- "gnu",
- "intel",
- "nvidia",
- "cray"
- ],
- "max_jobs": 100
- },
- {
- "name": "mc",
- "descr": "Multicore nodes",
- "scheduler": "slurm",
- "launcher": "srun",
- "access": [
- "-C mc",
- "-A csstaff"
- ],
- "environs": [
- "gnu",
- "intel",
- "nvidia",
- "cray"
- ],
- "max_jobs": 100
- }
- ]
-
- Check how the output changes if we explicitly set system to ``daint:login``:
-
- .. code-block:: console
-
- ./bin/reframe -C tutorials/config/settings.py --system=daint:login --show-config=systems/0/partitions
-
-
- .. code:: javascript
-
- [
- {
- "name": "login",
- "descr": "Login nodes",
- "scheduler": "local",
- "launcher": "local",
- "environs": [
- "gnu",
- "intel",
- "nvidia",
- "cray"
- ],
- "max_jobs": 10
- }
- ]
-
-
- ReFrame will internally represent system ``daint`` as having a single partition only.
- Notice also how you can use indexes to objects elements inside a list.
-
-* Query an environment configuration:
-
- .. code-block:: console
-
- ./bin/reframe -C tutorials/config/settings.py --system=daint --show-config=environments/@gnu
-
- .. code:: javascript
-
- {
- "name": "gnu",
- "modules": [
- "PrgEnv-gnu"
- ],
- "cc": "cc",
- "cxx": "CC",
- "ftn": "ftn",
- "target_systems": [
- "daint"
- ]
- }
-
- If an object has a ``name`` property you can address it by name using the ``@name`` syntax, instead of its index.
-
-* Query an environment's compiler:
-
- .. code-block:: console
-
- ./bin/reframe -C tutorials/config/settings.py --system=daint --show-config=environments/@gnu/cxx
-
- .. code:: javascript
-
- "CC"
-
- If you explicitly query a configuration value which is not defined in the configuration file, ReFrame will print its default value.
-
-
-.. _proc-autodetection:
-
-Auto-detecting processor information
-------------------------------------
-
-.. versionadded:: 3.7.0
-
-.. |devices| replace:: :attr:`devices`
-.. _devices: config_reference.html#.systems[].partitions[].devices
-.. |processor| replace:: :attr:`processor`
-.. _processor: config_reference.html#.systems[].partitions[].processor
-.. |detect_remote_system_topology| replace:: :attr:`remote_detect`
-.. _detect_remote_system_topology: config_reference.html#.general[].remote_detect
-
-ReFrame is able to detect the processor topology of both local and remote partitions automatically.
-The processor and device information are made available to the tests through the corresponding attributes of the :attr:`~reframe.core.pipeline.RegressionTest.current_partition` allowing a test to modify its behavior accordingly.
-Currently, ReFrame supports auto-detection of the local or remote processor information only.
-It does not support auto-detection of devices, in which cases users should explicitly specify this information using the |devices|_ configuration option.
-The processor information auto-detection works as follows:
-
-#. If the |processor|_ configuration option is defined, then no auto-detection is attempted.
-
-#. If the |processor|_ configuration option is not defined, ReFrame will look for a processor configuration metadata file in ``~/.reframe/topology/{system}-{part}/processor.json``.
- If the file is found, the topology information is loaded from there.
- These files are generated automatically by ReFrame from previous runs.
-
-#. If the corresponding metadata files are not found, the processor information will be auto-detected.
- If the system partition is local (i.e., ``local`` scheduler + ``local`` launcher), the processor information is auto-detected unconditionally and stored in the corresponding metadata file for this partition.
- If the partition is remote, ReFrame will not try to auto-detect it unless the :envvar:`RFM_REMOTE_DETECT` or the |detect_remote_system_topology|_ configuration option is set.
- In that case, the steps to auto-detect the remote processor information are the following:
-
- a. ReFrame creates a fresh clone of itself in a temporary directory created under ``.`` by default.
- This temporary directory prefix can be changed by setting the :envvar:`RFM_REMOTE_WORKDIR` environment variable.
- b. ReFrame changes to that directory and launches a job that will first bootstrap the fresh clone and then run that clone with ``{launcher} ./bin/reframe --detect-host-topology=topo.json``.
- The :option:`--detect-host-topology` option causes ReFrame to detect the topology of the current host,
- which in this case would be the remote compute nodes.
-
- In case of errors during auto-detection, ReFrame will simply issue a warning and continue.
diff --git a/docs/deferrables.rst b/docs/deferrables.rst
index be78c33265..f3e5f54b5d 100644
--- a/docs/deferrables.rst
+++ b/docs/deferrables.rst
@@ -3,7 +3,7 @@ Understanding the Mechanism of Deferrable Functions
===================================================
This section describes the mechanism behind deferrable functions, which in ReFrame, they are used for sanity and performance checking.
-Generally, writing a new sanity function in a :class:`~reframe.core.pipeline.RegressionTest` is as straightforward as decorating a simple member function with the built-in :func:`~reframe.core.pipeline.RegressionMixin.sanity_function` decorator.
+Generally, writing a new sanity function in a :class:`~reframe.core.pipeline.RegressionTest` is as straightforward as decorating a simple member function with the built-in :func:`~reframe.core.builtins.sanity_function` decorator.
Behind the scenes, this decorator will convert the Python function into a deferrable function and schedule its evaluation for the sanity stage of the test.
However, when dealing with more complex scenarios such as a deferrable function taking as an argument the results from other deferrable functions, it is crucial to understand how a deferrable function differs from a regular Python function, and when is it actually evaluated.
@@ -181,7 +181,7 @@ There are some exceptions to this rule, though.
The logical :keyword:`and`, :keyword:`or` and :keyword:`not` operators as well as the :keyword:`in` operator cannot be deferred automatically.
These operators try to take the truthy value of their arguments by calling :func:`bool ` on them.
As we shall see later, applying the :func:`bool ` function on a deferred expression causes its immediate evaluation and returns the result.
-If you want to defer the execution of such operators, you should use the corresponding :func:`and_ `, :func:`or_ `, :func:`not_ ` and :func:`contains ` functions in :mod:`reframe.utility.sanity`, which basically wrap the expression in a deferrable function.
+If you want to defer the execution of such operators, you should use the corresponding :func:`~reframe.utility.sanity.and_`, :func:`~reframe.utility.sanity.or_`, :func:`~reframe.utility.sanity.not_` and :func:`~reframe.utility.sanity.contains` functions in :mod:`reframe.utility.sanity`, which basically wrap the expression in a deferrable function.
In summary deferrable functions have the following characteristics:
diff --git a/docs/dependencies.rst b/docs/dependencies.rst
index f1b43bc9e5..9ff2634274 100644
--- a/docs/dependencies.rst
+++ b/docs/dependencies.rst
@@ -2,7 +2,7 @@
How Test Dependencies Work In ReFrame
=====================================
-Dependencies in ReFrame are defined at the test level using the :func:`depends_on` function, but are projected to the `test cases `__ space.
+Dependencies in ReFrame are defined at the test level using the :func:`depends_on` function, but are projected to the :doc:`test cases ` space.
We will see the rules of that projection in a while.
The dependency graph construction and the subsequent dependency analysis happen also at the level of the test cases.
@@ -192,43 +192,6 @@ If you end up requiring such type of dependency in your tests, you might have to
Technically, the framework could easily support such types of dependencies, but ReFrame's output would have to change substantially.
-
-Resolving dependencies
-----------------------
-
-As shown in the :doc:`tutorial_deps`, test dependencies would be of limited usage if you were not able to use the results or information of the target tests.
-Let's reiterate over the :func:`set_executable` function of the :class:`OSULatencyTest` that we presented previously:
-
-.. literalinclude:: ../tutorials/deps/osu_benchmarks.py
- :pyobject: OSULatencyTest.set_executable
-
-The ``@require_deps`` decorator does some magic -- we will unravel this shortly -- with the function arguments of the :func:`set_executable` function and binds them to the target test dependencies by their name.
-However, as discussed in this section, dependencies are defined at test case level, so the ``OSUBuildTest`` function argument is bound to a special function that allows you to retrieve an actual test case of the target dependency.
-This is why you need to "call" ``OSUBuildTest`` in order to retrieve the desired test case.
-When no arguments are passed, this will retrieve the test case corresponding to the current partition and the current programming environment.
-We could always retrieve the ``PrgEnv-gnu`` case by writing ``OSUBuildTest('PrgEnv-gnu')``.
-If a dependency cannot be resolved, because it is invalid, a runtime error will be thrown with an appropriate message.
-
-The low-level method for retrieving a dependency is the :func:`getdep() ` method of the :class:`RegressionTest`.
-In fact, you can rewrite :func:`set_executable` function as follows:
-
-.. code:: python
-
- @run_after('setup')
- def set_executable(self):
- target = self.getdep('OSUBuildTest')
- self.executable = os.path.join(
- target.stagedir,
- 'osu-micro-benchmarks-5.6.2', 'mpi', 'pt2pt', 'osu_latency'
- )
- self.executable_opts = ['-x', '100', '-i', '1000']
-
-
-Now it's easier to understand what the ``@require_deps`` decorator does behind the scenes.
-It binds the function arguments to a partial realization of the :func:`getdep` function and attaches the decorated function as an after-setup hook.
-In fact, any ``@require_deps``-decorated function will be invoked before any other after-setup hook.
-
-
.. _cleaning-up-stage-files:
Cleaning up stage files
diff --git a/docs/howto.rst b/docs/howto.rst
new file mode 100644
index 0000000000..cdc4c3d083
--- /dev/null
+++ b/docs/howto.rst
@@ -0,0 +1,1164 @@
+.. currentmodule:: reframe.core.pipeline.RegressionTest
+
+===============
+ReFrame How Tos
+===============
+
+This is a collection of "How To" articles on specific ReFrame usage topics.
+
+
+.. contents:: Table of Contents
+ :local:
+ :depth: 3
+
+
+Working with build systems
+==========================
+
+ReFrame supports building the test's code in many scenarios.
+We have seen in the :doc:`tutorial` how to build the test's code if it is just a single file.
+However, ReFrame knows how to interact with Make, CMake and Autotools.
+Additionally, it supports integration with the `EasyBuild `__ build automation tool as well as the `Spack `__ package manager.
+Finally, if none of the above build systems fits, users are allowed to use their custom build scripts.
+
+
+Using Make, CMake or Autotools
+------------------------------
+
+We have seen already in the :ref:`tutorial ` how to build a test with a single source file.
+ReFrame can also build test code using common build systems, such as `Make `__, `CMake `__ or `Autotools `__.
+The build system to be used is selected by the :attr:`build_system` test attribute.
+This is a "magic" attribute where you assign it a string and ReFrame will create the appropriate :ref:`build system object `.
+Each build system can define its own properties, but some build systems have a common set of properties that are interpreted accordingly.
+Let's see a version of the STREAM benchmark that uses ``make``:
+
+.. literalinclude:: ../examples/tutorial/stream/stream_make.py
+ :caption:
+ :lines: 5-
+
+Build system properties are set in a pre-compile hook.
+In this case we set the CFLAGS and also pass Makefile target to the Make build system's :attr:`~reframe.core.buildsystems.Make.options`.
+
+.. warning::
+
+ You can't set build system options inside the test class body.
+ The test must be instantiated in order for the conversion from string to build system to happen.
+ The following will yield therefore an error:
+
+ .. code-block:: python
+
+ class build_stream(rfm.CompileOnlyRegressionTest):
+ build_system = 'Make'
+ build_system.flags = ['-O3']
+
+
+Based on the selected build system, ReFrame will generate the appropriate build script.
+
+.. code-block:: bash
+ :caption: Run in the single-node container.
+
+ reframe -C config/baseline_environs.py -c stream/stream_make.py -p gnu -r
+ cat output/tutorialsys/default/gnu/build_stream_40af02af/rfm_build.sh
+
+.. code-block:: bash
+
+ #!/bin/bash
+
+ _onerror()
+ {
+ exitcode=$?
+ echo "-reframe: command \`$BASH_COMMAND' failed (exit code: $exitcode)"
+ exit $exitcode
+ }
+
+ trap _onerror ERR
+
+ make -j 1 CC="gcc" CXX="g++" FC="ftn" NVCC="nvcc" CFLAGS="-O3 -fopenmp" stream_c.exe
+
+
+Note that ReFrame passes several variables in the ``make`` command apart from those explicitly requested by the test, such as the ``CFLAGS``.
+The rest of the flags are implicitly requested by the selected test environment, in this case ``gnu``, and ReFrame is trying its best to make sure that the environment's definition will be respected.
+In the case of Autotools and CMake these variables will be set during the "configure" step.
+Users can still override this behaviour and request explicitly to ignore any flags coming from the environment by setting the build system's :attr:`~reframe.core.buildsystems.BuildSystem.flags_from_environ` to :obj:`False`.
+In this case, only the flags requested by the test will be emitted.
+
+The Autotools and CMake build systems are quite similar.
+For passing ``configure`` options, the :attr:`~reframe.core.buildsystems.ConfigureBasedBuildSystem.config_opts` build system attribute should be set, whereas for ``make`` options the :attr:`~reframe.core.buildsystems.ConfigureBasedBuildSystem.make_opts` should be used.
+The :ref:`OSU benchmarks ` in the main tutorial use the Autotools build system.
+
+Finally, in all three build systems, the :attr:`~reframe.core.buildsystems.Make.max_concurrency` can be set to control the number of parallel make jobs.
+
+
+Integrating with EasyBuild
+--------------------------
+
+.. versionadded:: 3.5.0
+
+
+ReFrame integrates with the `EasyBuild `__ build automation framework, which allows you to use EasyBuild for building the source code of your test.
+
+Let's consider a simple ReFrame test that installs ``bzip2-1.0.6`` given the easyconfig `bzip2-1.0.6.eb `__ and checks that the installed version is correct.
+The following code block shows the check, highlighting the lines specific to this tutorial:
+
+.. literalinclude:: ../examples/tutorial/easybuild/eb_test.py
+ :caption:
+ :start-at: import reframe
+
+
+The test looks pretty standard except that we use the ``EasyBuild`` build system and set some build system-specific attributes.
+More specifically, we set the :attr:`~reframe.core.buildsystems.EasyBuild.easyconfigs` attribute to the list of packages we want to build and install.
+We also pass the ``-f`` option to EasyBuild's ``eb`` command through the :attr:`~reframe.core.buildsystems.EasyBuild.options` attribute, so that we force the build even if the corresponding environment module already exists.
+
+For running this test, we need the following Docker image:
+
+.. code-block:: bash
+ :caption: Run in the EasyBuild+Spack container.
+
+ docker build -t reframe-eb-spack:latest -f examples/tutorial/dockerfiles/eb-spack.dockerfile .
+ docker run -h myhost --mount type=bind,source=$(pwd)/examples/,target=/home/user/reframe-examples --workdir=/home/user/reframe-examples/tutorial -it reframe-eb-spack:latest /bin/bash -l
+
+
+EasyBuild requires a :ref:`modules system ` to run, so we need a configuration file that sets the modules system of the current system:
+
+.. literalinclude:: ../examples/tutorial/config/baseline_modules.py
+ :caption:
+ :lines: 5-
+
+We talk about modules system and how ReFrame interacts with them in :ref:`working-with-environment-modules`.
+For the moment, we will only use them for running the EasyBuild example:
+
+.. code-block:: bash
+ :caption: Run in the EasyBuild+Spack container.
+
+ reframe -C config/baseline_modules.py -c easybuild/eb_test.py -r
+
+
+ReFrame generates the following commands to build and install the easyconfig:
+
+.. code-block:: bash
+ :caption: Run in the EasyBuild+Spack container.
+
+ cat output/tutorialsys/default/builtin/BZip2EBCheck/rfm_build.sh
+
+.. code-block:: bash
+
+ ...
+ export EASYBUILD_BUILDPATH=${stagedir}/easybuild/build
+ export EASYBUILD_INSTALLPATH=${stagedir}/easybuild
+ export EASYBUILD_PREFIX=${stagedir}/easybuild
+ export EASYBUILD_SOURCEPATH=${stagedir}/easybuild
+ eb bzip2-1.0.6.eb -f
+
+All the files generated by EasyBuild (sources, temporary files, installed software and the corresponding modules) are kept under the test's stage directory, thus the relevant EasyBuild environment variables are set.
+
+.. tip::
+
+ Users may set the EasyBuild prefix to a different location by setting the :attr:`~reframe.core.buildsystems.EasyBuild.prefix` attribute of the build system.
+ This allows you to have the built software installed upon successful completion of the build phase, but if the test fails in a later stage (sanity, performance), the installed software will not be cleaned up automatically.
+
+.. note::
+
+ ReFrame assumes that the ``eb`` executable is available on the system where the compilation is run (typically the local host where ReFrame is executed).
+
+
+To run the freshly built package, the generated environment modules need to be loaded first.
+These can be accessed through the :attr:`~reframe.core.buildsystems.EasyBuild.generated_modules` attribute *after* EasyBuild completes the installation.
+For this reason, we set the test's :attr:`modules` in a pre-run hook.
+This generated final run script is the following:
+
+.. code-block:: bash
+ :caption: Run in the EasyBuild+Spack container.
+
+ cat output/tutorialsys/default/builtin/BZip2EBCheck/rfm_job.sh
+
+.. code-block:: bash
+
+ module use ${stagedir}/easybuild/modules/all
+ module load bzip/1.0.6
+ bzip2 --help
+
+
+Packaging the installation
+^^^^^^^^^^^^^^^^^^^^^^^^^^
+
+The EasyBuild build system offers a way of packaging the installation via EasyBuild's packaging support.
+To use this feature, `the FPM package manager `__ must be available.
+By setting the dictionary :attr:`~reframe.core.buildsystems.Easybuild.package_opts` in the test, ReFrame will pass ``--package-{key}={val}`` to the EasyBuild invocation.
+For instance, the following can be set to package the installations as an rpm file:
+
+.. code-block:: python
+
+ self.keep_files = ['easybuild/packages']
+ self.build_system.package_opts = {
+ 'type': 'rpm',
+ }
+
+The packages are generated by EasyBuild in the stage directory.
+To retain them after the test succeeds, :attr:`~reframe.core.pipeline.RegressionTest.keep_files` needs to be set.
+
+
+Integrating with Spack
+----------------------
+
+.. versionadded:: 3.6.1
+
+ReFrame can also use `Spack `__ to build a software package and test it.
+
+The example shown here is the equivalent to the `EasyBuild <#integrating-with-easybuild>`__ one that built ``bzip2``.
+Here is the test code:
+
+.. literalinclude:: ../examples/tutorial/spack/spack_test.py
+ :start-at: import reframe
+
+
+When :attr:`~reframe.core.pipeline.RegressionTest.build_system` is set to ``'Spack'``, ReFrame will leverage Spack environments in order to build the test code.
+By default, ReFrame will create a new Spack environment in the test's stage directory and add the requested :attr:`~reframe.core.buildsystems.Spack.specs` to it.
+
+.. note::
+ Optional spec attributes, such as ``target`` and ``os``, should be specified in :attr:`~reframe.core.buildsystems.Spack.specs` and not as install options in :attr:`~reframe.core.buildsystems.Spack.install_opts`.
+
+You can set Spack configuration options for the new environment with the :attr:`~reframe.core.buildsystems.Spack.config_opts` attribute. These options take precedence over Spack's ``spack.yaml`` defaults.
+
+Users may also specify an existing Spack environment by setting the :attr:`~reframe.core.buildsystems.Spack.environment` attribute.
+In this case, ReFrame treats the environment as a *test resource* so it expects to find it under the test's :attr:`~reframe.core.pipeline.RegressionTest.sourcesdir`, which defaults to ``'src'``.
+
+To run this test, use the same container as with EasyBuild:
+
+.. code-block:: bash
+ :caption: Run in the EasyBuild+Spack container.
+
+ docker run -h myhost --mount type=bind,source=$(pwd)/examples/,target=/home/user/reframe-examples --workdir=/home/user/reframe-examples/tutorial -it reframe-eb-spack:tutorial /bin/bash -l
+
+Conversely to EasyBuild, Spack does not require a modules systems to be configured, so you could simply run the test with ReFrame's builtin configuration:
+
+.. code-block:: bash
+ :caption: Run in the EasyBuild+Spack container.
+
+ reframe -c spack/spack_test.py -r
+
+As with every other test, ReFrame will copy the test's resources to its stage directory before building it.
+ReFrame will then activate the generated environment (either the one provided by the user or the one generated by ReFrame), add the given specs using the ``spack add`` command and, finally, install the packages in the environment.
+Here is what ReFrame generates as a build script for this example:
+
+.. code:: bash
+
+ spack env create -d rfm_spack_env
+ spack -e rfm_spack_env config add "config:install_tree:root:opt/spack"
+ spack -e rfm_spack_env add bzip2@1.0.6
+ spack -e rfm_spack_env install
+
+As you might have noticed ReFrame expects that Spack is already installed on the system.
+The packages specified in the environment and the tests will be installed in the test's stage directory, where the environment is copied before building.
+Here is the stage directory structure:
+
+.. code:: console
+
+ stage/generic/default/builtin/BZip2SpackCheck/
+ ├── rfm_spack_env
+ │  ├── spack
+ │  │  └── opt
+ │  │    └── spack
+ │  │    ├── bin
+ │  │    └── darwin-catalina-skylake
+ │  ├── spack.lock
+ │  └── spack.yaml
+ ├── rfm_BZip2SpackCheck_build.err
+ ├── rfm_BZip2SpackCheck_build.out
+ ├── rfm_BZip2SpackCheck_build.sh
+ ├── rfm_BZip2SpackCheck_job.err
+ ├── rfm_BZip2SpackCheck_job.out
+ └── rfm_BZip2SpackCheck_job.sh
+
+
+Finally, here is the generated run script that ReFrame uses to run the test, once its build has succeeded:
+
+.. code-block:: bash
+
+ #!/bin/bash
+ spack env create -d rfm_spack_env
+ eval `spack -e rfm_spack_env load --sh bzip2@1.0.6`
+ bzip2 --help
+
+From this point on, sanity and performance checking are exactly identical to any other ReFrame test.
+
+.. tip::
+
+ While developing a test using Spack or EasyBuild as a build system, it can be useful to run ReFrame with the :option:`--keep-stage-files` and :option:`--dont-restage` options to prevent ReFrame from removing the test's stage directory upon successful completion of the test.
+ For this particular type of test, these options will avoid having to rebuild the required package dependencies every time the test is retried.
+
+
+
+Custom builds
+-------------
+
+There are cases where you need to test a code that does not use of the supported build system of ReFrame.
+In this case, you could set the :attr:`build_system` to ``'CustomBuild'`` and supply the exact commands to build the code:
+
+
+.. code-block:: python
+
+ @rfm.simple_test
+ class CustomBuildCheck(rfm.RegressionTest):
+ build_system = 'CustomBuild'
+
+ @run_before('compile')
+ def setup_build(self):
+ self.build_system.commands = [
+ './myconfigure.sh',
+ './build.sh'
+ ]
+
+
+.. warning::
+
+ You should use this build system with caution, because environment management, reproducibility and any potential side effects are all controlled by the custom build system.
+
+
+.. _working-with-environment-modules:
+
+Working with environment modules
+================================
+
+A common practice in HPC environments is to provide the software stack through `environment modules `__.
+An environment module is essentially a set of environment variables that are sourced in the user's current shell in order to make available the requested software stack components.
+
+ReFrame allows users to associate an environment modules system to a system in the configuration file.
+Tests may then specify the environment modules needed for them to run.
+
+We have seen environment modules in practice with the EasyBuild integration.
+Systems that use environment modules must set the :attr:`~config.systems.modules_system` system configuration parameter to the modules system that the system uses.
+
+.. literalinclude:: ../examples/tutorial/config/baseline_modules.py
+ :lines: 5-
+
+
+The tests that require environment modules must simply list the required modules in their :attr:`modules` variable.
+ReFrame will then emit the correct commands to load the modules based on the configured modules system.
+For older modules systems, such as Tmod 3.2, that do not support automatic conflict resolution, ReFrame will also emit commands to unload the conflicted modules before loading the requested ones.
+
+Test environments can also use modules by settings their :attr:`~config.environments.modules` parameter.
+
+.. code-block:: python
+
+ 'environments': [
+ ...
+ {
+ 'name': 'gnu',
+ 'cc': 'gcc',
+ 'cxx': 'g++',
+ 'modules': ['gnu'],
+ 'features': ['openmp'],
+ 'extras': {'omp_flag': '-fopenmp'}
+ }
+ ...
+ ]
+
+
+Environment module mappings
+---------------------------
+
+ReFrame allows you to replace environment modules used in tests with other modules on-the-fly.
+This is quite useful if you want to test a new version of a module or another combination of modules.
+Assume you have a test that loads a ``gromacs`` module:
+
+.. code-block:: python
+
+ class GromacsTest(rfm.RunOnlyRegressionTest):
+ ...
+ modules = ['gromacs']
+
+
+This test would use the default version of the module in the system, but you might want to test another version, before making that new one the default.
+You can ask ReFrame to temporarily replace the ``gromacs`` module with another one as follows:
+
+
+.. code-block:: bash
+
+ reframe -n GromacsTest -M 'gromacs:gromacs/2020.5' -r
+
+
+Every time ReFrame tries to load the ``gromacs`` module, it will replace it with ``gromacs/2020.5``.
+You can specify multiple mappings at once or provide a file with mappings using the :option:`--module-mappings` option.
+You can also replace a single module with multiple modules.
+
+A very convenient feature of ReFrame in dealing with modules is that you do not have to care about module conflicts at all, regardless of the modules system backend.
+ReFrame will take care of unloading any conflicting modules, if the underlying modules system cannot do that automatically.
+In case of module mappings, it will also respect the module order of the replacement modules and will produce the correct series of "load" and "unload" commands needed by the modules system backend used.
+
+
+Manipulating ReFrame's environment
+----------------------------------
+
+ReFrame runs the selected tests in the same environment as the one that it itself executes in.
+It does not unload any environment modules nor sets or unsets any environment variable.
+Nonetheless, it gives you the opportunity to modify the environment that the tests execute.
+You can either purge completely all environment modules by passing the :option:`--purge-env` option or ask ReFrame to load or unload some environment modules before starting running any tests by using the :option:`-m` and :option:`-u` options respectively.
+Of course you could manage the environment manually, but it's more convenient if you do that directly through ReFrame's command-line.
+If you used an environment module to load ReFrame, e.g., ``reframe``, you can use the :option:`-u` to have ReFrame unload it before running any tests, so that the tests start in a clean environment:
+
+.. code-block:: bash
+
+ reframe -u reframe ...
+
+
+Working with low-level dependencies
+===================================
+
+We have seen that :ref:`test fixtures ` fixtures introduce dependencies between tests along with a scope.
+It is possible to define test dependencies without a scope using the low-level test dependency API.
+In fact, test fixtures translate to that low-level API.
+In this how-to, we will rewrite the :ref:`OSU benchmarks example ` of the main tutorial to use the low-level dependency API.
+
+Here is the full code:
+
+.. literalinclude:: ../examples/tutorial/mpi/osu_deps.py
+ :caption:
+ :lines: 5-
+
+Contrary to when using fixtures, dependencies are now explicitly defined using the :func:`depends_on` method.
+The target test is referenced by name and the option ``how`` argument defines how the individual cases of the two tests depend on each other.
+Remember that a test generates a test case for each combination of valid systems and valid environments.
+There are some shortcuts for defining common dependency patterns, such as the :obj:`udeps.fully` and :obj:`udeps.by_env`.
+The former defines that all the test cases of the current test depend on all the test cases of the target, whereas the latter defines that test cases depend by environment, i.e., a test case of the current test depends on a test case of the target test only when the environment is the same.
+In our example, the :obj:`build_osu_benchmarks` depends fully on the :obj:`fetch_osu_benchmarks` whereas the final benchmarks depend on the :obj:`build_os_benchmarks` by environment.
+This is similar to the session and environment scopes of fixtures, but you have to set the :attr:`valid_systems` and :attr:`valid_prog_environs` of the targets, whereas for fixtures these will be automatically determined by the scope.
+This makes the low-level dependencies less flexible.
+
+As with fixtures, you can still access fully the target test, but the way to do so is a bit more involved.
+There are two ways to access the target dependencies:
+
+1. Using the :func:`@require_deps ` decorator.
+2. Using the low-level :func:`getdep` method.
+
+The :func:`@require_deps ` acts as a special post-setup hook (in fact, it is always the first post-setup hook of the test) that binds each argument of the decorated function to the corresponding target dependency.
+For the binding to work correctly, the function arguments must be named after the target dependencies.
+However, referring to a dependency only by the test's name is not enough, since a test might be associated with multiple environments or partitions.
+For this reason, each dependency argument is essentially bound to a function that accepts as argument the name of the target partition and target programming environment.
+If no arguments are passed, the current programming environment is implied, such that ``build_osu_benchmarks()`` is equivalent to ``build_osu_benchmarks(self.current_environ.name, self.current_partition.name)``.
+If the target partition and environment do not match the current ones, we should specify them, as is the case for accessing the :obj:`fetch_osu_benchmarks` dependency.
+This call returns a handle to the actual target test object that, exactly as it happens when accessing the fixture handle in a post-setup hook.
+
+Target dependencies can also be accessed directly using the :func:`getdep` function.
+This is what both the :func:`@require_deps ` decorator and fixtures use behind the scenes.
+Let's rewrite the dependency hooks using the low-level :func:`getdep` function:
+
+.. code-block:: python
+
+ @run_before('compile')
+ def prepare_build(self):
+ target = self.getdep('fetch_osu_benchmarks', 'gnu', 'login')
+ ...
+
+ @run_before('run')
+ def prepare_run(self):
+ osu_binaries = self.getdep('build_osu_benchmarks')
+ ...
+
+For running and listing tests with dependencies the same principles apply as with fixtures as ReFrame only sees dependencies and test cases.
+The only difference in listing is that there is no scope associated with the dependent tests as is with fixtures:
+
+.. code-block:: bash
+ :caption: Run with the Docker compose setup.
+
+ reframe --prefix=/scratch/rfm-stage/ -C config/cluster_mpi.py -c mpi/osu_deps.py -n osu_allreduce_test -l
+
+.. code-block:: console
+
+ [List of matched checks]
+ - osu_allreduce_test /63dd518c
+ ^build_osu_benchmarks /f6911c4c
+ ^fetch_osu_benchmarks /52d9b2c6
+ Found 3 check(s)
+
+
+Resolving dependencies
+----------------------
+
+When defining a low-level dependency using the :func:`depends_on` function, the target test cases must exist, otherwise ReFrame will refuse to load the dependency chain and will issue a warning.
+Similarly, when requesting access to a target test case using :func:`getdep`, if the target test case does not exist, the current test will fail.
+
+To fully understand how the different cases of a test depend on the cases of another test and how to express more complex dependency relations, please refer to :doc:`dependencies`.
+It is generally preferable to use the higher-level fixture API instead of the low-level dependencies as it's more intuitive, less error-prone and offers more flexibility.
+
+
+.. _param_deps:
+
+Depending on parameterized tests
+--------------------------------
+
+As we have seen earlier, tests define their dependencies by referencing the target tests by their unique name.
+This is straightforward when referring to regular tests, where their name matches the class name, but it becomes cumbersome trying to refer to a parameterized tests, since no safe assumption should be made as of the variant number of the test or how the parameters are encoded in the name.
+In order to safely and reliably refer to a parameterized test, you should use the :func:`~reframe.core.pipeline.RegressionMixin.get_variant_nums` and :func:`~reframe.core.pipeline.RegressionMixin.variant_name` class methods as shown in the following example:
+
+.. literalinclude:: ../examples/tutorial/deps/parameterized.py
+ :lines: 6-
+
+In this example, :class:`TestB` depends only on selected variants of :class:`TestA`.
+The :func:`get_variant_nums` method accepts a set of key-value pairs representing the target test parameters and selector functions and returns the list of the variant numbers that correspond to these variants.
+Using the :func:`variant_name` subsequently, we can get the actual name of the variant.
+
+
+.. code-block:: bash
+ :caption: Run in the single-node container.
+
+ reframe -c deps/parameterized.py -l
+
+.. code-block:: console
+
+ [List of matched checks]
+ - TestB /cc291487
+ ^TestA %z=9 /ca1c96ee
+ ^TestA %z=8 /75b6718c
+ ^TestA %z=7 /1d87616c
+ ^TestA %z=6 /06c8e673
+ - TestA %z=5 /536115e0
+ - TestA %z=4 /b1aa0bc1
+ - TestA %z=3 /e62d23e8
+ - TestA %z=2 /423a76e9
+ - TestA %z=1 /8258ae7a
+ - TestA %z=0 /7a14ae93
+ Found 11 check(s)
+
+
+.. _generate-ci-pipeline:
+
+Integrating into a CI pipeline
+==============================
+
+.. versionadded:: 3.4.1
+
+Instead of running your tests, you can ask ReFrame to generate a `child pipeline `__ specification for the Gitlab CI.
+This will spawn a CI job for each ReFrame test respecting test dependencies.
+You could run your tests in a single job of your Gitlab pipeline, but you would not take advantage of the parallelism across different CI jobs.
+Having a separate CI job per test makes it also easier to spot the failing tests.
+
+As soon as you have set up a `runner `__ for your repository, it is fairly straightforward to use ReFrame to automatically generate the necessary CI steps.
+The following is an example of ``.gitlab-ci.yml`` file that does exactly that:
+
+.. code-block:: yaml
+
+ stages:
+ - generate
+ - test
+
+ generate-pipeline:
+ stage: generate
+ script:
+ - reframe --ci-generate=${CI_PROJECT_DIR}/pipeline.yml -c ${CI_PROJECT_DIR}/path/to/tests
+ artifacts:
+ paths:
+ - ${CI_PROJECT_DIR}/pipeline.yml
+
+ test-jobs:
+ stage: test
+ trigger:
+ include:
+ - artifact: pipeline.yml
+ job: generate-pipeline
+ strategy: depend
+
+
+It defines two stages.
+The first one, called ``generate``, will call ReFrame to generate the pipeline specification for the desired tests.
+All the usual :ref:`test selection options ` can be used to select specific tests.
+ReFrame will process them as usual, but instead of running the selected tests, it will generate the correct steps for running each test individually as a Gitlab job in a child pipeline.
+The generated ReFrame command that will run each individual test reuses the :option:`-C`, :option:`-R`, :option:`-v` and :option:`--mode` options passed to the initial invocation of ReFrame that was used to generate the pipeline.
+Users can define CI-specific execution modes in their configuration in order to pass arbitrary options to the ReFrame invocation in the child pipeline.
+
+Finally, we pass the generated CI pipeline file to the second phase as an artifact and we are done!
+If ``image`` keyword is defined in ``.gitlab-ci.yml``, the emitted pipeline will use the same image as the one defined in the parent pipeline.
+Besides, each job in the generated pipeline will output a separate junit report which can be used to create GitLab badges.
+
+The following figure shows one part of an automatically generated pipeline.
+
+.. figure:: _static/img/gitlab-ci.png
+ :align: center
+
+ :sub:`Snapshot of a Gitlab pipeline generated automatically by ReFrame.`
+
+
+.. note::
+
+ The ReFrame executable must be available in the Gitlab runner that will run the CI jobs.
+
+
+Flexible tests
+==============
+
+.. versionadded:: 2.15
+
+ReFrame can automatically set the number of tasks of a particular test, if its :attr:`num_tasks` attribute is set to a negative value or zero.
+In ReFrame's terminology, such tests are called *flexible*.
+Negative values indicate the minimum number of tasks that are acceptable for this test (a value of ``-4`` indicates that at least ``4`` tasks are required).
+A zero value indicates the default minimum number of tasks which is equal to :attr:`num_tasks_per_node`.
+
+By default, ReFrame will spawn such a test on all the idle nodes of the current system partition, but this behavior can be adjusted with :option:`--flex-alloc-nodes` command-line option.
+Flexible tests are very useful for multi-node diagnostic tests.
+
+In this example, we demonstrate this feature by forcing flexible execution in the OSU allreduce benchmark.
+
+.. code-block:: bash
+ :caption: Run with the Docker compose setup.
+
+ reframe --prefix=/scratch/rfm-stage/ -C config/cluster_mpi.py -c mpi/osu.py -n osu_allreduce_test -S num_tasks=0 -r
+
+By default, our version of the OSU allreduce benchmark uses two processes, but setting :attr:`num_tasks` to zero will span the test to the full pseudo-cluster occupying all three available nodes:
+
+.. code-block:: console
+
+ admin@login:~$ squeue
+ JOBID PARTITION NAME USER ST TIME NODES NODELIST(REASON)
+ 5 all rfm_osu_ admin R 1:04 3 nid[00-02]
+
+Note that for flexible tests, :attr:`num_tasks` is updated to the actual value of tasks that ReFrame requested just after the test job is submitted.
+Thus, the actual number of tasks can then be used in sanity or performance checking.
+
+.. tip::
+
+ If you want to run multiple flexible tests at once that compete for the same nodes, you will have to run them using the serial execution policy, because the first test will take all the available idel nodes causing the rest to fail immediately, as there will be no available nodes for them.
+
+
+Testing containerized applications
+==================================
+
+.. versionadded:: 2.20
+
+ReFrame can be used also to test applications that run inside a container.
+First, you will need to enable the container platform support in ReFrame's configuration:
+
+.. literalinclude:: ../examples/tutorial/config/baseline_contplatf.py
+ :caption:
+ :lines: 5-
+
+For each partition, users can define a list of all supported container platforms using the :attr:`~config.systems.partitions.container_platforms` configuration parameter.
+In this case define the `Docker `__ platform.
+If your system supports multiple configuration platforms, ReFrame offers more configuration options, such as setting up the environment or indicating which platform is the default one.
+
+To denote that a test should be launched inside a container, the test must set the :attr:`container_platform` variable.
+Here is an example:
+
+
+.. literalinclude:: ../examples/tutorial/containers/container_test.py
+ :caption:
+ :lines: 5-
+
+A container-based test should be written as a :class:`RunOnlyRegressionTest`.
+The :attr:`container_platform` variable accepts a string that corresponds to the name of the container platform that will be used to run the container for this test.
+It is not necessary to set this variable, in which case, the default container platform of the current partition will be used.
+You can still differentiate your test based on the actual container platform that is being used by checking the ``self.container_platform.name`` variable.
+
+As soon as the container platform to be used is determined, you need to specify the container image to use by setting the :attr:`~reframe.core.containers.ContainerPlatform.image`.
+If the image is not specified, then the container logic is skipped and the test executes as if the :attr:`container_platform` was never set.
+
+The :attr:`~reframe.core.containers.ContainerPlatform.image` is the only mandatory attribute for container-based checks.
+It is important to note that the :attr:`executable` and :attr:`executable_opts` attributes of the actual test are ignored if the containerized code path is taken, i.e., when :attr:`~reframe.core.containers.ContainerPlatform.image` is not :obj:`None`.
+
+Running the test, ReFrame will generate a script that will launch and run the container for the given platform:
+
+.. note::
+
+ This example must be run natively.
+
+
+.. code-block:: bash
+ :caption: Run locally.
+
+ reframe -C examples/tutorial/config/baseline_contplatf.py -c examples/tutorial/containers/container_test.py -r
+
+And this is the generated test job script:
+
+.. code-block:: bash
+
+ #!/bin/bash
+ docker pull ubuntu:18.04
+ docker run --rm -v "/Users/karakasv/Repositories/reframe/stage/tutorialsys/default/builtin/ContainerTest":"/rfm_workdir" -w /rfm_workdir ubuntu:18.04 bash -c 'cat /etc/os-release | tee /rfm_workdir/release.txt'
+
+By default, ReFrame will pull the image, but this can be skipped by setting the :attr:`container_platform` 's :attr:`~reframe.core.containers.ContainerPlatform.pull_image` attribute to :obj:`False`.
+Also, ReFrame will mount the stage directory of the test under ``/rfm_workdir`` inside the container.
+Once the commands are executed, the container is stopped and ReFrame goes on with the sanity and performance checks.
+Besides the stage directory, additional mount points can be specified through the :attr:`~reframe.core.pipeline.RegressionTest.container_platform.mount_points` :attr:`container_platform` attribute:
+
+.. code-block:: python
+
+ self.container_platform.mount_points = [('/path/to/host/dir1', '/path/to/container/mount_point1'),
+ ('/path/to/host/dir2', '/path/to/container/mount_point2')]
+
+The container filesystem is ephemeral, therefore, ReFrame mounts the stage directory under ``/rfm_workdir`` inside the container where the user can copy artifacts as needed.
+These artifacts will therefore be available inside the stage directory after the container execution finishes.
+This is very useful if the artifacts are needed for the sanity or performance checks.
+If the copy is not performed by the default container command, the user can override this command by settings the :attr:`container_platform` 's :attr:`~reframe.core.containers.ContainerPlatform.command` such as to include the appropriate copy commands.
+In the current test, the output of the ``cat /etc/os-release`` is available both in the standard output as well as in the ``release.txt`` file, since we have used the command:
+
+.. code-block:: bash
+
+ bash -c 'cat /etc/os-release | tee /rfm_workdir/release.txt'
+
+
+and ``/rfm_workdir`` corresponds to the stage directory on the host system.
+Therefore, the ``release.txt`` file can now be used in the subsequent sanity checks:
+
+.. literalinclude:: ../examples/tutorial/containers/container_test.py
+ :start-at: @sanity_function
+ :end-at: return
+
+
+
+.. versionchanged:: 3.12.0
+ There is no need any more to explicitly set the :attr:`container_platform` in the test.
+ This is automatically initialized from the default platform of the current partition.
+
+
+Generating tests programmatically
+=================================
+
+You can use ReFrame to generate tests programmatically using the special :func:`~reframe.core.meta.make_test` function.
+This function creates a new test type as if you have typed it manually using the :keyword:`class` keyword.
+You can create arbitrarily complex tests that use variables, parameters, fixtures and pipeline hooks.
+
+In this tutorial, we will use :func:`~reframe.core.meta.make_test` to build a simple domain-specific syntax for generating variants of STREAM benchmarks.
+Our baseline STREAM test is the one presented in the :doc:`tutorial` that uses a build fixture:
+
+.. literalinclude:: ../examples/tutorial/stream/stream_fixtures.py
+ :caption:
+ :lines: 5-
+
+For our example, we would like to create a simpler syntax for generating multiple different :class:`stream_test` versions that could run all at once.
+Here is an example specification file for those tests:
+
+.. literalinclude:: ../examples/tutorial/stream/stream_config.yaml
+ :caption:
+ :lines: 6-
+
+
+The :attr:`thread_scaling` configuration parameter for the last workflow will create a parameterised version of the test using different number of threads.
+In total, we expect six :class:`stream_test` versions to be generated by this configuration.
+
+The process for generating the actual tests from this spec file comprises three steps and everything happens in a somewhat unconventional, though valid, ReFrame test file:
+
+1. We load the test configuration from a spec file that is passed through the ``STREAM_SPEC_FILE`` environment variable.
+2. Based on the loaded test specs, we generate the actual tests using the :func:`~reframe.core.meta.make_test` function.
+3. We register the generated tests with the framework by applying manually the :func:`@simple_test ` decorator.
+
+The whole code for generating the tests is the following and is only a few lines.
+Let's walk through it.
+
+.. literalinclude:: ../examples/tutorial/stream/stream_workflows.py
+ :caption:
+ :lines: 5-
+
+The :func:`load_specs()` function simply loads the test specs from the YAML test spec file and does some simple sanity checking.
+
+The :func:`generate_tests()` function consumes the test specs and generates a test for each entry.
+Each test inherits from the base :class:`stream_test` and redefines its :attr:`stream_binaries` fixture so that it is instantiated with the set of variables specified in the test spec.
+Remember that all the STREAM test variables in the YAML file refer to its build phase and thus its build fixture.
+We also treat specially the :attr:`thread_scaling` spec parameter.
+In this case, we add a :attr:`num_threads` parameter to the test and add a post-init hook that sets the test's :attr:`~reframe.core.pipeline.RegressionTest.num_cpus_per_task`.
+
+Finally, we register the generated tests using the :func:`rfm.simple_test` decorator directly;
+remember that :func:`~reframe.core.meta.make_test` returns a class.
+
+The equivalent of our test generation for the third spec is exactly the following:
+
+.. code-block:: python
+
+ @rfm.simple_test
+ class stream_test_2(stream_test):
+ stream_binary = fixture(build_stream, scope='environment',
+ variables={'elem_type': 'double',
+ 'array_size': 16777216,
+ 'num_iters': 10})
+ nthr = parameter([1, 2, 4, 8])
+
+ @run_after('init')
+ def _set_num_threads(self):
+ self.num_threads = self.nthr
+
+
+And here is the listing of generated tests:
+
+.. code-block:: bash
+ :caption: Run in the single-node container.
+
+ STREAM_SPEC_FILE=stream_config.yaml reframe -C config/baseline_environs.py -c stream/stream_workflows.py -l
+
+.. code-block:: console
+
+ [List of matched checks]
+ - stream_test_2 %nthr=8 %stream_binary.elem_type=double %stream_binary.array_size=16777216 %stream_binary.num_iters=10 /04f5cf62
+ ^build_stream %elem_type=double %array_size=16777216 %num_iters=10 ~tutorialsys:default+gnu 'stream_binary /74d12df7
+ ^build_stream %elem_type=double %array_size=16777216 %num_iters=10 ~tutorialsys:default+clang 'stream_binary /f3a963e3
+ - stream_test_2 %nthr=4 %stream_binary.elem_type=double %stream_binary.array_size=16777216 %stream_binary.num_iters=10 /1c09d755
+ ^build_stream %elem_type=double %array_size=16777216 %num_iters=10 ~tutorialsys:default+gnu 'stream_binary /74d12df7
+ ^build_stream %elem_type=double %array_size=16777216 %num_iters=10 ~tutorialsys:default+clang 'stream_binary /f3a963e3
+ - stream_test_2 %nthr=2 %stream_binary.elem_type=double %stream_binary.array_size=16777216 %stream_binary.num_iters=10 /acb6dc4d
+ ^build_stream %elem_type=double %array_size=16777216 %num_iters=10 ~tutorialsys:default+gnu 'stream_binary /74d12df7
+ ^build_stream %elem_type=double %array_size=16777216 %num_iters=10 ~tutorialsys:default+clang 'stream_binary /f3a963e3
+ - stream_test_2 %nthr=1 %stream_binary.elem_type=double %stream_binary.array_size=16777216 %stream_binary.num_iters=10 /e6eebc18
+ ^build_stream %elem_type=double %array_size=16777216 %num_iters=10 ~tutorialsys:default+gnu 'stream_binary /74d12df7
+ ^build_stream %elem_type=double %array_size=16777216 %num_iters=10 ~tutorialsys:default+clang 'stream_binary /f3a963e3
+ - stream_test_1 %stream_binary.elem_type=double %stream_binary.array_size=1048576 %stream_binary.num_iters=100 /514be749
+ ^build_stream %elem_type=double %array_size=1048576 %num_iters=100 ~tutorialsys:default+gnu 'stream_binary /b841f3c9
+ ^build_stream %elem_type=double %array_size=1048576 %num_iters=100 ~tutorialsys:default+clang 'stream_binary /ade049de
+ - stream_test_0 %stream_binary.elem_type=float %stream_binary.array_size=16777216 %stream_binary.num_iters=10 /c0c0f2bf
+ ^build_stream %elem_type=float %array_size=16777216 %num_iters=10 ~tutorialsys:default+gnu 'stream_binary /6767ce8c
+ ^build_stream %elem_type=float %array_size=16777216 %num_iters=10 ~tutorialsys:default+clang 'stream_binary /246007ff
+ Found 6 check(s)
+
+
+.. note::
+
+ The path passed to ``STREAM_SPEC_FILE`` is relative to the test directory.
+ Since version 4.2, ReFrame changes to the test directory before loading a test file.
+ In prior versions you have to specify the path relative to the current working directory.
+
+
+Using the Flux framework scheduler
+==================================
+
+This is a how to that will show how-to use ReFrame with `Flux
+Framework `__. First, build the
+container here from the root of reframe.
+
+.. code-block:: bash
+
+ $ docker build -f examples/tutorial/dockerfiles/flux.dockerfile -t flux-reframe .
+
+Then shell inside, optionally binding the present working directory if
+you want to develop.
+
+.. code-block:: bash
+
+ $ docker run -it -v $PWD:/code flux-reframe
+ $ docker run -it flux-reframe
+
+Note that if you build the local repository, you’ll need to bootstrap
+and install again, as we have over-written the bin!
+
+.. code-block:: bash
+
+ # In case of problems with pip, first clean the `external` directory with `rm -rf external`
+ ./bootstrap.sh
+
+And then reframe will again be in the local ``bin`` directory:
+
+.. code-block:: bash
+
+ # which reframe
+ /code/bin/reframe
+
+Then we can run ReFrame with the custom config `config.py `__
+for flux.
+
+.. code-block:: bash
+
+ # What tests are under examples/howto/flux?
+ $ cd examples/howto/flux
+ $ reframe -c . -C settings.py -l
+
+.. code-block:: console
+
+ [ReFrame Setup]
+ version: 4.0.0-dev.1
+ command: '/code/bin/reframe -c examples/howto/flux -C examples/howto/flux/settings.py -l'
+ launched by: root@b1f6650222bc
+ working directory: '/code'
+ settings file: 'examples/howto/flux/settings.py'
+ check search path: '/code/examples/howto/flux'
+ stage directory: '/code/stage'
+ output directory: '/code/output'
+
+ [List of matched checks]
+ - EchoRandTest /66b93401
+ Found 1 check(s)
+
+ Log file(s) saved in '/tmp/rfm-ilqg7fqg.log'
+
+This also works
+
+.. code-block:: bash
+ :caption: Run in the Flux container.
+
+ $ reframe -c examples/howto/flux -C examples/howto/flux/settings.py -l
+
+And then to run tests, just replace ``-l`` (for list) with ``-r`` or
+``--run`` (for run):
+
+.. code-block:: bash
+ :caption: Run in the Flux container.
+
+ $ reframe -c examples/howto/flux -C examples/howto/flux/settings.py --run
+
+.. code:: console
+
+ root@b1f6650222bc:/code# reframe -c examples/howto/flux -C examples/howto/flux/settings.py --run
+ [ReFrame Setup]
+ version: 4.0.0-dev.1
+ command: '/code/bin/reframe -c examples/howto/flux -C examples/howto/flux/settings.py --run'
+ launched by: root@b1f6650222bc
+ working directory: '/code'
+ settings file: 'examples/howto/flux/settings.py'
+ check search path: '/code/examples/howto/flux'
+ stage directory: '/code/stage'
+ output directory: '/code/output'
+
+ [==========] Running 1 check(s)
+ [==========] Started on Fri Sep 16 20:47:15 2022
+
+ [----------] start processing checks
+ [ RUN ] EchoRandTest /66b93401 @generic:default+builtin
+ [ OK ] (1/1) EchoRandTest /66b93401 @generic:default+builtin
+ [----------] all spawned checks have finished
+
+ [ PASSED ] Ran 1/1 test case(s) from 1 check(s) (0 failure(s), 0 skipped)
+ [==========] Finished on Fri Sep 16 20:47:15 2022
+ Run report saved in '/root/.reframe/reports/run-report.json'
+ Log file(s) saved in '/tmp/rfm-0avso9nb.log'
+
+For advanced users or developers, here is how to run tests within the container:
+
+Testing
+-------
+
+.. code-block:: console
+
+ ./test_reframe.py --rfm-user-config=examples/howto/flux/settings.py unittests/test_schedulers.py -xs
+
+
+Building test libraries and utilities
+=====================================
+
+ReFrame tests are extremely modular.
+You can create libraries of base tests and utilities that others can use or extend.
+You can organize the source code of a test library as you would with a regular Python code.
+Let's see a made-up example for demonstration purposes:
+
+.. code-block:: console
+
+ ~/reframe-examples/howto
+ ├── testlib
+ │  ├── __init__.py
+ │  ├── simple.py
+ │  └── utility
+ │  └── __init__.py
+ └── testlib_example.py
+
+
+The ``testlib_example.py`` is fairly simple:
+it extends the :class:`simple_echo_check` from the test library and sets the message.
+
+.. literalinclude:: ../examples/howto/testlib_example.py
+ :caption:
+ :lines: 6-
+
+The :class:`simple_echo_check` it echoes "Hello, " and asserts the output.
+It also uses a dummy fixture that it includes from a utility.
+
+.. literalinclude:: ../examples/howto/testlib/simple.py
+ :caption:
+ :lines: 6-
+
+Note that the :class:`simple_echo_check` is also decorated as a :func:`@simple_test `, meaning that it can be executed as a stand-alone check.
+This is typical when you are building test libraries:
+you want the base tests to be complete and functional making minimum assumptions for the target system/environment.
+You can then specialize further the derived tests and add more constraints in their :attr:`valid_systems` or :attr:`valid_prog_environs`.
+
+Let's try running both the library and the derived tests:
+
+.. code-block:: bash
+ :caption: Running the derived test (in the single-node container)
+
+ reframe -c reframe-examples/howto/testlib_example.py -r
+
+.. code-block:: console
+
+ [----------] start processing checks
+ [ RUN ] dummy_fixture ~generic:default+builtin /1fae4a8b @generic:default+builtin
+ [ OK ] (1/2) dummy_fixture ~generic:default+builtin /1fae4a8b @generic:default+builtin
+ [ RUN ] HelloFoo /2ecd9f04 @generic:default+builtin
+ [ OK ] (2/2) HelloFoo /2ecd9f04 @generic:default+builtin
+ [----------] all spawned checks have finished
+
+.. code-block:: bash
+ :caption: Running the library test (in the single-node container)
+
+ reframe -c reframe-examples/howto/testlib/simple.py -r
+
+.. code-block:: console
+
+ [----------] start processing checks
+ [ RUN ] dummy_fixture ~generic:default+builtin /1fae4a8b @generic:default+builtin
+ [ OK ] (1/2) dummy_fixture ~generic:default+builtin /1fae4a8b @generic:default+builtin
+ [ RUN ] simple_echo_check /8e1b0090 @generic:default+builtin
+ [ OK ] (2/2) simple_echo_check /8e1b0090 @generic:default+builtin
+ [----------] all spawned checks have finished
+
+
+There is a little trick that makes running both the library test and the derived test so painlessly, despite the relative import of the :obj:`utility` module by the library test.
+ReFrame loads the test files by importing them as Python modules using the file's basename as the module name.
+It also adds temporarily to the ``sys.path`` the parent directory of the test file.
+This is enough to load the ``testlib.simple`` module in the ``testlib_example.py`` and since the ``simple`` module has a parent, Python knows how to resolve the relative import in ``from .utility import dummy_fixture`` (it will be resolved as ``testlib.utility``).
+However, loading directly the test library file, Python would not know the parent module of ``utility`` and would complain.
+The trick is to create an empty ``testlib/__init__.py`` file, so as to tell ReFrame to load also ``testlib`` as a parent module.
+Whenever ReFrame encounters an ``__init__.py`` file down the directory path leading to a test file, it will load it as a parent module, thus allowing relative imports to succeed.
+
+
+Debugging
+=========
+
+ReFrame tests are Python classes inside Python source files, so the usual debugging techniques for Python apply.
+However, ReFrame will filter some errors and stack traces by default in order to keep the output clean.
+Generally, full stack traces for user programming errors will not be printed and will not block the test loading process.
+If a test has errors and cannot be loaded, an error message will be printed and the loading of the remaining tests will continue.
+In the following, we have inserted a small typo in the ``stream_variables.py`` tutorial example:
+
+.. code-block:: bash
+ :caption: Run in the single-node container.
+
+ reframe -C config/baseline_environs.py -c stream/stream_variables.py -l
+
+.. code-block:: console
+
+ WARNING: skipping test file '/home/user/reframe-examples/tutorial/stream/stream_variables.py': name error: stream/stream_variables.py:30: name 'varible' is not defined
+ num_threads = varible(int, value=0)
+ (rerun with '-v' for more information)
+
+Rerunning with increased verbosity as the message suggests will give a full traceback.
+
+.. note::
+
+ ReFrame cannot always track a user error back to its source, especially for some of the `builtin `__ functionality.
+ In such cases, ReFrame will just print the error message but not the source code context.
+
+.. tip::
+ The :option:`-v` option can be specified multiple times to increase the verbosity level further.
+
+
+Debugging sanity and performance patterns
+-----------------------------------------
+
+When creating a new test that requires a complex output parsing for the sanity checking or for extracting the figures of merit, tuning the functions decorated by :attr:`@sanity_function` or :attr:`@performance_function` may involve some trial and error to debug the complex regular expressions required.
+For lightweight tests which execute in few seconds, this trial and error may not be an issue at all.
+However, when dealing with tests which take longer to run, this method can quickly become tedious and inefficient.
+
+.. tip::
+ When dealing with ``make``-based projects which take a long time to compile, you can use the command line option :option:`--dont-restage` in order to speed up the compile stage in subsequent runs.
+
+When a test fails, ReFrame will keep the test output in the stage directory after its execution, which means that one can load this output into a Python shell or another helper script without having to rerun the expensive test again.
+If the test is not failing but the user still wants to experiment or modify the existing sanity or performance functions, the command line option :option:`--keep-stage-files` can be used when running ReFrame to avoid deleting the stage directory.
+With the executable's output available in the stage directory, one can simply use the `re `_ module to debug regular expressions as shown below.
+
+.. code-block:: python
+
+ >>> import re
+
+ >>> # Read the test's output
+ >>> with open(the_output_file, 'r') as f:
+ ... test_output = ''.join(f.readlines())
+ ...
+ >>> # Evaluate the regular expression
+ >>> re.findall(the_regex_pattern, test_output, re.MULTILINE)
+
+Alternatively to using the `re `_ module, one could use all the :mod:`~reframe.utility.sanity` utility provided by ReFrame directly from the Python shell.
+In order to do so, if ReFrame was installed manually using the ``bootstrap.sh`` script, one will have to make all the Python modules from the ``external`` directory accessible to the Python shell as shown below.
+
+.. code-block:: python
+
+ >>> import sys
+ >>> import os
+
+ >>> # Make ReFrame's dependencies available
+ >>> sys.path = ['/path/to/reframe/prefix/external'] + sys.path
+
+ >>> # Import ReFrame-provided sanity functions
+ >>> import reframe.utility.sanity as sn
+
+ >>> # Evaluate the regular expression
+ >>> assert sn.evaluate(sn.assert_found(the_regex_pattern, the_output_file))
+
+
+Debugging test loading
+----------------------
+
+If you are new to ReFrame, you might wonder sometimes why your tests are not loading or why your tests are not running on the partition they were supposed to run.
+This can be due to ReFrame picking the wrong configuration entry or that your test is not written properly (not decorated, no :attr:`~reframe.core.pipeline.RegressionTest.valid_systems` etc.).
+If you try to load a test file and list its tests by increasing twice the verbosity level, you will get enough output to help you debug such issues.
+Let's try loading the ``stream_variables.py`` file:
+
+.. code-block:: bash
+ :caption: Run in the single-node container.
+
+ reframe -C config/baseline_environs.py -c stream/stream_variables.py -l -vv
+
+.. literalinclude:: listings/verbose_test_loading.txt
+ :language: console
+
+You can see all the different phases ReFrame's frontend goes through when loading a test.
+After loading the configuration, ReFrame will print out its relevant environment variables and will start examining the given files in order to find and load ReFrame tests.
+Before attempting to load a file, it will validate it and check if it looks like a ReFrame test.
+If it does, it will load that file by importing it.
+This is where any ReFrame tests are instantiated and initialized (see ``Loaded 3 test(s)``), as well as the actual test cases (combination of tests, system partitions and environments) are generated.
+Then the test cases are filtered based on the various :ref:`filtering command line options ` as well as the programming environments that are defined for the currently selected system.
+Finally, the test case dependency graph is built and everything is ready for running (or listing).
+
+Try passing a specific system or partition with the :option:`--system` option or modify the test (e.g., removing the decorator that registers it) and see how the logs change.
+
+
+
+Extending the framework
+=======================
+
+.. _custom-launchers:
+
+Implementing a parallel launcher backend
+----------------------------------------
+
+It is not uncommon for sites to supply their own alternatives of parallel launchers that build on top of existing launchers and provide additional functionality or implement some specific site policies.
+In ReFrame it is straightforward to implement a custom parallel launcher backend without having to modify the framework code.
+
+Let's see how a builtin launcher looks like.
+The following is the actual implementation of the ``mpirun`` launcher in ReFrame:
+
+.. literalinclude:: ../reframe/core/launchers/mpi.py
+ :pyobject: MpirunLauncher
+
+
+Each launcher must derive from the abstract base class :class:`~reframe.core.launchers.JobLauncher` ands needs to implement the :func:`~reframe.core.launchers.JobLauncher.command` method and, optionally, change the default :func:`~reframe.core.launchers.JobLauncher.run_command` method.
+
+The :func:`~reframe.core.launchers.JobLauncher.command` returns a list of command tokens that will be combined with any user-supplied job launcher :attr:`~reframe.core.launchers.JobLauncher.options` by the :func:`~reframe.core.launchers.JobLauncher.run_command` method to generate the actual launcher command line.
+Notice you can use the ``job`` argument to get job-specific information that will allow you to construct the correct launcher invocation.
+
+If you use a Python-based configuration file, you can define your custom launcher directly inside your config as follows:
+
+.. code-block:: python
+
+ from reframe.core.backends import register_launcher
+ from reframe.core.launchers import JobLauncher
+
+
+ @register_launcher('slrun')
+ class MySmartLauncher(JobLauncher):
+ def command(self, job):
+ return ['slrun', ...]
+
+ site_configuration = {
+ 'systems': [
+ {
+ 'name': 'my_system',
+ 'partitions': [
+ {
+ 'name': 'my_partition',
+ 'launcher': 'slrun'
+ ...
+ }
+ ],
+ ...
+ },
+ ...
+ ],
+ ...
+ }
+
+
+.. note::
+
+ In versions prior to 4.0, launchers could only be implemented inside the source code tree of ReFrame.
diff --git a/docs/index.rst b/docs/index.rst
index de85eca7f5..6d35d027a8 100644
--- a/docs/index.rst
+++ b/docs/index.rst
@@ -18,6 +18,7 @@ Finally, ReFrame offers a powerful and efficient runtime for running and managin
Publications
============
+* Slides [`pdf `__][`talk `__] @ `9th EasyBuild User Meeting 2024 `__.
* Slides [`part 1 `__][`part 2 `__][`talk `__] @ `8th EasyBuild User Meeting 2023 `__.
* Slides [`pdf `__] @ `7th EasyBuild User Meeting 2022 `__.
* Slides [`pdf `__] @ `6th EasyBuild User Meeting 2021 `__.
@@ -44,12 +45,11 @@ Webinars and Tutorials
.. toctree::
:caption: Table of Contents
- :maxdepth: 2
started
- whats_new_40
- tutorials
- configure
+ tutorial
+ howto
topics
manuals
+ whats_new_40
hpctestlib
diff --git a/docs/listings/alltests_daint.txt b/docs/listings/alltests_daint.txt
deleted file mode 100644
index 5f2f96c200..0000000000
--- a/docs/listings/alltests_daint.txt
+++ /dev/null
@@ -1,146 +0,0 @@
-[ReFrame Setup]
- version: 4.0.0-dev.2
- command: './bin/reframe -c tutorials/basics/ -R -n HelloMultiLangTest|HelloThreadedExtended2Test|StreamWithRefTest --performance-report -r'
- launched by: user@host
- working directory: '/home/user/Devel/reframe'
- settings files: '', '/home/user/Devel/reframe/tutorials/config/daint.py'
- check search path: (R) '/home/user/Devel/reframe/tutorials/basics'
- stage directory: '/home/user/Devel/reframe/stage'
- output directory: '/home/user/Devel/reframe/output'
- log files: '/tmp/rfm-nyqs7jb9.log'
-
-[==========] Running 4 check(s)
-[==========] Started on Tue Nov 15 18:20:32 2022
-
-[----------] start processing checks
-[ RUN ] HelloMultiLangTest %lang=cpp /71bf65a3 @daint:login+builtin
-[ RUN ] HelloMultiLangTest %lang=cpp /71bf65a3 @daint:login+gnu
-[ RUN ] HelloMultiLangTest %lang=cpp /71bf65a3 @daint:login+intel
-[ RUN ] HelloMultiLangTest %lang=cpp /71bf65a3 @daint:login+nvidia
-[ RUN ] HelloMultiLangTest %lang=cpp /71bf65a3 @daint:login+cray
-[ RUN ] HelloMultiLangTest %lang=cpp /71bf65a3 @daint:gpu+gnu
-[ RUN ] HelloMultiLangTest %lang=cpp /71bf65a3 @daint:gpu+intel
-[ RUN ] HelloMultiLangTest %lang=cpp /71bf65a3 @daint:gpu+nvidia
-[ RUN ] HelloMultiLangTest %lang=cpp /71bf65a3 @daint:gpu+cray
-[ RUN ] HelloMultiLangTest %lang=cpp /71bf65a3 @daint:mc+gnu
-[ RUN ] HelloMultiLangTest %lang=cpp /71bf65a3 @daint:mc+intel
-[ RUN ] HelloMultiLangTest %lang=cpp /71bf65a3 @daint:mc+nvidia
-[ RUN ] HelloMultiLangTest %lang=cpp /71bf65a3 @daint:mc+cray
-[ RUN ] HelloMultiLangTest %lang=c /7cfa870e @daint:login+builtin
-[ RUN ] HelloMultiLangTest %lang=c /7cfa870e @daint:login+gnu
-[ RUN ] HelloMultiLangTest %lang=c /7cfa870e @daint:login+intel
-[ RUN ] HelloMultiLangTest %lang=c /7cfa870e @daint:login+nvidia
-[ RUN ] HelloMultiLangTest %lang=c /7cfa870e @daint:login+cray
-[ RUN ] HelloMultiLangTest %lang=c /7cfa870e @daint:gpu+gnu
-[ RUN ] HelloMultiLangTest %lang=c /7cfa870e @daint:gpu+intel
-[ RUN ] HelloMultiLangTest %lang=c /7cfa870e @daint:gpu+nvidia
-[ RUN ] HelloMultiLangTest %lang=c /7cfa870e @daint:gpu+cray
-[ RUN ] HelloMultiLangTest %lang=c /7cfa870e @daint:mc+gnu
-[ RUN ] HelloMultiLangTest %lang=c /7cfa870e @daint:mc+intel
-[ RUN ] HelloMultiLangTest %lang=c /7cfa870e @daint:mc+nvidia
-[ RUN ] HelloMultiLangTest %lang=c /7cfa870e @daint:mc+cray
-[ RUN ] HelloThreadedExtended2Test /57223829 @daint:login+builtin
-[ RUN ] HelloThreadedExtended2Test /57223829 @daint:login+gnu
-[ RUN ] HelloThreadedExtended2Test /57223829 @daint:login+intel
-[ RUN ] HelloThreadedExtended2Test /57223829 @daint:login+nvidia
-[ RUN ] HelloThreadedExtended2Test /57223829 @daint:login+cray
-[ RUN ] HelloThreadedExtended2Test /57223829 @daint:gpu+gnu
-[ RUN ] HelloThreadedExtended2Test /57223829 @daint:gpu+intel
-[ RUN ] HelloThreadedExtended2Test /57223829 @daint:gpu+nvidia
-[ RUN ] HelloThreadedExtended2Test /57223829 @daint:gpu+cray
-[ RUN ] HelloThreadedExtended2Test /57223829 @daint:mc+gnu
-[ RUN ] HelloThreadedExtended2Test /57223829 @daint:mc+intel
-[ RUN ] HelloThreadedExtended2Test /57223829 @daint:mc+nvidia
-[ RUN ] HelloThreadedExtended2Test /57223829 @daint:mc+cray
-[ RUN ] StreamWithRefTest /f925207b @daint:login+gnu
-[ RUN ] StreamWithRefTest /f925207b @daint:gpu+gnu
-[ RUN ] StreamWithRefTest /f925207b @daint:mc+gnu
-[ OK ] ( 1/42) HelloMultiLangTest %lang=cpp /71bf65a3 @daint:login+builtin
-[ OK ] ( 2/42) HelloMultiLangTest %lang=cpp /71bf65a3 @daint:login+gnu
-[ OK ] ( 3/42) HelloMultiLangTest %lang=cpp /71bf65a3 @daint:login+intel
-[ OK ] ( 4/42) HelloMultiLangTest %lang=cpp /71bf65a3 @daint:login+nvidia
-[ OK ] ( 5/42) HelloMultiLangTest %lang=cpp /71bf65a3 @daint:login+cray
-[ OK ] ( 6/42) HelloMultiLangTest %lang=c /7cfa870e @daint:login+builtin
-[ OK ] ( 7/42) HelloMultiLangTest %lang=c /7cfa870e @daint:login+gnu
-[ OK ] ( 8/42) HelloMultiLangTest %lang=c /7cfa870e @daint:login+intel
-[ OK ] ( 9/42) HelloMultiLangTest %lang=c /7cfa870e @daint:login+nvidia
-[ OK ] (10/42) HelloMultiLangTest %lang=c /7cfa870e @daint:login+cray
-[ OK ] (11/42) HelloMultiLangTest %lang=cpp /71bf65a3 @daint:gpu+cray
-[ OK ] (12/42) HelloMultiLangTest %lang=cpp /71bf65a3 @daint:mc+nvidia
-[ OK ] (13/42) HelloMultiLangTest %lang=cpp /71bf65a3 @daint:mc+cray
-[ OK ] (14/42) HelloMultiLangTest %lang=c /7cfa870e @daint:mc+cray
-[ OK ] (15/42) HelloMultiLangTest %lang=cpp /71bf65a3 @daint:gpu+nvidia
-[ OK ] (16/42) HelloMultiLangTest %lang=c /7cfa870e @daint:gpu+intel
-[ OK ] (17/42) HelloMultiLangTest %lang=c /7cfa870e @daint:gpu+nvidia
-[ OK ] (18/42) HelloMultiLangTest %lang=c /7cfa870e @daint:mc+intel
-[ OK ] (19/42) HelloThreadedExtended2Test /57223829 @daint:login+builtin
-[ OK ] (20/42) HelloThreadedExtended2Test /57223829 @daint:login+gnu
-[ OK ] (21/42) HelloThreadedExtended2Test /57223829 @daint:login+intel
-[ OK ] (22/42) HelloMultiLangTest %lang=c /7cfa870e @daint:gpu+cray
-[ OK ] (23/42) HelloMultiLangTest %lang=c /7cfa870e @daint:mc+gnu
-[ OK ] (24/42) HelloThreadedExtended2Test /57223829 @daint:login+nvidia
-[ OK ] (25/42) HelloThreadedExtended2Test /57223829 @daint:login+cray
-[ OK ] (26/42) HelloMultiLangTest %lang=c /7cfa870e @daint:mc+nvidia
-[ OK ] (27/42) HelloMultiLangTest %lang=cpp /71bf65a3 @daint:gpu+gnu
-[ OK ] (28/42) HelloMultiLangTest %lang=cpp /71bf65a3 @daint:gpu+intel
-[ OK ] (29/42) HelloMultiLangTest %lang=cpp /71bf65a3 @daint:mc+gnu
-[ OK ] (30/42) HelloMultiLangTest %lang=cpp /71bf65a3 @daint:mc+intel
-[ OK ] (31/42) HelloMultiLangTest %lang=c /7cfa870e @daint:gpu+gnu
-[ OK ] (32/42) StreamWithRefTest /f925207b @daint:login+gnu
-P: Copy: 71061.6 MB/s (r:0, l:None, u:None)
-P: Scale: 44201.5 MB/s (r:0, l:None, u:None)
-P: Add: 48178.5 MB/s (r:0, l:None, u:None)
-P: Triad: 48063.3 MB/s (r:0, l:None, u:None)
-[ OK ] (33/42) HelloThreadedExtended2Test /57223829 @daint:mc+cray
-[ OK ] (34/42) HelloThreadedExtended2Test /57223829 @daint:mc+intel
-[ OK ] (35/42) HelloThreadedExtended2Test /57223829 @daint:mc+gnu
-[ OK ] (36/42) HelloThreadedExtended2Test /57223829 @daint:mc+nvidia
-[ OK ] (37/42) StreamWithRefTest /f925207b @daint:mc+gnu
-P: Copy: 52660.1 MB/s (r:0, l:None, u:None)
-P: Scale: 33117.6 MB/s (r:0, l:None, u:None)
-P: Add: 34876.7 MB/s (r:0, l:None, u:None)
-P: Triad: 35150.7 MB/s (r:0, l:None, u:None)
-[ OK ] (38/42) HelloThreadedExtended2Test /57223829 @daint:gpu+intel
-[ OK ] (39/42) HelloThreadedExtended2Test /57223829 @daint:gpu+cray
-[ OK ] (40/42) HelloThreadedExtended2Test /57223829 @daint:gpu+nvidia
-[ OK ] (41/42) HelloThreadedExtended2Test /57223829 @daint:gpu+gnu
-[ OK ] (42/42) StreamWithRefTest /f925207b @daint:gpu+gnu
-P: Copy: 49682.3 MB/s (r:0, l:None, u:None)
-P: Scale: 34452.3 MB/s (r:0, l:None, u:None)
-P: Add: 38030.7 MB/s (r:0, l:None, u:None)
-P: Triad: 38379.0 MB/s (r:0, l:None, u:None)
-[----------] all spawned checks have finished
-
-[ PASSED ] Ran 42/42 test case(s) from 4 check(s) (0 failure(s), 0 skipped)
-[==========] Finished on Tue Nov 15 18:22:48 2022
-
-================================================================================
-PERFORMANCE REPORT
---------------------------------------------------------------------------------
-[StreamWithRefTest /f925207b @daint:login:gnu]
- num_gpus_per_node: 0
- num_tasks: 1
- performance:
- - Copy: 71061.6 MB/s (r: 0 MB/s l: -inf% u: +inf%)
- - Scale: 44201.5 MB/s (r: 0 MB/s l: -inf% u: +inf%)
- - Add: 48178.5 MB/s (r: 0 MB/s l: -inf% u: +inf%)
- - Triad: 48063.3 MB/s (r: 0 MB/s l: -inf% u: +inf%)
-[StreamWithRefTest /f925207b @daint:gpu:gnu]
- num_gpus_per_node: 0
- num_tasks: 1
- performance:
- - Copy: 49682.3 MB/s (r: 0 MB/s l: -inf% u: +inf%)
- - Scale: 34452.3 MB/s (r: 0 MB/s l: -inf% u: +inf%)
- - Add: 38030.7 MB/s (r: 0 MB/s l: -inf% u: +inf%)
- - Triad: 38379.0 MB/s (r: 0 MB/s l: -inf% u: +inf%)
-[StreamWithRefTest /f925207b @daint:mc:gnu]
- num_gpus_per_node: 0
- num_tasks: 1
- performance:
- - Copy: 52660.1 MB/s (r: 0 MB/s l: -inf% u: +inf%)
- - Scale: 33117.6 MB/s (r: 0 MB/s l: -inf% u: +inf%)
- - Add: 34876.7 MB/s (r: 0 MB/s l: -inf% u: +inf%)
- - Triad: 35150.7 MB/s (r: 0 MB/s l: -inf% u: +inf%)
---------------------------------------------------------------------------------
-Run report saved in '/home/user/.reframe/reports/run-report-1.json'
-Log file(s) saved in '/tmp/rfm-nyqs7jb9.log'
diff --git a/docs/listings/deps_complex_run.txt b/docs/listings/deps_complex_run.txt
index a8fe0ac662..b575f42ea9 100644
--- a/docs/listings/deps_complex_run.txt
+++ b/docs/listings/deps_complex_run.txt
@@ -1,211 +1,116 @@
[ReFrame Setup]
- version: 4.0.0-dev.2+5ea6b7a6
- command: './bin/reframe -c unittests/resources/checks_unlisted/deps_complex.py -r'
- launched by: user@host
- working directory: '/home/user/Repositories/reframe'
- settings files: '', '/home/user/Repositories/reframe/tutorials/config/tresa.py'
- check search path: '/home/user/Repositories/reframe/unittests/resources/checks_unlisted/deps_complex.py'
- stage directory: '/home/user/Repositories/reframe/stage'
- output directory: '/home/user/Repositories/reframe/output'
- log files: '/var/folders/h7/k7cgrdl13r996m4dmsvjq7v80000gp/T/rfm-_008n_el.log'
+ version: 4.6.0-dev.2
+ command: '/usr/local/share/reframe/bin/reframe -c deps/deps_complex.py -r --nocolor'
+ launched by: user@myhost
+ working directory: '/home/user/reframe-examples/tutorial'
+ settings files: ''
+ check search path: '/home/user/reframe-examples/tutorial/deps/deps_complex.py'
+ stage directory: '/home/user/reframe-examples/tutorial/stage'
+ output directory: '/home/user/reframe-examples/tutorial/output'
+ log files: '/tmp/rfm-01gkxmq0.log'
[==========] Running 10 check(s)
-[==========] Started on Sat Nov 12 19:01:00 2022
+[==========] Started on Tue Apr 16 21:35:34 2024+0000
[----------] start processing checks
-[ RUN ] T0 /c9c2be9f @tresa:default+gnu
-[ RUN ] T0 /c9c2be9f @tresa:default+clang
-[ OK ] ( 1/20) T0 /c9c2be9f @tresa:default+gnu
-[ OK ] ( 2/20) T0 /c9c2be9f @tresa:default+clang
-[ RUN ] T4 /11ee5e9a @tresa:default+gnu
-[ RUN ] T4 /11ee5e9a @tresa:default+clang
-[ OK ] ( 3/20) T4 /11ee5e9a @tresa:default+gnu
-[ OK ] ( 4/20) T4 /11ee5e9a @tresa:default+clang
-[ RUN ] T5 /020d01e5 @tresa:default+gnu
-[ RUN ] T5 /020d01e5 @tresa:default+clang
-[ OK ] ( 5/20) T5 /020d01e5 @tresa:default+gnu
-[ OK ] ( 6/20) T5 /020d01e5 @tresa:default+clang
-[ RUN ] T1 /1f93603d @tresa:default+gnu
-[ RUN ] T1 /1f93603d @tresa:default+clang
-[ OK ] ( 7/20) T1 /1f93603d @tresa:default+gnu
-[ OK ] ( 8/20) T1 /1f93603d @tresa:default+clang
-[ RUN ] T8 /605fc1d6 @tresa:default+gnu
-[ FAIL ] ( 9/20) T8 /605fc1d6 @tresa:default+gnu
-==> test failed during 'setup': test staged in '/home/user/Repositories/reframe/stage/tresa/default/gnu/T8'
-[ RUN ] T8 /605fc1d6 @tresa:default+clang
-[ FAIL ] (10/20) T8 /605fc1d6 @tresa:default+clang
-==> test failed during 'setup': test staged in '/home/user/Repositories/reframe/stage/tresa/default/clang/T8'
-[ FAIL ] (11/20) T9 /78a78a4e @tresa:default+gnu
+[ RUN ] T0 /c9c2be9f @generic:default+builtin
+[ OK ] ( 1/10) T0 /c9c2be9f @generic:default+builtin
+[ RUN ] T4 /11ee5e9a @generic:default+builtin
+[ OK ] ( 2/10) T4 /11ee5e9a @generic:default+builtin
+[ RUN ] T5 /020d01e5 @generic:default+builtin
+[ OK ] ( 3/10) T5 /020d01e5 @generic:default+builtin
+[ RUN ] T1 /1f93603d @generic:default+builtin
+[ OK ] ( 4/10) T1 /1f93603d @generic:default+builtin
+[ RUN ] T8 /605fc1d6 @generic:default+builtin
+[ FAIL ] ( 5/10) T8 /605fc1d6 @generic:default+builtin
+==> test failed during 'setup': test staged in '/home/user/reframe-examples/tutorial/stage/generic/default/builtin/T8'
+[ FAIL ] ( 6/10) T9 /78a78a4e @generic:default+builtin
==> test failed during 'startup': test staged in None
-[ FAIL ] (12/20) T9 /78a78a4e @tresa:default+clang
+[ RUN ] T6 /6dbdaf93 @generic:default+builtin
+[ OK ] ( 7/10) T6 /6dbdaf93 @generic:default+builtin
+[ RUN ] T2 /0f617ba9 @generic:default+builtin
+[ RUN ] T3 /5dd67f7f @generic:default+builtin
+[ FAIL ] ( 8/10) T2 /0f617ba9 @generic:default+builtin
+==> test failed during 'sanity': test staged in '/home/user/reframe-examples/tutorial/stage/generic/default/builtin/T2'
+[ FAIL ] ( 9/10) T7 /f005e93d @generic:default+builtin
==> test failed during 'startup': test staged in None
-[ RUN ] T6 /6dbdaf93 @tresa:default+gnu
-[ RUN ] T6 /6dbdaf93 @tresa:default+clang
-[ OK ] (13/20) T6 /6dbdaf93 @tresa:default+gnu
-[ OK ] (14/20) T6 /6dbdaf93 @tresa:default+clang
-[ RUN ] T2 /0f617ba9 @tresa:default+gnu
-[ RUN ] T2 /0f617ba9 @tresa:default+clang
-[ RUN ] T3 /5dd67f7f @tresa:default+gnu
-[ RUN ] T3 /5dd67f7f @tresa:default+clang
-[ FAIL ] (15/20) T2 /0f617ba9 @tresa:default+gnu
-==> test failed during 'sanity': test staged in '/home/user/Repositories/reframe/stage/tresa/default/gnu/T2'
-[ FAIL ] (16/20) T2 /0f617ba9 @tresa:default+clang
-==> test failed during 'sanity': test staged in '/home/user/Repositories/reframe/stage/tresa/default/clang/T2'
-[ FAIL ] (17/20) T7 /f005e93d @tresa:default+gnu
-==> test failed during 'startup': test staged in None
-[ FAIL ] (18/20) T7 /f005e93d @tresa:default+clang
-==> test failed during 'startup': test staged in None
-[ OK ] (19/20) T3 /5dd67f7f @tresa:default+gnu
-[ OK ] (20/20) T3 /5dd67f7f @tresa:default+clang
+[ OK ] (10/10) T3 /5dd67f7f @generic:default+builtin
[----------] all spawned checks have finished
-[ FAILED ] Ran 20/20 test case(s) from 10 check(s) (8 failure(s), 0 skipped)
-[==========] Finished on Sat Nov 12 19:01:03 2022
-
+[ FAILED ] Ran 10/10 test case(s) from 10 check(s) (4 failure(s), 0 skipped, 0 aborted)
+[==========] Finished on Tue Apr 16 21:35:36 2024+0000
================================================================================
SUMMARY OF FAILURES
--------------------------------------------------------------------------------
-FAILURE INFO for T8
- * Expanded name: T8
+FAILURE INFO for T8 (run: 1/1)
* Description:
- * System partition: tresa:default
- * Environment: gnu
- * Stage directory: /home/user/Repositories/reframe/stage/tresa/default/gnu/T8
+ * System partition: generic:default
+ * Environment: builtin
+ * Stage directory: /home/user/reframe-examples/tutorial/stage/generic/default/builtin/T8
* Node list:
* Job type: local (id=None)
* Dependencies (conceptual): ['T1']
- * Dependencies (actual): [('T1', 'tresa:default', 'gnu')]
+ * Dependencies (actual): [('T1', 'generic:default', 'builtin')]
* Maintainers: []
* Failing phase: setup
- * Rerun with '-n /605fc1d6 -p gnu --system tresa:default -r'
+ * Rerun with '-n /605fc1d6 -p builtin --system generic:default -r'
* Reason: exception
Traceback (most recent call last):
- File "/home/user/Repositories/reframe/reframe/frontend/executors/__init__.py", line 303, in _safe_call
+ File "/usr/local/share/reframe/reframe/frontend/executors/__init__.py", line 317, in _safe_call
return fn(*args, **kwargs)
- File "/home/user/Repositories/reframe/reframe/core/hooks.py", line 101, in _fn
+ File "/usr/local/share/reframe/reframe/core/hooks.py", line 111, in _fn
getattr(obj, h.__name__)()
- File "/home/user/Repositories/reframe/reframe/core/hooks.py", line 32, in _fn
+ File "/usr/local/share/reframe/reframe/core/hooks.py", line 38, in _fn
func(*args, **kwargs)
- File "/home/user/Repositories/reframe/unittests/resources/checks_unlisted/deps_complex.py", line 180, in fail
+ File "/home/user/reframe-examples/tutorial/deps/deps_complex.py", line 180, in fail
raise Exception
Exception
--------------------------------------------------------------------------------
-FAILURE INFO for T8
- * Expanded name: T8
- * Description:
- * System partition: tresa:default
- * Environment: clang
- * Stage directory: /home/user/Repositories/reframe/stage/tresa/default/clang/T8
- * Node list:
- * Job type: local (id=None)
- * Dependencies (conceptual): ['T1']
- * Dependencies (actual): [('T1', 'tresa:default', 'clang')]
- * Maintainers: []
- * Failing phase: setup
- * Rerun with '-n /605fc1d6 -p clang --system tresa:default -r'
- * Reason: exception
-Traceback (most recent call last):
- File "/home/user/Repositories/reframe/reframe/frontend/executors/__init__.py", line 303, in _safe_call
- return fn(*args, **kwargs)
- File "/home/user/Repositories/reframe/reframe/core/hooks.py", line 101, in _fn
- getattr(obj, h.__name__)()
- File "/home/user/Repositories/reframe/reframe/core/hooks.py", line 32, in _fn
- func(*args, **kwargs)
- File "/home/user/Repositories/reframe/unittests/resources/checks_unlisted/deps_complex.py", line 180, in fail
- raise Exception
-Exception
-
---------------------------------------------------------------------------------
-FAILURE INFO for T9
- * Expanded name: T9
- * Description:
- * System partition: tresa:default
- * Environment: gnu
- * Stage directory: None
- * Node list:
- * Job type: local (id=None)
- * Dependencies (conceptual): ['T8']
- * Dependencies (actual): [('T8', 'tresa:default', 'gnu')]
- * Maintainers: []
- * Failing phase: startup
- * Rerun with '-n /78a78a4e -p gnu --system tresa:default -r'
- * Reason: task dependency error: dependencies failed
---------------------------------------------------------------------------------
-FAILURE INFO for T9
- * Expanded name: T9
+FAILURE INFO for T9 (run: 1/1)
* Description:
- * System partition: tresa:default
- * Environment: clang
+ * System partition: generic:default
+ * Environment: builtin
* Stage directory: None
* Node list:
* Job type: local (id=None)
* Dependencies (conceptual): ['T8']
- * Dependencies (actual): [('T8', 'tresa:default', 'clang')]
+ * Dependencies (actual): [('T8', 'generic:default', 'builtin')]
* Maintainers: []
* Failing phase: startup
- * Rerun with '-n /78a78a4e -p clang --system tresa:default -r'
+ * Rerun with '-n /78a78a4e -p builtin --system generic:default -r'
* Reason: task dependency error: dependencies failed
--------------------------------------------------------------------------------
-FAILURE INFO for T2
- * Expanded name: T2
+FAILURE INFO for T2 (run: 1/1)
* Description:
- * System partition: tresa:default
- * Environment: gnu
- * Stage directory: /home/user/Repositories/reframe/stage/tresa/default/gnu/T2
- * Node list: hostNone
- * Job type: local (id=59611)
+ * System partition: generic:default
+ * Environment: builtin
+ * Stage directory: /home/user/reframe-examples/tutorial/stage/generic/default/builtin/T2
+ * Node list: myhost
+ * Job type: local (id=23)
* Dependencies (conceptual): ['T6']
- * Dependencies (actual): [('T6', 'tresa:default', 'gnu')]
+ * Dependencies (actual): [('T6', 'generic:default', 'builtin')]
* Maintainers: []
* Failing phase: sanity
- * Rerun with '-n /0f617ba9 -p gnu --system tresa:default -r'
+ * Rerun with '-n /0f617ba9 -p builtin --system generic:default -r'
* Reason: sanity error: 31 != 30
+--- rfm_job.out (first 10 lines) ---
+--- rfm_job.out ---
+--- rfm_job.err (first 10 lines) ---
+--- rfm_job.err ---
--------------------------------------------------------------------------------
-FAILURE INFO for T2
- * Expanded name: T2
- * Description:
- * System partition: tresa:default
- * Environment: clang
- * Stage directory: /home/user/Repositories/reframe/stage/tresa/default/clang/T2
- * Node list: hostNone
- * Job type: local (id=59612)
- * Dependencies (conceptual): ['T6']
- * Dependencies (actual): [('T6', 'tresa:default', 'clang')]
- * Maintainers: []
- * Failing phase: sanity
- * Rerun with '-n /0f617ba9 -p clang --system tresa:default -r'
- * Reason: sanity error: 31 != 30
---------------------------------------------------------------------------------
-FAILURE INFO for T7
- * Expanded name: T7
- * Description:
- * System partition: tresa:default
- * Environment: gnu
- * Stage directory: None
- * Node list:
- * Job type: local (id=None)
- * Dependencies (conceptual): ['T2']
- * Dependencies (actual): [('T2', 'tresa:default', 'gnu')]
- * Maintainers: []
- * Failing phase: startup
- * Rerun with '-n /f005e93d -p gnu --system tresa:default -r'
- * Reason: task dependency error: dependencies failed
---------------------------------------------------------------------------------
-FAILURE INFO for T7
- * Expanded name: T7
+FAILURE INFO for T7 (run: 1/1)
* Description:
- * System partition: tresa:default
- * Environment: clang
+ * System partition: generic:default
+ * Environment: builtin
* Stage directory: None
* Node list:
* Job type: local (id=None)
* Dependencies (conceptual): ['T2']
- * Dependencies (actual): [('T2', 'tresa:default', 'clang')]
+ * Dependencies (actual): [('T2', 'generic:default', 'builtin')]
* Maintainers: []
* Failing phase: startup
- * Rerun with '-n /f005e93d -p clang --system tresa:default -r'
+ * Rerun with '-n /f005e93d -p builtin --system generic:default -r'
* Reason: task dependency error: dependencies failed
--------------------------------------------------------------------------------
-Run report saved in '/home/user/.reframe/reports/run-report-326.json'
-Log file(s) saved in '/var/folders/h7/k7cgrdl13r996m4dmsvjq7v80000gp/T/rfm-_008n_el.log'
+Log file(s) saved in '/tmp/rfm-01gkxmq0.log'
diff --git a/docs/listings/deps_rerun_t6.txt b/docs/listings/deps_rerun_t6.txt
index 5e7852720f..839d5e8243 100644
--- a/docs/listings/deps_rerun_t6.txt
+++ b/docs/listings/deps_rerun_t6.txt
@@ -1,25 +1,22 @@
[ReFrame Setup]
- version: 4.0.0-dev.2+5ea6b7a6
- command: './bin/reframe --restore-session --keep-stage-files -n T6 -r'
- launched by: user@host
- working directory: '/home/user/Repositories/reframe'
- settings files: '', '/home/user/Repositories/reframe/tutorials/config/tresa.py'
- check search path: '/home/user/Repositories/reframe/unittests/resources/checks_unlisted/deps_complex.py'
- stage directory: '/home/user/Repositories/reframe/stage'
- output directory: '/home/user/Repositories/reframe/output'
- log files: '/var/folders/h7/k7cgrdl13r996m4dmsvjq7v80000gp/T/rfm-vtnok1ih.log'
+ version: 4.6.0-dev.2
+ command: '/usr/local/share/reframe/bin/reframe --restore-session --keep-stage-files -n T6 -r --nocolor'
+ launched by: user@myhost
+ working directory: '/home/user/reframe-examples/tutorial'
+ settings files: ''
+ check search path: '/home/user/reframe-examples/tutorial/deps/deps_complex.py'
+ stage directory: '/home/user/reframe-examples/tutorial/stage'
+ output directory: '/home/user/reframe-examples/tutorial/output'
+ log files: '/tmp/rfm-5nhx1_74.log'
[==========] Running 1 check(s)
-[==========] Started on Sat Nov 12 19:01:06 2022
+[==========] Started on Wed Apr 3 21:40:44 2024+0000
[----------] start processing checks
-[ RUN ] T6 /6dbdaf93 @tresa:default+gnu
-[ RUN ] T6 /6dbdaf93 @tresa:default+clang
-[ OK ] (1/2) T6 /6dbdaf93 @tresa:default+gnu
-[ OK ] (2/2) T6 /6dbdaf93 @tresa:default+clang
+[ RUN ] T6 /6dbdaf93 @generic:default+builtin
+[ OK ] (1/1) T6 /6dbdaf93 @generic:default+builtin
[----------] all spawned checks have finished
-[ PASSED ] Ran 2/2 test case(s) from 1 check(s) (0 failure(s), 0 skipped)
-[==========] Finished on Sat Nov 12 19:01:07 2022
-Run report saved in '/home/user/.reframe/reports/run-report-328.json'
-Log file(s) saved in '/var/folders/h7/k7cgrdl13r996m4dmsvjq7v80000gp/T/rfm-vtnok1ih.log'
+[ PASSED ] Ran 1/1 test case(s) from 1 check(s) (0 failure(s), 0 skipped, 0 aborted)
+[==========] Finished on Wed Apr 3 21:40:44 2024+0000
+Log file(s) saved in '/tmp/rfm-5nhx1_74.log'
diff --git a/docs/listings/deps_run_t6.txt b/docs/listings/deps_run_t6.txt
index 5a92e562cb..7e07f7cb9c 100644
--- a/docs/listings/deps_run_t6.txt
+++ b/docs/listings/deps_run_t6.txt
@@ -1,41 +1,30 @@
[ReFrame Setup]
- version: 4.0.0-dev.2+5ea6b7a6
- command: './bin/reframe -c unittests/resources/checks_unlisted/deps_complex.py -n T6 -r'
- launched by: user@host
- working directory: '/home/user/Repositories/reframe'
- settings files: '', '/home/user/Repositories/reframe/tutorials/config/tresa.py'
- check search path: '/home/user/Repositories/reframe/unittests/resources/checks_unlisted/deps_complex.py'
- stage directory: '/home/user/Repositories/reframe/stage'
- output directory: '/home/user/Repositories/reframe/output'
- log files: '/var/folders/h7/k7cgrdl13r996m4dmsvjq7v80000gp/T/rfm-8n8uvclh.log'
+ version: 4.6.0-dev.2
+ command: '/usr/local/share/reframe/bin/reframe -c deps/deps_complex.py -n T6 -r --nocolor'
+ launched by: user@myhost
+ working directory: '/home/user/reframe-examples/tutorial'
+ settings files: ''
+ check search path: '/home/user/reframe-examples/tutorial/deps/deps_complex.py'
+ stage directory: '/home/user/reframe-examples/tutorial/stage'
+ output directory: '/home/user/reframe-examples/tutorial/output'
+ log files: '/tmp/rfm-umx3ijmp.log'
[==========] Running 5 check(s)
-[==========] Started on Sat Nov 12 19:01:07 2022
+[==========] Started on Wed Apr 3 21:41:17 2024+0000
[----------] start processing checks
-[ RUN ] T0 /c9c2be9f @tresa:default+gnu
-[ RUN ] T0 /c9c2be9f @tresa:default+clang
-[ OK ] ( 1/10) T0 /c9c2be9f @tresa:default+gnu
-[ OK ] ( 2/10) T0 /c9c2be9f @tresa:default+clang
-[ RUN ] T4 /11ee5e9a @tresa:default+gnu
-[ RUN ] T4 /11ee5e9a @tresa:default+clang
-[ OK ] ( 3/10) T4 /11ee5e9a @tresa:default+gnu
-[ OK ] ( 4/10) T4 /11ee5e9a @tresa:default+clang
-[ RUN ] T5 /020d01e5 @tresa:default+gnu
-[ RUN ] T5 /020d01e5 @tresa:default+clang
-[ OK ] ( 5/10) T5 /020d01e5 @tresa:default+gnu
-[ OK ] ( 6/10) T5 /020d01e5 @tresa:default+clang
-[ RUN ] T1 /1f93603d @tresa:default+gnu
-[ RUN ] T1 /1f93603d @tresa:default+clang
-[ OK ] ( 7/10) T1 /1f93603d @tresa:default+gnu
-[ OK ] ( 8/10) T1 /1f93603d @tresa:default+clang
-[ RUN ] T6 /6dbdaf93 @tresa:default+gnu
-[ RUN ] T6 /6dbdaf93 @tresa:default+clang
-[ OK ] ( 9/10) T6 /6dbdaf93 @tresa:default+gnu
-[ OK ] (10/10) T6 /6dbdaf93 @tresa:default+clang
+[ RUN ] T0 /c9c2be9f @generic:default+builtin
+[ OK ] (1/5) T0 /c9c2be9f @generic:default+builtin
+[ RUN ] T4 /11ee5e9a @generic:default+builtin
+[ OK ] (2/5) T4 /11ee5e9a @generic:default+builtin
+[ RUN ] T5 /020d01e5 @generic:default+builtin
+[ OK ] (3/5) T5 /020d01e5 @generic:default+builtin
+[ RUN ] T1 /1f93603d @generic:default+builtin
+[ OK ] (4/5) T1 /1f93603d @generic:default+builtin
+[ RUN ] T6 /6dbdaf93 @generic:default+builtin
+[ OK ] (5/5) T6 /6dbdaf93 @generic:default+builtin
[----------] all spawned checks have finished
-[ PASSED ] Ran 10/10 test case(s) from 5 check(s) (0 failure(s), 0 skipped)
-[==========] Finished on Sat Nov 12 19:01:08 2022
-Run report saved in '/home/user/.reframe/reports/run-report-329.json'
-Log file(s) saved in '/var/folders/h7/k7cgrdl13r996m4dmsvjq7v80000gp/T/rfm-8n8uvclh.log'
+[ PASSED ] Ran 5/5 test case(s) from 5 check(s) (0 failure(s), 0 skipped, 0 aborted)
+[==========] Finished on Wed Apr 3 21:41:19 2024+0000
+Log file(s) saved in '/tmp/rfm-umx3ijmp.log'
diff --git a/docs/listings/hello1.txt b/docs/listings/hello1.txt
deleted file mode 100644
index b7b4c7351d..0000000000
--- a/docs/listings/hello1.txt
+++ /dev/null
@@ -1,23 +0,0 @@
-[ReFrame Setup]
- version: 4.0.0-dev.2+5ea6b7a6
- command: './bin/reframe -c tutorials/basics/hello/hello1.py -r'
- launched by: user@host
- working directory: '/home/user/Repositories/reframe'
- settings files: ''
- check search path: '/home/user/Repositories/reframe/tutorials/basics/hello/hello1.py'
- stage directory: '/home/user/Repositories/reframe/stage'
- output directory: '/home/user/Repositories/reframe/output'
- log files: '/var/folders/h7/k7cgrdl13r996m4dmsvjq7v80000gp/T/rfm-tgqpdq_b.log'
-
-[==========] Running 1 check(s)
-[==========] Started on Sat Nov 12 19:00:44 2022
-
-[----------] start processing checks
-[ RUN ] HelloTest /2b3e4546 @generic:default+builtin
-[ OK ] (1/1) HelloTest /2b3e4546 @generic:default+builtin
-[----------] all spawned checks have finished
-
-[ PASSED ] Ran 1/1 test case(s) from 1 check(s) (0 failure(s), 0 skipped)
-[==========] Finished on Sat Nov 12 19:00:45 2022
-Run report saved in '/home/user/.reframe/reports/run-report-319.json'
-Log file(s) saved in '/var/folders/h7/k7cgrdl13r996m4dmsvjq7v80000gp/T/rfm-tgqpdq_b.log'
diff --git a/docs/listings/hello2.txt b/docs/listings/hello2.txt
deleted file mode 100644
index c7b0b0701f..0000000000
--- a/docs/listings/hello2.txt
+++ /dev/null
@@ -1,46 +0,0 @@
-[ReFrame Setup]
- version: 4.0.0-dev.2+5ea6b7a6
- command: './bin/reframe -c tutorials/basics/hello/hello2.py -r'
- launched by: user@host
- working directory: '/home/user/Repositories/reframe'
- settings files: ''
- check search path: '/home/user/Repositories/reframe/tutorials/basics/hello/hello2.py'
- stage directory: '/home/user/Repositories/reframe/stage'
- output directory: '/home/user/Repositories/reframe/output'
- log files: '/var/folders/h7/k7cgrdl13r996m4dmsvjq7v80000gp/T/rfm-krmo7oc3.log'
-
-[==========] Running 2 check(s)
-[==========] Started on Sat Nov 12 19:00:45 2022
-
-[----------] start processing checks
-[ RUN ] HelloMultiLangTest %lang=cpp /71bf65a3 @generic:default+builtin
-[ RUN ] HelloMultiLangTest %lang=c /7cfa870e @generic:default+builtin
-[ FAIL ] (1/2) HelloMultiLangTest %lang=cpp /71bf65a3 @generic:default+builtin
-==> test failed during 'compile': test staged in '/home/user/Repositories/reframe/stage/generic/default/builtin/HelloMultiLangTest_71bf65a3'
-rfm_job.out
-[ OK ] (2/2) HelloMultiLangTest %lang=c /7cfa870e @generic:default+builtin
-[----------] all spawned checks have finished
-
-[ FAILED ] Ran 2/2 test case(s) from 2 check(s) (1 failure(s), 0 skipped)
-[==========] Finished on Sat Nov 12 19:00:46 2022
-
-================================================================================
-SUMMARY OF FAILURES
---------------------------------------------------------------------------------
-FAILURE INFO for HelloMultiLangTest_1
- * Expanded name: HelloMultiLangTest %lang=cpp
- * Description:
- * System partition: generic:default
- * Environment: builtin
- * Stage directory: /home/user/Repositories/reframe/stage/generic/default/builtin/HelloMultiLangTest_71bf65a3
- * Node list:
- * Job type: local (id=None)
- * Dependencies (conceptual): []
- * Dependencies (actual): []
- * Maintainers: []
- * Failing phase: compile
- * Rerun with '-n /71bf65a3 -p builtin --system generic:default -r'
- * Reason: build system error: I do not know how to compile a C++ program
---------------------------------------------------------------------------------
-Run report saved in '/home/user/.reframe/reports/run-report-320.json'
-Log file(s) saved in '/var/folders/h7/k7cgrdl13r996m4dmsvjq7v80000gp/T/rfm-krmo7oc3.log'
diff --git a/docs/listings/hello2_list_verbose.txt b/docs/listings/hello2_list_verbose.txt
deleted file mode 100644
index ba65ade6a1..0000000000
--- a/docs/listings/hello2_list_verbose.txt
+++ /dev/null
@@ -1,100 +0,0 @@
-Loading user configuration
-Loading the generic configuration
-Loading configuration file: ('tutorials/config/tresa.py',)
-Detecting system using method: 'hostname'
-Using standard hostname...
-Retrieved hostname: 'host'
-Looking for a matching configuration entry
-Configuration found: picking system 'tresa'
-Initializing runtime
-Initializing system partition 'default'
-Initializing system 'tresa'
-Initializing modules system 'nomod'
-detecting topology info for tresa:default
-> found topology file '/home/user/.reframe/topology/tresa-default/processor.json'; loading...
-> device auto-detection is not supported
-[ReFrame Environment]
- RFM_AUTODETECT_FQDN=
- RFM_AUTODETECT_METHOD=
- RFM_AUTODETECT_XTHOSTNAME=
- RFM_CHECK_SEARCH_PATH=
- RFM_CHECK_SEARCH_RECURSIVE=
- RFM_CLEAN_STAGEDIR=
- RFM_COLORIZE=n
- RFM_COMPRESS_REPORT=
- RFM_CONFIG_FILES=/home/user/Repositories/reframe/tutorials/config/tresa.py
- RFM_CONFIG_PATH=
- RFM_DUMP_PIPELINE_PROGRESS=
- RFM_GIT_TIMEOUT=
- RFM_HTTPJSON_URL=
- RFM_IGNORE_CHECK_CONFLICTS=
- RFM_IGNORE_REQNODENOTAVAIL=
- RFM_INSTALL_PREFIX=/home/user/Repositories/reframe
- RFM_KEEP_STAGE_FILES=
- RFM_MODULE_MAPPINGS=
- RFM_MODULE_MAP_FILE=
- RFM_NON_DEFAULT_CRAYPE=
- RFM_OUTPUT_DIR=
- RFM_PERFLOG_DIR=
- RFM_PIPELINE_TIMEOUT=
- RFM_PREFIX=
- RFM_PURGE_ENVIRONMENT=
- RFM_REMOTE_DETECT=
- RFM_REMOTE_WORKDIR=
- RFM_REPORT_FILE=
- RFM_REPORT_JUNIT=
- RFM_RESOLVE_MODULE_CONFLICTS=
- RFM_SAVE_LOG_FILES=
- RFM_STAGE_DIR=
- RFM_SYSLOG_ADDRESS=
- RFM_SYSTEM=
- RFM_TIMESTAMP_DIRS=
- RFM_TRAP_JOB_ERRORS=
- RFM_UNLOAD_MODULES=
- RFM_USER_MODULES=
- RFM_USE_LOGIN_SHELL=
- RFM_VERBOSE=
-[ReFrame Setup]
- version: 4.0.0-dev.2+5ea6b7a6
- command: './bin/reframe -C tutorials/config/tresa.py -c tutorials/basics/hello/hello2.py -l -vv'
- launched by: user@host
- working directory: '/home/user/Repositories/reframe'
- settings files: '', 'tutorials/config/tresa.py'
- check search path: '/home/user/Repositories/reframe/tutorials/basics/hello/hello2.py'
- stage directory: '/home/user/Repositories/reframe/stage'
- output directory: '/home/user/Repositories/reframe/output'
- log files: '/var/folders/h7/k7cgrdl13r996m4dmsvjq7v80000gp/T/rfm-3gcehyof.log'
-
-Looking for tests in '/home/user/Repositories/reframe/tutorials/basics/hello/hello2.py'
-Validating '/home/user/Repositories/reframe/tutorials/basics/hello/hello2.py': OK
- > Loaded 2 test(s)
-Loaded 2 test(s)
-Generated 4 test case(s)
-Filtering test cases(s) by name: 4 remaining
-Filtering test cases(s) by tags: 4 remaining
-Filtering test cases(s) by other attributes: 4 remaining
-Building and validating the full test DAG
-Full test DAG:
- ('HelloMultiLangTest_1', 'tresa:default', 'gnu') -> []
- ('HelloMultiLangTest_1', 'tresa:default', 'clang') -> []
- ('HelloMultiLangTest_0', 'tresa:default', 'gnu') -> []
- ('HelloMultiLangTest_0', 'tresa:default', 'clang') -> []
-Final number of test cases: 4
-[List of matched checks]
-- HelloMultiLangTest %lang=cpp /71bf65a3
-- HelloMultiLangTest %lang=c /7cfa870e
-Found 2 check(s)
-
-Log file(s) saved in '/var/folders/h7/k7cgrdl13r996m4dmsvjq7v80000gp/T/rfm-3gcehyof.log'
->>> profiler report [start] <<<
-main: 0.053832 s
- test processing: 0.012268 s
- RegressionCheckLoader.load_all: 0.008720 s
- TestRegistry.instantiate_all: 0.003012 s
- generate_testcases: 0.000049 s
- main.._sort_testcases: 0.000012 s
- build_deps: 0.000072 s
- validate_deps: 0.000061 s
- toposort: 0.000091 s
- list_checks: 0.001080 s
->>> profiler report [ end ] <<<
diff --git a/docs/listings/hello2_print_stdout.txt b/docs/listings/hello2_print_stdout.txt
deleted file mode 100644
index ada4bd4c2a..0000000000
--- a/docs/listings/hello2_print_stdout.txt
+++ /dev/null
@@ -1,37 +0,0 @@
-[ReFrame Setup]
- version: 4.0.0-dev.2+5ea6b7a6
- command: './bin/reframe -C tutorials/config/tresa.py -c tutorials/basics/hello/hello2.py -r'
- launched by: user@host
- working directory: '/home/user/Repositories/reframe'
- settings files: '', 'tutorials/config/tresa.py'
- check search path: '/home/user/Repositories/reframe/tutorials/basics/hello/hello2.py'
- stage directory: '/home/user/Repositories/reframe/stage'
- output directory: '/home/user/Repositories/reframe/output'
- log files: '/var/folders/h7/k7cgrdl13r996m4dmsvjq7v80000gp/T/rfm-b22mnhb0.log'
-
-[==========] Running 2 check(s)
-[==========] Started on Sat Nov 12 19:00:58 2022
-
-[----------] start processing checks
-[ RUN ] HelloMultiLangTest %lang=cpp /71bf65a3 @tresa:default+gnu
-[ RUN ] HelloMultiLangTest %lang=cpp /71bf65a3 @tresa:default+clang
-[ RUN ] HelloMultiLangTest %lang=c /7cfa870e @tresa:default+gnu
-[ RUN ] HelloMultiLangTest %lang=c /7cfa870e @tresa:default+clang
-rfm_job.out
-rfm_job.out
-[ OK ] (1/4) HelloMultiLangTest %lang=c /7cfa870e @tresa:default+gnu
-rfm_job.out
-rfm_job.out
-[ OK ] (2/4) HelloMultiLangTest %lang=c /7cfa870e @tresa:default+clang
-rfm_job.out
-rfm_job.out
-[ OK ] (3/4) HelloMultiLangTest %lang=cpp /71bf65a3 @tresa:default+gnu
-rfm_job.out
-rfm_job.out
-[ OK ] (4/4) HelloMultiLangTest %lang=cpp /71bf65a3 @tresa:default+clang
-[----------] all spawned checks have finished
-
-[ PASSED ] Ran 4/4 test case(s) from 2 check(s) (0 failure(s), 0 skipped)
-[==========] Finished on Sat Nov 12 19:01:00 2022
-Run report saved in '/home/user/.reframe/reports/run-report-325.json'
-Log file(s) saved in '/var/folders/h7/k7cgrdl13r996m4dmsvjq7v80000gp/T/rfm-b22mnhb0.log'
diff --git a/docs/listings/hello2_tresa.txt b/docs/listings/hello2_tresa.txt
deleted file mode 100644
index a594563a79..0000000000
--- a/docs/listings/hello2_tresa.txt
+++ /dev/null
@@ -1,33 +0,0 @@
-[ReFrame Setup]
- version: 4.0.0-dev.2+5ea6b7a6
- command: './bin/reframe -C tutorials/config/tresa.py -c tutorials/basics/hello/hello2.py -r'
- launched by: user@host
- working directory: '/home/user/Repositories/reframe'
- settings files: '', 'tutorials/config/tresa.py'
- check search path: '/home/user/Repositories/reframe/tutorials/basics/hello/hello2.py'
- stage directory: '/home/user/Repositories/reframe/stage'
- output directory: '/home/user/Repositories/reframe/output'
- log files: '/var/folders/h7/k7cgrdl13r996m4dmsvjq7v80000gp/T/rfm-e3dlf19_.log'
-
-[==========] Running 2 check(s)
-[==========] Started on Sat Nov 12 19:00:46 2022
-
-[----------] start processing checks
-[ RUN ] HelloMultiLangTest %lang=cpp /71bf65a3 @tresa:default+gnu
-[ RUN ] HelloMultiLangTest %lang=cpp /71bf65a3 @tresa:default+clang
-[ RUN ] HelloMultiLangTest %lang=c /7cfa870e @tresa:default+gnu
-[ RUN ] HelloMultiLangTest %lang=c /7cfa870e @tresa:default+clang
-rfm_job.out
-[ OK ] (1/4) HelloMultiLangTest %lang=c /7cfa870e @tresa:default+gnu
-rfm_job.out
-[ OK ] (2/4) HelloMultiLangTest %lang=c /7cfa870e @tresa:default+clang
-rfm_job.out
-[ OK ] (3/4) HelloMultiLangTest %lang=cpp /71bf65a3 @tresa:default+gnu
-rfm_job.out
-[ OK ] (4/4) HelloMultiLangTest %lang=cpp /71bf65a3 @tresa:default+clang
-[----------] all spawned checks have finished
-
-[ PASSED ] Ran 4/4 test case(s) from 2 check(s) (0 failure(s), 0 skipped)
-[==========] Finished on Sat Nov 12 19:00:48 2022
-Run report saved in '/home/user/.reframe/reports/run-report-321.json'
-Log file(s) saved in '/var/folders/h7/k7cgrdl13r996m4dmsvjq7v80000gp/T/rfm-e3dlf19_.log'
diff --git a/docs/listings/hello2_typo.txt b/docs/listings/hello2_typo.txt
deleted file mode 100644
index 8446dc8256..0000000000
--- a/docs/listings/hello2_typo.txt
+++ /dev/null
@@ -1,19 +0,0 @@
-[ReFrame Setup]
- version: 4.0.0-dev.2+5ea6b7a6
- command: './bin/reframe -c tutorials/basics/hello -R -l'
- launched by: user@host
- working directory: '/home/user/Repositories/reframe'
- settings files: '', '/home/user/Repositories/reframe/tutorials/config/tresa.py'
- check search path: (R) '/home/user/Repositories/reframe/tutorials/basics/hello'
- stage directory: '/home/user/Repositories/reframe/stage'
- output directory: '/home/user/Repositories/reframe/output'
- log files: '/var/folders/h7/k7cgrdl13r996m4dmsvjq7v80000gp/T/rfm-ldo5um3v.log'
-
-WARNING: skipping test file '/home/user/Repositories/reframe/tutorials/basics/hello/hello2.py': name error: tutorials/basics/hello/hello2.py:13: name 'paramter' is not defined
- lang = paramter(['c', 'cpp'])
- (rerun with '-v' for more information)
-[List of matched checks]
-- HelloTest /2b3e4546
-Found 1 check(s)
-
-Log file(s) saved in '/var/folders/h7/k7cgrdl13r996m4dmsvjq7v80000gp/T/rfm-ldo5um3v.log'
diff --git a/docs/listings/hello2_typo_stacktrace.txt b/docs/listings/hello2_typo_stacktrace.txt
deleted file mode 100644
index 8df6d423e8..0000000000
--- a/docs/listings/hello2_typo_stacktrace.txt
+++ /dev/null
@@ -1,44 +0,0 @@
-[ReFrame Setup]
- version: 4.0.0-dev.2+5ea6b7a6
- command: './bin/reframe -c tutorials/basics/hello -R -l -v'
- launched by: user@host
- working directory: '/home/user/Repositories/reframe'
- settings files: '', '/home/user/Repositories/reframe/tutorials/config/tresa.py'
- check search path: (R) '/home/user/Repositories/reframe/tutorials/basics/hello'
- stage directory: '/home/user/Repositories/reframe/stage'
- output directory: '/home/user/Repositories/reframe/output'
- log files: '/var/folders/h7/k7cgrdl13r996m4dmsvjq7v80000gp/T/rfm-xs3l6jud.log'
-
-WARNING: skipping test file '/home/user/Repositories/reframe/tutorials/basics/hello/hello2.py': name error: tutorials/basics/hello/hello2.py:13: name 'paramter' is not defined
- lang = paramter(['c', 'cpp'])
- (rerun with '-v' for more information)
-Traceback (most recent call last):
- File "/home/user/Repositories/reframe/reframe/frontend/loader.py", line 205, in load_from_file
- util.import_module_from_file(filename, force)
- File "/home/user/Repositories/reframe/reframe/utility/__init__.py", line 109, in import_module_from_file
- return importlib.import_module(module_name)
- File "/usr/local/Cellar/python@3.10/3.10.7/Frameworks/Python.framework/Versions/3.10/lib/python3.10/importlib/__init__.py", line 126, in import_module
- return _bootstrap._gcd_import(name[level:], package, level)
- File "", line 1050, in _gcd_import
- File "", line 1027, in _find_and_load
- File "", line 1006, in _find_and_load_unlocked
- File "", line 688, in _load_unlocked
- File "", line 883, in exec_module
- File "", line 241, in _call_with_frames_removed
- File "/home/user/Repositories/reframe/tutorials/basics/hello/hello2.py", line 12, in
- class HelloMultiLangTest(rfm.RegressionTest):
- File "/home/user/Repositories/reframe/tutorials/basics/hello/hello2.py", line 13, in HelloMultiLangTest
- lang = paramter(['c', 'cpp'])
-NameError: name 'paramter' is not defined
-
-Loaded 1 test(s)
-Generated 2 test case(s)
-Filtering test cases(s) by name: 2 remaining
-Filtering test cases(s) by tags: 2 remaining
-Filtering test cases(s) by other attributes: 2 remaining
-Final number of test cases: 2
-[List of matched checks]
-- HelloTest /2b3e4546
-Found 1 check(s)
-
-Log file(s) saved in '/var/folders/h7/k7cgrdl13r996m4dmsvjq7v80000gp/T/rfm-xs3l6jud.log'
diff --git a/docs/listings/hello2_verbose_load.txt b/docs/listings/hello2_verbose_load.txt
deleted file mode 100644
index b97ee20a93..0000000000
--- a/docs/listings/hello2_verbose_load.txt
+++ /dev/null
@@ -1,80 +0,0 @@
-Loading user configuration
-Loading configuration file: 'tutorials/config/settings.py'
-Detecting system
-Looking for a matching configuration entry for system '1.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.ip6.arpa'
-Configuration found: picking system 'generic'
-Selecting subconfig for 'generic'
-Initializing runtime
-Selecting subconfig for 'generic:default'
-Initializing system partition 'default'
-Selecting subconfig for 'generic'
-Initializing system 'generic'
-Initializing modules system 'nomod'
-detecting topology info for generic:default
-> found topology file '/Users/user/.reframe/topology/generic-default/processor.json'; loading...
-> device auto-detection is not supported
-[ReFrame Environment]
- RFM_CHECK_SEARCH_PATH=
- RFM_CHECK_SEARCH_RECURSIVE=
- RFM_CLEAN_STAGEDIR=
- RFM_COLORIZE=
- RFM_COMPACT_TEST_NAMES=
- RFM_CONFIG_FILE=tutorials/config/settings.py
- RFM_GIT_TIMEOUT=
- RFM_GRAYLOG_ADDRESS=
- RFM_HTTPJSON_URL=
- RFM_IGNORE_CHECK_CONFLICTS=
- RFM_IGNORE_REQNODENOTAVAIL=
- RFM_INSTALL_PREFIX=/Users/user/Repositories/reframe
- RFM_KEEP_STAGE_FILES=
- RFM_MODULE_MAPPINGS=
- RFM_MODULE_MAP_FILE=
- RFM_NON_DEFAULT_CRAYPE=
- RFM_OUTPUT_DIR=
- RFM_PERFLOG_DIR=
- RFM_PREFIX=
- RFM_PURGE_ENVIRONMENT=
- RFM_REMOTE_DETECT=
- RFM_REMOTE_WORKDIR=
- RFM_REPORT_FILE=
- RFM_REPORT_JUNIT=
- RFM_RESOLVE_MODULE_CONFLICTS=
- RFM_SAVE_LOG_FILES=
- RFM_STAGE_DIR=
- RFM_SYSLOG_ADDRESS=
- RFM_SYSTEM=
- RFM_TIMESTAMP_DIRS=
- RFM_TRAP_JOB_ERRORS=
- RFM_UNLOAD_MODULES=
- RFM_USER_MODULES=
- RFM_USE_LOGIN_SHELL=
- RFM_VERBOSE=
-[ReFrame Setup]
- version: 3.10.0-dev.2+cb5edd8b
- command: './bin/reframe -C tutorials/config/settings.py -c tutorials/basics/hello/hello2.py -l -vv'
- launched by: user@host
- working directory: '/Users/user/Repositories/reframe'
- settings file: 'tutorials/config/settings.py'
- check search path: '/Users/user/Repositories/reframe/tutorials/basics/hello/hello2.py'
- stage directory: '/Users/user/Repositories/reframe/stage'
- output directory: '/Users/user/Repositories/reframe/output'
-
-Looking for tests in '/Users/user/Repositories/reframe/tutorials/basics/hello/hello2.py'
-Validating '/Users/user/Repositories/reframe/tutorials/basics/hello/hello2.py': OK
- > Loaded 2 test(s)
-Loaded 2 test(s)
-Generated 2 test case(s)
-Filtering test cases(s) by name: 2 remaining
-Filtering test cases(s) by tags: 2 remaining
-Filtering test cases(s) by other attributes: 2 remaining
-Building and validating the full test DAG
-Full test DAG:
- ('HelloMultiLangTest_cpp', 'generic:default', 'builtin') -> []
- ('HelloMultiLangTest_c', 'generic:default', 'builtin') -> []
-Final number of test cases: 2
-[List of matched checks]
-- HelloMultiLangTest %lang=cpp
-- HelloMultiLangTest %lang=c
-Found 2 check(s)
-
-Log file(s) saved in '/var/folders/h7/k7cgrdl13r996m4dmsvjq7v80000gp/T/rfm-fpjj5gru.log'
diff --git a/docs/listings/hellomp1.txt b/docs/listings/hellomp1.txt
deleted file mode 100644
index 0dd52677ee..0000000000
--- a/docs/listings/hellomp1.txt
+++ /dev/null
@@ -1,25 +0,0 @@
-[ReFrame Setup]
- version: 4.0.0-dev.2+5ea6b7a6
- command: './bin/reframe -c tutorials/basics/hellomp/hellomp1.py -r'
- launched by: user@host
- working directory: '/home/user/Repositories/reframe'
- settings files: '', '/home/user/Repositories/reframe/tutorials/config/tresa.py'
- check search path: '/home/user/Repositories/reframe/tutorials/basics/hellomp/hellomp1.py'
- stage directory: '/home/user/Repositories/reframe/stage'
- output directory: '/home/user/Repositories/reframe/output'
- log files: '/var/folders/h7/k7cgrdl13r996m4dmsvjq7v80000gp/T/rfm-v56bz2uo.log'
-
-[==========] Running 1 check(s)
-[==========] Started on Sat Nov 12 19:00:48 2022
-
-[----------] start processing checks
-[ RUN ] HelloThreadedTest /a6fa300f @tresa:default+gnu
-[ RUN ] HelloThreadedTest /a6fa300f @tresa:default+clang
-[ OK ] (1/2) HelloThreadedTest /a6fa300f @tresa:default+gnu
-[ OK ] (2/2) HelloThreadedTest /a6fa300f @tresa:default+clang
-[----------] all spawned checks have finished
-
-[ PASSED ] Ran 2/2 test case(s) from 1 check(s) (0 failure(s), 0 skipped)
-[==========] Finished on Sat Nov 12 19:00:50 2022
-Run report saved in '/home/user/.reframe/reports/run-report-322.json'
-Log file(s) saved in '/var/folders/h7/k7cgrdl13r996m4dmsvjq7v80000gp/T/rfm-v56bz2uo.log'
diff --git a/docs/listings/hellomp2.txt b/docs/listings/hellomp2.txt
deleted file mode 100644
index 68072e954d..0000000000
--- a/docs/listings/hellomp2.txt
+++ /dev/null
@@ -1,61 +0,0 @@
-[ReFrame Setup]
- version: 4.0.0-dev.2+5ea6b7a6
- command: './bin/reframe -c tutorials/basics/hellomp/hellomp2.py -r'
- launched by: user@host
- working directory: '/home/user/Repositories/reframe'
- settings files: '', '/home/user/Repositories/reframe/tutorials/config/tresa.py'
- check search path: '/home/user/Repositories/reframe/tutorials/basics/hellomp/hellomp2.py'
- stage directory: '/home/user/Repositories/reframe/stage'
- output directory: '/home/user/Repositories/reframe/output'
- log files: '/var/folders/h7/k7cgrdl13r996m4dmsvjq7v80000gp/T/rfm-a2tt4eqp.log'
-
-[==========] Running 1 check(s)
-[==========] Started on Sat Nov 12 19:00:50 2022
-
-[----------] start processing checks
-[ RUN ] HelloThreadedExtendedTest /4733a67d @tresa:default+gnu
-[ RUN ] HelloThreadedExtendedTest /4733a67d @tresa:default+clang
-[ FAIL ] (1/2) HelloThreadedExtendedTest /4733a67d @tresa:default+gnu
-==> test failed during 'sanity': test staged in '/home/user/Repositories/reframe/stage/tresa/default/gnu/HelloThreadedExtendedTest'
-[ FAIL ] (2/2) HelloThreadedExtendedTest /4733a67d @tresa:default+clang
-==> test failed during 'sanity': test staged in '/home/user/Repositories/reframe/stage/tresa/default/clang/HelloThreadedExtendedTest'
-[----------] all spawned checks have finished
-
-[ FAILED ] Ran 2/2 test case(s) from 1 check(s) (2 failure(s), 0 skipped)
-[==========] Finished on Sat Nov 12 19:00:52 2022
-
-================================================================================
-SUMMARY OF FAILURES
---------------------------------------------------------------------------------
-FAILURE INFO for HelloThreadedExtendedTest
- * Expanded name: HelloThreadedExtendedTest
- * Description:
- * System partition: tresa:default
- * Environment: gnu
- * Stage directory: /home/user/Repositories/reframe/stage/tresa/default/gnu/HelloThreadedExtendedTest
- * Node list: hostNone
- * Job type: local (id=59525)
- * Dependencies (conceptual): []
- * Dependencies (actual): []
- * Maintainers: []
- * Failing phase: sanity
- * Rerun with '-n /4733a67d -p gnu --system tresa:default -r'
- * Reason: sanity error: 13 != 16
---------------------------------------------------------------------------------
-FAILURE INFO for HelloThreadedExtendedTest
- * Expanded name: HelloThreadedExtendedTest
- * Description:
- * System partition: tresa:default
- * Environment: clang
- * Stage directory: /home/user/Repositories/reframe/stage/tresa/default/clang/HelloThreadedExtendedTest
- * Node list: hostNone
- * Job type: local (id=59528)
- * Dependencies (conceptual): []
- * Dependencies (actual): []
- * Maintainers: []
- * Failing phase: sanity
- * Rerun with '-n /4733a67d -p clang --system tresa:default -r'
- * Reason: sanity error: 11 != 16
---------------------------------------------------------------------------------
-Run report saved in '/home/user/.reframe/reports/run-report-323.json'
-Log file(s) saved in '/var/folders/h7/k7cgrdl13r996m4dmsvjq7v80000gp/T/rfm-a2tt4eqp.log'
diff --git a/docs/listings/maketest_mixin.txt b/docs/listings/maketest_mixin.txt
deleted file mode 100644
index 039fc0cc21..0000000000
--- a/docs/listings/maketest_mixin.txt
+++ /dev/null
@@ -1,19 +0,0 @@
-[ReFrame Setup]
- version: 4.0.0-dev.2+5ea6b7a6
- command: './bin/reframe -c tutorials/advanced/makefiles/maketest_mixin.py -l'
- launched by: user@host
- working directory: '/home/user/Repositories/reframe'
- settings files: '', '/home/user/Repositories/reframe/tutorials/config/tresa.py'
- check search path: '/home/user/Repositories/reframe/tutorials/advanced/makefiles/maketest_mixin.py'
- stage directory: '/home/user/Repositories/reframe/stage'
- output directory: '/home/user/Repositories/reframe/output'
- log files: '/var/folders/h7/k7cgrdl13r996m4dmsvjq7v80000gp/T/rfm-z_z51hkz.log'
-
-[List of matched checks]
-- MakeOnlyTestAlt %elem_type=double /8b62380e
-- MakeOnlyTestAlt %elem_type=float /da39ec20
-- MakefileTestAlt %elem_type=double /89aac4a2
-- MakefileTestAlt %elem_type=float /a998ce67
-Found 4 check(s)
-
-Log file(s) saved in '/var/folders/h7/k7cgrdl13r996m4dmsvjq7v80000gp/T/rfm-z_z51hkz.log'
diff --git a/docs/listings/osu_bandwidth_concretized_daint.txt b/docs/listings/osu_bandwidth_concretized_daint.txt
deleted file mode 100644
index d16b17007b..0000000000
--- a/docs/listings/osu_bandwidth_concretized_daint.txt
+++ /dev/null
@@ -1,24 +0,0 @@
-[ReFrame Setup]
- version: 4.0.0-dev.2
- command: './bin/reframe -c tutorials/fixtures/osu_benchmarks.py -n osu_bandwidth_test -lC'
- launched by: user@host
- working directory: '/home/user/Devel/reframe'
- settings files: '', '/home/user/Devel/reframe/tutorials/config/daint.py'
- check search path: '/home/user/Devel/reframe/tutorials/fixtures/osu_benchmarks.py'
- stage directory: '/home/user/Devel/reframe/stage'
- output directory: '/home/user/Devel/reframe/output'
- log files: '/tmp/rfm-m1w2t4eh.log'
-
-[List of matched checks]
-- osu_bandwidth_test /026711a1 @daint:gpu+gnu
- ^build_osu_benchmarks ~daint:gpu+gnu /f3269d42 @daint:gpu+gnu
- ^fetch_osu_benchmarks ~daint /79cd6023 @daint:gpu+gnu
-- osu_bandwidth_test /026711a1 @daint:gpu+intel
- ^build_osu_benchmarks ~daint:gpu+intel /4d450880 @daint:gpu+intel
- ^fetch_osu_benchmarks ~daint /79cd6023 @daint:gpu+gnu
-- osu_bandwidth_test /026711a1 @daint:gpu+nvidia
- ^build_osu_benchmarks ~daint:gpu+nvidia /e9b8d152 @daint:gpu+nvidia
- ^fetch_osu_benchmarks ~daint /79cd6023 @daint:gpu+gnu
-Concretized 7 test case(s)
-
-Log file(s) saved in '/tmp/rfm-m1w2t4eh.log'
diff --git a/docs/listings/osu_bandwidth_concretized_daint_nvidia.txt b/docs/listings/osu_bandwidth_concretized_daint_nvidia.txt
deleted file mode 100644
index 7ccdba96a9..0000000000
--- a/docs/listings/osu_bandwidth_concretized_daint_nvidia.txt
+++ /dev/null
@@ -1,17 +0,0 @@
-[ReFrame Setup]
- version: 3.10.0-dev.3+605af31a
- command: './bin/reframe -c tutorials/fixtures/osu_benchmarks.py -n osu_bandwidth_test -lC -p nvidia'
- launched by: user@host
- working directory: '/home/user/Devel/reframe'
- settings file: '/home/user/Devel/reframe/tutorials/config/settings.py'
- check search path: '/home/user/Devel/reframe/tutorials/fixtures/osu_benchmarks.py'
- stage directory: '/home/user/Devel/reframe/stage'
- output directory: '/home/user/Devel/reframe/output'
-
-[List of matched checks]
-- osu_bandwidth_test @daint:gpu+nvidia
- ^build_osu_benchmarks ~daint:gpu+nvidia @daint:gpu+nvidia
- ^fetch_osu_benchmarks ~daint @daint:gpu+nvidia
-Concretized 3 test case(s)
-
-Log file(s) saved in '/tmp/rfm-dnfdagj8.log'
diff --git a/docs/listings/osu_bench_deps.txt b/docs/listings/osu_bench_deps.txt
deleted file mode 100644
index 974d410d8f..0000000000
--- a/docs/listings/osu_bench_deps.txt
+++ /dev/null
@@ -1,83 +0,0 @@
-[ReFrame Setup]
- version: 4.0.0-dev.2
- command: './bin/reframe -c tutorials/deps/osu_benchmarks.py -r'
- launched by: user@host
- working directory: '/home/user/Devel/reframe'
- settings files: '', '/home/user/Devel/reframe/tutorials/config/daint.py'
- check search path: '/home/user/Devel/reframe/tutorials/deps/osu_benchmarks.py'
- stage directory: '/home/user/Devel/reframe/stage'
- output directory: '/home/user/Devel/reframe/output'
- log files: '/tmp/rfm-r1a7v0w3.log'
-
-[==========] Running 8 check(s)
-[==========] Started on Tue Nov 15 18:24:00 2022
-
-[----------] start processing checks
-[ RUN ] OSUDownloadTest /7de668df @daint:login+builtin
-[ OK ] ( 1/22) OSUDownloadTest /7de668df @daint:login+builtin
-[ RUN ] OSUBuildTest /19b4fb56 @daint:gpu+gnu
-[ RUN ] OSUBuildTest /19b4fb56 @daint:gpu+intel
-[ RUN ] OSUBuildTest /19b4fb56 @daint:gpu+nvidia
-[ OK ] ( 2/22) OSUBuildTest /19b4fb56 @daint:gpu+gnu
-[ RUN ] OSUAllreduceTest %mpi_tasks=16 /7f033d39 @daint:gpu+gnu
-[ RUN ] OSUAllreduceTest %mpi_tasks=8 /005fca19 @daint:gpu+gnu
-[ RUN ] OSUAllreduceTest %mpi_tasks=4 /84b85d90 @daint:gpu+gnu
-[ RUN ] OSUAllreduceTest %mpi_tasks=2 /9d550c4f @daint:gpu+gnu
-[ RUN ] OSUBandwidthTest /764cdb0b @daint:gpu+gnu
-[ RUN ] OSULatencyTest /14f35a43 @daint:gpu+gnu
-[ OK ] ( 3/22) OSUBuildTest /19b4fb56 @daint:gpu+intel
-[ RUN ] OSUAllreduceTest %mpi_tasks=16 /7f033d39 @daint:gpu+intel
-[ RUN ] OSUAllreduceTest %mpi_tasks=8 /005fca19 @daint:gpu+intel
-[ RUN ] OSUAllreduceTest %mpi_tasks=4 /84b85d90 @daint:gpu+intel
-[ OK ] ( 4/22) OSUBuildTest /19b4fb56 @daint:gpu+nvidia
-[ RUN ] OSUAllreduceTest %mpi_tasks=16 /7f033d39 @daint:gpu+nvidia
-[ RUN ] OSUAllreduceTest %mpi_tasks=8 /005fca19 @daint:gpu+nvidia
-[ RUN ] OSUAllreduceTest %mpi_tasks=4 /84b85d90 @daint:gpu+nvidia
-[ RUN ] OSUAllreduceTest %mpi_tasks=2 /9d550c4f @daint:gpu+intel
-[ RUN ] OSUAllreduceTest %mpi_tasks=2 /9d550c4f @daint:gpu+nvidia
-[ RUN ] OSUBandwidthTest /764cdb0b @daint:gpu+intel
-[ RUN ] OSUBandwidthTest /764cdb0b @daint:gpu+nvidia
-[ RUN ] OSULatencyTest /14f35a43 @daint:gpu+intel
-[ RUN ] OSULatencyTest /14f35a43 @daint:gpu+nvidia
-[ OK ] ( 5/22) OSUAllreduceTest %mpi_tasks=4 /84b85d90 @daint:gpu+gnu
-P: latency: 5.31 us (r:0, l:None, u:None)
-[ OK ] ( 6/22) OSUAllreduceTest %mpi_tasks=2 /9d550c4f @daint:gpu+intel
-P: latency: 10.07 us (r:0, l:None, u:None)
-[ OK ] ( 7/22) OSUAllreduceTest %mpi_tasks=2 /9d550c4f @daint:gpu+gnu
-P: latency: 1.67 us (r:0, l:None, u:None)
-[ OK ] ( 8/22) OSUAllreduceTest %mpi_tasks=4 /84b85d90 @daint:gpu+intel
-P: latency: 24.97 us (r:0, l:None, u:None)
-[ OK ] ( 9/22) OSUAllreduceTest %mpi_tasks=2 /9d550c4f @daint:gpu+nvidia
-P: latency: 8.92 us (r:0, l:None, u:None)
-[ OK ] (10/22) OSUAllreduceTest %mpi_tasks=16 /7f033d39 @daint:gpu+intel
-P: latency: 14.78 us (r:0, l:None, u:None)
-[ OK ] (11/22) OSULatencyTest /14f35a43 @daint:gpu+nvidia
-P: latency: 2.19 us (r:0, l:None, u:None)
-[ OK ] (12/22) OSULatencyTest /14f35a43 @daint:gpu+gnu
-P: latency: 1.76 us (r:0, l:None, u:None)
-[ OK ] (13/22) OSUAllreduceTest %mpi_tasks=16 /7f033d39 @daint:gpu+gnu
-P: latency: 19.54 us (r:0, l:None, u:None)
-[ OK ] (14/22) OSULatencyTest /14f35a43 @daint:gpu+intel
-P: latency: 4.4 us (r:0, l:None, u:None)
-[ OK ] (15/22) OSUAllreduceTest %mpi_tasks=4 /84b85d90 @daint:gpu+nvidia
-P: latency: 6.88 us (r:0, l:None, u:None)
-[ OK ] (16/22) OSUAllreduceTest %mpi_tasks=8 /005fca19 @daint:gpu+intel
-P: latency: 21.37 us (r:0, l:None, u:None)
-[ OK ] (17/22) OSUAllreduceTest %mpi_tasks=8 /005fca19 @daint:gpu+gnu
-P: latency: 10.15 us (r:0, l:None, u:None)
-[ OK ] (18/22) OSUAllreduceTest %mpi_tasks=8 /005fca19 @daint:gpu+nvidia
-P: latency: 52.87 us (r:0, l:None, u:None)
-[ OK ] (19/22) OSUAllreduceTest %mpi_tasks=16 /7f033d39 @daint:gpu+nvidia
-P: latency: 64.77 us (r:0, l:None, u:None)
-[ OK ] (20/22) OSUBandwidthTest /764cdb0b @daint:gpu+intel
-P: bandwidth: 9118.51 MB/s (r:0, l:None, u:None)
-[ OK ] (21/22) OSUBandwidthTest /764cdb0b @daint:gpu+nvidia
-P: bandwidth: 8476.18 MB/s (r:0, l:None, u:None)
-[ OK ] (22/22) OSUBandwidthTest /764cdb0b @daint:gpu+gnu
-P: bandwidth: 8326.06 MB/s (r:0, l:None, u:None)
-[----------] all spawned checks have finished
-
-[ PASSED ] Ran 22/22 test case(s) from 8 check(s) (0 failure(s), 0 skipped)
-[==========] Finished on Tue Nov 15 18:27:13 2022
-Run report saved in '/home/user/.reframe/reports/run-report-3.json'
-Log file(s) saved in '/tmp/rfm-r1a7v0w3.log'
diff --git a/docs/listings/osu_bench_fixtures_list.txt b/docs/listings/osu_bench_fixtures_list.txt
deleted file mode 100644
index 19fbd22500..0000000000
--- a/docs/listings/osu_bench_fixtures_list.txt
+++ /dev/null
@@ -1,57 +0,0 @@
-[ReFrame Setup]
- version: 4.0.0-dev.2
- command: './bin/reframe -c tutorials/fixtures/osu_benchmarks.py -l'
- launched by: user@host
- working directory: '/home/user/Devel/reframe'
- settings files: '', '/home/user/Devel/reframe/tutorials/config/daint.py'
- check search path: '/home/user/Devel/reframe/tutorials/fixtures/osu_benchmarks.py'
- stage directory: '/home/user/Devel/reframe/stage'
- output directory: '/home/user/Devel/reframe/output'
- log files: '/tmp/rfm-to7wa4gh.log'
-
-[List of matched checks]
-- osu_allreduce_test %mpi_tasks=16 /1fe48834
- ^build_osu_benchmarks ~daint:gpu+gnu /f3269d42
- ^fetch_osu_benchmarks ~daint /79cd6023
- ^build_osu_benchmarks ~daint:gpu+intel /4d450880
- ^fetch_osu_benchmarks ~daint /79cd6023
- ^build_osu_benchmarks ~daint:gpu+nvidia /e9b8d152
- ^fetch_osu_benchmarks ~daint /79cd6023
-- osu_allreduce_test %mpi_tasks=8 /ae01c137
- ^build_osu_benchmarks ~daint:gpu+gnu /f3269d42
- ^fetch_osu_benchmarks ~daint /79cd6023
- ^build_osu_benchmarks ~daint:gpu+intel /4d450880
- ^fetch_osu_benchmarks ~daint /79cd6023
- ^build_osu_benchmarks ~daint:gpu+nvidia /e9b8d152
- ^fetch_osu_benchmarks ~daint /79cd6023
-- osu_allreduce_test %mpi_tasks=4 /2129dc34
- ^build_osu_benchmarks ~daint:gpu+gnu /f3269d42
- ^fetch_osu_benchmarks ~daint /79cd6023
- ^build_osu_benchmarks ~daint:gpu+intel /4d450880
- ^fetch_osu_benchmarks ~daint /79cd6023
- ^build_osu_benchmarks ~daint:gpu+nvidia /e9b8d152
- ^fetch_osu_benchmarks ~daint /79cd6023
-- osu_allreduce_test %mpi_tasks=2 /9f29c081
- ^build_osu_benchmarks ~daint:gpu+gnu /f3269d42
- ^fetch_osu_benchmarks ~daint /79cd6023
- ^build_osu_benchmarks ~daint:gpu+intel /4d450880
- ^fetch_osu_benchmarks ~daint /79cd6023
- ^build_osu_benchmarks ~daint:gpu+nvidia /e9b8d152
- ^fetch_osu_benchmarks ~daint /79cd6023
-- osu_bandwidth_test /026711a1
- ^build_osu_benchmarks ~daint:gpu+gnu /f3269d42
- ^fetch_osu_benchmarks ~daint /79cd6023
- ^build_osu_benchmarks ~daint:gpu+intel /4d450880
- ^fetch_osu_benchmarks ~daint /79cd6023
- ^build_osu_benchmarks ~daint:gpu+nvidia /e9b8d152
- ^fetch_osu_benchmarks ~daint /79cd6023
-- osu_latency_test /d2c978ad
- ^build_osu_benchmarks ~daint:gpu+gnu /f3269d42
- ^fetch_osu_benchmarks ~daint /79cd6023
- ^build_osu_benchmarks ~daint:gpu+intel /4d450880
- ^fetch_osu_benchmarks ~daint /79cd6023
- ^build_osu_benchmarks ~daint:gpu+nvidia /e9b8d152
- ^fetch_osu_benchmarks ~daint /79cd6023
-Found 6 check(s)
-
-Log file(s) saved in '/tmp/rfm-to7wa4gh.log'
diff --git a/docs/listings/osu_bench_fixtures_run.txt b/docs/listings/osu_bench_fixtures_run.txt
deleted file mode 100644
index 401ef37227..0000000000
--- a/docs/listings/osu_bench_fixtures_run.txt
+++ /dev/null
@@ -1,83 +0,0 @@
-[ReFrame Setup]
- version: 4.0.0-dev.2
- command: './bin/reframe -c tutorials/fixtures/osu_benchmarks.py -r'
- launched by: user@host
- working directory: '/home/user/Devel/reframe'
- settings files: '', '/home/user/Devel/reframe/tutorials/config/daint.py'
- check search path: '/home/user/Devel/reframe/tutorials/fixtures/osu_benchmarks.py'
- stage directory: '/home/user/Devel/reframe/stage'
- output directory: '/home/user/Devel/reframe/output'
- log files: '/tmp/rfm-63lwmv4b.log'
-
-[==========] Running 10 check(s)
-[==========] Started on Tue Nov 15 18:27:17 2022
-
-[----------] start processing checks
-[ RUN ] fetch_osu_benchmarks ~daint /79cd6023 @daint:gpu+gnu
-[ OK ] ( 1/22) fetch_osu_benchmarks ~daint /79cd6023 @daint:gpu+gnu
-[ RUN ] build_osu_benchmarks ~daint:gpu+gnu /f3269d42 @daint:gpu+gnu
-[ RUN ] build_osu_benchmarks ~daint:gpu+intel /4d450880 @daint:gpu+intel
-[ RUN ] build_osu_benchmarks ~daint:gpu+nvidia /e9b8d152 @daint:gpu+nvidia
-[ OK ] ( 2/22) build_osu_benchmarks ~daint:gpu+gnu /f3269d42 @daint:gpu+gnu
-[ RUN ] osu_allreduce_test %mpi_tasks=16 /1fe48834 @daint:gpu+gnu
-[ RUN ] osu_allreduce_test %mpi_tasks=8 /ae01c137 @daint:gpu+gnu
-[ RUN ] osu_allreduce_test %mpi_tasks=4 /2129dc34 @daint:gpu+gnu
-[ RUN ] osu_allreduce_test %mpi_tasks=2 /9f29c081 @daint:gpu+gnu
-[ RUN ] osu_bandwidth_test /026711a1 @daint:gpu+gnu
-[ RUN ] osu_latency_test /d2c978ad @daint:gpu+gnu
-[ OK ] ( 3/22) build_osu_benchmarks ~daint:gpu+intel /4d450880 @daint:gpu+intel
-[ OK ] ( 4/22) build_osu_benchmarks ~daint:gpu+nvidia /e9b8d152 @daint:gpu+nvidia
-[ RUN ] osu_allreduce_test %mpi_tasks=16 /1fe48834 @daint:gpu+intel
-[ RUN ] osu_allreduce_test %mpi_tasks=16 /1fe48834 @daint:gpu+nvidia
-[ RUN ] osu_allreduce_test %mpi_tasks=8 /ae01c137 @daint:gpu+intel
-[ RUN ] osu_allreduce_test %mpi_tasks=8 /ae01c137 @daint:gpu+nvidia
-[ RUN ] osu_allreduce_test %mpi_tasks=4 /2129dc34 @daint:gpu+intel
-[ RUN ] osu_allreduce_test %mpi_tasks=4 /2129dc34 @daint:gpu+nvidia
-[ RUN ] osu_allreduce_test %mpi_tasks=2 /9f29c081 @daint:gpu+intel
-[ RUN ] osu_allreduce_test %mpi_tasks=2 /9f29c081 @daint:gpu+nvidia
-[ RUN ] osu_bandwidth_test /026711a1 @daint:gpu+intel
-[ RUN ] osu_bandwidth_test /026711a1 @daint:gpu+nvidia
-[ RUN ] osu_latency_test /d2c978ad @daint:gpu+intel
-[ RUN ] osu_latency_test /d2c978ad @daint:gpu+nvidia
-[ OK ] ( 5/22) osu_allreduce_test %mpi_tasks=2 /9f29c081 @daint:gpu+gnu
-P: latency: 2.76 us (r:0, l:None, u:None)
-[ OK ] ( 6/22) osu_allreduce_test %mpi_tasks=2 /9f29c081 @daint:gpu+intel
-P: latency: 1.68 us (r:0, l:None, u:None)
-[ OK ] ( 7/22) osu_allreduce_test %mpi_tasks=4 /2129dc34 @daint:gpu+intel
-P: latency: 4.89 us (r:0, l:None, u:None)
-[ OK ] ( 8/22) osu_latency_test /d2c978ad @daint:gpu+intel
-P: latency: 1.54 us (r:0, l:None, u:None)
-[ OK ] ( 9/22) osu_latency_test /d2c978ad @daint:gpu+gnu
-P: latency: 1.17 us (r:0, l:None, u:None)
-[ OK ] (10/22) osu_allreduce_test %mpi_tasks=4 /2129dc34 @daint:gpu+gnu
-P: latency: 3.22 us (r:0, l:None, u:None)
-[ OK ] (11/22) osu_allreduce_test %mpi_tasks=16 /1fe48834 @daint:gpu+gnu
-P: latency: 13.84 us (r:0, l:None, u:None)
-[ OK ] (12/22) osu_allreduce_test %mpi_tasks=16 /1fe48834 @daint:gpu+nvidia
-P: latency: 30.77 us (r:0, l:None, u:None)
-[ OK ] (13/22) osu_allreduce_test %mpi_tasks=4 /2129dc34 @daint:gpu+nvidia
-P: latency: 5.74 us (r:0, l:None, u:None)
-[ OK ] (14/22) osu_allreduce_test %mpi_tasks=16 /1fe48834 @daint:gpu+intel
-P: latency: 14.77 us (r:0, l:None, u:None)
-[ OK ] (15/22) osu_allreduce_test %mpi_tasks=2 /9f29c081 @daint:gpu+nvidia
-P: latency: 4.5 us (r:0, l:None, u:None)
-[ OK ] (16/22) osu_allreduce_test %mpi_tasks=8 /ae01c137 @daint:gpu+nvidia
-P: latency: 33.93 us (r:0, l:None, u:None)
-[ OK ] (17/22) osu_allreduce_test %mpi_tasks=8 /ae01c137 @daint:gpu+intel
-P: latency: 20.9 us (r:0, l:None, u:None)
-[ OK ] (18/22) osu_latency_test /d2c978ad @daint:gpu+nvidia
-P: latency: 1.18 us (r:0, l:None, u:None)
-[ OK ] (19/22) osu_allreduce_test %mpi_tasks=8 /ae01c137 @daint:gpu+gnu
-P: latency: 10.14 us (r:0, l:None, u:None)
-[ OK ] (20/22) osu_bandwidth_test /026711a1 @daint:gpu+gnu
-P: bandwidth: 9785.43 MB/s (r:0, l:None, u:None)
-[ OK ] (21/22) osu_bandwidth_test /026711a1 @daint:gpu+intel
-P: bandwidth: 9841.26 MB/s (r:0, l:None, u:None)
-[ OK ] (22/22) osu_bandwidth_test /026711a1 @daint:gpu+nvidia
-P: bandwidth: 9824.01 MB/s (r:0, l:None, u:None)
-[----------] all spawned checks have finished
-
-[ PASSED ] Ran 22/22 test case(s) from 10 check(s) (0 failure(s), 0 skipped)
-[==========] Finished on Tue Nov 15 18:30:34 2022
-Run report saved in '/home/user/.reframe/reports/run-report-4.json'
-Log file(s) saved in '/tmp/rfm-63lwmv4b.log'
diff --git a/docs/listings/osu_bench_list_concretized.txt b/docs/listings/osu_bench_list_concretized.txt
deleted file mode 100644
index 598a42906a..0000000000
--- a/docs/listings/osu_bench_list_concretized.txt
+++ /dev/null
@@ -1,69 +0,0 @@
-[ReFrame Setup]
- version: 4.0.0-dev.2
- command: './bin/reframe -c tutorials/deps/osu_benchmarks.py -lC'
- launched by: user@host
- working directory: '/home/user/Devel/reframe'
- settings files: '', '/home/user/Devel/reframe/tutorials/config/daint.py'
- check search path: '/home/user/Devel/reframe/tutorials/deps/osu_benchmarks.py'
- stage directory: '/home/user/Devel/reframe/stage'
- output directory: '/home/user/Devel/reframe/output'
- log files: '/tmp/rfm-fremrbwf.log'
-
-[List of matched checks]
-- OSUAllreduceTest %mpi_tasks=16 /7f033d39 @daint:gpu+gnu
- ^OSUBuildTest /19b4fb56 @daint:gpu+gnu
- ^OSUDownloadTest /7de668df @daint:login+builtin
-- OSUAllreduceTest %mpi_tasks=16 /7f033d39 @daint:gpu+intel
- ^OSUBuildTest /19b4fb56 @daint:gpu+intel
- ^OSUDownloadTest /7de668df @daint:login+builtin
-- OSUAllreduceTest %mpi_tasks=16 /7f033d39 @daint:gpu+nvidia
- ^OSUBuildTest /19b4fb56 @daint:gpu+nvidia
- ^OSUDownloadTest /7de668df @daint:login+builtin
-- OSUAllreduceTest %mpi_tasks=8 /005fca19 @daint:gpu+gnu
- ^OSUBuildTest /19b4fb56 @daint:gpu+gnu
- ^OSUDownloadTest /7de668df @daint:login+builtin
-- OSUAllreduceTest %mpi_tasks=8 /005fca19 @daint:gpu+intel
- ^OSUBuildTest /19b4fb56 @daint:gpu+intel
- ^OSUDownloadTest /7de668df @daint:login+builtin
-- OSUAllreduceTest %mpi_tasks=8 /005fca19 @daint:gpu+nvidia
- ^OSUBuildTest /19b4fb56 @daint:gpu+nvidia
- ^OSUDownloadTest /7de668df @daint:login+builtin
-- OSUAllreduceTest %mpi_tasks=4 /84b85d90 @daint:gpu+gnu
- ^OSUBuildTest /19b4fb56 @daint:gpu+gnu
- ^OSUDownloadTest /7de668df @daint:login+builtin
-- OSUAllreduceTest %mpi_tasks=4 /84b85d90 @daint:gpu+intel
- ^OSUBuildTest /19b4fb56 @daint:gpu+intel
- ^OSUDownloadTest /7de668df @daint:login+builtin
-- OSUAllreduceTest %mpi_tasks=4 /84b85d90 @daint:gpu+nvidia
- ^OSUBuildTest /19b4fb56 @daint:gpu+nvidia
- ^OSUDownloadTest /7de668df @daint:login+builtin
-- OSUAllreduceTest %mpi_tasks=2 /9d550c4f @daint:gpu+gnu
- ^OSUBuildTest /19b4fb56 @daint:gpu+gnu
- ^OSUDownloadTest /7de668df @daint:login+builtin
-- OSUAllreduceTest %mpi_tasks=2 /9d550c4f @daint:gpu+intel
- ^OSUBuildTest /19b4fb56 @daint:gpu+intel
- ^OSUDownloadTest /7de668df @daint:login+builtin
-- OSUAllreduceTest %mpi_tasks=2 /9d550c4f @daint:gpu+nvidia
- ^OSUBuildTest /19b4fb56 @daint:gpu+nvidia
- ^OSUDownloadTest /7de668df @daint:login+builtin
-- OSUBandwidthTest /764cdb0b @daint:gpu+gnu
- ^OSUBuildTest /19b4fb56 @daint:gpu+gnu
- ^OSUDownloadTest /7de668df @daint:login+builtin
-- OSUBandwidthTest /764cdb0b @daint:gpu+intel
- ^OSUBuildTest /19b4fb56 @daint:gpu+intel
- ^OSUDownloadTest /7de668df @daint:login+builtin
-- OSUBandwidthTest /764cdb0b @daint:gpu+nvidia
- ^OSUBuildTest /19b4fb56 @daint:gpu+nvidia
- ^OSUDownloadTest /7de668df @daint:login+builtin
-- OSULatencyTest /14f35a43 @daint:gpu+gnu
- ^OSUBuildTest /19b4fb56 @daint:gpu+gnu
- ^OSUDownloadTest /7de668df @daint:login+builtin
-- OSULatencyTest /14f35a43 @daint:gpu+intel
- ^OSUBuildTest /19b4fb56 @daint:gpu+intel
- ^OSUDownloadTest /7de668df @daint:login+builtin
-- OSULatencyTest /14f35a43 @daint:gpu+nvidia
- ^OSUBuildTest /19b4fb56 @daint:gpu+nvidia
- ^OSUDownloadTest /7de668df @daint:login+builtin
-Concretized 22 test case(s)
-
-Log file(s) saved in '/tmp/rfm-fremrbwf.log'
diff --git a/docs/listings/osu_bench_list_concretized_gnu.txt b/docs/listings/osu_bench_list_concretized_gnu.txt
deleted file mode 100644
index 7898775b1d..0000000000
--- a/docs/listings/osu_bench_list_concretized_gnu.txt
+++ /dev/null
@@ -1,18 +0,0 @@
-[ReFrame Setup]
- version: 4.0.0-dev.2
- command: './bin/reframe -c tutorials/deps/osu_benchmarks.py -n OSULatencyTest -L -p builtin -p gnu'
- launched by: user@host
- working directory: '/home/user/Devel/reframe'
- settings files: '', '/home/user/Devel/reframe/tutorials/config/daint.py'
- check search path: '/home/user/Devel/reframe/tutorials/deps/osu_benchmarks.py'
- stage directory: '/home/user/Devel/reframe/stage'
- output directory: '/home/user/Devel/reframe/output'
- log files: '/tmp/rfm-7hnco47r.log'
-
-[List of matched checks]
-- OSULatencyTest /14f35a43 [variant: 0, file: '/home/user/Devel/reframe/tutorials/deps/osu_benchmarks.py']
- ^OSUBuildTest /19b4fb56 [variant: 0, file: '/home/user/Devel/reframe/tutorials/deps/osu_benchmarks.py']
- ^OSUDownloadTest /7de668df [variant: 0, file: '/home/user/Devel/reframe/tutorials/deps/osu_benchmarks.py']
-Found 3 check(s)
-
-Log file(s) saved in '/tmp/rfm-7hnco47r.log'
diff --git a/docs/listings/osu_latency_list.txt b/docs/listings/osu_latency_list.txt
deleted file mode 100644
index 66840d6cc3..0000000000
--- a/docs/listings/osu_latency_list.txt
+++ /dev/null
@@ -1,18 +0,0 @@
-[ReFrame Setup]
- version: 4.0.0-dev.2
- command: './bin/reframe -c tutorials/deps/osu_benchmarks.py -n OSULatencyTest -l'
- launched by: user@host
- working directory: '/home/user/Devel/reframe'
- settings files: '', '/home/user/Devel/reframe/tutorials/config/daint.py'
- check search path: '/home/user/Devel/reframe/tutorials/deps/osu_benchmarks.py'
- stage directory: '/home/user/Devel/reframe/stage'
- output directory: '/home/user/Devel/reframe/output'
- log files: '/tmp/rfm-7id7z75s.log'
-
-[List of matched checks]
-- OSULatencyTest /14f35a43
- ^OSUBuildTest /19b4fb56
- ^OSUDownloadTest /7de668df
-Found 3 check(s)
-
-Log file(s) saved in '/tmp/rfm-7id7z75s.log'
diff --git a/docs/listings/osu_latency_unresolved_deps.txt b/docs/listings/osu_latency_unresolved_deps.txt
deleted file mode 100644
index d82a5b97e0..0000000000
--- a/docs/listings/osu_latency_unresolved_deps.txt
+++ /dev/null
@@ -1,41 +0,0 @@
-[ReFrame Setup]
- version: 4.0.0-dev.2
- command: './bin/reframe -c tutorials/deps/osu_benchmarks.py -n OSULatencyTest --system=daint:gpu -l'
- launched by: user@host
- working directory: '/home/user/Devel/reframe'
- settings files: '', '/home/user/Devel/reframe/tutorials/config/daint.py'
- check search path: '/home/user/Devel/reframe/tutorials/deps/osu_benchmarks.py'
- stage directory: '/home/user/Devel/reframe/stage'
- output directory: '/home/user/Devel/reframe/output'
- log files: '/tmp/rfm-12gjxnvc.log'
-
-WARNING: could not resolve dependency: ('OSUBuildTest', 'daint:gpu', 'gnu') -> 'OSUDownloadTest'
-WARNING: could not resolve dependency: ('OSUBuildTest', 'daint:gpu', 'intel') -> 'OSUDownloadTest'
-WARNING: could not resolve dependency: ('OSUBuildTest', 'daint:gpu', 'nvidia') -> 'OSUDownloadTest'
-WARNING: skipping all dependent test cases
- - ('OSUBuildTest', 'daint:gpu', 'intel')
- - ('OSUBandwidthTest', 'daint:gpu', 'intel')
- - ('OSUBuildTest', 'daint:gpu', 'nvidia')
- - ('OSULatencyTest', 'daint:gpu', 'intel')
- - ('OSUAllreduceTest_3', 'daint:gpu', 'nvidia')
- - ('OSUBuildTest', 'daint:gpu', 'gnu')
- - ('OSUAllreduceTest_1', 'daint:gpu', 'nvidia')
- - ('OSUAllreduceTest_0', 'daint:gpu', 'intel')
- - ('OSUAllreduceTest_2', 'daint:gpu', 'nvidia')
- - ('OSUBandwidthTest', 'daint:gpu', 'gnu')
- - ('OSULatencyTest', 'daint:gpu', 'gnu')
- - ('OSUAllreduceTest_2', 'daint:gpu', 'intel')
- - ('OSUAllreduceTest_3', 'daint:gpu', 'intel')
- - ('OSUAllreduceTest_1', 'daint:gpu', 'intel')
- - ('OSUAllreduceTest_0', 'daint:gpu', 'nvidia')
- - ('OSUBandwidthTest', 'daint:gpu', 'nvidia')
- - ('OSULatencyTest', 'daint:gpu', 'nvidia')
- - ('OSUAllreduceTest_2', 'daint:gpu', 'gnu')
- - ('OSUAllreduceTest_1', 'daint:gpu', 'gnu')
- - ('OSUAllreduceTest_3', 'daint:gpu', 'gnu')
- - ('OSUAllreduceTest_0', 'daint:gpu', 'gnu')
-
-[List of matched checks]
-Found 0 check(s)
-
-Log file(s) saved in '/tmp/rfm-12gjxnvc.log'
diff --git a/docs/listings/param_deps_list.txt b/docs/listings/param_deps_list.txt
deleted file mode 100644
index 400efc053a..0000000000
--- a/docs/listings/param_deps_list.txt
+++ /dev/null
@@ -1,26 +0,0 @@
-[ReFrame Setup]
- version: 4.0.0-dev.2+5ea6b7a6
- command: './bin/reframe -c tutorials/deps/parameterized.py -l'
- launched by: user@host
- working directory: '/home/user/Repositories/reframe'
- settings files: '', '/home/user/Repositories/reframe/tutorials/config/tresa.py'
- check search path: '/home/user/Repositories/reframe/tutorials/deps/parameterized.py'
- stage directory: '/home/user/Repositories/reframe/stage'
- output directory: '/home/user/Repositories/reframe/output'
- log files: '/var/folders/h7/k7cgrdl13r996m4dmsvjq7v80000gp/T/rfm-u9ryq5d3.log'
-
-[List of matched checks]
-- TestB /cc291487
- ^TestA %z=9 /034f091a
- ^TestA %z=8 /a093d19f
- ^TestA %z=7 /77b4b8e6
- ^TestA %z=6 /40ce4759
-- TestA %z=5 /aa0cffc9
-- TestA %z=4 /83cd5dec
-- TestA %z=3 /1c51609b
-- TestA %z=2 /707b752c
-- TestA %z=1 /c65657d5
-- TestA %z=0 /1b9f44df
-Found 11 check(s)
-
-Log file(s) saved in '/var/folders/h7/k7cgrdl13r996m4dmsvjq7v80000gp/T/rfm-u9ryq5d3.log'
diff --git a/docs/listings/perflogs.txt b/docs/listings/perflogs.txt
deleted file mode 100644
index 5751f3a7a9..0000000000
--- a/docs/listings/perflogs.txt
+++ /dev/null
@@ -1,4 +0,0 @@
-job_completion_time,version,display_name,system,partition,environ,jobid,result,Copy_value,Copy_unit,Copy_ref,Copy_lower,Copy_upper,Scale_value,Scale_unit,Scale_ref,Scale_lower,Scale_upper,Add_value,Add_unit,Add_ref,Add_lower,Add_upper,Triad_value,Triad_unit,Triad_ref,Triad_lower,Triad_upper
-2022-10-18T21:41:25,4.0.0-dev.2+90fbd3ef,StreamWithRefTest,catalina,default,gnu,81351,pass,24235.6,MB/s,25200,-0.05,0.05,16044.2,MB/s,16800,-0.05,0.05,17733.7,MB/s,18500,-0.05,0.05,18232.0,MB/s,18800,-0.05,0.05
-2022-10-18T21:41:31,4.0.0-dev.2+90fbd3ef,StreamWithRefTest,catalina,default,gnu,81377,fail,23615.4,MB/s,25200,-0.05,0.05,16394.5,MB/s,16800,-0.05,0.05,17841.3,MB/s,18500,-0.05,0.05,18284.1,MB/s,18800,-0.05,0.05
-2022-10-18T21:46:06,4.0.0-dev.2+90fbd3ef,StreamWithRefTest,catalina,default,gnu,81480,fail,23736.4,MB/s,25200,-0.05,0.05,16242.8,MB/s,16800,-0.05,0.05,17699.1,MB/s,18500,-0.05,0.05,18077.3,MB/s,18800,-0.05,0.05
diff --git a/docs/listings/run-report.json b/docs/listings/run-report.json
deleted file mode 100644
index a31169ecd1..0000000000
--- a/docs/listings/run-report.json
+++ /dev/null
@@ -1,65 +0,0 @@
-{
- "session_info": {
- "cmdline": "./bin/reframe -c tutorials/basics/hello/hello1.py -r",
- "config_file": "",
- "data_version": "2.0",
- "hostname": "host",
- "prefix_output": "/path/to/reframe/output",
- "prefix_stage": "/path/to/reframe/stage",
- "user": "user",
- "version": "3.10.0-dev.3+c22440c1",
- "workdir": "/path/to/reframe",
- "time_start": "2022-01-22T13:21:50+0100",
- "time_end": "2022-01-22T13:21:51+0100",
- "time_elapsed": 0.8124568462371826,
- "num_cases": 1,
- "num_failures": 0
- },
- "runs": [
- {
- "num_cases": 1,
- "num_failures": 0,
- "num_aborted": 0,
- "num_skipped": 0,
- "runid": 0,
- "testcases": [
- {
- "build_stderr": "rfm_HelloTest_build.err",
- "build_stdout": "rfm_HelloTest_build.out",
- "dependencies_actual": [],
- "dependencies_conceptual": [],
- "description": "HelloTest",
- "display_name": "HelloTest",
- "filename": "/path/to/reframe/tutorials/basics/hello/hello1.py",
- "environment": "builtin",
- "fail_phase": null,
- "fail_reason": null,
- "jobid": "43152",
- "job_stderr": "rfm_HelloTest_job.err",
- "job_stdout": "rfm_HelloTest_job.out",
- "maintainers": [],
- "name": "HelloTest",
- "nodelist": [
- "tresa.local"
- ],
- "outputdir": "/path/to/reframe/output/generic/default/builtin/HelloTest",
- "perfvars": null,
- "prefix": "/path/to/reframe/tutorials/basics/hello",
- "result": "success",
- "stagedir": "/path/to/reframe/stage/generic/default/builtin/HelloTest",
- "scheduler": "local",
- "system": "generic:default",
- "tags": [],
- "time_compile": 0.27164483070373535,
- "time_performance": 0.00010180473327636719,
- "time_run": 0.3764667510986328,
- "time_sanity": 0.0006909370422363281,
- "time_setup": 0.007919073104858398,
- "time_total": 0.8006880283355713,
- "unique_name": "HelloTest"
- }
- ]
- }
- ],
- "restored_cases": []
-}
diff --git a/docs/listings/stream1.txt b/docs/listings/stream1.txt
deleted file mode 100644
index 7cf37f4c2a..0000000000
--- a/docs/listings/stream1.txt
+++ /dev/null
@@ -1,40 +0,0 @@
-[ReFrame Setup]
- version: 4.0.0-dev.2+5ea6b7a6
- command: './bin/reframe -c tutorials/basics/stream/stream1.py -r --performance-report'
- launched by: user@host
- working directory: '/home/user/Repositories/reframe'
- settings files: '', '/home/user/Repositories/reframe/tutorials/config/tresa.py'
- check search path: '/home/user/Repositories/reframe/tutorials/basics/stream/stream1.py'
- stage directory: '/home/user/Repositories/reframe/stage'
- output directory: '/home/user/Repositories/reframe/output'
- log files: '/var/folders/h7/k7cgrdl13r996m4dmsvjq7v80000gp/T/rfm-v0ig7jt4.log'
-
-[==========] Running 1 check(s)
-[==========] Started on Sat Nov 12 19:00:53 2022
-
-[----------] start processing checks
-[ RUN ] StreamTest /cdf4820d @tresa:default+gnu
-[ OK ] (1/1) StreamTest /cdf4820d @tresa:default+gnu
-P: Copy: 24031.8 MB/s (r:0, l:None, u:None)
-P: Scale: 16297.9 MB/s (r:0, l:None, u:None)
-P: Add: 17843.8 MB/s (r:0, l:None, u:None)
-P: Triad: 18278.3 MB/s (r:0, l:None, u:None)
-[----------] all spawned checks have finished
-
-[ PASSED ] Ran 1/1 test case(s) from 1 check(s) (0 failure(s), 0 skipped)
-[==========] Finished on Sat Nov 12 19:00:56 2022
-
-================================================================================
-PERFORMANCE REPORT
---------------------------------------------------------------------------------
-[StreamTest /cdf4820d @tresa:default:gnu]
- num_tasks: 1
- num_gpus_per_node: 0
- performance:
- - Copy: 24031.8 MB/s (r: 0 MB/s l: -inf% u: +inf%)
- - Scale: 16297.9 MB/s (r: 0 MB/s l: -inf% u: +inf%)
- - Add: 17843.8 MB/s (r: 0 MB/s l: -inf% u: +inf%)
- - Triad: 18278.3 MB/s (r: 0 MB/s l: -inf% u: +inf%)
---------------------------------------------------------------------------------
-Run report saved in '/home/user/.reframe/reports/run-report-324.json'
-Log file(s) saved in '/var/folders/h7/k7cgrdl13r996m4dmsvjq7v80000gp/T/rfm-v0ig7jt4.log'
diff --git a/docs/listings/stream3_failure_only.txt b/docs/listings/stream3_failure_only.txt
deleted file mode 100644
index 77f6c2b219..0000000000
--- a/docs/listings/stream3_failure_only.txt
+++ /dev/null
@@ -1,14 +0,0 @@
-FAILURE INFO for StreamWithRefTest
- * Expanded name: StreamWithRefTest
- * Description:
- * System partition: catalina:default
- * Environment: gnu
- * Stage directory: /Users/user/Repositories/reframe/stage/catalina/default/gnu/StreamWithRefTest
- * Node list: tresa.localNone
- * Job type: local (id=4576)
- * Dependencies (conceptual): []
- * Dependencies (actual): []
- * Maintainers: []
- * Failing phase: performance
- * Rerun with '-n /f925207b -p gnu --system catalina:default -r'
- * Reason: performance error: failed to meet reference: Add=19585.3, expected 18500 (l=17575.0, u=19425.0)
diff --git a/docs/listings/stream4_daint.txt b/docs/listings/stream4_daint.txt
deleted file mode 100644
index e4796996f6..0000000000
--- a/docs/listings/stream4_daint.txt
+++ /dev/null
@@ -1,206 +0,0 @@
-[ReFrame Setup]
- version: 4.0.0-dev.2
- command: './bin/reframe -c tutorials/basics/stream/stream4.py -r --performance-report'
- launched by: user@host
- working directory: '/home/user/Devel/reframe'
- settings files: '', '/home/user/Devel/reframe/tutorials/config/daint.py'
- check search path: '/home/user/Devel/reframe/tutorials/basics/stream/stream4.py'
- stage directory: '/home/user/Devel/reframe/stage'
- output directory: '/home/user/Devel/reframe/output'
- log files: '/tmp/rfm-yf6xjn_4.log'
-
-[==========] Running 1 check(s)
-[==========] Started on Tue Nov 15 18:22:48 2022
-
-[----------] start processing checks
-[ RUN ] StreamMultiSysTest /eec1c676 @daint:login+gnu
-[ RUN ] StreamMultiSysTest /eec1c676 @daint:login+intel
-[ RUN ] StreamMultiSysTest /eec1c676 @daint:login+nvidia
-[ RUN ] StreamMultiSysTest /eec1c676 @daint:login+cray
-[ RUN ] StreamMultiSysTest /eec1c676 @daint:gpu+gnu
-[ RUN ] StreamMultiSysTest /eec1c676 @daint:gpu+intel
-[ RUN ] StreamMultiSysTest /eec1c676 @daint:gpu+nvidia
-[ RUN ] StreamMultiSysTest /eec1c676 @daint:gpu+cray
-[ RUN ] StreamMultiSysTest /eec1c676 @daint:mc+gnu
-[ RUN ] StreamMultiSysTest /eec1c676 @daint:mc+intel
-[ RUN ] StreamMultiSysTest /eec1c676 @daint:mc+nvidia
-[ RUN ] StreamMultiSysTest /eec1c676 @daint:mc+cray
-[ OK ] ( 1/12) StreamMultiSysTest /eec1c676 @daint:login+gnu
-P: Copy: 97772.6 MB/s (r:0, l:None, u:None)
-P: Scale: 69418.6 MB/s (r:0, l:None, u:None)
-P: Add: 71941.0 MB/s (r:0, l:None, u:None)
-P: Triad: 73679.7 MB/s (r:0, l:None, u:None)
-[ OK ] ( 2/12) StreamMultiSysTest /eec1c676 @daint:login+intel
-P: Copy: 85123.0 MB/s (r:0, l:None, u:None)
-P: Scale: 79701.7 MB/s (r:0, l:None, u:None)
-P: Add: 81632.7 MB/s (r:0, l:None, u:None)
-P: Triad: 44391.5 MB/s (r:0, l:None, u:None)
-[ OK ] ( 3/12) StreamMultiSysTest /eec1c676 @daint:login+nvidia
-P: Copy: 76641.4 MB/s (r:0, l:None, u:None)
-P: Scale: 59041.9 MB/s (r:0, l:None, u:None)
-P: Add: 64792.5 MB/s (r:0, l:None, u:None)
-P: Triad: 69441.4 MB/s (r:0, l:None, u:None)
-[ OK ] ( 4/12) StreamMultiSysTest /eec1c676 @daint:login+cray
-P: Copy: 35658.5 MB/s (r:0, l:None, u:None)
-P: Scale: 27732.2 MB/s (r:0, l:None, u:None)
-P: Add: 39037.7 MB/s (r:0, l:None, u:None)
-P: Triad: 45310.3 MB/s (r:0, l:None, u:None)
-[ OK ] ( 5/12) StreamMultiSysTest /eec1c676 @daint:gpu+gnu
-P: Copy: 42666.3 MB/s (r:0, l:None, u:None)
-P: Scale: 38491.0 MB/s (r:0, l:None, u:None)
-P: Add: 43686.4 MB/s (r:0, l:None, u:None)
-P: Triad: 43466.6 MB/s (r:0, l:None, u:None)
-[ OK ] ( 6/12) StreamMultiSysTest /eec1c676 @daint:gpu+intel
-P: Copy: 51726.7 MB/s (r:0, l:None, u:None)
-P: Scale: 54185.6 MB/s (r:0, l:None, u:None)
-P: Add: 57608.3 MB/s (r:0, l:None, u:None)
-P: Triad: 57390.7 MB/s (r:0, l:None, u:None)
-[ OK ] ( 7/12) StreamMultiSysTest /eec1c676 @daint:gpu+nvidia
-P: Copy: 51810.8 MB/s (r:0, l:None, u:None)
-P: Scale: 39653.4 MB/s (r:0, l:None, u:None)
-P: Add: 44008.0 MB/s (r:0, l:None, u:None)
-P: Triad: 44384.4 MB/s (r:0, l:None, u:None)
-[ OK ] ( 8/12) StreamMultiSysTest /eec1c676 @daint:gpu+cray
-P: Copy: 51101.8 MB/s (r:0, l:None, u:None)
-P: Scale: 38568.1 MB/s (r:0, l:None, u:None)
-P: Add: 43193.6 MB/s (r:0, l:None, u:None)
-P: Triad: 43142.9 MB/s (r:0, l:None, u:None)
-[ OK ] ( 9/12) StreamMultiSysTest /eec1c676 @daint:mc+gnu
-P: Copy: 48292.9 MB/s (r:0, l:None, u:None)
-P: Scale: 38499.5 MB/s (r:0, l:None, u:None)
-P: Add: 43555.7 MB/s (r:0, l:None, u:None)
-P: Triad: 43871.4 MB/s (r:0, l:None, u:None)
-[ OK ] (10/12) StreamMultiSysTest /eec1c676 @daint:mc+cray
-P: Copy: 46538.3 MB/s (r:0, l:None, u:None)
-P: Scale: 40133.3 MB/s (r:0, l:None, u:None)
-P: Add: 43363.9 MB/s (r:0, l:None, u:None)
-P: Triad: 43450.3 MB/s (r:0, l:None, u:None)
-[ OK ] (11/12) StreamMultiSysTest /eec1c676 @daint:mc+nvidia
-P: Copy: 46648.2 MB/s (r:0, l:None, u:None)
-P: Scale: 40384.5 MB/s (r:0, l:None, u:None)
-P: Add: 44001.1 MB/s (r:0, l:None, u:None)
-P: Triad: 44489.7 MB/s (r:0, l:None, u:None)
-[ OK ] (12/12) StreamMultiSysTest /eec1c676 @daint:mc+intel
-P: Copy: 51335.9 MB/s (r:0, l:None, u:None)
-P: Scale: 49490.3 MB/s (r:0, l:None, u:None)
-P: Add: 56859.9 MB/s (r:0, l:None, u:None)
-P: Triad: 56544.5 MB/s (r:0, l:None, u:None)
-[----------] all spawned checks have finished
-
-[ PASSED ] Ran 12/12 test case(s) from 1 check(s) (0 failure(s), 0 skipped)
-[==========] Finished on Tue Nov 15 18:24:00 2022
-
-================================================================================
-PERFORMANCE REPORT
---------------------------------------------------------------------------------
-[StreamMultiSysTest /eec1c676 @daint:login:gnu]
- num_cpus_per_task: 10
- num_gpus_per_node: 0
- num_tasks: 1
- performance:
- - Copy: 97772.6 MB/s (r: 0 MB/s l: -inf% u: +inf%)
- - Scale: 69418.6 MB/s (r: 0 MB/s l: -inf% u: +inf%)
- - Add: 71941.0 MB/s (r: 0 MB/s l: -inf% u: +inf%)
- - Triad: 73679.7 MB/s (r: 0 MB/s l: -inf% u: +inf%)
-[StreamMultiSysTest /eec1c676 @daint:login:intel]
- num_cpus_per_task: 10
- num_gpus_per_node: 0
- num_tasks: 1
- performance:
- - Copy: 85123.0 MB/s (r: 0 MB/s l: -inf% u: +inf%)
- - Scale: 79701.7 MB/s (r: 0 MB/s l: -inf% u: +inf%)
- - Add: 81632.7 MB/s (r: 0 MB/s l: -inf% u: +inf%)
- - Triad: 44391.5 MB/s (r: 0 MB/s l: -inf% u: +inf%)
-[StreamMultiSysTest /eec1c676 @daint:login:nvidia]
- num_cpus_per_task: 10
- num_gpus_per_node: 0
- num_tasks: 1
- performance:
- - Copy: 76641.4 MB/s (r: 0 MB/s l: -inf% u: +inf%)
- - Scale: 59041.9 MB/s (r: 0 MB/s l: -inf% u: +inf%)
- - Add: 64792.5 MB/s (r: 0 MB/s l: -inf% u: +inf%)
- - Triad: 69441.4 MB/s (r: 0 MB/s l: -inf% u: +inf%)
-[StreamMultiSysTest /eec1c676 @daint:login:cray]
- num_cpus_per_task: 10
- num_gpus_per_node: 0
- num_tasks: 1
- performance:
- - Copy: 35658.5 MB/s (r: 0 MB/s l: -inf% u: +inf%)
- - Scale: 27732.2 MB/s (r: 0 MB/s l: -inf% u: +inf%)
- - Add: 39037.7 MB/s (r: 0 MB/s l: -inf% u: +inf%)
- - Triad: 45310.3 MB/s (r: 0 MB/s l: -inf% u: +inf%)
-[StreamMultiSysTest /eec1c676 @daint:gpu:gnu]
- num_cpus_per_task: 12
- num_gpus_per_node: 0
- num_tasks: 1
- performance:
- - Copy: 42666.3 MB/s (r: 0 MB/s l: -inf% u: +inf%)
- - Scale: 38491.0 MB/s (r: 0 MB/s l: -inf% u: +inf%)
- - Add: 43686.4 MB/s (r: 0 MB/s l: -inf% u: +inf%)
- - Triad: 43466.6 MB/s (r: 0 MB/s l: -inf% u: +inf%)
-[StreamMultiSysTest /eec1c676 @daint:gpu:intel]
- num_cpus_per_task: 12
- num_gpus_per_node: 0
- num_tasks: 1
- performance:
- - Copy: 51726.7 MB/s (r: 0 MB/s l: -inf% u: +inf%)
- - Scale: 54185.6 MB/s (r: 0 MB/s l: -inf% u: +inf%)
- - Add: 57608.3 MB/s (r: 0 MB/s l: -inf% u: +inf%)
- - Triad: 57390.7 MB/s (r: 0 MB/s l: -inf% u: +inf%)
-[StreamMultiSysTest /eec1c676 @daint:gpu:nvidia]
- num_cpus_per_task: 12
- num_gpus_per_node: 0
- num_tasks: 1
- performance:
- - Copy: 51810.8 MB/s (r: 0 MB/s l: -inf% u: +inf%)
- - Scale: 39653.4 MB/s (r: 0 MB/s l: -inf% u: +inf%)
- - Add: 44008.0 MB/s (r: 0 MB/s l: -inf% u: +inf%)
- - Triad: 44384.4 MB/s (r: 0 MB/s l: -inf% u: +inf%)
-[StreamMultiSysTest /eec1c676 @daint:gpu:cray]
- num_cpus_per_task: 12
- num_gpus_per_node: 0
- num_tasks: 1
- performance:
- - Copy: 51101.8 MB/s (r: 0 MB/s l: -inf% u: +inf%)
- - Scale: 38568.1 MB/s (r: 0 MB/s l: -inf% u: +inf%)
- - Add: 43193.6 MB/s (r: 0 MB/s l: -inf% u: +inf%)
- - Triad: 43142.9 MB/s (r: 0 MB/s l: -inf% u: +inf%)
-[StreamMultiSysTest /eec1c676 @daint:mc:gnu]
- num_cpus_per_task: 36
- num_gpus_per_node: 0
- num_tasks: 1
- performance:
- - Copy: 48292.9 MB/s (r: 0 MB/s l: -inf% u: +inf%)
- - Scale: 38499.5 MB/s (r: 0 MB/s l: -inf% u: +inf%)
- - Add: 43555.7 MB/s (r: 0 MB/s l: -inf% u: +inf%)
- - Triad: 43871.4 MB/s (r: 0 MB/s l: -inf% u: +inf%)
-[StreamMultiSysTest /eec1c676 @daint:mc:intel]
- num_cpus_per_task: 36
- num_gpus_per_node: 0
- num_tasks: 1
- performance:
- - Copy: 51335.9 MB/s (r: 0 MB/s l: -inf% u: +inf%)
- - Scale: 49490.3 MB/s (r: 0 MB/s l: -inf% u: +inf%)
- - Add: 56859.9 MB/s (r: 0 MB/s l: -inf% u: +inf%)
- - Triad: 56544.5 MB/s (r: 0 MB/s l: -inf% u: +inf%)
-[StreamMultiSysTest /eec1c676 @daint:mc:nvidia]
- num_cpus_per_task: 36
- num_gpus_per_node: 0
- num_tasks: 1
- performance:
- - Copy: 46648.2 MB/s (r: 0 MB/s l: -inf% u: +inf%)
- - Scale: 40384.5 MB/s (r: 0 MB/s l: -inf% u: +inf%)
- - Add: 44001.1 MB/s (r: 0 MB/s l: -inf% u: +inf%)
- - Triad: 44489.7 MB/s (r: 0 MB/s l: -inf% u: +inf%)
-[StreamMultiSysTest /eec1c676 @daint:mc:cray]
- num_cpus_per_task: 36
- num_gpus_per_node: 0
- num_tasks: 1
- performance:
- - Copy: 46538.3 MB/s (r: 0 MB/s l: -inf% u: +inf%)
- - Scale: 40133.3 MB/s (r: 0 MB/s l: -inf% u: +inf%)
- - Add: 43363.9 MB/s (r: 0 MB/s l: -inf% u: +inf%)
- - Triad: 43450.3 MB/s (r: 0 MB/s l: -inf% u: +inf%)
---------------------------------------------------------------------------------
-Run report saved in '/home/user/.reframe/reports/run-report-2.json'
-Log file(s) saved in '/tmp/rfm-yf6xjn_4.log'
diff --git a/docs/listings/stream_params.txt b/docs/listings/stream_params.txt
deleted file mode 100644
index 71836c914b..0000000000
--- a/docs/listings/stream_params.txt
+++ /dev/null
@@ -1,26 +0,0 @@
-[ReFrame Setup]
- version: 4.0.0-dev.2+5ea6b7a6
- command: './bin/reframe -c tutorials/advanced/parameterized/stream.py -l'
- launched by: user@host
- working directory: '/home/user/Repositories/reframe'
- settings files: '', '/home/user/Repositories/reframe/tutorials/config/tresa.py'
- check search path: '/home/user/Repositories/reframe/tutorials/advanced/parameterized/stream.py'
- stage directory: '/home/user/Repositories/reframe/stage'
- output directory: '/home/user/Repositories/reframe/output'
- log files: '/var/folders/h7/k7cgrdl13r996m4dmsvjq7v80000gp/T/rfm-lou71c2g.log'
-
-[List of matched checks]
-- StreamMultiSysTest %num_bytes=536870912 /cf10843f
-- StreamMultiSysTest %num_bytes=268435456 /97fb363f
-- StreamMultiSysTest %num_bytes=134217728 /7b4d01d3
-- StreamMultiSysTest %num_bytes=67108864 /530b0154
-- StreamMultiSysTest %num_bytes=33554432 /7199fc93
-- StreamMultiSysTest %num_bytes=16777216 /9d1b9ea8
-- StreamMultiSysTest %num_bytes=8388608 /3f29039f
-- StreamMultiSysTest %num_bytes=4194304 /e30054cd
-- StreamMultiSysTest %num_bytes=2097152 /45efaec5
-- StreamMultiSysTest %num_bytes=1048576 /92327981
-- StreamMultiSysTest %num_bytes=524288 /eb104cd0
-Found 11 check(s)
-
-Log file(s) saved in '/var/folders/h7/k7cgrdl13r996m4dmsvjq7v80000gp/T/rfm-lou71c2g.log'
diff --git a/docs/listings/verbose_test_loading.txt b/docs/listings/verbose_test_loading.txt
new file mode 100644
index 0000000000..3140584597
--- /dev/null
+++ b/docs/listings/verbose_test_loading.txt
@@ -0,0 +1,108 @@
+Loading user configuration
+Loading the builtin configuration
+Loading configuration file: 'config/baseline_environs.py'
+Autodetecting system
+Trying autodetection method: 'py::socket.gethostname'
+Retrieved hostname: 'myhost'
+Looking for a matching configuration entry
+Configuration found: picking system 'tutorialsys'
+Initializing runtime
+Initializing system partition 'default'
+Initializing system 'tutorialsys'
+Initializing modules system 'nomod'
+detecting topology info for tutorialsys:default
+> found topology file '/home/user/.reframe/topology/tutorialsys-default/processor.json'; loading...
+> device auto-detection is not supported
+[ReFrame Environment]
+ RFM_AUTODETECT_FQDN=
+ RFM_AUTODETECT_METHOD=
+ RFM_AUTODETECT_METHODS=
+ RFM_AUTODETECT_XTHOSTNAME=
+ RFM_CHECK_SEARCH_PATH=
+ RFM_CHECK_SEARCH_RECURSIVE=
+ RFM_CLEAN_STAGEDIR=
+ RFM_COLORIZE=
+ RFM_COMPRESS_REPORT=
+ RFM_CONFIG_FILES=
+ RFM_CONFIG_PATH=
+ RFM_DUMP_PIPELINE_PROGRESS=
+ RFM_GIT_TIMEOUT=
+ RFM_HTTPJSON_URL=
+ RFM_IGNORE_REQNODENOTAVAIL=
+ RFM_INSTALL_PREFIX=/usr/local/share/reframe
+ RFM_KEEP_STAGE_FILES=
+ RFM_MODULE_MAPPINGS=
+ RFM_MODULE_MAP_FILE=
+ RFM_NON_DEFAULT_CRAYPE=
+ RFM_OUTPUT_DIR=
+ RFM_PERFLOG_DIR=
+ RFM_PERF_INFO_LEVEL=
+ RFM_PIPELINE_TIMEOUT=
+ RFM_PREFIX=
+ RFM_PURGE_ENVIRONMENT=
+ RFM_REMOTE_DETECT=
+ RFM_REMOTE_WORKDIR=
+ RFM_REPORT_FILE=
+ RFM_REPORT_JUNIT=
+ RFM_RESOLVE_MODULE_CONFLICTS=
+ RFM_SAVE_LOG_FILES=
+ RFM_STAGE_DIR=
+ RFM_SYSLOG_ADDRESS=
+ RFM_SYSTEM=
+ RFM_TIMESTAMP_DIRS=
+ RFM_TRAP_JOB_ERRORS=
+ RFM_UNLOAD_MODULES=
+ RFM_USER_MODULES=
+ RFM_USE_LOGIN_SHELL=
+ RFM_VERBOSE=
+[ReFrame Setup]
+ version: 4.6.0-dev.2
+ command: '/usr/local/share/reframe/bin/reframe -C config/baseline_environs.py -c stream/stream_variables.py -l -vv'
+ launched by: user@myhost
+ working directory: '/home/user/reframe-examples/tutorial'
+ settings files: '', 'config/baseline_environs.py'
+ check search path: '/home/user/reframe-examples/tutorial/stream/stream_variables.py'
+ stage directory: '/home/user/reframe-examples/tutorial/stage'
+ output directory: '/home/user/reframe-examples/tutorial/output'
+ log files: '/tmp/rfm-f2v37wl4.log'
+
+Looking for tests in '/home/user/reframe-examples/tutorial/stream/stream_variables.py'
+Validating '/home/user/reframe-examples/tutorial/stream/stream_variables.py': OK
+ > Loaded 3 test(s)
+Loaded 3 test(s)
+Generated 4 test case(s)
+Filtering test cases(s) by name: 2 remaining
+Filtering test cases(s) by tags: 2 remaining
+Filtering test cases(s) by other attributes: 2 remaining
+Building and validating the full test DAG
+Full test DAG:
+ ('stream_test', 'tutorialsys:default', 'gnu') -> [('build_stream_32608d67', 'tutorialsys:default', 'gnu')]
+ ('stream_test', 'tutorialsys:default', 'clang') -> [('build_stream_de1600df', 'tutorialsys:default', 'clang')]
+ ('build_stream_de1600df', 'tutorialsys:default', 'clang') -> []
+ ('build_stream_32608d67', 'tutorialsys:default', 'gnu') -> []
+Pruned test DAG
+ ('stream_test', 'tutorialsys:default', 'gnu') -> [('build_stream_32608d67', 'tutorialsys:default', 'gnu')]
+ ('build_stream_32608d67', 'tutorialsys:default', 'gnu') -> []
+ ('stream_test', 'tutorialsys:default', 'clang') -> [('build_stream_de1600df', 'tutorialsys:default', 'clang')]
+ ('build_stream_de1600df', 'tutorialsys:default', 'clang') -> []
+Final number of test cases: 4
+[List of matched checks]
+- stream_test /2e15a047
+ ^build_stream ~tutorialsys:default+gnu 'stream_binary /40af02af
+ ^build_stream ~tutorialsys:default+clang 'stream_binary /8effd276
+Found 1 check(s)
+
+Log file(s) saved in '/tmp/rfm-f2v37wl4.log'
+>>> profiler report [start] <<<
+main: 0.118834 s
+ test processing: 0.037243 s
+ RegressionCheckLoader.load_all: 0.028813 s
+ TestRegistry.instantiate_all: 0.015122 s
+ generate_testcases: 0.000090 s
+ main.._sort_testcases: 0.000019 s
+ build_deps: 0.000259 s
+ validate_deps: 0.000097 s
+ prune_deps: 0.000199 s
+ toposort: 0.000200 s
+ list_checks: 0.002932 s
+>>> profiler report [ end ] <<<
diff --git a/docs/manpage.rst b/docs/manpage.rst
index 2b56ef1080..387c27f9f0 100644
--- a/docs/manpage.rst
+++ b/docs/manpage.rst
@@ -12,7 +12,7 @@ Synopsis
Description
-----------
-ReFrame provides both a `programming interface `__ for writing regression tests and a command-line interface for managing and running the tests, which is detailed here.
+ReFrame provides both a :doc:`programming interface ` for writing regression tests and a command-line interface for managing and running the tests, which is detailed here.
The ``reframe`` command is part of ReFrame's frontend.
This frontend is responsible for loading and running regression tests written in ReFrame.
ReFrame executes tests by sending them down to a well defined pipeline.
@@ -251,7 +251,7 @@ An action must always be specified.
For more information, have a look in :ref:`generate-ci-pipeline`.
.. note::
- This option will not work with the `test generation options <#test-generators>`.
+ This option will not work with the :ref:`test generation options `.
.. versionadded:: 3.4.1
@@ -385,7 +385,7 @@ Options controlling ReFrame output
Directory prefix for logging performance data.
- This option is relevant only to the ``filelog`` `logging handler `__.
+ This option is relevant only to the ``filelog`` :ref:`logging handler `.
This option can also be set using the :envvar:`RFM_PERFLOG_DIR` environment variable or the :attr:`~config.logging.handlers_perflog..filelog..basedir` logging handler configuration parameter.
@@ -441,7 +441,7 @@ Options controlling ReFrame output
Save ReFrame log files in the output directory before exiting.
- Only log files generated by ``file`` `log handlers `__ will be copied.
+ Only log files generated by ``file`` :ref:`log handlers ` will be copied.
This option can also be set using the :envvar:`RFM_SAVE_LOG_FILES` environment variable or the :attr:`~config.general.save_log_files` general configuration parameter.
@@ -583,7 +583,7 @@ Options controlling ReFrame execution
This is not a problem when rerunning a failed case, since the stage directories of its dependencies are automatically kept, but if you want to rerun a successful test case, you should make sure to have run with the :option:`--keep-stage-files` option.
.. note::
- This option will not work with the `test generation options <#test-generators>`.
+ This option will not work with the :ref:`test generation options `.
.. versionadded:: 3.4
@@ -752,6 +752,18 @@ If no node can be selected, the test will be marked as a failure with an appropr
Slurm OR constraints and parenthesized expressions are supported in flexible node allocation.
+ .. versionchanged:: 4.7
+ The test is not marked as a failure if not enough nodes are available, but it is skipped instead.
+ To enforce a failure, use :option:`--flex-alloc-strict`
+
+.. option:: --flex-alloc-strict
+
+ Fail flexible tests if their minimum task requirement is not satisfied.
+ Otherwise the tests will be skipped.
+
+ .. versionadded:: 4.7
+
+
---------------------------------------
Options controlling ReFrame environment
---------------------------------------
@@ -1175,7 +1187,7 @@ Notice that this example leads to a name conflict with the old naming scheme, si
Each test is also associated with a hash code that is derived from the test name, its parameters and their values.
As in the example listing above, the hash code of each test is printed with the :option:`-l` option and individual tests can be selected by their hash using the :option:`-n` option, e.g., ``-n /1c51609b``.
-The stage and output directories, as well as the performance log file of the ``filelog`` `performance log handler `__ will use the hash code for the test-specific directories and files.
+The stage and output directories, as well as the performance log file of the ``filelog`` :ref:`performance log handler ` will use the hash code for the test-specific directories and files.
This might lead to conflicts for tests as the one above when executing them with the asynchronous execution policy, but ensures consistency of performance record files when parameter values are added to or deleted from a test parameter.
More specifically, the test's hash will not change if a new parameter value is added or deleted or even if the parameter values are shuffled.
Test variants on the other side are more volatile and can change with such changes.
@@ -1425,6 +1437,21 @@ Whenever an environment variable is associated with a configuration option, its
.. versionadded:: 4.0.0
+.. envvar:: RFM_FLEX_ALLOC_STRICT
+
+ Fail flexible tests if their minimum task requirement is not satisfied.
+
+ .. table::
+ :align: left
+
+ ================================== ==================
+ Associated command line option :option:`--flex-alloc-strict`
+ Associated configuration parameter :attr:`~config.general.flex_alloc_strict`
+ ================================== ==================
+
+ .. versionadded:: 4.7
+
+
.. envvar:: RFM_GIT_TIMEOUT
Timeout value in seconds used when checking if a git repository exists.
@@ -1858,13 +1885,10 @@ If no configuration file can be found in any of the predefined locations, ReFram
This configuration file is located in |reframe/core/settings.py|_.
Users may *not* modify this file.
-For a complete reference of the configuration, please refer to |reframe.settings(8)|_ man page.
+For a complete reference of the configuration, please refer to :doc:`reframe.settings(8) ` man page.
.. |reframe/core/settings.py| replace:: ``reframe/core/settings.py``
.. _reframe/core/settings.py: https://github.com/reframe-hpc/reframe/blob/master/reframe/core/settings.py
-.. |reframe.settings(8)| replace:: ``reframe.settings(8)``
-.. _reframe.settings(8): config_reference.html
-
Reporting Bugs
--------------
diff --git a/docs/pipeline.rst b/docs/pipeline.rst
index b6efb3edef..1bea1b4b95 100644
--- a/docs/pipeline.rst
+++ b/docs/pipeline.rst
@@ -14,7 +14,7 @@ The following figure explains in more detail the process:
When ReFrame loads a test from the disk it unconditionally constructs it executing its :func:`__init__` method.
The practical implication of this is that your test will be instantiated even if it will not run on the current system.
-After all the tests are loaded, they are filtered based on the current system and any other criteria (such as programming environment, test attributes etc.) specified by the user (see `Test Filtering `__ for more details).
+After all the tests are loaded, they are filtered based on the current system and any other criteria (such as programming environment, test attributes etc.) specified by the user (see :ref:`Test Filtering ` for more details).
After the tests are filtered, ReFrame creates the actual `test cases` to be run. A test case is essentially a tuple consisting of the test, the system partition and the programming environment to try.
The test that goes into a test case is essentially a `clone` of the original test that was instantiated upon loading.
This ensures that the test case's state is not shared and may not be reused in any case.
@@ -66,7 +66,7 @@ The Run Phase
During this phase a job script associated with the test case will be created and it will be submitted for execution.
If the test is `"run-only," `__ its `resources `__ will be first copied to the test case's stage directory.
ReFrame will temporarily switch to that directory and spawn the test's job from there.
-This phase is executed asynchronously (either a batch job is spawned or a local process is started) and it is up to the selected `execution policy <#execution-policies>`__ to block or not until the associated job finishes.
+This phase is executed asynchronously (either a batch job is spawned or a local process is started) and it is up to the selected :ref:`execution policy ` to block or not until the associated job finishes.
----------------
@@ -96,6 +96,8 @@ More specifically, if the test has finished successfully, all interesting test f
This phase might be deferred in case a test has dependents (see :ref:`cleaning-up-stage-files` for more details).
+.. _execution-policies:
+
Execution Policies
------------------
diff --git a/docs/regression_test_api.rst b/docs/regression_test_api.rst
index 5b546b3e36..16c38feab1 100644
--- a/docs/regression_test_api.rst
+++ b/docs/regression_test_api.rst
@@ -289,7 +289,7 @@ Modules Systems
:members:
:show-inheritance:
-
+.. _build-systems:
-------------
Build Systems
-------------
diff --git a/docs/requirements.txt b/docs/requirements.txt
index 0a7408c2c2..f589cdd2fc 100644
--- a/docs/requirements.txt
+++ b/docs/requirements.txt
@@ -1,9 +1,9 @@
-archspec==0.2.2
+archspec==0.2.4
docutils==0.18.1
jsonschema==3.2.0
semver==2.13.0; python_version == '3.6'
semver==3.0.2; python_version >= '3.7'
Sphinx==5.3.0; python_version < '3.8'
Sphinx==7.1.2; python_version == '3.8'
-Sphinx==7.2.6; python_version >= '3.9'
+Sphinx==7.3.7; python_version >= '3.9'
sphinx-rtd-theme==2.0.0
diff --git a/docs/started.rst b/docs/started.rst
index e3f9c0507e..ccfc1f57ba 100644
--- a/docs/started.rst
+++ b/docs/started.rst
@@ -105,6 +105,10 @@ The ``./bootstrap.sh`` has two additional variant options:
The bootstrap script for ReFrame was added.
For previous ReFrame versions you should install its requirements using ``pip install -r requirements.txt`` in a Python virtual environment.
+ .. versionchanged:: 4.5
+ ReFrame now supports multiarch builds and it will place all of its dependencies in an arch-specific directory under its prefix.
+ Also, ``pip`` is no more required, as the bootstrap script will start a virtual environment without ``pip`` and will fetch a fresh ``pip``, which will be used to install the dependencies.
+
Enabling auto-completion
------------------------
@@ -123,10 +127,8 @@ Auto-completion is supported for Bash, Tcsh and Fish shells.
Where to Go from Here
---------------------
-If you are new to ReFrame, the place to start is the first tutorial :doc:`tutorial_basics`, which will guide you step-by-step in both writing your first tests and in configuring ReFrame.
-The rest of the tutorials explore additional capabilities of the framework and cover several topics that you will likely come across when writing your own tests.
-
-The :doc:`configure` page provides more details on how a configuration file is structured and the :doc:`topics` explain some more advanced concepts as well as some implementation details.
-The :doc:`manuals` provide complete reference guides for the command line interface, the configuration parameters and the programming APIs for writing tests.
+If you are new to ReFrame, the place to start is the :doc:`tutorial`, which will guide you through all the concepts of the framework and get you up and running.
+If you are looking for a particular topic that is not covered in the tutorial, you can refer to the :doc:`howto` or the :doc:`topics`.
+For detailed reference guides for the command line, the configuration and the programming API, refer to the :doc:`manuals`.
-Finally, if you are not new to ReFrame and you have been using the 3.x versions, you should read the :doc:`whats_new_40` page, which explains what are the key new features of ReFrame 4.0 as well as all the breaking changes.
+Finally, if you are already a user of ReFrame 3.x version, you should read the :doc:`whats_new_40` page, which explains what are the key new features of ReFrame 4.0 as well as all the breaking changes.
diff --git a/docs/tutorial.rst b/docs/tutorial.rst
new file mode 100644
index 0000000000..9852171216
--- /dev/null
+++ b/docs/tutorial.rst
@@ -0,0 +1,1955 @@
+.. currentmodule:: reframe.core.pipeline.RegressionTest
+
+================
+ReFrame Tutorial
+================
+
+This tutorial will cover the basic concepts of ReFrame and will get you started with the framework.
+For more specific topics, you should refer to ":doc:`howto`" as well as to the ":doc:`topics`" for an in-depth understanding of some of the framework's concepts.
+
+.. contents:: Table of Contents
+ :local:
+ :depth: 3
+
+Requirements
+============
+
+To run this tutorial you need ``docker`` for the local examples and ``docker compose`` for the examples emulating a Slurm cluster.
+Note, that the Docker daemon must be running.
+
+The tutorial container images provide already the latest ReFrame version installed.
+For installing a stand-alone version of ReFrame, please refer to the ":doc:`started`" guide.
+
+All tutorial examples are located under the ``reframe-examples`` directory inside the container's working directory.
+
+
+Running the local examples
+--------------------------
+
+To run the local examples, launch the single-node tutorial container by binding mounting the examples:
+
+.. code-block:: bash
+
+ git clone https://github.com/reframe-hpc/reframe.git
+ cd reframe
+ docker build -t reframe-tut-singlenode:latest -f examples/tutorial/dockerfiles/singlenode.Dockerfile .
+ docker run -h myhost -it --mount type=bind,source=$(pwd)/examples/,target=/home/user/reframe-examples reframe-tut-singlenode:latest /bin/bash
+
+
+.. _multi-node-setup:
+
+Running the multi-node examples
+-------------------------------
+
+To run the multi-node examples you need first to launch a Slurm pseudo cluster using the provided Docker compose file:
+
+.. code-block:: bash
+
+ git clone https://github.com/reframe-hpc/reframe.git
+ cd reframe
+ docker compose --project-directory=$(pwd) -f examples/tutorial/dockerfiles/slurm-cluster/docker-compose.yml up --abort-on-container-exit --exit-code-from frontend
+
+Once the Docker compose stack is up, you execute the following from a different terminal window in order to "log in" in the frontend container:
+
+.. code-block::
+
+ docker exec -it $(docker ps -f name=frontend -q) /bin/bash
+
+ # Inside the container
+ cd reframe-examples/tutorial/
+
+Once done, press Ctl-D in the frontend container and Ctl-C in the Docker compose console window.
+
+
+.. note::
+
+ All examples use the single-node container unless it is otherwise noted.
+
+
+Modifying the examples
+----------------------
+
+In both cases, the tutorial examples are bind mounted in the container, so you could make changes directly from your host and these will be reflected inside the container and vice versa.
+
+
+.. _writing-your-first-test:
+
+Writing your first test
+=======================
+
+We will start with the `STREAM benchmark `__.
+This is a standard benchmark for measuring the DRAM bandwidth.
+The tutorial container already contains a pre-compiled OpenMP version of the benchmark.
+Our test will run the STREAM executable, validate the output and extract the figure of merits.
+Here is the full ReFrame test:
+
+.. literalinclude:: ../examples/tutorial/stream/stream_runonly.py
+ :caption:
+ :lines: 5-
+
+ReFrame tests are specially decorated classes that ultimately derive from the :class:`~reframe.core.pipeline.RegressionTest` class.
+Since we only want to run an executable in this first test, we derive from the :class:`~reframe.core.pipeline.RunOnlyRegressionTest` class, which essentially short-circuits the "compile" stage of the test.
+The :func:`@simple_test ` decorator registers a test class with the framework and makes it available for running.
+
+Every ReFrame test must define the :attr:`valid_systems` and :attr:`valid_prog_environs` variables.
+These describe the test's constraints and the framework will automatically filter out tests on systems and environments that do not match the constraints.
+We will describe the system and environments abstractions later in this tutorial.
+For this first example, the ``*`` symbol denotes that this test is valid for any system or environment.
+A :class:`~reframe.core.pipeline.RunOnlyRegressionTest` must also define an :attr:`executable` to run.
+
+A test must also define a validation function which is decorated with the :func:`@sanity_function` decorator.
+This function will be used to validate the test's output after it is finished.
+ReFrame, by default, makes no assumption about whether a test is successful or not;
+it is the test's responsibility to define its validation.
+The framework provides a rich set of :doc:`utility functions ` that help matching patterns and extract values from the test's output.
+The :attr:`stdout` here refers to the name of the file where the test's standard output is stored.
+
+Finally, a test may optionally define a set of *performance functions* that will extract *figures of merit* for the test.
+These are simple test methods decorated with the :func:`@performance_function ` decorator that return the figure of merit.
+In this example, we extract the ``Copy`` and ``Triad`` bandwidth values and convert them to ``float``.
+These figures of merit or *performance variables* as they are called in ReFrame's nomenclature have a special treatment:
+they are logged in the test's performance logs and a reference value per system may also be assigned to them.
+If that reference value is not met within some user-defined thresholds, the test will fail.
+
+
+Running a test
+==============
+
+Running our test is very straightforward:
+
+.. code-block:: bash
+ :caption: Run in the single-node container.
+
+ cd reframe-examples/tutorial
+ reframe -c stream/stream_runonly.py -r
+
+
+The :option:`-c` option defines the *check path* and it can be specified multiple times.
+It specifies the locations, directories or files, where ReFrame will try to look for tests.
+In this case, we simply pass the path to our test file.
+The :option:`-r` option instructs ReFrame to run the selected tests:
+
+.. code-block:: console
+
+ [ReFrame Setup]
+ version: 4.5.0-dev.1
+ command: '/usr/local/share/reframe/bin/reframe -c stream/stream_runonly.py -r'
+ launched by: user@myhost
+ working directory: '/home/user'
+ settings files: ''
+ check search path: '/home/user/reframe-examples/tutorial/stream/stream_runonly.py'
+ stage directory: '/home/user/stage'
+ output directory: '/home/user/output'
+ log files: '/tmp/rfm-mzynqhye.log'
+
+ [==========] Running 1 check(s)
+ [==========] Started on Mon Nov 27 20:55:17 2023
+
+ [----------] start processing checks
+ [ RUN ] stream_test /2e15a047 @generic:default+builtin
+ [ OK ] (1/1) stream_test /2e15a047 @generic:default+builtin
+ P: copy_bw: 19538.4 MB/s (r:0, l:None, u:None)
+ P: triad_bw: 14883.4 MB/s (r:0, l:None, u:None)
+ [----------] all spawned checks have finished
+
+ [ PASSED ] Ran 1/1 test case(s) from 1 check(s) (0 failure(s), 0 skipped, 0 aborted)
+ [==========] Finished on Mon Nov 27 20:55:25 2023
+ Log file(s) saved in '/tmp/rfm-mzynqhye.log'
+
+
+The verbosity of the output can be increasing using the :option:`-v` option or decreased using the :option:`-q` option.
+By default, a log file is generated in the system's temporary directory that contains detailed debug information.
+The :ref:`logging` section article describes how logging can be configured in more detail.
+Once a performance test finishes, its figures of merit are printed immediately using the ``P:`` prefix.
+This can be suppressed by increasing the level at which this information is logged using the :envvar:`RFM_PERF_INFO_LEVEL` environment variable.
+
+.. _run-reports-and-performance-logging:
+
+Run reports and performance logging
+-----------------------------------
+
+Once a test session finishes, ReFrame generates a detailed JSON report under ``$HOME/.reframe/reports``.
+Every time ReFrame is run a new report will be generated automatically.
+The latest one is always symlinked by the ``latest.json`` name, unless the :option:`--report-file` option is given.
+
+For performance tests, in particular, an additional CSV file is generated with all the relevant information.
+These files are located by default under ``perflogs///.log``.
+In our example, this translates to ``perflogs/generic/default/stream_test.log``.
+The information that is being logged is fully configurable and we will cover this in the :ref:`logging` section.
+
+Finally, you can use also the :option:`--performance-report` option, which will print a summary of the results of the performance tests that have run in the current session.
+
+.. code-block:: console
+
+ [stream_test /2e15a047 @generic:default:builtin]
+ num_tasks: 1
+ performance:
+ - copy_bw: 22704.4 MB/s (r: 0 MB/s l: -inf% u: +inf%)
+ - triad_bw: 16040.9 MB/s (r: 0 MB/s l: -inf% u: +inf%)
+
+
+Inspecting the test artifacts
+-----------------------------
+
+When ReFrame executes tests, it first copies over all of the test resources (if any) to a *stage directory*, from which it executes the test.
+Upon successful execution, the test artifacts will be copied over to the *output directory* for archiving.
+The default artifacts for every test are the generated test script as well as the test's standard output and standard error.
+The default location for the stage and output directories are the ``./stage`` and ``./output`` directories.
+These can be changed with the :option:`--stage` and :option:`--output` options or the more general :option:`--prefix` option.
+The test artifacts of our first example can be found in the following location:
+
+.. code-block:: bash
+ :caption: Run in the single-node container.
+
+ ls output/generic/default/builtin/stream_test/
+
+.. code-block:: console
+
+ rfm_job.err rfm_job.out rfm_job.sh
+
+The ``rfm_job.sh`` is the actual test script that was generated and executed and, as you can see, it was pretty simple for this case:
+
+.. code-block:: bash
+
+ #!/bin/bash
+ stream.x
+
+Inspecting test failures
+-------------------------
+
+When a test fails, ReFrame will not move its artifacts to the output directory and will keep everything inside the stage directory.
+For each failed test, a summary will be printed at the end that contains details about the reason of the failure and the location of the test's stage directory.
+Here is an example failure that we induced artificially by changing the validation regular expression:
+
+.. code-block:: console
+
+ FAILURE INFO for stream_test (run: 1/1)
+ * Description:
+ * System partition: generic:default
+ * Environment: builtin
+ * Stage directory: /home/user/stage/generic/default/builtin/stream_test
+ * Node list: myhost
+ * Job type: local (id=19)
+ * Dependencies (conceptual): []
+ * Dependencies (actual): []
+ * Maintainers: []
+ * Failing phase: sanity
+ * Rerun with '-n /2e15a047 -p builtin --system generic:default -r'
+ * Reason: sanity error: pattern 'Slution Validates' not found in 'rfm_job.out'
+ --- rfm_job.out (first 10 lines) ---
+ -------------------------------------------------------------
+ STREAM version $Revision: 5.10 $
+ -------------------------------------------------------------
+ This system uses 8 bytes per array element.
+ -------------------------------------------------------------
+ Array size = 100000000 (elements), Offset = 0 (elements)
+ Memory per array = 762.9 MiB (= 0.7 GiB).
+ Total memory required = 2288.8 MiB (= 2.2 GiB).
+ Each kernel will be executed 10 times.
+ The *best* time for each kernel (excluding the first iteration)
+ --- rfm_job.out ---
+ --- rfm_job.err (first 10 lines) ---
+ --- rfm_job.err ---
+
+
+Adding performance references
+-----------------------------
+
+For each performance variable defined in the test, we can add a reference value and set thresholds of acceptable variations.
+Here is an example for our STREAM benchmark:
+
+.. code-block:: python
+
+ @rfm.simple_test
+ class stream_test(rfm.RunOnlyRegressionTest):
+ ...
+ reference = {
+ 'myhost:baseline': {
+ 'copy_bw': (23_890, -0.10, 0.30, 'MB/s'),
+ 'triad_bw': (17_064, -0.05, 0.50, 'MB/s'),
+ }
+ }
+
+The :attr:`reference` test variable is a multi-level dictionary that defines the expected performance for each of the test's performance variables on all supported systems.
+It is not necessary that all performance variables and all systems have a reference.
+If a reference value is not found, then the obtained performance will be logged, but no performance validation will be performed.
+The reference value is essentially a three or four element tuple of the form: ``(target_perf, lower_thres, upper_thres, unit)``. The ``unit`` is optional as it is already defined in the :func:`@performance_function ` definitions.
+The lower and upper thresholds are deviations from the target reference expressed as fractional numbers.
+In our example, we allow the ``copy_bw`` to be 10% lower than the target reference and no more than 30% higher.
+Sometimes, especially in microbenchmarks, it is a good practice to set an upper threshold to denote the absolute maximum that cannot be exceeded.
+
+
+Dry-run mode
+------------
+
+ReFrame provides also a dry-run mode for the tests, which can be enabled by passing :option:`--dry-run` as the action option (instead of :option:`-r` that runs the tests).
+In this mode, ReFrame will generate the test script to be executed in the stage directory, but it will not run the test and will not perform the sanity and performance checking, neither will it attempt to extract any of the figures of merit.
+Tests can also modify their behaviour if run in dry-run mode by calling the :meth:`is_dry_run` method.
+Here is an example dry-run of our first version of the STREAM benchmark:
+
+.. code-block:: bash
+ :caption: Run in the single-node container.
+
+ reframe -c stream/stream_runonly.py --dry-run
+
+.. code-block:: console
+
+ [==========] Running 1 check(s)
+ [==========] Started on Wed Jan 10 22:45:49 2024+0000
+
+ [----------] start processing checks
+ [ DRY ] stream_test /2e15a047 @generic:default+builtin
+ [ OK ] (1/1) stream_test /2e15a047 @generic:default+builtin
+ [----------] all spawned checks have finished
+
+ [ PASSED ] Ran 1/1 test case(s) from 1 check(s) (0 failure(s), 0 skipped, 0 aborted)
+ [==========] Finished on Wed Jan 10 22:45:49 2024+0000
+
+
+Note that the ``RUN`` message is replaced by ``DRY`` in the dry-run mode.
+You can also check the generated test script in ``stage/generic/default/builtin/stream_test/rfm_job.sh``.
+
+
+Systems and environments
+========================
+
+The first version of our STREAM test assumes that the environment it runs in already provides the benchmark executable.
+This is totally fine if we want to run a set of run-only tests in a single environment, but it can become a maintenance burden if we want to run our test on different systems and environments.
+In this section, we will introduce how we can define system and environment configurations in ReFrame and match them to tests.
+
+For ReFrame, a *system* is an abstraction of an HPC system that is managed by a workload manager.
+A system can comprise multiple *partitions*, which are collection of nodes with similar characteristics.
+This is entirely up to the user on how to define the system partitions.
+
+An *environment* is an abstraction of the environment where a test will run and it is a collection of environment variables, environment modules and compiler definitions.
+The following picture depicts this architecture.
+
+.. figure:: _static/img/reframe-system-arch.svg
+ :align: center
+
+ :sub:`ReFrame's system architecture`
+
+Tests are associated with systems and environments through their :attr:`valid_systems` and :attr:`valid_prog_environs` variables.
+
+Let's limit the scope of our test by making it require a specific environment, since in order to run it we require an environment that provides STREAM.
+We could do that simply by setting the :attr:`valid_prog_environs` as follows:
+
+.. code-block:: python
+
+ self.valid_prog_environs = ['+stream']
+
+This tells ReFrame that this test is valid only for environments that define the ``stream`` feature.
+If we try to run the test now, nothing will be run:
+
+.. code-block:: bash
+ :caption: Run in the single-node container.
+
+ reframe -c stream/stream_runonly.py -r
+
+
+.. code-block:: console
+
+ [ PASSED ] Ran 0/0 test case(s) from 0 check(s) (0 failure(s), 0 skipped, 0 aborted)
+
+
+This happens because ReFrame by default defines a generic system and environment.
+You may have noticed in our first run the ``@generic:default+builtin`` notation printed after test name.
+This is the system partition name (``generic:default``) and the environment name (``builtin``) where the test is being run in.
+The ``generic`` system and the ``builtin`` partition come as predefined in ReFrame.
+They make the minimum possible assumptions:
+
+- The ``generic`` system defines a single partition, named ``default`` which launches test jobs locally.
+- The ``builtin`` environment assumes only that the ``cc`` compiler is available.
+
+.. note::
+ ReFrame will not complain if a compiler is not installed until your test tries to build something.
+
+
+Let's define our own system and baseline environment in a ReFrame configuration file (``reframe-examples/tutorial/config/baseline.py``):
+
+.. literalinclude:: ../examples/tutorial/config/baseline.py
+ :caption:
+ :lines: 5-
+
+This configuration defines a system named ``tutorialsys`` with a single partition named ``default`` and an environment named ``baseline``.
+Let's look at some key elements of the configuration:
+
+* Each system, partition and environment require a unique name.
+ The name must contain only alphanumeric characters, underscores or dashes.
+* The :attr:`~config.systems.hostnames` option defines a set of hostname patterns which ReFrame will try to match against the current system's hostname.
+ The first matching system will become the current system and ReFrame will load the corresponding configuration.
+* The :attr:`~config.systems.partitions.scheduler` partition option defines the job scheduler backend to use on this partition.
+ ReFrame supports many `job schedulers `__.
+ The ``local`` scheduler that we use here is the simplest one and it practically spawns a process executing the generated test script.
+* The :attr:`~config.systems.partitions.launcher` partition option defines the parallel launcher to use for spawning parallel programs.
+ ReFrame supports all the major `parallel launchers `__.
+* The :attr:`~config.systems.partitions.environs` partition option is a list of environments to test on this partition.
+ Their definitions are resolved in the :attr:`~config.environments` section.
+* Every partition and environment can define a set of arbitrary features or key/value pairs in the :attr:`~config.environments.features` and :attr:`~config.environments.extras` options respectively.
+ ReFrame will try to match system partitions and environments to a test based on the test's specification in :attr:`valid_systems` and :attr:`valid_prog_environs`.
+
+There are many options that we can be define for systems, partitions and environments.
+We will cover several of them as we go through the tutorial, but for the complete reference you should refer to :doc:`config_reference`.
+
+.. note::
+
+ ReFrame supports splitting the configuration in multiple files that can be loaded simultaneously.
+ In fact the builtin configuration is always loaded, therefore the ``generic`` system as well as the ``builtin`` environment are always defined.
+ Additionally, the builtin configuration provides a baseline logging configuration that should cover a wide range of use cases.
+ See :ref:`managing-the-configuration` for more details.
+
+
+Let's try running the constrained version of our STREAM test with the configuration file that we have just created:
+
+.. code-block:: bash
+ :caption: Run in the single-node container.
+
+ reframe -C config/baseline.py -c stream/stream_runonly.py -r
+
+.. code-block:: console
+
+ [ReFrame Setup]
+ version: 4.5.0-dev.1
+ command: '/usr/local/share/reframe/bin/reframe -C config/baseline.py -c stream/stream_runonly.py -r'
+ launched by: user@myhost
+ working directory: '/home/user'
+ settings files: '', 'reframe-examples/tutorial/config/baseline.py'
+ check search path: '/home/user/reframe-examples/tutorial/stream/stream_runonly.py'
+ stage directory: '/home/user/stage'
+ output directory: '/home/user/output'
+ log files: '/tmp/rfm-dz8m5nfz.log'
+
+ <...>
+
+ [----------] start processing checks
+ [ RUN ] stream_test /2e15a047 @tutorialsys:default+baseline
+ [ OK ] (1/1) stream_test /2e15a047 @tutorialsys:default+baseline
+ P: copy_bw: 23135.4 MB/s (r:0, l:None, u:None)
+ P: triad_bw: 16600.5 MB/s (r:0, l:None, u:None)
+ [----------] all spawned checks have finished
+
+ [ PASSED ] Ran 1/1 test case(s) from 1 check(s) (0 failure(s), 0 skipped, 0 aborted)
+
+
+The :option:`-C` option specifies the configuration file that ReFrame should load.
+Note that ReFrame has loaded two configuration files: first the ```` and then the one we supplied.
+
+Note also that the system and environment specification in the test run output is now ``@tutorialsys:default+baseline``.
+ReFrame has determined that the ``default`` partition and the ``baseline`` environment satisfy the test constraints and thus it has run the test with this partition/environment combination.
+
+
+.. _compiling-the-test-code:
+
+Compiling the test code
+=======================
+
+You can also use ReFrame to compile the test's code.
+To demonstrate this, we will write a different test version of the STREAM benchmark that will also compile the benchmark's source code.
+
+.. literalinclude:: ../examples/tutorial/stream/stream_build_run.py
+ :caption:
+ :pyobject: stream_build_test
+
+The key difference of this test is that it derives from the :class:`~reframe.core.pipeline.RegressionTest` instead of the :class:`~reframe.core.pipeline.RunOnlyRegressionTest` and it specifies how the test's code should be built.
+ReFrame uses a *build system* abstraction for building source code.
+Based on the build system backend used, it will emit the appropriate build instructions.
+All the major build systems are supported as well as the `EasyBuild `__ build automation tool and the `Spack `__ package manager.
+
+In this case, we use the :class:`~reframe.core.build_systems.SingleSource` build system, which is suitable for compiling a single source file.
+The :attr:`sourcepath` variable is used to specify the source file to compile.
+The path is relative to the test's *resource directory*.
+
+Test resources
+--------------
+
+The resource directory is a directory associated to the test where its static resources are stored.
+During execution the contents of this directory will be copied to the test's stage directory and the test will execute from that directory.
+Here is the directory structure:
+
+.. code::
+
+ stream
+ ├── stream_build_run.py
+ └── src
+ └── stream.c
+
+By default, the test's resources directory is named ``src/`` and is located next to the test's file.
+It can be set to a different location inside the test using the :attr:`sourcesdir` variable.
+
+Pipeline hooks
+--------------
+
+The :func:`prepare_build` test method in our example is a *pipeline hook* that will execute just before the compilation phase and will set the compilation flags based on the current environment.
+Pipeline hooks are a fundamental tool in ReFrame for customizing the test execution.
+Let's explain the concept in more detail.
+
+When executed, every test in ReFrame goes through the following stages:
+(a) *setup*,
+(b) *compile*,
+(c) *run*,
+(d) *sanity*,
+(e) *performance* and
+(f) *cleanup*.
+This is the *test pipeline* and a test can assign arbitrary functions to run before or after any of these stages using the :func:`@run_before ` and :func:`@run_after ` decorators.
+There is also a pseudo-stage called *init* that denotes the instantiation/initialization of the test.
+The :doc:`pipeline` page describes in detail every stage, but the most important stages in terms of the test's lifetime are the "init" and the "setup" stages.
+
+The "init" stage is where the test object is actually instantiated and for this reason you cannot define a pre-init hooks.
+At this stage, the system partition and the environment where the test will run are not yet determined, therefore the :attr:`current_partition` and :attr:`current_environ` variables are not set.
+This happens during the "setup" stage, where also all the test's dependencies (if any) have been executed and their resources can be safely accessed (we will cover test dependencies later in this tutorial).
+Technically, all pipeline hooks could be attached to those two stages, but it's a good programming practice to attach them close to the phase that they manipulate as it makes clearer their intent.
+
+For a detailed description of the pipeline hook API, you may refer to the :ref:`pipeline-hooks` guide.
+
+Disabling pipeline hooks
+^^^^^^^^^^^^^^^^^^^^^^^^
+
+.. versionadded:: 3.2
+
+Any pipeline hook can be disabled from the command line using the :option:`--disable-hook` command line option.
+This can be useful to temporarily disable a functionality of the test, e.g., a workaround.
+
+You can view the list of all the hooks of a test using the :option:`--describe` option:
+
+.. code-block:: bash
+ :caption: Run in the single-node container.
+
+ reframe -C config/baseline_environs.py -c stream/stream_variables.py --describe | jq .[].pipeline_hooks
+
+.. code-block:: json
+
+ {
+ "post_setup": [
+ "set_executable"
+ ],
+ "pre_run": [
+ "set_num_threads"
+ ]
+ }
+
+
+We could disable the :obj:`set_num_threads` hook by passing ``--disable-hook=set_num_threads``:
+
+.. code-block:: bash
+ :caption: Run in the single-node container.
+
+ reframe -C config/baseline_environs.py -c stream/stream_variables.py --disable-hook=set_num_threads --describe | jq .[].pipeline_hooks
+
+.. code-block:: json
+
+ {
+ "post_setup": [
+ "set_executable"
+ ]
+ }
+
+The :option:`--disable-hook` option can be passed multiple times to disable multiple hooks at the same time.
+
+
+Environment features and extras
+-------------------------------
+
+We have shown already in the first example the use of *features* in system partition and environment definitions in the configuration file.
+These can be used in the :attr:`valid_systems` and :attr:`valid_prog_environs` specifications to help ReFrame pick the right system/environment combinations to run the test.
+
+In addition to features, the configuration of a system partition or environment can include extra properties that can be accessed from the test and also be used as constraints in the :attr:`valid_systems` and :attr:`valid_prog_environs`.
+The following shows the use of extras in the ``baseline_environs.py`` file for defining the compiler flag that enables OpenMP compilation:
+
+.. literalinclude:: ../examples/tutorial/config/baseline_environs.py
+ :caption:
+ :lines: 23-42
+
+The :attr:`~config.environments.extras` is a simple key/value dictionary, where the values can have any type and are accessible in the test through the :attr:`current_environ` property as shown in the example above.
+
+
+Execution policies
+------------------
+
+Having explained the key concepts behind compiled tests as well as the test pipeline, it's time to run our updated test.
+However, there is still a small tweak that we need to introduce.
+
+ReFrame executes tests concurrently.
+More precisely, the "compile" and "run" stages of a test execute asynchronously and ReFrame will schedule other tests for running.
+Once any of those stages finishes, it will resume execution of the test.
+However, this is problematic for our local benchmarks since ReFrame would schedule the GNU-based and the Clang-based tests concurrently and therefore the tests would exhibit lower performance.
+For this reason, we will force ReFrame to execute the tests serially with ``--exec-policy=serial``:
+
+.. code-block:: bash
+ :caption: Run in the single-node container.
+
+ reframe -C config/baseline_environs.py -c stream/stream_build_run.py --exec-policy=serial -r
+
+.. code-block:: console
+
+ [----------] start processing checks
+ [ RUN ] stream_build_test /6c084d40 @tutorialsys:default+gnu-11.4.0
+ [ OK ] (1/2) stream_build_test /6c084d40 @tutorialsys:default+gnu-11.4.0
+ P: copy_bw: 22273.9 MB/s (r:0, l:None, u:None)
+ P: triad_bw: 16492.8 MB/s (r:0, l:None, u:None)
+ [ RUN ] stream_build_test /6c084d40 @tutorialsys:default+clang-14.0.0
+ [ OK ] (2/2) stream_build_test /6c084d40 @tutorialsys:default+clang-14.0.0
+ P: copy_bw: 22747.9 MB/s (r:0, l:None, u:None)
+ P: triad_bw: 16541.7 MB/s (r:0, l:None, u:None)
+ [----------] all spawned checks have finished
+
+ [ PASSED ] Ran 2/2 test case(s) from 1 check(s) (0 failure(s), 0 skipped, 0 aborted)
+
+
+.. _test-fixtures:
+
+Test fixtures
+=============
+
+Often a test needs some preparation to be done before it runs, but this preparation may need to be run only once per system partition or environment and not every time the test is run.
+A typical example is when we want to build the code of the test once per environment and reuse the executable in multiple different tests.
+We can achieve this in ReFrame using *test fixtures*.
+Test fixtures are normal ReFrame tests as any other, but they have a scope associated with them and can be fully accessed by the tests that define them.
+When test ``A`` is a fixture of test ``B``, then ``A`` will run before ``B`` and ``B`` will have access not only to anything that ``A`` produced, but also to every of its attributes.
+
+Let's see fixtures in practice by separating our compile-and-run STREAM version into two tests:
+a compile-only test that simply builds the benchmark and a run-only version that uses the former as a fixture.
+
+.. literalinclude:: ../examples/tutorial/stream/stream_fixtures.py
+ :caption:
+ :lines: 5-
+
+
+A test fixture is defined with the :func:`~reframe.core.builtins.fixture` builtin:
+
+.. literalinclude:: ../examples/tutorial/stream/stream_fixtures.py
+ :caption:
+ :lines: 25
+
+The first argument is a standard ReFrame test which encompasses the fixture logic and will be executed before the current test.
+Note that there is no need to decorate a fixture with :func:`@simple_test ` as it will run anyway as part of the test that is using it.
+You could still decorate it, though, if you would like to run it independently.
+
+Each fixture is associated with a scope, which will determine when it will run.
+There are the following scopes available:
+
+- ``session``: The fixture will run once for the whole run session.
+- ``partition``: The fixture will run once per system partition.
+- ``environment``: The fixture will run once per system partition and environment combination.
+- ``test``: The fixture will run every time that the calling test is run.
+
+Finally, the :func:`~reframe.core.builtins.fixture` builtin returns a handle which can be used to access the target test once it has finished.
+This can only be done after the "setup" stage of the current test.
+Any attribute of the target test can be accessed through the fixture handle and, in our example, we use the target's test :attr:`stagedir` to construct the final executable.
+
+.. note::
+
+ Compile-only tests do not require a validation check, since the test will fail anyway if the compilation fails.
+ But if one is provided, it will be used.
+
+Before running the new test, let's try to list it first:
+
+.. code-block:: bash
+ :caption: Run in the single-node container.
+
+ reframe -C config/baseline_environs.py -c stream/stream_fixtures.py -l
+
+.. code-block:: console
+
+ [List of matched checks]
+ - stream_test /2e15a047
+ ^build_stream ~tutorialsys:default+gnu-11.4.0 'stream_binary /2ed36672
+ ^build_stream ~tutorialsys:default+clang-14.0.0 'stream_binary /d19d2d86
+ Found 1 check(s)
+
+We will describe in more detail the listing output later in this tutorial, but at the moment it is enough to show that it gives us all the essential information about the test fixtures: their scope and the test variable that they are bound to.
+Note also that due to the ``environment`` scope, a separate fixture is created for every environment that will be tested.
+
+We can now run the benchmarks in parallel to demonstrate that the execution order of the fixtures is respected:
+
+.. code-block:: bash
+ :caption: Run in the single-node container.
+
+ reframe -C config/baseline_environs.py -c stream/stream_fixtures.py -r
+
+.. code-block:: console
+
+ [----------] start processing checks
+ [ RUN ] build_stream ~tutorialsys:default+gnu-11.4.0 /2ed36672 @tutorialsys:default+gnu-11.4.0
+ [ RUN ] build_stream ~tutorialsys:default+clang-14.0.0 /d19d2d86 @tutorialsys:default+clang-14.0.0
+ [ OK ] (1/4) build_stream ~tutorialsys:default+gnu-11.4.0 /2ed36672 @tutorialsys:default+gnu-11.4.0
+ [ OK ] (2/4) build_stream ~tutorialsys:default+clang-14.0.0 /d19d2d86 @tutorialsys:default+clang-14.0.0
+ [ RUN ] stream_test /2e15a047 @tutorialsys:default+gnu-11.4.0
+ [ RUN ] stream_test /2e15a047 @tutorialsys:default+clang-14.0.0
+ [ OK ] (3/4) stream_test /2e15a047 @tutorialsys:default+gnu-11.4.0
+ P: copy_bw: 8182.4 MB/s (r:0, l:None, u:None)
+ P: triad_bw: 9174.3 MB/s (r:0, l:None, u:None)
+ [ OK ] (4/4) stream_test /2e15a047 @tutorialsys:default+clang-14.0.0
+ P: copy_bw: 7974.4 MB/s (r:0, l:None, u:None)
+ P: triad_bw: 18494.1 MB/s (r:0, l:None, u:None)
+ [----------] all spawned checks have finished
+
+ [ PASSED ] Ran 4/4 test case(s) from 3 check(s) (0 failure(s), 0 skipped, 0 aborted)
+
+Note that the two STREAM tests are still independent to each other, so they run in parallel and thus the lower performance.
+
+We will cover more aspects of the fixtures in the following sections, but you are advised to read the API docs of :func:`~reframe.core.builtins.fixture` for a detailed description of all their capabilities.
+
+
+Test variables
+==============
+
+Tests can define *variables* that can be set from the command line.
+These are essentially knobs that allow you to change the test's behaviour on-the-fly.
+All the test's pre-defined attributes that we have seen so far are defined as variables.
+A test variable is defined with the :func:`~reframe.core.builtins.variable` builtin.
+Let's augment our STREAM example by adding a variable to control the number of threads to use.
+
+.. literalinclude:: ../examples/tutorial/stream/stream_variables.py
+ :caption:
+ :lines: 5-
+
+We define a new test variable with the following line:
+
+.. literalinclude:: ../examples/tutorial/stream/stream_variables.py
+ :caption:
+ :lines: 26
+
+Variables are typed and any attempt to assign them a value of different type will cause a :class:`TypeError`.
+Variables can also have a default value as in this case, which is set to ``0``.
+If a variable is not given a value is considered undefined.
+Any attempt to read an undefined variable will cause an error.
+It is not necessary for a variable to be assigned a value along with its declaration;
+this can happen anytime before it is accessed.
+Variables are also inherited, that's why we can set the standard variables of a ReFrame test, such as the :attr:`valid_systems`, :attr:`valid_prog_environs` etc., in our subclasses.
+
+Variables are accessed inside the test as normal class attributes.
+In our example, we use the :attr:`num_threads` variable to set the ``OMP_NUM_THREADS`` environment variable accordingly.
+
+.. literalinclude:: ../examples/tutorial/stream/stream_variables.py
+ :caption:
+ :lines: 32-35
+
+Variables can be set from the command-line using the :option:`-S` option as ``-S var=value``:
+
+.. code-block:: bash
+ :caption: Run in the single-node container.
+
+ reframe -C config/baseline_environs.py -c stream/stream_variables.py -S num_threads=2 -r
+
+We will not list the command output here, but you could verify that the variable was set by inspecting the generated run script:
+
+.. code-block:: bash
+ :caption: Run in the single-node container.
+
+ cat output/tutorialsys/default/clang-14.0.0/stream_test/rfm_job.sh
+
+.. code-block:: bash
+
+ #!/bin/bash
+ export OMP_NUM_THREADS=2
+ /home/user/reframe-examples/tutorial/stage/tutorialsys/default/clang-14.0.0/build_stream_d19d2d86/stream.x
+
+Another thing to notice in the output is the following warning:
+
+.. code-block:: console
+
+ WARNING: test 'build_stream': the following variables were not set: 'num_threads'
+
+
+When setting a variable as ``-S var=value``, ReFrame will try to set it on all the selected tests, including any fixtures.
+If the requested variable is not part of the test, the above warning will be issued.
+You can scope the variable assignment in the command line by prefixing the variable name with test's name as follows: ``-S stream_test.num_threads=2``.
+In this case, the :attr:`num_threads` variable will be set only in the :class:`stream_test` test.
+
+
+Setting variables in fixtures
+-----------------------------
+
+As we have already mentioned, fixtures are normal ReFrame tests, so they can also define their own variables.
+In our example, it makes sense to define a variable in the :class:`build_stream` fixture to control the size of the arrays involved in the computation.
+Here is the updated :class:`build_stream` fixture:
+
+.. literalinclude:: ../examples/tutorial/stream/stream_variables_fixtures.py
+ :caption:
+ :pyobject: build_stream
+
+.. note::
+ The :attr:`~reframe.core.buildsystems.BuildSystem.cppflags` attribute of build system refers to the preprocessor flags and not the C++ flags, which are :attr:`~reframe.core.buildsystems.BuildSystem.cxxflags` instead.
+
+We can set the :attr:`array_size` variable inside the build fixture of our final test through the fixture handle (remember that the fixture handle name is printed in the test listing).
+Here is an example:
+
+.. code-block:: bash
+ :caption: Run in the single-node container.
+
+ reframe -C config/baseline_environs.py -c stream/stream_variables_fixtures.py --exec-policy=serial -S stream_test.stream_binary.array_size=50000000 -r
+
+If you check the generated build script, you will notice the emitted ``-D`` flag:
+
+.. code-block:: bash
+ :caption: Run in the single-node container.
+
+ cat output/tutorialsys/default/clang-14.0.0/build_stream_d19d2d86/rfm_build.sh
+
+.. code-block:: bash
+
+ #!/bin/bash
+
+ _onerror()
+ {
+ exitcode=$?
+ echo "-reframe: command \`$BASH_COMMAND' failed (exit code: $exitcode)"
+ exit $exitcode
+ }
+
+ trap _onerror ERR
+
+ clang -DARRAY_SIZE=50000000 -O3 -fopenmp stream.c -o ./stream.x
+
+
+
+Test parameterization
+=====================
+
+It is often the case that we want to test different variants of the same test, such as varying the number of tasks in order to perform a scaling analysis on a parallel program.
+ReFrame offers a powerful multi-dimensional test parameterization mechanism that automatically generate variants of your tests with different parameter values.
+Let's elaborate on this using the STREAM example.
+Suppose we want to scale over the number of threads and also try different thread placements.
+Here is the updated parameterized :class:`stream_test`:
+
+.. literalinclude:: ../examples/tutorial/stream/stream_parameters.py
+ :caption:
+ :pyobject: stream_test
+
+Parameters are defined in ReFrame using the :func:`~reframe.core.builtins.parameter` builtin.
+This builtin takes simply a list of values for the parameter being defined.
+Each parameter is independent and defines a new dimension in the parameterization space.
+Parameters can also be inherit and filtered from base classes.
+For each point in the final parameterization space, ReFrame will instantiate a different test.
+In our example, we expect 12 :class:`stream_test` variants.
+Given that we have two valid programming environments and a build fixture with an environment scope, we expect ReFrame to generate and run 26 tests in total (including the fixtures):
+
+.. code-block:: bash
+ :caption: Run in the single-node container.
+
+ reframe -C config/baseline_environs.py -c stream/stream_parameters.py --exec-policy=serial -r
+
+.. code-block:: console
+
+ [----------] start processing checks
+ [ RUN ] build_stream ~tutorialsys:default+gnu-11.4.0 /2ed36672 @tutorialsys:default+gnu-11.4.0
+ [ OK ] ( 1/26) build_stream ~tutorialsys:default+gnu-11.4.0 /2ed36672 @tutorialsys:default+gnu-11.4.0
+ [ RUN ] build_stream ~tutorialsys:default+clang-14.0.0 /d19d2d86 @tutorialsys:default+clang-14.0.0
+ [ OK ] ( 2/26) build_stream ~tutorialsys:default+clang-14.0.0 /d19d2d86 @tutorialsys:default+clang-14.0.0
+ [ RUN ] stream_test %num_threads=8 %thread_placement=spread /3c8af82c @tutorialsys:default+gnu-11.4.0
+ [ OK ] ( 3/26) stream_test %num_threads=8 %thread_placement=spread /3c8af82c @tutorialsys:default+gnu-11.4.0
+ P: copy_bw: 24020.6 MB/s (r:0, l:None, u:None)
+ P: triad_bw: 15453.1 MB/s (r:0, l:None, u:None)
+ <...omitted...>
+ [----------] all spawned checks have finished
+
+ [ PASSED ] Ran 26/26 test case(s) from 14 check(s) (0 failure(s), 0 skipped, 0 aborted)
+
+
+Note how the fixture mechanism of ReFrame prevents the recompilation of the STREAM's source code in every test variant:
+The source code is only compiled once per toolchain.
+
+
+Parameterizing existing test variables
+--------------------------------------
+
+We can also parameterize a test on any of its existing variables directly from the command line using the :option:`-P` option.
+For example, we could parameterize the STREAM version in ``stream_variables_fixtures.py`` on :attr:`num_threads` as follows:
+
+.. code-block:: bash
+ :caption: Run in the single-node container.
+
+ reframe -C config/baseline_environs.py -c stream/stream_variables_fixtures.py -P num_threads=1,2,4,8 --exec-policy=serial -r
+
+
+Parameterizing a fixture
+------------------------
+
+Fixtures can also be parameterized.
+In this case the tests that use them are also parameterized implicitly.
+Let's see an example, by parameterizing the build fixture of the STREAM benchmark by adding parameter about the element type (``float`` or ``double``):
+
+.. literalinclude:: ../examples/tutorial/stream/stream_parameters_fixtures.py
+ :caption:
+ :pyobject: build_stream
+
+As expected, parameters in fixtures are no different than parameters in normal test.
+The difference is when you try to list/run the final :class:`stream_test`, where now we have twice as many variants:
+
+.. code-block:: bash
+ :caption: Run in the single-node container.
+
+ reframe -C config/baseline_environs.py -c stream/stream_parameters_fixtures.py -l
+
+
+.. code-block:: console
+
+ - stream_test %num_threads=8 %thread_placement=spread %stream_binary.elem_type=double /ffbd00f1
+ ^build_stream %elem_type=double ~tutorialsys:default+gnu-11.4.0 'stream_binary /099a4f75
+ ^build_stream %elem_type=double ~tutorialsys:default+clang-14.0.0 'stream_binary /7bd4e3bb
+ <...omitted...>
+ - stream_test %num_threads=1 %thread_placement=close %stream_binary.elem_type=float /bc1f32c2
+ ^build_stream %elem_type=float ~tutorialsys:default+gnu-11.4.0 'stream_binary /2ed36672
+ ^build_stream %elem_type=float ~tutorialsys:default+clang-14.0.0 'stream_binary /d19d2d86
+ Found 24 check(s)
+
+Note, that the test variant name now contains the parameter coming from the fixture.
+In total, 52 test cases (24 tests x 2 environments + 2 fixtures x 2 environments) will be run from this simple combination of parameterized tests!
+
+
+Pruning the parameterization space
+----------------------------------
+
+Sometimes parameters are not independent of each other, as a result some parameter combination may be invalid for the test at hand.
+There are two ways to overcome this:
+
+a. Skip the test if the parameter combination is invalid.
+b. Use *parameter packs*.
+
+Let's see those two methods in practice with a fictitious test.
+The first method define two parameters and uses the :func:`skip_if` test method to skip the test if the two parameters have the same value.
+The test will be skipped just after it is initialized and the message supplied will be printed as a warning.
+
+.. literalinclude:: ../examples/tutorial/dummy/params.py
+ :caption:
+ :pyobject: echo_test_v0
+
+The second method uses a single parameter that packs the valid combinations of the ``x`` and ``y`` parameters.
+Note that we also use the optional ``fmt`` argument to provide a more compact formatting for the combined parameter.
+Instead of the skip hook, we simply unpack the combined parameters in the :func:`set_executable_opts` hook.
+
+.. literalinclude:: ../examples/tutorial/dummy/params.py
+ :caption:
+ :pyobject: echo_test_v1
+
+The advantage of using parameter packs instead of skipping explicitly the test is that we do not get a warning message and the test is more compact.
+
+.. note::
+
+ In these tests, we also introduced two more utility functions used in sanity checking, the :func:`~reframe.utility.sanity.and_`, which performs a logical AND of its arguments, and the :func:`~reframe.utility.sanity.assert_eq`, which asserts that its both arguments are equal.
+ We could have simply written ``return x == self.x and y == self.y`` and the test would still validate, but the utility functions provide more context in case of validation errors.
+ In fact, we could also provide a custom message to be printed in case of errors, which can be helpful in real case scenarios.
+
+
+Mastering sanity and performance checking
+=========================================
+
+The sanity and performance checking in our STREAM example are simple, but they do represent the most commonly used patterns.
+There are cases, however, where we would need a more elaborate sanity checking or extracting the performance measure would not be so straightforward.
+The sanity and performance functions (see :func:`@sanity_function ` and :func:`@performance_function `) allow us to write arbitrary code to perform the task at hand, but there are a couple of things to keep in mind:
+
+- Both sanity and performance functions execute from the test's stage directory.
+ All relative paths will be resolved against it.
+- A sanity function must return a boolean or raise a :class:`~reframe.core.exceptions.SanityError` with a message.
+ Raising a :class:`~reframe.core.exceptions.SanityError` is the preferred way to denote sanity error and this is exactly what the utility :doc:`sanity functions ` do.
+- A performance function must return the value of the extracted figure of merit or raise a :class:`~reframe.core.exceptions.SanityError` in case this is not possible.
+
+
+Understanding the builtin sanity functions
+------------------------------------------
+
+All the utility functions provided by the framework for sanity checking and the :attr:`stdout` and :attr:`stderr` test attributes are lazily evaluated:
+when you call these functions or access these attributes, you are not getting their final value, but instead a special object, named *deferred expression*, which is similar in concept to a *future* or *promise*.
+You can include these objects in arbitrary expressions and a new deferred expression will be produced.
+In fact, both sanity and performance functions can return a deferred expression, which would return a boolean when evaluated.
+And this is what our STREAM sanity and performance functions actually return.
+
+A deferred expression can be evaluated explicitly by calling its :func:`evaluate` method or pass it to the :func:`~reframe.core.utility.sanity.evaluate` utility function.
+For example, to retrieve the actual :attr:`stdout` value, we should do ``self.stdout.evaluate()`` or ``sn.evaluate(self.stdout)``.
+Deferred expressions are evaluated implicitly in the following situations:
+
+1. When trying to iterate over them in ``for`` loop.
+2. When trying to include them in an ``if`` expression.
+3. When calling :func:`str` on them.
+
+The ":doc:`deferrables`" page contains details about the underlying mechanism of deferred expressions and gives also concrete examples.
+
+.. tip::
+
+ If you are in doubt about the evaluation of a deferred expression, always call :func:`evaluate()` on it.
+ At the point where the test's :func:`@sanity_function ` is called, all test's attributes are safe to access.
+
+
+.. note::
+
+ Why deferred expressions?
+
+ In ReFrame versions prior to 3.7, the sanity and performance checking were defined using the :attr:`sanity_patterns` :attr:`perf_patterns` expressions at test's initialization.
+ In this case, a lazily evaluated expression was necessary since the test has not yet been executed.
+ The use of :attr:`sanity_patterns` and :attr:`perf_patterns` attributes is still valid today, but it may be deprecated in the future.
+
+
+
+Interacting with workload managers
+==================================
+
+ReFrame integrates with many HPC workload managers (batch job schedulers), including Slurm, PBS Pro, Torque and others.
+The complete list of scheduler backend can be found `here `__.
+Tests in ReFrame are scheduler-agnostic in that they do not need to include any scheduler-specific information.
+Instead, schedulers are associated to system partitions.
+Each system partition in the configuration file defines the scheduler backend to use along with any scheduler-specific options that are needed to grant access to the desired nodes.
+
+HPC systems also come with parallel program launchers which are responsible for launching parallel programs onto multiple nodes.
+ReFrame supports all major `parallel launchers `__ and allows users to easily define their own custom ones.
+Similarly to the batch job schedulers, each system partition is associated to a parallel launcher, which will be used to launch the test's :attr:`executable`.
+
+In the following, we define a configuration for the Slurm-based pseudo cluster of the tutorial.
+We will focus only on the new system configuration as the rest of the configuration remains the same.
+
+.. literalinclude:: ../examples/tutorial/config/cluster.py
+ :caption:
+ :lines: 22-43
+
+We define two partitions, one named ``login`` where we are running tests locally (emulating the login nodes of an HPC cluster) and another one named ``compute`` (emulating the compute nodes of an HPC cluster), where we will be submitting test jobs with Slurm and ``srun``.
+We use the ``squeue`` scheduler backend, because our Slurm installation does not have job accounting, so we instruct ReFrame to use the ``squeue`` command for querying the job state.
+If your Slurm installation has job accounting enabled, you should prefer the ``slurm`` backend, which uses the ``sacct`` for retrieving the job state, which is more reliable.
+
+Another important parameter is :attr:`~config.systems.partitions.access`, which denotes the job scheduler options needed to access the desired nodes.
+In our example, it is redundant to define it as the ``all`` partition is the default, but in most real cases, you will have to define the :attr:`~config.systems.partitions.access` options.
+
+Let's run our STREAM example with the new configuration:
+
+.. code-block:: bash
+ :caption: Run with the Docker compose setup.
+
+ reframe --prefix=/scratch/rfm-stage/ -C config/cluster.py -c stream/stream_variables_fixtures.py -r
+
+.. code-block:: console
+
+ [----------] start processing checks
+ [ RUN ] build_stream ~pseudo-cluster:login+gnu /c5e9e6a0 @pseudo-cluster:login+gnu
+ [ RUN ] build_stream ~pseudo-cluster:login+clang /d0622327 @pseudo-cluster:login+clang
+ [ RUN ] build_stream ~pseudo-cluster:compute+gnu /3f5dbfe2 @pseudo-cluster:compute+gnu
+ [ RUN ] build_stream ~pseudo-cluster:compute+clang /78c4801e @pseudo-cluster:compute+clang
+ [ OK ] (1/8) build_stream ~pseudo-cluster:login+gnu /c5e9e6a0 @pseudo-cluster:login+gnu
+ [ OK ] (2/8) build_stream ~pseudo-cluster:login+clang /d0622327 @pseudo-cluster:login+clang
+ [ OK ] (3/8) build_stream ~pseudo-cluster:compute+gnu /3f5dbfe2 @pseudo-cluster:compute+gnu
+ [ OK ] (4/8) build_stream ~pseudo-cluster:compute+clang /78c4801e @pseudo-cluster:compute+clang
+ [ RUN ] stream_test /2e15a047 @pseudo-cluster:login+gnu
+ [ RUN ] stream_test /2e15a047 @pseudo-cluster:login+clang
+ [ RUN ] stream_test /2e15a047 @pseudo-cluster:compute+gnu
+ [ RUN ] stream_test /2e15a047 @pseudo-cluster:compute+clang
+ [ OK ] (5/8) stream_test /2e15a047 @pseudo-cluster:login+gnu
+ P: copy_bw: 9062.2 MB/s (r:0, l:None, u:None)
+ P: triad_bw: 8344.9 MB/s (r:0, l:None, u:None)
+ [ OK ] (6/8) stream_test /2e15a047 @pseudo-cluster:login+clang
+ P: copy_bw: 25823.0 MB/s (r:0, l:None, u:None)
+ P: triad_bw: 12732.2 MB/s (r:0, l:None, u:None)
+ [ OK ] (7/8) stream_test /2e15a047 @pseudo-cluster:compute+clang
+ P: copy_bw: 11215.5 MB/s (r:0, l:None, u:None)
+ P: triad_bw: 7960.5 MB/s (r:0, l:None, u:None)
+ [ OK ] (8/8) stream_test /2e15a047 @pseudo-cluster:compute+gnu
+ P: copy_bw: 10300.7 MB/s (r:0, l:None, u:None)
+ P: triad_bw: 9647.1 MB/s (r:0, l:None, u:None)
+ [----------] all spawned checks have finished
+
+ [ PASSED ] Ran 8/8 test case(s) from 5 check(s) (0 failure(s), 0 skipped, 0 aborted)
+
+
+Note how the test runs every each partition and environment combination.
+For the ``login`` partition the generated script is the same as for the local execution, whereas for the ``compute`` partition ReFrame generates a job script, which submits with ``sbatch``:
+
+.. code-block:: bash
+ :caption: Run with the Docker compose setup.
+
+ cat /scratch/rfm-stage/output/pseudo-cluster/compute/gnu/stream_test/rfm_job.sh
+
+.. code-block:: bash
+
+ #!/bin/bash
+ #SBATCH --job-name="rfm_stream_test"
+ #SBATCH --ntasks=1
+ #SBATCH --output=rfm_job.out
+ #SBATCH --error=rfm_job.err
+ #SBATCH -p all
+ srun /scratch/rfm-stage/stage/pseudo-cluster/compute/gnu/build_stream_3f5dbfe2/stream.x
+
+
+You may have noticed that we use the :option:`--prefix` option when running ReFrame this time.
+This option changes the prefix of the stage and output directory.
+All scheduler backends, except ``ssh``, require the test's stage directory to be shared across the local and remote nodes, therefore set it to point under the shared ``/scratch`` volume.
+
+.. note::
+ For running the Slurm-based examples, make sure to follow the instructions in :ref:`multi-node-setup` for bringing up and accessing this cluster.
+
+
+Selecting specific partitions or environments to run
+----------------------------------------------------
+
+ReFrame can generated many test cases if have many partitions and environments and you will most likely need to scope down the test space.
+You could use the :option:`--system` and :option:`-p` options to restrict a test to a single partition and/or a single environment.
+To run only the GCC tests on the compute partition you could do the following:
+
+.. code-block:: bash
+ :caption: Run with the Docker compose setup.
+
+ reframe --prefix=/scratch/rfm-stage/ -C config/cluster.py -c stream/stream_variables_fixtures.py \
+ --system=pseudo-cluster:compute -p gnu -r
+
+
+Compiling remotely
+------------------
+
+By default, ReFrame compiles the test's source code locally on the node that it runs.
+This may be problematic in cases that cross-compilation is not possible and the test's code needs to be compiled on the remote nodes.
+This can be achieved by settings the test's :attr:`build_locally` attribute to :obj:`False` with ``-S build_locally=0``.
+In this case, ReFrame will generate a job script also for the compilation job and submit it for execution.
+
+Passing additional scheduler options
+------------------------------------
+
+There are two ways to pass additional options to the backend scheduler: either by modifying the :attr:`~reframe.core.schedulers.Job` instance associated to the test or by defining an extra resource at the partition configuration requesting this from the test.
+Let's see both methods:
+
+Modifying the test job's options
+^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
+
+This method is quite straightforward:
+you need simply to define a pre-run hook and set the ``self.job.options``.
+For example, to pass the ``--mem`` Slurm option to the job submitted by the test, you could do the following:
+
+.. code-block:: python
+
+ @run_before('run')
+ def set_mem_constraint(self):
+ self.job.options = ['--mem=1000']
+
+The advantage of this method is its simplicity, but it injects system-specific information to the test tying it to the system scheduler.
+You could make the test more robust, however, by restricting it to system partitions with Slurm by setting the :attr:`valid_systems` accordingly:
+
+.. code-block:: python
+
+ valid_systems = [r'%scheduler=slurm', r'%scheduler=squeue']
+
+
+Defining extra scheduler resources
+^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
+
+This method comprises two steps.
+First, we need to define a `resource `__ in the partition configuration:
+
+.. literalinclude:: ../examples/tutorial/config/cluster_resources.py
+ :caption:
+ :lines: 41-46
+
+Each resource has a name and a list of scheduler options that will be emitted in the job script when this resource will be requested by the test.
+The scheduler options specification can contain placeholders that will be filled from the test.
+
+Now we can use this resource very in the test by settings its :attr:`extra_resources`:
+
+.. code-block:: python
+
+ extra_resources = {
+ 'memory': {'size': '1000'}
+ }
+
+The advantage of this method is that it is completely scheduler-agnostic.
+If the system partition where the test is running on does not define a resource, the request will be ignored.
+
+Both methods of setting addition job options are valid and you may use whichever of the two fits best your use case.
+
+
+Modifying the launch command
+----------------------------
+
+Sometimes it's useful to modify the launch command itself by prepending another program, such as debugger or profiler.
+You can achieve this by setting the :attr:`~reframe.core.launchers.JobLauncher.modifier` and :attr:`~reframe.core.launchers.JobLauncher.modifier_options` of the test job's launcher:
+
+
+.. code-block:: python
+
+ @run_before('run')
+ def run_with_gdb(self):
+ self.job.launcher.modifier = 'gdb'
+ self.job.launcher.modifier_options = ['-batch', '-ex run', '-ex bt', '--args']
+
+
+
+Replacing the launch command
+----------------------------
+
+Sometimes you may want to replace completely the launcher associated with the partition that the test will run.
+You can do that with the following hook:
+
+.. code-block:: python
+
+ from reframe.core.backends import getlauncher
+ ...
+
+ @run_before('run')
+ def replace_launcher(self):
+ self.job.launcher = getlauncher('local')()
+
+The :func:`~reframe.core.backends.getlauncher` utility function returns the type the implements the launcher with the given name.
+The supported launcher names are those registered with the framework, i.e., all the names listed
+`here `__ as well as any :ref:`user-registered ` launcher.
+Once we have the launcher type, we instantiate it and replace the job's launcher.
+
+
+Multiple job steps
+------------------
+
+A job step is a command launched with the parallel launcher.
+ReFrame will only launch the :attr:`executable` as a job step.
+You can launch multiple job steps by leveraging the :attr:`prerun_cmds` or :attr:`postrun_cmds` test attributes.
+These are commands to be executed before or after the main :attr:`executable` and, normally, they are not job steps: they are simple Bash commands.
+However, you can use the :attr:`reframe.core.launcher.JobLauncher` API to emit the parallel launch command and convert them to a job step as shown in the following example:
+
+.. literalinclude:: ../examples/tutorial/stream/stream_multistep.py
+ :caption:
+ :lines: 37-40
+
+Here we invoke the job launcher's :func:`~reframe.core.launchers.JobLauncher.run_command` method, which is responsible for emitting the launcher prefix based on the current partition.
+
+
+Generally, ReFrame generates the job shell scripts using the following pattern:
+
+.. code-block:: bash
+
+ #!/bin/bash -l
+ {job_scheduler_preamble}
+ {prepare_cmds}
+ {env_load_cmds}
+ {prerun_cmds}
+ {parallel_launcher} {executable} {executable_opts}
+ {postrun_cmds}
+
+The ``job_scheduler_preamble`` contains the backend job scheduler directives that control the job allocation.
+The ``prepare_cmds`` are commands that can be emitted before the test environment commands.
+These can be specified with the :attr:`~config.systems.partitions.prepare_cmds` partition configuration option.
+The ``env_load_cmds`` are the necessary commands for setting up the environment of the test.
+These include any modules or environment variables set at the `system partition level `__ or any `modules `__ or `environment variables `__ set at the test level.
+Then the commands specified in :attr:`prerun_cmds` follow, while those specified in the :attr:`postrun_cmds` come after the launch of the parallel job.
+The parallel launch itself consists of three parts:
+
+#. The parallel launcher program (e.g., ``srun``, ``mpirun`` etc.) with its options,
+#. the test executable as specified in the :attr:`~reframe.core.pipeline.executable` attribute and
+#. the options to be passed to the executable as specified in the :attr:`executable_opts` attribute.
+
+
+Accessing CPU topology information
+==================================
+
+.. versionadded:: 3.7
+
+Sometimes a test may need to access processor topology information for the partition it runs so as to better set up the run.
+Of course, you could hard code the information in the test, but it wouldn't be so portable.
+ReFrame auto-detects the local host topology and it can also auto-detect the topology of remote hosts.
+It makes available this information to the test through the :attr:`current_partition`'s :attr:`~reframe.core.systems.SystemPartition.processor` attribute.
+
+Let's use this feature to set the number of threads of our STREAM benchmark to the host's number of cores, if it is not defined otherwise.
+
+.. literalinclude:: ../examples/tutorial/stream/stream_cpuinfo.py
+ :caption:
+ :lines: 32-39
+
+Note also the use of the :func:`skip_if_no_procinfo()` function which will cause ReFrame to skip the test if there is no processor information available.
+
+Let's try running the test on our pseudo-cluster:
+
+.. code-block:: bash
+ :caption: Run with the Docker compose setup.
+
+ reframe --prefix=/scratch/rfm-stage/ -C config/cluster.py -c stream/stream_cpuinfo.py -p gnu -r
+
+.. code-block:: console
+
+ [==========] Running 3 check(s)
+ [==========] Started on Mon Feb 12 21:55:54 2024+0000
+
+ [----------] start processing checks
+ [ RUN ] build_stream ~pseudo-cluster:login+gnu /c5e9e6a0 @pseudo-cluster:login+gnu
+ [ RUN ] build_stream ~pseudo-cluster:compute+gnu /3f5dbfe2 @pseudo-cluster:compute+gnu
+ [ OK ] (1/4) build_stream ~pseudo-cluster:login+gnu /c5e9e6a0 @pseudo-cluster:login+gnu
+ [ OK ] (2/4) build_stream ~pseudo-cluster:compute+gnu /3f5dbfe2 @pseudo-cluster:compute+gnu
+ [ RUN ] stream_test /2e15a047 @pseudo-cluster:login+gnu
+ [ RUN ] stream_test /2e15a047 @pseudo-cluster:compute+gnu
+ [ SKIP ] (3/4) no topology information found for partition 'pseudo-cluster:compute'
+ [ OK ] (4/4) stream_test /2e15a047 @pseudo-cluster:login+gnu
+ P: copy_bw: 36840.6 MB/s (r:0, l:None, u:None)
+ P: triad_bw: 18338.8 MB/s (r:0, l:None, u:None)
+ [----------] all spawned checks have finished
+
+ [ PASSED ] Ran 3/4 test case(s) from 3 check(s) (0 failure(s), 1 skipped, 0 aborted)
+
+
+Indeed, for the ``login`` partition, the generated script contains the correct number of threads:
+
+.. code-block:: bash
+ :caption: Run with the Docker compose setup.
+
+ cat /scratch/rfm-stage/output/pseudo-cluster/login/gnu/stream_test/rfm_job.sh
+
+.. code-block:: bash
+
+ #!/bin/bash
+ export OMP_NUM_THREADS=8
+ /scratch/rfm-stage/stage/pseudo-cluster/login/gnu/build_stream_c5e9e6a0/stream.x
+
+However, if you noticed, the ``compute`` partition was skipped as no topology information was found.
+ReFrame by default does not try to auto-detect remote partitions, because this could be time consuming.
+To enable remote host auto-detection, we should set the :envvar:`RFM_REMOTE_DETECT` or the equivalent :attr:`~config.general.remote_detect` configuration option.
+
+.. code-block:: bash
+ :caption: Run with the Docker compose setup.
+
+ RFM_REMOTE_WORKDIR=/scratch/rfm-stage RFM_REMOTE_DETECT=1 reframe --prefix=/scratch/rfm-stage/ -C config/cluster.py -c stream/stream_cpuinfo.py -p gnu -r
+
+
+.. code-block:: console
+
+ ...
+ Detecting topology of remote partition 'pseudo-cluster:compute': this may take some time...
+ ...
+ [ OK ] (3/4) stream_test /2e15a047 @pseudo-cluster:compute+gnu
+ P: copy_bw: 19288.6 MB/s (r:0, l:None, u:None)
+ P: triad_bw: 15243.0 MB/s (r:0, l:None, u:None)
+ ...
+
+.. note::
+
+ In our setup we need to set also the :envvar:`RFM_REMOTE_WORKDIR` since the current volume (``/home``) is not shared with the head node.
+
+ReFrame caches the result of host auto-detection, so that it avoids re-detecting the topology every time.
+For a detailed description of the process, refer to the documentation of the :attr:`~config.system.partitions.processor` configuration option.
+
+
+Device information
+------------------
+
+ReFrame cannot auto-detect at the moment device information, such as attached accelerators, NICs etc.
+You can however add manually in the configuration any interesting device and this will be accessible from inside the test through the :attr:`current_partition`.
+For more information check the documentation of the :attr:`~config.systems.partitions.devices` configuration parameter.
+
+
+.. _multi-node-tests:
+
+Multi-node tests
+================
+
+Multi-node tests are quite straightforward in ReFrame.
+All you need is to specify the task setup and the scheduler backend and parallel launcher will emit the right options.
+
+The following tests run download, compile and launch the `OSU benchmarks