Benchmarks
-The SMT-LIB benchmark repository is now hosted on the SMT-LIB GitLab server. Read access does not require an account.
- The releases are stored in the master branch. The current release is at the head.
-
- The following links point to the repositories for individual logics.
- Note: Some of the logics listed below have not been entered in the Logics section yet.
-
-
Non-incremental benchmarks
- ABVFP, - ALIA, - AUFBVDTLIA, - AUFDTLIA, - AUFLIA, - AUFLIRA, - AUFNIA, - AUFNIRA, - BV, - BVFP, - FP, - LIA, - LRA, - NIA, - NRA, - QF_ABV, - QF_ABVFP, - QF_ALIA, - QF_ANIA, - QF_AUFBV, - QF_AUFLIA, - QF_AUFNIA, - QF_AX, - QF_BV, - QF_BVFP, - QF_BVFPLRA, - QF_DT, - QF_FP, - QF_FPLRA, - QF_IDL, - QF_LIA, - QF_LIRA, - QF_LRA, - QF_NIA, - QF_NIRA, - QF_NRA, - QF_RDL, - QF_S, - QF_SLIA, - QF_UF, - QF_UFBV, - QF_UFIDL, - QF_UFLIA, - QF_UFLRA, - QF_UFNIA, - QF_UFNRA, - UF, - UFBV, - UFDT, - UFDTLIA, - UFDTNIA, - UFIDL, - UFLIA, - UFLRA, - UFNIA -Incremental benchmarks
- ABVFP, - ALIA, - ANIA, - AUFNIRA, - BV, - BVFP, - LIA, - LRA, - QF_ABV, - QF_ABVFP, - QF_ALIA, - QF_ANIA, - QF_AUFBV, - QF_AUFBVLIA, - QF_AUFBVNIA, - QF_AUFLIA, - QF_BV, - QF_BVFP, - QF_FP, - QF_LIA, - QF_LRA, - QF_NIA, - QF_UF, - QF_UFBV, - QF_UFBVLIA, - QF_UFLIA, - QF_UFLRA, - QF_UFNIA, - UFLRA -- Instructions on how to clone each repository are available in the - repository's space. You can also - download this bash script, uncomment the - lines relevant to the logic/repositories you want to clone, and run - it. Note: Benchmarks with a size of 10MB+ are stored - via Git LFS. You will need - to install and set up Git LFS before cloning any repositories. -
-Zip archives
- To decrease maintenance, we decided not to provide zip archives anymore. If you miss this feature, let us know. - + The latest release of the SMT-LIB benchmark library is hosted on Zenodo under the SMT-LIB community. + Also available there are all previous releases starting from 2023.
+StarExec
-The latest release of the benchmark library is also available on StarExec. +
Historical releases of the benchmark library, from 2013 to 2023, are available on StarExec.
Submitting Benchmarks
For details on how to submit benchmarks to SMT-LIB, please refer to the README file in - the benchmarks-pending - repository on the SMT-LIB GitLab. + the benchmark-submission + SMT-LIB GitHub repository.