diff --git a/README.md b/README.md index e0614d0..7321395 100644 --- a/README.md +++ b/README.md @@ -17,9 +17,9 @@ $\text{BasicTS}^{+}$ (**Basic** **T**ime **S**eries **P**lus) is an enhanced benchmark and toolbox designed for time series forecasting. BasicTS+ evolved from its predecessor, [BasicTS](https://github.com/zezhishao/BasicTS/tree/c3075025a5d20ef48da62fc85d05621f8f6b15ca), and now has robust support for spatial-temporal forecasting and long time-series forecasting as well as more general tasks, such as M4 competition. For brevity and consistency, we will refer to this project as $\text{BasicTS}^{+}$ and $\text{BasicTS}$ interchangeably. -On the one hand, $\text{BasicTS}$ utilizes a ***unified and standard pipeline*** to give a ***fair and exhaustive*** reproduction and comparison of popular deep learning-based models. +On the one hand, BasicTS utilizes a ***unified and standard pipeline*** to give a ***fair and exhaustive*** reproduction and comparison of popular deep learning-based models. -On the other hand, $\text{BasicTS}$ provides users with ***easy-to-use and extensible interfaces*** to facilitate the quick design and evaluation of new models. At a minimum, users only need to define the model architecture. +On the other hand, BasicTS provides users with ***easy-to-use and extensible interfaces*** to facilitate the quick design and evaluation of new models. At a minimum, users only need to define the model architecture. We are collecting **TODOs** and **HOWTOs**, if you need more features (*e.g.* more datasets or baselines) or have any questions, please feel free to create an issue or leave a comment [here](https://github.com/zezhishao/BasicTS/issues/95). @@ -46,7 +46,7 @@ Users can control all the details of the pipeline through a config file, such as
Support All Devices -$\text{BasicTS}$ supports CPU, GPU and GPU distributed training (both single node multiple GPUs and multiple nodes) thanks to using EasyTorch as the backend. Users can use it by setting parameters without modifying any code. +BasicTS supports CPU, GPU and GPU distributed training (both single node multiple GPUs and multiple nodes) thanks to using EasyTorch as the backend. Users can use it by setting parameters without modifying any code.
@@ -58,7 +58,7 @@ Support `logging` log system and `Tensorboard`, and encapsulate it as a unified ### Datasets -$\text{BasicTS}$ support a variety of datasets, including spatial-temporal forecasting, long time-series forecasting, and large-scale datasets, e.g., +BasicTS support a variety of datasets, including spatial-temporal forecasting, long time-series forecasting, and large-scale datasets, e.g., - METR-LA, PEMS-BAY, PEMS03, PEMS04, PEMS07, PEMS08 - ETTh1, ETTh2, ETTm1, ETTm2, Electricity, Exchange Rate, Weather, Traffic, Illness, Beijing Air Quality @@ -67,7 +67,7 @@ $\text{BasicTS}$ support a variety of datasets, including spatial-temporal forec ### Baselines -$\text{BasicTS}$ implements a wealth of models, including both spatial-temporal forecasting models and long time-series forecasting models, e.g., +BasicTS implements a wealth of models, including both spatial-temporal forecasting models and long time-series forecasting models, e.g., - DCRNN, Graph WaveNet, MTGNN, STID, D2STGNN, STEP, DGCRN, DGCRN, STNorm, AGCRN, GTS, StemGNN, MegaCRN, STGCN, STWave, STAEformer, GMSDR, ... - Informer, Autoformer, FEDformer, Pyraformer, DLinear, NLinear, Triformer, Crossformer, ... @@ -79,7 +79,7 @@ $\text{BasicTS}$ implements a wealth of models, including both spatial-temporal ### OS -We recommend using $\text{BasicTS}$ on Linux systems (*e.g.* Ubuntu and CentOS). +We recommend using BasicTS on Linux systems (*e.g.* Ubuntu and CentOS). Other systems (*e.g.*, Windows and macOS) have not been tested. ### Python @@ -91,7 +91,7 @@ Python >= 3.6 (recommended >= 3.9). ### Other Dependencies
-$\text{BasicTS}$ is built based on PyTorch and [EasyTorch](https://github.com/cnstark/easytorch). +BasicTS is built based on PyTorch and [EasyTorch](https://github.com/cnstark/easytorch). You can install PyTorch following the instruction in [PyTorch](https://pytorch.org/get-started/locally/). For example: ```bash @@ -106,7 +106,7 @@ pip install -r requirements.txt ### Warning -$\text{BasicTS}$ is built on PyTorch 1.9.1 or 1.10.0, while other versions have not been tested. +BasicTS is built on PyTorch 1.9.1 or 1.10.0, while other versions have not been tested. ## 🎯 Getting Started of Developing with BasicTS @@ -138,27 +138,27 @@ $\text{BasicTS}$ is built on PyTorch 1.9.1 or 1.10.0, while other versions have - **Define Your Model Architecture** - The `forward` function needs to follow the conventions of $\text{BasicTS}$. You can find an example of the Multi-Layer Perceptron (`MLP`) model in [baselines/MLP/mlp_arch.py](baselines/MLP/mlp_arch.py) + The `forward` function needs to follow the conventions of BasicTS. You can find an example of the Multi-Layer Perceptron (`MLP`) model in [baselines/MLP/mlp_arch.py](baselines/MLP/mlp_arch.py) - **Define Your Runner for Your Model** (Optional) - $\text{BasicTS}$ provides a unified and standard pipeline in `basicts.runner.BaseTimeSeriesForecastingRunner`. + BasicTS provides a unified and standard pipeline in `basicts.runner.BaseTimeSeriesForecastingRunner`. Nevertheless, you still need to define the specific forward process (the `forward` function in the **runner**). - Fortunately, $\text{BasicTS}$ also provides such an implementation in `basicts.runner.SimpleTimeSeriesForecastingRunner`, which can cover most of the situations. + Fortunately, BasicTS also provides such an implementation in `basicts.runner.SimpleTimeSeriesForecastingRunner`, which can cover most of the situations. The runner for the `MLP` model can also use this built-in runner. You can also find more runners in `basicts.runners.runner_zoo` to learn more about the runner design. - **Configure your Configuration File** You can configure all the details of the pipeline and hyperparameters in a configuration file, *i.e.*, **everything is based on config**. - The configuration file is a `.py` file, in which you can import your model and runner and set all the options. $\text{BasicTS}$ uses `EasyDict` to serve as a parameter container, which is extensible and flexible to use. + The configuration file is a `.py` file, in which you can import your model and runner and set all the options. BasicTS uses `EasyDict` to serve as a parameter container, which is extensible and flexible to use. An example of the configuration file for the `MLP` model on the `METR-LA` dataset can be found in [baselines/MLP/MLP_METR-LA.py](baselines/MLP/MLP_METR-LA.py) ### Run It! - **Reproducing Built-in Models** - $\text{BasicTS}$ provides a wealth of built-in models. You can reproduce these models by running the following command: + BasicTS provides a wealth of built-in models. You can reproduce these models by running the following command: ```bash python experiments/train.py -c baselines/${MODEL_NAME}/${DATASET_NAME}.py --gpus '0' @@ -208,4 +208,4 @@ Comprehensive Benchmarking and Heterogeneity Analysis](https://arxiv.org/pdf/231 ## 🔗 Acknowledgement -$\text{BasicTS}$ is developed based on [EasyTorch](https://github.com/cnstark/easytorch), an easy-to-use and powerful open-source neural network training framework. +BasicTS is developed based on [EasyTorch](https://github.com/cnstark/easytorch), an easy-to-use and powerful open-source neural network training framework.