Skip to content

Commit

Permalink
Merge pull request #191 from ibm-granite/readme_update
Browse files Browse the repository at this point in the history
Data download instructions
  • Loading branch information
vijaye12 authored Nov 11, 2024
2 parents d8aad5f + 70fd776 commit f577fc2
Show file tree
Hide file tree
Showing 4 changed files with 18 additions and 3 deletions.
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,6 @@ Public notebooks, utilities, and serving components for working with Time Series
The core TSFM time series models have been made available on Hugging Face -- details can be found
[here](https://github.com/ibm-granite/granite-tsfm/wiki). Information on the services component can be found [here](services/inference/README.md).


## Python Version
The current Python versions supported are 3.9, 3.10, 3.11, 3.12.

Expand All @@ -28,6 +27,7 @@ pip install ".[notebooks]"
- Transfer learning with `PatchTSMixer` [[Try it out]](https://github.com/ibm-granite/granite-tsfm/blob/main/notebooks/hfdemo/patch_tsmixer_transfer.ipynb)
- Transfer learning with `PatchTST` [[Try it out]](https://github.com/ibm-granite/granite-tsfm/blob/main/notebooks/hfdemo/patch_tst_transfer.ipynb)
- Getting started with `TinyTimeMixer (TTM)` [[Try it out]](https://github.com/ibm-granite/granite-tsfm/blob/main/notebooks/hfdemo/ttm_getting_started.ipynb)
- `TTM` full benchmarking scripts and results are available [[here]](https://github.com/ibm-granite/granite-tsfm/tree/main/notebooks/hfdemo/tinytimemixer/full_benchmarking)

## 📗 Google Colab Tutorials
Run the TTM tutorial in Google Colab, and quickly build a forecasting application with the pre-trained TSFM models.
Expand Down
Empty file.
15 changes: 14 additions & 1 deletion notebooks/hfdemo/tinytimemixer/full_benchmarking/README.md
Original file line number Diff line number Diff line change
@@ -1,13 +1,26 @@
# Steps to run the full benchmarking

## Fetching the data
The evaluation data can be downloaded from any of the previous time-series github repos like autoformer or timesnet or informer. [Sample download link](https://drive.google.com/drive/folders/1vE0ONyqPlym2JaaAoEe0XNDR8FS_d322). The ETT datasets can also be downloaded from [ETT-Github-Repository](https://github.com/zhouhaoyi/ETDataset).

Download and save the datasets in a directory. For example, in `data_root_path`.
CSVs of each data should reside in location `data_root_path/$dataset_name/$dataset_name.csv` for our data utils to process them automatically.

## Running the scripts

1. In terminal, the any one of the three bash scripts `granite-r2.sh`, `granite-r1.sh`, or `research-use-r2.sh`.
2. Run `summarize_results.py`. For example,
```
sh granite-r2.sh
sh granite-r2.sh data_root_path/
python summarize_results.py -rd=results-granite-r2/
```

It will run all benchmarking and dump the results. The dumped results are available in the CSV files.


## Benchmarking Results
Note that, although random seed has been set, the mean squared error (MSE) scores might not match the below scores exactly depending on the runtime environment. The following results were obtained in a Unix-based machine equipped with one NVIDIA A-100 GPU.

1. TTM-Research-Use model results:
- `combined_results-research-use-r2.csv`: Across all datasets, all TTM models, and all forecast horizons.
- `combined_avg_results-research-use-r2.csv`: Across all datasets and all TTM models average over forecast horizons.
Expand Down
4 changes: 3 additions & 1 deletion notebooks/hfdemo/tinytimemixer/ttm_m4_hourly.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,9 @@
"\n",
"Note that other subsets of M4 data like (daily, monthly, quarterly, and yearly)\n",
"are shorter in length and are not suitable for TTM-512-96 or TTM-1024-96 model.\n",
"Stay tuned for more TTM models!"
"Stay tuned for more TTM models!\n",
"\n",
"Dataset download link: [download link](https://drive.google.com/drive/folders/15zio96o3NK4XOoR5L88oaWcJDVOiqQo9)"
]
},
{
Expand Down

0 comments on commit f577fc2

Please sign in to comment.