Skip to content
/ TIMING Public

Official PyTorch implementation of TIMING: Temporality-Aware Integrated Gradients for Time Series Explanation (ICML 2025 Spotlight)

Notifications You must be signed in to change notification settings

drumpt/TIMING

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

TIMING: Temporality-Aware Integrated Gradients for Time Series Explanation

arXiv DOI

TIMING: Temporality-Aware Integrated Gradients for Time Series Explanation
Hyeongwon Jang*, Changhun Kim*, Eunho Yang (*: equal contribution)
International Conference on Machine Learning (ICML), 2025 (Spotlight Presentation, 313/12107=2.6%)

Introduction

Official implementation for TIMING: Temporality-Aware Integrated Gradients for Time Series Explanation. TIMING is implemented in PyTorch and tested on different time series datasets, including switch-feature, state, Mimic-III, PAM, Epilespy, boiler, freezer, and wafer. Our overall experiments are based on time_interpret, ContraLSP, TimeX++, WinIT. Sincere thanks to each of the original authors!

Installation instructions

conda create -n timing python==3.10.16
conda activate timing
pip install -r requirement.txt --no-deps

The requirements.txt file is used to install the necessary packages into a virtual environment.

To test with switch-feature, additional setup is required.

git clone https://github.com/TimeSynth/TimeSynth.git
cd TimeSynth
python setup.py install
cd ..
python synthetic/switchstate/switchgenerator.py

Reproducing experiments

We have divided our experiments into two categories: Synthetic and Real.

All experiments can be executed using scripts located in scripts/real, scripts/hmm, or scripts/switchfeature.

This is an example execution for MIMIC-III (ours)

bash scripts/real/train.sh
bash scripts/real/run_mimic_our.sh
bash scripts/real/run_mimic_baseline.sh

Due to differences between our training environment and the released code, the paper’s results may not be fully reproducible with the training scripts alone. We have publicly released all model checkpoints, except those trained on the restricted-access MIMIC-III dataset.

Researchers who need MIMIC-III checkpoints may contact the authors directly. ([email protected])

All results will be stored in the current working directory.

And then save parsing results:

python real/parse.py --model state --data mimic3 --top_value 100
python real/parse.py --model state --data mimic3 --experiment_name baseline --top_value 100

All parsed results will be saved in the results/ directory.

Citation

@inproceedings{jang2025timing,
  title={{TIMING: Temporality-Aware Integrated Gradients for Time Series Explanation}},
  author={Jang, Hyeongwon and Kim, Changhun and Yang, Eunho},
  booktitle={International Conference on Machine Learning (ICML)},
  year={2025}
}

About

Official PyTorch implementation of TIMING: Temporality-Aware Integrated Gradients for Time Series Explanation (ICML 2025 Spotlight)

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •