TIMING: Temporality-Aware Integrated Gradients for Time Series Explanation
Hyeongwon Jang*, Changhun Kim*, Eunho Yang (*: equal contribution)
International Conference on Machine Learning (ICML), 2025 (Spotlight Presentation, 313/12107=2.6%)
Official implementation for TIMING: Temporality-Aware Integrated Gradients for Time Series Explanation. TIMING is implemented in PyTorch and tested on different time series datasets, including switch-feature, state, Mimic-III, PAM, Epilespy, boiler, freezer, and wafer. Our overall experiments are based on time_interpret, ContraLSP, TimeX++, WinIT. Sincere thanks to each of the original authors!
conda create -n timing python==3.10.16
conda activate timing
pip install -r requirement.txt --no-depsThe requirements.txt file is used to install the necessary packages into a virtual environment.
To test with switch-feature, additional setup is required.
git clone https://github.com/TimeSynth/TimeSynth.git
cd TimeSynth
python setup.py install
cd ..
python synthetic/switchstate/switchgenerator.pyWe have divided our experiments into two categories: Synthetic and Real.
All experiments can be executed using scripts located in scripts/real, scripts/hmm, or scripts/switchfeature.
This is an example execution for MIMIC-III (ours)
bash scripts/real/train.sh
bash scripts/real/run_mimic_our.sh
bash scripts/real/run_mimic_baseline.shDue to differences between our training environment and the released code, the paper’s results may not be fully reproducible with the training scripts alone. We have publicly released all model checkpoints, except those trained on the restricted-access MIMIC-III dataset.
Researchers who need MIMIC-III checkpoints may contact the authors directly. ([email protected])
All results will be stored in the current working directory.
And then save parsing results:
python real/parse.py --model state --data mimic3 --top_value 100
python real/parse.py --model state --data mimic3 --experiment_name baseline --top_value 100All parsed results will be saved in the results/ directory.
@inproceedings{jang2025timing,
title={{TIMING: Temporality-Aware Integrated Gradients for Time Series Explanation}},
author={Jang, Hyeongwon and Kim, Changhun and Yang, Eunho},
booktitle={International Conference on Machine Learning (ICML)},
year={2025}
}
