AKCD: An Attention-Driven Framework with Kernel Conditional Independence Testing for Time Series Causal Discovery

Authors

  • Wenming Lu Qingdao University, Qingdao 266071, China Author
  • Dingkai Hu Qingdao University, Qingdao 266071, China Author

DOI:

https://doi.org/10.63313/AERpc.9054

Keywords:

Causal discovery, time series, Transformer, KCI test, conditional independence

Abstract

Causal discovery is crucial for understanding complex system dynamics, espe-cially with non-stationary time series data. This paper presents An Atten-tion-Driven Framework with Kernel Conditional Independence Testing (AKCD), a novel two-stage framework to enhance time series causal discovery accuracy and robustness. In the first stage, AKCD uses a Transformer-based Multi-Scale Temporal Self-Attention Network (MS-TSAN) to capture complex temporal de-pendencies in time series, generating attention matrices and hidden representa-tions for subsequent causal inference. In the second stage, it applies ker-nel-based conditional independence (KCI) tests on the hidden representations to generate a conditional independence (CI) matrix, which is used as prior knowledge to guide causal graph optimization via regularization constraints. This hybrid approach combines deep learning’s feature extraction advantages with statistical tests’ causal inference rigor, making AKCD particularly suitable for non-stationary time series data.

References

[1] Granger C W J. Investigating causal relations by econometric models and cross-spectral methods[J]. Econometrica: journal of the Econometric Society, 1969: 424-438.

[2] Spirtes, Peter, Clark N. Glymour, and Richard Scheines. Causation, prediction, and search. MIT press, 2000.

[3] Runge J, Bathiany S, Bollt E, et al. Inferring causation from time series in Earth system sci-ences[J]. Nature communications, 2019, 10(1): 2553.

[4] Huang B, Zhang K, Zhang J, et al. Causal discovery from heterogeneous/nonstationary da-ta[J]. Journal of Machine Learning Research, 2020, 21(89): 1-53.

[5] Tank A, Covert I, Foti N, et al. Neural granger causality[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2021, 44(8): 4267-4279.

[6] Gao S, Addanki R, Yu T, et al. Causal discovery in semi-stationary time series[J]. Advances in Neural Information Processing Systems, 2023, 36: 46624-46657.

[7] Wu H, Hu T, Liu Y, et al. Timesnet: Temporal 2d-variation modeling for general time series analysis[J]. arXiv preprint arXiv:2210.02186, 2022.

[8] Zhang K, Peters J, Janzing D, et al. Kernel-based conditional independence test and applica-tion in causal discovery[J]. arXiv preprint arXiv:1202.3775, 2012.

[9] Yu Y, Chen J, Gao T, et al. DAG-GNN: DAG structure learning with graph neural net-works[C]//International conference on machine learning. PMLR, 2019: 7154-7163.

[10] Wang L, Zhang C, Ding R, et al. Root cause analysis for microservice systems via hierar-chical reinforcement learning from human feedback[C]//Proceedings of the 29th ACM SIGKDD Conference on Knowledge Discovery and Data Mining. 2023: 5116-5125.

[11] Hoyer P, Janzing D, Mooij J M, et al. Nonlinear causal discovery with additive noise mod-els[J]. Advances in neural information processing systems, 2008, 21.

[12] Eichler M. Causal inference with multiple time series: principles and problems[J]. Philo-sophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sci-ences, 2013, 371(1997): 20110613.

[13] Haufe S, Müller K R, Nolte G, et al. Sparse causal discovery in multivariate time se-ries[C]//causality: objectives and assessment. PMLR, 2010: 97-106.

Downloads

Published

2025-10-27

How to Cite

AKCD: An Attention-Driven Framework with Kernel Conditional Independence Testing for Time Series Causal Discovery. (2025). Advances in Engineering Research : Possibilities and Challenges, 2(3), 27–38. https://doi.org/10.63313/AERpc.9054