Details
Original language | English |
---|---|
Pages (from-to) | 6940-6955 |
Number of pages | 16 |
Journal | Proceedings of Machine Learning Research |
Volume | 206 |
Publication status | Published - 2023 |
Event | 26th International Conference on Artificial Intelligence and Statistics, AISTATS 2023 - Valencia, Spain Duration: 25 Apr 2023 → 27 Apr 2023 |
Abstract
In this paper, we consider the problem of global change-point detection in event sequence data, where both the event distributions and changepoints are assumed to be unknown. For this problem, we propose a Log-likelihood Ratio based Global Change-point Detector, which observes the entire sequence and detects a prespecified number of change-points. Based on the Transformer Hawkes Process (THP), a well-known neural TPP framework, we develop DCPD, a differentiable change-point detector, along with maintaining distinct intensity and mark predictor for each partition. Further, we propose a sliding-window-based extension of DCPD to improve its scalability in terms of the number of events or change-points with minor sacrifices in performance. Experiments on synthetic datasets explore the effects of run-time, relative complexity, and other aspects of distributions on various properties of our changepoint detectors, namely robustness, detection accuracy, scalability, etc., under controlled environments. Finally, we perform experiments on six real-world temporal event sequences collected from diverse domains like health, geographical regions, etc., and show that our methods either outperform or perform comparably with the baselines.
ASJC Scopus subject areas
- Computer Science(all)
- Artificial Intelligence
- Computer Science(all)
- Software
- Engineering(all)
- Control and Systems Engineering
- Mathematics(all)
- Statistics and Probability
Cite this
- Standard
- Harvard
- Apa
- Vancouver
- BibTeX
- RIS
In: Proceedings of Machine Learning Research, Vol. 206, 2023, p. 6940-6955.
Research output: Contribution to journal › Conference article › Research › peer review
}
TY - JOUR
T1 - Differentiable Change-point Detection With Temporal Point Processes
AU - Koley, Paramita
AU - Alimi, Harshavardhan
AU - Singla, Shrey
AU - Bhattacharya, Sourangshu
AU - Ganguly, Niloy
AU - De, Abir
N1 - Funding Information: This research was (partially) funded by the Federal Ministry of Education and Research (BMBF), Germany under the project LeibnizKILabor with grant No. 01DD20003 and an Intel Inc, India project. Abir De acknowledges a Google Faculty Grant and IBM AI Horizon grant.
PY - 2023
Y1 - 2023
N2 - In this paper, we consider the problem of global change-point detection in event sequence data, where both the event distributions and changepoints are assumed to be unknown. For this problem, we propose a Log-likelihood Ratio based Global Change-point Detector, which observes the entire sequence and detects a prespecified number of change-points. Based on the Transformer Hawkes Process (THP), a well-known neural TPP framework, we develop DCPD, a differentiable change-point detector, along with maintaining distinct intensity and mark predictor for each partition. Further, we propose a sliding-window-based extension of DCPD to improve its scalability in terms of the number of events or change-points with minor sacrifices in performance. Experiments on synthetic datasets explore the effects of run-time, relative complexity, and other aspects of distributions on various properties of our changepoint detectors, namely robustness, detection accuracy, scalability, etc., under controlled environments. Finally, we perform experiments on six real-world temporal event sequences collected from diverse domains like health, geographical regions, etc., and show that our methods either outperform or perform comparably with the baselines.
AB - In this paper, we consider the problem of global change-point detection in event sequence data, where both the event distributions and changepoints are assumed to be unknown. For this problem, we propose a Log-likelihood Ratio based Global Change-point Detector, which observes the entire sequence and detects a prespecified number of change-points. Based on the Transformer Hawkes Process (THP), a well-known neural TPP framework, we develop DCPD, a differentiable change-point detector, along with maintaining distinct intensity and mark predictor for each partition. Further, we propose a sliding-window-based extension of DCPD to improve its scalability in terms of the number of events or change-points with minor sacrifices in performance. Experiments on synthetic datasets explore the effects of run-time, relative complexity, and other aspects of distributions on various properties of our changepoint detectors, namely robustness, detection accuracy, scalability, etc., under controlled environments. Finally, we perform experiments on six real-world temporal event sequences collected from diverse domains like health, geographical regions, etc., and show that our methods either outperform or perform comparably with the baselines.
UR - http://www.scopus.com/inward/record.url?scp=85165198413&partnerID=8YFLogxK
M3 - Conference article
AN - SCOPUS:85165198413
VL - 206
SP - 6940
EP - 6955
JO - Proceedings of Machine Learning Research
JF - Proceedings of Machine Learning Research
T2 - 26th International Conference on Artificial Intelligence and Statistics, AISTATS 2023
Y2 - 25 April 2023 through 27 April 2023
ER -