Learning conditional generative models for temporal point processes. Xiao, S., Xu, H., Yan, J., Farajtabar, M., Yang, X., Song, L., & Zha, H. 32nd AAAI Conference on Artificial Intelligence, AAAI 2018, 2018. ZSCC: NoCitationData[s1] ISBN: 9781577358008 Publisher: aaai.orgabstract bibtex Estimating the future event sequence conditioned on current observations is a long-standing and challenging task in temporal analysis. On one hand for many real-world problems the underlying dynamics can be very complex and often unknown. This renders the traditional parametric point process models often fail to fit the data for their limited capacity. On the other hand, long-term prediction suffers from the problem of bias exposure where the error accumulates and propagates to future prediction. Our new model builds upon the sequence to sequence (seq2seq) prediction network. Compared with parametric point process models, its modeling capacity is higher and has better flexibility for fitting real-world data. The main novelty of the paper is to mitigate the second challenge by introducing the likelihood-free loss based on Wasserstein distance between point processes, besides negative maximum likelihood loss used in the traditional seq2seq model. Wasserstein distance, unlike KL divergence i.e. MLE loss, is sensitive to the underlying geometry between samples and can robustly enforce close geometry structure between them. This technique is proven able to improve the vanilla seq2seq model by a notable margin on various tasks.
@article{xiao_learning_2018,
title = {Learning conditional generative models for temporal point processes},
abstract = {Estimating the future event sequence conditioned on current observations is a long-standing and challenging task in temporal analysis. On one hand for many real-world problems the underlying dynamics can be very complex and often unknown. This renders the traditional parametric point process models often fail to fit the data for their limited capacity. On the other hand, long-term prediction suffers from the problem of bias exposure where the error accumulates and propagates to future prediction. Our new model builds upon the sequence to sequence (seq2seq) prediction network. Compared with parametric point process models, its modeling capacity is higher and has better flexibility for fitting real-world data. The main novelty of the paper is to mitigate the second challenge by introducing the likelihood-free loss based on Wasserstein distance between point processes, besides negative maximum likelihood loss used in the traditional seq2seq model. Wasserstein distance, unlike KL divergence i.e. MLE loss, is sensitive to the underlying geometry between samples and can robustly enforce close geometry structure between them. This technique is proven able to improve the vanilla seq2seq model by a notable margin on various tasks.},
journal = {32nd AAAI Conference on Artificial Intelligence, AAAI 2018},
author = {Xiao, Shuai and Xu, Honteng and Yan, Junchi and Farajtabar, Mehrdad and Yang, Xiaokang and Song, Le and Zha, Hongyuan},
year = {2018},
note = {ZSCC: NoCitationData[s1]
ISBN: 9781577358008
Publisher: aaai.org},
keywords = {\#nosource, CD GAN Review, CD GAN methods, F-Read, Highlight and M and Points, Review/Point-process, T-Points, roam},
pages = {6302--6309},
}
Downloads: 0
{"_id":"7CN6SPLPn9v9Ri89F","bibbaseid":"xiao-xu-yan-farajtabar-yang-song-zha-learningconditionalgenerativemodelsfortemporalpointprocesses-2018","authorIDs":[],"author_short":["Xiao, S.","Xu, H.","Yan, J.","Farajtabar, M.","Yang, X.","Song, L.","Zha, H."],"bibdata":{"bibtype":"article","type":"article","title":"Learning conditional generative models for temporal point processes","abstract":"Estimating the future event sequence conditioned on current observations is a long-standing and challenging task in temporal analysis. On one hand for many real-world problems the underlying dynamics can be very complex and often unknown. This renders the traditional parametric point process models often fail to fit the data for their limited capacity. On the other hand, long-term prediction suffers from the problem of bias exposure where the error accumulates and propagates to future prediction. Our new model builds upon the sequence to sequence (seq2seq) prediction network. Compared with parametric point process models, its modeling capacity is higher and has better flexibility for fitting real-world data. The main novelty of the paper is to mitigate the second challenge by introducing the likelihood-free loss based on Wasserstein distance between point processes, besides negative maximum likelihood loss used in the traditional seq2seq model. Wasserstein distance, unlike KL divergence i.e. MLE loss, is sensitive to the underlying geometry between samples and can robustly enforce close geometry structure between them. This technique is proven able to improve the vanilla seq2seq model by a notable margin on various tasks.","journal":"32nd AAAI Conference on Artificial Intelligence, AAAI 2018","author":[{"propositions":[],"lastnames":["Xiao"],"firstnames":["Shuai"],"suffixes":[]},{"propositions":[],"lastnames":["Xu"],"firstnames":["Honteng"],"suffixes":[]},{"propositions":[],"lastnames":["Yan"],"firstnames":["Junchi"],"suffixes":[]},{"propositions":[],"lastnames":["Farajtabar"],"firstnames":["Mehrdad"],"suffixes":[]},{"propositions":[],"lastnames":["Yang"],"firstnames":["Xiaokang"],"suffixes":[]},{"propositions":[],"lastnames":["Song"],"firstnames":["Le"],"suffixes":[]},{"propositions":[],"lastnames":["Zha"],"firstnames":["Hongyuan"],"suffixes":[]}],"year":"2018","note":"ZSCC: NoCitationData[s1] ISBN: 9781577358008 Publisher: aaai.org","keywords":"#nosource, CD GAN Review, CD GAN methods, F-Read, Highlight and M and Points, Review/Point-process, T-Points, roam","pages":"6302–6309","bibtex":"@article{xiao_learning_2018,\n\ttitle = {Learning conditional generative models for temporal point processes},\n\tabstract = {Estimating the future event sequence conditioned on current observations is a long-standing and challenging task in temporal analysis. On one hand for many real-world problems the underlying dynamics can be very complex and often unknown. This renders the traditional parametric point process models often fail to fit the data for their limited capacity. On the other hand, long-term prediction suffers from the problem of bias exposure where the error accumulates and propagates to future prediction. Our new model builds upon the sequence to sequence (seq2seq) prediction network. Compared with parametric point process models, its modeling capacity is higher and has better flexibility for fitting real-world data. The main novelty of the paper is to mitigate the second challenge by introducing the likelihood-free loss based on Wasserstein distance between point processes, besides negative maximum likelihood loss used in the traditional seq2seq model. Wasserstein distance, unlike KL divergence i.e. MLE loss, is sensitive to the underlying geometry between samples and can robustly enforce close geometry structure between them. This technique is proven able to improve the vanilla seq2seq model by a notable margin on various tasks.},\n\tjournal = {32nd AAAI Conference on Artificial Intelligence, AAAI 2018},\n\tauthor = {Xiao, Shuai and Xu, Honteng and Yan, Junchi and Farajtabar, Mehrdad and Yang, Xiaokang and Song, Le and Zha, Hongyuan},\n\tyear = {2018},\n\tnote = {ZSCC: NoCitationData[s1] \nISBN: 9781577358008\nPublisher: aaai.org},\n\tkeywords = {\\#nosource, CD GAN Review, CD GAN methods, F-Read, Highlight and M and Points, Review/Point-process, T-Points, roam},\n\tpages = {6302--6309},\n}\n\n","author_short":["Xiao, S.","Xu, H.","Yan, J.","Farajtabar, M.","Yang, X.","Song, L.","Zha, H."],"key":"xiao_learning_2018","id":"xiao_learning_2018","bibbaseid":"xiao-xu-yan-farajtabar-yang-song-zha-learningconditionalgenerativemodelsfortemporalpointprocesses-2018","role":"author","urls":{},"keyword":["#nosource","CD GAN Review","CD GAN methods","F-Read","Highlight and M and Points","Review/Point-process","T-Points","roam"],"downloads":0},"bibtype":"article","biburl":"https://api.zotero.org/users/3522498/collections/AW3NX4WW/items?key=kdJ5QIjIIc7oy1mYjjz70Rv2&format=bibtex&limit=100","creationDate":"2021-03-03T03:31:06.251Z","downloads":0,"keywords":["#nosource","cd gan review","cd gan methods","f-read","highlight and m and points","review/point-process","t-points","roam"],"search_terms":["learning","conditional","generative","models","temporal","point","processes","xiao","xu","yan","farajtabar","yang","song","zha"],"title":"Learning conditional generative models for temporal point processes","year":2018,"dataSources":["dwrmKCbrccWf5bf2H"]}