Adaptive Incremental Learning for Statistical Relational Models Using Gradient-Based Boosting. Gu, Y. & Missier, P. In *Procs. ILP '17, 27th International Conference on Inductive Logic Programming (late-breaking paper)*, Orleans, France, 2017. CEUR-WS. Paper abstract bibtex We consider the problem of incrementally learning models from relational data. Most existing learning methods for statistical relational models use batch learning, which becomes computationally expensive and eventually infeasible for large datasets. The majority of the previous work in relational incremental learning assumes the model's structure is given and only the model's parameters needed to be learned. In this paper, we propose algorithms that can incrementally learn the model's parameters and structure simultaneously. These algorithms are based on the successful formalisation of the relational functional gradient boosting system (RFGB), and extend the classical propositional ensemble methods to relational learning for handling evolving data streams.

@inproceedings{gu_adaptive_2017,
address = {Orleans, France},
title = {Adaptive {Incremental} {Learning} for {Statistical} {Relational} {Models} {Using} {Gradient}-{Based} {Boosting}},
url = {https://ilp2017.sciencesconf.org/data/pages/ILP_2017_paper_27.pdf},
abstract = {We consider the problem of incrementally learning models from relational data. Most existing learning methods for statistical relational models use batch learning, which becomes computationally expensive and eventually infeasible for large datasets. The majority of the previous work in relational incremental learning assumes the model's structure is given and only the model's parameters needed to be learned. In this paper, we propose algorithms that can incrementally learn the model's parameters and structure simultaneously. These algorithms are based on the successful formalisation of the relational functional gradient boosting system (RFGB), and extend the classical propositional ensemble methods to relational learning for handling evolving data streams.},
booktitle = {Procs. {ILP} '17, 27th {International} {Conference} on {Inductive} {Logic} {Programming} (late-breaking paper)},
publisher = {CEUR-WS},
author = {Gu, Yulong and Missier, Paolo},
year = {2017},
}

Downloads: 0

{"_id":"spmtguax9meZzEgon","bibbaseid":"gu-missier-adaptiveincrementallearningforstatisticalrelationalmodelsusinggradientbasedboosting-2017","downloads":0,"creationDate":"2017-09-08T11:08:52.781Z","title":"Adaptive Incremental Learning for Statistical Relational Models Using Gradient-Based Boosting","author_short":["Gu, Y.","Missier, P."],"year":2017,"bibtype":"inproceedings","biburl":"https://bibbase.org/f/AkioLaRHRt6JQLR3e/MyPublications-bibbase.bib","bibdata":{"bibtype":"inproceedings","type":"inproceedings","address":"Orleans, France","title":"Adaptive Incremental Learning for Statistical Relational Models Using Gradient-Based Boosting","url":"https://ilp2017.sciencesconf.org/data/pages/ILP_2017_paper_27.pdf","abstract":"We consider the problem of incrementally learning models from relational data. Most existing learning methods for statistical relational models use batch learning, which becomes computationally expensive and eventually infeasible for large datasets. The majority of the previous work in relational incremental learning assumes the model's structure is given and only the model's parameters needed to be learned. In this paper, we propose algorithms that can incrementally learn the model's parameters and structure simultaneously. These algorithms are based on the successful formalisation of the relational functional gradient boosting system (RFGB), and extend the classical propositional ensemble methods to relational learning for handling evolving data streams.","booktitle":"Procs. ILP '17, 27th International Conference on Inductive Logic Programming (late-breaking paper)","publisher":"CEUR-WS","author":[{"propositions":[],"lastnames":["Gu"],"firstnames":["Yulong"],"suffixes":[]},{"propositions":[],"lastnames":["Missier"],"firstnames":["Paolo"],"suffixes":[]}],"year":"2017","bibtex":"@inproceedings{gu_adaptive_2017,\n\taddress = {Orleans, France},\n\ttitle = {Adaptive {Incremental} {Learning} for {Statistical} {Relational} {Models} {Using} {Gradient}-{Based} {Boosting}},\n\turl = {https://ilp2017.sciencesconf.org/data/pages/ILP_2017_paper_27.pdf},\n\tabstract = {We consider the problem of incrementally learning models from relational data. Most existing learning methods for statistical relational models use batch learning, which becomes computationally expensive and eventually infeasible for large datasets. The majority of the previous work in relational incremental learning assumes the model's structure is given and only the model's parameters needed to be learned. In this paper, we propose algorithms that can incrementally learn the model's parameters and structure simultaneously. These algorithms are based on the successful formalisation of the relational functional gradient boosting system (RFGB), and extend the classical propositional ensemble methods to relational learning for handling evolving data streams.},\n\tbooktitle = {Procs. {ILP} '17, 27th {International} {Conference} on {Inductive} {Logic} {Programming} (late-breaking paper)},\n\tpublisher = {CEUR-WS},\n\tauthor = {Gu, Yulong and Missier, Paolo},\n\tyear = {2017},\n}\n\n","author_short":["Gu, Y.","Missier, P."],"bibbaseid":"gu-missier-adaptiveincrementallearningforstatisticalrelationalmodelsusinggradientbasedboosting-2017","role":"author","urls":{"Paper":"https://ilp2017.sciencesconf.org/data/pages/ILP_2017_paper_27.pdf"},"metadata":{"authorlinks":{}}},"search_terms":["adaptive","incremental","learning","statistical","relational","models","using","gradient","based","boosting","gu","missier"],"keywords":[],"authorIDs":[],"dataSources":["zh27EpT9RPew3MWSE","ner3YxPo3mvD9E5ym","nF6KkFb4XxGruanwy","BDjqJntjXzyBmLxhv","25kjbmCDwcgy8vnRq","oiWqtmpFQ6ZtiMEK2","k75vCTghu54BjX5qH","j9tnaL2u4rifwAc2v","NCorZq2vkXK6BnhLF"]}