{"_id":"D3Pj2SE7Zdq4qGjEo","bibbaseid":"nichol-achiam-schulman-onfirstordermetalearningalgorithms-2018","author_short":["Nichol, A.","Achiam, J.","Schulman, J."],"bibdata":{"bibtype":"misc","type":"misc","title":"On First-Order Meta-Learning Algorithms","url":"http://arxiv.org/abs/1803.02999","doi":"10.48550/arXiv.1803.02999","abstract":"This paper considers meta-learning problems, where there is a distribution of tasks, and we would like to obtain an agent that performs well (i.e., learns quickly) when presented with a previously unseen task sampled from this distribution. We analyze a family of algorithms for learning a parameter initialization that can be fine-tuned quickly on a new task, using only first-order derivatives for the meta-learning updates. This family includes and generalizes first-order MAML, an approximation to MAML obtained by ignoring second-order derivatives. It also includes Reptile, a new algorithm that we introduce here, which works by repeatedly sampling a task, training on it, and moving the initialization towards the trained weights on that task. We expand on the results from Finn et al. showing that first-order meta-learning algorithms perform well on some well-established benchmarks for few-shot classification, and we provide theoretical analysis aimed at understanding why these algorithms work.","urldate":"2023-10-03","publisher":"arXiv","author":[{"propositions":[],"lastnames":["Nichol"],"firstnames":["Alex"],"suffixes":[]},{"propositions":[],"lastnames":["Achiam"],"firstnames":["Joshua"],"suffixes":[]},{"propositions":[],"lastnames":["Schulman"],"firstnames":["John"],"suffixes":[]}],"month":"October","year":"2018","note":"arXiv:1803.02999 [cs]","keywords":"Computer Science - Machine Learning","bibtex":"@misc{nichol_first-order_2018,\n\ttitle = {On {First}-{Order} {Meta}-{Learning} {Algorithms}},\n\turl = {http://arxiv.org/abs/1803.02999},\n\tdoi = {10.48550/arXiv.1803.02999},\n\tabstract = {This paper considers meta-learning problems, where there is a distribution of tasks, and we would like to obtain an agent that performs well (i.e., learns quickly) when presented with a previously unseen task sampled from this distribution. We analyze a family of algorithms for learning a parameter initialization that can be fine-tuned quickly on a new task, using only first-order derivatives for the meta-learning updates. This family includes and generalizes first-order MAML, an approximation to MAML obtained by ignoring second-order derivatives. It also includes Reptile, a new algorithm that we introduce here, which works by repeatedly sampling a task, training on it, and moving the initialization towards the trained weights on that task. We expand on the results from Finn et al. showing that first-order meta-learning algorithms perform well on some well-established benchmarks for few-shot classification, and we provide theoretical analysis aimed at understanding why these algorithms work.},\n\turldate = {2023-10-03},\n\tpublisher = {arXiv},\n\tauthor = {Nichol, Alex and Achiam, Joshua and Schulman, John},\n\tmonth = oct,\n\tyear = {2018},\n\tnote = {arXiv:1803.02999 [cs]},\n\tkeywords = {Computer Science - Machine Learning},\n}\n\n\n\n","author_short":["Nichol, A.","Achiam, J.","Schulman, J."],"key":"nichol_first-order_2018","id":"nichol_first-order_2018","bibbaseid":"nichol-achiam-schulman-onfirstordermetalearningalgorithms-2018","role":"author","urls":{"Paper":"http://arxiv.org/abs/1803.02999"},"keyword":["Computer Science - Machine Learning"],"metadata":{"authorlinks":{}},"html":""},"bibtype":"misc","biburl":"https://bibbase.org/zotero/mh_lenguyen","dataSources":["iwKepCrWBps7ojhDx"],"keywords":["computer science - machine learning"],"search_terms":["first","order","meta","learning","algorithms","nichol","achiam","schulman"],"title":"On First-Order Meta-Learning Algorithms","year":2018}