{"_id":"ZmbmrmBm5YKovuPwW","bibbaseid":"munkhdalai-yu-metanetworks-2017","author_short":["Munkhdalai, T.","Yu, H."],"bibdata":{"bibtype":"inproceedings","type":"inproceedings","address":"Sydney, Australia","title":"Meta Networks","volume":"70","abstract":"Neural networks have been successfully applied in applications with a large amount of labeled data. However, the task of rapid generalization on new concepts with small training data while preserving performances on previously learned ones still presents a significant challenge to neural network models. In this work, we introduce a novel meta learning method, Meta Networks (MetaNet), that learns a meta-level knowledge across tasks and shifts its inductive biases via fast parameterization for rapid generalization. When evaluated on Omniglot and Mini-ImageNet benchmarks, our MetaNet models achieve a near human-level performance and outperform the baseline approaches by up to 6% accuracy. We demonstrate several appealing properties of MetaNet relating to generalization and continual learning.","booktitle":"ICML","author":[{"propositions":[],"lastnames":["Munkhdalai"],"firstnames":["Tsendsuren"],"suffixes":[]},{"propositions":[],"lastnames":["Yu"],"firstnames":["Hong"],"suffixes":[]}],"month":"August","year":"2017","pmid":"31106300; PMCID: PMC6519722","pages":"2554–2563","bibtex":"@inproceedings{munkhdalai_meta_2017,\n\taddress = {Sydney, Australia},\n\ttitle = {Meta {Networks}},\n\tvolume = {70},\n\tabstract = {Neural networks have been successfully applied in applications with a large amount of labeled data. However, the task of rapid generalization on new concepts with small training data while preserving performances on previously learned ones still presents a significant challenge to neural network models. In this work, we introduce a novel meta learning method, Meta Networks (MetaNet), that learns a meta-level knowledge across tasks and shifts its inductive biases via fast parameterization for rapid generalization. When evaluated on Omniglot and Mini-ImageNet benchmarks, our MetaNet models achieve a near human-level performance and outperform the baseline approaches by up to 6\\% accuracy. We demonstrate several appealing properties of MetaNet relating to generalization and continual learning.},\n\tbooktitle = {{ICML}},\n\tauthor = {Munkhdalai, Tsendsuren and Yu, Hong},\n\tmonth = aug,\n\tyear = {2017},\n\tpmid = {31106300; PMCID: PMC6519722},\n\tpages = {2554--2563},\n}\n\n","author_short":["Munkhdalai, T.","Yu, H."],"key":"munkhdalai_meta_2017","id":"munkhdalai_meta_2017","bibbaseid":"munkhdalai-yu-metanetworks-2017","role":"author","urls":{},"metadata":{"authorlinks":{}},"html":""},"bibtype":"inproceedings","biburl":"http://fenway.cs.uml.edu/papers/pubs-all.bib","dataSources":["TqaA9miSB65nRfS5H"],"keywords":[],"search_terms":["meta","networks","munkhdalai","yu"],"title":"Meta Networks","year":2017}