Sparse Algorithms Are Not Stable: A No-Free-Lunch Theorem. Xu, H., Caramanis, C., & Mannor, S. Pattern Analysis and Machine Intelligence, IEEE Transactions on, 34(1):187–193, January, 2012.
doi  abstract   bibtex   
We consider two desired properties of learning algorithms: *sparsity* and *algorithmic stability*. Both properties are believed to lead to good generalization ability. We show that these two properties are fundamentally at odds with each other: a sparse algorithm cannot be stable and vice versa. Thus, one has to trade off sparsity and stability in designing a learning algorithm. In particular, our general result implies that \$_1\$-regularized regression (Lasso) cannot be stable, while \$_2\$-regularized regression is known to have strong stability properties and is therefore not sparse.
@article{xuSparseAlgorithmsAre2012,
  title = {Sparse {{Algorithms Are Not Stable}}: {{A No}}-{{Free}}-{{Lunch Theorem}}},
  author = {Xu, Huan and Caramanis, C. and Mannor, S.},
  year = {2012},
  month = jan,
  volume = {34},
  pages = {187--193},
  issn = {0162-8828},
  doi = {10.1109/tpami.2011.177},
  abstract = {We consider two desired properties of learning algorithms: *sparsity* and *algorithmic stability*. Both properties are believed to lead to good generalization ability. We show that these two properties are fundamentally at odds with each other: a sparse algorithm cannot be stable and vice versa. Thus, one has to trade off sparsity and stability in designing a learning algorithm. In particular, our general result implies that \$\_1\$-regularized regression (Lasso) cannot be stable, while \$\_2\$-regularized regression is known to have strong stability properties and is therefore not sparse.},
  journal = {Pattern Analysis and Machine Intelligence, IEEE Transactions on},
  keywords = {*imported-from-citeulike-INRMM,~INRMM-MiD:c-9685435,machine-learning,mathematical-reasoning,modelling,no-free-lunch-theorem,stability-vs-sparsity},
  lccn = {INRMM-MiD:c-9685435},
  number = {1}
}

Downloads: 0