Escaping the Gravitational Pull of Softmax. Mei, J.; Xiao, C; Dai, B.; Li, L.; Szepesvári, C.; and Schuurmans, D. In December, 2020.
Escaping the Gravitational Pull of Softmax [pdf]Paper  Escaping the Gravitational Pull of Softmax [link]Link  abstract   bibtex   13 downloads  
he softmax is the standard transformation used in machine learning to map real-valued vectors to categorical distributions. Unfortunately, this transform poses serious drawbacks for gradient descent (ascent) optimization. We reveal this difficulty by establishing two negative results: (1) optimizing any expectation with respect to the softmax must exhibit sensitivity to parameter initialization (softmax gravity well''), and (2) optimizing log-probabilities under the softmax must exhibit slow convergence (softmax damping''). Both findings are based on an analysis of convergence rates using the Non-uniform \Lojasiewicz (N\L) inequalities. To circumvent these shortcomings we investigate an alternative transformation, the \emphescort mapping, that demonstrates better optimization properties. The disadvantages of the softmax and the effectiveness of the escort transformation are further explained using the concept of N\L coefficient. In addition to proving bounds on convergence rates to firmly establish these results, we also provide experimental evidence for the superiority of the escort transformation.
@inproceedings{MXDLSzS20,
	abstract = {he softmax is the standard transformation used in machine learning to map real-valued vectors to categorical distributions. Unfortunately, this transform poses serious drawbacks for gradient descent (ascent) optimization. We reveal this difficulty by establishing two negative results: (1) optimizing any expectation with respect to the softmax must exhibit sensitivity to parameter initialization (<code>softmax gravity well''), and (2) optimizing log-probabilities under the softmax must exhibit slow convergence (</code>softmax damping''). Both findings are based on an analysis of convergence rates using the Non-uniform \L{}ojasiewicz (N\L{}) inequalities. To circumvent these shortcomings we investigate an alternative transformation, the \emph{escort} mapping, that demonstrates better optimization properties. The disadvantages of the softmax and the effectiveness of the escort transformation are further explained using the concept of N\L{} coefficient. In addition to proving bounds on convergence rates to firmly establish these results, we also provide experimental evidence for the superiority of the escort transformation.},
	author = {Mei, J. and Xiao, C and Dai, B. and Li, L. and Szepesv{\'a}ri, Cs. and Schuurmans, D.},
	crossref = {NeurIPS2020oral},
	title = {Escaping the Gravitational Pull of Softmax},
	url_paper = {NeurIPS2020_pg.pdf},
	url_link = {https://papers.nips.cc/paper/2020/hash/f1cf2a082126bf02de0b307778ce73a7-Abstract.html},	
	year = {2020},
    month = {December},
}
Downloads: 13