Lower error bounds for the stochastic gradient descent optimization algorithm: Sharp convergence rates for slowly and fast decaying learning rates. Jentzen, A. & Wurstemberger, P. v. arXiv:1803.08600, March, 2018. arXiv: 1803.08600
Lower error bounds for the stochastic gradient descent optimization algorithm: Sharp convergence rates for slowly and fast decaying learning rates [link]Paper  bibtex   
@article{jentzen_lower_2018,
	title = {Lower error bounds for the stochastic gradient descent optimization algorithm: {Sharp} convergence rates for slowly and fast decaying learning rates},
	url = {http://arxiv.org/pdf/1803.08600},
	journal = {arXiv:1803.08600},
	author = {Jentzen, Arnulf and Wurstemberger, Philippe von},
	month = mar,
	year = {2018},
	note = {arXiv: 1803.08600},
	keywords = {/unread, math.NA, math.PR, stat.ML, ⛔ No DOI found},
}

Downloads: 0