AutoHOOT: Automatic High-Order Optimization for Tensors. Ma, L., Ye, J., & Solomonik, E. arXiv:2005.04540 [cs, math], May, 2020. arXiv: 2005.04540
AutoHOOT: Automatic High-Order Optimization for Tensors [link]Paper  abstract   bibtex   
High-order optimization methods, including Newton's method and its variants as well as alternating minimization methods, dominate the optimization algorithms for tensor decompositions and tensor networks. These tensor methods are used for data analysis and simulation of quantum systems. In this work, we introduce AutoHOOT, the first automatic differentiation (AD) framework targeting at high-order optimization for tensor computations. AutoHOOT takes input tensor computation expressions and generates optimized derivative expressions. In particular, AutoHOOT contains a new explicit Jacobian / Hessian expression generation kernel whose outputs maintain the input tensors' granularity and are easy to optimize. The expressions are then optimized by both the traditional compiler optimization techniques and specific tensor algebra transformations. Experimental results show that AutoHOOT achieves competitive performance for both tensor decomposition and tensor network applications compared to existing AD software and other tensor computation libraries with manually written kernels, both on CPU and GPU architectures. The scalability of the generated kernels is as good as other well-known high-order numerical algorithms so that it can be executed efficiently on distributed parallel systems.
@article{ma_autohoot_2020,
	title = {{AutoHOOT}: {Automatic} {High}-{Order} {Optimization} for {Tensors}},
	shorttitle = {{AutoHOOT}},
	url = {http://arxiv.org/abs/2005.04540},
	abstract = {High-order optimization methods, including Newton's method and its variants as well as alternating minimization methods, dominate the optimization algorithms for tensor decompositions and tensor networks. These tensor methods are used for data analysis and simulation of quantum systems. In this work, we introduce AutoHOOT, the first automatic differentiation (AD) framework targeting at high-order optimization for tensor computations. AutoHOOT takes input tensor computation expressions and generates optimized derivative expressions. In particular, AutoHOOT contains a new explicit Jacobian / Hessian expression generation kernel whose outputs maintain the input tensors' granularity and are easy to optimize. The expressions are then optimized by both the traditional compiler optimization techniques and specific tensor algebra transformations. Experimental results show that AutoHOOT achieves competitive performance for both tensor decomposition and tensor network applications compared to existing AD software and other tensor computation libraries with manually written kernels, both on CPU and GPU architectures. The scalability of the generated kernels is as good as other well-known high-order numerical algorithms so that it can be executed efficiently on distributed parallel systems.},
	urldate = {2020-05-16},
	journal = {arXiv:2005.04540 [cs, math]},
	author = {Ma, Linjian and Ye, Jiayu and Solomonik, Edgar},
	month = may,
	year = {2020},
	note = {arXiv: 2005.04540},
	keywords = {Computer Science - Mathematical Software, Mathematics - Numerical Analysis, uses sympy},
}

Downloads: 0