Hardness of High-Dimensional Linear Classification. Munteanu, A., Omlor, S., & Phillips, J. M. March, 2026. arXiv:2603.19061 [cs]
Paper doi abstract bibtex We establish new exponential in dimension lower bounds for the Maximum Halfspace Discrepancy problem, which models linear classification. Both are fundamental problems in computational geometry and machine learning in their exact and approximate forms. However, only O(nd) and respectively O˜(1/εd) upper bounds are known and complemented by polynomial lower bounds that do not support the exponential in dimension dependence. We close this gap up to polylogarithmic terms by reduction from widely-believed hardness conjectures for Affine Degeneracy testing and k-Sum problems. Our reductions yield matching lower bounds of Ω˜ (nd) and respectively Ω˜ (1/εd) based on Affine Degeneracy testing, and Ω˜ (nd/2) and respectively Ω˜ (1/εd/2) conditioned on k-Sum. The first bound also holds unconditionally if the computational model is restricted to make sidedness queries, which corresponds to a widely spread setting implemented and optimized in many contemporary algorithms and computing paradigms.
@misc{munteanu_hardness_2026,
title = {Hardness of {High}-{Dimensional} {Linear} {Classification}},
url = {http://arxiv.org/abs/2603.19061},
doi = {10.48550/arXiv.2603.19061},
abstract = {We establish new exponential in dimension lower bounds for the Maximum Halfspace Discrepancy problem, which models linear classification. Both are fundamental problems in computational geometry and machine learning in their exact and approximate forms. However, only O(nd) and respectively O˜(1/εd) upper bounds are known and complemented by polynomial lower bounds that do not support the exponential in dimension dependence. We close this gap up to polylogarithmic terms by reduction from widely-believed hardness conjectures for Affine Degeneracy testing and k-Sum problems. Our reductions yield matching lower bounds of Ω˜ (nd) and respectively Ω˜ (1/εd) based on Affine Degeneracy testing, and Ω˜ (nd/2) and respectively Ω˜ (1/εd/2) conditioned on k-Sum. The first bound also holds unconditionally if the computational model is restricted to make sidedness queries, which corresponds to a widely spread setting implemented and optimized in many contemporary algorithms and computing paradigms.},
language = {en},
urldate = {2026-04-06},
publisher = {arXiv},
author = {Munteanu, Alexander and Omlor, Simon and Phillips, Jeff M.},
month = mar,
year = {2026},
note = {arXiv:2603.19061 [cs]},
keywords = {Computer Science - Computational Geometry, Computer Science - Data Structures and Algorithms, Computer Science - Machine Learning, Statistics - Machine Learning},
}
Downloads: 0
{"_id":"Qf6QSzMCs9b757aTi","bibbaseid":"munteanu-omlor-phillips-hardnessofhighdimensionallinearclassification-2026","author_short":["Munteanu, A.","Omlor, S.","Phillips, J. M."],"bibdata":{"bibtype":"misc","type":"misc","title":"Hardness of High-Dimensional Linear Classification","url":"http://arxiv.org/abs/2603.19061","doi":"10.48550/arXiv.2603.19061","abstract":"We establish new exponential in dimension lower bounds for the Maximum Halfspace Discrepancy problem, which models linear classification. Both are fundamental problems in computational geometry and machine learning in their exact and approximate forms. However, only O(nd) and respectively O˜(1/εd) upper bounds are known and complemented by polynomial lower bounds that do not support the exponential in dimension dependence. We close this gap up to polylogarithmic terms by reduction from widely-believed hardness conjectures for Affine Degeneracy testing and k-Sum problems. Our reductions yield matching lower bounds of Ω˜ (nd) and respectively Ω˜ (1/εd) based on Affine Degeneracy testing, and Ω˜ (nd/2) and respectively Ω˜ (1/εd/2) conditioned on k-Sum. The first bound also holds unconditionally if the computational model is restricted to make sidedness queries, which corresponds to a widely spread setting implemented and optimized in many contemporary algorithms and computing paradigms.","language":"en","urldate":"2026-04-06","publisher":"arXiv","author":[{"propositions":[],"lastnames":["Munteanu"],"firstnames":["Alexander"],"suffixes":[]},{"propositions":[],"lastnames":["Omlor"],"firstnames":["Simon"],"suffixes":[]},{"propositions":[],"lastnames":["Phillips"],"firstnames":["Jeff","M."],"suffixes":[]}],"month":"March","year":"2026","note":"arXiv:2603.19061 [cs]","keywords":"Computer Science - Computational Geometry, Computer Science - Data Structures and Algorithms, Computer Science - Machine Learning, Statistics - Machine Learning","bibtex":"@misc{munteanu_hardness_2026,\n\ttitle = {Hardness of {High}-{Dimensional} {Linear} {Classification}},\n\turl = {http://arxiv.org/abs/2603.19061},\n\tdoi = {10.48550/arXiv.2603.19061},\n\tabstract = {We establish new exponential in dimension lower bounds for the Maximum Halfspace Discrepancy problem, which models linear classification. Both are fundamental problems in computational geometry and machine learning in their exact and approximate forms. However, only O(nd) and respectively O˜(1/εd) upper bounds are known and complemented by polynomial lower bounds that do not support the exponential in dimension dependence. We close this gap up to polylogarithmic terms by reduction from widely-believed hardness conjectures for Affine Degeneracy testing and k-Sum problems. Our reductions yield matching lower bounds of Ω˜ (nd) and respectively Ω˜ (1/εd) based on Affine Degeneracy testing, and Ω˜ (nd/2) and respectively Ω˜ (1/εd/2) conditioned on k-Sum. The first bound also holds unconditionally if the computational model is restricted to make sidedness queries, which corresponds to a widely spread setting implemented and optimized in many contemporary algorithms and computing paradigms.},\n\tlanguage = {en},\n\turldate = {2026-04-06},\n\tpublisher = {arXiv},\n\tauthor = {Munteanu, Alexander and Omlor, Simon and Phillips, Jeff M.},\n\tmonth = mar,\n\tyear = {2026},\n\tnote = {arXiv:2603.19061 [cs]},\n\tkeywords = {Computer Science - Computational Geometry, Computer Science - Data Structures and Algorithms, Computer Science - Machine Learning, Statistics - Machine Learning},\n}\n\n\n\n","author_short":["Munteanu, A.","Omlor, S.","Phillips, J. M."],"key":"munteanu_hardness_2026","id":"munteanu_hardness_2026","bibbaseid":"munteanu-omlor-phillips-hardnessofhighdimensionallinearclassification-2026","role":"author","urls":{"Paper":"http://arxiv.org/abs/2603.19061"},"keyword":["Computer Science - Computational Geometry","Computer Science - Data Structures and Algorithms","Computer Science - Machine Learning","Statistics - Machine Learning"],"metadata":{"authorlinks":{}},"downloads":0},"bibtype":"misc","biburl":"https://bibbase.org/zotero-group/pratikmhatre/5933976","dataSources":["yJr5AAtJ5Sz3Q4WT4"],"keywords":["computer science - computational geometry","computer science - data structures and algorithms","computer science - machine learning","statistics - machine learning"],"search_terms":["hardness","high","dimensional","linear","classification","munteanu","omlor","phillips"],"title":"Hardness of High-Dimensional Linear Classification","year":2026}