Comparing Comparators in Generalization Bounds. Hellström, F. & Guedj, B. In Proceedings of The 27th International Conference on Artificial Intelligence and Statistics [AISTATS], 2024.
Comparing Comparators in Generalization Bounds [link]Paper  Comparing Comparators in Generalization Bounds [pdf]Pdf  doi  abstract   bibtex   7 downloads  
We derive generic information-theoretic and PAC-Bayesian generalization bounds involving an arbitrary convex \emphcomparator function, which measures the discrepancy between the training loss and the population loss. The bounds hold under the assumption that the cumulant-generating function (CGF) of the comparator is upper-bounded by the corresponding CGF within a family of bounding distributions. We show that the tightest possible bound is obtained with the comparator being the convex conjugate of the CGF of the bounding distribution, also known as the Cramér function. This conclusion applies more broadly to generalization bounds with a similar structure. This confirms the near-optimality of known bounds for bounded and sub-Gaussian losses and leads to novel bounds under other bounding distributions.

Downloads: 7