Boosting is a general method to convert a weak learner (which generates hypotheses that are just slightly better than random) into a strong learner (which generates hypotheses that are much better than random). Recently, Arunachalam and Maity [5] gave the first quantum improvement for boosting, by combining Freund and Schapire’s AdaBoost algorithm with a quantum algorithm for approximate counting. Their booster is faster than classical boosting as a function of the VC-dimension of the weak learner’s hypothesis class, but worse as a function of the quality of the weak learner. In this paper we give a substantially faster and simpler quantum boosting algorithm, based on Servedio’s SmoothBoost algorithm [22].

, ,
doi.org/10.4230/LIPIcs.ESA.2023.64
Leibniz International Proceedings in Informatics
Quantum Software Consortium , Quantum algorithms and applications
31st Annual European Symposium on Algorithms, ESA 2023
,
Algorithms and Complexity

Izdebski, A., & de Wolf, R. (2023). Improved quantum boosting. In Annual European Symposium on Algorithms (pp. 64:1–64:16). doi:10.4230/LIPIcs.ESA.2023.64