Minimization of the rank loss or, equivalently, maximization of the AUC in bipartite ranking calls for minimizing the number of disagreements between pairs of instances. Since the complexity of this problem is inherently quadratic in the number of training examples, it is tempting to ask how much is actually lost by minimizing a simple univariate loss function, as done by standard classification methods, as a surrogate. In this paper, we first note that minimization of 0/1 loss is not an option, as it may yield an arbitrarily high rank loss. We show, however, that better results can be achieved by means of a weighted (cost-sensitive) version of 0/1 loss. Yet, the real gain is obtained through margin-based loss functions, for which we are able to derive proper bounds, not only for rank risk but, more importantly, also for rank regret. The paper is completed with an experimental study in which we address specific questions raised by our theoretical analysis.
,
Omnipress
L. Getoor , T. Scheffer
Safe Statistics
International Conference on Machine Learning
Algorithms and Complexity

Kotlowski, W., Dembczynski, K., & Huellermeier, E. (2011). Bipartite Ranking through Minimization of Univariate Loss. In L. Getoor & T. Scheffer (Eds.), Proceedings of International Conference on Machine Learning 2011. Omnipress.