From f13bde844cfac56cb5a82b858689978d5d20fe34 Mon Sep 17 00:00:00 2001 From: Jiaming Yuan Date: Tue, 28 Feb 2023 21:50:14 +0800 Subject: [PATCH] Update doc. --- doc/tutorials/learning_to_rank.rst | 4 ++++ 1 file changed, 4 insertions(+) diff --git a/doc/tutorials/learning_to_rank.rst b/doc/tutorials/learning_to_rank.rst index 445ea425aba8..2a6f6d820d00 100644 --- a/doc/tutorials/learning_to_rank.rst +++ b/doc/tutorials/learning_to_rank.rst @@ -100,6 +100,8 @@ Pairwise -------- The `LambdaMART` algorithm scales the logistic loss with learning to rank metrics like ``NDCG`` in the hope of including ranking infomation into the loss function. The ``rank:pairwise`` loss is the orginal version of the pairwise loss, also known as the `RankNet loss` `[7] <#references>`__ or the `pairwise logistic loss`. Unlike the ``rank:map`` and the ``rank:ndcg``, no scaling is applied (:math:`|\Delta Z_{ij}| = 1`). +Whether scaling with a LTR metric is actually more effective is still up for debate, `[8] <#references>`__ provides a theoretical foundation for general lambda loss functions and some insights into the framework. + ****************** Constructing Pairs ****************** @@ -153,6 +155,8 @@ References [7] Chris Burges, Tal Shaked, Erin Renshaw, Ari Lazier, Matt Deeds, Nicole Hamilton, and Greg Hullender. 2005. "`Learning to rank using gradient descent`_". In Proceedings of the 22nd international conference on Machine learning (ICML '05). Association for Computing Machinery, New York, NY, USA, 89–96. +[8] Xuanhui Wang and Cheng Li and Nadav Golbandi and Mike Bendersky and Marc Najork. 2018. "`The LambdaLoss Framework for Ranking Metric Optimization`_". Proceedings of The 27th ACM International Conference on Information and Knowledge Management (CIKM '18). + .. _`Learning to Rank for Information Retrieval`: https://doi.org/10.1561/1500000016 .. _`Learning to rank with nonsmooth cost functions`: https://dl.acm.org/doi/10.5555/2976456.2976481 .. _`Adapting boosting for information retrieval measures`: https://doi.org/10.1007/s10791-009-9112-1