Skip to content

Commit

Permalink
Update doc.
Browse files Browse the repository at this point in the history
  • Loading branch information
trivialfis committed Feb 28, 2023
1 parent d553e7c commit f13bde8
Showing 1 changed file with 4 additions and 0 deletions.
4 changes: 4 additions & 0 deletions doc/tutorials/learning_to_rank.rst
Original file line number Diff line number Diff line change
Expand Up @@ -100,6 +100,8 @@ Pairwise
--------
The `LambdaMART` algorithm scales the logistic loss with learning to rank metrics like ``NDCG`` in the hope of including ranking infomation into the loss function. The ``rank:pairwise`` loss is the orginal version of the pairwise loss, also known as the `RankNet loss` `[7] <#references>`__ or the `pairwise logistic loss`. Unlike the ``rank:map`` and the ``rank:ndcg``, no scaling is applied (:math:`|\Delta Z_{ij}| = 1`).

Whether scaling with a LTR metric is actually more effective is still up for debate, `[8] <#references>`__ provides a theoretical foundation for general lambda loss functions and some insights into the framework.

******************
Constructing Pairs
******************
Expand Down Expand Up @@ -153,6 +155,8 @@ References

[7] Chris Burges, Tal Shaked, Erin Renshaw, Ari Lazier, Matt Deeds, Nicole Hamilton, and Greg Hullender. 2005. "`Learning to rank using gradient descent`_". In Proceedings of the 22nd international conference on Machine learning (ICML '05). Association for Computing Machinery, New York, NY, USA, 89–96.

[8] Xuanhui Wang and Cheng Li and Nadav Golbandi and Mike Bendersky and Marc Najork. 2018. "`The LambdaLoss Framework for Ranking Metric Optimization`_". Proceedings of The 27th ACM International Conference on Information and Knowledge Management (CIKM '18).

.. _`Learning to Rank for Information Retrieval`: https://doi.org/10.1561/1500000016
.. _`Learning to rank with nonsmooth cost functions`: https://dl.acm.org/doi/10.5555/2976456.2976481
.. _`Adapting boosting for information retrieval measures`: https://doi.org/10.1007/s10791-009-9112-1
Expand Down

0 comments on commit f13bde8

Please sign in to comment.