This repository contains some small examples of how to solve standard NLP learning problems using automatic differentiation to compute the derivatives of the objective function.
The included examples are:
- A linear-chain conditional random field
- A log bilinear language model (Section 4)
For ease of reading, the code is not optimized, but automatic differentiation can be quite fast, and it can be used in large-scale learning problems.
To build the code, you need:
- A C++11 compiler
- The Adept (Automatic Differentiation using Expression Templates) library
- The Eigen linear algebra library (for the LBL example)
Here is a crash course on automatic differentiation that I put together.
- autodiff.org - lots of pointers
- Fast Reverse-Mode Automatic Differentiation using Expression Templates in C++ - paper describing the adept library
- Recipes for Adjoint Code Construction - readable discussion about constructing adjoint mode (backward) automatic differentiation code