On DerivativesΒΆ

Ceres Solver, like all gradient based optimization algorithms, depends on being able to evaluate the objective function and its derivatives at arbitrary points in its domain. Indeed, defining the objective function and its Jacobian is the principal task that the user is required to perform when solving an optimization problem using Ceres Solver. The correct and efficient computation of the Jacobian is the key to good performance.

Ceres Solver offers considerable flexibility in how the user can provide derivatives to the solver. She can use:

  1. Analytic Derivatives: The user figures out the derivatives herself, by hand or using a tool like Maple or Mathematica, and implements them in a CostFunction.

  2. Numeric derivatives: Ceres numerically computes the derivative using finite differences.

  3. Automatic Derivatives: Ceres automatically computes the analytic derivative using C++ templates and operator overloading.

Which of these three approaches (alone or in combination) should be used depends on the situation and the tradeoffs the user is willing to make. Unfortunately, numerical optimization textbooks rarely discuss these issues in detail and the user is left to her own devices.

The aim of this article is to fill this gap and describe each of these three approaches in the context of Ceres Solver with sufficient detail that the user can make an informed choice.

For the impatient amongst you, here is some high level advice:

  1. Use Automatic Derivatives.

  2. In some cases it maybe worth using Analytic Derivatives.

  3. Avoid Numeric derivatives. Use it as a measure of last resort, mostly to interface with external libraries.

For the rest, read on.