{ "cells": [ { "cell_type": "markdown", "id": "48970ca0", "metadata": {}, "source": [ "# NonlinearConjugateGradientOptimizer Class Documentation\n", "\n", "*Disclaimer: This documentation was generated by AI and may require human revision for accuracy and completeness.*\n", "\n", "## Overview\n", "\n", "The `NonlinearConjugateGradientOptimizer` class in GTSAM is an implementation of the nonlinear conjugate gradient method for optimizing nonlinear functions. This optimizer is particularly useful for solving large-scale optimization problems where the Hessian matrix is not easily computed or stored. The conjugate gradient method is an iterative algorithm that seeks to find the minimum of a function by following a series of conjugate directions.\n", "\n", "## Key Features\n", "\n", "- **Optimization Method**: Implements the nonlinear conjugate gradient method, which is an extension of the linear conjugate gradient method to nonlinear optimization problems.\n", "- **Efficiency**: Suitable for large-scale problems due to its iterative nature and reduced memory requirements compared to methods that require the Hessian matrix.\n", "- **Flexibility**: Can be used with various line search strategies and conjugate gradient update formulas.\n", "\n", "## Main Methods\n", "\n", "### Constructor\n", "\n", "- **NonlinearConjugateGradientOptimizer**: Initializes the optimizer with a given nonlinear factor graph and initial values. The user can specify optimization parameters, including the choice of line search method and conjugate gradient update formula.\n", "\n", "### Optimization\n", "\n", "- **optimize**: Executes the optimization process. This method iteratively updates the solution by computing search directions and performing line searches to minimize the objective function along these directions.\n", "\n", "### Accessors\n", "\n", "- **error**: Returns the current error value of the objective function. This is useful for monitoring the convergence of the optimization process.\n", "- **values**: Retrieves the current estimate of the optimized variables. This allows users to access the solution at any point during the optimization.\n", "\n", "## Mathematical Background\n", "\n", "The nonlinear conjugate gradient method seeks to minimize a nonlinear function $f(x)$ by iteratively updating the solution $x_k$ according to:\n", "\n", "$$ x_{k+1} = x_k + \\alpha_k p_k $$\n", "\n", "where $p_k$ is the search direction and $\\alpha_k$ is the step size determined by a line search. The search direction $p_k$ is computed using the gradient of the function and a conjugate gradient update formula, such as the Fletcher-Reeves or Polak-Ribiere formulas:\n", "\n", "- **Fletcher-Reeves**: \n", " $$ \\beta_k^{FR} = \\frac{\\nabla f(x_{k+1})^T \\nabla f(x_{k+1})}{\\nabla f(x_k)^T \\nabla f(x_k)} $$\n", " \n", "- **Polak-Ribiere**: \n", " $$ \\beta_k^{PR} = \\frac{\\nabla f(x_{k+1})^T (\\nabla f(x_{k+1}) - \\nabla f(x_k))}{\\nabla f(x_k)^T \\nabla f(x_k)} $$\n", "\n", "The choice of $\\beta_k$ affects the convergence properties of the algorithm.\n", "\n", "## Usage Notes\n", "\n", "- The `NonlinearConjugateGradientOptimizer` is most effective when the problem size is large and the computation of the Hessian is impractical.\n", "- Users should choose an appropriate line search method and conjugate gradient update formula based on the specific characteristics of their optimization problem.\n", "- Monitoring the error and values during optimization can provide insights into the convergence behavior and help diagnose potential issues.\n", "\n", "This class provides a robust framework for solving complex nonlinear optimization problems efficiently, leveraging the power of the conjugate gradient method." ] } ], "metadata": {}, "nbformat": 4, "nbformat_minor": 5 }