Line Search

A subroutine for computing the step size for descent direction based optimization methods. Suppose we have the descent direction \(\mathbf{d}\) and want to find the optimal step size \(\alpha\), assuming we want to minimize some function \(f : \mathbb{R}^d \rightarrow \mathbb{R}\), we have the following 1D optimization problem: \[ \min_\alpha f(\mathbf{x}+\alpha \mathbf{d})\]

According to the authors in (Kochenderfer and Wheeler 2019), the Brent-Dekker method is a commonly used algorithm for such 1D optimization problems.

However it is often computationally more expensive to perform a line search during each iteration, thus approximate line search methods are often employed. These methods rely on finding an approximate value for \(\alpha\) that satisfies the following conditions:

Thoughts

References

Kochenderfer, Mykel J., and Tim A. Wheeler. 2019. Algorithms for Optimization. The MIT Press.

Author: Nazaal

Created: 2022-04-04 Mon 23:39

Validate