Matches in DBpedia 2014 for { <http://dbpedia.org/resource/Nonlinear_conjugate_gradient_method> ?p ?o. }
Showing items 1 to 31 of
31
with 100 items per page.
- Nonlinear_conjugate_gradient_method abstract "In numerical optimization, the nonlinear conjugate gradient method generalizes the conjugate gradient method to nonlinear optimization. For a quadratic function The minimum of is obtained when the gradient is 0: .Whereas linear conjugate gradient seeks a solution to the linear equation , the nonlinear conjugate gradient method is generally used to find the local minimum of a nonlinear function using its gradient alone. It works when the function is approximately quadratic near the minimum, which is the case when the function is twice differentiable at the minimum.Given a function of variables to minimize, its gradient indicates the direction of maximum increase.One simply starts in the opposite (steepest descent) direction: with an adjustable step length and performs a line search in this direction until it reaches the minimum of , After this first iteration in the steepest direction , the following steps constitute one iteration of moving along a subsequent conjugate direction , where Calculate the steepest direction: , Compute according to one of the formulas below, Update the conjugate direction: Perform a line search: optimize , Update the position: ,With a pure quadratic function the minimum is reached within N iterations (excepting roundoff error), but a non-quadratic function will make slower progress. Subsequent search directions lose conjugacy requiring the search direction to be reset to the steepest descent direction at least every N iterations, or sooner if progress stops. However, resetting every iteration turns the method into steepest descent. The algorithm stops when it finds the minimum, determined when no progress is made after a direction reset (i.e. in the steepest descent direction), or when some tolerance criterion is reached.Within a linear approximation, the parameters and are the same as in thelinear conjugate gradient method but have been obtained with line searches.The conjugate gradient method can follow narrow (ill-conditioned) valleys where the steepest descent method slows down and follows a criss-cross pattern.Three of the best known formulas for are titled Fletcher-Reeves (FR), Polak-Ribière (PR), and Hestenes-Stiefel (HS) after their developers. They are given by the following formulas: Fletcher–Reeves: Polak–Ribière: Hestenes-Stiefel: .These formulas are equivalent for a quadratic function, but for nonlinear optimization the preferred formula is a matter of heuristics or taste. A popular choice is which provides a direction reset automatically.Newton based methods - Newton-Raphson Algorithm, Quasi-Newton methods (e.g., BFGS method) - tend to converge in fewer iterations, although each iteration typically requires more computation than a conjugate gradient iteration as Newton-like methods require computing the Hessian (matrix of second derivatives) in addition to the gradient. Quasi-Newton methods also require more memory to operate (see also the limited memory L-BFGS method).".
- Nonlinear_conjugate_gradient_method wikiPageExternalLink painless-conjugate-gradient.pdf.
- Nonlinear_conjugate_gradient_method wikiPageExternalLink bookcpdf.php.
- Nonlinear_conjugate_gradient_method wikiPageID "8980593".
- Nonlinear_conjugate_gradient_method wikiPageRevisionID "546277939".
- Nonlinear_conjugate_gradient_method hasPhotoCollection Nonlinear_conjugate_gradient_method.
- Nonlinear_conjugate_gradient_method subject Category:Gradient_methods.
- Nonlinear_conjugate_gradient_method subject Category:Optimization_algorithms_and_methods.
- Nonlinear_conjugate_gradient_method type Ability105616246.
- Nonlinear_conjugate_gradient_method type Abstraction100002137.
- Nonlinear_conjugate_gradient_method type Act100030358.
- Nonlinear_conjugate_gradient_method type Activity100407535.
- Nonlinear_conjugate_gradient_method type Algorithm105847438.
- Nonlinear_conjugate_gradient_method type Cognition100023271.
- Nonlinear_conjugate_gradient_method type Event100029378.
- Nonlinear_conjugate_gradient_method type GradientMethods.
- Nonlinear_conjugate_gradient_method type Know-how105616786.
- Nonlinear_conjugate_gradient_method type Method105660268.
- Nonlinear_conjugate_gradient_method type OptimizationAlgorithmsAndMethods.
- Nonlinear_conjugate_gradient_method type Procedure101023820.
- Nonlinear_conjugate_gradient_method type PsychologicalFeature100023100.
- Nonlinear_conjugate_gradient_method type Rule105846932.
- Nonlinear_conjugate_gradient_method type YagoPermanentlyLocatedEntity.
- Nonlinear_conjugate_gradient_method comment "In numerical optimization, the nonlinear conjugate gradient method generalizes the conjugate gradient method to nonlinear optimization. For a quadratic function The minimum of is obtained when the gradient is 0: .Whereas linear conjugate gradient seeks a solution to the linear equation , the nonlinear conjugate gradient method is generally used to find the local minimum of a nonlinear function using its gradient alone.".
- Nonlinear_conjugate_gradient_method label "Nonlinear conjugate gradient method".
- Nonlinear_conjugate_gradient_method sameAs m.027s3nl.
- Nonlinear_conjugate_gradient_method sameAs Q17086453.
- Nonlinear_conjugate_gradient_method sameAs Q17086453.
- Nonlinear_conjugate_gradient_method sameAs Nonlinear_conjugate_gradient_method.
- Nonlinear_conjugate_gradient_method wasDerivedFrom Nonlinear_conjugate_gradient_method?oldid=546277939.
- Nonlinear_conjugate_gradient_method isPrimaryTopicOf Nonlinear_conjugate_gradient_method.