News
Conjugate gradient methods form a class of iterative algorithms that are highly effective for solving large‐scale unconstrained optimisation problems. They achieve efficiency by constructing ...
In this work, we present a robust optimization method that is suited for unconstrained problems with a nonconvex cost function as well as for problems based on simulations, such as large partial ...
The BFGS method for unconstrained optimization, using a variety of line searches, including backtracking, is shown to be globally and superlinearly convergent on uniformly convex problems. The ...
In addition, the book includes an introduction to artificial neural networks, convex optimization, multi-objective optimization and applications of optimization in machine learning. About the Purdue ...
See "Nonlinear Optimization and Related Subroutines" for a listing of all NLP subroutines. See Chapter 11, "Nonlinear Optimization Examples," for a description of the inputs to and outputs of all NLP ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results