wnconj(3)
NAME
- wn_conj_direction_method, wn_force_conj_direction_stop,
- wn_conj_gradient_method, wn_force_conj_gradient_stop, wn_barrier,
- wn_dbarrier, wn_penalty, wn_dpenalty - conjugate directions func
- tion minimization package
SYNOPSIS
#include <wn/wnconj.h> void wn_conj_direction_method(int *code, double void wn_conj_direction_method(&code,&val_min, vect,len,pfunction,max_iterations) int code; double val_min; double vect[]; int len; double (*pfunction)(/* vect */); int max_iterations; void wn_force_conj_direction_stop(void) void wn_conj_gradient_method(int *code, double *val_min, double *vect,int void wn_conj_gradient_method(&code,&val_min, vect,len,pfunction,pgradient, max_iterations) int code; double val_min; double vect[]; int len; double (*pfunction)(/* double vect[] */); void (*pgradient)(/* double grad[],double vect[] */); int max_iterations; void wn_force_conj_gradient_stop(void) double wn_barrier(x) double x; double wn_dbarrier(x) double x; double wn_penalty(x) double x; double wn_dpenalty(x) double x;
DESCRIPTION
- This package is useful for minimizing continuous, differ
- entiable functions of many variables. The routines assume that
- the function to minimize is well approximated by a quadratic form
- with a positive definite Hessian. For functions which are well
- approximated this way, convergence rates are reasonably pre
- dictable. If all goes well, these routines converge to a local
- minimum; this local minimum is not necessarily a global minimum
- or even a good local minimum. If the function is not differen
- tiable, the algorithms often get stuck in very sub-optimal solu
- tions, and no warning is given that this has happened.
- wn_conj_direction_method conducts a conjugate directions
- search without the need for derivatives of the function to mini
- mize. That is, the function must be differentiable, but the
- derivatives need not be known explicitly. Consequently, it is
- the easiest to use, but it runs very slowly for medium-sized and
- large problems. code indicates the status of the completed
- search. val_min is the objective function value at the final so
- lution. vect should be loaded with a solution used to begin the
- search; upon return it contains the final solution. len is the
- number of variables in vect. pfunction is a pointer to the func
- tion to minimize. max_iterations is the maximum number of itera
- tions allowed; set to WN_IHUGE for no limit.
- wn_conj_gradient_method conducts a conjugate directions
- search using derivatives of the function to minimize. Using
- derivatives usually results in a dramatic speedup, but it makes
- programming much more complicated. Furthermore, if the deriva
- tives are in error, no warning is given; the result is either
- very slow convergence or convergence to an incorrect result. The
- arguments mean the same as the arguments to
- conj_direction_method. The additional argument pgradient is a
- pointer to a function which computes the gradient of the function
- to minimize; that is
- grad[i] = d function / d vect[i]
- Because of the difficulty in ensuring that the derivatives
- are correct, it is usually best to start by experimenting on a
- small problem using wn_conj_direction_method, even if you ulti
- mately intend to use wn_conj_gradient_method. The solutions ob
- tained from wn_conj_gradient_method using derivatives should be
- similar or the same as solutions obtained from
- wn_conj_directions_method not using derivatives. If this is not
- so, your derivatives are probably faulty.
- wn_barrier, wn_dbarrier, wn_penalty, and wn_dpenalty are
- useful for implementing barrier and penalty methods of non-linear
- programming. They are designed to implement the constraint x >=
- 0.
- wn_barrier(x) returns +infinity for x <= 0 and decreases
- in a continuous and differentiable way for x > 0.
- wn_dbarrier(x) returns the derivative of wn_barrier(x).
- wn_penalty(x) returns 0 for x >= 0 and x^2 for x < 0. It
- is continuous and differentiable for all x.
- wn_dpenalty(x) returns the derivative of penalty(x).
RESOURCES
wn_conjugate_direction_method runs with
AVERAGE CASE:
time = len^2 * (len^2 + <time to evaluate function>)
stack memory = 1
dynamic memory = len*len
wn_conjugate_gradient_method runs with
AVERAGE CASE:
- time = len * (2*<time to evaluate function> + <time to
- evaluate gradien>)
stack memory = 1
dynamic memory = len - Run time depends heavily on the problem being solved. If
- the function is badly behaved, convergence can take much longer
- than these times. If the function is not too badly behaved and
- the Hessian has distinct eigenvalues, the times given here usual
- ly apply. If the eigenvalues of the Hessian are not distinct,
- much faster convergence times are possible.
DIAGNOSTICS
- code == WN_SUCCESS means successful completion, optimum
- found
code == WN_SUBOPTIMAL means termination before optimum - found
code == WN_UNBOUNDED means function is unbounded
AUTHOR
- Will Naylor
- WNLIB August 23, 1998