1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87
|
/* -*- mode: c++; tab-width: 4; indent-tabs-mode: nil; c-basic-offset: 4 -*- */
/*
Copyright (C) 2001, 2002, 2003 Nicolas Di Césaré
This file is part of QuantLib, a free-software/open-source library
for financial quantitative analysts and developers - http://quantlib.org/
QuantLib is free software: you can redistribute it and/or modify it
under the terms of the QuantLib license. You should have received a
copy of the license along with this program; if not, please email
<quantlib-dev@lists.sf.net>. The license is also available online at
<http://quantlib.org/license.shtml>.
This program is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the license for more details.
*/
#include <ql/math/optimization/armijo.hpp>
#include <ql/math/optimization/method.hpp>
#include <ql/math/optimization/problem.hpp>
namespace QuantLib {
Real ArmijoLineSearch::operator()(Problem& P,
EndCriteria::Type& ecType,
const EndCriteria& endCriteria,
const Real t_ini)
{
//OptimizationMethod& method = P.method();
Constraint& constraint = P.constraint();
succeed_=true;
bool maxIter = false;
Real qtold, t = t_ini;
Size loopNumber = 0;
Real q0 = P.functionValue();
Real qp0 = P.gradientNormValue();
qt_ = q0;
qpt_ = (gradient_.empty()) ? qp0 : -DotProduct(gradient_,searchDirection_);
// Initialize gradient
gradient_ = Array(P.currentValue().size());
// Compute new point
xtd_ = P.currentValue();
t = update(xtd_, searchDirection_, t, constraint);
// Compute function value at the new point
qt_ = P.value (xtd_);
// Enter in the loop if the criterion is not satisfied
if ((qt_-q0) > -alpha_*t*qpt_) {
do {
loopNumber++;
// Decrease step
t *= beta_;
// Store old value of the function
qtold = qt_;
// New point value
xtd_ = P.currentValue();
t = update(xtd_, searchDirection_, t, constraint);
// Compute function value at the new point
qt_ = P.value (xtd_);
P.gradient (gradient_, xtd_);
// and it squared norm
maxIter = endCriteria.checkMaxIterations(loopNumber, ecType);
} while (
(((qt_ - q0) > (-alpha_ * t * qpt_)) ||
((qtold - q0) <= (-alpha_ * t * qpt_ / beta_))) &&
(!maxIter));
}
if (maxIter)
succeed_ = false;
// Compute new gradient
P.gradient(gradient_, xtd_);
// and it squared norm
qpt_ = DotProduct(gradient_, gradient_);
// Return new step value
return t;
}
}
|