File: Algorithms-using-Derivatives.html

package info (click to toggle)
gsl-ref-html 2.3-1
  • links: PTS
  • area: non-free
  • in suites: bullseye, buster, sid
  • size: 6,876 kB
  • ctags: 4,574
  • sloc: makefile: 35
file content (207 lines) | stat: -rw-r--r-- 10,404 bytes parent folder | download
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
<!DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN" "http://www.w3.org/TR/html4/loose.dtd">
<html>
<!-- Copyright (C) 1996, 1997, 1998, 1999, 2000, 2001, 2002, 2003, 2004, 2005, 2006, 2007, 2008, 2009, 2010, 2011, 2012, 2013, 2014, 2015, 2016 The GSL Team.

Permission is granted to copy, distribute and/or modify this document
under the terms of the GNU Free Documentation License, Version 1.3 or
any later version published by the Free Software Foundation; with the
Invariant Sections being "GNU General Public License" and "Free Software
Needs Free Documentation", the Front-Cover text being "A GNU Manual",
and with the Back-Cover Text being (a) (see below). A copy of the
license is included in the section entitled "GNU Free Documentation
License".

(a) The Back-Cover Text is: "You have the freedom to copy and modify this
GNU Manual." -->
<!-- Created by GNU Texinfo 5.1, http://www.gnu.org/software/texinfo/ -->
<head>
<title>GNU Scientific Library &ndash; Reference Manual: Algorithms using Derivatives</title>

<meta name="description" content="GNU Scientific Library &ndash; Reference Manual: Algorithms using Derivatives">
<meta name="keywords" content="GNU Scientific Library &ndash; Reference Manual: Algorithms using Derivatives">
<meta name="resource-type" content="document">
<meta name="distribution" content="global">
<meta name="Generator" content="makeinfo">
<meta http-equiv="Content-Type" content="text/html; charset=utf-8">
<link href="index.html#Top" rel="start" title="Top">
<link href="Function-Index.html#Function-Index" rel="index" title="Function Index">
<link href="Multidimensional-Root_002dFinding.html#Multidimensional-Root_002dFinding" rel="up" title="Multidimensional Root-Finding">
<link href="Algorithms-without-Derivatives.html#Algorithms-without-Derivatives" rel="next" title="Algorithms without Derivatives">
<link href="Search-Stopping-Parameters-for-the-multidimensional-solver.html#Search-Stopping-Parameters-for-the-multidimensional-solver" rel="previous" title="Search Stopping Parameters for the multidimensional solver">
<style type="text/css">
<!--
a.summary-letter {text-decoration: none}
blockquote.smallquotation {font-size: smaller}
div.display {margin-left: 3.2em}
div.example {margin-left: 3.2em}
div.indentedblock {margin-left: 3.2em}
div.lisp {margin-left: 3.2em}
div.smalldisplay {margin-left: 3.2em}
div.smallexample {margin-left: 3.2em}
div.smallindentedblock {margin-left: 3.2em; font-size: smaller}
div.smalllisp {margin-left: 3.2em}
kbd {font-style:oblique}
pre.display {font-family: inherit}
pre.format {font-family: inherit}
pre.menu-comment {font-family: serif}
pre.menu-preformatted {font-family: serif}
pre.smalldisplay {font-family: inherit; font-size: smaller}
pre.smallexample {font-size: smaller}
pre.smallformat {font-family: inherit; font-size: smaller}
pre.smalllisp {font-size: smaller}
span.nocodebreak {white-space:nowrap}
span.nolinebreak {white-space:nowrap}
span.roman {font-family:serif; font-weight:normal}
span.sansserif {font-family:sans-serif; font-weight:normal}
ul.no-bullet {list-style: none}
-->
</style>


</head>

<body lang="en" bgcolor="#FFFFFF" text="#000000" link="#0000FF" vlink="#800080" alink="#FF0000">
<a name="Algorithms-using-Derivatives"></a>
<div class="header">
<p>
Next: <a href="Algorithms-without-Derivatives.html#Algorithms-without-Derivatives" accesskey="n" rel="next">Algorithms without Derivatives</a>, Previous: <a href="Search-Stopping-Parameters-for-the-multidimensional-solver.html#Search-Stopping-Parameters-for-the-multidimensional-solver" accesskey="p" rel="previous">Search Stopping Parameters for the multidimensional solver</a>, Up: <a href="Multidimensional-Root_002dFinding.html#Multidimensional-Root_002dFinding" accesskey="u" rel="up">Multidimensional Root-Finding</a> &nbsp; [<a href="Function-Index.html#Function-Index" title="Index" rel="index">Index</a>]</p>
</div>
<hr>
<a name="Algorithms-using-Derivatives-1"></a>
<h3 class="section">36.6 Algorithms using Derivatives</h3>

<p>The root finding algorithms described in this section make use of both
the function and its derivative.  They require an initial guess for the
location of the root, but there is no absolute guarantee of
convergence&mdash;the function must be suitable for this technique and the
initial guess must be sufficiently close to the root for it to work.
When the conditions are satisfied then convergence is quadratic.
</p>

<a name="index-HYBRID-algorithms-for-nonlinear-systems"></a>
<dl>
<dt><a name="index-gsl_005fmultiroot_005ffdfsolver_005fhybridsj"></a>Derivative Solver: <strong>gsl_multiroot_fdfsolver_hybridsj</strong></dt>
<dd><a name="index-HYBRIDSJ-algorithm"></a>
<a name="index-MINPACK_002c-minimization-algorithms"></a>
<p>This is a modified version of Powell&rsquo;s Hybrid method as implemented in
the <small>HYBRJ</small> algorithm in <small>MINPACK</small>.  Minpack was written by Jorge
J. Mor&eacute;, Burton S. Garbow and Kenneth E. Hillstrom.  The Hybrid
algorithm retains the fast convergence of Newton&rsquo;s method but will also
reduce the residual when Newton&rsquo;s method is unreliable. 
</p>
<p>The algorithm uses a generalized trust region to keep each step under
control.  In order to be accepted a proposed new position <em>x'</em> must
satisfy the condition <em>|D (x' - x)| &lt; \delta</em>, where <em>D</em> is a
diagonal scaling matrix and <em>\delta</em> is the size of the trust
region.  The components of <em>D</em> are computed internally, using the
column norms of the Jacobian to estimate the sensitivity of the residual
to each component of <em>x</em>.  This improves the behavior of the
algorithm for badly scaled functions.
</p>
<p>On each iteration the algorithm first determines the standard Newton
step by solving the system <em>J dx = - f</em>.  If this step falls inside
the trust region it is used as a trial step in the next stage.  If not,
the algorithm uses the linear combination of the Newton and gradient
directions which is predicted to minimize the norm of the function while
staying inside the trust region,
</p>
<div class="example">
<pre class="example">dx = - \alpha J^{-1} f(x) - \beta \nabla |f(x)|^2.
</pre></div>

<p>This combination of Newton and gradient directions is referred to as a
<em>dogleg step</em>.
</p>
<p>The proposed step is now tested by evaluating the function at the
resulting point, <em>x'</em>.  If the step reduces the norm of the function
sufficiently then it is accepted and size of the trust region is
increased.  If the proposed step fails to improve the solution then the
size of the trust region is decreased and another trial step is
computed.
</p>
<p>The speed of the algorithm is increased by computing the changes to the
Jacobian approximately, using a rank-1 update.  If two successive
attempts fail to reduce the residual then the full Jacobian is
recomputed.  The algorithm also monitors the progress of the solution
and returns an error if several steps fail to make any improvement,
</p>
<dl compact="compact">
<dt><code>GSL_ENOPROG</code></dt>
<dd><p>the iteration is not making any progress, preventing the algorithm from
continuing.
</p>
</dd>
<dt><code>GSL_ENOPROGJ</code></dt>
<dd><p>re-evaluations of the Jacobian indicate that the iteration is not
making any progress, preventing the algorithm from continuing.
</p></dd>
</dl>

</dd></dl>

<dl>
<dt><a name="index-gsl_005fmultiroot_005ffdfsolver_005fhybridj"></a>Derivative Solver: <strong>gsl_multiroot_fdfsolver_hybridj</strong></dt>
<dd><a name="index-HYBRIDJ-algorithm"></a>
<p>This algorithm is an unscaled version of <code>hybridsj</code>.  The steps are
controlled by a spherical trust region <em>|x' - x| &lt; \delta</em>, instead
of a generalized region.  This can be useful if the generalized region
estimated by <code>hybridsj</code> is inappropriate.
</p></dd></dl>


<dl>
<dt><a name="index-gsl_005fmultiroot_005ffdfsolver_005fnewton"></a>Derivative Solver: <strong>gsl_multiroot_fdfsolver_newton</strong></dt>
<dd><a name="index-Newton_0027s-method-for-systems-of-nonlinear-equations"></a>

<p>Newton&rsquo;s Method is the standard root-polishing algorithm.  The algorithm
begins with an initial guess for the location of the solution.  On each
iteration a linear approximation to the function <em>F</em> is used to
estimate the step which will zero all the components of the residual.
The iteration is defined by the following sequence,
</p>
<div class="example">
<pre class="example">x -&gt; x' = x - J^{-1} f(x)
</pre></div>

<p>where the Jacobian matrix <em>J</em> is computed from the derivative
functions provided by <var>f</var>.  The step <em>dx</em> is obtained by solving
the linear system,
</p>
<div class="example">
<pre class="example">J dx = - f(x)
</pre></div>

<p>using LU decomposition.  If the Jacobian matrix is singular, an error
code of <code>GSL_EDOM</code> is returned.
</p></dd></dl>


<dl>
<dt><a name="index-gsl_005fmultiroot_005ffdfsolver_005fgnewton"></a>Derivative Solver: <strong>gsl_multiroot_fdfsolver_gnewton</strong></dt>
<dd><a name="index-Modified-Newton_0027s-method-for-nonlinear-systems"></a>
<a name="index-Newton-algorithm_002c-globally-convergent"></a>
<p>This is a modified version of Newton&rsquo;s method which attempts to improve
global convergence by requiring every step to reduce the Euclidean norm
of the residual, <em>|f(x)|</em>.  If the Newton step leads to an increase
in the norm then a reduced step of relative size,
</p>
<div class="example">
<pre class="example">t = (\sqrt(1 + 6 r) - 1) / (3 r)
</pre></div>

<p>is proposed, with <em>r</em> being the ratio of norms
<em>|f(x')|^2/|f(x)|^2</em>.  This procedure is repeated until a suitable step
size is found. 
</p></dd></dl>


<hr>
<div class="header">
<p>
Next: <a href="Algorithms-without-Derivatives.html#Algorithms-without-Derivatives" accesskey="n" rel="next">Algorithms without Derivatives</a>, Previous: <a href="Search-Stopping-Parameters-for-the-multidimensional-solver.html#Search-Stopping-Parameters-for-the-multidimensional-solver" accesskey="p" rel="previous">Search Stopping Parameters for the multidimensional solver</a>, Up: <a href="Multidimensional-Root_002dFinding.html#Multidimensional-Root_002dFinding" accesskey="u" rel="up">Multidimensional Root-Finding</a> &nbsp; [<a href="Function-Index.html#Function-Index" title="Index" rel="index">Index</a>]</p>
</div>



</body>
</html>