File: __init__.py

package info (click to toggle)
python-scipy 0.18.1-2
  • links: PTS, VCS
  • area: main
  • in suites: stretch
  • size: 75,464 kB
  • ctags: 79,406
  • sloc: python: 143,495; cpp: 89,357; fortran: 81,650; ansic: 79,778; makefile: 364; sh: 265
file content (253 lines) | stat: -rw-r--r-- 6,526 bytes parent folder | download
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
"""
=====================================================
Optimization and root finding (:mod:`scipy.optimize`)
=====================================================

.. currentmodule:: scipy.optimize

Optimization
============

Local Optimization
------------------

.. autosummary::
   :toctree: generated/

   minimize - Unified interface for minimizers of multivariate functions
   minimize_scalar - Unified interface for minimizers of univariate functions
   OptimizeResult - The optimization result returned by some optimizers
   OptimizeWarning - The optimization encountered problems

The `minimize` function supports the following methods:

.. toctree::

   optimize.minimize-neldermead
   optimize.minimize-powell
   optimize.minimize-cg
   optimize.minimize-bfgs
   optimize.minimize-newtoncg
   optimize.minimize-lbfgsb
   optimize.minimize-tnc
   optimize.minimize-cobyla
   optimize.minimize-slsqp
   optimize.minimize-dogleg
   optimize.minimize-trustncg

The `minimize_scalar` function supports the following methods:

.. toctree::

   optimize.minimize_scalar-brent
   optimize.minimize_scalar-bounded
   optimize.minimize_scalar-golden

The specific optimization method interfaces below in this subsection are
not recommended for use in new scripts; all of these methods are accessible
via a newer, more consistent interface provided by the functions above.

General-purpose multivariate methods:

.. autosummary::
   :toctree: generated/

   fmin - Nelder-Mead Simplex algorithm
   fmin_powell - Powell's (modified) level set method
   fmin_cg - Non-linear (Polak-Ribiere) conjugate gradient algorithm
   fmin_bfgs - Quasi-Newton method (Broydon-Fletcher-Goldfarb-Shanno)
   fmin_ncg - Line-search Newton Conjugate Gradient

Constrained multivariate methods:

.. autosummary::
   :toctree: generated/

   fmin_l_bfgs_b - Zhu, Byrd, and Nocedal's constrained optimizer
   fmin_tnc - Truncated Newton code
   fmin_cobyla - Constrained optimization by linear approximation
   fmin_slsqp - Minimization using sequential least-squares programming
   differential_evolution - stochastic minimization using differential evolution

Univariate (scalar) minimization methods:

.. autosummary::
   :toctree: generated/

   fminbound - Bounded minimization of a scalar function
   brent - 1-D function minimization using Brent method
   golden - 1-D function minimization using Golden Section method

Equation (Local) Minimizers
---------------------------

.. autosummary::
   :toctree: generated/
   
   leastsq - Minimize the sum of squares of M equations in N unknowns
   least_squares - Feature-rich least-squares minimization.
   nnls - Linear least-squares problem with non-negativity constraint
   lsq_linear - Linear least-squares problem with bound constraints

Global Optimization
-------------------

.. autosummary::
   :toctree: generated/

   basinhopping - Basinhopping stochastic optimizer
   brute - Brute force searching optimizer
   differential_evolution - stochastic minimization using differential evolution

Rosenbrock function
-------------------

.. autosummary::
   :toctree: generated/

   rosen - The Rosenbrock function.
   rosen_der - The derivative of the Rosenbrock function.
   rosen_hess - The Hessian matrix of the Rosenbrock function.
   rosen_hess_prod - Product of the Rosenbrock Hessian with a vector.

Fitting
=======

.. autosummary::
   :toctree: generated/

   curve_fit -- Fit curve to a set of points

Root finding
============

Scalar functions
----------------
.. autosummary::
   :toctree: generated/

   brentq - quadratic interpolation Brent method
   brenth - Brent method, modified by Harris with hyperbolic extrapolation
   ridder - Ridder's method
   bisect - Bisection method
   newton - Secant method or Newton's method

Fixed point finding:

.. autosummary::
   :toctree: generated/

   fixed_point - Single-variable fixed-point solver

Multidimensional
----------------

General nonlinear solvers:

.. autosummary::
   :toctree: generated/

   root - Unified interface for nonlinear solvers of multivariate functions
   fsolve - Non-linear multi-variable equation solver
   broyden1 - Broyden's first method
   broyden2 - Broyden's second method

The `root` function supports the following methods:

.. toctree::

   optimize.root-hybr
   optimize.root-lm
   optimize.root-broyden1
   optimize.root-broyden2
   optimize.root-anderson
   optimize.root-linearmixing
   optimize.root-diagbroyden
   optimize.root-excitingmixing
   optimize.root-krylov
   optimize.root-dfsane

Large-scale nonlinear solvers:

.. autosummary::
   :toctree: generated/

   newton_krylov
   anderson

Simple iterations:

.. autosummary::
   :toctree: generated/

   excitingmixing
   linearmixing
   diagbroyden

:mod:`Additional information on the nonlinear solvers <scipy.optimize.nonlin>`

Linear Programming
==================

Simplex Algorithm:

.. autosummary::
   :toctree: generated/

   linprog -- Linear programming using the simplex algorithm
   linprog_verbose_callback -- Sample callback function for linprog

The `linprog` function supports the following methods:

.. toctree::

   optimize.linprog-simplex

Assignment problems:

.. autosummary::
   :toctree: generated/

   linear_sum_assignment -- Solves the linear-sum assignment problem


Utilities
=========

.. autosummary::
   :toctree: generated/

   approx_fprime - Approximate the gradient of a scalar function
   bracket - Bracket a minimum, given two starting points
   check_grad - Check the supplied derivative using finite differences
   line_search - Return a step that satisfies the strong Wolfe conditions

   show_options - Show specific options optimization solvers
   LbfgsInvHessProduct - Linear operator for L-BFGS approximate inverse Hessian

"""

from __future__ import division, print_function, absolute_import

from .optimize import *
from ._minimize import *
from ._root import *
from .minpack import *
from .zeros import *
from .lbfgsb import fmin_l_bfgs_b, LbfgsInvHessProduct
from .tnc import fmin_tnc
from .cobyla import fmin_cobyla
from .nonlin import *
from .slsqp import fmin_slsqp
from .nnls import nnls
from ._basinhopping import basinhopping
from ._linprog import linprog, linprog_verbose_callback
from ._hungarian import linear_sum_assignment
from ._differentialevolution import differential_evolution
from ._lsq import least_squares, lsq_linear


__all__ = [s for s in dir() if not s.startswith('_')]
from numpy.testing import Tester
test = Tester().test
bench = Tester().bench