esys.downunder.splitminimizers Package¶
Classes¶
-
class
esys.downunder.splitminimizers.
AbstractMinimizer
(J=None, m_tol=0.0001, J_tol=None, imax=300)¶ Bases:
object
Base class for function minimization methods.
-
__init__
(J=None, m_tol=0.0001, J_tol=None, imax=300)¶ Initializes a new minimizer for a given cost function.
Parameters: - J (
CostFunction
) – the cost function to be minimized - m_tol (float) – terminate interations when relative change of the level set function is less than or equal m_tol
- J (
-
getCostFunction
()¶ return the cost function to be minimized
Return type: CostFunction
-
getOptions
()¶ Returns a dictionary of minimizer-specific options.
-
getResult
()¶ Returns the result of the minimization.
-
logSummary
()¶ Outputs a summary of the completed minimization process to the logger.
-
run
(x0)¶ Executes the minimization algorithm for f starting with the initial guess
x0
.Returns: the result of the minimization
-
setCallback
(callback)¶ Sets a callback function to be called after every iteration. It is up to the specific implementation what arguments are passed to the callback. Subclasses should at least pass the current iteration number k, the current estimate x, and possibly f(x), grad f(x), and the current error.
-
setCostFunction
(J)¶ set the cost function to be minimized
Parameters: J ( CostFunction
) – the cost function to be minimized
-
setMaxIterations
(imax)¶ Sets the maximum number of iterations before the minimizer terminates.
-
setOptions
(**opts)¶ Sets minimizer-specific options. For a list of possible options see
getOptions()
.
-
setTolerance
(m_tol=0.0001, J_tol=None)¶ Sets the tolerance for the stopping criterion. The minimizer stops when an appropriate norm is less than
m_tol
.
-
-
class
esys.downunder.splitminimizers.
ArithmeticTuple
(*args)¶ Bases:
object
Tuple supporting inplace update x+=y and scaling x=a*y where
x,y
is an ArithmeticTuple anda
is a float.Example of usage:
from esys.escript import Data from numpy import array a=eData(...) b=array([1.,4.]) x=ArithmeticTuple(a,b) y=5.*x
-
__init__
(*args)¶ Initializes object with elements
args
.Parameters: args – tuple of objects that support inplace add (x+=y) and scaling (x=a*y)
-
-
class
esys.downunder.splitminimizers.
FunctionJob
(fn, *args, **kwargs)¶ Bases:
esys.escriptcore.splitworld.Job
Takes a python function (with only self and keyword params) to be called as the work method
-
__init__
(fn, *args, **kwargs)¶ It ignores all of its parameters, except that, it requires the following as keyword arguments
Variables: - domain – Domain to be used as the basis for all
Data
and PDEs in this Job. - jobid – sequence number of this job. The first job has id=1
- domain – Domain to be used as the basis for all
-
clearExports
()¶ Remove exported values from the map
-
clearImports
()¶ Remove imported values from their map
-
declareImport
(name)¶ Adds name to the list of imports
-
exportValue
(name, v)¶ Make value v available to other Jobs under the label name. name must have already been registered with the SplitWorld instance. For use inside the work() method.
Variables: - name – registered label for exported value
- v – value to be imported
-
setImportValue
(name, v)¶ Use to make a value available to the job (ie called from outside the job)
Variables: - name – label used to identify this import
- v – value to be imported
-
work
()¶ Need to be overloaded for the job to actually do anthing. A return value of True indicates this job thinks it is done. A return value of False indicates work still to be done
-
-
class
esys.downunder.splitminimizers.
Job
(*args, **kwargs)¶ Bases:
object
Describes a sequence of work to be carried out in a subworld. The instances of this class used in the subworlds will be constructed by the system. To do specific work, this class should be subclassed and the work() (and possibly __init__ methods overloaded). The majority of the work done by the job will be in the overloaded work() method. The work() method should retrieve values from the outside using importValue() and pass values to the rest of the system using exportValue(). The rest of the methods should be considered off limits.
-
__init__
(*args, **kwargs)¶ It ignores all of its parameters, except that, it requires the following as keyword arguments
Variables: - domain – Domain to be used as the basis for all
Data
and PDEs in this Job. - jobid – sequence number of this job. The first job has id=1
- domain – Domain to be used as the basis for all
-
clearExports
()¶ Remove exported values from the map
-
clearImports
()¶ Remove imported values from their map
-
declareImport
(name)¶ Adds name to the list of imports
-
exportValue
(name, v)¶ Make value v available to other Jobs under the label name. name must have already been registered with the SplitWorld instance. For use inside the work() method.
Variables: - name – registered label for exported value
- v – value to be imported
-
setImportValue
(name, v)¶ Use to make a value available to the job (ie called from outside the job)
Variables: - name – label used to identify this import
- v – value to be imported
-
work
()¶ Need to be overloaded for the job to actually do anthing. A return value of True indicates this job thinks it is done. A return value of False indicates work still to be done
-
-
class
esys.downunder.splitminimizers.
SplitInversionCostFunction
(numLevelSets=None, numModels=None, numMappings=None, splitworld=None, worldsinit_fn=None)¶ Bases:
esys.downunder.costfunctions.MeteredCostFunction
Class to define cost function J(m) for inversion with one or more forward models based on a multi-valued level set function m:
J(m) = J_reg(m) + sum_f mu_f * J_f(p)
where J_reg(m) is the regularization and cross gradient component of the cost function applied to a level set function m, J_f(p) are the data defect cost functions involving a physical forward model using the physical parameter(s) p and mu_f is the trade-off factor for model f.
A forward model depends on a set of physical parameters p which are constructed from components of the level set function m via mappings.
- Example 1 (single forward model):
- m=Mapping() f=ForwardModel() J=InversionCostFunction(Regularization(), m, f)
- Example 2 (two forward models on a single valued level set)
m0=Mapping() m1=Mapping() f0=ForwardModel() f1=ForwardModel()
J=InversionCostFunction(Regularization(), mappings=[m0, m1], forward_models=[(f0, 0), (f1,1)])
- Example 3 (two forward models on 2-valued level set)
m0=Mapping() m1=Mapping() f0=ForwardModel() f1=ForwardModel()
J=InversionCostFunction(Regularization(self.numLevelSets=2), mappings=[(m0,0), (m1,0)], forward_models=[(f0, 0), (f1,1)])
Note: If provides_inverse_Hessian_approximation is true, then the class provides an approximative inverse of the Hessian operator. -
__init__
(numLevelSets=None, numModels=None, numMappings=None, splitworld=None, worldsinit_fn=None)¶ fill this in.
-
calculateGradient
(vnames1, vnames2)¶ The gradient operation produces two components (designated (Y^,X) in the non-split version). vnames1 gives the variable name(s) where the first component should be stored. vnames2 gives the variable name(s) where the second component should be stored.
-
static
calculatePropertiesHelper
(self, m, mappings)¶ returns a list of the physical properties from a given level set function m using the mappings of the cost function.
Parameters: m ( Data
) – level set functionReturn type: list
ofData
-
calculateValue
(vname)¶
-
createLevelSetFunction
(*props)¶ returns an instance of an object used to represent a level set function initialized with zeros. Components can be overwritten by physical properties
props
. If present entries must correspond to themappings
arguments in the constructor. UseNone
for properties for which no value is given.
-
static
createLevelSetFunctionHelper
(self, regularization, mappings, *props)¶ Returns an object (init-ed) with 0s. Components can be overwritten by physical properties
props
. If present entries must correspond to themappings
arguments in the constructor. UseNone
for properties for which no value is given.
-
static
formatMappings
(mappings, numLevelSets)¶
-
static
formatModels
(forward_models, numMappings)¶
-
getArguments
(x)¶ returns precalculated values that are shared in the calculation of f(x) and grad f(x) and the Hessian operator
Note
The tuple returned by this call will be passed back to this
CostFunction
in other calls(eg:getGradient
). Its contents are not specified at this level because no code, other than theCostFunction
which created it, will be interacting with it. That is, the implementor can put whatever information they find useful in it.Parameters: x (x-type) – location of derivative Return type: tuple
-
getComponentValues
(m, *args)¶
-
getDomain
()¶ returns the domain of the cost function
Return type: Domain
-
getDualProduct
(x, r)¶ returns the dual product of
x
andr
Return type: float
-
getForwardModel
(idx=None)¶ returns the idx-th forward model.
Parameters: idx ( int
) – model index. If cost function contains one model onlyidx
can be omitted.
-
getGradient
(x, *args)¶ returns the gradient of f at x using the precalculated values for x.
Parameters: - x (x-type) – location of derivative
- args – pre-calculated values for
x
fromgetArguments()
Return type: r-type
-
getInverseHessianApproximation
(x, r, *args)¶ returns an approximative evaluation p of the inverse of the Hessian operator of the cost function for a given gradient r at a given location x: H(x) p = r
Note
In general it is assumed that the Hessian H(x) needs to be calculate in each call for a new location x. However, the solver may suggest that this is not required, typically when the iteration is close to completeness.
Parameters: - x (x-type) – location of Hessian operator to be evaluated.
- r (r-type) – a given gradient
- args – pre-calculated values for
x
fromgetArguments()
Return type: x-type
-
static
getModelArgs
(self, fwdmodels)¶ Attempts to import the arguments for forward models, if they are not available, Computes and exports them
-
getNorm
(x)¶ returns the norm of
x
Return type: float
-
getNumTradeOffFactors
()¶ returns the number of trade-off factors being used including the trade-off factors used in the regularization component.
Return type: int
-
getProperties
(m, return_list=False)¶ returns a list of the physical properties from a given level set function m using the mappings of the cost function.
Parameters: - m (
Data
) – level set function - return_list (
bool
) – ifTrue
a list is returned.
Return type: list
ofData
- m (
-
getRegularization
()¶ returns the regularization
Return type: Regularization
-
getTradeOffFactors
(mu=None)¶ returns a list of the trade-off factors.
Return type: list
offloat
-
getTradeOffFactorsModels
()¶ returns the trade-off factors for the forward models
Return type: float
orlist
offloat
-
getValue
(x, *args)¶ returns the value f(x) using the precalculated values for x.
Parameters: x (x-type) – a solution approximation Return type: float
-
provides_inverse_Hessian_approximation
= True¶
-
resetCounters
()¶ resets all statistical counters
-
setPoint
()¶
-
setTradeOffFactors
(mu=None)¶ sets the trade-off factors for the forward model and regularization terms.
Parameters: mu ( list
offloat
) – list of trade-off factors.
-
setTradeOffFactorsModels
(mu=None)¶ sets the trade-off factors for the forward model components.
Parameters: mu ( float
in case of a single model or alist
offloat
with the length of the number of models.) – list of the trade-off factors. If not present ones are used.
-
setTradeOffFactorsRegularization
(mu=None, mu_c=None)¶ sets the trade-off factors for the regularization component of the cost function, see
Regularization
for details.Parameters: - mu – trade-off factors for the level-set variation part
- mu_c – trade-off factors for the cross gradient variation part
-
static
subworld_setMu_model
(self, **args)¶
-
updateHessian
()¶ notifies the class that the Hessian operator needs to be updated.
-
static
update_point_helper
(self, newpoint)¶ Call within a subworld to set ‘current_point’ to newpoint and update all the cached args info
-
class
esys.downunder.splitminimizers.
SplitMinimizerLBFGS
(J=None, m_tol=0.0001, J_tol=None, imax=300)¶ Bases:
esys.downunder.minimizers.AbstractMinimizer
Minimizer that uses the limited-memory Broyden-Fletcher-Goldfarb-Shanno method.
version modified to fit with split world.
-
__init__
(J=None, m_tol=0.0001, J_tol=None, imax=300)¶ Initializes a new minimizer for a given cost function.
Parameters: - J (
CostFunction
) – the cost function to be minimized - m_tol (float) – terminate interations when relative change of the level set function is less than or equal m_tol
- J (
-
getCostFunction
()¶ return the cost function to be minimized
Return type: CostFunction
-
getOptions
()¶ Returns a dictionary of minimizer-specific options.
-
getResult
()¶ Returns the result of the minimization.
-
logSummary
()¶ Outputs a summary of the completed minimization process to the logger.
-
static
move_point_from_base
(self, **kwargs)¶
-
run
()¶ This version relies on the costfunction already having an initial guess loaded. It also does not return the result, meaning a job needs to be submitted to get the result out.
-
setCallback
(callback)¶ Sets a callback function to be called after every iteration. It is up to the specific implementation what arguments are passed to the callback. Subclasses should at least pass the current iteration number k, the current estimate x, and possibly f(x), grad f(x), and the current error.
-
setCostFunction
(J)¶ set the cost function to be minimized
Parameters: J ( CostFunction
) – the cost function to be minimized
-
setMaxIterations
(imax)¶ Sets the maximum number of iterations before the minimizer terminates.
-
setOptions
(**opts)¶ Sets minimizer-specific options. For a list of possible options see
getOptions()
.
-
setTolerance
(m_tol=0.0001, J_tol=None)¶ Sets the tolerance for the stopping criterion. The minimizer stops when an appropriate norm is less than
m_tol
.
-