File: plot_propagate_kriging_ishigami.py

package info (click to toggle)
openturns 1.24-4
  • links: PTS, VCS
  • area: main
  • in suites: forky, sid, trixie
  • size: 66,204 kB
  • sloc: cpp: 256,662; python: 63,381; ansic: 4,414; javascript: 406; sh: 180; xml: 164; yacc: 123; makefile: 98; lex: 55
file content (93 lines) | stat: -rw-r--r-- 2,564 bytes parent folder | download
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
"""
Kriging: propagate uncertainties
================================

In this example we propagate uncertainties through a Kriging metamodel of the :ref:`Ishigami model<use-case-ishigami>`.
"""

import openturns as ot
from matplotlib import pylab as plt
import openturns.viewer as otv


# %%
# We first build the metamodel and then compute its mean with a Monte-Carlo
# computation.
#
# We load the Ishigami model from the usecases module:

from openturns.usecases import ishigami_function

im = ishigami_function.IshigamiModel()

# %%
# We build a design of experiments with a Latin Hypercube Sampling (LHS) for the three input variables
# supposed independent.
experiment = ot.LHSExperiment(im.inputDistribution, 30, False, True)
xdata = experiment.generate()

# %%
# We get the exact model and evaluate it at the input training data `xdata` to
# build the output data `ydata`.
model = im.model
ydata = model(xdata)

# %%
# We define our Kriging strategy :
#
#  - a constant basis in :math:`\mathbb{R}^3` ;
#  - a squared exponential covariance function.
#
dimension = 3
basis = ot.ConstantBasisFactory(dimension).build()
covarianceModel = ot.SquaredExponential([0.1] * dimension, [1.0])
algo = ot.KrigingAlgorithm(xdata, ydata, covarianceModel, basis)
algo.run()
result = algo.getResult()

# %%
# We finally get the metamodel to use with Monte-Carlo.
metamodel = result.getMetaModel()

# %%
# We want to estmate the mean of the Ishigami model with Monte-Carlo using the
# metamodel instead of the exact model.
#
# We first create a random vector following the input distribution :
X = ot.RandomVector(im.inputDistribution)

# %%
# And then we create a random vector from the image of the input random vector
# by the metamodel :
Y = ot.CompositeRandomVector(metamodel, X)

# %%
# We now set our :class:`~openturns.ExpectationSimulationAlgorithm` object :
algo = ot.ExpectationSimulationAlgorithm(Y)
algo.setMaximumOuterSampling(50000)
algo.setBlockSize(1)
algo.setCoefficientOfVariationCriterionType("NONE")

# %%
# We run it and store the results :
algo.run()
result = algo.getResult()

# %%
# The expectation ( :math:`\mathbb{E}(Y)` mean ) is obtained with :
expectation = result.getExpectationEstimate()

# %%
# The mean estimate of the metamodel is
print("Mean of the Ishigami metamodel : %.3e" % expectation[0])

# %%
# We draw the convergence history.
graph = algo.drawExpectationConvergence()
view = otv.View(graph)

# %%
# For reference, the exact mean of the Ishigami model is :
print("Mean of the Ishigami model : %.3e" % im.expectation)

plt.show()