1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119
|
"""
===========================
Partial Dependence Plots 2D
===========================
Hvass-Labs Dec 2017
Holger Nahrstaedt 2020
.. currentmodule:: skopt
Simple example to show the new 2D plots.
"""
print(__doc__)
from math import exp
import numpy as np
from skopt import gp_minimize
from skopt.plots import plot_histogram, plot_objective, plot_objective_2D
from skopt.space import Categorical, Integer, Real
from skopt.utils import point_asdict
np.random.seed(123)
import matplotlib.pyplot as plt
#############################################################################
dim_learning_rate = Real(name='learning_rate', low=1e-6, high=1e-2, prior='log-uniform')
dim_num_dense_layers = Integer(name='num_dense_layers', low=1, high=5)
dim_num_dense_nodes = Integer(name='num_dense_nodes', low=5, high=512)
dim_activation = Categorical(name='activation', categories=['relu', 'sigmoid'])
dimensions = [
dim_learning_rate,
dim_num_dense_layers,
dim_num_dense_nodes,
dim_activation,
]
default_parameters = [1e-4, 1, 64, 'relu']
def model_fitness(x):
learning_rate, num_dense_layers, num_dense_nodes, activation = x
fitness = (
((exp(learning_rate) - 1.0) * 1000) ** 2
+ (num_dense_layers) ** 2
+ (num_dense_nodes / 100) ** 2
)
fitness *= 1.0 + 0.1 * np.random.rand()
if activation == 'sigmoid':
fitness += 10
return fitness
print(model_fitness(x=default_parameters))
#############################################################################
search_result = gp_minimize(
func=model_fitness,
dimensions=dimensions,
n_calls=30,
x0=default_parameters,
random_state=123,
)
print(search_result.x)
print(search_result.fun)
#############################################################################
for fitness, x in sorted(zip(search_result.func_vals, search_result.x_iters)):
print(fitness, x)
#############################################################################
space = search_result.space
print(search_result.x_iters)
search_space = {name: space[name][1] for name in space.dimension_names}
print(point_asdict(search_space, default_parameters))
#############################################################################
print("Plotting now ...")
_ = plot_histogram(result=search_result, dimension_identifier='learning_rate', bins=20)
plt.show()
#############################################################################
_ = plot_objective_2D(
result=search_result,
dimension_identifier1='learning_rate',
dimension_identifier2='num_dense_nodes',
)
plt.show()
#############################################################################
_ = plot_objective_2D(
result=search_result,
dimension_identifier1='num_dense_layers',
dimension_identifier2='num_dense_nodes',
)
plt.show()
#############################################################################
_ = plot_objective(
result=search_result, plot_dims=['num_dense_layers', 'num_dense_nodes']
)
plt.show()
|