1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28
|
<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN">
<html><head>
<meta http-equiv="content-type" content="text/html; charset=ISO-8859-1">
</head><body>
<h2>Multi-Layer Perceptron</h2>
<br>
MLP is a model of feed-forward Artificial Neural Networks composed by
number of separate (hidden) layers. Information is passed from one
layer to the next following a given activation function. Training of
MLP is done by back-propagation.<br>
<br>
More information on <a href="http://en.wikipedia.org/wiki/Multi-layer_perceptron">Wikipedia</a>.<br>
<br>
Parameters:<br>
<ul>
<li># Neurons: number of neurons per hidden layer; input and output nodes are not counted</li>
<li># Layers: number of hidden layers</li>
<li>Activation function: output function for all neurons in the network</li>
<ul>
<li>sigmoid: beta*(1-exp(-alpha*x)) / (1 + exp(-alpha*x))</li>
<li>gaussian: beta*exp(-alpha*x*x)</li>
</ul>
</ul>
</body></html>
|