When do we say that a artificial neural network is a multilayer Perceptron?
And when do we say that a artificial neural network is a multilayer?
Is the term perceptron related to learning rule to update the weights?
Or It is related to neuron units?
Asked By : Mohammad
Answered By : Pål GD
A perceptron is always feedforward, that is, all the arrows are going in the direction of the output. Neural networks in general might have loops, and if so, are often called recurrent networks. A recurrent network is much harder to train than a feedforward network.
In addition, it is assumed that in a perceptron, all the arrows are going from layer $i$ to layer $i+1$, and it is also usual (to start with having) that all the arcs from layer $i$ to $i+1$ are present.
Finally, having multiple layers means more than two layers, that is, you have hidden layers. A perceptron is a network with two layers, one input and one output. A multilayered network means that you have at least one hidden layer (we call all the layers between the input and output layers hidden).
Best Answer from StackOverflow
Question Source : http://cs.stackexchange.com/questions/53521
0 comments:
Post a Comment
Let us know your responses and feedback