Matches in DBpedia 2014 for { <http://dbpedia.org/resource/Quickprop> ?p ?o. }
Showing items 1 to 16 of
16
with 100 items per page.
- Quickprop abstract "Quickprop is an iterative method for determining the minimum of the loss function of an artificial neural network, following an algorithm inspired by the Newton's method. Sometimes, the algorithm is classified to the group of the second order learning methods. It follows a quadratic approximation of the previous gradient step and the current gradient, which is expected to be closed to the minimum of the loss function, under the assumption that the loss function is locally approximately square, trying to describe it by means of an upwardly open parabola. The minimum is sought in the vertex of the parabola. The procedure requires only local information of the artificial neuron to which it is applied.The k-th approximation step is given by:Being the neuron j weight of its i input and E is the loss function.The Quickprop algorithm converges generally faster than error backpropagation algorithm, but the network can behave chaotically during the learning phase due to large step sizes.".
- Quickprop wikiPageExternalLink qp-tr.ps.
- Quickprop wikiPageID "41575347".
- Quickprop wikiPageRevisionID "589769552".
- Quickprop subject Category:Computational_neuroscience.
- Quickprop subject Category:Machine_learning_algorithms.
- Quickprop subject Category:Neural_networks.
- Quickprop comment "Quickprop is an iterative method for determining the minimum of the loss function of an artificial neural network, following an algorithm inspired by the Newton's method. Sometimes, the algorithm is classified to the group of the second order learning methods.".
- Quickprop label "Quickprop".
- Quickprop label "Quickprop".
- Quickprop sameAs Quickprop.
- Quickprop sameAs m.0_1k3tx.
- Quickprop sameAs Q1326757.
- Quickprop sameAs Q1326757.
- Quickprop wasDerivedFrom Quickprop?oldid=589769552.
- Quickprop isPrimaryTopicOf Quickprop.