Explore chapters and articles related to this topic
Applications and Architectures for Chaotic ICs: An Introduction
Published in M.P. Kennedy, R. Rovatti, G. Setti, Chaotic Electronics in Telecommunications, 2018
Angel Rodriguez-Vázquez, Manuel Delgado-Restituto, Rocío del Rio, Belén Pérez-Verdú
Another neural network model which makes use of chaos for information storage has been proposed in [42]. The model is formally identical to a Hopfield network, in the sense that it consists on a fully recurrent structure which codifies input patterns into stable equilibrium points. Moreover, as in the Hopfield network, the stored patterns are binary vectors and the synaptic weights are learned by using the outer product (Hebbian) rule. The difference between both models is that the processing units are not threshold elements, but chaotic oscillators 4. Such chaotic oscillators consist of two mutually coupled discrete maps, whose interconnection strength is a weighted sum of the neuron inputs. The output of the processing unit is binary and takes a ’1’ value if the coupled maps synchronize within some error bound, and a ’0’ value if there is no synchronization. It is shown that the model it is able to retrieve any of the stored patterns when sufficient partial information of itself is presented to the network.
Symmetric Weights and Deep Belief Networks
Published in Stephen Marsland, Machine Learning, 2014
We give an input to the Hopfield network by setting the activations of the neurons (the si) and then running the update equation until the neurons stop changing values. So once the weights are set, the remembering is very simple. The learning is also simple: the Hopfield network learns the weights using Hebb’s rule: wij=1N∑n=1Nsi(n)sj(n),
D
Published in Philip A. Laplante, Comprehensive Dictionary of Electrical Engineering, 2018
discrete Hopfield network a single layer, fully connected network that stores (usually bipolar) patterns by setting its weight values w i j equal to the (i, j) entry in the sum of the outer products of the patterns. The network can be used as an associative memory so long as the number of stored patterns is less than about 14% of the number of neural elements. Compare with continuous Hopfield network. 0.3536 - 0.0975 - 0.4619 0.2778 0.3536 - 0.4157 - 0.1913 0.4904 0.3536 0.3536 0.3536 - 0.2778 - 0.4157 - 0.4904 - 0.1913 0.1913 0.4619 0.4904 0.0975 - 0.4517 - 0.3536 - 0.3536 0.3536 - 0.0975 0.4904 - 0.2778 0.4619 - 0.4619 0.1913 - 0.4157 0.2778 - 0.0975
Virtual metrology in long batch processes using machine learning
Published in Materials and Manufacturing Processes, 2023
Ritam Guha, Anirudh Suresh, Jared DeFrain, Kalyanmoy Deb
After DNNs started gaining popularity as an important estimation tool, the researchers realized there is a need for introducing a recurrent structure to the NN for handling recurring estimations with dynamic dimensions of inputs. In 1982, Hopfield[18] proposed a preliminary version of recurrent neural network (RNN) known as Hopfield Network in which recurrent structure was introduced in the nodes. The modern unfolding structure in RNNs was introduced by Jurgen.[19] But even though RNNs can be recurrently unfolded to any number of time steps, it was unable to capture long-term dependencies. This problem of long-term dependency retrieval was solved by Long Short-Term Memory (LSTM).