Explore chapters and articles related to this topic
Deep Learning for IoT-Healthcare Based on Physiological Signals
Published in Jacques Bou Abdo, Jacques Demerjian, Abdallah Makhoul, 5G Impact on Biomedical Engineering, 2022
Joseph Azar, Raphaël Couturier
A Restricted Boltzmann Machine (RBM) is a neural model proposed by Smolensky in [41]. A RBM is composed of a layer of (visible) input neurons and a layer of (latent) hidden neurons. If many layers of RBM are stacked, one gets a layer-by-layer scheme called Deep Belief Network (DBN). A RBM is a generative probabilistic model that can learn a process of data generation described by the units observed but utilizes latent variables to model all internal relationships. The learning of the joint probability distribution over the observable nodes through the hidden nodes makes the RBM suitable for capturing the high-order dependencies in physiological signals. For instance, the authors in [39] used the RBM to capture relations between EEG signals and peripheral physiological signals for a better feature representation. The RBM model is also suitable for mining massive unlabelled physiological data. The authors in [47] used the RBM to process and classify a large scale of unlabeled ECG data.
Recent Advancements in Automatic Sign Language Recognition (SLR)
Published in Sourav De, Paramartha Dutta, Computational Intelligence for Human Action Recognition, 2020
Varshini Prakash, B.K. Tripathy
Being an energy-based model, it is used as a generative model in various applications to estimate probability or data distribution. RBM is composed of two kinds of layers which are independent of each other. The two layers comprise visible and hidden units. Although they are independent, the units of the other layer govern them. The Gradient-based Contrastive Divergence algorithm is used to train the RBM. The Gibbs sampling method is used in RBM to obtain a fitting estimate of the log likelihood gradient. Convergence of the model depends on proper adjustments made on RBM parameters, such as the learning rate, weights, number of hidden units, momentum and so on. RBM with fewer parameters can be be a good alternative in cases where CNN fails to generalize well.
Privacy towards GIS Based Intelligent Tourism Recommender System in Big Data Analytics
Published in Siddhartha Bhattacharyya, Václav Snášel, Indrajit Pan, Debashis De, Hybrid Computational Intelligence, 2019
Abhaya Kumar Sahoo, Chittaranjan Pradhan, Siddhartha Bhattacharyya
Multilayer Perceptron (MLP), Auto-encoder, Convolutional Neural Network, Recurrent Neural Network, Restricted Boltzmann Machine, Neural Autoregressive Distribution Estimation and Adversarial Networks are the main components of the deep learning method. MLP is a feed-forward neural network with multiple hidden layers in which perceptron uses the arbitrary activation function. Auto-encoder is an unsupervised model which attempts to reconstruct its input data in the output layer. Convolutional Neural Network is a special kind of feed-forward neural network with convolution layers which captures the global and local features for enhancing the efficiency and accuracy. Recurrent Neural Network is used for modeling sequential data. Restricted Boltzmann Machine is a two-layer neural network, which consists of visible layer and hidden layer. There is no intralayer communication among visible and hidden layer. Neural Autoregressive Distribution Estimation is an unsupervised neural network with autoregressive model which is composed of feed-forward neural networks. An Adversarial Network is a generative neural network, which consists of a discriminator and a generator. The two neural networks are trained simultaneously by competing with each other in a minimax game framework.
Applying transfer learning to achieve precision marketing in an omni-channel system – a case study of a sharing kitchen platform
Published in International Journal of Production Research, 2021
Ming-Chuan Chiu, Kai-Hsiang Chuang
Deep learning algorithms can be grouped into two categories based on their architecture: Restricted Boltzmann Machines (RBM) introduced by Ackley, Hinton, and Sejnowski (1985), and convolutional neural networks (CNN). An RBM is a stochastic neural network that is used to identify patterns and rules within data which consists of three layers: the visible layer, hidden layer, and the output layer. Alternatively, CNN can achieve high performance in image recognition through pattern analysis (Karpathy et al. 2014). CNN is a specific type of deep neural network that contains a convolutional layer and a pooling layer that are fully connected. Because of its performance capabilities, CNN has been applied widely in many fields, especially in image retraining. Esteva et al. (2017) proposed its use in developing CNN-PA, a method for diagnosing skin cancer using transfer learning that involves deep neural networks. Wang et al. (2018) established a multi-column CNN for enhancing gastric cancer screening. In studies, Google© Inception v3 has proven to be an outstanding CNN model, designed for image classification tasks and trained for use with ImageNet Large Scale Visual Recognition Challenge (ILSVRC) data (Szegedy et al. 2016).
Application of Deep Belief Network for Critical Heat Flux Prediction on Microstructure Surfaces
Published in Nuclear Technology, 2020
Deep belief network is a class of feed-forward deep neural networks that is composed of multiple hidden layers of graphical models having both directed and undirected edges that are capable of revealing bizarre patterns deeply rooted in data sets or modeling complex nonlinear relationships between various variables. As shown in Fig. 2, DBN is composed of two modules: an unsupervised feature extraction module sequentially stacked by restricted Boltzmann machines (RBMs), and a supervised perceptron for data classification and regression.45 In the stacked structure of DBN, the visible layer of the former RBM is the hidden (latent) layer of the latter RBM. RBM is a two-layer (visible layer and hidden layer) probabilistic neural network. Both visible and hidden (latent) units of RBM are binary and stochastic in terms of the data characteristics, and its visible units can be accurately reconstructed based on its hidden units through unsupervised training of a RBM. This suggests that the visible units can be exactly represented by the hidden units in a different dimensional space, and that no information is lost during data transformation. Representing a data set in a different dimensional (especially high-dimensional) space improves the ability of the network to reveal the hidden patterns rooted in the training data sets.
Hybrid features-enabled dragon deep belief neural network for activity recognition
Published in The Imaging Science Journal, 2018
DDBN is designed by modifying DBN with the integration of DA in the weight update rule such the feasible weight is selected. DBN [32] is composed of multiple layers of Restricted Boltzmann Machine (RBM) and Multilayer Perceptron (MLP). RBM is a stochastic Artificial Neural Network (ANN), and that learns based on the probability distribution of its input set. Meanwhile, MLP maps the input datasets to a set of outputs, where each neuron in the hidden and the output layers are processed using activation function. In DDBN, the effectiveness of DA regarding multi-objectives in solving unknown search space problems [33] improves the performance of the proposed classifier model. The DDBN classifier proposed is comprised of two RBM layers and an MLP layer, as shown in the architecture depicted in Figure 2.