Explore chapters and articles related to this topic
Off-Line Handwritten Word Recognition Using Hidden Markov Models
Published in Lakhmi C. Jain, Beatrice Lazzerini, KNOWLEDGE-BASED INTELLIGENT TECHNIQUES in CHARACTER RECOGNITION, 2020
A. El-Yacoubi, R. Sabourin, M. Gilloux, C.Y. Suen
A hidden Markov model is a doubly stochastic process, with an underlying stochastic process that is not observable (hence the word hidden), but can be observed through another stochastic process that produces the sequence of observations [14]. The hidden process consists of a set of states connected to each other by transitions with probabilities, while the observed process consists of a set of outputs or observations, each of which may be emitted by each state according to some output probability density function (pdf). Depending on the nature of this pdf, several kinds of HMMs can be distinguished. If the observations are naturally discrete or quantized using quantization or vector quantization [31], and drawn from an alphabet or a codebook, the HMM is called discrete [16], [17]. If these observations are continuous, we are dealing with a continuous HMM [17], [32], with a continuous pdf usually approximated by a mixture of normal distributions. Another family of HMMs, a compromise between discrete and continuous HMMs, are semi-continuous HMMs [33] that mutually optimize the vector quantized codebook and HMM parameters under a unified probabilistic framework.
God's algorithm: The expectation-maximization algorithm
Published in Jun Wu, Rachel Wu, Yuxi Candice Wang, The Beauty of Mathematics in Computer Science, 2018
Many algorithms we have introduced in previous chapters are EM algorithms, for example, the hidden Markov model training method, the Baum-Welch algorithm, as well as the maximum entropy model training method—the GIS algorithm. In the Baum-Welch algorithm, the E-step computes the number of transitions between each state (can be a fraction) and the number of times each state produces them according to the current model; the M-step re-estimates the hidden Markov model's parameters according to these numbers. Note that the objective function to be maximized in the training of HMM is the probability of observations. In the maximum entropy model's general iterative algorithm GIS, the E-step computes each feature's mathematical expectation according to the current model, and the M-step adjusts model parameters according to the ratios of mathematical expectations and actual observed values. Here, the objective function to be maximized is the entropy of the probability model.
Human Activity Recognition
Published in Maheshkumar H. Kolekar, Intelligent Video Surveillance Systems, 2018
HMM is used for modeling time sequential activity. The Baum Welch algorithm is used for training purposes. Once the classifier is trained, the test videos are tested by extracting features from the video and classifying the video based on maximum likelihood estimation. Here, we have designed an HMM model for five different activities: bending, boxing, clapping, hand-wave, and walking. The trained state transition matrixes for bending, boxing, clapping, hand-wave, and walking activities are given in Tables , , 5.4, , and 5.5 respectively. The overall accuracy achieved was 90%. The applied techniques found to be scale and direction invariant due to fusion of both optical and shape-based features. The optical feature is direction invariant and the shape feature extracted is normalized to make it scale invariant. Snapshots of the successfully recognized clips are shown in Figure 5.7.
Speaker Identification in Interactions between Mothers and Children with Down Syndrome via Audio Analysis: A Case Study in Mexico
Published in International Journal of Human–Computer Interaction, 2023
Carlos R. Flores-Carballo, Gabriel A. Molina-Arenas, Adrian Macias, Karina Caro, Jessica Beltran, Luis A. Castro
We trained Hidden Markov Models (HMM), which is a statistical approach that assumes a Markov process with hidden and unknown parameters (Rabiner & Juang, 1986). This technique is used with observable sequential information, such as audio, and has been widely applied in problems of speech and speaker recognition, among many others (Mor et al., 2021). Although in recent years there have emerged approaches with better performance, HMM are still widely used because of their simplicity and because they provide a performance closely compared with recent methods (Hamidi et al., 2020; Todkar et al., 2018). Moreover, approaches based on deep learning require extensive datasets and computing power. In this work, we have the constrain of a small dataset that was collected in a semi-naturalistic scenario with conversations among mothers and children with DS. Also, the purpose of this work is to explore the potential of using ML techniques in systems that require human tasks, for which performance can be improved in future work.
A survey of intelligent building automation with machine learning and IoT
Published in Advances in Building Energy Research, 2023
Mona Masroor, Javad Rezazadeh, John Ayoade, Mehdi Aliehyaei
The hidden markov model can model complex markov processes that result in states based on the possible distribution of observations. For example, if the probability distribution is Gaussian, in such a hidden markov model output states also follow the gaussian distribution. In addition, the hidden markov model can model more complex behaviours. Where the output of the states follows a combination of two or more Gaussian distributions, in which case the probability of producing an observation from the product of the first selected Gaussian is obtained in the probability of producing an observation from the other Gaussian. The use of statistical methods such as the hidden (Franzese & Iuliano, 2019) markov model has certain advantages that can be mentioned as follows: This model uses strong mathematics that can be used in many applications.If the model is used correctly, it can be used in many important practical applications.In libraries it can be combined with other algorithms.
Identifying untrusted interactive behaviour in Enterprise Resource Planning systems based on a big data pattern recognition method using behavioural analytics
Published in Behaviour & Information Technology, 2022
Qian Yi, Mengyao Xu, Shuping Yi, Shiquan Xiong
HMM is used to construct accurate models, which has good detection efficiency, and can process sequence data. It is often used to analyze real-world behaviours, which are continuous and multi-characteristic. Chung and Liu (2008) presented a hierarchical context HMM for behaviour understanding from video streams in a nursing center. Doulamis and Varvarigou (2010) used HMM to analyze human behaviour on automobile assembly lines and attempted to discover abnormal behaviour. Bunian et al. (2016) explored the use of HMM to model players’ individual differences, and their results showed predictive power for some of the personality traits. Gong, Chen, and Peng (2019) proposed a recognition method that combines deep learning with HMM with the aim of improving the accuracy of two-person interaction behaviour. Zhang et al. (2020) established the Gaussian Mixture Model-HMM behaviour recognition model to predict the future behaviour of traffic vehicles. These articles evidence that HMM is effective in behaviour analysis. However, no studies have been found to analyze network behaviour using HMM yet.