Explore chapters and articles related to this topic
Speech Signal Processing
Published in Richard C. Dorf, Circuits, Signals, and Speech and Image Processing, 2018
Jerry D. Gibson, Bo Wei, Hui Dong, Yariv Ephraim, Israel Cohen, Jesse W. Fussell, Lynn D. Wilcox, Marcia A. Bush
HMM-based continuous speech recognition involves determining an optimal word sequence using the Viterbi algorithm. This algorithm uses dynamic programming to find the optimal state sequence through an HMM network representing the recognizer vocabulary and grammar. The optimal state sequence Q*=(q1∗…qT∗) is defined as the sequence which maximizes P(Q|O,λ), or equivalently P(Q|O). Let δt(i) be the joint probability of the optimal state sequence and the observations up to time t, ending in state Si at time t. Then: () δt(i)=maxP(q1…qt−1,qt=Si,O1…Ot|λ)
Smartphone-Based Human Activity Recognition
Published in Yufeng Wang, Athanasios V. Vasilakos, Qun Jin, Hongbo Zhu, Device-to-Device based Proximity Service, 2017
Yufeng Wang, Athanasios V. Vasilakos, Qun Jin, Hongbo Zhu
HMMs can be considered as a set of states which are traversed in a sequence hidden to an observer. The only thing that is visible is a sequence of observed symbols, emitted by each of the hidden states while they are traversed. Related to HMM, the following several efficient algorithms are used for learning and recognition: Forward–backward algorithm is used for determining the probability that an emission sequence was generated by a given HMM. The forward pass computes the probability of being in a state at a particular time, given the observation sequence up to that time, so summing over all states at the end is required.Baum–Welch algorithm is used for estimating transition and emission probabilities of a HMM, given an observation sequence and initial guesses for these values. As an expectation-maximization algorithm, it uses an iterative search for the parameters with the highest likelihood.Viterbi algorithm (it is the training algorithm, not to be confused with Viterbi decoding algorithm) is used to find the most likely sequence of hidden states, given a HMM and an observation sequence. It computes the recursive likelihood of being in each state at the next time step until the end of the sequence, at which point, the algorithm backtracks to give the most likely state sequence.
Factor Graphs and Message Passing Algorithms
Published in Erchin Serpedin, Thomas Chen, Dinesh Rajan, Mathematical Foundations for SIGNAL PROCESSING, COMMUNICATIONS, AND NETWORKING, 2012
Aitzaz Ahmad, Erchin Serpedin, Khalid A. Qaraqe
The Viterbi algorithm, proposed in [9], is a recursive algorithm to determine the most likely state sequence that resulted in the given set of observations. In the context of the hidden Markov model of Figure 13.13, the Viterbi algorithm finds the configuration that has the largest a posteriori probability. For equally likely codes, this problem is recast as maximum-likelihood sequence detection (MLSD) [5].
A Machine Learning Approach for Quantifying the Design Error Propagation in Safety Critical Software System
Published in IETE Journal of Research, 2022
The work carried out in this research provides an opportunity to identify the hidden design errors in safety critical systems. The whole process is explained through a case study, Anti-lock braking system. Through the hidden markov model we could evaluate the current system state with respect to various operating conditions. Explorative analysis on the system has been carried out for both non-failure and failure cases, in order to relate the software error occurrence with the operational parameters of the anti-lock braking system. The probability of error occurrence is re-estimated using Baum–Welch algorithm and depicted through various transition and observation matrices. The most probable sequence of states is computed using the Viterbi algorithm. This helped to visualize the maximum probability of reaching the error state during abnormal conditions. The machine-learning algorithm HMM approach helped to find more detailed information on the temporal sequences of error occurrence, resulting in a promising recall as depicted in Table 7. In future, the identified errors may be analyzed further for its contribution towards the reliability of the system.
Estimating Forest Fire Losses Using Stochastic Approach: Case Study of the Kroumiria Mountains (Northwestern Tunisia)
Published in Applied Artificial Intelligence, 2018
Ahmed Toujani, Hammadi Achour, Sami Faïz
Since the recognition of burned area size is the main interest in this work, the confusion matrix is then used to store the correct and incorrect classifications made by the HMM model. In order to use the proposed HMM model as a classifier, the Viterbi algorithm was employed to decide the most likely “state” sequence given the model and its matching observation sequence. And then to recognize a burned area class for a given fire, two steps should be considered: in the first one, the clusters of fire factors, characterizing the fire in question and the previous fires occurred in the corresponding parcel, should be reorganized in an ordered sequence. Once we have constructed the sequence of clusters, the state sequence could be determined by the Viterbi process and the end-state will be the conforming burned area class.
Refinement of HMM Model Parameters for Punjabi Automatic Speech Recognition (PASR) System
Published in IETE Journal of Research, 2018
Virender Kadyan, Archana Mantri, R. K. Aggarwal
The pattern matching of the input signal is performed by acquiring the knowledge of the trained HMM model and language model. A set of untrained phones of HMM requires pronunciation lexicon and a well-defined phone set. Now, with the help of such mentioned files, system is ready to train the transition matrix A (with aij entries) and output likelihood estimator B (with bj(ot) entries) for the HMMs. The Viterbi algorithm is used to determine the most suitable sequence of hidden states of an HMM model that contains hidden variables using the following equation:where i = 1, 2, 3,…, n. The algorithm initializes the probability using Equation (22) by taking the product of initial hidden state probabilities with the associated observation probabilities using Equation (23). It is used to find the most probable way for the next state. This can be achieved by the product of maximal probabilities derived in Equation (24) from the earlier step with transition probabilities [14]. To follow the most probable route, backtracking Equation (25) has been used. The sequence i1…iT will hold the most probable sequence of hidden states for the analysis of sequence. The Viterbi algorithm gets its efficiency via concentrating on survival paths of the trellis. It can be achieved by passing parameters value to the Viterbi function in pattern matching process as follows:where SEQ = sequence, A = calculates the most likely path through the HMM specified by transition probability matrix, and B = emission probability matrix.