Explore chapters and articles related to this topic
Preliminary Test and Evaluation
Published in Udo W. Pooch, Alan D. George, Lois Wright Hawkes, Microprocessor-Based Parallel Architecture for Reliable Digital Signal Processing Systems, 2018
Alan D. George, Lois Wright Hawkes
As previously discussed, the reliability R(t) of a component or system represents the probability that the device will operate correctly throughout some interval [t0, t], given that the device was operating at initial time t0. The assumption of a constant failure rate λ leads to the exponential failure law [JOHN89]: R(t)=e−λiThe most widely used techniques for reliability analysis of systems are the analytical methods. Of these, the two most commonly used are the combinatorial and Markov modeling approaches. Combinatorial methods use probability theory to estimate the reliability of a system based on the reliability of its individual components and their interconnections. Markov models use the concepts of system state and state transition to develop a state diagram that describes the system.
Sensor- and Recognition-Based Input for Interaction
Published in Julie A. Jacko, The Human–Computer Interaction Handbook, 2012
One popular special case of the DBN is the Hidden Markov Model (HMM), often used to model time-varying signals such as speech, gesture, pen strokes, and so on. HMMs model observations yt conditioned on a state variable xt, which evolves over time as P(xtfxt21). As with many probabilistic models, HMMs are generative in nature, meaning one of the ways we can understand them is to consider “running them forward” to generate new observations: an HMM can be thought of as a stochastic finite state machine (Markov model) that generates observations drawn from a distribution associated with each state. With each time step, the Markov model takes a transition to a new state. In the inference (recognition) process, the posterior distribution over the hidden state variable xt is computed from the observation sequence yt (Rabiner 1989). HMMs have been applied many types of observation sequences, including hand gestures, handwriting recognition, speech recognition, and so on. In the simplest application paradigm, a separate HMM is trained for each class.
Probability and Statistics
Published in Jerry C. Whitaker, Electronic Systems Maintenance Handbook, 2017
It only takes an intuitive understanding of three basic principles to construct Markov models. These are (1) constant rate processes, (2) independent competing events, and (3) independent sequential events. These three principles are described, and it is shown (by several examples) that an intuitive understanding of these principles is sufficient to construct Markov models. A Markov model consists of states and transitions. The states in the model correspond to identifiable states in the system, and the transitions in the model correspond to transitions between the states of the system.
A Stochastic Model for Performance Evaluation of Hybrid Network Architectures of IoT with an Improved Design
Published in IETE Journal of Research, 2023
The data transmission in a real-world IoT system may fail due to several reasons including sensor node failure, link failure, packet collision in multiple access scenarios, buffer overflow at access points resulting from heavy traffic, etc. This paper presents a stochastic model for the assessment of the reliability of data transmission, depending on the failure of sensor nodes due to the link failure at the perception layer. However, the other factors are not considered for the simplicity and feasibility of the stochastic model. System reliability is inversely related to its failure. Since the failure of a system is a random or stochastic process occurring over time, it is justifiable to use the Poisson process to describe the system failure [23]. Markov models can be used for stochastic system modeling to represent a system in a particular state at a particular time. Therefore, Markov chains [24] have been used to describe the transition of front-end sensor nodes from a healthy state to a failed state and vice versa. The failure of front-end sensor node links (wired or wireless) follows a Poisson process with rate λ and they get repaired at a rate μ [25]. Table 1 gives the important notations commonly used in this paper along with their description.
Condition prediction and estimation of service life in the presence of data censoring and dependent competing risks
Published in International Journal of Pavement Engineering, 2019
Valentin Donev, Markus Hoffmann
The Markov chain approach is popular because it allows for modelling of M&R treatments and the use of standard optimisation techniques (dynamic programming). An often made assumption is that within one transition the condition will remain in the current state or will change to next worse state (Butt et al. 1987, Abaza 2017). Although this assumption seems plausible, it is often made without justification in order to reduce the number of unknown probabilities and to simplify the computations. In general, limitations of Markov models include cumbersome estimation of transition probability matrices, small number of observed transitions to states with poorer condition and no section-specific prediction. Explanatory variables may be considered only by further segmentation of the observations in homogeneous groups at the cost of reducing the sample size (Madanat et al. 1995). Furthermore, the assumption of constant transition probabilities in the homogeneous case does not hold true for pavement deterioration due to cumulative traffic and ageing (Li et al.1996, Abaza 2017, Hoffmann and Donev 2016).
Application of Markov models to predict changes in nasal carriage of Staphylococcus aureus among industrial hog operations workers
Published in Journal of Occupational and Environmental Hygiene, 2022
Melissa G. Edmondson, Christopher D. Heaney, Meghan F. Davis, Gurumurthy Ramachandran
In this paper, the authors propose a Markov model approach to predict changes in a worker’s nasal carriage of S. aureus over time. A Markov model comprises two or more states and corresponding transition probabilities that indicate the probability of moving from one state to another during a defined time. The probability of moving into a different state, or staying in the same state, is dependent only on the current state. The simplest form of a Markov model is a Markov chain, where all states are observable. Markov chain models have previously been proposed as a generalized approach to describe changes in health status over time and as a tool for medical decision making (Sonnenberg and Beck 1993). Examples in the literature include breast cancer (Fujii et al. 2019; Huang et al. 2020), infertility (Srinivasa Rao and Diamond 2020), hepatocellular carcinoma (Ishida et al. 2008), and HIV (Simpson et al. 2009). Batina and colleagues have previously used Markov Models to describe how residents of community nursing homes transition between states of methicillin-resistant S. aureus (Batina et al. 2016). In this paper, we constructed a Markov chain model to describe how a cohort of industrial hog operation workers transition between states of nasal carriage over a period of 4 months. Since the use of face masks has been found to influence worker carriage status in this cohort previously (Nadimpalli et al. 2018), we stratified our models by face mask use to demonstrate the effectiveness of exposure controls to modify long-term carriage rates. Finally, we selectively used different amounts of data to calculate transition probabilities between different states and to determine the minimum set of repeated, sequential measures needed to reasonably predict long term nasal carriage rates.