Explore chapters and articles related to this topic
The Inquirer: Sensing and Learning in Swan Lake
Published in Çağrı Haksöz, Risk Intelligent Supply Chains, 2018
Surely, managing unexpected event risks is not trivial. In our context, to be able to scientifically analyze extreme events in a supply chain network, we need a proper set of methods and tools. One useful candidate is the extreme value theory, which studies the statistical properties of extreme events in a variety of disciplines such as climate science, insurance, financial markets, and earthquake modeling. I need to note that people’s worry and concerns in the supply chain network affect their effective use of these methods.37
Compressive Sensing Meets Noise Radar
Published in Moeness Amin, Compressive Sensing for Urban Radar, 2017
Mahesh C. Shastry, Ram M. Narayanan, Muralidhar Rangaswamy
There have been extensions to Monte Carlo simulations in the past to accommodate the occurrence of rare events (Broadwater and Chellappa 2010) in the context of radar signal processing. One such approach involves estimating probabilities using extreme value theory (Ozturk et al. 1996; Broadwater and Chellappa 2010). We propose applying to compressive noise radar, an approach based on limit theorems from extreme value theory. Extreme value theory refers to the study of probabilities of rare events in stochastic systems. The basic results concern the statistics of the extremes of ordered random variables. In the past, it has been applied to problems in finance, climate sciences, and geophysical modeling. In electrical engineering, the utility of extreme value theory was first proposed for problems in detection theory by Ozturk et al. (Ozturk et al. 1996). In the context of compressive sensing, our proposal is to extrapolate the probabilities of rare events from a few instances of solving the convex optimization problem. A manageable number of instances of convex optimization problem are used to generate the statistics of compressive sensing for various values of (ρ, δ) and these are used to compute thresholds for small values of Pfa (<10−4).
Count-based change point detection via multi-output log-Gaussian Cox processes
Published in IISE Transactions, 2020
Once the change scores of the measured responses are computed, a decision can be made to determine whether the system has experienced a change in its characteristics. There are various methods to construct criteria for such decision-making. One simple approach is to set a hard threshold and trigger alarms whenever the change score goes beyond the predefined threshold (Liu, 1995). Due to its simplicity and ease of implementation, this method has been extensively used in various industries (we will see an example in Section 4.3). A probabilistic approach, whereby a threshold is determined based on the historical data of the change scores, can also be used. This approach can be reliable when there is a vast amount of historical data, which contain sufficient cases for changes. If such data are not available, this approach may not accurately determine the thresholds, which in turn results in a poor performance in detecting changes. To overcome the addressed accuracy problem, there has been extensive research on how to determine a threshold for decision-making in change point detection algorithms. For example, Extreme Value Theory (EVT), which models the distribution of extremely rare events, has been employed for various applications such as natural disaster analysis and financial risk analysis (McNeil and Frey, 2000; Haan et al., 2007).
Developing proximal safety indicators for assessment of un-signalized intersection – a case study in Surat city
Published in Transportation Letters, 2020
Srinivasula Reddy K.A., Akhilesh Chepuri, Shriniwas S. Arkatkar, Gaurang Joshi
For the present case of un-signalized intersection, extreme value theory has been applied to both scenarios of with and without speed bumps. For this, the PET data set was tested for potential statistical distributions. Goodness-of-fit test results of the K-S test showed that Generalized Extreme Value (GEV) is the best-fitted distribution among the potential statistical distributions. GEV distribution is a family of continuous probability distributions developed within extreme value theory to combine the Gumbel, Fréchet, and Weibull families, also known as type I, II, and III extreme value distributions. By the extreme value theorem, the GEV distribution is the only possible limit distribution of properly normalized maxima of a sequence of independent and identically distributed random variables. This distribution has three parameters, namely, location parameter (µ), scale parameter (σ), and shape parameter (k ≠ 0). The values of these parameters for both the scenarios of with and without speed bumps are shown in Table 6. There is a significant variation in location parameter (µ) compared to the other two parameters. The mean PET value is observed to be decreased from 0.961 to 0.736 s because of the presence of speed bumps and, thereby, enhancing safety.
Statistical analysis of the largest possible earthquake magnitudes on the Ecuadorian coast for selected return periods
Published in Georisk: Assessment and Management of Risk for Engineered Systems and Geohazards, 2020
Sandra García-Bustos, Jimmy Landín, Ricardo Moreno, A.S.E. Chong, Maurizio Mulas, Mónica Mite, Nadia Cárdenas
Initially, the unusual behaviour of a random variable was studied empirically, that is, the maximum or minimum of one variable (Castillo 1988). Fréchet (1927), Fisher and Tippett (1928) proposed the Extreme Value Theorem, which states that the maximum or minimum limit distributions of a random variable tend to be only in three parametric distributions (Gumbel, Weibull and Fréchet). In 1943, Gnedenko demonstrated the Extreme Value Theorem (Gnedenko 1943). In general, some of the main applications of the Extreme Value theory focus on calculating probabilities of atypical floods, the amount of large losses of an insurer which must not exceed a certain threshold, the occurrence of financial risks or large forest fires, etc. This type of analysis is very useful in different fields, such as neurology, hydrology, economics, actuarial, among others. This methodology is also valid for modelling earthquakes: Burton and Makropoulus (1985) used the Gumbel distribution to model the seismic risk of Circum-Pacific Earthquakes. In addition, Zimbidis, Frangos, and Pantelous (2007) analysed the historical data of earthquakes that occurred in the border area of Greece to produce a reliable model of the risk of earthquake magnitudes. Other related works can be found in Ameer et al. (2004), Pisarenko, Sornette, and Rodkin (2010), Singh, Shanker, and Ali (2015) and Pavlenko (2017).