Explore chapters and articles related to this topic
High-Level Modeling and Design Techniques
Published in Soumya Pandit, Chittaranjan Mandal, Amit Patra, Nano-Scale CMOS Analog Circuits, 2018
Soumya Pandit, Chittaranjan Mandal, Amit Patra
The time complexity of an algorithm is calculated on the basis of the number of required elementary computational steps taken by the algorithm to compute the function it was developed for. The number of steps are interpreted as a function of the input size. However, it may be noted that most of the time, the total number of elementary computational steps varies from input to input because of the presence of conditional statements such as an if-else statement. Therefore, average-case complexity is considered to be a more meaningful characterization of the algorithm. However, accurate determination of the average-case complexity of an algorithm is not an easy task, which necessitates the use of the worst-case complexity metric. The worst-case complexity of an algorithm is the complexity with respect to the worst possible inputs, which gives an upper bound on the average-case complexity.
Optimization Using Heuristic Search
Published in Kaushik Kumar, Divya Zindani, J. Paulo Davim, Optimizing Engineering Problems through Heuristic Techniques, 2020
Kaushik Kumar, Divya Zindani, J. Paulo Davim
Worst-case analysis: An example that can show the weakness of the algorithm, which is usually referred to as pathological example, needs to be constructed. However, finding such an example is difficult especially in case of complex problems. One of the major drawbacks for carrying out such theoretically strong analysis is that the problem that is under study rarely represents the case for worst-case analysis. It is beneficial to understand well the problem under consideration to identify whether it truly resembles the example for worst-case analysis. Worst-case analysis provides for useful measures as it guarantees the performance of the algorithm that isn’t far from the real-life example.
Analysis of maximum flow algorithms for ultimate pit contour problems
Published in Heping Xie, Yuehan Wang, Yaodong Jiang, Computer Applications in the Mineral Industries, 2020
The complexity of algorithms is based on worst case scenarios. What is the real complexity when it is applied on G(x)? At first sight, this complexity may looks disastrous for the UPC problem since the capacity on arcs (i,j) ∈A is a very large number (theoretically this capacity is set to infinity). However, these arcs won’t be part of the minimal cut; otherwise, we would have an unbounded problem. So for the following explanations, we will consider U as the upper bound on the absolute value of a block in the mine, i.e. the maximal capacity among outgoing arcs of node s (arcs (i,j) ∈A+) or incoming arcs of node t (arcs (i,j) ∈A-). In practice we know that the number of augmenting path will be bounded by O(β) where β=min{(β1,β2}, β1=∑i∈V+ci and β2=∑i∈v−|ci|. Since β<<nU it represents a better upper bound.
Uncertain frequency responses of CNT – reinforced polymeric graded structure using fuzzified elastic properties – fuzzy finite element approach
Published in Waves in Random and Complex Media, 2022
Sridhar Bondla, Nitin Sharma, Subrata Kumar Panda, Chetan Kumar Hirwani, S.R. Mahmoud, Vikash Kumar
The mechanical behavior of the composite structures considering uncertain, fuzzy, or random material properties is essential in the design of nanocomposites. It is a reliable and safe analysis only when uncertain material properties are considered, which causes variability. To solve this problem, many authors used probabilistic [2] and non-probabilistic methods [3]. Still, using these probabilistic techniques, such as the stochastic finite element method (FEM), Monte Carlo simulation (MCS), etc., requires a large amount of sample data which becomes expensive for realistic models. There is a need to develop methods that are reliable and which can reduce computational costs. Hence, some computational non-probabilistic methods such as fuzzy logic [4], neural networks [5], fuzzy finite element methods [6], etc., are needed. The fuzzy logic technique can address the problems mentioned above. It is capable of handling fuzzy information that is linguistic, qualitative, vague, and imprecise and is a more reliable design and analysis tool considering uncertainties. Its underlying mathematical formulations are based on worst case situations, which helps to build a simplified, and reliable analysis method.
Worst-case analysis for a leader–follower partially observable stochastic game
Published in IISE Transactions, 2022
Yanling Chang, Chelsea C. White
Worst-case analysis is an effective tool for analyzing such hard problems without polynomial time algorithms (e.g., vehicle routing), and it has been used to obtain several important performance guarantees (e.g., lower bounds) that are useful for evaluating attractive algorithms or policies for these problems (Bramel and Simchi-Levi, 1997). Second, it helps to reduce the data requirements on the adversary and has been widely used in single-period games and stochastic games to cope with the uncertainty of an agent’s behavior towards others (Aghassi and Bertsimas, 2006; Kardes et al., 2011). More importantly, although worst-case analysis can sometimes be very conservative, it can provide a useful guide for many risk-averse situations (especially with severe consequences, e.g., death, nuclear war). For example, the benchmarks provided by such analyses provide a necessary basis for: (i) evaluating alternative control strategies (e.g., trade-off between risk and system productivity); and (b) quantifying the value of acquiring more data (presumably at a cost) to improve the understanding of the behavior of adversaries for resource justification. In addition, a defending agent may still need to protect itself when facing a new unknown adversary even before being able to setup and collect information on the adversary.