Explore chapters and articles related to this topic
Parallel Algorithms
Published in Vivek Kale, Parallel Computing Architectures and APIs, 2019
The question NC = ?P is an open problem of parallel complexity theory. A P-complete problem in P is a problem such that any other problem in P can be transformed to it in polylogarithmic time using a polynomial number of processors. So, if a polylogarithmic-time algorithm with a polynomial number of processors is ever found for any P-complete problem, then all problems in P are efficiently parallelizable and NC = P. Some of these problems have been around for years and studied extensively by numerous researchers. Thus, the lack of efficient algorithms for these problems strongly supports the conclusion that NC ≠ P.
Scene recognition of road traffic accident based on an improved faster R-CNN algorithm
Published in International Journal of Crashworthiness, 2022
Fenghui Wang, Jie Qiao, Lingyi Li, Yongtao Liu, Lang Wei
In order to improve the computational efficiency, the box classification convolution layer and the box regression convolution layer are merged into one convolution layer in the MSRPN. The improved convolution layer structure is 1 × 1× (4 + 1) k. The structure of the convolution layer is k, it contains the coordinates of k region candidate boxes and the scores of each region candidate box. This score reflects the probability that the region candidate box contains the target to be detected. That is to say MSRPN only use a separate probability value p complete the discrimination of the region candidate box. When the probability value of the region candidate box is p greater than a certain threshold, the region candidate box is considered as the foreground whereas the region candidate box is considered as the background
A stochastic joint replenishment problem considering transportation and warehouse constraints with gainsharing by Shapley Value allocation
Published in International Journal of Production Research, 2019
Carlos Otero-Palencia, René Amaya–Mier, Ruben Yie-Pinedo
The heuristic procedure integrates GAs and the Shapley’s function, as shown in Figure 2. The proposed evolutionary process required to assess the Shapley value must be executed times, one per each characteristic function. In our case, such function was shown in (5) and depends upon the players’ parameters. The Shapley Value is a #P-Complete problem (Deng and Papadimitriou 1994). The number of iterations is exponential, so the required computations become impractical for particularly complex characteristic functions. We use GAs to reduce the computational complexity of each JRP iteration, decreasing in this way the overall burden of the problem. The former illustrates the unique and seamless interaction of the S-CJRP components, given that only by means of GAs the numerous JRP instances can be computed to assess the Shapley function. On the other hand, although our GA (Step 2) is capable to solve large JRP problems, it solves the Shapley’s function by exact methods. This implies that for problems delivering more than 30 items, calculations on a conventional computer could be prohibitive. Luckily, this is a minor issue since in the model one item could represent a family of items composed by tens or hundreds of aggregated items that can be jointly replenished. The items in a family oftentimes share similar demand patterns, uses and physical features, which is why they can be unified in the model.
Combining bounded solving and controllable randomization for approximate model counting
Published in Journal of Experimental & Theoretical Artificial Intelligence, 2022
Shuai Lü, Tongbo Zhang, Yue Xu, Wenbo Zhou, Yong Lai
Propositional model counting, also known as #SAT, is concerned with computing the model count (satisfying truth assignments) of a given CNF formula. As an important extension of propositional satisfiability (SAT) with higher complexity, #SAT is #P-complete rather than NP-complete (Toda, 1989). This problem focuses on the exact model counts besides the satisfiability of a given formula. Many problems with complexities higher than NP in artificial intelligence can be transformed into model counting problems, such as probability inference (Chakraborty et al., 2014, 2016) and conformant probabilistic planning (Domshlak & Hoffmann, 2007).