Explore chapters and articles related to this topic
Introduction
Published in Chandrasekar Vuppalapati, Democratization of Artificial Intelligence for the Future of Humanity, 2021
In classical algorithm analysis, Time Complexity is a function describing the amount of time an algorithm takes regarding the number of inputs to the algorithm. “Time” measures the number of memory accesses performed, the number of comparisons between integers, and the number of iterations the inner loop executes, since there are many factors unrelated to the algorithm that can affect the real time (like the language used, type of computing hardware, proficiency of the programmer, optimization in the compiler, etc.) [9]. Coming to AI algorithms, the time complexity varies with respect to data size, data process iterations and data depth level.
*
Published in Fadi Al-Turjman, Smart Things and Femtocells, 2018
In terms of time complexity analysis, we use big-O notation. This means that the time taken to solve a problem of size n can be described by O(f(n)). Since traditional LS+ algorithm has complexity of O(n) when non-enumerative (data-dependent) search is allowed [29], where n is the total count of APs in our proposed two-tier architecture. Our improved LS+, in Algorithm 3, has a time complexity of O(nlogn). And thus, the total time complexity of the proposed HCPF approach, in Algorithm 4, end up being O(n²logn).
Mobile Couriers and the Grid
Published in Fadi Al-Turjman, Smart Grid in IoT-Enabled Spaces, 2020
In terms of time complexity analysis, we use big-O notation. This means that the time taken to solve a problem of size n can be described by O(f(n)), since traditional LS+ algorithm has complexity of O(n) when non-enumerative (data-dependent) search is allowed [32], where n is the total count of APs in our proposed two-tier architecture. Our improved LS+, in Algorithm 6.3, has a time complexity of O(nlogn). And thus, the total time complexity of the proposed HCPF approach, in Algorithm 6.4, end up to be O(n²logn).
Condition and criticality-based predictive maintenance prioritisation for networks of bridges
Published in Structure and Infrastructure Engineering, 2022
Georgios M. Hadjidemetriou, Manuel Herrera, Ajith K. Parlikad
Optimising group maintenance policy includes a time complexity of 2n, using Landau’s ‘big-O’ notation, where n is the number of maintenance activities. Time complexity can be decreased to n2, if every group of elements has consecutive activities, as proved by Wildeman et al. (1997). Time complexity for solving a given computation process can be defined as the amount of time taken by an algorithm to run as a function of the length of the input, and thus depends on the number of operations needed to solve or approach such a process (Rosen, 1999). ‘Big-O’ notation has been extensively used to approximate the number of operations an algorithm uses as its input grows. Hence, this notation provides an indication whether a particular algorithm is practical to be used for solving a problem. Therefore, for optimizing the maintenance schedule of stochastic deteriorating bridge elements, a genetic algorithm is used to provide robust results with limited computational capability. The genetic algorithm effectively avoids local optimal points of the overall maintenance cost function.
Optimal scheduling of vehicle-to-Grid power exchange using particle swarm optimization technique
Published in International Journal of Computers and Applications, 2022
Time complexity is defined as the execution time of an algorithm as a function of input size. As the size of input data directly affects the number of steps or instructions in an algorithm, this further affects the execution time. This gives information about variations in execution time as the size of input data increases. It helps us to compare different algorithms developed to execute the same objective. Since these algorithms are not performed with the same input data or on the same work station, they cannot be compared by execution time only. In computer science, the time complexity is generally represented by Big O notation. It is written in the form O(n), where O order of growth and n stands for input data. There are different types of time complexities, such as Figure 18.
Neighborhood optimization of intelligent wireless mobile network based on big data technology
Published in International Journal of Computers and Applications, 2021
The time complexity of the algorithm is usually measured by the number of arithmetic operations such as addition, subtraction, multiplication and division required to complete the algorithm. This paper evaluates their time complexity by comparing the time required for different algorithms to complete the same prediction. Secondly, the prediction accuracy is analyzed, because time series prediction is not a percentage. If the predicted value of time series is lower than the set error range, the predicted result is correct. In addition, the predicted accuracy can also be judged according to the arithmetic average or standard average value of the predicted error. Finally, the predicted stability is analyzed, and the predicted stability can be divided into three categories. The stability of the results in one prediction, the second is the stability of the results of different prediction times under the same initial condition, and the third is the stability of the long-term and short-term prediction results.