Explore chapters and articles related to this topic
Recognition of Emotions in the Elderly through Facial Expressions: A Machine Learning-Based Approach
Published in Wellington Pinheiro dos Santos, Juliana Carneiro Gomes, Valter Augusto de Freitas Barbosa, Swarm Intelligence Trends and Applications, 2023
Arianne Sarmento Torcate, Maíra Araújo Santana, Juliana Carneiro Gomes, Ingrid Bruno Nunes, Flávio Secco Fonseca, Gisele M.M. Moreno, Wellington Pinheiro dos Santos
The PSO is a technique in the field of Evolutionary Computing (EC) (Xue et al., 2014) and is based on the collective movement of a group of particles: the particle swarm (Rodrigues et al., 2019). The functioning of the PSO starts with a population of random positions and velocities, where each particle contains a fitness function that must be evaluated and compared with the fitness evaluation of each particle along with its pbest (which means personal experience). If in each iteration the positions and speeds of the particles are better, they should be updated and used to assume the value of pbest. Particle velocities should be updated using the best positions (personal and global) (Barbosa et al., 2021 b; Chrouta et al., 2021; Harb and Desuky, 2014; Rodrigues et al., 2019; Zomorodi-Moghadam et al., 2021).
Scope of Optimization in Plant Leaf Disease Detection using Deep Learning and Swarm Intelligence
Published in Shikha Agrawal, Manish Gupta, Jitendra Agrawal, Dac-Nhuong Le, Kamlesh Kumar Gupta, Swarm Intelligence and Machine Learning, 2022
Vishakha A Metre, Sudhir D Sawarkar
Swarm intelligence (IS) emerged as an innovative branch of artificial intelligence (AI) that is based on observing the behaviour of various natural beings. The basic idea is to encourage social behaviour within ant colonies, flocks of birds, hives, schools of fish, and others, an exceptional solution to various complex problems. The different optimization paradigms motivated by swarm intelligence are listed in alphabetical order of their names as ant colony optimization (ACO), bee colony optimization (BCO), fish school optimization (FSO), particle swarm optimization (PSO), current gray wolf optimization (GWO), etc., are discovered in the SI taxonomy. Figure 5 shows the SI scenario. Although various optimization algorithms have been studied and used in many notable applications, PSO has its own relevance in handling complex and nonlinear optimization problems.
Memetic Algorithms with Extremal Optimization
Published in Yong-Zai Lu, Yu-Wang Chen, Min-Rong Chen, Peng Chen, Guo-Qiang Zeng, Extremal Optimization, 2018
Yong-Zai Lu, Yu-Wang Chen, Min-Rong Chen, Peng Chen, Guo-Qiang Chen
The PSO algorithm is a recent addition to the list of global-search methods. This derivative-free method is particularly suited to continuous variable problems and has received increasing attention in the optimization community. PSO was originally developed by Kennedy and Eberhart (1995) and inspired by the paradigm of birds flocking. PSO consists of a swarm of particles and each particle flies through the multidimensional search space with a velocity, which is constantly updated by the particle’s previous best performance and by the previous best performance of the particle’s neighbors. PSO can be easily implemented and is computationally inexpensive in terms of both memory requirements and CPU speed (Kennedy and Eberhart, 1995). However, even though PSO is a good and fast search algorithm, it has premature convergence, especially in complex multi-peak-search problems. This means that it does not “know how” to sacrifice short-term fitness to gain longer-term fitness. The likelihood of this occurring depends on the shape of the fitness landscape: certain problems may provide an easy ascent toward a global optimum; others may make it easier for the function to find the local optima. So far, there have been many researchers devoted to this field to deal with this problem (Shelokar et al., 2007; Jin et al., 2008; Chen and Zhao, 2009).
Increasing importance of genetic algorithms in science and technology: Linear trends over the period from year 1989 to 2022
Published in Materials and Manufacturing Processes, 2023
After some incubation period, a fast PSO development started about year 2003 at the level of 5,000 yearly publications achieving about 35,000 at present.[47] PSO is classified as one of three most popular numerical optimization methods[4]: the most popular ones are genetic algorithms, simulated annealing (SA) and particle swarm optimization (PSO) with ~ 24,000, ~11,000 and ~ 9,000 entries in Scopus database, respectively (with the popularity of SA decreasing with time in the analyzed 2011–2020 period). It has been found that the number of PSO related publications increased exponentially in period 2000–2006 and then, until the considered year 2013, it was stabilized at level of about 1000 with some fluctuations.[48] However, using the methods of the present study applied for GAs it is found, for the simply defined Web-of-Science “particle swarm optimization” topics, that a steady growth for PSO takes place (see Fig. 10), with a run similar to that of Fig. 1, but shifted by about 10–15 years toward a later time (compare Figs. 10 with Fig. 1). This observation being in line with that of Nayak[47], makes PSO a possible candidate for the source of segmentation in F(t) and G(t) curves. Namely, the appearance and development of PSO (and/or for other related methods) can hypothetically reflect the researchers’ interest switching to the new method and consequently to be the reason of the segmentation of the curves in Figs. 2-9.
Optimization of unsubsidized and subsidized customized bus services
Published in Transportation Planning and Technology, 2023
Siqing Wang, Jian Wang, Xiaowei Hu
Because of its simple operation, fast convergence, and strong global search capability, PSO has been widely used in many fields, such as function optimization (Niu et al. 2015). Moreover, PSO algorithms can efficiently handle nonlinear mathematical programming problems with mixed-integer variables. However, the convergence speed of the PSO algorithm is constantly at odds with the diversity of the population during optimization. Therefore, improving the traditional PSO algorithm is always aimed at enhancing the algorithm's local search ability while maintaining the population's diversity. It is to prevent the algorithm from converging prematurely while converging rapidly. This paper makes two improvements based on traditional PSO (Marini and Walczak 2015). Improved parameters in update rules
Optimal experimental designs for ordinal models with mixed factors for industrial and healthcare applications
Published in Journal of Quality Technology, 2022
Joshua Lukemire, Abhyuday Mandal, Weng Kee Wong
In PSO, birds are drawn to both their pbest position and the gbest position at each iteration, and they “fly” through the search space at a velocity determined by their distance from these two positions. In doing so, the birds get the chance to explore many new solutions between the pbest and gbest positions, hopefully identifying better solutions. Some of the most appealing features of PSO are that it is virtually assumption free, flexible, and easy to use. For example, the behavior of the swarm is governed by two simple equations: the velocity update and the position update equations. As such, the PSO algorithm and its many variants have been widely applied to many types of optimization problems in engineering and computer science research, and increasingly in many other disciplines.