Explore chapters and articles related to this topic
Scheduling
Published in Susmita Bandyopadhyay, Production and Operations Analysis, 2019
Scheduling is a very essential task in manufacturing since it determines whether the jobs can be processed on time and in appropriate direction and method. Scheduling can be defined as “the actual assignment of starting and/or completion dates to operations or groups of operations to show when these must be done if the manufacturing order is to be completed on time” (Cox et al., 1992). Scheduling indicates the tasks of both sequencing and scheduling where sequencing finds the order in which jobs are processed and scheduling is the task of preparing the time table for jobs to be processed. The concept of manufacturing scheduling started during eighteenth century with mills, workshops, and projects. From then onward, the scheduling technology started flourishing along with the other associated technologies which has brought the concept to the era of smart manufacturing as it is now.
Introduction
Published in Joseph Y.-T. Leung, Handbook of SCHEDULING, 2004
Since the concept of “time” is of such importance in real-time application systems, and since these systems typically involve the sharing of one or more resources among various contending processes, the concept of scheduling is integral to real-time system design and analysis. Scheduling theory as it pertains to a finite set of requests for resources is a well-researched topic. However, requests in real-time environments are often of a recurring nature. Such systems are typically modeled as finite collections of simple, highly repetitive tasks, each of which generates jobs in a very predictable manner. These jobs have upper bounds upon their worst-case execution requirements and associated deadlines. For example, in a periodic task system [1-6] each task makes a resource request at regular periodic intervals. The processing time and the time elapsed between the request and the deadline are always the same for each request of a particular task; they may, however, be different for different tasks. A sporadic task system [5,7-12] is similar, except that the
Convergence Technologies
Published in K.R. Rao, Zoran S. Bojkovic, Dragorad A. Milovanovic, Wireless Multimedia Communications, 2018
K.R. Rao, Zoran S. Bojkovic, Dragorad A. Milovanovic
This is because HSDPA allows the packet scheduler to better exploit the varying channel conditions of different users in its scheduling decisions and increase the granularity of the scheduling process. It should be noted that favoring users with good channel conditions may prevent those with bad channel conditions from being served. A good design of a scheduling algorithm not only should take into account maximization of the system throughput, but also should be fair to users who use the same service and pay the same amount of money. That is, scheduling algorithms should balance the trade-off between maximizing throughput and fairness.
An Energy-Aware Agent-Based Resource Allocation Using Targeted Load Balancer for Improving Quality of Service in Cloud Environment
Published in Cybernetics and Systems, 2023
Umamageswaran Jambulingam, K. Balasubadra
The benefits of cloud computing include flexibility, high performance, pay-per-use, and on-demand service. Task scheduling is one of the crucial research questions in cloud computing. Scheduling has the dual goals of allocating tasks to available resources and achieving particular objectives more efficiently. In order to solve job scheduling issues in cloud computing settings, metaheuristic and hybrid metaheuristic algorithms are created (Aktan and Bulut 2022). Additionally created metaheuristic methods based on the genetic algorithm (GA), differential evolution (DE), and simulated annealing (SA), which was integrated with a greedy approach (GR). Additionally, greedy methodology was linked with the invention of hybrid metaheuristic algorithms such as DE-SA and GA-SA. The suggested methods were assessed in terms of turnaround time and virtual machine load balancing.
The Rαβγ categorisation framework for dexterous robotic manufacturing processes
Published in International Journal of Production Research, 2022
The classical [α|β|γ] manufacturing categorisation scheme was originally developed for production scheduling (Graham et al. 1979). Scheduling is an important activity in industrial manufacturing processes (Cancino et al. 2017; Wen et al. 2022). The [α|β|γ] framework (α − machine, β − job, γ − objective) established a unified language for scheduling research and facilitated evaluation of scheduling complexity. The [α|β|γ] categorisation is credited with revolutionising scheduling research (Potts and Strusevich 2009). Since its inception, the [α|β|γ] framework has been adapted to many additional fields, including sports timetabling (Van Bulck et al. 2020), sharing economy (Boysen, Briskorn, and Schwerdfeger 2019), and genetic algorithms (Akgündüz and Tunalı 2011). However, the classical [α|β|γ] is not suitable for representing robotic manufacturing processes since it does not facilitate representation of the core robotic process attributes, in its machine (α) and job (β) tiers. For example, in the machine tier, core robotic system components, e.g. end-effector, manipulator, sensors are not represented. Similarly, in the job tier, significant robotic process attributes are missing, e.g. processing activity type, automation level, software environment.
Embedded real-time systems in cyber-physical applications: a frequency domain analysis methodology
Published in International Journal of General Systems, 2020
Claudio Aciti, Ricardo Cayssials, Edgardo Ferro, José Urriza, Javier Orozco
Embedded real-time systems are used to implement CPSs, but there is not a straightforward relationship between embedded real-time systems and physical applications. Real-time theory deals with periodic tasks, missed deadlines, jitter, utilization factors and scheduling disciplines. Classical signal processing and control theory applies frequency domain techniques to design and evaluate the performance of filters and controllers. Filters and controllers are usually designed to produce a linear time-invariant response. An inadequate implementation of the filters or controllers may cause undesirable and unpredictable consequences for a cyber-physical application. However, real-time scheduling analysis cannot be utilized to evaluate the performance of either filters or controllers implemented as a real-time task.