Explore chapters and articles related to this topic
Classical Optimization Methods
Published in A Vasuki, Nature-Inspired Optimization Algorithms, 2020
Revised simplex method is a variant of the original simplex method that has reduced storage requirement and computational time. The simplex method requires large memory capacity in the computer to store the several variables and the constraints involved when the problem has a higher number of dimensions. It also takes up more time for computations. In the original simplex method, a new table has to be computed and stored in each iteration of the algorithm. This takes up more memory and time and most of this stored information is not used in every iteration. This memory requirement is reduced in the revised simplex method by storing the basis of the matrix representing the constraints. The required quantities are computed from the inverse of the current basis matrix. This makes the computations more efficient.
The Revised Simplex Method
Published in Shashi Kant Mishra, Bhagwat Ram, Introduction to Linear Programming with MATLAB®, 2017
Shashi Kant Mishra, Bhagwat Ram
Consider a linear programming problem in standard form with a simplex tableau A of size m×n $ m\times n $ . Suppose that we wish to solve this problem using the simplex method. Experience suggests that if the simplex tableau A has fewer rows m than columns n, then in most instances, pivots occur in only a small fraction of the columns of the simplex tableau A. The operation of pivoting involves updating all the columns of the simplex tableau to move from one iteration to next in search of an improved solution. However, if a particular column of A never enters into basis during the entire simplex procedure, then computations performed on this column are not explicitly used. Therefore, the effort expended on performing operations on many such columns of A may be a waste. The revised simplex method reduces the amount of computation leading to an optimal solution by eliminating operations on columns of A that do not enter into the basis.
Introduction to Optimization Models
Published in William P. Fox, Nonlinear Optimization, 2020
The revised simplex method is mathematically equivalent to the standard simplex method, but differs in implementation. Instead of maintaining a tableau that explicitly represents the constraints adjusted to a set of basic variables, it maintains a representation of a basis of the matrix representing the constraints as we describe using the computer chip problem.
The ‘Idiot’ crash quadratic penalty algorithm for linear programming and its application to linearizations of quadratic assignment problems
Published in Optimization Methods and Software, 2020
Forrest's aim in developing ICA for LP problems was to determine a point that, when used to obtain a starting basis for the primal revised simplex method, results in a significant reduction in the time required to solve the problem. This paper has distilled the essence of ICA and presented it in algorithmic form for the first time. Practical experiments have demonstrated that, for some large-scale LP test problems, Forrest's aim is achieved. For LP problems when ICA is not advantageous, this is identified without meaningful detriment to the performance of Clp. For the best case in which ICA subproblems are solved exactly, Theorem 3.1 shows that every limit point of the sequence of ICA iterations is a solution of the corresponding LP problem. It is observed empirically that, typically, the lower the condition of the constraint matrix A, the closer the point obtained by ICA is to being an optimal solution of the LP problem. For linearizations of quadratic assignment problems, it has been demonstrated that ICA consistently yields near-optimal solutions, achieving this in minutes for instances that are intractable on the same machine using commercial LP solvers. Thus, in addition to achieving Forrest's initial aim, ICA is seen as being useful in its own right as a fast solver for amenable LP problems.