Linear Programming
Linear Programming
Linear Programming
Find numbers x 1 and x 2 that maximize the sum x 1+ x2 subject to the constraints
x 1 ≥ 0, x 2 ≥ 0 and
x 1+ 2 x 2 ≤ 4
4 x1 +2 x 2 ≤ 12
−x 1+ x2 ≤1
In this problem there are two unknowns, and five constraints. All the constraints are
inequalities and they are all linear in the sense that each involves an inequality in
some linear function of the variables. The first two constraints, x 1 ≥ 0 and x 2 ≥ 0, are
special.
These are called nonnegative constraints and are often found in linear
programming problems. The other constraints are then called the main constraints.
The function to be maximized (or minimized) is called the objective function. Here,
the objective function is
x 1+ x2
Since there are only two variables, we can solve this problem by graphing the
set of points in the plane that satisfies all the constraints (called the constraint set)
and then finding which point of this set maximizes the value of the objective function.
Each inequality constraint is satisfied by a half-plane of points, and the constraint set
is the intersection of all the half-planes. In the present example, the constraint set is
the five sided figure shaded in Figure 1.
We seek the point ( x 1 , x 2 ¿ , that achieves the maximum x 1+ x2 of as ( x 1 , x 2 ¿
ranges over this constraint set. The function x 1+ x2 is constant on lines with slope −1,
for example the line x 1+ x2=1, and as we move this line further from the origin up and
to the right, the value of x 1+ x2 increases. Therefore, we seek the line of slope −1 that
is farthest from the origin and still touches the constraint set. This occurs at the
intersection of the lines x 1+ 2 x 2=4and4 x1 +2 x 2 ¿ 12, namely, ( x 1 , x 2 ¿=¿ (8/3, 2/3). The
value of the objective function there is (8/3) + (2/3) = 10/3.
To solve a standard maximization problem using the simplex method, we take the
following steps:
Step 3. Select the pivot column: Choose the negative number with the largest
magnitude in the bottom row (excluding the rightmost entry). Its column is the pivot
column. (If there are two candidates, choose either one.) If all the numbers in the
bottom row are zero or positive (excluding the rightmost entry), then you are done:
the basic solution maximizes the objective function (see below for the basic solution)
Step 4. Select the pivot in the pivot column: The pivot must always be a
positive number. For each positive entry b in the pivot column, compute the ratio a/b,
where a is the number in the Answer column in that row. Of these test ratios, choose
the smallest one. The corresponding number b is the pivot.
Step 5. Use the pivot to clear the column in the normal manner (taking care to
follow the exact prescription for formulating the row operations described in the
tutorial on the Gauss Jordan method,) and then relabelled the pivot row with the
label from the pivot column. The variable originally labelling the pivot row is the
departing or exiting variable and the variable labelling the column is the entering
variable.
Step 6. Go to Step 3.
In summary, the general simplex method consists of three stages. In the first
stage, all equality constraints and unconstrained variables are pivoted (and removed
if desired). In the second stage, one uses the simplex pivoting rules to obtain a
feasible solution for the problem or its dual. In the third stage, one improves the
feasible solution according to the simplex pivoting rules until the optimal solution is
found.
1. Bound the feasible region by adding a vertical line to the right of the right most
corner point, and a horizontal line above the highest corner point.
2. Calculate the coordinates of the new corner points you obtain.
3. Find the corner point that gives the optimal value of the objective function.
4. If this optimal value occurs at a point of the original (unbounded) region, then
the LP problem has a solution at that point. If not, then the LP problem has no
optimal solution.
In everyday life people are interested in knowing the most efficient way of
carrying out a task or achieving a goal. For example, a farmer might want to know
how many crops to plant during a season in order to maximise yield (produce) or a
stock broker might want to know how much to invest in stocks in order to maximise
profit. These are examples of optimisation problems, where by optimising we mean
finding the maxima or minima of a function.
More formally, uses of linear programming are the technique, in which the
optimization of a linear objective functions, subject to linear equality and linear
inequality constraints. Given a polytope and the real-valued affine function defined
on this polytope, a linear programming method will find a point on the polytope where
this function has the smallest (or largest) value if such point exists, by searching
through the polytope vertices.
Certain special cases of linear programming are network flow problems and
multicommodity flow problems are considered important enough to have generated
much research on specialized algorithms for their solution. A certain number of
algorithms for other types of optimization problems work by solving LP problems as
sub-problems. Historically, ideas from linear programming had inspired many of the
central concepts of optimization theory, such as duality, decomposition, and the
importance of convexity and its generalizations. Likewise, linear programming is
heavily used under microeconomics and company management, such as planning,
production, transportation, technology and other issues.
Dantzig was working for the US Department of the Air Force on a program called
Project SCOOP (Scientific Computation of Optimum Programs), after World War II.
The military needed to organize and expedite supplies to troops. “Programming”
was a “military term that, at that time, referred to plans or schedules for training,
logistical supply or deployment of men. Dantzig mechanized the planning process by
introducing ‘programming in a linear structure’, where ‘programming’ has the military
meaning explained above.”
Dantzig devised what is commonly known as the “simplex method” for solving
linear programming problems. This method was created, in part, to employ the use
of computers (which were in their beginning stages) to do numerous calculations
quickly and accurately. Dantzig (and colleagues) spent the better part of a year
assessing thousands of situations taken from their wartime experience and deciding
whether this model could be used to formulate realistic scheduling problems. They
reviewed each of the “ground rules,” for planning and scheduling, individually and
showed that nearly everyone could be converted into linear programming format
(with the exception of those involving discreteness and non-convexity.)
It was actually in the summer of 1947 that Dantzig set out to create an algorithm
for solving real planning problems (under pressure from the Air Force). The first
thing he observed was that these problems could be represented graphically,
specifically as a polyhedral set. He noticed that the solutions improved as he moved
along the edges of the convex body from one vertex to the next, eventually reaching
the “optimal” solution. At that time, he wasn’t using an objective function and soon
felt that this method was extremely inefficient, so rejected the idea in search of
something better.
In June of this same year, Dantzig got in touch with T.J. Koopmans of the Cowles
Foundation in Chicago. At first Koopmans seemed unresponsive to the presentation
but suddenly became very interested in this linear programming model. “As if he
suddenly saw its significance to economic theory.”In fact, Koopmans was eventually
awarded the Nobel Prize in 1975, the culmination of heading a group of economists
who developed the theory of distribution of resources and its correlation to linear
programming.
Leonid Hurwitz, a student Koopmans, helped Dantzig with what they called
“climbing up the beanpole.” It was the predecessor to the simplex algorithm. It was
enhanced by eliminating the convexity constraint and assuming the variables
summed to unity. However, Dantzig still felt that though this model was more
efficient, it was probably very impractical.
Dantzig consulted with Johnny von Neumann about solution procedures the
following fall. And surprisingly (having searched in vain for any literature on the
subject) received a lecture on the theory of linear programs. It turned out that von
Neumann (and Oskar Morgenstern) had just completed a text on game theory and
suspected that the theory behind Dantzig’s model was an analogue to the one they
had developed for games.
Meanwhile, Dantzig’s group at the Pentagon was testing his simplex method and
finding that it was working very well. According to Dantzig, this was unforeseen. He
hadn’t trusted his intuition in higher dimensions. He thought “that the procedure
would require too many steps wandering from one adjacent vertex to the next.”
When in fact, most of the time, his technique solved problems with m equations in
2m or 3m steps…”truly amazing,” thought Dantzig.
Dantzig wrote a book (over a nine year period) on his work entitled, Linear
Programming and Extensions, published in 1963. He also received many awards for
his work.
Linear Programming has a wide range of practical applications including
business, economics, scheduling, agriculture, medicine, natural science, social
science, transportation, and even nutrition.