The Indian Institute of Planning & Management: Operational Research
The Indian Institute of Planning & Management: Operational Research
The Indian Institute of Planning & Management: Operational Research
Project Trimester
(Fall - Winter 2008-10)
OPERATIONAL RESEARCH
Ananth Kumar P.
ACKNOWLEDGEMENT
We would like to express our gratitude and thanks to Prof. Mashkur Zafar for their
invaluable co-operation, continued support and guidance as well as healthy criticism throughout
resulting works. Their able guidance, encouragement and valuable suggestion led our way pass
easily through most difficult period during this Project.
We would also like to thank the faculty and staff of IIPM because without their help this project
would not have been completed.
INTRODCUTION
Operations Research is an interdisciplinary branch of applied mathematics that uses methods
such as mathematical modeling, statistics, and algorithms to arrive at optimal or near optimal
solutions to complex problems. It is typically concerned with determining the maxima (of profit,
assembly line performance, crop yield, bandwidth, etc) or minima (of loss, risk, etc.) of some
objective function. Operations research helps management achieve its goals using scientific
methods. OR is the discipline of applying advanced analytical methods to help make better
decisions.
Basic OR concepts:-
In this problem there are two unknowns, and five constraints. All the constraints are inequalities
and they are all linear in the sense that each involves an inequality in some linear function of the
variables. The first two constraints, x1 ≥ 0 and x2 ≥ 0, are special. These are called non-
negativity constraints and are often found in linear programming problems. The other constraints
are then called the main constraints. The function to be maximized (or minimized) is called the
objective function. Here, the objective function is x1 + x2 .
4. Stimulation- Simulation is the imitation of some real thing, state of affairs, or process. The act
of simulating something generally entails representing certain key characteristics or behaviours
of a selected physical or abstract system. Simulation is used in many contexts, including the
modeling of natural systems or human systems in order to gain insight into their functioning. [1]
Other contexts include simulation of technology for performance optimization, safety
engineering, testing, training and education. Simulation can be used to show the eventual real
effects of alternative conditions and actionKey issues in simulation include acquisition of valid
source information about the relevant selection of key characteristics and behaviours, the use of
simplifying approximations and assumptions within the simulation, and fidelity and validity of
the simulation outcome.
5. Queuing- Queuing theory is the mathematical study of waiting lines (or queues). The theory
enables mathematical analysis of several related processes, including arriving at the (back of the)
queue, waiting in the queue (essentially a storage process), and being served by the server(s) at
the front of the queue. The theory permits the derivation and calculation of several performance
measures including the average waiting time in the queue or the system, the expected number
waiting or receiving service and the probability of encountering the system in certain states, such
as empty, full, having an available server or having to wait a certain time to be served.
linear regression
In statistics, linear regression refers to any approach to modeling the relationship between one
or more variables denoted y and one or more variables denoted X, such that the model depends
linearly on the unknown parameters to be estimated from the data. Such a model is called a
“linear model”. Most commonly, linear regression refers to a model in which the conditional
mean of y given the value of X is an affine function of X. Less commonly, linear regression could
refer to a model in which the median, or some other quantile of the conditional distribution of y
given X is expressed as a linear function of X. Like all forms of regression analysis, linear
regression focuses on the conditional probability distribution of y given X, rather than on the
joint probability distribution of y and X, which is the domain of multivariate analysis
Linear regression was the first type of regression analysis to be studied rigorously, and to be used
extensively in practical applications. This is because models which depend linearly on their
unknown parameters are easier to fit than models which are non-linearly related to their
parameters and because the statistical properties of the resulting estimators are easier to
determine.
Multiple regression
The general purpose of multiple regression (the term was first used by Pearson, 1908) is to learn
more about the relationship between several independent or predictor variables and a dependent
or criterion variable. For example, a real estate agent might record for each listing the size of the
house (in square feet), the number of bedrooms, the average income in the respective
neighborhood according to census data, and a subjective rating of appeal of the house. Once this
information has been compiled for various houses it would be interesting to see whether and how
these measures relate to the price for which a house is sold. For example, you might learn that
the number of bedrooms is a better predictor of the price for which a house sells in a particular
neighborhood than how "pretty" the house is (subjective rating). You may also detect "outliers,"
that is, houses that should really sell for more, given their location and characteristics.
Once this so-called regression line has been determined, the analyst can now easily construct a
graph of the expected (predicted) salaries and the actual salaries of job incumbents in his or her
company. Thus, the analyst is able to determine which position is underpaid (below the
regression line) or overpaid (above the regression line), or paid equitably.
In the social and natural sciences multiple regression procedures are very widely used in
research. In general, multiple regression allows the researcher to ask (and hopefully answer) the
general question "what is the best predictor of ...". For example, educational researchers might
want to learn what are the best predictors of success in high-school. Psychologists may want to
determine which personality variable best predicts social adjustment. Sociologists may want to
find out which of the multiple social indicators best predict whether or not a new immigrant
group will adapt and be absorbed into society.
Computational Approach
The general computational problem that needs to be solved in multiple regression
analysis is to fit a straight line to a number of points.
In the simplest case - one dependent and one independent variable - you can
visualize this in a scatterplot.
Least Squares
The Regression Equation
Unique Prediction and Partial Correlation
Predicted and Residual Scores
Residual Variance and R-square
Interpreting the Correlation Coefficient R
Least Squares
In the scatterplot, we have an independent or X variable, and a dependent or Y
variable. These variables may, for example, represent IQ (intelligence as measured
by a test) and school achievement (grade point average; GPA), respectively. Each
point in the plot represents one student, that is, the respective student's IQ and
GPA. The goal of linear regression procedures is to fit a line through the points.
Specifically, the program will compute a line so that the squared deviations of the
observed points from that line are minimized. Thus, this general procedure is
sometimes also referred to as least squares estimation.
For example, the animation below shows a two dimensional regression equation
plotted with three different confidence intervals (90%, 95% and 99%).
In the multivariate case, when there is more than one independent variable, the
regression line cannot be visualized in the two dimensional space, but can be
computed just as easily. For example, if in addition to IQ we had additional
predictors of achievement (e.g., Motivation, Self- discipline) we could construct a
linear equation containing all those variables. In general then, multiple regression
procedures will estimate a linear equation of the form:
OPTIMISING PROBLEM
labour
service Units Meal sold hours materials dollar
unit 1 100 2 200
unit 2 100 4 150
unit 3 100 4 100
unit 4 100 6 100
unit 5 100 8 80
unit 6 100 10 50
Adjustable Cells
Original
Cell Name Value Final Value
$B$1
6 ratio u1 0.01 0.01
$C$1
6 ratio v1 0.166666667 0.166666667
$D$1
6 ratio v2 0.003333333 0.003333333
Constraints
Cell Name Cell Value Formula Status Slack
$B$2 $B$21<=$D$2
1 unit1 Lhs 0 1 Binding 0
$B$2 - $B$22<=$D$2 Not 0.16666666
2 unit2 Lhs 0.166666667 2 Binding 7
$B$2 $B$23<=$D$2
3 unit3 Lhs 1.38778E-15 3 Binding 0
$B$2 - $B$24<=$D$2 Not 0.33333333
4 unit4 Lhs 0.333333333 4 Binding 3
$B$2 $B$25<=$D$2 Not
5 unit5 Lhs -0.6 5 Binding 0.6
$B$2 - $B$26<=$D$2 Not 0.83333333
6 unit6 Lhs 0.833333333 6 Binding 3
$B$2 $B$27=$D$2 Not
7 constr Lhs 1 7 Binding 0
Reduce Allowabl
Final d Objective e Allowable
Coefficien
Cell Name Value Cost t Increase Decrease
$B$1
6 ratio u1 0.01 0 100 1E+30 100
$C$1 0.16666666
6 ratio v1 7 0 0 0 3
$D$1 0.00333333
6 ratio v2 3 0 0 300 0
Constrain Allowabl
Final Shadow t e Allowable
Cell Name Value Price R.H. Side Increase Decrease
$B$2 unit1 0.38461538
1 Lhs 0 1 0 1 5
-
$B$2 unit2 0.16666666 0.16666666
2 Lhs 7 0 0 1E+30 7
$B$2 unit3 1.38778E-
3 Lhs 15 0 0 0.2 1
-
$B$2 unit4 0.33333333 0.33333333
4 Lhs 3 0 0 1E+30 3
$B$2 unit5
5 Lhs -0.6 0 0 1E+30 0.6
-
$B$2 unit6 0.83333333 0.83333333
6 Lhs 3 0 0 1E+30 3
$B$2 constr
7 Lhs 1 1 1 1E+30 1
Target
Cell Name Value
$B$1 Maximise
8 u1 1
Targe Targe
Adjustable Lower t Upper t
Resul Resul
Cell Name Value Limit t Limit t
$B$1
6 ratio u1 0.01 0 0 0.01 1
$C$1 0.16666666 0.16666666 0.16666666
6 ratio v1 7 7 1 7 1
$D$1 0.00333333 0.00333333 0.00333333
6 ratio v2 3 3 1 3 1
unit1
units u1 v1 v2
unit1 100 -2 -200
unit2 100 -4 -150
unit3 100 -4 -100
unit4 100 -6 -100
unit5 100 -8 -80
unit6 100 -10 -50
constr 0 2 200
Model
Decesion Variable
u1 v1 v2
0.16666 0.00333
ratio 0.01 7 3
Maximis
e 1
constrai
n Lhs Rhs
unit1 0 <= 0
-
0.1666
unit2 7 <= 0
1.39E-
unit3 15 <= 0
-
0.3333
unit4 3 <= 0
unit5 -0.6 <= 0
-
0.8333
unit6 3 <= 0
constr 1 = 1
Microsoft Excel 12.0 Answer Report
Worksheet: [optimising problem.xlsx]unit3
Report Created: 2/17/2010 11:03:34 AM
Adjustable Cells
Original
Cell Name Value Final Value
$B$1
6 ratio u1 0.01 0.01
$C$1
6 ratio v1 0.166666667 0.0625
$D$1
6 ratio v2 0.003333333 0.0075
Constraints
Slac
Cell Name Cell Value Formula Status k
$B$2 $B$21<=$D$2 Not 0.62
1 unit1 Lhs -0.625 1 Binding 5
$B$2 $B$22<=$D$2 Not 0.37
2 unit2 Lhs -0.375 2 Binding 5
$B$2 $B$23<=$D$2
3 unit3 Lhs 4.55191E-14 3 Binding 0
$B$2 $B$24<=$D$2 Not 0.12
4 unit4 Lhs -0.125 4 Binding 5
$B$2 $B$25<=$D$2 Not
5 unit5 Lhs -0.1 5 Binding 0.1
$B$2 $B$26<=$D$2
6 unit6 Lhs 1.36335E-13 6 Binding 0
$B$2 Not
7 constr Lhs 1 $B$27=$D$27 Binding 0
Microsoft Excel 12.0 Sensitivity Report
Worksheet: [optimising problem.xlsx]Sheet3
Report Created: 2/17/2010 10:52:48 AM
Adjustable Cells
Final Reduced Objective Allowable Allowable
Coefficien
Cell Name Value Cost t Increase Decrease
$B$1 0.00888888
6 ratio u1 9 0 100 1E+30 100
$C$1 0.05555555
6 ratio v1 6 0 0 2 7
$D$1 0.00666666 116.666666 33.3333333
6 ratio v2 7 0 0 7 3
Constraints
Constrain
Final Shadow t Allowable Allowable
Cell Name Value Price R.H. Side Increase Decrease
-
$B$2 unit1 0.55555555 0.55555555
1 Lhs 6 0 0 1E+30 6
-
$B$2 unit2 0.33333333 0.33333333
2 Lhs 3 0 0 1E+30 3
$B$2 unit3 3.63043E- 0.77777777 0.14285714
3 Lhs 14 8 0 3 0.5
-
$B$2 unit4 0.11111111 0.11111111
4 Lhs 1 0 0 1E+30 1
-
$B$2 unit5 0.08888888 0.08888888
5 Lhs 9 0 0 1E+30 9
$B$2 unit6 0.22222222 0.15384615
6 Lhs 1.2701E-13 2 0 4 0.625
$B$2 constr 0.88888888
7 Lhs 1 9 1 1E+30 1
unit4
Availabl
units u1 v1 v2 e
unit1 100 -2 -200 0
unit2 100 -4 -150 0
unit3 100 -4 -100 0
unit4 100 -6 -100 0
unit5 100 -8 -80 0
unit6 100 -10 -50 0
constr 0 6 100 1
Model
Decesion Variable
u1 v1 v2
0.00888 0.05555 0.00666
ratio 9 6 7
Maximis 0.88888
e 9
constrai
n Lhs Rhs
-
unit1 0.55556 <= 0
-
unit2 0.33333 <= 0
3.63E-
unit3 14 <= 0
-
unit4 0.11111 <= 0
-
unit5 0.08889 <= 0
1.27E-
unit6 13 <= 0
constr 1 = 1
unit6
Availabl
units u1 v1 v2 e
unit1 100 -2 -200 0
unit2 100 -4 -150 0
unit3 100 -4 -100 0
unit4 100 -6 -100 0
unit5 100 -8 -80 0
unit6 100 -10 -50 0
constr 0 10 50 1
Model
Decesion Variable
u1 v1 v2
ratio 0.01 0.0625 0.0075
Maximis
e 1
constrai
n Lhs Rhs
unit1 -0.625 <= 0
unit2 -0.375 <= 0
unit3 0 <= 0
unit4 -0.125 <= 0
unit5 -0.1 <= 0
unit6 0 <= 0
constr 1 = 1
Linear programming Problem
RMC
MATERIAL REQUIREMENT
SOLVENT AMOUNT
MATERIAL FUEL ADDITIVE BASE AVAILABLE
MATERIAL 1 0.4 0.5 20
MATERIAL 2 0 0.2 5
MATERIAL 3 0.6 0.3 21
PROFIT PER TON 40 30
MODEL
DECISION VARIABLE
SOLVENT
FUEL ADDITIVE BASE
TONS
PRODUCED 25 20
MAX TOTAL
PROFIT 1600
Constraints
Slac
Cell Name Cell Value Formula Status k
$B$1 MATERIAL 1 AMOUNT USED $B$17<=$D$1
7 (LHS) 20 7 Binding 0
$B$1 MATERIAL 2 AMOUNT USED $B$18<=$D$1 Not
8 (LHS) 4 8 Binding 1
$B$1 MATERIAL 3 AMOUNT USED $B$19<=$D$1
9 (LHS) 21 9 Binding 0
Adjustable Cells
Fina Reduc
l ed
Ce Val Gradi
ll Name ue ent
TONS
$B PRODUCE
$1 D FUEL
2 ADDITIVE 25 0
TONS
PRODUCE
$C D
$1 SOLVENT
2 BASE 20 0
Constraints
Fina Lagra
l nge
Ce Val Multip
ll Name ue lier
$B MATERIAL 20 33.333
$1 1 33366
7 AMOUNT
USED
(LHS)
MATERIAL
2
$B AMOUNT
$1 USED
8 (LHS) 4 0
MATERIAL
3
$B AMOUNT
$1 USED 44.444
9 (LHS) 21 44213
Target
Va
Ce lu
ll Name e
MAX
TOTAL
$B PROFIT
$1 FUEL 16
4 ADDITIVE 00
T T
a U a
r p r
Lo g p g
w e e e
Adjustable er t r t
R L R
e i e
Va Li s m s
Ce lu mi u i u
ll Name e t lt t lt
TONS 1
$B PRODUCE 6 6
$1 D FUEL 0 2 0
2 ADDITIVE 25 0 0 5 0
TONS
PRODUCE 1 1
$C D 0 6
$1 SOLVENT 0 2 0
2 BASE 20 0 0 0 0
Microsoft Excel 10.0 Answer Report
Worksheet: [SHAKTI SOLVER.xls]RMC
Cell Name
MAX TOTAL PROFIT FUEL
$B$14 ADDITIVE
Adjustable Cells
Cell Name
TONS PRODUCED FUEL
$B$12 ADDITIVE
TONS PRODUCED
$C$12 SOLVENT BASE
Constraints
Cell Name
MATERIAL 1 AMOUNT
$B$17 USED (LHS)
MATERIAL 2 AMOUNT
$B$18 USED (LHS)
MATERIAL 3 AMOUNT
$B$19 USED (LHS)
Adjustable Cells
Allowabl Allowabl
Final Reduced Objective e e
Cell Name Valu Cost Coefficie Increase Decreas
e nt e
$B$1 TONS PRODUCED FUEL
2 ADDITIVE 25 0 40 20 16
$C$1 TONS PRODUCED SOLVENT
2 BASE 20 0 30 20 10
Constraints
Constrain Allowabl Allowabl
Final Shadow t e e
Valu Decreas
Cell Name e Price R.H. Side Increase e
$B$1 MATERIAL 1 AMOUNT USED 33.3333333
7 (LHS) 20 3 20 1.5 1E+30
$B$1 MATERIAL 2 AMOUNT USED
8 (LHS) 4 0 5 1E+30 1
$B$1 MATERIAL 3 AMOUNT USED 44.4444444
9 (LHS) 21 4 21 1E+30 2.25
Target
Valu
Cell Name e
MAX TOTAL PROFIT FUEL
$B$14 ADDITIVE 1600
Adjustable Cells
Original
Cell Name Value Final Value
$B$1
2 UNITS PRODUCED AEROPLANE 0 84
$C$1
2 UNITS PRODUCED BOAT 0 44
Constraints
Slac
Cell Name Cell Value Formula Status k
$B$1 $B$17<=$D$1
7 LABOUR AMOUNT USED (LHS) 560 7 Binding 0
$B$1 MACHINE TIME AMOUNT USED $B$18<=$D$1
8 (LHS) 300 8 Binding 0
$B$1 $B$19<=$D$1 Not
9 P&V AMOUNT USED (LHS) 128 9 Binding 12
Adjustable Cells
Cell
RMC
MATERIAL REQUIREMENT
AMOUNT
MATERIAL
Microsoft AEROPLANE
Excel 10.0 Limits BOAT AVAILABLE
LABOUR
Report 3 7 560
MACHINE [SHAKTI
Worksheet: TIME 2 3 300
SOLVER.xls]SHAKTI
P&V 1 1 140
PROFIT PER TON 5 9
MODEL
Target
DECISION VARIABLE
Valu
Cell NameAEROPLANE e BOAT
UNITS MAX TOTAL PROFIT
PRODUCED
$B$14 AEROPLANE 816 84 44
MAX TOTAL
PROFIT 816
Lowe Targe
Adjustable r t
CONSTRAINTS Valu(LHS)
AMOUNT USED Resul
Cell
LABOUR Name e Limit <= t
560 560
UNITS
MACHINE TIME PRODUCED 300 <= 300
$B$12 AEROPLANE
P&V 84 128 0<= 396 140
$C$1 UNITS PRODUCED
2 BOAT 44 0 420
ANOVA for Multiple Linear Regression
STOCKS TAKEN
BHARTI
ACC
DR.REDDY
GAIL
HDFC
ICICI
NTPC
ONGC
nse
Regression Statistics
0.98522098
Multiple R 5
0.97066038
R Square 9
Adjusted R 0.96748853
Square 9
Standard Error 29.6098491
Observations 83
ANOVA
Significance
df SS MS F F
268303.952 306.023432
Regression 8 2146431.618 3 4 1.78981E-53
876.743163
Residual 74 64878.99411 6
Total 82 2211310.612
Standard
Coefficients Error t Stat P-value Lower 95% Upper 95%
- - -
546.294572 4.14534971 283.707395
Intercept 6 131.7849181 5 8.93226E-05 -808.88175 2
1.11423432 3.72213018 0.00038263 0.51775859 1.71071005
X Variable 1 4 0.299353937 2 1 3 6
0.57274590 0.33157555 0.81391626
X Variable 2 9 0.121036434 4.73201243 1.04156E-05 4 4
-
0.12849103 1.51640250 0.13367801 0.04034538 0.29732745
X Variable 3 5 0.084734122 4 9 6 5
0.94550178 2.76294010 0.00722280 0.26363633 1.62736723
X Variable 4 2 0.34220857 6 6 3 2
0.76851876 9.19764546 0.60202973 0.93500779
X Variable 5 5 0.083556033 7 7.04942E-14 8 2
6.22672145
X Variable 6 0.93653779 0.150406245 7 2.61521E-08 0.63684681 1.23622877
4.62150022 7.42942451 3.38203140 5.86096905
X Variable 7 9 0.622053595 6 1.54342E-10 3 5
0.95445532 7.29291411 0.69368251 1.21522813
X Variable 8 2 0.13087434 9 2.78481E-10 1 3
RESIDUAL
OUTPUT