Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
0% found this document useful (0 votes)
33 views

3.2 Unconstrained Optimization Multiple Variables

This document discusses unconstrained optimization problems involving multiple variables. It uses the example of a Cournot duopoly to demonstrate how to set up and solve an optimization problem with two choice variables (quantity for each of two firms). The optimization problems are set up by defining the profit functions for each firm in terms of their own and the other firm's quantity. Taking derivatives of the profit functions yields a system of first-order conditions that can be solved to find the profit-maximizing quantities. The Hessian matrix is then used to check that these quantities represent a maximum rather than a minimum.

Uploaded by

SHYAM SUNDAR
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
33 views

3.2 Unconstrained Optimization Multiple Variables

This document discusses unconstrained optimization problems involving multiple variables. It uses the example of a Cournot duopoly to demonstrate how to set up and solve an optimization problem with two choice variables (quantity for each of two firms). The optimization problems are set up by defining the profit functions for each firm in terms of their own and the other firm's quantity. Taking derivatives of the profit functions yields a system of first-order conditions that can be solved to find the profit-maximizing quantities. The Hessian matrix is then used to check that these quantities represent a maximum rather than a minimum.

Uploaded by

SHYAM SUNDAR
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 11

Unconstrained Optimization:

Multiple Variables

Basic Math for Economics – Refresher

Eric Dunaway epdunaway@gmail.com 1


Introduction
 While working with just one variable is simple, it’s rare in
economics.
 Often, we must work with many variables all at the same
time.
 This complicates our optimization slightly, but the same rules
still apply.
 Let’s take a look at another unconstrained optimization
problem, but this time through the lens of a Cournot duopoly.
Recallthat in a Cournot duopoly, two firms simultaneously
choose their quantities, but face the same market price.

Eric Dunaway epdunaway@gmail.com 2


Unconstrained Optimization
 Let’s use the same market as last time, but rather than there
being a monopolist, now there are two firms and they
compete in quantities.
 Both firms face the inverse demand function
𝑝 = 250 − 2𝑞1 − 2𝑞2
and face constant marginal costs of production of 𝑐 = 50.
 As before, we will set up our problem and then optimize.
 Note that we actually have two optimization problems here.
Each firm is maximizing their profits separately, so we
need to treat them as separate optimization decisions.
Eric Dunaway epdunaway@gmail.com 3
Unconstrained Optimization
 Let’s start with firm 1.
 They want to maximize their profits, and the only thing
they can choose is their own quantity, 𝑞1 . Thus, we
have,
max
𝑞1
 Next, they want to maximize their own profits (they
don’t care about firm 2’s profits at all), but their inverse
demand function is also a function of firm 2’s quantity,
so we must include it.
max 250 − 2𝑞1 − 2𝑞2 𝑞1 − 50𝑞1
𝑞1

Eric Dunaway epdunaway@gmail.com 4


Unconstrained Optimization
max 250 − 2𝑞1 − 2𝑞2 𝑞1 − 50𝑞1
𝑞1
 From here, we can calculate a first-order condition as before
(remember to treat 𝑞2 as a constant),
𝜕𝜋1
= 250 − 4𝑞1 − 2𝑞2 − 50 = 0
𝜕𝑞1
 We can perform the same analysis for firm 2, obtaining the following
optimization problem,
max 250 − 2𝑞1 − 2𝑞2 𝑞2 − 50𝑞2
𝑞2
and first-order condition,
𝜕𝜋2
= 250 − 2𝑞1 − 4𝑞2 − 50 = 0
𝜕𝑞2

Eric Dunaway epdunaway@gmail.com 5


Unconstrained Optimization
𝜕𝜋1
= 250 − 4𝑞1 − 2𝑞2 − 50 = 0
𝜕𝑞1
𝜕𝜋2
= 250 − 2𝑞1 − 4𝑞2 − 50 = 0
𝜕𝑞2
 This is just a system of two equations and two unknowns
and fairly easy to solve.
 Doing so yields 𝑞1∗ = 𝑞2∗ = 33.3, which is notable smaller
than what a monopolist produces (which is expected).
Westill need to check whether this is a maximum,
however.
Eric Dunaway epdunaway@gmail.com 6
Unconstrained Optimization
𝜕𝜋1
= 250 − 4𝑞1 − 2𝑞2 − 50 = 0
𝜕𝑞1
𝜕𝜋2
= 250 − 2𝑞1 − 4𝑞2 − 50 = 0
𝜕𝑞2
 This is a bit harder, and requires the use of a matrix called the Hessian.
 To calculate the Hessian, we need to take a derivative of each of our
first-order conditions for each of our choice variables (4 derivatives
total in this case).
 The first row and first column of our Hessian is the derivative of our
first first-order condition for our first variable.
 Thefirst row and second column of our Hessian is the derivative of
our first first-order condition for our second variable.

Eric Dunaway epdunaway@gmail.com 7


Unconstrained Optimization
 Calculating these derivatives,
𝜕 2 𝜋1 𝜕 2 𝜋1
2 = −4 𝜕𝑞1 𝜕𝑞2
= −2
𝜕𝑞1
𝐻=
𝜕 2 𝜋2 𝜕 2 𝜋2
= −2 2 = −4
𝜕𝑞2 𝜕𝑞1 𝜕𝑞2
 Or, reducing this matrix a little bit, we have
−4 −2
𝐻=
−2 −4
Our problem is maximized if this matrix is negative
semidefinite.
Eric Dunaway epdunaway@gmail.com 8
Unconstrained Optimization
−4 −2
𝐻=
−2 −4
 Negative semidefinite?
 Leaving out most of the math, it means that certain
elements of the Hessian must take certain values.
For a 2x2 matrix, if all the elements along the trace are
less than or equal to zero and the determinant is positive,
then the matrix is negative semidefinite.
First,
let’s check the trace. Both elements along the
trace are equal to −4, so they satisfy the requirement
that they are less than or equal to zero.
Eric Dunaway epdunaway@gmail.com 9
Unconstrained Optimization
−4 −2
𝐻=
−2 −4
 Now, let’s calculate the determinant.
𝐻 = −4 −4 − −2 −2 = 16 − 4 = 12
 Since the determinant of the Hessian is positive, that
along with all the elements of our trace being negative
means that our matrix is negative semidefinite and we
have a maximum rather than a minimum.
It’s
useful to check this when dealing with less
behaved functional forms.

Eric Dunaway epdunaway@gmail.com 10


Unconstrained Optimization
 If you are looking for a minimum, rather than a maximum,
the Hessian needs to be positive semidefinite.
 For a 2x2 Hessian, all the elements along the trace must
be greater than or equal to zero while the determinant
is still positive.
 If we move beyond a 2x2 Hessian (i.e., we have more
than two choice variables), this becomes more
complicated.
 Our principal minors must have alternating signs based
upon which minor they are.
A linear algebra course would be helpful for this.
Eric Dunaway epdunaway@gmail.com 11

You might also like