Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

Quantitative Analysis: FRM 2013 Study Notes - Part1.Topic2

Download as pdf or txt
Download as pdf or txt
You are on page 1of 7

Quantitative Analysis

FRM 2013 Study Notes Part1.Topic2

By David Harper, CFA FRM CIPM www.bionicturtle.com

Table of Contents
Miller, Chapter 2: Probabilities ............................................................................................. 2 Miller, Chapter 3: Basic Statistics ........................................................................................ 20 Miller, Chapter 4: Distributions ........................................................................................... 36 Miller, Chapter 5: Hypothesis Testing and Confidence Intervals ......................................... 60 Stock & Watsons Probability and Statistics Review (Chapters 2 & 3) ................................ 75 Stock, Chapter 4: Linear Regression with one regressor.................................................... 78 Stock, Chapter 5: Single Regression: Hypothesis Tests and Confidence Intervals ............... 90 Stock: Chapter 6: Linear Regression with Multiple Regressors ........................................... 95 Stock, Chapter 7: Hypothesis Tests and Confidence Intervals in Multiple Regression ........100 Jorion, Chapter 12: Monte Carlo Methods .........................................................................104 Hull, Chapter 22: Estimating Volatilities and Correlations .................................................115 Allen, Boudoukh, and Saunders, Chapter 2: Quantifying Volatility in VaR Models ...........125

www.bionicturtle.com

FRM 2012 QUANTITATIVE ANALYSIS 1

Miller, Chapter 2:

Probabilities
In this chapter
Describe the concept of probability. Describe and distinguish between continuous and discrete random variables. Define and distinguish between the probability density function, the cumulative distribution function and the inverse cumulative distribution function, and calculate probabilities based on each of these functions. Calculate the probability of an event given a discrete probability function. Distinguish between independent and mutually exclusive events. Define joint probability, describe a probability matrix and calculate joint probabilities using probability matrices. Define and calculate a conditional probability, and distinguish between conditional and unconditional probabilities. Describe Bayes Theorem and apply it in the calculation of conditional probabilities. s.

Terminology (for discrete/compound, these do not need to be memorized)


x x x x Statistical or random experiment: an observation or measurement process with multiple but uncertain outcomes Population or sample space: set of all possible outcomes of an experiment; Sample point: each member or outcome of the sample space Outcome: the result of a single trial. For example, if we roll two dice, an outcome might be a three (3) and a four (4); a different outcome might be a (5) and a (2). Event: the result that reflects none, one, or more outcomes in the sample space. Events can be simple or compound. An event is a subset of the sample space. If we roll twodice, an example of an event might be rolling a seven (7) in total. Random variable (or stochastic variable): A stochastic or random variable (r.v.) is a variable whose value is determined by the outcome of an experiment Discrete random variable: r.v. that can take a finite number of values (or countably infinite). For example, coin, six-sided die, bond default (yes or no). Continuous random variable: r.v. that can take any value in some interval. For example, asset returns, time. Mutually exclusive events: events which cannot simultaneously occur. If A and B are mutually exclusive, the probability of (A and B) is zero. Put another way, their intersection is the null set. Collectively exhaustive events (a.k.a., cumulatively exhaustive): events that cumulatively describe all possible outcomes.

x x x x

www.bionicturtle.com

FRM 2013 QUANTITATIVE ANALYSIS 2

Describe the concept of probability.


Assume a discrete random variable X, which can take various values, x(i). The probability of any given x(i) occurring is p(i), which is represented by the following:

P>X

xi @

pi

s.t .

x ^x1, x2, , xn `

where P >

@ is the probability operator

The probability (p) of an event always falls between zero (0) and one (1.0 or 100%). In the case of a discrete random variable, a probability function solves for the expectation that a variable will equal a certain value. In the case of a continuous random variable, the probability function gives the likelihood that the random variable will fall between specified intervals. If f(x) is a probability function, in order to satisfy the definition of a probability the following two conditions must be true: x The probability function f(x) must be greater than (or equal to) zero and less than (or equal to) one; for example, there is no such thing as a -30% probability of occurrence or a 120% probability of occurrence. The sum of each mutually exclusive probability must be equal to one: if the probabilities are mutually exclusive and cumulatively exhaustive (we include all possible outcomes), the sum of those probabilities must equal one.

In mathematical terms, these conditions are represented as properties of a probability, where f(x) characterizes a discrete random variable:

1st condition: 2nd condition:

0 d f ( x) d 1 f ( x) 1
x

A two-sided coin is the classic discrete random variable (Bernoulli). If we flip a coin, there are two possible outcomes: heads or tails. The first condition above insists that the probability of flipping a head (or a tail) must lie between 0% and 100%. And the odds for a head are 50%. The second condition insists that all probability functions must sum to 100%: 50% probability of heads plus 50% probability of tails = 100% (1.0). I n other words, outcome = heads and outcome = tails cover all the possible outcomes; we arent omitting another possible outcome.

In contrast to the case of a discrete random variable, a continuous random variable does not have exact outcomes like 1.0 or 3.5. For example, asset returns. Strictly, we must define continuous outcomes on an interval: instead of P[X=x] we need P[x1 < X < x2]. For a continuous variable, the probability of any specific value occurring is zero.

www.bionicturtle.com

FRM 2013 QUANTITATIVE ANALYSIS 3

Describe and distinguish between continuous and discrete random variables.


We characterize (describe) a random variable with a probability distribution. The random variable can be discrete or continuous; and in either the discrete or continuous case, the probability can be local (PMF, PDF) or cumulative (CDF). A random variable is a variable whose value is determined by the outcome of an experiment (a.k.a., stochastic variable). A random variable is a numerical summary of a random outcome. The number of times your computer crashes while you are writing a term paper is random and takes on a numerical value, so it is a random variable. S&W
Continuous

probability function (pdf, pmf)

Pr (c1 Z c2) = (c2) - (c1)

Pr (Z c)= (c)
Cumulative Distribution Function (CDF)

Pr (X = 3)

Pr (X 3)

Discrete

Continuous random variable


A continuous random variable (X) has an infinite number of values within an interval:

P (a  X  b)

a f ( x )dx

www.bionicturtle.com

FRM 2013 QUANTITATIVE ANALYSIS 4

Discrete random variable


A discrete random variable (X) assumes a value among a finite set including x1, x2, x3 and so on. The probability function is expressed by:

P( X

xk ) f ( xk )

Notes on continuous versus discrete random variables


x Discrete random variables can be counted. Continuous random variables must be measured. Examples of a discrete random variable include: coin toss (head or tails, nothing in between); roll of the dice (1, 2, 3, 4, 5, 6); and did the fund beat the benchmark?(yes, no). In risk, common discrete random variables are default/no default (0/1) and loss frequency. Examples of continuous random variables include: distance and time. A common example of a continuous variable, in risk, is loss severity. Note the similarity between the summation ( ) under the discrete variable and the integral () under the continuous variable. The summation ( ) of all discrete outcomes must equal one. Similarly, the integral ( ) captures the area under the continuous distribution function. The total area under this curve, from (-) to (), must equal one. All four of the so-called sampling distributionsthat each converge to the normalare continuous: normal, students t, chi-square, and F distribution.

www.bionicturtle.com

FRM 2013 QUANTITATIVE ANALYSIS 5

Continuous Are measured Infinite

Summary Discrete Are counted Finite Examples in Finance Default (1,0) (e.g.) Frequency of loss (e.g.)

Distance, Time (e.g.) Severity of loss (e.g.) Asset returns (e.g.)

For example Normal Students t Chi-square F distribution Lognormal Exponential Gamma, Beta EVT Distributions (GPD (GPD, GEV) Bernoulli (0/1) Binomial ( (series i.i.d. Bernoullis) Poisson Logarithmic

www.bionicturtle.com

FRM 2013 QUANTITATIVE ANALYSIS 6

You might also like