Quantitative Analysis: FRM 2013 Study Notes - Part1.Topic2
Quantitative Analysis: FRM 2013 Study Notes - Part1.Topic2
Quantitative Analysis: FRM 2013 Study Notes - Part1.Topic2
Table of Contents
Miller, Chapter 2: Probabilities ............................................................................................. 2 Miller, Chapter 3: Basic Statistics ........................................................................................ 20 Miller, Chapter 4: Distributions ........................................................................................... 36 Miller, Chapter 5: Hypothesis Testing and Confidence Intervals ......................................... 60 Stock & Watsons Probability and Statistics Review (Chapters 2 & 3) ................................ 75 Stock, Chapter 4: Linear Regression with one regressor.................................................... 78 Stock, Chapter 5: Single Regression: Hypothesis Tests and Confidence Intervals ............... 90 Stock: Chapter 6: Linear Regression with Multiple Regressors ........................................... 95 Stock, Chapter 7: Hypothesis Tests and Confidence Intervals in Multiple Regression ........100 Jorion, Chapter 12: Monte Carlo Methods .........................................................................104 Hull, Chapter 22: Estimating Volatilities and Correlations .................................................115 Allen, Boudoukh, and Saunders, Chapter 2: Quantifying Volatility in VaR Models ...........125
www.bionicturtle.com
Miller, Chapter 2:
Probabilities
In this chapter
Describe the concept of probability. Describe and distinguish between continuous and discrete random variables. Define and distinguish between the probability density function, the cumulative distribution function and the inverse cumulative distribution function, and calculate probabilities based on each of these functions. Calculate the probability of an event given a discrete probability function. Distinguish between independent and mutually exclusive events. Define joint probability, describe a probability matrix and calculate joint probabilities using probability matrices. Define and calculate a conditional probability, and distinguish between conditional and unconditional probabilities. Describe Bayes Theorem and apply it in the calculation of conditional probabilities. s.
x x x x
www.bionicturtle.com
P>X
xi @
pi
s.t .
x ^x1, x2, , xn `
where P >
The probability (p) of an event always falls between zero (0) and one (1.0 or 100%). In the case of a discrete random variable, a probability function solves for the expectation that a variable will equal a certain value. In the case of a continuous random variable, the probability function gives the likelihood that the random variable will fall between specified intervals. If f(x) is a probability function, in order to satisfy the definition of a probability the following two conditions must be true: x The probability function f(x) must be greater than (or equal to) zero and less than (or equal to) one; for example, there is no such thing as a -30% probability of occurrence or a 120% probability of occurrence. The sum of each mutually exclusive probability must be equal to one: if the probabilities are mutually exclusive and cumulatively exhaustive (we include all possible outcomes), the sum of those probabilities must equal one.
In mathematical terms, these conditions are represented as properties of a probability, where f(x) characterizes a discrete random variable:
0 d f ( x) d 1 f ( x) 1
x
A two-sided coin is the classic discrete random variable (Bernoulli). If we flip a coin, there are two possible outcomes: heads or tails. The first condition above insists that the probability of flipping a head (or a tail) must lie between 0% and 100%. And the odds for a head are 50%. The second condition insists that all probability functions must sum to 100%: 50% probability of heads plus 50% probability of tails = 100% (1.0). I n other words, outcome = heads and outcome = tails cover all the possible outcomes; we arent omitting another possible outcome.
In contrast to the case of a discrete random variable, a continuous random variable does not have exact outcomes like 1.0 or 3.5. For example, asset returns. Strictly, we must define continuous outcomes on an interval: instead of P[X=x] we need P[x1 < X < x2]. For a continuous variable, the probability of any specific value occurring is zero.
www.bionicturtle.com
Pr (Z c)= (c)
Cumulative Distribution Function (CDF)
Pr (X = 3)
Pr (X 3)
Discrete
P (a X b)
a f ( x )dx
www.bionicturtle.com
P( X
xk ) f ( xk )
www.bionicturtle.com
Summary Discrete Are counted Finite Examples in Finance Default (1,0) (e.g.) Frequency of loss (e.g.)
For example Normal Students t Chi-square F distribution Lognormal Exponential Gamma, Beta EVT Distributions (GPD (GPD, GEV) Bernoulli (0/1) Binomial ( (series i.i.d. Bernoullis) Poisson Logarithmic
www.bionicturtle.com