Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

Physics 127a: Class Notes: Lecture 2: A Simple Probability Example

Download as pdf or txt
Download as pdf or txt
You are on page 1of 3

Physics 127a: Class Notes

Lecture 2: A Simple Probability Example


The equally likely of the fundamental postulate reminds us of a coin ip, and in fact a very simple probability problem actually gives us useful insights into statistical mechanics issues. The problem is: What is the probability of getting m heads in a sequence of N ipped coins? We will denote this P (m, N ). In particular, we will compare the probability of getting N/2 heads (m = N/2) with the probability of getting N heads (m = N ). This is actually the same problem as a number of simple, but not uninteresting, statistical mechanics problems: For a magnetic system of N noninteracting magnetic moments that can each point either up or 1 down (e.g. a set of spin- 2 atoms) what is the probability of a state with m up moments and N m down moments, i.e. a magnetization of 2m N times the individual moment? Given an ideal gas of N molecules, what is the probability of nding m molecules in one half of the box, and N m molecules in the other half, and in particular the probability of an equal number of molecules in each half, compared with the probability of all the molecules being in one half? A drunkards walk along a path, or a one-dimensional random walk. At each step the drunkard may go one step forwards or one step backwards, with equal probability. What is the probability of nding the drunkard at position x after N steps? This is P (m, N) with x = 2m N (m forward steps and N m backward steps). Back to the coin problem. Consider rst N = 4 coins. Since any sequence, e.g. HHTT or HTHT, is equally likely (assuming unbiased coins) we can calculate P (m, 4) by counting the number of sequences or microstates that are consistent with each macrostate m m 0 1 2 3 4 microstates TTTT HTTT,THTT,TTHT,TTTH HHTT,HTHT,THHT HTTH,THTH,TTHH THHH,HTHH,HHTH,HHHT HHHH no. 1 4 6 4 1 P (m, 4) 1 = 0.062 5 16 4 = 0.25 16
6 16 4 16 1 16

= 0.375 = 0.25 = 0.062 5

Already P (N/2) is several times P (N ), and the ratio increases rapidly as N increases. The general expression for P (m, N ) for two outcomes A and B with individual probabilities pA and pB = 1 pA is N! m N P (m, N) = pA pB m (1) m!(N m)! known as the binomial distribution. Here the rst factor is the probability of a particular sequence with m outcomes A, and the second factor counts how many such sequences there are. The coin problem is this result
1 with pA = pB = 2 . It is then possible to show directly that for large N (and see below) P (N/2, N) = 2 N

and P (N, N) = 21 so that for large N we see P (N/2, N) P (N, N). N You will investigate P (m, N ) for large N using Stirlings approximation for factorials of large numbers in the homework. We can actually get the interesting properties from a couple of simple arguments and a profound result. The simple arguments are: 1

Mean: The mean or average value of the number of heads is


N

m =
i=1

xi

(2)

1 where xi is a random variable which takes on the values 1 (heads) with probability 2 and 0 (tails) with 1 probability 2 . The stand for the ensemble average. We can interchange the order of the sum and the average, so N N (3) m = xi = . 2 i=1

Variance: The variance or mean square uctuation in the number of heads is


N 2 N 2 N 2

=
i=1 N

xi m 1 xi 2 1 xi 2
N

=
i=1

1 xi 2

(4)

=
i=1 N

xj
j =1 2 N N

1 2 xi 1 2 xj 1 2 .

(5)

=
i=1

+
i=1 j =1, j =i

(6)

Interchanging the order of averaging and summing as before, the rst term is N/4 (N terms each equal 1 1 to 4 ) and the second gives zero since the two factors are both equally likely to be 2 . So N = N/2, which gives the width of the distribution. Finally, we use the central limit theorem, which tells us that the probability distribution of a quantity Q formed as the sum of a large number of random variables with any distribution (with nite mean and variance) is Gaussian, i.e. 2 1 (QQ) P (Q) = (7) e 2 2 2 with Q the mean and 2 the variance. Thus for large N for the coin toss problem we have, since m is the sum of N random variables xi , P (m, N) = e 2 N 1
(mN/2) 2
2N 2

(8)

with N = N /2 as we have just calculated. Note that this expression for P (m, N) is accurate only for m not too far from the most probable value N/2 (i.e. not too many N away) which however is the only region where the probability is signicantly nonzero. Where the probability is very small the result is inaccurate. For example for m = N we know P (N, N) = 2N = eN ln 2 , which is very different from the result given by Eq. (8). As N gets large, the width of the probability distribution of the number of heads m also becomes large, proportional to N. But this is small compared to the range N, and becomes very small in this comparison for N equal to the number of molecules in a macroscopic object, say 1024 (cf. the gas-in-the-box problem). The the number of heads is an extensive quantity the mean value grows proportional to the size of the system N. It is often convenient to introduce an intensive variable, such as the fraction of heads f = m/N , 2

with a mean that does not grow with system size. Since it is natural to consider f as a continuous variable for large N, we introduce the probability density p(f ) such that the probability of a fraction between f and 1 f + df for small df is p(f )df . Then since P (f N, N) = p(f ) N we have
2 1 (f 1/2) p(f ) = e 2 2 2

(9)

1 with = 2N . For the intensive variable the probability distribution gets narrower proportional to N 1/2 as N gets large. Again the Gaussian distribution is only good not too many away from the most probable value, i.e. for f within of order N 1/2 of 1/2 but again this is the only region where p(f ) is signicantly nonzero. In fact the probability of any fraction f not equal to one half (e.g. f = 1/3 or f = 1/29) is exponentially small for large N , i.e. of order eaN with a a number of order unity that depends on f but not N. For example the probability of nding on one side of the box a fraction f of gas molecules that is not equal to one half (e.g. one third) is of order

P (f =

1 24 ) 1010 2

(10)

i.e. 101000000000000000000000000 . (Note a 1024 is of order 1024 for any reasonable number a!) This is a number that for all physical purposes is zero (much, much, much smaller than one over the number of atoms in the universe etc.). This way that probabilities become certainties for N large corresponding to the number of molecules in a macroscopic sample, will be a recurring theme in the application of statistical mechanics of macroscopic systems. In general we will nd for the probability distribution of macroscopic quantities: For extensive quantities a mean proportional to N (e.g. the total energy) uctuations that are with relatively small of order N with a Gaussian distribution about the mean; For intensive quantities with a mean of order unity (e.g. the temperature) small uctuations of order 1/ N again with a Gaussian distribution about the mean The distribution is so narrow that we can replaces averages of quantities over the distribution by the result evaluated at the most probably value. Fluctuations far away from the mean have a probability that is exponentially small in N , and for N 1024 can be considered as never happening.

You might also like