Page 1 ! " # # ! $ " ! # ! % & ' $ ( ) " Page 2 Page 3 Page 4 Page 5

0 downloads 0 Views 14MB Size Report
The probability of any event has been defined by him in the following way: If sample .... Cases favourable to ܥҧ : (2 R ,1 Y) + (2Y ,1 R) + (3 R) +(3Y). ..... 0 + (5/14) x(1/4) + (5/14) x(1/2) + (5/84)x (3/4) = ..... The computational formula for variance may be given in the forms ...... P(X = 2, Y = 2) = 1/6 ...... circles around the origin.
   

                                               

   

  

   

        

                 

    

      

         

     

 

          

                    

         

        

                        

                 

       

          

!"#  # ! $   

"  

 !     #    !%

&  '$   

(              

       )

  " 

    



                

       

         

    

                                   

 !      "        #      $  !%  !      &  $   '      '    ($     '   # %  % )   % *   % '   $  '      +      " %        &  '  !#       $,  ( $        -     .                                      !   "-           (    %                              .          %     %   %   %    $        $ $ -           -                           - - // $$$    0   1"1"#23."         

4& )*5/ +)     678%99:::&  %  ) 2  ;   *   &        /- 

”‡ˆƒ…‡ This book is a natural outgrowth of Probability and distribution theory. It is written as a text book for the course at Graduation level or Post-Graduation level for all fields of sciences and engineering. There are several materials in this text book, so anyone can select the topics according to his/her need for their respective university/college examinations. This is also enriched with many interesting and harder sums in solved examples as well as in exercises. This text is also suitable for self study and for short, intensive study for continuing education. The book is written in such a manner that nobody requires any previous knowledge of Probability and distribution theory to understand the materials inside this book. Theoretical aspects of Probability and its practical applications are discussed in every chapter of this book. The book is enriched with theoretical problems as well as numerical illustrations which can be truthfully utilized for students belonging to Graduation, PostGraduation and Research level. Introduction of each chapter contains motivational points, real life examples and theoretical demonstrations which will develop the interest of the student about probability and distribution theory. Several pictorial demonstrations presented in this book will help the reader for easy understanding. Moreover, several solved problems and exercises will help the reader for self-practice and self-improvement. Solved problems and exercise are designed in such a way that they help the students in preparing for university/college examination.

   

  / 

‘–‡– Pages Preface...................................................................................................I Acknowledgement .................................................................................... II Content................................................................................................IIIíIV Chapter 1 Probability Theory««««««««««««««««««««í x History............................................................................... . 1 x Important terminologies........................................................ 2 x &ODVVLFDOGHILQLWLRQDQG.ROPRJRURY¶Vdefinition of Probability.............................................................................. í x Important Theorems on Probability........................................ 6í9 x Illustrative examples and Exercise ........................................ 9í15 Chapter 2 &RQGLWLRQDO3UREDELOLW\DQG%D\HV¶7KHRUHP............................................í x Introduction...................................................................................16 x Multiplication rule..........................................................................19 x Stochastic Independence...........................................................í x Theorem on total probability.....................................................20í22 x %D\H¶V7heorem...............................................................................23 x Illustrative examples and exercise.............................................24í27 Chapter 3 Random Variables, Moments and Expectations...........................................28í49 x Definition of Random variable, discrete r.v., pmf....................28í30 x Pdf, Cdf....................................................................................31í32 x Expectation and properties........................................................32í34 x Mean and Variance....................................................................34í36 x Moments.....................................................................................36-37 x Moment generating function .....................................................37-38 x &KHE\VKHY¶V,QHTXDOLW\..............................................................38-39 x Joint Probability Distribution....................................................39í45 /// 

x Illustrative examples and exercise

45í49

Chapter 4 Discrete Probability Distributions..........................................................49í70 x Introduction.....................................................................................49 x Geometric Distribution............................................... 49í53 x Binomial Distribution.................................................................53-55 x Negative Binomial Distribution................................................55í57 x Multinomial Distribution ..........................................................57í59 x Poisson Distribution.............................................................59í61 x Hyper-geometric distribution.................................................61í62 x Illustrative examples and exercise...........................................62í70 Chapter 5 Continuous Distributions..........................................................................í x Introduction........................................................................................71 x Uniform Distribution6.................................................................í3 x Gamma Distribution..................................................................í x Exponential Distribution .............................................................7í77 x Laplace (Double Exponential) Distribution..................................77-78 x Weibul Distribution......................................................................79í80 x Normal Distribution......................................................................80-90 x Log Normal Distribution...................................................................90 x Bivariate Normal Distribution .....................................................90í93 x Illustrative examples and exercise................................................93í99 Appendix x Standard statistical Table of Normal curve.......................................100 Bibliography....................................................................................................101

/s 

 

Chapter 1 Probability Theory Information: the negative reciprocal value of Probability ----- Claude Shannon

History:Although gambling dates back thousands of years, the birth of modern probability is considered to be a 1654 letter from the Flemish aristocrat and notorious gambler Chevalier de M´er´e to the mathematician and philosopher Blaise Pascal. In essence the letter said:I used to bet even money that I would get at least one 6 in four rolls of a fair die. The probability of this is 4 times the probability of getting a 6 in a single die, i.e., 4/6 = 2/3; clearly I had an advantage and indeed I was making money. Now I bet even money that within 24 rolls of two dice I get at least one double 6. This has the same advantage (24/62) = 2/3), but now I am losing money. Why? $V 3DVFDO GLVFXVVHG LQ KLV FRUUHVSRQGHQFH ZLWK 3LHUUH GH )HUPDW GH 0HUH¶V reasoning was faulty; after all, if the number of rolls were 7 in the first game, the logic would give the nonsensical probability 7/:H¶OOFRPHEDFNWRWKLVODWHU.

1. Introduction and Meaning:In our daily life we often face situations which can not be predicted with absolute certainty and sometimes we can not even say what would be the future event. Now the likelihood of occurrence may be termed as Probability. Initially application of probability was restricted to the game of chances but in course of time it has been incorporated to business, defence, decision making platforms. Probability theory has a great advantage to quantify how likely some event is. Example: A patient is admitted to the hospital and a potentially life-saving drug is administered. The following dialog takes place between the nurse and a concerned relative. RELATIVE: Nurse, what is the probability that the drug will work? 1856(,KRSHLWZRUNVZH¶OONQRZWRPRUURZ RELATIVE: Yes, but what is the probability that it will? NURSE: Each case is different, we have to wait. 5(/$7,9(%XWOHW¶VVHHRXWRIDKXQGUHGSatients that are treated under similar conditions, how many times would you expect it to work? NURSE (somewhat annoyed): I told you, every person is different, for some it works, IRUVRPHLWGRHVQ¶W RELATIVE (insisting): Then tell me, if you had to bet whether it will work or not, which side of the bet would you take? 6 K L I ¶G E L LOO N 1 

 

1.2. Important Terminologies 1.2.1. Random Experiment : A random experiment is an experiment whose possible outcomes depend on chance and which can be repeated under identical conditions but it is not possible to predict the outcomes of any particular trial in advance. Random Experiment is denoted by E. Trial is any particular performance of a Random Experiment. For e.g. E: Tossing one/many unbiased coin(s) simultaneously, throwing one/many unbiased dice etc. 1.2.2 Sample space:A set which consists of all possible outcomes of any random experiment E is called the sample space of E and is denoted by S(E). e.g.1. E: Tossing two unbiased coins simultaneously S(E) ={HH,HT,TH,TT} 2. E: Throwing one unbiased dice. S( E )= {1,2 3,4,5,6} 1.2.3. Event:Any subset of a sample space of any random experiment is called an event. Note: 1. If n number of unbiased coins are tossed simultaneously then total number of elements in the sample space will be 2n. For e.g if n=2 then S(E)={HH, HT, TH, TT}i.e total number of elements in sample space is 22=4. 2. If n number of unbiased dice are thrown then total number of elements in the sample space = 6n. &RQFOXVLRQRI1RWH,IDQ\UDQGRPH[SHULPHQWKDVµP¶RXWFRPHVLQVLQJOHWULDOWKHQWRWDOQXPEHURI RXWFRPHVLQµQ¶WULDO Pn 

Suppose 2 unbiased coins are tossed simultaneously, then the sample space would be S ( E )= { HH, HT, TH, TT }. /HW $ EH WKH HYHQW GHQRWHV ³$W OHDVW RQH +HDG ³ Then A= {HT, TH, HH} and obviously A‫ؿ‬S( E ).

1.2.4 Mutually Exclusive Events:If two or more events cannot take place at a time then they are called Mutually Exclusive Events. For e.g.If one card is drawn of pack of cards then any one card may be drawn: Spade or Hearts or Diamonds or Clubs. Then these 4 suits are mutually exclusive events. 1.2.5 Equally Likely Events:Two or more events are said to be equally likely if one cannot be expected rather than the other. As an example the outcomes 1,2,3,4,5,6 of throwing an unbiased dice are considered to be equally likely events. 2 

 

x The number of ways in which n different things can be arranged in r different groups will be r(r+1)(r+2)(r+3).... (r+n-1) provided blank groups are not acceptable. x The number of ways in which n different things may be distributed in r different parcels where blank lots are acceptable is r n x The number of ways in which n different elements are to be distributed among r different parcels where no blank lots are admissible will be r n - r C1 (r-1)n + r C2 (r-2)n -.... +(-1)r-1 r Cr-1

e.g.1) If 7 apples are to be distributed in 3 different boxes in such a way that atleast one apple must be in each box then total number of ways of distribution will be 37  3C1 (3  1)7  3C2 (3  2)7

1806 15

e.g.2) If 15 dates are to be named by 7 days then total number of possible ways= 7 x The number of ways in which n things of same sort can be distributed over r n 1

distinct parcels where blank slots are not admissible is Cr 1 x The number of ways in which n things of same sort can be distributed over r n  r 1

Cr 1 .

different parcels where blank slots are permissible is

e.g 1) If 7 balls are to be distributed among 3 boxes where each box may contain at 7 1

least 1 ball , then total number of ways of distribution = C31 15 e.g 2) If 7 balls are to be distributed among 3 different boxes where boxes may 7 31

remain empty , then total number of distribution=

C31

36

1.4.4 Combination: If r objects are to be selected out of n objects then total n

Cr =

n! r!(n-r)!

number of selection is called combination If 2 balls are to be selected from 5 balls then total number of selection 5

C2 =

5! 10 2!(5-2)!

1.5. Classical Definition of Probability 7KH &ODVVLFDO GHILQLWLRQ ZDV ILUVW SURSRVHG E\ 0DUTXLV 3L¶HUUH 6LPRQ 'H /DSODFH (1749-1827). He presented the definition in his publication Theorieanalytique des probabilities. The probability of any event has been defined by him in the following way: If sample space S(E) of any random experiment E contains a finite number of elements, say n and any event A ‫ك‬S(E) has m number of favourable cases then by

4 

 

classical definition the probability of the event A is given by P(A)= —„‡”‘ˆ…ƒ•‡•ˆƒ˜‘—”• –‘–ƒŽ’‘••‹„Ž‡‘—–…‘‡‹•ƒ’Ž‡•’ƒ…‡

݉ ݊



.

Illustration: 1. Random Experiment E: Tossing 2 coins simultaneously

S(E)={HH,HT,TH,TT}. Event A: Getting at least one H= {HT, TH, HH}. Therefore m=n(S(E))=4, n= n( A)= 3. Hence P(A)=

͵ Ͷ

Illustration: 2. Random Experiment E: Drawing 2 cards from a well shuffled pack of

cards. § 52 · ¨ ¸ Therefore n= total number of possible outcomes= © 2 ¹ , Number of cases favourable § 4· § 4· ¨ ¸ x¨ ¸ to event A= m= © 1 ¹ © 1 ¹ ,Then the required probability P(A)= § 4· § 4· ¨ ¸ x¨ ¸ ©1¹ ©1¹ § 52 · ¨ ¸ © 2¹

.

Limitations of Classical Definition: The classical definition fails when the sample space becomes infinite. Then the probabilities for all the events of that particular random experiment would likely become 0 which cannot be accepted in real. Moreover, if the cases in sample space does not become equally probable (equally likely) then it becomes rather difficult to find probability of some event from that sample space. 1..ROPRJRURY¶V'HILQLWLRQRQSUREDELOLW\ ,I6 ( EHDQ\QRQHPSW\VHWDQGĮEHDQ\QRQHPSW\VXEVHWRI6 ( WKHQĮLVFDOOHG ߪ െalgebra if i) ‫׊‬A‫ ߙ א‬՜ ‫ܣ‬ҧ ‫ߙ א‬ ii) ‫ߙ א ݅ܣ׊‬ǡ ݅ ൌ ͳǡʹǡ ǥ Ǥ Ǥ ǡ ݊Ǥ ‫ߙ א ݅ܣ ݊ڂ‬ Let ܵሺ‫ܧ‬ሻbe an nonempty set and letߙ be ߪ-algebra of random elements (subset of set ) defined on set ܵሺ‫ܧ‬ሻ: Then the probability P(A) of event A ‫ ߙ א‬is a real function defined on ߙthat, for all disjoint events i) P(S(E))=1 ii) P ሺ‫ ݅ܣ‬ሻ ൒ Ͳ‫ ݅׊‬ൌ ͳǡʹǡ Ǥ Ǥ ݊ iii) ܲ ሺ‫ ݅ܣ ݅ڂ‬ሻ ൌ σ݅ ܲሺ‫ ݅ܣ‬ሻ

5 

 

Hence

P (A-B) = P(A)- P(A‫ ת‬ሻ

Proof (ii) left as an exercise for reader.

1.6.4 7KHRUHP %RROH¶V,QHTXDOLW\ For any two nonempty events A and B 3 $8% ”3 $ 3 %  Proof: From theorem 1.6.1 we know P(A U B)= P(A)+ P(B)- P(A‫ܤ ת‬ሻ P(A‫ܤ ת‬ሻ ൒ ͲLPSOLHV3 $8% ”3 $ 3 % (TXDOLW\KROGVZKHQ3 $‫ܤ ת‬ሻ ൌ Ͳ.

1.6.5 Theorem: For 3 nonempty events A,B and C ( Not mutually exclusive) total probability is given by P(A U B UC)= P(A)+ P(B) +P(C)- ሺ ‫ ת‬ሻ െ ሺ‫ ת‬ሻ െ ሺ ‫ ת‬ሻ ൅ ሺ ‫ ת  ת‬ሻǤ Proof left as an exercise for the reader.

1.6.6 Theorem: (Generalised Addition Law) :If ‫ ͳܣ‬ǡ ‫ ʹܣ‬ǡ ‫ ͵ܣ‬ǡ ǥ ǥ ǥ ǥ ǡ ‫ ݊ܣ‬are not necessarily be mutually exclusive events, then P ሺ‫݅݊ڂ‬ൌͳ ‫ ݅ܣ‬ሻ= σ݊݅ൌͳ ܲሺ‫ ݅ܣ‬ሻ σ σͳ൑݅൑݆ ൑݊ ܲሺ‫ ݆ܣ ת ݅ܣ‬ሻ+ ................. + ሺെͳሻ݊െͳ P(‫ת ͳܣ‬ǡ ‫ ת ʹܣ‬ǥ ǥ Ǥ‫ ݊ܣ ת‬ሻ. Proof: from theorem 1.6.1 the result holds for n=2. We shall prove the theorem by the help of Principle of Mathematical Induction. Let the result holds for n=r. Pሺ‫‹”ڂ‬ൌͳ ‹ ሻ=σ”‹ൌͳ ሺ‹ ሻ-σ σͳ൑‹൑Œ൑” ሺ‹ ‫ Œ ת‬ሻ+..........+ሺെͳሻ”െͳ P(ͳ ‫ת‬ǡ ʹ ‫ ת‬ǥ ǥ Ǥ‫ת‬ ”൅ͳ

” ሻ.Then Pቀሪ‹ൌͳ ‹ ቁ=Pሾሺ‫‹”ڂ‬ൌͳ ‹ ሻ ‫” ׫‬൅ͳ ሿ = Pሺ‫‹”ڂ‬ൌͳ ‹ ሻ+ P(”൅ͳ ሻ - P[ሺ‫‹”ڂ‬ൌͳ ‹ ሻ ‫ת‬ ”൅ͳ ሿ [since result holds for n=2] =Pሺ‫‹”ڂ‬ൌͳ ‹ ሻ+P(”൅ͳ ሻ-P[‫‹”ڂ‬ൌͳሺ‹ ‫” ת‬൅ͳ ሻሿ [distributive law] ” ”െͳ =σ‹ൌͳ ሺ‹ ሻ െ σ σͳ൑‹൑Œ൑” ሺ‹ ‫ Œ ת‬ሻ+............+ሺെͳሻ P(ͳ ‫ת‬ǡ ʹ ‫ ת‬ǥ ǥ Ǥ‫ ” ת‬ሻ+ P(”൅ͳ ሻ-[σ”‹ൌͳ ሺ‹ ‫” ת‬൅ͳ ሻ - σ σͳ൑‹൑Œ൑” ሺ‹ ‫” ת Œ ת‬൅ͳ ሻ+............+ሺെͳሻ”െͳ P(ͳ ‫ת‬ ǡ ʹ ‫ ת‬ǥ ǥ Ǥ‫” ת ” ת‬൅ͳ ሻሿ ” =σ”൅ͳ ‹ൌͳ ሺ‹ ሻ െ[σ σͳ൑‹൑Œ൑” ሺ‹ ‫ Œ ת‬ሻ ൅ σ‹ൌͳ ሺ‹ ‫” ת‬൅ͳ ሻሿ

+

[σ σ σͳ൑Œ൏൑” ൫‹ ‫  ת Œ ת‬൯ ൅ σ σͳ൑‹൑Œ൑” ሺ‹ ‫” ת Œ ת‬൅ͳ ሻሿ-

............... 8 

 

+ሺെͳሻ” P (ͳ ‫ת‬ǡ ʹ ‫ ת‬ǥ ǥ Ǥ‫” ת ” ת‬൅ͳ ሻሿ

”൅ͳെͳ P(ͳ ‫ת‬ǡ ʹ ‫ ת‬ǥ ǥ Ǥ‫ת‬ =σ”൅ͳ ‹ൌͳ ሺ‹ ሻ -σ σͳ൑‹൑Œ൑”൅ͳ ሺ‹ ‫ Œ ת‬ሻ+ .................+ ሺെͳሻ ”൅ͳ ሻ.

The result holds for n=r+1 when it was previously assumed that the result holds for n=r. Hence by PMI the result holds for all n‫א‬N.

1.7. Illustrative examples: 1.7.1 An urn contains 5 Red, 3 Green and 4 Yellow balls. Three balls are drawn at random. Determine the probabilities that, i) all 3 balls are red, ii) one is red and 2 green iii) atleast one is green. Soln. Total number of balls= 12. Therefore total number of mutually exclusive and ͳʹ equally likely cases= ቀ ቁ ൌ ʹʹͲ. ͵ i)

Let, A be the event of drawing 3 red balls. Then number of cases ͳͲ ͷ favourable to event A= ቀ ቁ ൌ ͳͲ. Hence by classical definition P(A)= = ʹʹͲ ͵ 0.045

ii)

Let, B be the event of drawing 1 Red and 2 Green balls. Then number of ͵ ͷ cases favourable to event B = ቀ ቁ ‫ ݔ‬ቀ ቁ = 15. Required probability P(B)= ʹ ͳ ͳͷ ൌ ͲǤͲ͸ͺͳ ʹʹͲ

iii)

Let, C be the event of drawing atleast one Green ball. Then ‫ܥ‬ҧ will be the event of

Drawing no green ball. Cases favourable to ‫ܥ‬ҧ : (2 R ,1 Y) + (2Y ,1 R) + (3 R) +(3Y). Ͷ Ͷ ͷ Ͷ ͷ ͷ Hence n(‫ܥ‬ҧ )= ቀ ቁ ‫ ݔ‬ቀ ቁ ൅ ቀ ቁ ‫ ݔ‬ቀ ቁ ൅ ቀ ቁ ൅ ቀ ቁ = 84. ͳ ʹ ͵ ʹ ͳ ͵ Hence P(‫ܥ‬ҧ ሻ ൌ

ͺͶ ʹʹͲ

= 0.381. Required P(C)= 1- P(‫ܥ‬ҧ ሻ ൌ ͳ െ ͲǤ͵ͺͳ ൌ ͲǤ͸ͳͺ.

1.7.2 Suppose there are 3 students A,B, C are in a batch. What is the probability that i) they have same birthday, ii) exactly 2 of them have same birthday, iii) At least 2 have same 9 

 

Soln. 3 birthdays may be selected out of 365 days in ሺ͵͸ͷሻ͵ ways which may be the number of mutually exclusive and equally likely cases. i)

Let, X be the event of having different birthdays.

ii)

Then A may have his birthday in 365 ways. Again B may have his birthday out of remaining 364 days in 364 ways. Since 363 days remains IRU&¶VELUWKGD\DOORFDWLRQWKHUHIRUH& PD\KDYHKLVELUWKGD\RXWRI days.

Therefore number of cases favourable to Event X= 365x364x363 Required P(X)= iii)

͵͸ͷš͵͸Ͷš͵͸͵ ሺ͵͸ͷሻ͵



͵͸Ͷ‫͵ݔ‬͸͵ ͵͸ͷ‫͵ݔ‬͸ͷ

= 0.992

Let, Y be the event of having exactly 2 same birthdays.

͵ Among 3 students 2 will have same birthday and the pair may be selected in ቀ ቁ ൌ ʹ ͵™ƒ›•. The pair will have birthday out of 365 days and the rest one will have the birthday from rest of 364 days. So, number of cases favourable to event Y= 3x 365x364 Hence required probability P(Y)= iv)

૜‫ܠ‬૜૟૞‫ܠ‬૜૟૝ ሺ૜૟૞ሻ૜

= 0.0082

Let, Z be the event that at least 2 have same birthday. Therefore ܼҧ denotes that no one have same birthday.

Therefore P(Z)= 1- P(ܼҧ ሻ = 1- 0.992= 0.008. 1.7.3 If from a lottery of 20 tickets numbered 1,2,3,.....,20 three tickets are drawn at random. What is the probability that they will remain in Arithmetic Progress ion? Soln.If 3 tickets are drawn from 20 tickets then total number of mutually exclusive ʹͲ and equally likely cases = ቀ ቁ = 1140. ͵ Let, A be the event of having 3 cards in AP. If common difference is one the first sequence (1,2,3) and last sequence is (18,19,20) i.e total number of cases= 18 If CD is 2 then first sequence is (1,3,5) and last sequence (16,18,20) i.e number of cases= 16. 10 

 

In the similar process if CD is 9 then cases are (1,10,19) and (2,11,20) i.e 2 cases. Therefore total number of cases favourable to A = 2+4+6+....+18= 90 Required P(A)=

ͻͲ ͳͳͶͲ

ൌ ͲǤͲ͹ͺͻ

1.7.4 If 4 dates are chosen at random from a calendar year out of 20 dates , then find the probability that exactly 3 dates are Sundays. Soln. Total number of ways of naming 20 dates will be ͹ʹͲ from the theory of counting principle.Let A be the event of naming 3 dates Sunday. If 3 dates are to be named Sundays then those dates may be selected in ൫ʹͲ ൯ = 1140 ͵

ways. Remaining (20-3)=17 dates are to be named by remaining (7-1)= 6 days in ͸ͳ͹ ways of. Therefore total number of cases favourable to event A is 1140 x ͸ͳ͹ . Hence required probability P(A) =

ͳͳͶͲš͸ͳ͹ ͹ʹͲ

.

(Ans)

1.7.5 Two students X and Y appeared for an examination where probabilities for passing the examination for X and Y are 0.05 and 0.10 respectively and both will qualify has the probability 0.02. find the probability i) both will not qualify, ii) atleast one of them will not qualify, iii) only one of them will qualify the exam. Soln. The events are following: A: X will qualify B: Y will qualify, i)

ii)

iii)



A‫ ת‬B: both will qualify

Then ‫ܣ‬ҧ ‫ܤ ת‬ത : both will not qualify the exam. P(‫ܣ‬ҧ ‫ܤ ת‬ത ሻ = 1- P( A‫ ׫‬B)= 1-[P(A)+P(B)-P(A‫ ת‬B)] = 1- [0.05+0.1-0.02]=0.87 Required probability P ( at least one will qualify)= 1- P( both will not qualify) = 1-0.87= 0.13 Only one of them will qualify meanVHLWKHU³$TXDOLI\%GLVTXDOLI\´RU³$ GLVTXDOLI\%TXDOLI\´%RWKWKHHYHQWVDUHPXWXDOO\GLVMRLQW P ( only one will qualify)= P(‫ܣ‬ҧ ‫ܤ ת‬ሻ ൅ ܲሺ‫ܤ ת ܣ‬ത ሻ 11

 

1.7.8 when a computer goes down, there is a 70% chance of hardware problem and 25% chance due to software problem. There is an 85% chance that it goes down due to hardware or software problem. What is the probability that both of these problems are at fault? Sol. Let A be the event of having hardware problem B ̄̄̄̄̄̄̄̄̄̄̄̄̄̄̄̄̄̄̄̄̄̄̄̄̄̄̄software problem A‫׫‬B is the event that the computer goes down. Form given condition, P(A) = 0.7, P(B)= 0.25 , P(A‫׫‬B)= 0.85 $ŀ%GHQRWHVFRPSXWHUJRHVGRZQIRUERWKWKHIDXOWV By theorem of total probability we have P(A‫׫‬B)=P(A)+P(B)-3 $ŀ% Or, 0.85= 0.7+0.25 -3 $ŀ% 2U3 $ŀ% ± 0.85= 0.1

(Ans)

Exercise 1. An urn contains 3 red, 5 blue and 7 green balls. A set of 3 of the balls is randomly selected. (a) What is the probability that all 3 of the selected balls are red? (b) What is the probability that all 3 have different colours? 2. Two cards are chosen at random from a deck of 52 cards. What is the probability that they (a) are from different suits? (b) Are both 4s? 3. How many people need to be in a room in order that the probability that at least two of them have the same birthday is at least 0.8? 4. A coin is flipped until heads has appeared four times. What is the probability that the fourth head appears on the tenth flip? 5. A blue die and a red die are rolled. What is the probability that (a) their sum is at least 6, given that the red die is even? (b) The red die is 3, given that their sum is at least 6? 6. Urn A has 99 red balls and 1 green ball. Urn B has 1 red ball and 99 green balls. An urn is picked at random and a ball is chosen from it. If the ball is red, what is the probability that it came from urn A?

13 

 

7. In a class, there are 5 girls who are at least 20, 7 boys who are under 20 and 9 girls who are under 20. Is it possible for the events at least 20 and girl to be independent if a student is drawn at random from the class? 8. Compute the probability that a bridge hand (13 cards) has at least two cards from every suit. 9. Mr. Jones has two children. The older child is a girl. What is the probability that both children are girls? 10.In a class of 50 students, 30 opted for Mathematics, 25 opted for Statistics and 20 opted for both the subjects. If one of the subjects are chosen at random then find the probability that i) the students neither applied for statistics and mathematics, ii) the students opted for Statistics but not Mathematics. 11.If the letters of the word PROBABILITY are arranged in random order what is the probability of getting the word PROBABILITY again? 12.Cards are drawn one by one from a full deck. What is the probability that exactly 10 cards will precede the first ace? 13.12 pairs of shoes are kept in a bucket. If 6 shoes are selected at random then find the probability that there will be at least one pair among the 6 shoes is selected? 14.An eight faced dice is so biased that it is thrice as likely to show an even number as an odd number if thrown. If it is thrown twice then find the probability that sum of two numbers thrown is odd. 15.Sam and Harry throw a dice alternatively. If Sam gets 5 he wins and Harry gets 4 he wins. Find the probability of winning of Harry when i) Sam starts and ii) Harry starts the game. 16.If 3 arbitrary people in street are questioned about their birthdays, what is the probability that i) 3 people born in Monday? ii) None were born in Tuesday? ii) one born in Sunday and other two in Thursday? 17.A bag contains 5 white and 7 red balls. Three balls are drawn at random. Find the probability for the drawing of 3 red balls in first draw and 3 white balls in 2nd draw when the balls are not replaced in 2nd draw? 18.The odds against A solving a certain problem are 4 to 3 and odds in favour of B solving the same problem is 7 to 5. What is the probability that the problem will be solved? 19.Show that the probability of occurrence of only one of the events A and B is P(A) +P(B)- 3 $ŀ%  20.A pack of 2n cards, n of which are red and n are black. It is divided into two equal parts and a card is drawn from each. Find the probability that the cards drawn are of the same colour. 14 

 

21.If n biscuits are distributed among k dogs, what is the probability that a particular dog receives r biscuits? 22.If n letters are randomly placed in n addressed envelopes, prove that the probability that exactly r letters are placed in correct envelopes is given by, ݊െ‫ݎ‬

ሺെͳሻ݇ ͳ ෍ ‫ݎ‬Ǩ ݇Ǩ ݇ൌͲ

r= 1,2,3,.......n 23.A committee has 5 male and 4 female members. If all of them sit together then fnd the probability that 2 female cannot sit side by side.

   15 

  ሺ‫ת‬ሻ

B as well as A. Hence the proportion

ሺሻ

represents the proportion that the

events favourable to A which are favourable to B earlier. Therefore it may be represented as

 ሺ ‫ ת‬ሻ ݊ ሺܵ ሻ  ሺ ሻ  ሺ ሻ

=

ܲሺ‫ܤתܣ‬ሻ ܲሺ‫ܤ‬ሻ



= Pቀ ቁ. (P(B)>0) 



ሺ‫ת‬ሻ



ሺሻ

Similarly, the other conditional probability is given by Pቀ ቁ ൌ (P(A)>0). ‫ܤ‬



‫ܣ‬



***ሺ ‫ ת‬ሻ= P(A)x Pቀ ቁ= P(B)x Pቀ ቁ.

Theorem 2.1 If ͳ ǡ ʹ ǡ ǥ ǥ ǥ Ǥ Ǥ  are n number of events connected to a

random experiment then show that P(‫ ת ʹܣ ת ͳܣ‬ǥ ǥ Ǥ‫ ݊ܣ ת‬ሻ ൒ σ݊݅ൌͳ ܲሺ‫ ݅ܣ‬ሻ െ ሺ݊െͳሻ. Proof: this theorem is the extended part of BonIHURQQL¶V,QHTXDOLW\ For n= 2 we have, Pሺͳ ‫ ʹ ׫‬ሻ = P(‫ ) ͳܣ‬+ P(‫ ) ʹܣ‬± P(ͳ ‫ ʹ ת‬ሻ Again,

Pሺͳ ‫ ʹ ׫‬ሻ  ൑ ͳ

Or,

P(ͳ ) + P(ʹ ) ± P(ͳ ‫ ʹ ת‬ሻ  ൑ ͳ

Or,

P(ͳ ‫ ʹ ת‬ሻ ൒P(ͳ ) + P(ʹ ) -1

Let the result holds for n= r Therefore, P(ͳ ‫ ת ʹ ת‬ǥ ǥ Ǥ‫ ” ת‬ሻ ൒ σ”‹ൌͳ ሺ‹ ሻ െ ሺ” െ ͳሻ. ‫ݎ‬ Now, P൫‫ݎځ‬൅ͳ ݅ൌͳ ‫ ݅ܣ‬൯ ൌ ܲሺ‫݅ځ‬ൌͳ ‫ݎܣ ת ݅ܣ‬൅ͳ ሻ

൒ ܲሺ‫݅ݎځ‬ൌͳ ‫ ݅ܣ‬ሻ ൅ ܲሺ‫ݎܣ‬൅ͳ ሻ െ ͳ ൒ σ”‹ൌͳ ሺ‹ ሻ െ ሺ” െ ͳሻ ൅ ܲ ሺ‫ݎܣ‬൅ͳ ሻ െ ͳ r 1

r 1

i 1

i 1

Ÿ P( Ai ) t ¦ P( Ai )  r 

Therefore, the result holds for n=r+1 Hence by principle of mathematical induction P(‫ ת ʹܣ ת ͳܣ‬ǥ ǥ Ǥ‫ ݊ܣ ת‬ሻ ൒ σ݊݅ൌͳ ܲሺ‫ ݅ܣ‬ሻ െ ሺ݊ െ ͳሻ.

17 

(proved)

 

Theorem 2.2(Multiplication Rule) For n number of events ͳ ǡ ʹ ǡ ǥ ǥ ǥ Ǥ Ǥ  , 





 ͳ ‫ʹ ת‬

P(‫ ת ʹܣ ת ͳܣ‬ǥ ǥ Ǥ‫ ݊ܣ ת‬ሻ ൌ P(ͳ ሻš ቀ ʹ ቁ  ቀ

ቁ ǥ ǥ ǥ ǥ Ǥ Ǥ ሺ

 ‫‹ځ‬ൌͳ  ‹



Proof: We shall prove this by the help of PMI. Let us consider two events ‫ ͳܣ‬and ‫ ʹܣ‬. 

Then P(‫ ʹܣ ת ͳܣ‬ሻ ൌP(ͳ ሻš ቀ ʹ ቁ by conditional probability ͳ

Again taking 3 events ͳ ǡ ʹ and ͵ Then ,

P(‫ ͵ܣ ת ʹܣ ת ͳܣ‬ሻ ൌP(‫ ת ͳܣ‬ሺ‫ ͵ܣ ת ʹܣ‬ሻሻ ‫͵ܣת ʹ ܣ‬

= P(ͳ ሻ x Pቀ

‫ͳܣ‬



‫ܣ‬

‫͵ܣ‬

‫ͳܣ‬

‫ʹܣת ͳ ܣ‬

= P(ͳ ሻšPቀ ʹ ቁ x P ቀ



Therefore the result holds for n= 2 and n=3. Let us consider that the result holds for n=r. 





 ͳ ‫ʹ ת‬

Then P(‫ ת ʹܣ ת ͳܣ‬ǥ ǥ Ǥ‫ ݎܣ ת‬ሻ ൌ P(ͳ ሻš ቀ ʹ ቁ  ቀ

ቁ ǥ ǥ ǥ ǥ Ǥ Ǥ ሺ

” െͳ ‹ ‫‹”ځ‬ൌͳ

Now, P(‫ ת ʹܣ ת ͳܣ‬ǥ ǥ Ǥ‫” ת ݎܣ ת‬൅ͳ ሻ ൌ P(‫ ת ʹܣ ת ͳܣ‬ǥ ǥ Ǥ‫ ݎܣ ת‬ሻš ሺ 





 ͳ ‫ʹ ת‬

= P(ͳ ሻš ቀ ʹ ቁ  ቀ

ቁ ǥ ǥ ǥ ǥ Ǥ Ǥ ሺ

” െͳ ‹ ‫‹”ځ‬ൌͳ

 ” ൅ͳ ‫‹”ځ‬ൌͳ  ‹

ሻ x ሺ



 ” ൅ͳ

‫‹”ځ‬ൌͳ  ‹

Hence the result holds for n=r+1. Therefore, by PMI, 





 ͳ ‫ʹ ת‬

P(‫ ת ʹܣ ת ͳܣ‬ǥ ǥ Ǥ‫ ݊ܣ ת‬ሻ ൌ P(ͳ ሻš ቀ ʹ ቁ  ቀ

18 

ቁ ǥ ǥ ǥ ǥ Ǥ Ǥ ሺ

 ‫‹ځ‬ൌͳ  ‹







 

2. Stochastic Independence If occurrence of B does not affected by occurrence of A then from the ‫ܤ‬

conditional probability we have P( ) =P(B) and P(A/B)= P(A). ‫ܣ‬

Thus from conditional probability we have



Pቀ ቁ ൌ Or,

ሺ‫ת‬ሻ

 ሺሻ ሺ‫ת‬ሻ ሺሻ

ൌ ሺሻ

Or, ሺ ‫ ת‬ሻ ൌ ሺሻšሺሻ Therefore two nonempty events A and B are called independent to each other if ‫۾‬ሺ‫ ת ۯ‬۰ሻ ൌ ‫۾‬ሺ‫ۯ‬ሻ‫۾ܠ‬ሺ۰ሻ Corollary: 1.If A and B are two independent events then i) … ƒ†… ƒ”‡ƒŽ•‘‹†‡’‡†‡–Ǥ ii) ƒ†… ƒ”‡‹†‡’‡†‡– iii) … ƒ†ƒ”‡‹†‡’‡†‡– Proof: Since A and B are independent events, ሺ ‫ ת‬ሻ ൌ ሺሻšሺሻ.......... (1) i) Now, P (… ŀ… ሻ ൌ ሺ ‫ ׫‬ሻ… >'¶0RUJDQ¶V@ = 1- P ( ‫ ׫‬ሻ ൌ ͳ െ ሾሺሻ ൅ ሺሻ െ ( ‫ ת‬ሻ = 1 ±P (A) ± P (B) +P (A) P(B) [from (1)] = {1-P (A)}{1-P (B)} = P (… ሻሺ… ሻ Hence, … ƒ†… ƒ”‡ independent .............................................. Proved ii) 3 $ŀ… ሻ ൌ ሺ െ ሻ ൌ ሺሻ െ ሺ ‫ ת‬ሻ = P (A) ± P (A) P (B) = P (A) [1- P (B)] = P (A) P (… ሻ ..................................................... Proved

Hence A and… ƒ”‡‹†‡’‡†‡– iii) Similar proof. Corollary: 2. If A, B and c are mutually independent events then A‫ܤ ׫‬and C are also mutually independent. Proof: We need to show that P[(A‫ ׫‬ሻ ‫ ת‬ሿ= P(A‫ܤ ׫‬ሻܲሺ‫ܥ‬ሻ Therefore, P[(A‫ ׫‬ሻ ‫ ת‬ሿ ൌ ሾሺ ‫ ת‬ሻ ‫ ׫‬ሺ ‫ ת‬ሻሿ = Pሺ ‫ ת‬ሻ+Pሺ ‫ ת‬ሻ- P( ‫ ת  ת‬ሻ = P(A) P(C) + P(B) P(C) ± P(A) P(B) P(C) = P(C) [ P(A) + P(B) ± P(A) P(B)] 19 

 

=P(C) [ P(A) + P(B) ± P( ‫ ת‬ሻሿ = P(C) P(A‫ ׫‬ሻ [Addition Theorem] Hence proved. Corollary: 3 For any two nonempty events A and B P( ‫ ת‬ሻ ൑ ሺሻ ൑ (A‫ ׫‬ሻ”3 $ 3 % Proof left as exercise. Example:2.1 If 3 workers A, B and C are working independently on a project independently with their respective probabilities of doing the job 0.5, 0.33 and 0.67 , then find the probability of that the project would be done. Method I: Since A, B and C are working independently; therefore if any one of them completes the project it would be done. So A or B or C could alone complete it. So the probability will be P(A‫ ׫  ׫‬ሻ = P(A)+ P(B) + P(C) ± Pሺ‫ ܤ ת ܣ‬ሻ െ ܲሺ‫ ܥ ת ܣ‬ሻ െ ܲሺ‫ ܥ ת ܤ‬ሻ ൅ ܲሺ‫ܥ ת ܤ ת ܣ‬ሻ =P(A)+ P(B) + P(C)- P(A) P(B) ±P(A)P(C)±P(B)P(C) + P(A) P(B) P(C) [Since A, B and C are independent] = 0.5+ 0.33+0.67- 0.5x 0.33- 0.5x 0.67- 0.33x 0.67 +0.5x0.33x0.67 =0.88945

(Ans)

Alternative Method: Let, … ǡ … ƒ† … are the events that A , B and C will not complete the project. Hence the project will not be completed has the probability P(… ‫ …  ת … ת‬ሻ ൌ ሺ… ሻሺ… ሻሺ … ሻ ൌ ൫ͳ െ ሺሻ൯൫ͳ െ ሺሻ൯ሺͳ െ ሺሻሻ = 0. 5x0.67x 0.33= 0.11055 Hence required probability for the project would be done= 1- 0.11055=0.88945 (Ans)

Theorem2.3: Theorem on total probability: ͳ ǡ ʹ ǡ ǥ ǥ ǥ Ǥ Ǥ  ƒ”‡’ƒ‹”™‹•‡—–—ƒŽŽ›‡š…Ž—•‹˜‡‡˜‡–•ƒ†„‡ƒ‡˜‡–™Š‹…Š ‘……—”•when all the ‹ will occur. Then, 20 

  











P(A)= P(ͳ ሻ ቀ ቁ ൅ ሺʹ ሻ ቀ ቁ ൅ ‫ ڮ‬ǥ ǥ ǥ ǥ ǥ ǥ Ǥ ൅ሺ ሻ ቀ ቁ. Proof:

‫ ͳܣ‬

A

Ϯ

ϱ ϯ

Ŷ

Figure 3. Partition of a sample space with mutually disjoint sets From the above diagram we observe that Ai ,i=1,2,3,.....n are all mutually disjoint sets. Here the set A‫݅݊ڂك‬ൌͳ ‫ ݅ܣ‬. Therefore ,‫ ܣ ת ݅ܣ‬,i=1,2,3,...n are also mutually disjoint sets. Then, A= ‫‹ڂ‬ൌͳሺ‹ ‫ ת‬ሻ.......................................

(1)

By addition theorem of probability, P(A) = σ‹ൌͳሺ‹ ‫ ת‬ሻ = P(ͳ ‫ ת‬ሻ ൅ ሺʹ ‫ ת‬ሻ ൅ ‫ ڮ‬ǥ ǥ ǥ ǥ ǥ Ǥ ൅ሺ ‫ ת‬ሻ 











= P(ͳ ሻሺ ሻ+ P(ʹ ሻ ቀ ቁ ൅ ‫ ڮ‬ǥ ǥ ǥ ǥ ൅ ሺ ሻ ቀ ቁ [Conditional probability] Hence proved. Corollary: 4 If A and B are two events such that P(B)് ͳǡthen P(A/‫ ܿ ܤ‬ሻ ൌ 

ሾሺሻെሺ‫ת‬ሻሿ ሾͳെሺሻሿ

Proof: P(A/… ሻ ൌ ൌ

ሺ‫ … ת‬ሻ ሺ … ሻ ሺሻെሺ‫ת‬ሻ ͳെሺሻ

Proved.

Example 2.2In a class 67% students are boys and rest are girls. The probability that a girl gets outstanding marks is 0.25 and so that of a boy is 0.27. If a student is chosen at random find the probability that he gets outstanding marks. Soln. Let, A : event of being a boy student 21 

 

B: event of being a girl student X : event that a student gets outstanding marks. Therefore, P(A)= 0.67, P(B)= 0.33 P(X/A)= 0.28, P(X/B)= 0.25 Hence required probability for a student to get outstanding marks is P(X)= P(A) P(X/A) + P(B) P(X/B) = 0.67x 0.28+ 0.33x 0.25 = 0.2701.

(Ans)

Example 2.3The chances that a doctor A will diagnose correctly are 60%. The chances that a patient will die after correct diagnosis is 40% and after wrong diagnosis is 70%. What is the probability that the patient will die after diagnosis? Soln. Let the events are A: Correct diagnosis B: Wrong diagnosis C: Patient dies From given condition, P(A)=0.6, P(B)= 0.4, P(C/A)= 0.4, P(C/B)= 0.7 Hence the required probability P(A)= P( Patient dies after diagnosis) = P(A) P(C/A) + P(B) P(C/B) = 0.6 x 0.4 + 0.4 x 0.7 = 0.52

(Ans)

Example 2.4 A and B are solving a problem of statistics. Their chances of solving the problem are

ͳ ͺ

and

ͳ ͳʹ

respectively. The chances of making common mistake is

They got the same answer. Find the probability that the answer is same. Left as an exercise [Hint: same answer may be correct or incorrect]

22 

ͳ ͳͲͲͳ

.

 

7KHRUHP%D\HV¶7KHRUHP Statement: if ͳ ǡ ʹ ǡ ǥ ǥ ǥ Ǥ Ǥ  ƒ”‡’ƒ‹”™‹•‡—–—ƒŽŽ›‡š…Ž—•‹˜‡‡˜‡–•with P(Ai)്0, i=1,2,......,n then for any arbitrary event A ‫‹ڂك‬ൌͳ ‹ , then P(‹ Ȁሻ ൌ

 

ሺ ‹ ሻቀ ‹ ቁ ሺሻ 











Where P(A)= P(ͳ ሻ ቀ ቁ+ P(ʹ ሻ ቀ ቁ ൅ ‫ ڮ‬ǥ ǥ ǥ ǥ ൅ ሺ ሻ ቀ ቁ. Proof: from theorem 2.3 we may derive 











P(A)= P(ͳ ሻ ቀ ቁ+ P(ʹ ሻ ቀ ቁ ൅ ‫ ڮ‬ǥ ǥ ǥ ǥ ൅ ሺ ሻPቀ ቁ................(1) Again from conditional probability, ‫݅ܣ‬

ሺ ‹ ‫ת‬ሻ

‫ܣ‬

ሺሻ

Pቀ ቁ=



ܲሺ‫ ‹ תܣ‬ሻ ܲሺ‫ܣ‬ሻ



 ‹

ܲሺ ‹ ሻ൬ ൰ ܲሺ‫ܣ‬ሻ

...................... Proved

Remark: The probabilities P (‹ ሻƒ”‡…ƒŽŽ‡†’”‹‘”‹probabilities. The probabilities P (‹ Ȁሻƒ”‡…ƒŽŽ‡†’‘•–‡”‹‘” probabilities. The probabilities P (Ȁ‹ ሻƒ”‡…ƒŽŽ‡†Ž‹‡Ž›Š‘‘† probabilities. Example 2.5 From an urn containing 3 white and 5 black balls , 4 balls are transferred into an empty urn. From this vessel a ball is drawn and is found to be white. What is the probability that out of four balls transferred 3 are white and 1 is black? Soln. Let us consider the events‹ : transferring i number of white balls to the empty urn. Let A be the event of drawing white ball from empty urn. Therefore P(A0)= 5

P(A1)=

C3 3C1 8

C4

͵Ͳ

5

C4

8

C4



= =5/14. ͺͶ

ͷ ͺͶ

, 5

P (A2)=

C2 3C2 8

C4



23 

ͷ ͳͶ

5

, P(A3) =

C1 3C3 8

C4

= 5/84 , P(A4) = 0.

  ͳ

ʹ

͵

Ͷ

Ͷ

Ͷ

P(A/A0)= 0, P(A/A1)= , P(A/A2)= , P(A/A3)=

Therefore the probability that the drawn ball is white = 







P(A) = P(A0) P(A/A0)+ P(ͳ ሻ ቀ ቁ+ P(ʹ ሻ ቀ ቁ ൅ ሺ3) P(A/3) =

0 + (5/14) x(1/4) + (5/14) x(1/2) + (5/84)x (3/4) =

Therefore, required probability ‫ܣ‬

ܲሺ‫ ͵ ܣ‬ሻܲቀ

‫ܣ‬

ܲሺ‫ܣ‬ሻ

Pቀ ͵ ቁ=

‫ܣ‬ ቁ ‫͵ܣ‬

ൌ

ͳ

................................ (Ans)

͹

Example 2.6 It is reported that 50% of all computer chips produced are defective. Inspector ensures that only 5% of the chips legally marketed are defective. Unfortunately, some chips are stolen before inspection. If 1% of all chips on the market are stolen find the probability that a given chip is stolen given that it is defective. Sol. Let ͳ and ʹ denotes 2 mutually disjoint events: ҰOHJDOO\PDUNHWHGFKLSV‫ފ‬DQG³ VWROHQ&KLSV´UHVSHFWLYHO\ Let A be the event denotes defective chips. ‫ܣ‬

Now, P(ʹ ሻ ൌ ͳΨ ൌ ͲǤͲͳ , Pቀ ቁ ൌ ͷΨ ൌ ͲǤͲͷ and P(A)= 50% = 0.5 ͳ

Therefore P(ͳ ሻ ൌ ͳ െ ሺʹ ሻ ൌ ͳ െ ͲǤͲͳ ൌ ͲǤͻͻ As from the theorem ‫ܣ‬

‫ܣ‬





P(A) = P(ͳ ሻPቀ ቁ ൅ ሺʹ ሻ ቀ ቁ ‫ܣ‬

Or, 0.5 = 0.99 x 0.05 + ሺʹ ሻ ቀ ቁ ʹ

‫ܣ‬

Or, ሺʹ ሻ ቀ ቁ ൌ ͲǤ5- 0.0495 = 0.4505 ʹ

7KHUHIRUHE\%D\H¶VWKHRUHPUHTXLUHGSUREDELOLW\

24 

  ‫ܣ‬

ሺʹ ሻ ቀ ቁ ʹ ʹ ൬ ൰ ൌ ሺሻ ‫ܣ‬ =

ͲǤͶͷͲͷ ͲǤͷ

=0.901

(Ans)

Example 2.7 Abag contains six balls of different colours and a ball is drawn from it at random. A speaks truth thrice out of 4 times and B speaks truth 7 out of 10 times. If both A and B say that a red ball was drawn find the probability of their joint statement being true. Sol. Let ͳ and ʹ denote the events that the joint statements of A and B are true and false respectively. By given condition, ͳ

ͷ

͸

͸

P(ͳ ሻ ൌ  and P(ʹ ሻ ൌ 

Let A be the event that A and B will have same statement. Then ൬

͵ ͹ ʹͳ  ൰ ൌ š ൌ ͳ Ͷ ͳͲ ͶͲ



ͳ

ʹ



Ͷ

ͳͲ

 ቀ ቁ ൌ  š Then

ͳ

͵

ͷ

ͳͲ

x š

=

͵ ͳͲͲͲ

‫ܣ‬

‫ܣ‬





P(A)= P(ͳ ሻPቀ ቁ ൅ ሺʹ ሻ ቀ ቁ ͳ

ʹͳ

͸

ͶͲ

= x

ͷ

͵

͸

ͳͲͲͲ

+ x

=

ͻ ͳͲͲ

+HQFHE\%D\H¶VWKHRUHP ‫ܣ‬

ሺʹ ሻ ቀ ቁ ʹ ʹ ൬ ൰ ൌ ሺሻ  =

ͷ ͵ š ͸ ͳͲͲͲ ͻ ͳͲͲ

͵ͷ

=

͵͸

(Ans)

Example 2.8There are 5 bags each contains 10 balls. The ith bag contains i number of defective balls. A bag is drawn at random and one ball is chosen. i) Find the probability that the chosen ball is defective. ii) if the chosen ball is defective find the probability that it came from the ith bag. 25 

 

Sol.Let ‹ denotes the event of choosing ith bag. i= 1, 2,3,4,5. Let A be the event of choosing a defective ball. Then by condition, ͳ

P(‹ ሻ ൌ  , i= 1,2,3,4,5 ͷ



݅



ͳͲ

Then,  ቀ ቁ ൌ 

ͳ

݅



ͷ

ͳͲ

Hence P(‹ ሻ ቀ ቁ ൌ š i)

ൌ

݅ ͷͲ

, i= 1,2,3,4,5

The probability that the chosen ball is defective= 

P(A) = σͷ݅ൌͳ ሺ‹ ሻ ቀ ቁ = ii)

ͳ൅ʹ൅͵൅Ͷ൅ͷ ͷͲ

=



͵ ͳͲ

The probability that the chosen ball come from the ith bag is 

ሺ‹ ሻ ቀ ቁ ‹ ‹ ൌ ൬ ൰ ൌ ሺሻ  ൌ

‹

݅ ͷͲ ͵ ͳͲ

(Ans)

ͳͷ

Exercise 1. In a factory which manufactures bulbs, machines X, Y and Z manufacture respectively 1000, 2000, 3000 of the bulbs. Of their outputs,1%, 1.5% and 2 % are respectively defective bulbs. A bulb is drawn at random from the product and is found to be defective. What is the probability that it is manufactured by the machine X ? 2. A and B fires on the same target. A fires on average 9 shots in the time and B shots 10 in the same time. A hits target on average of 8 out of 10 and B hits on average 7 out of 10. The target is hit. Find the probability that A hits the target. 3. A and B two students appear for an examination. Their chances of solving a problem in mathematics are 1/8 and 1/12 respectively. If the probability of their common mistake is 1/1001, and they obtain the same answer, find the chance of answer being correct. 4. Let us consider that 5 men out of 100 men and 25 women out of 1000 are having IQ below 40. A person of below 40 IQ is chosen. Find the probability that he is male. 26 

 

5. Three machines A,B and C have capacities in the ratio of 2:3:4. The probabilities that they produce defective items are 0.1,0.2 and 0.1 respectively. An item is selected random and is found to be defective. What is the probability that it came from the first machine? 6. In a multiple choice test a question was answered by an examinee who knows the answer with probability p. Assume that probability of answering a question correctly is unity for an examinee who knows the answer and 1/m for the examinee who guesses. Suppose examinee answer it correctly. Find the probability that he knew the answer. 7. The probability that a new born baby is Blue eyed is ¼ and Black eyed is ¾. What is the probability that a newborn baby is Blue eyed girl? 8. Out of 400 patients 150 consume alcohol and 92 out of them have high cholesterol. Again out of 200 non alcoholic patients 44 have high cholesterol level. If a patient is drawn at random find the probability of being nonalcoholic. 9. There are 4 buckets each contain 15 apples. They contain 1,4 , 6, 3 rotten apples respectively. One apple is drawn from a bucket and found to be in good condition. What is the probability that it came from the 2nd bucket? 10.A sample space contains integers from 1 to 2n which are assigned probabilities proportional to their logarithms. Find the probabilities and show that the conditional probability of the integer 2, given that an even integer occurs is Ž‘‰ ʹ ሾ Ž‘‰ ʹ ൅ Ž‘‰༌ ሺǨሻሿ

27 

 

consider a function (Random Variable) X which denote number of H(Heads). Then X(HH)=2, X(HT)=1=X(TH), X(TT)=0 Therefore from the defn.2.1, the set K={0,1,2} represents spectrum of the R.V. X. e.g.2.2(Maximum roll of a dice): A six faced dice is rolled twice and the maximum value will be noted. If the outcome becomes (3, 5) then 5 will be considered as output. Visualisation of the example is given below:

6 5 ϰ ϯ Ϯ ϭϮϯϰϱϲ ϯϰϱϲ Example (2.2)

Deinition2.1: Any random variable X is said to be discrete random variable if the sample space S is finite or count ably infinite. In e.g. 2.1 the random variable X is a discrete random variable. Definition.2.2: Let us consider a discrete random variable X which assumes the values x1 ,x 2 ,x3 ,.........,x n , then the probabilities for the assumed values are called the probability mass function(PMF) of the discrete r.v. and they are given by P(X= x i ) = pi .

29 

  5

¦ P(X=x ) i

1

i=1

15 c =1 or, c= 3 ;!_;” 

ሺʹ൏ܺ൑Ͷሻ ሺ൑Ͷሻ



ˆሺ͵ሻ൅ˆሺͶሻ ˆሺͶሻ

ͳ ͳͷ

ൌ

͹Ȁͳͷ ͳͲȀͳͷ

ൌ

͹ ͳͲ

(Ans)

Question 2.2: Let X be the discrete rv having pmf f(x)=kx, x=1,2,......,n. (i)Determine k, (ii) Find P(X>2). Soln. The probability distribution of X will be given by, X 1 2 3 ................... n-1 n Pi K 2k 3k ................. (n-1)k nk From the properties of pmf n

¦ P(X=x ) i

1

i=1

Ÿ k+2k+3k+.........+nk=1 Ÿ (1+2+...n)k=1 Ÿ

n(n+1)k =1 2

Ÿ k=

2 n(n+1)

Again, P(X t 2)=1- P(X=1) =1-

2 (Ans) n(n+1)

Definition 2.3: Let X be any discrete random variable. Then for any real number x, WKHIXQFWLRQ) [ 3 ;”[ LVFDOOHGCumulative Distribution Function (cdf) or simply Distribution function (df) of X. e.g.2.4.1.Let X= number of Heads in tossing two fair coins simultaneously. Then the probability distribution is given by X P(X= x i )

0 0.25

1 0.5

2 0.25

Then the cdf is given by )  3 ;”   31 

Total 1

 

)  3 ;” 3 ;  3 ;     F(2)= P(;” S  S  S     Definition 2.4: The Continuous random variable is a random variable which may assume infinitely many data. The spectrum of continuous r.v.is an infinite set or any interval. Definition 2.5:The probability distribution of any continuous r.v. is called continuous distribution. The curve describing the distribution is called the Density Curve.

  

yсĂ

ZĞƉƌĞƐĞŶƚĂƚŝŽŶŽĨƉƌŽďĂďŝůŝƚLJďLJƚŚĞĂƌĞĂƵŶĚĞƌĚĞŶƐŝƚLJĐƵƌǀĞW;yфĂͿ   

W;ĂфyфďͿ

yсĂ

yсď

Definition 2.6 If X be any continuous random variable such that it assumes all real x, then the function f(x) is called Probability Density Function (PDF) of the continuous random variable X. ‫ݔ‬ The distribution function or CDF LVJLYHQE\) [ 3 ;”[  ‫׬‬െλ ݂ ሺ‫ ݔ‬ሻ݀‫ ݔ‬. Properties of pdf 1. I [ •‫ܺ א ݔ׊‬ λ 2. ‫׬‬െλ ݂ሺ‫ ݔ‬ሻ݀‫ ݔ‬ൌ ͳ 

Expectation of a random variable The behaviour of any random variable may be completely categorised by the density function. Some constants or parameters are also associated with the random variable. Knowledge of the numerical values of these parameters gives the researcher a quick insight into the nature of random variable. 32 

 

For example if an unbiased die is rolled then the possible outcomes are 1,2,3,4,5,6 which are assumed by the random variable X. Then probability associated with each of the values will be 1/6. Here X assumes the number obtained. Now if a question arises that in long run theoretical average of X then the matter of expectation comes. We are asking for theoretical average value of X as roll approaches to infinity. In long run we expect as much as 1 and 6; similarly as much as 2 and 5 and so on. Then in each case the average occurrence of number becomes 3.5. Now 1 x(1/6) +2x (1/6) + 3x(1/6)....+6x (1/6)= 3.5 is the expected value of X . Definition 3.7: For any discrete random variable X assuming the values xi, i=1,2,...n with corresponding probability mass function p i=P(X=xi) then the expectation of X is given by,E[X]= σ‹ൌͳ š‹ ’‹

   





Definition 3.8: For any continuous random variable X with probability density function f(x) the expectation of f(x) is given by , ’ E[X]=‫׬‬െ’ šˆሺšሻ†š

Properties of Expectation: For any random variable X assuming the value c , E[c] = c For any random variable X , E[cX] = c E[X] 3. For any two random variables X and Y, E[X+Y] = E[X] + E[Y] 4. If X and Y are independent random variables then E [XY] = E[X] E[Y]. 5. For any random variableȁሾሿȁ ൑ ȁȁ, provided all expectation exist. Proof: 1) let the random variable X has pdf f(x). λ Then, E[c] = ‫׬‬െλ ݂ܿ ሺ‫ ݔ‬ሻ݀‫ݔ‬ 1.

2.

λ

= ܿ ‫׬‬െλ ݂ሺ‫ ݔ‬ሻ݀‫ݔ‬ = c x 1 = c [ from property of pdf] λ

2)let the random variable X has pdf f(x). Then, E[X] = ‫׬‬െλ ‫ ݂ݔ‬ሺ‫ ݔ‬ሻ݀‫ݔ‬ λ

Then, E[cX] = ‫׬‬െλ ܿ‫ ݂ݔ‬ሺ‫ ݔ‬ሻ݀‫ݔ‬ λ

= c‫׬‬െλ ‫ ݂ݔ‬ሺ‫ ݔ‬ሻ݀‫ݔ‬ = c E[X] 3) /HW;DQGȁȁሿ Or, - (>;@”(>ȁȁሿ 2U(>;@•-E [ȁȁሿ............... (b) Therefore, from (a) and (b) we have ȁሾሿȁ  ൑ E [ȁȁሿ

Mean and Variance of any Random Variable: Mean of any random variable X is its expectation. For any discrete random variable X the mean is given by E[X]= σ‹ൌͳ š‹ ’‹ where xi¶VDUHWKHDVVXPHG values of X and pi¶VDUHWKHFRUUHVSRQGLQJSPI¶V6LPLODUO\WKHPHDQRIDQ\ ’ continuous random variable X is given by E[X]= ‫׬‬െ’ šˆሺšሻ†š , where f(x) is the pdf of X. Variance is the measurement of variability. It is a parameter which reflects the consistency or its lacking which means how much a random variable fluctuates from its mean.

34 

 

Definition 3.9: Let X be a random variable with mean= E[X] = m. The variance of X is denoted by Var[X] = ો૛ ൌ ۳ሾ‫ ܆‬െ ‫ܕ‬ሿ૛ The computational formula for variance may be given in the forms 1) Var[X] = ો૛ ൌ ۳ሾ‫ ܆‬૛ ሿ െ ሾ‫ܕ‬ሿ૛ 2) Var[X] = ો૛ = E[X(X-1)] ± m(m-1) Proof: 1) Var[X] = ıʹ ൌ ሾ െ ሿʹ = E[ ʹ െ ʹ ൅ ʹ ሿ = E[ ʹ ሿ െ ʹሾሿ ൅E[ʹ ሿ = E[ ʹ ሿ െ ʹš + ʹ = E[ ʹ ሿ െ ʹʹ ൅ ʹ =E[ ʹ ሿ െ  ʹ (Proved) ʹ ʹ 2)Var[X] = ı ൌ ሾ െ ሿ = E[ ʹ െ ʹ ൅ ʹ ሿ = E [X(X-1) +X ] ± 2 ʹ ൅ ʹ = E[ X(X-1)] +E[X] - ʹ = E[X(X-1)] +m ±ʹ =E[X(X-1)] ±m (m-1) (proved) Standard Deviation of any random variable X is given by SD(X) =ɐ = ඥƒ”ሺሻ

Properties of Variance: 1. Var (k) =0 , where k is any constant. 2. Var (aX+b) = ܽʹ ܸܽ‫ݎ‬ሺܺሻ Proof: 1) let X be any discrete random variable. Then, mean= m = E[k] = k. Now, E[k2] = σ݊݅ൌͳ ‫ = ݅݌ ݅ݔ‬σ݊݅ൌͳ ݇ ʹ ‫ ʹ ݇ = ݅݌‬σ݊݅ൌͳ ‫ʹ ݇ = ݅݌‬ Therefore, Var(k) = E[ X-m]2 = E [X-k]2 = E[ X2] -2k E[X] + E[k2] = ʹ െ ʹʹ ൅ ʹ ൌ Ͳ Similar result will come when X is a continuous r.v. 2)Var( aX+b) = E[ ሺܽܺ ൅ ܾሻ െ ݉ሿ2 = E[ܽʹ ܺ ʹ +2abX+ܾ ʹ െ ʹܽ݉ܺ െ ʹܾ݉ܺ ൅ ݉ʹ ሿ[ Where m= mean= E[aX+b] =am1+b ; m1= E[X]] 35 

 

= ܽʹ ‫ܧ‬ሾܺʹ ]+2abm1+ ܾ ʹ െ ʹܽ݉݉ͳ ൅ ʹܾ݉݉ͳ ൅ ݉ʹ = ܽʹ ‫ܧ‬ሾܺʹ ሿ +2abm1+ ܾʹ െ ʹܽሺam1+b) ݉ͳ െ ʹܾሺƒͳ ൅ „ሻͳ ൅  ሺƒͳ ൅ „ሻʹ =ƒʹ ሾሾ ʹ ሿ െ ሺሻʹ ሿ = ܽʹ ܸܽ‫ ݎ‬ሺܺሻ (Proved) Covariance: If X and Y are two random variables, then the covariance between them is given by Cov(X,Y) = E[{X-E(X)}{Y-E(Y)}] = E(XY) - E(X)E(Y) Remarks: Cov(aX,bY) = abCov(X,Y). Cov (aX+b, cY+d) = ac Cov(X,Y)

 



Question 2.3: Let X be a random variable with probability distribution y Ϭ ϭ Ϯ ϯ W;džͿ ϭͬϯ ϭͬϮ ϭͬϮϰ ϭͬϴ Find the expected value of Y= ሺܺ െ ͳሻʹ . Soln. Here E[X] = 0x 1/3 +1x 1/2 + 2x 1/24 + 3x 1/8 =

ʹ͵ ʹͶ

ͳ

ͳ

ͳ

͵

ʹ

ʹͶ

E[ܺʹ ሿ ൌ  Ͳʹ ൅ ͳʹ + ʹʹ

ͳ

Ͷ͵

ͺ

ʹͶ

൅ ͵ʹ =

Therefore E[Y] = E[ሺܺ െ ͳሻʹ ሿ ൌ ‫ ܧ‬ሾܺʹ ሿ െ ʹ‫ ܧ‬ሾܺሿ ൅ ͳ = =

Ͷ͵ ʹͶ

െ ʹ‫ݔ‬

ʹ͵ ʹͶ

͹

൅ͳ (ANS)

ͺ

Moments: 7KH ³PRPHQWV´ RI D UDQGRP YDULDEOH RU RI LWV GLVWULEXWLRQ  DUH H[SHFWHG YDOXHV RI powers or related functions of the random variable. The r th moment of X is E(ܺ‫) ݎ‬. In particular, the first moment is the mean, ߤܺ = E(X). The mean is a measure of the ³FHQWHU´RU³ORFDWLRQ´RIDGLVWULEXWLRQ$QRWKHUPHDVXUHRIWKH³FHQWHU´RU³ORFDWLRQ´ LVDPHGLDQGHILQHGDVDYDOXHPVXFKWKDW3 ; P ”DQG3 ;” P •,I there is only one such value, then it is called the median. The rth central moment of X is ݉‫ ݎ‬ൌE[ሺܺ െ ߤܺ ሻ‫] ݎ‬. In particular, the second central moment is the variance, 36 

 

ߪ ʹ ܺ = Var(X) = E[ሺܺ െ ߤܺ ሻʹ ]. The standard deviation of a random variable is the (nonnegative) square root of the variance: ߪܺ = Sd(X) The variance and standard deviation are measures of the spread or dispersion of a distribution. The standard deviation is measured in the same units as X, while the variance is in X- units squared. Raw moments: the rth moment of a variable x about any point x= A is denoted by ͳ

݉‫ ݎ‬Ԣ and is given by ݉‫ ݎ‬Ԣ ൌ  σ݅ ሺ‫ ݅ݔ‬െ ‫ܣ‬ሻ‫ ݎ‬, σ݅ ݂݅ ൌ ܰ ܰ

Results: 1. ݉Ͳ = 1 ,݉ͳ ൌ Ͳ, ݉ʹ ൌVar(X). 2. ݉ʹ ൌ ݉ʹԢ െ ݉ͳԢ , ݉͵ = ݉͵Ԣ െ ͵݉ʹԢ ݉ͳԢ ൅ ʹ݉ͳԢ 3.

Moment Generating Function: If X be any random variable with its probability function f(x), then its mgf (moment generating function is given by MX(t) =E[݁ ‫ ܺݐ‬ሿ=ቊ

‫݂ ܺݐ ݁ ׬‬ሺ‫ ݔ‬ሻ݀‫ݔ‬ሺ‫ܴݏݑ݋ݑ݊݅ݐ݊݋ܥ‬Ǥ ܸሻ σ‫݂ ܺݐ ݁ ݔ‬ሺ‫ ݔ‬ሻሺ‫ܴ݁ݐ݁ݎܿݏ݅ܦ‬Ǥ ܸሻ

Where t is a real parameter and the summation is done for entire range of x. E[݁ ‫ ܺݐ‬ሿ ൌ ‫ܧ‬ሾͳ ൅ ‫ ܺݐ‬൅ = 1+ t E[X] + = 1+ t ߤͳԢ ൅

‫ʹݐ‬ ʹǨ

ሺ‫ ܺݐ‬ሻʹ

‫ʹݐ‬ ʹǨ

ʹǨ



ሺ‫ ܺݐ‬ሻ͵

‫ ܧ‬ሾܺ ʹ ሿ ൅

ߤʹԢ ൅

‫͵ݐ‬ ͵Ǩ



͵Ǩ ‫͵ݐ‬ ͵Ǩ

ሺ‫ ܺݐ‬ሻͶ ͶǨ

൅ ‫ڮ‬ǥ ǥ λሿ

‫ ܧ‬ሾܺ ͵ ሿ ൅

‫ݐ‬Ͷ ͶǨ

‫ ܧ‬ሾܺͶ ሿ ൅ ‫ ڮ‬ǥ ǥ Ǥ λ

ߤ͵Ԣ ൅ ‫ ڮ‬ǥ ǥ ǥ ǥ ǥ λ............................(a)

Where ߤ‫ݎ‬Ԣ represents the rth raw moment about origin. ࢊ࢘ ሼ‫ ܆ۻ‬ሺ‫ܜ‬ሻሽ

From (a) ቚ

ࢊ࢚࢘



࢚ൌ૙

ൌ ࣆԢ࢘

Theorem 3.1.… ሺ–ሻ ൌ  ሺ…–ሻǡ …„‡‹‰ƒ…‘•–ƒ–. Proof:

… ሺ–ሻ ൌ ሾ‡–… ሿ  ሺ…–ሻ ൌ ሾ‡…– ሿ L.H.S = R.H.S 37



(Proved)

 

Theorem 3.2.If ܺͳǡ ܺʹǡ ܺ͵ǡ ܺͶǡǥǥǥǥǥǥǥ ܺ݊  are n independent random variables then ‫ ͳܺܯ‬൅ܺʹ ൅ܺʹ ൅‫ڮ‬ǥǥǥǥ൅ܺ݊ ሺ‫ݐ‬ሻ ൌ  ‫ ͳܺܯ‬ሺ‫ݐ‬ሻ‫ ʹܺܯ‬ሺ‫ݐ‬ሻ ǥ ǥ ǥ ǥ ǥ Ǥ ‫ ݊ܺܯ‬ሺ‫ݐ‬ሻ Proof:

‫ ͳܺܯ‬൅ܺʹ ൅ܺʹ ൅‫ڮ‬ǥǥǥǥ൅ܺ݊ ሺ‫ݐ‬ሻ ൌ ‫ܧ‬ሾ݁ ‫ݐ‬ሼܺͳ ൅ܺʹ ൅ܺʹ ൅‫ڮ‬ǥǥǥǥ൅ܺ݊ ሽ ሿ = E[݁ ‫ ͵ܺݐ ݁ ʹܺݐ ݁ ͳܺݐ‬ǥ ǥ ǥ Ǥ Ǥ ݁ ‫ ݊ܺݐ‬ሿ =E ( ݁ ‫ ͳܺݐ‬ሻ‫ ܧ‬ሺ݁ ‫ ʹܺݐ‬ሻ ǥ ǥ ǥ ǥ ǥ Ǥ ‫ܧ‬ሺ݁ ‫ ݊ܺݐ‬ሻ = ‫ ͳܺܯ‬ሺ‫ݐ‬ሻ‫ ʹܺܯ‬ሺ‫ݐ‬ሻ ǥ ǥ ǥ ǥ ǥ Ǥ ‫ ݊ܺܯ‬ሺ‫ݐ‬ሻ (proved)

Theorem 3.3.

If ܺͳǡ ܺʹǡ ܺ͵ǡ ܺͶǡǥǥǥǥǥǥǥ ܺ݊ are n i.i.d (Independent and identically ‫ݐ‬

GLVWULEXWHG UY¶VZLWKPJI0 W WKHQ‫ܺܯ‬ത ሺ‫ݐ‬ሻ ൌ ሾ‫ ܯ‬ቀ ቁሿ݊ ݊

ܺ , where ܺത ൌ σ݊݅ൌͳ ݅ . ݊

Proof left as an exercise for reader.

Theorem 3.4 (&KHE\VKHY¶V,QHTXDOLW\) If X be any random variable (discrete or continuous) and let k be some positive number, then, Pሺȁܺ െ ߤȁ ൒ ݇ߪሻ ൑

ͳ ݇ʹ

Proof: let X be a continuous r.v. with pdf f(x). Therefore Var[X] = E[X-E(X)]2 λ = ‫׬‬െλ ሺ‫ ݔ‬െ ߤ ሻʹ ݂ ሺ‫ ݔ‬ሻ݀‫ݔ‬ Let k> 0 and c = ݇ ʹ ߪ ʹ . ߤ െξܿ

ߤ ൅ξܿ

Therefore ߪ ʹ ൌ ‫׬‬െλ ሺ‫ ݔ‬െ ߤ ሻʹ ݂ሺ‫ ݔ‬ሻ݀‫ ݔ‬൅ ‫ ߤ׬‬െξܿ ሺ‫ ݔ‬െ ߤ ሻʹ ݂ ሺ‫ ݔ‬ሻ݀‫ݔ‬ λ

൅ න

ሺ‫ ݔ‬െ ߤ ሻʹ ݂ሺ‫ ݔ‬ሻ݀‫ݔ‬

ߤ ൅ξܿ ߤ ൅ξܿ

Since ሺ‫ ݔ‬െ ߤ ሻʹ ݂ ሺ‫ ݔ‬ሻ ൒ Ͳǡ ‫ ߤ׬‬െξܿ ሺ‫ ݔ‬െ ߤ ሻʹ ݂ ሺ‫ ݔ‬ሻ݀‫ ݔ‬൒ Ͳ ߤ െξܿ

λ

Hence ߪ ʹ ൒  ‫׬‬െλ ሺ‫ ݔ‬െ ߤ ሻʹ ݂ሺ‫ ݔ‬ሻ݀‫ ݔ‬൅ ‫ ߤ׬‬൅ξܿሺ‫ ݔ‬െ ߤ ሻʹ ݂ ሺ‫ ݔ‬ሻ݀‫ݔ‬

Note that, over both the regions of integration, ሺ‫ ݔ‬െ ߤ ሻʹ ൒ ܿ. So it may be concluded that, ߪʹ ൒  න

ߤ െξܿ

݂ܿ ሺ‫ ݔ‬ሻ݀‫ ݔ‬൅ න

െλ

Or,

λ

ߤ ൅ξܿ

ߪ ʹ ൒ ܿܲൣܺ ൑ ߤ െ ξܿ൧F3>;•ߤ ൅ ξܿሿ

Or, ߪ ʹ ൒ cP[X-ߤ ൑ െξܿሿ ൅ ܿܲሾX-ߤ ൒ ξܿሿ By rearranging we may write, 38 

݂ܿሺ‫ ݔ‬ሻ݀‫ݔ‬

 

P[X-ߤ ൑ െξܿሿ ൅ ܲሾX-ߤ ൒ ξܿሿ • Or, P[െξܿ ൑ ܺ െ ߤ ൑ ξܿሿ ൒ ͳ െ Or, P[െ݇ߪ ൑ ܺ െ ߤ ൑ ݇ߪሿ ൒ ͳ െ

ߪʹ ܿ ߪʹ ܿ ͳ ݇ʹ

$OWHUQDWLYHVWDWHPHQWRI&KHE\VKHY¶VLQHTXDOLW\ Let X be a random variable and let g(x) be a non-negative function. Then for r > 0, 3>J ; •U@”

‫ܧ‬ሾ݃ሺܺሻሿ ‫ݎ‬

Proof: we have, λ

‫ ܧ‬ሾ݃ሺܺሻሿ ൌ න ݃ሺ‫ ݔ‬ሻ݂ܺ ሺ‫ ݔ‬ሻ݀‫ݔ‬ െλ

൒

³

f X ( x) g ( x)dx

³

f X ( x)dx 

[ x:g ( x ) t r ]

൒r

[ x:g ( x ) t r ]

U3>J ; •U@֜ 3>J ; •U@”(>J ; @U Proved

Joint Probability Distribution Joint Probability If two random variables X and Y are defined in the same probability space then they are said to be jointly distributed. The joint probability of X and Y for some event A is given by ܲܺǡܻ ሺ‫ݔ‬ǡ ‫ݕ‬ሻ ൌ ܲሾሺܺǡ ܻሻ ‫ܣ א‬ሿ

Joint Probability Mass Function Let us consider a sample space S(E) of any random experiment E. Let X and Y be two random variables on the sample space S(E). The spectrum of X and Y are respectively given by, ൛‫ͳݔ‬ǡ ‫ ʹݔ‬ǡ ‫͵ݔ‬ǡǥǥǥǤǤǡ‫ ݉ݔ‬ൟܽ݊݀൛‫ͳݕ‬ǡ ‫ ʹݕ‬ǡ ‫͵ݕ‬ǡǥǥǥǤǤǡ ‫ ݊ݕ‬ൟ. By making Cartesian product of these two sets a set containing 2-tuples will be generated where each element will be of the form (‫݅ݔ‬ǡ ‫ ݆ݕ‬ሻ, where i= 1,2,...,m and j=1,2,.....,n. Then the joint probability function will be defined by P(X= ‫ ܻ ת ݅ݔ‬ൌ ‫ ݆ݕ‬ሻ ൌ ‫݅݌‬ǡ݆ ൌ ܲ൫‫ ݅ݔ‬ǡ ‫ ݆ݕ‬൯. 39 

 

The following table shows the Joint probability function Y

‫ͳݕ‬

‫ʹݕ‬

‫͵ݕ‬

‫ݕ‬Ͷ

............................

‫݊ݕ‬

Total

X ‫ͳݔ‬ ‫ʹݔ‬ ‫͵ݔ‬ ‫ݔ‬Ͷ . . . . . ‫݉ݔ‬ Total

‫ͳ݌‬ǡ݊ ‫ͳ݌‬ǡͳ ‫ͳ݌‬ǡʹ ‫ͳ݌‬ǡ͵ ‫ͳ݌‬ǡͶ ............................ ‫ʹ݌‬ǡ݊ ‫ʹ݌‬ǡͳ ‫ʹ݌‬ǡʹ ‫ʹ݌‬ǡ͵ ‫ʹ݌‬ǡͶ ............................ ‫͵݌‬ǡ݊ ‫͵݌‬ǡͳ ‫͵݌‬ǡʹ ‫͵݌‬ǡ͵ ‫͵݌‬ǡͶ ............................ ‫݌‬Ͷǡ݊ ‫݌‬Ͷǡͳ ‫݌‬Ͷǡʹ ‫݌‬Ͷǡ͵ ‫݌‬ͶǡͶ ............................ .................................................................................................... ....................................................................................................... ....................................................................................................... ‫ ݉݌‬ǡͳ ‫݌‬Ǥͳ

‫ ݉݌‬ǡʹ ‫݌‬Ǥʹ

‫ ݉݌‬ǡ͵ ‫݌‬Ǥ͵

‫ ݉݌‬ǡͶ ‫݌‬ǤͶ

.............................. ...............................

‫ ݉݌‬ǡ݊ ‫݌‬Ǥ݊

‫ͳ݌‬Ǥ ‫ʹ݌‬Ǥ ‫͵݌‬Ǥ ‫݌‬Ͷ Ǥ . . . . . ‫ ݉݌‬Ǥ 1

From the above table the property may be defined as ݉

݊

෍ ෍ ܲ൫‫ ݅ݔ‬ǡ ‫ ݆ݕ‬൯ ൌ ͳ ݅ൌͳ ݆ ൌͳ

Marginal Probability Mass Function If X and Y be two random variables defined on the same sample space with image sets ൛‫ͳݔ‬ǡ ‫ ʹݔ‬ǡ ‫͵ݔ‬ǡǥǥǥǤǤǡ‫ ݉ݔ‬ൟܽ݊݀൛‫ͳݕ‬ǡ ‫ ʹݕ‬ǡ ‫͵ݕ‬ǡǥǥǥǤǤǡ ‫ ݊ݕ‬ൟ respectively, then ܲ ሺܺ ൌ ‫ ݅ݔ‬ሻ ൌ ܲሺܺ ൌ ‫ ܻ ת  ݅ݔ‬ൌ ‫ ͳݕ‬ሻ + ܲሺܺ ൌ ‫ ܻ ת  ݅ݔ‬ൌ ‫ ʹݕ‬ሻ + ܲሺܺ ൌ ‫ ܻ ת  ݅ݔ‬ൌ ‫ ͵ݕ‬ሻ + ..........................+ ܲሺܺ ൌ ‫ ܻ ת  ݅ݔ‬ൌ ‫ ݊ݕ‬ሻ = ‫ ͳ݅݌‬൅ ‫ ʹ݅݌‬൅ ‫ ͵݅݌‬൅ ‫ ڮ‬ǥ ǥ ǥ Ǥ ൅‫݊݅݌‬ = σ݆݊ൌͳ ‫݅݌ = ݆݅݌‬Ǥ This is called the marginal probability mass function of random variable X Similarly, the marginal probability function of random variable Y is given by 40 

 

ܻܲ ൫‫ ݆ݕ‬൯ ൌ ܲ൫ܻ ൌ ‫ ݆ݕ‬൯ ൌ  ‫݌‬Ǥ݆ Remark: From the marginal probability mass function and the table of joint probability distribution we get ݊ σ݉ ݅ൌͳ ‫݅݌‬Ǥ ൌ ͳ And σ݆ ൌͳ ‫݌‬Ǥ݆ ൌ ͳ

Coin Flips Consider the flip of two fair coins, let A and B be discrete random variables associated with the outcomes first and second coin flips respectively. If a coin displays "heads" then associated random variable is 1, and is 0 otherwise. The joint probability mass function of A and B defines probabilities for each pair of outcomes. All possible outcomes are (A=0 , B=0) , (A=0, B=1), (A=1, B=0), (A=1, B=1). Since each outcome is equally likely the joint probability mass function becomes P(A,B)=

ͳ Ͷ

when (A,B) ‫ א‬ሼͲǡͳሽ. Since the coin flips are independent, the joint

probability mass function is the product of the marginals:P(A,B)= P(A) P(B).

.

In general, each coin flip is a Bernoulli trial and the sequence of flips follows a Bernoulli distribution.

Joint Probability Distribution Function Let X and Y be two random variables whose probability mass function is given by ‫ ܻܺ݌‬ሺ‫ݔ‬ǡ ‫ݕ‬ሻ. Then the distribution function is given by ‫ ܻܺܨ‬ሺ‫ݔ‬ǡ ‫ݕ‬ሻ ൌ ܲሺܺ ൑ ‫ݔ‬ǡ ܻ ൑ ‫ݕ‬ሻ =σš σ› ’ ሺšǡ ›ሻ Properties: 1. F(െλǡ Ͳሻ ൌ Ͳ ൌ ‫ܨ‬ሺ‫ݔ‬ǡ െλሻǢ ‫ܨ‬ሺെλǡ λሻ ൌ ͳ 2. F(ܽͳ ൏ ܺ ൑ ܾͳ ǡ ܽʹ ൏ ܻ ൑ ܾʹ ሻ ൌ  ‫ ܻܺܨ‬ሺܽͳ ǡ ܽʹ ሻ ൅ ‫ ܻܺܨ‬ሺܾͳ ǡ ܾʹ ሻ -‫ ܻܺܨ‬ሺܽͳ ǡ ܾʹ ሻ െ ‫ ܻܺܨ‬ሺܾͳ ǡ ܽʹ ሻ The marginal distribution functions are given by, 41 

 

‫ ܺܨ‬ሺ‫ ݔ‬ሻ ൌ ܲሺܺ ൑ ‫ ݔ‬ሻ ൌ ܲሺܺ ൑ ‫ݔ‬ǡ ܻ ൏ λሻ ൌ ‫ ܻܺܨ‬ሺ‫ݔ‬ǡ λሻ ൌ ෍ ܲሺܺ ൑ ‫ݔ‬ǡ ܻ ൌ ‫ݕ‬ሻ ‫ݕ‬

‫ ܻܨ‬ሺ‫ݕ‬ሻ ൌ ܲሺܻ ൑ ‫ݕ‬ሻ ൌ ܲ ሺܺ ൏ λǡ ܻ ൑ ‫ݕ‬ሻ ൌ ‫ ܻܺܨ‬ሺλǡ ›ሻ ൌ ෍ ܲሺܺ ൌ ‫ݔ‬ǡ ܻ ൑ ‫ݕ‬ሻ ‫ݔ‬

Joint Probability Distribution for continuous variable If X and Y be two continuous random variables with joint distribution function ‫ ܻܺܨ‬ሺ‫ݔ‬ǡ ‫ݕ‬ሻ, then its probability density function (pdf) is given by ࢌࢄࢅ ሺ࢞ǡ ࢟ሻ ൌ

ࣔ૛ ۴ ሺ‫ܠ‬ǡ ‫ܡ‬ሻ ࣔ࢞ࣔ࢟ ‫܇܆‬

The marginal probability function of X and Y are given by λ

݂ܻ ሺ‫ݕ‬ሻ ൌ න ࢌࢄࢅ ሺ࢞ǡ ࢟ሻࢊ࢞ െλ

λ

And ݂ܺ ሺ‫ ݔ‬ሻ ൌ ‫׬‬െλ ࢌࢄࢅ ሺ࢞ǡ ࢟ሻࢊ࢟ Conditional distribution function and pdf First consider the case when X and Y are both discrete. Then the marginal pdf's (or pmf's = probability mass functions, if you prefer this terminology for discrete random variables) are defined by fY(y) = P(Y = y) and fX(x) = P(X = x). The joint pdf is, similarly,fX,Y(x,y) = P(X = x and Y = y). The conditional pdf of the conditional distribution Y|X is ሺൌšƒ† ൌ›ሻ fY|X(y|x) = P(Y = y|X = x)= ሺൌšሻ

݂ܺ ǡܻ ሺ‫ݔ‬ǡ‫ݕ‬ሻ

=

݂ܺ ሺ‫ݔ‬ሻ

For continuous random variables X and Y , the conditional probability distribution is given by ݀

3 @ (ii) (a) Find an expression for P(Y •y). (b) Hence deduce the probability function of Y. 6. let X be geometric with probability of success p. Prove that when x is positive integer then F(x)= 1- qx. 7. A discrete random variable has moment generating function ‫ݐ‬ ‫ ܺܯ‬ሺ‫ݐ‬ሻ ൌ ݁ ʹሺ݁ െͳሻ 2 Find E[X], E[X ] and Var [X]. 8. Suppose that X is hypergeometric with N=25, r=12 and n=6. What is mean and the variance? 9. It has been found that 80% of all printers used on home computer operate correctly at the time of installation. The rest require some adjustment. A particular dealer sells 10 units during a month. i) find the probability that atleast 9 computer works correctly upon installation. ii) Consider 5 months in which 10 units are sold per month. What is the probability that at least 9 units operate correctly in each of the 5 months? 10.Assume that each time a metal detector at an airport signals there is 25% FKDQFHWKDWWKHFDXVHLQFKDQJHLQWKHSDVVHQJHU¶VSRFNHW'XULQJDgiven hour, 15 passengers are stopped because of signal problem in metal detector. a) Find probability that atleast 3 persons will be stopped due to change in their pocket b) If 15 passengers are stopped by the detector would it be unusual for none of these to have been stopped due to change in the pocket?

70 

 

Chapter 5 Continuous Distributions

Introduction As we discussed different discrete random variables in chapter 4, we may consider that the non discrete random variables are continuous. If we consider a time period [0,12] where 0 denotes 12 noon and 12 denotes midnight at any call centre then the time when the number of phone calls will be at peak cannot be given at any particular time. For e.g. if someone ask how many calls are served at 13.1234 hours then at that particular time nobody can give data and the probability will assumed to be 0. These two properties, possible values occurring as intervals and the priori probability of assuming any specific value being 0, are the characteristics which identify a random variable as being continuous. In this chapter we will discuss about some important continuous distributions.

5.1. Uniform Distribution The concept of this distribution comes from the occurrence of events with equal or uniform probability. If the random variable assumes any value in an interval (a,b) then its probability will remain same everywhere. Definition 5.1 (Rectangular or Uniform distribution) A random variable X is said to have a continuous uniform distribution over an interval (a,b) if ™Š‡ƒ ൏ ‫ ݔ‬൏ ܾ the probability density function is given by f(x)= ቄ , Ͳ‘–Š‡”™‹•‡ ͳ since total of the pdf is 1 therefore k = . ܾെܽ

0.4

f

x

0.3

0.2

­ ° 1 ° ® ba ° ° ¯

0.1

0 0

5

10

a

b

Uniform Density Function f(x)

71 

x

15

 

5.1.1. Moment Generating Function of Uniform Distribution Let X̱ܷሺܽǡ ܾሻǤ The pdf is given by f(x)=ቄ

ͳ ™Š‡ƒ ൏ ‫ ݔ‬൏ ܾ where k = ܾെܽ Ͳ‘–Š‡”™‹•‡

ͳ

=

–ሺ„െƒሻ

Hence  ሺ–ሻ ൌ

ͳ

„

’

Therefore  ሺ–ሻ ൌ ሾ‡– ሿ ൌ ‫׬‬െ’ ‡–š ˆሺšሻ†š ൌ ‫š–‡ ƒ׬‬

„െƒ

†š ൌ

ͳ –ሺ„െƒሻ

‡–š ȁ„ƒ

ሺ‡–„ െ ‡–ƒ ሻ

ͳ –ሺ„െƒሻ

ሺ‡–„ െ ‡–ƒ ሻ

5.1.2. Mean and Variance of Uniform Distribution We have from 5.1.1 , ሺ–ሻ ൌ

ͳ –ሺ„െƒሻ

ሺ‡–„ െ ‡–ƒ ሻ

..................(1)

Differentiating (1) w.r.to t we get, ݀ ሺ–ሻ ൫„‡–„ െ ƒ‡–ƒ ൯– െ ሺ‡–„ െ ‡–ƒ ሻ ൌ ሺܾ െ ܽሻ‫ʹ ݐ‬ ݀‫ݐ‬ ݀  ሺ–ሻ

But at t=0, we cannot find a real value of

݀‫ݐ‬

.

Hence we find a drawback of Moment generating function. Remark: Although the moments of Uniform distribution over any interval [a,b] exists, but they cannot be derived from moment generating function which is a drawback for moment generating function. Therefore by conventional method „

’

Mean= E[X] = ‫׬‬െ’ šˆሺšሻ†š ൌ ‫š ƒ׬‬ = =

ͳ „െƒ

†š ൌ

ͳ ʹሺ„െƒሻ

š ʹ ȁ„ƒ

ሺ„ʹ െ ƒʹ ሻ

ƒ൅„ ʹ ’

„

Again, E[ܺʹ ሿ ൌ ‫׬‬െ’ š ʹ ˆሺšሻ†š ൌ ‫ʹ š ƒ׬‬

ͳ „െƒ

†š ൌ

72 

ͳ ʹሺ„െƒሻ

ͳ ͵ሺ„െƒሻ

š ͵ ȁ„ƒ

 

=

ͳ ͵ሺ„െƒሻ

=

ሺ„͵ െ ƒ͵ ሻ

ƒ ʹ ൅ƒ„ ൅„ ʹ ͵

Hence Variance= Var[X] = ߪ ʹ ൌ ሾܺʹ ሿ െ ሾሾሿሿʹ =

ƒ ʹ ൅ƒ„ ൅„ ʹ

=

͵

െቀ

ƒ൅„ ʹ ʹ



ሺܾെܽሻʹ ͳʹ

Let X̱ܷሺܽǡ ܾሻǤ then Mean = E[X] = Variance= Var[X] = ߪ ʹ ൌ

ሺܾെܽሻʹ ͳʹ

ƒ൅„ ʹ



5.2. Gamma Distribution The gamma probability density function is useful in reliability models of lifetimes. The gamma distribution is more flexible than the exponential distribution in that the probability of a product surviving an additional period may depend on its current age. 7KH H[SRQHQWLDO DQG Ȥ2 functions are special cases of the gamma function. Before going through the distribution we first discuss about the Gamma function. The Gamma function is given by λ

߁ ሺߙ ሻ ൌ ‫ ݁ Ͳ׬‬െ‫ߙ ݐ ݐ‬െͳ ݀‫ ݐ‬, where Ƚ൐ Ͳ 5.2.1. Properties of Gamma function 1. ߁ ሺͳሻ ൌ ͳ 2. If n is a positive real number then, ߁ ሺ݊ ൅ ͳሻ ൌ ݊߁ ሺ݊ሻ 3. If n is a positive integer then ߁ ሺ݊ ൅ ͳሻ ൌ ݊Ǩ Definition 5.2 (Gamma distribution): A random variable X is said to follow Gamma distribution with parameters Ƚ and Ⱦ if its pdf is given by , f(x) =

ͳ ߁ሺߙሻȾ Ƚ

‫ ݔ‬Ƚെͳ ݁

‫ݔ‬ ߚ



, Ƚ൐ Ͳǡ ߚ ൐ Ͳǡ ‫ ݔ‬൐ Ͳ

5.2.2. Moment Generating functionof Gammadistribution 73 

 

Let X be a gamma random variable with parameters Ƚ and ȾǤ  –•’†ˆ‹•‰‹˜‡„›f(x) =

ͳ

‫ ݔ‬Ƚെͳ ݁

߁ሺߙሻȾ Ƚ

‫ݔ‬ ߚ



, Ƚ൐ Ͳǡ ߚ ൐ Ͳǡ ‫ ݔ‬൐ Ͳ

Š‡‘‡–‰‡‡”ƒ–‹‰ˆ—…–‹‘™‹ŽŽ„‡ λ

 ሺ–ሻ ൌ ሾ‡– ሿ ൌ න ‡–š െλ

ൌ ൌ

ͳ

λ

߁ሺߙሻȾ Ƚ

ͳ ߁ሺߙሻȾ Ƚ

Ƚെͳ

ߚ

‫ Ͳ׬‬ቀͳെߚ‫ ݐ‬ቁ

ൌ

λ

‫ ݔ Ͳ׬‬Ƚെͳ ݁

‫ߙ ݖ‬െͳ ݁ െ‫ݖ‬ ͳ

߁ሺߙሻȾ Ƚ

ͳ Ⱦ

ቀ‫ݐ‬െ ቁ‫ݔ‬

݀‫ݔ‬ ͳ

ߚ ͳെߚ‫ݐ‬

ߚ

Ƚ

݀‫ݖ‬ሾ—„•–‹–—–‡ቀ‫ ݐ‬െ ቁ ‫ ݔ‬ൌ െ‫ݖ‬ሿ Ⱦ

λ

ቁ ‫ߙ ݖ Ͳ׬‬െͳ ݁ െ‫ݖ݀ ݖ‬ ͳെߚ‫ݐ‬



ሺͳെߚ‫ ݐ‬ሻെߙ

ൌ

‫ݔ‬ ͳ െ ‫ ݔ‬Ƚെͳ ݁ ߚ ݀‫ݔ‬ Ƚ ߁ ሺߙ ሻȾ

߁ሺߙሻ

߁ ሺߙ ሻ ͳ

ൌሺͳ െ ߚ‫ݐ‬ሻെߙ ”‘˜‹†‡†–൏  ߚ

5.2.2. Mean and Variance of Gamma distribution We have, for any gamma random variable X with parameters Ƚ and Ⱦ ,  ሺ–ሻ ൌ  ሺͳ െ ߚ‫ݐ‬ሻെߙ ( 1) Differentiating (1) w.r.to t we get, ݀  ሺ–ሻ ݀‫ݐ‬

=

݀ሺͳെߚ‫ ݐ‬ሻെߙ ݀‫ݐ‬

= ȽȾሺͳ െ ߚ‫ݐ‬ሻെߙെͳ

Therefore Mean= E[X] =

݀  ሺ–ሻ ݀‫ݐ‬



‫ݐ‬ൌͲ

= ȽȾሺͳ െ ߚ‫ݐ‬ሻെߙെͳ ȁ‫ݐ‬ൌͲ = ȽȾ

Again, differenting (1) twice with respect to t we get, ݀ ʹ  ሺ–ሻ ൌ ȽሺȽ ൅ ͳሻȾʹ ሺͳ െ ߚ‫ݐ‬ሻെߙെʹ ݀‫ʹ ݐ‬ Next, E[ܺʹ ሿ ൌ 

݀ ʹ   ሺ–ሻ ݀‫ʹ ݐ‬



‫ݐ‬ൌͲ

ൌ  ȽሺȽ ൅ ͳሻȾʹ ሺͳ െ ߚ‫ݐ‬ሻെߙെʹ ȁ‫ݐ‬ൌͲ = ȽሺȽ ൅ ͳሻȾʹ 74



 

Therefore Variance = Var[X] = ıʹ ൌ ሾ ʹ ሿ െ ሾሾሿሿʹ = ĮሺĮ ൅ ͳሻȕʹ െ ሺĮȕሻʹ Įȕʹ If X be any gamma random variable with parameters Ƚ and Ⱦǡ–Š‡ ‡ƒൌȽȾ ƒ”‹ƒ…‡ൌĮȕʹ

5.3. Exponential Distribution In probability theory and statistics, the exponential distribution is the probability distribution that describes the time between events in Poisson process i.e. a process in which events occur continuously and independently at constant average rate. It is a particular case of Gamma distribution. It is a continuous memory less random distribution. /HWXVFRQVLGHUD3RLVVRQSURFHVVZLWKSDUDPHWHUȜZKHUHGLVFUHWHHYHQWVDUHEHLQJ occurred over a continuous time interval. Let W denotes time of occurrence of first event. The first occurrence of the event will take place after time w if no occurrence of the event are recorded in the time interval [0,w]. The distribution function for X is given by F(w) = P(W൑ ™ሻ ൌ ͳ െ ሺ ൐ ‫ݓ‬ሻ. Let X denotes number of occurrences of the event in this time interval and X is a Poisson random variable with parameter ȜZ Therefore, ሺ ൐ ‫ݓ‬ሻ= P(X=0) =

݁ െߣ‫ ݓ‬ሺߣ‫ ݓ‬ሻͲ ͲǨ

= ݁ െߣ‫ݓ‬ Therefore

F(w) = P(W൑ ™ሻ ൌ ͳ െ ሺ ൐ ‫ݓ‬ሻ= 1- ݁ െߣ‫ݓ‬

Hence the pdf will be f(w) = F¶ Z  Ȝ݁ െߣ‫ݓ‬

75 

 

Definition 5.3 (Exponential distribution): A random variable X is said to follow an exponential dLVWULEXWLRQZLWKSDUDPHWHUȜLILWVSGILVJLYHQE\ I [  Ȝ݁ െߣ‫ ݔ‬, x>0

0.2

0.1

0 -2

0

2

4

6

8

10

Exponential Density function f(x)

5.3.1. Moment Generating Function of Exponential Distribution Let X be an exponential distribution ZLWKSDUDPHWHUȜ 7KHQLWVSGILVJLYHQE\I [  Ȝ݁ െߣ‫ ݔ‬, x>0 Hence the mgf is given by ’

 ሺ–ሻ ൌ ሾ‡– ሿ ൌ  න ‡–š ˆሺšሻ†š െ’

=

’ ‫׬‬െ’ ‡–š Ȝ݁ െߣ‫š† ݔ‬ ’

= Ȝ ‫ ݁ š–‡ Ͳ׬‬െߣ‫š† ݔ‬ ’

= Ȝ ‫‡ Ͳ׬‬െሺȜെ–ሻš †š =

Ȝ െሺȜെ–ሻ Ȝ

= ሺȜെ–ሻ

76 

’

‡െሺȜെ–ሻš หͲ

 

5.3.2. Mean and Variance of Exponential Distribution ,I;EHDQH[SRQHQWLDOUDQGRPYDULDEOHZLWKSDUDPHWHUȜWKHQZHKDYHIURPWKH mgf is Ȝ

 ሺ–ሻ ൌ ሺȜെ–ሻ(1) Differentiating (1) w.r.to t we get,

݀  ሺ–ሻ ݀‫ݐ‬

Ȝ

ൌ  ሺȜെ–ሻʹ ݀  ሺ–ሻ

Therefore Mean= E[X] =

݀‫ݐ‬



ͳ

Ȝ

‫ݐ‬ൌͲ

ൌ ሺȜെ–ሻʹ ቚ

‫ݐ‬ൌͲ

= . Ȝ

Differentiating (1) twice with respect to t we get, ݀ ʹ  ሺ–ሻ ʹȜ ൌ ሺȜ െ –ሻ͵ ݀‫ʹ ݐ‬ Therefore, E[ ʹ ሿ ൌ 

† ʹ   ሺ–ሻ †– ʹ



–ൌͲ



Hence Var[X] = E[ ʹ ሿ െ ൣሾሿ൧ =

ʹ Ȝʹ



ͳ ߣʹ

=

ʹ Ȝʹ

ʹ

ͳ ߣʹ

,I;EHDQH[SRQHQWLDOUDQGRPYDULDEOHZLWKSDUDPHWHUȜWKHQ Mean=

ͳ Ȝ

Variance=

ͳ ߣʹ

5.4 Laplace (Double Exponential) Distribution: Definition 5.4.Laplace (Double Exponential distribution): A random variable X is said to follow Laplace distribution if its pdf is given by ͳ

f(x)= ‡െȁš ȁ , where െ’ ൏ ‫ ݔ‬൏ ’ ʹ

77 

 

5.4.1. MGF of Laplace (Double Exponential) Distribution: λ

We have  ሺ–ሻ ൌ ሾ‡– ሿ ൌ  ‫׬‬െλ ‡–š ˆሺšሻ†š ͳ

λ

= ‫׬‬െλ ‡–š ‡െȁš ȁ†š ʹ ͳ

λ

Ͳ

= ሾ‫‡ š–‡ Ͳ׬‬െš †š + ‫׬‬െλ ‡–š ‡š †š] ʹ

ͳ

λ

ͳ

ͳ

Ͳ

= ሾ‫‡ Ͳ׬‬െሺͳെ–ሻš †š +‫׬‬െλ ‡ሺ–൅ͳሻš †š] ʹ = ሾ

ʹ െሺͳെ–ሻ

ͳ ሺͳ൅–ሻ

ሼ݁ െሺͳെ‫ݐ‬ሻ‫ ݔ‬ሽλ Ͳ +

ሼ݁ ሺͳ൅‫ݐ‬ሻ‫ ݔ‬ሽͲെλ ሿ ͳ

ͳ

ͳ

= ቂሺͳെ–ሻ ൅ ሺͳ൅–ሻቃ ʹ

=

ͳ ͳെ‫ʹ ݐ‬

5.4.2. Mean and Variance of Laplace (Double Exponential) Distribution: We have from 5.4.1,  ሺ–ሻ ൌ

ͳ ͳെ‫ʹ ݐ‬

...............(1)

Differentiating (1) w.r.to t we get, ʹ– ݀ ሺ–ሻ ൌ ሺͳ െ – ʹ ሻʹ ݀‫ݐ‬ Š‡”‡ˆ‘”‡ǡ ‡ƒ ൌ ሾሿ ൌ 

ʹ– † ሺ–ሻ ቤ ൌ ฬ ൌͲ †– –ൌͲ ሺͳ െ – ʹ ሻʹ –ൌͲ

Again differentiating (1) w.r.to t twice we get, ݀ ʹ  ሺ–ሻ ʹሺͷ– ʹ െ ͳሻ ൌ ʹ ሺ– െ ͳሻ͵ ݀‫ʹ ݐ‬ Therefore,E[ ʹ ሿ ൌ 

† ʹ   ሺ–ሻ †– ʹ



–ൌͲ

ൌʹ ʹ

Hence Variance= Var[X]= E[ ʹ ሿ െ ൣሾሿ൧ ൌ ʹ െ Ͳ ൌ ʹ If X is a standard double exponential random variable then, Mean= 0 and Variance = 2 78 

 

5.5 Weibul Distribution The family of Weibull distributions was introduced by the Swedish physicist WaloddiWeibull in 1939. Definition 5.5.Weibul distribution: A random variable X is said to follow Weibul Distribution with parameters ߙܽ݊݀ߚሺߙ ൐ Ͳǡ ߚ ൐ Ͳሻ if it has the pdf ߙ

I [Įߚ)= ቊ

ߚߙ

‫ߙ ݔ‬െͳ

Ͳǡ ‫ ݔ‬൏ Ͳ

݁ ͳ

‫ߙ ݔ‬ ߚ

െቀ ቁ

,IZHWDNHĮ WKHQI [  ݁

Remark:

‫ݔ‬ ߚ

െቀ ቁ

ߚ

H[SRQHQWLDOGLVWULEXWLRQZLWKSDUDPHWHUȜ

ǡ ‫ ݔ‬൒ Ͳ

ͳ ߚ

, that means it represents an

.

;ɲсϬ͘ϵ͕ߚ ൌ ʹͿ

0.7 0.6 0.5

;ɲсϬ͘ϳ͕ߚ ൌ ʹͿ

0.4 0.3

;ɲсϬ͘ϱ͕ߚ ൌ ʹͿ

0.2 0.1 0 0

1

2

3

4

5

Weibul Density function f(x) 5.5.1 Moment generating function of Weibul Distribution We have X̱‡„ሺĮǡ ȕሻ λ

 ሺ–ሻ ൌ ሾ‡– ሿ ൌ  ‫׬‬െλ ‡–š ˆሺšሻ†š

Then λ

= ‫š–‡ Ͳ׬‬ =

ߙ

λ

ߙ ߚ

‫ߙ ݔ‬െͳ ݁ ߙ

‫ߙ ݔ š–‡ ׬‬െͳ ݁ ߚߙ Ͳ

‫ߙ ݔ‬ ߚ

െቀ ቁ

‫ߙ ݔ‬ ߚ

െቀ ቁ

†š

†š

5.5.2 Mean and Variance of Standard Weibul Distribution The Weibul distribution may be presented in another way by taking the pdf of the Weibul variable X as ݂ܺ ሺ‫ ݔ‬ሻ ൌ ݇ߙ െͳ ݁



‫ ݔ‬െߤ ݇െͳ ቁ ߙ

݁

െቀ

‫ ݔ‬െߤ ݇ ቁ ߙ

where x ൐ ߤܽ݊݀݇ ൐ Ͳ

7KHVWDQGDUG:HLEXOGLVWULEXWLRQLVREWDLQHGZKHQĮ DQGߤ ൌ Ͳ 79 

 

Then rth moment will be ͳ

ߤ‫ݎ‬Ԣ ൌ ‫ ܧ‬ሾܺ‫ ݎ‬ሿ ൌ ‫ܧ‬ሾܻ ݇ ሿ‫ݎ‬ ‫ݎ‬

λ

=‫ ݁ Ͳ׬‬െ‫ݕ݀ ݇ ݕ ݕ‬ ‫ݎ‬

ɝቀͳ ൅ ቁ ݇

ͳ

+HQFH0HDQ (>;@ ɝቀͳ ൅ ቁ ݇

ʹ

Now, E[ܺ ʹ ሿ ൌɝቀͳ ൅ ቁ ݇

Therefore Variance = var[X] = E[ ʹ ሿ െ ൣሾሿ൧

ʹ

ʹ

ͳ

݇

݇

= ൬ˆ ቀͳ ൅ ቁ൰- ൬ˆ ቀͳ ൅ ቁ൰

ʹ

5.6 Normal Distribution Normal distribution was first introduced by De Moivre in the year 1733 as being the limiting form of binomial density function where the number of trials is considered to be infinite. After a half century again the same distribution was discovered by Gauss and Laplace and they used it for calculation of different errors in astronomy. This distribution is sometimes called Gaussian Distribution.





Fig.5.6.1

Fig.5.6.2

Fig.5.6.3 80 

 

In figure 5.6.1 data are scattered more on left and in fig 5.6.2 the data are spread more on right. But in fig.5.6.3 we see that the data tends to accumulate to some central value without the bias on left or right. The "Bell Curve" is a Normal Distribution. And the yellow histogram shows some data that follows it closely, but not perfectly (which is usual). 

Definition 5.6.1.(Normal distribution): A continuous random variable X is said to follow a normal distribution with parameter Ɋƒ†ɐ if the pdf is given by, f(x) =

ͳ ξʹߨߪ

݁

ሺ‫ ݔ‬െߤ ሻʹ ʹߪ ʹ



where െλ ൏ ‫ ݔ‬൏ λ

Theorem 5.6.1 If X be a normal random variable with parameters ߤ and ߪ then mean is ߤ and standard deviation is ߪ. Proof.

Let X̱ܰሺߤǡ ߪሻ . then its pdf is f(x) =

ͳ ξʹߨߪ

݁

ሺ‫ ݔ‬െߤ ሻʹ ʹߪ ʹ



where െλ ൏ ‫ ݔ‬൏

λ λ

ͳ

λ

Then mean= E[X] = ‫׬‬െλ ‫ ݂ݔ‬ሺ‫ ݔ‬ሻ݀‫׬ = ݔ‬െλ ‫ݔ‬ =

ͳ

ξ

ͳ λ ‫ ׬‬൫ߤ ξߨ െλ ͳ

ξߨ

=

ͳ ξߨ

λ

‫݁ݔ ׬‬ ʹߨߪ െλ

ሺ‫ ݔ‬െߤ ሻʹ ʹߪ ʹ



ሺ‫ ݔ‬െߤ ሻʹ ʹߪ ʹ



݀‫ݔ‬

݀‫ݔ‬

ʹ

൅ ξʹߪ‫ݖ‬൯ ݁ െ‫ [ݖ݀ ݖ‬substitute z=

=

=

ξʹߨߪ

݁

λ

λ

ʹ

ξʹߪ

]

ʹ

ሾ‫׬‬െλ ሺߤ ሻ ݁ െ‫ ݖ݀ ݖ‬൅ ‫׬‬െλ ξʹߪ‫ ݁ ݖ‬െ‫ݖ݀ ݖ‬ሿ λ

ʹ

ʹ

ሾߤ ‫׬‬െλ ݁ െ‫ ݖ݀ ݖ‬൅ Ͳሿሾsince z݁ െ‫ ݖ‬is odd fn.]

ʹߤ λ െ‫ʹ ݖ‬ ‫[ݖ݀ ݁ ׬‬ ξߨ Ͳ

=

ʹ

since݁ െ‫ ݖ‬is even function]

[ substitute‫ ʹ ݖ‬ൌ ‫ݐ‬ǡ ʹ‫ ݖ݀ݖ‬ൌ ݀‫ݐ‬ǡ ݀‫ ݖ‬ൌ  =

ʹߤ λ െ‫ ݐ‬െͳ ‫ݐ݀ ʹ ݐ ݁ ׬‬ ʹ ξߨ Ͳ

=

81 

‫ݔ‬െߤ

ߤ ξߨ

ͳ

ߤ

ʹ

ξߨ

߁ቀ ቁ =

ξߨ ൌ ߤ

‫ݐ‬



ʹ

ͳ ʹ

݀‫] ݐ‬

 

λ

Now, the Variance = E[ሺܺ െ ߤ ሻʹ ሿ ൌ ‫׬‬െλ ሺ‫ ݔ‬െ ߤ ሻʹ ݂ሺ‫ ݔ‬ሻ݀‫ݔ‬ λ

= ‫׬‬െλ ሺ‫ ݔ‬െ ߤ ሻʹ = =

ͳ

ξ

ͳ ξʹߨߪ

λ

݁

‫ ׬‬ሺ‫ ݔ‬െ ߤ ሻʹ ݁ ʹߨ ߪ െλ

ሺ‫ ݔ‬െߤ ሻʹ ʹߪ ʹ



ሺ‫ ݔ‬െߤ ሻʹ ʹߪ ʹ



݀‫ݔ‬

݀‫ݔ‬

ʹ ͳ λ ‫ ݁ ʹ ݖ ʹ ߪʹ ׬‬െ‫ ݖ‬ξʹߪ݀‫[ ݖ‬ ξʹߨߪ െλ

= = = = =

ʹߪ ʹ ξߨ Ͷߪ ʹ ξߨ

λ

substitute z=

‫ݔ‬െߤ ξʹߪ

ʹ

‫׬‬െλ ‫ ݁ ʹ ݖ‬െ‫ݖ݀ ݖ‬ λ

ʹ

ʹ

‫ ݁ ʹ ݖ Ͳ׬‬െ‫ [ ݖ݀ ݖ‬since‫ ݁ ʹ ݖ‬െ‫ ݖ‬is even function] ͳ

Ͷߪ ʹ λ ‫׬‬ ʹ ξߨ Ͳ

‫ ݁ ʹ ݐ‬െ‫ [ݐ݀ ݐ‬substitute‫ ʹ ݖ‬ൌ ‫ݐ‬ሿ

ʹߪ ʹ

͵

ξߨ

]

߁ ቀ ቁ ʹ

ʹߪ ʹ ͳ ξߨ ʹ

ͳ

ߪʹ

ʹ

ξߨ

߁ ቀ ቁ ൌ

ξߨ ൌ ߪ ʹ

Therefore standard deviation = sd(X) = ξ‫ߪ= ݁ܿ݊ܽ݅ݎܽݒ‬ If X ̱ࡺሺࣆǡıሻ Then Mean= ࣆ and Variance=࣌૛ Figure 5.6.4 shows a normal curve with mean=50 and standard deviation=10.

EŽƌŵĂůƵƌǀĞ

Ĩ;džͿ Ϭ͘Ϭϱ Ϭ͘Ϭϰ Ϭ͘Ϭϯ Ϭ͘ϬϮ Ϭ͘Ϭϭ Ϭ

ŝŶĨůĞĐƚŝŽŶ ƉŽŝŶƚ dž Ϭ

ϭϬ

ϮϬ

ϯϬ

ϰϬ

ϱϬ

ϲϬ

Fig.5.6.4

5.6.1 Characteristics of Normal curve The normal curve with mean Ɋƒ†•†ɐ has pdf 82 

EŽƌŵĂůƵƌǀĞ

ϳϬ

ϴϬ

ϵϬ

 

f(x) =

ͳ ξʹߨߪ

݁

ሺ‫ ݔ‬െߤ ሻʹ ʹߪ ʹ



where െλ ൏ ‫ ݔ‬൏ λ

and it has the following characteristics: 1. 2. 3. 4.

Mean= Median = Mode The curve is bell shaped and symmetrical about mean = Ɋ X-axis is asymptote to the normal curve. The linear combination of independent normal variate is also a normal variate.

5. The points of inflection of the curve are given by ‫ ݔ‬ൌ ߤ േ ߪǡ ݂ ሺ‫ ݔ‬ሻ ൌ

ͳ ξʹߨߪ

ͳ

݁ െʹ

5.6.2 Standardisation of Normal curve Let X be a normal random variable with meanɊƒ†•–ƒ†ƒ”††‡˜‹ƒ–‹‘ɐ. Then the random variable

Z=

ܺെߤ ߪ

is called the standard normal random variable. There are

many normal curves with parameters Ɋƒ†ɐǤIntegrating the normal curve over a particular interval the probability is calculated. But due to complexity numerical integration is required to find such probabilities. To overcome this difficulty the transformation of standardisation is required. The standard normal variate has mean=0 and s.d=1.

standard normal curve 0.6 0.4 standard normal curve

0.2 0 -4

-2

ʅ с

0

2

4



Figure 5.6.5 Definition 5.6.2.(Standard Normal distribution): A continuous random variable X is said to follow a standard normal distribution if its pdf is given by  

߮ሺܼሻ ൌ

ͳ ξʹߨ

݁

െ‫ʹ ݖ‬ ʹ

െλ ൏ ‫ ݖ‬൏ λ



Theorem 5.6.2.If Z be a standard normal variate then its mean= 0 and sd=1. 83 

 

Proof: let Z be a standard normal variate, then its pdf is given by ߮ ሺܼ ሻ ൌ

ͳ

݁

ξʹߨ

െ‫ʹ ݖ‬ ʹ

െλ൏‫ ݖ‬൏λ

λ

Therefore mean = E[Z] = ‫׬‬െλ ‫ ߮ݖ‬ሺ‫ݖ‬ሻ݀‫ݖ‬ λ

= ‫׬‬െλ ‫ݖ‬ = Now, E[ܼʹ ሿ ൌ 

ͳ ξ

λ

ξʹߨ

݁

λ

ͳ ξ

ͳ

‫݁ݖ ׬‬ ʹߨ െλ

‫݁ʹݖ ׬‬ ʹߨ െλ

െ‫ʹ ݖ‬ ʹ

െ‫ʹ ݖ‬ ʹ

െ‫ʹ ݖ‬ ʹ

݀‫= ݖ‬

݀‫ݖ‬

݀‫= ݖ‬0 [ since‫݁ݖ‬

ʹ ξ

λ

‫݁ʹݖ ׬‬ ʹߨ Ͳ

െ‫ʹ ݖ‬ ʹ

െ‫ʹ ݖ‬ ʹ

is odd function.]

݀‫ [ ݖ‬since ‫݁ ʹ ݖ‬

െ‫ʹ ݖ‬ ʹ

is even

function.] = = =

Ͷ ʹξ

ʹ ξߨ

λ

ͳ

‫ ݁ ʹ ݐ ׬‬െ‫ [ݐ݀ ݐ‬substitute‫ ʹ ݖ‬ൌ ‫ݐ‬ሿ ߨ Ͳ ͵

߁ ቀ ቁ

ʹ ͳ ξߨ ʹ

ʹ

ͳ

ͳ

ʹ

ξߨ

߁ ቀ ቁ ൌ

ξߨ ൌ ͳ

Therefore Var[Z] = E[ܼʹ ሿ െ ሺ‫ܧ‬ሾܼሿሻʹ = 1-0 = 1 Hence

SD[Z] = ඥƒ”ሾሿ = 1

(proved)

5.6.3 Distribution Function of Standard Normal Random Variable. If Z be a standard normal random variable then the cumulative distributive function ‫ݖ‬

RI=LVJLYHQE\ĭ ]  3>ܼ ൑ ‫ݖ‬ሿ ൌ ‫׬‬െλ

ͳ ξʹߨ

݁

െ‫ʹ ݐ‬ ʹ

If X̱ܰሺߤǡ ߪሻ then first it is converted into standard normal variate Z=

݀‫( ݐ‬1)

ܺെߤ ߪ

The area under the standard normal probability curve is assumed to be 1. The above expression (1) results the area under standard normal curve between െλ and z. The following fig 5.6.6 shows explains the cdf of Standard Normal variate. If X̱ܰሺߤǡ ߪሻ then first it is converted into standard normal variate Z=

ܺെߤ ߪ

The area under the standard normal probability curve is assumed to be 1. The above expression (1) results the area under standard normal curve between െλ and z. Properties of the CDF are following : 1. ĭ  0.5 2. ĭ λ) =1 84 

 

3. 4. 5. 6.

ĭ ‫ ͳݖ‬൑ ‫ ݖ‬൑ ‫ ʹݖ‬ĭ ‫) ʹݖ‬- ĭ ‫) ͳݖ‬ ĭ -z) = 1- ĭ ]  3>=!]@ P(z >‫ ͳݖ‬ሻ ൌ 1- ĭ ‫) ͳݖ‬ ĭ െ‫ ͳݖ‬൑ ‫ ݖ‬൑ ‫ ͳݖ‬ĭ ‫ ) ͳݖ‬-1

The following figures explains the cdf of Standard Normal variate.

&ŝŐ͘ϱ͘ϲ͘ϲ

Fig. 5.6.7

85 

 

Fig. 5.6.8

Fig.5.6.9

5.6.4 Moment Generating Function of Normal Distribution Let X̱ܰሺߤǡ ߪሻ. Then its pdf is given by f(x) =

ͳ ξʹߨߪ

݁

ሺ‫ ݔ‬െߤ ሻʹ ʹߪ ʹ



where െλ ൏ ‫ ݔ‬൏ λ λ

Then ‫ ܺܯ‬ሺ‫ݐ‬ሻ ൌ ‫ ܧ‬ሾ݁ ‫ ܺݐ‬ሿ ൌ  ‫׬‬െλ ݁ ‫ݔݐ‬ =

ͳ ξʹߨߪ

݁

ሺ‫ ݔ‬െߤ ሻʹ ʹߪ ʹ



ʹ ͳ λ ‫ݐ ݁ ׬‬൫ߤ ൅ξʹߪ‫ ݖ‬൯ ݁ െ‫[ݖ݀ ݖ‬ ξߨ െλ

86 

݀‫ݔ‬ Substitute z=

‫ݔ‬െߤ ξʹߪ



 

= ݁ ‫ߤݐ‬ = = = Remark:

݀  ሺ–ሻ ݀‫ݐ‬

ͳ ξߨ ʹ ξߨ ʹ ξߨ

݀ ʹ   ሺ–ሻ ݀‫ʹ ݐ‬

ߪ ʹ‫ʹݐ‬ ʹ

݁ ‫ ߤݐ‬൅

ߪ ʹ‫ʹݐ‬ ʹ

݁ ‫ ߤݐ‬൅

ߪ ʹ‫ʹݐ‬ ʹ

݁ ‫ ߤݐ‬൅

λ

‫׬‬െλ ݁

ɐ– ʹ

െቀœെ ቁ

λ

ξʹ

݀‫ݖ‬

ʹ

‫ ݁ Ͳ׬‬െ– ݀‫ [ݐ‬substituteœ െ ξߨ ʹ

ɐ– ξʹ

ൌ –ǡ †œ ൌ †–ሿ

ો૛ ‫ ܜ‬૛ ૛

=‫ૄܜ܍‬൅

ɐ ʹ –ʹ ʹ

ൌ ሺɊ ൅ ɐʹ –ሻ‡–Ɋ൅

Mean = E[X] = Again ,

ʹ ͳ λ ‫ ݁ ׬‬െ൫œ െξʹߪ‫ ݖݐ‬൯ ݀‫ݖ‬ ξߨ െλ

†  ሺ–ሻ †–



–ൌͲ

ɐ ʹ–ʹ ʹ

ൌ ሺɊ ൅ ɐʹ –ሻ‡–Ɋ൅



–ൌͲ

ൌ Ɋ

ɐ ʹ–ʹ ʹ

= ሾɐʹ +ሺߤ ൅ ߪ ʹ ‫ݐ‬ሻʹ ሿ‡–Ɋ൅

Hence E[ܺʹ ሿ ൌ 

݀ ʹ   ሺ–ሻ ݀‫ʹ ݐ‬



‫ݐ‬ൌͲ

ɐ ʹ–ʹ ʹ

ൌ ሾɐʹ ൅ ሺߤ ൅ ߪ ʹ ‫ݐ‬ሻʹ ሿ‡–Ɋ൅



‫ݐ‬ൌͲ

ൌ ሾɐʹ ൅ Ɋʹ ሿ

Var [X] = E[ܺ ʹ ሿ െ ሺ‫ܧ‬ሾܺሿሻʹ = ሾɐʹ ൅ Ɋʹ ሿ െ Ɋʹ = ɐʹ

5.6.5.Normal Approximation to Binomial Distribution The point is to be noted from the fig 5.6.10 that a smooth bell shaped curve has been fitted closely into the block diagram. This may suggest that binomial probabilities represented by one or more blocks in the figure can be approximated reasonably well by a carefully selected area under an approximated chosen normal curve. It also tells that the normal variate selected should have same mean and variance as that of the binomial distribution. 

87 

  ݊‫݌‬

݊‫݌‬

‫ݔ‬

݊‫ ݌‬൅ߜ

lnቀ ቁ = lnቀ lnቀ

݊‫ݍ‬ ݊െ‫ݔ‬

ቁ ൌ  െ Ž ቀͳ ൅

݊‫ݍ‬

ቁ = lnቀ

݊‫ ݍ‬െߜ

ߜ ݊‫݌‬

ቁ ൌ  െ Ž ቀͳ െ



ߜ ݊‫ݍ‬



ͳ

using the expansion ln (1+x) = x- ‫ ʹ ݔ‬൅ ܱሺ‫ ͵ ݔ‬ሻ ʹ

݊‫ݔ ݌‬

lnቂቀ ቁ ቀ ‫ݔ‬

݊‫ݍ‬

݊െ‫ݔ‬

݊െ‫ݔ‬



݊‫݌‬

݊‫ݍ‬

‫ݔ‬

݊െ‫ݔ‬

ቃ = x lnቀ ቁ ൅ ሺ݊ െ ‫ ݔ‬ሻ݈݊ ቀ = - (ߜ ൅ ݊‫݌‬ሻ ቂ

ߜ ݊‫݌‬





ͳ ߜʹ ʹ ݊ʹ‫ʹ݌‬

ߜ͵

ߜ

݊

݊‫ݍ‬

൅ ܱ ቀ ͵ ቁቃ - (nq-ߜሻ ቂെ



ͳʹߜʹ݊ʹ‫ʹݍ‬൅ܱߜ͵݊͵ = -ߜሾͳ ൅ =െ

ߜʹ

ͳ ߜ ʹ ݊‫݌‬

െͳ൅

ͳ ߜ ʹ ݊‫ݍ‬

ߜʹ

൅ ܱ ቀ ʹቁ ݊

ߜ͵

ʹ݊‫ݍ݌‬

൅ ܱ ቀ ʹ ቁ ݊

Exponentiating the above result it may be concluded that the product of first two terms of eqn. (1) will be ݊‫ݔ ݌‬

ቀ ቁ ቀ ‫ݔ‬

݊‫ݍ‬

݊െ‫ݔ‬

݊െ‫ݔ‬



ൌ݁

ߜʹ ʹ݊‫ݍ݌‬



ߜ͵

ቂͳ ൅ ܱ ቀ ʹ ቁቃ

(2)

݊

Moreover, the square root factor of eqn(1) can be approximated to ݊

݊

ͳ

ߜ

ටʹߨ‫ ݔ‬ሺ݊െ‫ݔ‬ሻ ൌ ටʹߨሺߜ ൅݊‫ ݌‬ሻሺ“ Ȃߜሻ ൌ ʹߨ݊‫ܱ ݍ݌‬ሺͳ ൅ ݊ )

(3)

It may be DUJXHGWKDW [VKRXOGGLIIHU IURP WKH PHDQ ȝ  QSE\DVPDOO QXPEHURI standard deviations ߪ = ඥ݊‫ݍ݌‬. In particular this number shouldbe of O(1) as n is taken large. Since x =ߜ ൅ ݊‫ ݌‬, this means that at worst, ߜ̱ܱ ቀඥሺ݊ሻቁforlarge values ߜʹ

ͳ

of n. In this case, both ܱ ቀ ʹ ቁ and O( ) in eq. (2) and eq. (3) behaveas O(1/ ξ݊) as n ݊ ݊ ՜ ’. Hence, the binomial probability function can been written as

f(x) = ට

ͳ ʹߨ݊‫ݍ݌‬

݁

ሺ‫ ݔ‬െ݊‫ ݌‬ሻʹ ʹ݊‫ݍ݌‬



ͳ

ሾͳ ൅O( ሻሿ ξ݊

(4)

ZKLFKLVWKHQRUPDOGLVWULEXWLRQZLWKSDUDPHWHUVȝ QSDQGߪ ʹ = npq, up to correctionsthat vanish as n ՜ ’7KHPHDQYDOXHȝDQGWKHVWDQGDUGGHYLDWLRQߪ of thenormal approximation are identical to the mean value and the standard deviation 89 

 

of theoriginal binomial distribution, respectively. That is, for߮ ሺ‫ ݔ‬ሻ ൌ ට

ͳ ʹߨ݊‫ݍ݌‬

݁

ሺ‫ ݔ‬െ݊‫ ݌‬ሻʹ ʹ݊‫ݍ݌‬



where q=1-p. The normal approximation to the binomial distribution holds for values of x withinsome number of standard deviations of the average value np, where this number is ofO(1) as n ՜ ’, which corresponds to the central part of the bell curve. As previouslynoted, f(x) is small anyway in other parts of the domain, so that we can ignore thefact that our approximation may not be good there. Eq. (4) also reveals the size of thefirst correction to the normal approximation to the binomial distribution. Note that theO(1/n) term in eq. (1) has been dropped as this term is much smaller than the O(1/ξ݊)correction term that appears in eq. (4). Remark: 7KHDERYHWKHRUHPLVYDJXHLQWHUPVRIWKHZRUG³ODUJH´,Q0DWKHPDWLFV large in strictest sense means infinity. For most of the practical cases the approximation is acceptable for the values of n and p such that either p൑ ͲǤͷܽ݊݀݊‫ ݌‬൐ ͷ or p> 0.5 and n (1-p) > 5.

5.7 Log Normal Distribution $UDQGRPYDULDEOH;LVVDLGWRKDYHWKHORJQRUPDOGLVWULEXWLRQZLWKSDUDPHWHUVȝ‫א‬Թ DQGı!LIOQ ; KDVWKHQRUPDOGLVWULEXWLRQZLWKPHDQȝDQGVtandard deviation ı (TXLYDOHQWO\ ;  ݁ ܻ ZKHUH 15) = ݁ െͳͷߣ = ݁ െʹ = 0.22 3 ;!_;!  3 ;!  Hí  5.9.3. The length of bolts produced by a machine is normally distributed with mean 4 and s.d. = 0.5. A bolt is defective if its length does not lie in the interval (3.8, 4.3). Find the percentage of defective bolts produced by the machine. Solution: Let x the random variable denoting length of bolt. By problem x has normal distribution with mean ȝ  and s. d. V 0.5. First we shall find the probability p(3.8 < x < 4.3)

Now, z=

z=

3.8 - 4 0.5

x-ȝ ı

has a standard normal distribution. It is observed that when x = 3.8,

0.4 and

when x = 4.3,

z=

4.3 - 4 0.5

0.6

? p (3.8 < x < 4.3) = p(-0.4 < z < 0.6) = The area under standard normal curve enclosed between the ordinate z = - 0.4 and z = 0. 6.

94 

 

Hence, using standard normal table it can be derived as p (3.8 < x < 4.3) =

1

0.6

³e 2ʌ

-

z2 2



-f

-0.4

1

It can be concluded that

-0.4

1

dz -



³

³e

-

z2 2

dz = 0.7257

-

-f -

f

1

³e 2ʌ

-

z2 2

dz

0.4

z2

e 2 dz =

1

-f

f

-

z2

³ e 2 dz due

ʌ 0.4

to symmetry of standard normal

curve. ? p (3.8 < x < 4.3) =

0.7257 -

1

f

-

z2

³ e 2 dz =

2ʌ 0.4

0.4 z § · 1 ¨1  ³ e 2 dz ¸¸ = 0.7257-(1-0.6554) ¨ 2ʌ f © ¹ 2

0.7257 -

= 0.3881 Hence, probability of length of bolt lies between 3.8 and 4.3 is 0.3881. ?Probability of length of bolt does not lies between 3.8 and 4.3 is = 1- 0.3881= 0.66189 and percentage of defective bolt produced is 0.66189 × 100 = 61.89. 5.9.4.In a sample of 100 cases, the mean of certain test is 14 and standard deviation is 2.5. Assuming the normality of the distributions, find (i) how many candidates score between 12 and 15; (ii) how many scores below 8 (iii) what is the probability that a candidate selected at random will score above 15. Given that

p (0 d z d 2.4) =

where p 0 d z d z1 =

1 2ʌ

z1

0.4918,

³e

-

z2 2

p(0 d z d 0.8) =

0.2881,

p(0 d z d 0.4) =

0.1554

dz .

0

Solution: Let x be the random variable denoting score of a candidate in test. Here

ȝ  and V

2.5 Ÿ z =

x-ȝ [ = ı 

is a standard normal variable.

(i) The area of the probability curve lying between x = 12 and 15 is given by 95 

  p 12 d x d 15 =

1

z2

³e 2ʌ

-

z2 2

dz ;

where z1

z1

=

12 - 14 = -0.8 and z 2 2.5

12 - 14 2.5

0.4.

Using above given data it can be derived as p 12 d x d 15 = p(Z d 0.4)  p(Z d - 0.8) = p(0 d z d 0.8) + p(0 d z d 0.4) =

0.2881+0.1554= 0.4435

It may be noted due to symmetry on normal distribution curve that p(0 d z d 0.8) = p(- 0.8 d z d 0) .

Hence, the number of candidates score between 12 and 15 is =

0.4335 u1000 444.

(ii) Probability of scoring below 8 is given by § x-14 8 -14 · p( x < 8) = p ¨  ¸ p(z < -2.4) 2.5 ¹ © 2.5

Using the given data it can be derived as p(z < -2.4) = p(- f < z d 0) - p (0 d z < 2.4)= 0.5-0.4918 = 0.0082

It may be noted that p(- f < z d 0) = p(0 d z < f) = 0.5due to symmetry on normal curve. Therefore, the number of candidates score below 8 is = 0.0082 u1000 = 82. (iii) Probability of scoring above 15 is given by § x-14 15 -14 · p( x > 8) = p ¨ ! ¸ p(z > 0.4) 2.5 ¹ © 2.5

Using the given data it can be derived as p(z > 0.4) = p( 0 d z 60) =

5 100

z=

x - 50 is ı

0.05 .

60 - 50 10 = V ı

When x = 60,

z=

Which shows

60 - 50 · § x - 50 § 10 · p(x > 60) = p¨ > ¸ =p¨ z> ¸ =0.05 ı ¹ ı¹ © ı ©

From the above supplied data we have Therefore,

a standard normal variable.

(1)

p(0 < z < 1.64) = 0.45 .

(2)

p(z > 1.64)= 0.5-0.45= 0.05

Comparing equation (1) and (2) we get

10

V

= 1.64 so V = 6.097 (Answer).

5.9.6. Among 10,000 random digits, find the probability that digit 3 appears at most 950 times. Solution: Let x denotes the number of times the digit 3 appears. Then x is a b(n,p) variate (binomial variate with n and p parameter) when n = 10,000 and

p=

1 . 10

?np = 1000 and np(1- p) = 900. Since, n is large we find p(x d 950) by approximating with normal distribution. x is

approximately

(np, np(1 -p)) =

(1000, 30) normal variate. So

z=

x - 1000 30

is

approximately standard normal variate. Since discrete variate is approximated by continuous variate so we find p(x d 950.5). Now x = 950.5

Ÿz=

950.5 - 1000 30

1.65.

ƌĞĂƵŶĚĞƌ^EsnjфͲϭ͘ϲϱ

Therefore, the required probability  p(x

d 950.5) = p( z d 1.65)

= 0.5-p(-1.65 d z