Let a random experiment consist of tossing of coin two times.
Let 'S' be the sample space associated with the random experiment.
S = {(HH), (HT), (TT), (TH)}
Consider the No. of heads (0, 1 or 2) as the variable in an outcome (W).
Thus to each outcome (W) of S there corresponds a real number X(W). So for each of
YES we can define a real number denoted by X(W).
Definition: Let S be the sample space associated with a random experiment.
BINOMIAL DISTRIBUTION
The Binomial distribution was first discovered by James - Bernoulli (1654-1705).}
If n is a positive integer, p and q are constants such that p + q = 1 and
then X is called a Binomial distribution (or) Bernoulli distribution.
Here x = favourable events
n = No. of trials
p = Probability of success
q = Probability of failure
Characteristics and Properties
» The number of trials must be independent and finite.
» In every trial p is constant and p ≥ 0
* Mean of the Binomial distribution: np
* Variance of the Binomial distribution: npq
Poisson Distribution
* In a Binomial distribution we deal with a sample of definite size (n is precisely known). But there are situations where this may not be possible (either n is very large or may not be predictable). The basic reason for this is that event is rare and casual.
* Also we can say that the successful events in the total event space are few. e.g. the accidents in a factory, the goals scored in a hockey match.
* In such cases we know the no. of times an event occurs but not how many times it does not occur.
* To all such cases the Binomial distribution is inapplicable.
* The Poisson distribution is very suitable in such cases and it was derivated in 1837 by a French mathematician S.D. Poisson (1781-1840).
* then x is called Poisson distribution with parameter λ.
* Mean and variance of a Poisson distribution: λ.