Bernoulli distribution

From formulasearchengine
Jump to navigation Jump to search

{{ safesubst:#invoke:Unsubst||$N=Refimprove |date=__DATE__ |$B= {{#invoke:Message box|ambox}} }} Template:Probability distribution

In probability theory and statistics, the Bernoulli distribution, named after Swiss scientist Jacob Bernoulli, is the probability distribution of a random variable which takes value 1 with success probability and value 0 with failure probability . It can be used, for example, to represent the toss of a coin, where "1" is defined to mean "heads" and "0" is defined to mean "tails" (or vice versa).


If is a random variable with this distribution, we have:

A classical example of a Bernoulli experiment is a single toss of a coin. The coin might come up heads with probability and tails with probability . The experiment is called fair if , indicating the origin of the terminology in betting (the bet is fair if both possible outcomes have the same probability).

The probability mass function of this distribution is

This can also be expressed as

The expected value of a Bernoulli random variable is , and its variance is

Bernoulli distribution is a special case of the Binomial distribution with .[1]

The kurtosis goes to infinity for high and low values of , but for the Bernoulli distribution has a lower excess kurtosis than any other probability distribution, namely −2.

The Bernoulli distributions for form an exponential family.

The maximum likelihood estimator of based on a random sample is the sample mean.

Related distributions

(binomial distribution).

The Bernoulli distribution is simply .

See also



  • {{#invoke:citation/CS1|citation

|CitationClass=book }}

  • Johnson, N.L., Kotz, S., Kemp A. (1993) Univariate Discrete Distributions (2nd Edition). Wiley. ISBN 0-471-54897-9

External links

  • {{#invoke:citation/CS1|citation

|CitationClass=citation }}

  1. REDIRECT Template:Probability distributions

Template:Common univariate probability distributions