The Gaussian distribution is the most common distribution encountered in the wild. This is due to the fact that the Gaussian is the distribution that
all others tend to, thanks to the central limit theorem. That means that if you convolute any other probability distribution often enough with any
other, the result will be a Gaussian.

The Gaussian distribution is defined by: $$ f(x, \mu, \sigma) = \frac{1}{\sqrt{2\pi}\sigma} \cdot e^{\frac{(x-\mu)^2}{2\sigma^2}}$$ where \(\mu\)
and \(\sigma\) are parameters that define the position and width respectively.

The Single dimensional Gaussian can be generalized into multiple dimensions. To do this \( x\quad \mu \) and \( \sigma \) are interpreted as
vectors. So that the square of the \( \vec{x} - \vec{\mu} \) can be calculated, the inverse covariance matrix \( V^{-1} \) must be inserted in the exponent.
The full form looks like this:

The Poisson distribution describes the probability of \( n \) events occurring in a given interval \( A \) when \( \nu \) are to be expected.
The Poisson distribution is discrete, as the number of events that are observed can only be an integer. The expectation value \( \nu \) on the
other hand can be contiguous. As the standard deviation is a function of the expected number of events \( \nu \) and \(\nu \) is the mean, the
Poisson distribution is has only one parameter, namely \(\nu \). The standard deviation in terms of \( \nu \) is defined as: \(\sigma =
\sqrt{\nu} \).

The Poisson distribution is often used in nuclear and particle physics. A Poisson distribution most common use is to give the uncertainty for histogram
bins as the number of events in a histogram bin is exactly the answer to the question of how many events are observed within the bin-boundaries. The
error for the histogram bins can therefore be easily calculated.

The Poisson distribution is defined as:

$$ f(n, \nu) = \frac{\nu^n}{n!} e^{-\nu} $$

For large \( \nu \) the poisson tends towards the Gaussian distribution following the central limit theorem.

The binomial distribution is the last of the three "basic distributions". I say basic here because together with the Poisson and the Gaussian
distribution the binomial distribution appears very often in very many places. The binomial distribution describes how probable it is to get exactly
\(k\) events of probability \(p\) in \(n\) tries if there are only two outcomes (either something happens or it does not happen). The Binomial distribution
is written as:

Where \( \binom{n}{k} \) is the binomial coefficient which is defined as

$$ \binom{n}{k} = \frac{n!}{k!(n-k)!} $$

It should be noted that \( k \) can't get larger than \( n \) as one observation can never generate more than one result (one coin flip will never
generate more than one heads or tails). As \( n \) is the number of tries and \( k \) the number of results, both \( n \) and \( k
\) must be positive integers to make sense.