The Gaussian distribution is the most common distribution encountered in the wild. This is due to the fact that the Gaussian is the distribution that all others tend to, thanks to the central limit theorem. That means that if you convolute any other probability distribution often enough with any other, the result will be a Gaussian.
The Gaussian distribution is defined by: $$ f(x, \mu, \sigma) = \frac{1}{\sqrt{2\pi}\sigma} \cdot e^{\frac{(x-\mu)^2}{2\sigma^2}}$$ where \(\mu\) and \(\sigma\) are parameters that define the position and width respectively.
The Single dimensional Gaussian can be generalized into multiple dimensions. To do this \( x\quad \mu \) and \( \sigma \) are interpreted as vectors. So that the square of the \( \vec{x} - \vec{\mu} \) can be calculated, the inverse covariance matrix \( V^{-1} \) must be inserted in the exponent. The full form looks like this:
$$ f(\vec{x}, \vec{\mu}, \mathbf{V}) = \frac{1}{(2\pi)^{N/2}\left|\mathbf{V}\right|} \cdot e^{-\frac{1}{2}(\vec{x}-\vec{\mu})^T\mathbf{V}^{-1}(\vec{x} -\vec{\mu})} $$