Gaussian Mixture Models

624 views

Thanks!

In: Mathematics

2 Answers

Anonymous 0 Comments

It’s possible to approximate any signal with a mixture of sine waves, this is how the Fourier Transform is conceptualized. The same process can be done with a mixture of Gaussian functions, what’s called a GMM. The FT is often quite efficient for signals that repeat with a specific period, whereas the GMM tends to be most efficient with statistical distributions. Both of these approximations are much more compact and fast to evaluate than something like a least-squares fit polynomial.

Anonymous 0 Comments

Gaussian is, for our purposes, a fancy way of referring to the Normal distribution. A mixture model is one that combines two or more distributions. Suppose you have two Gaussian/Normal distributions, N(0,1) and N(10,1). Every time you draw a number, you first determine which of those distributions you’re drawing from. Maybe you have a 30% chance of drawing from the N(1,0) distribution and a 70% chance of drawing from the N(10,1). Then you draw a number from one distribution or the other. The resulting random number can be said to come from a Gaussian mixture model. If you plotted the distribution, you would see a peak at both 1 and 10. Mixture models in general are a good way to model data with this kind of multi-peak structure.