Skip to content

This repository contains project which implements the Expectation Maximization Algorithm to train a Gaussian Mixture Model

Notifications You must be signed in to change notification settings

harry1357931/GaussianMixtureModel-EM_Algorithm

Repository files navigation

GaussianMixtureModel-ExpectaionMaximization Algorithm

For Algorithms used and Experimental Results:

Check File: FinalReport3.doc

Gaussian Mixture Model:

A Gaussian Mixture Model (GMM) is a parametric probability density function represented as a weighted sum of Gaussian component densities. GMMs are commonly used as a parametric model of the probability distribution of continuous measurements or features in a biometric system, such as vocal-tract related spectral features in a speaker recognition system. GMM parameters are estimated from training data using the iterative Expectation-Maximization (EM) algorithm or Maximum A Posteriori (MAP) estimation from a well-trained prior model.
For a good summary of Gaussian Mixture Model:
Check File: FinalReport3.doc
Visit Link: http://en.wikipedia.org/wiki/Mixture_model

Expectation Maximization:

An expectation–maximization (EM) algorithm is an iterative method for finding maximum likelihood or maximum a posteriori (MAP) estimates of parameters in statistical models, where the model depends on unobserved latent variables. Expectation maximization (EM) is seemingly the most popular technique used to determine the parameters of a mixture with an a priori given number of c omponents. This is a particular way of implementing maximum likelihood estimation for this problem. EM is of particular appeal for finite normal mixtures where closed-form expressions are possible. The EM iteration alternates between performing an expectation (E) step, which creates a function for the expectation of the log-likelihood evaluated using the current estimate for the parameters, and a maximization (M) step, which computes parameters maximizing the expected log-likelihood found on the E step. These parameter-estimates are then used to determine the distribution of the latent variables in the next E step.
Filtering and smoothing EM algorithms arise by repeating the following two-step procedure.
E-Step
    Operate a minimum-variance smoother designed with current parameter estimates to obtain updated state estimates.
M-Step
    Use the filtered or smoothed state estimates within maximum-likelihood calculations to obtain updated parameter estimates.
For More Info. on EM algorithm:

Input File having datapoints:

'samples_add.text'

Output:

GMM Parameters Pi, Mu(Mean), and Sigma(Co-Variance) are outputted for each group in a 3 group mixture at varying thresholds.
Sample Output Images when Input file is samples_add.text
     Output_BestSetOfParametersAtThreshold_0.0001.png
     Output_BestSetOfParametersAtThreshold_0.001.png
     Output_BestSetOfParametersAtThreshold_0.01.png
     Output_BestSetOfParametersAtThreshold_0.1.png

About

This repository contains project which implements the Expectation Maximization Algorithm to train a Gaussian Mixture Model

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages