Why would you want to have a generative model ?
Simplified and accurate description. “The data follows this distribution”
The derived model can be used to test hypotheses
Potential for Normalization
Generation of synthetic data
Birefly explain MLE (maximum likelihood estimation)
Allows us (using X samples) to estimate the Parameters of the propablity that “generated” the sample
The distribution has to be decided upon beforehand
The likelihood function is a function of the parameters only
=> Log likelihood is often a lot easier
The likelihood function L(θ;X) is the joint probability (or density) of the observed data, viewed as a function of the parameters θ
For independent and identically distributed (i.i.d.) data, the likelihood function is the product of the individual probabilities (or densities)
Whats one of the main issues with 16S sequencing data ?
A ton of 0s
What are two-component-mixture models?
A model that is a mixture of 2 distributions
e.g: Zero inflated distributions
e.g: Sampling from 2 normal distributions
2 Step approach: 1. decide from which normal to sample 2 sample the selected distribution
e.g.: Derive the sum of 2 densities as the formula for the density of all fair values
What makes fitting mixture models especially hard ?
This again is an optimization problem but with extra parameters
number k of components ( number of distributions [or same dist with different params])
mixture proportions
EM alogrith is used for this
What are infinite mixtures ?
Mixtures that have as many components as they have samples
e.g.: each value in a range of 1-1000 has its own distribution
e.g: sampling a laplacian dist from a 1000 normal dists with different parameters
Whats a Gamma-Poisson mixture ?
Equivalient to the negaitve binomial
A distribution that was sampled by a 2 step sampling approach
Sample lambda values from a Gamma Distribution
Sample from multiple different Poisson Distributions that have the respoective previously sampled lambda values
Whats the Dirchlet distribution
Last changed7 months ago