How did we define
likelihood function
score-function?
Likelihood
L(π|y) = βfi( π | yi )
Score-Function
u((π|y) = βπ log(L(π|y))
Prove that the expected value of the score function is 0.
Eu(π|y)
= β« f(π|y) * d/dπ logf(π|y) dy
= β« f(π|y) * d/dπ f(π|y) / f(π|y) dy
= β« d/dπ f(π|y) dy
= d/dπ β« f(π|y) dy
= d/dπ 1
= 0
How did we define the Fisher Information Matrix?
The fisher information matrix equals the variance of the score function.
How can you express the Fisher Information Matrix in terms of the log likelihood function?
What is the observed information matrix?
What is the maximum likelihood estimator?
The maximum likelihood estimator is the π that maximises the likelihood of f(y|π), i.e. where the score function equals 0.
If we send the number of observations towards infinity, what distribution does the maximum likelihood estimator converge to?
How does the Fisher scoring algorithm work?
Basically, we use Newtonβs method to find the MLE. In the end, that looks like
π* = π + I^-1 u(y|π)
This can be derived from the linear approximation of u(π*|y), which is u(π*|y) = u(π|y) + uβ(π|y)(π* - π).
Since u(π*|y) = 0, we get that π* = π + I^-1 u(y|π), since uβ = I.
What is the log-likelihood-ratio test?
Chapter 2
How did we define the linear predictor?
What does the design matrix in the linear predictor look like?
What lines do we add do the design matrix if we have some non-numerical varibales?
What do we add to the design matrix if we want to see whether two variables interact?
What is an offset variable?
What is provably the MLE for π?
How did we define a GLM?
How did we define the exponential family?
What is the variance function?
What is the link function?
What is the canonical link function?
Last changed2 years ago