STAT 305 – Introduction to Statistical Inference was a pretty difficult course in my opinion. It was very theory based with not many concrete examples. My favorite unit was probably likelihood estimators. I felt as if I could just follow the same game-plan for most questions:

1) Find the likelihood function by taking the product of n probability density functions

2) Then log it to make it the log likelihood which is easier to proceed with

3) Take the first derivative of the log likelihood and equate it to zero and solve for parameter of interest to find the MLE

4) Take the second derivative of of the log likelihood and if it is <0 then it ensures that you are maximizing

5) Fisher information is -E(second derivative)

6) Variance estimate is just 1/(Fisher Info)

Some topics or concepts covered in this course (Off the top of my head):

- Moment Generating functions. First derivative gives E(Y) or mean while the second derivative gives E(Y²). Var(Y)=E(Y²)-E(Y)²
- Likelihood functions
- Maximum likelihood estimators (MLE’s)
- Bayesian prior/posterior
- Hessian matrix
- Fisher information
- Wilk’s and Pearson’s statistics
- Paired comparisons/comparing 2 multinomial distributions
- Hypothesis testing using Neyman Pearson Lemma. Significance level, power, and p-value.
- Pooled samples
- Categorical data with free parameters