\[\]

STK4011 – Statistical Inference Theory

From Chapter 1: Exercises 1, 2, 3, 4, 7, 8, 9, 10, 12, 15, 16, 17, 18, 21, 23, 24, 29, 30, 31, 33, 40, 41, 43, 45, 49, 50. Gently asterisked: 12, 30, 33, 41, 45. From Chapter 2: Exercises 1, 2, 6, 7, 8, 9, 10, 11, 12, 13, 14, 18, 19, 21, 23 29, 32, 33, 39, 41, 42, 43, 52, 54, 55. Gently asterisked: 2, 6, 10, 19, 21, 29, 32, 41, 42, From Chapter 3: Exercises 1, 2, 4, 5, 6, 7, 8, 9, 11, 12, 15, 16, 17, 18, 24, 27, 30, 31, 32, 34. Gently asterisked: 1, 6, 11, 17, 18, 24, 30, 31. From Chapter 4: Exercises 1, 2, 3, 4, 5, 7, 8, 9, 12, 16, 18, 24, 25, 26, 27, 29, 32, 34, 35, 41. Gently asterisked: 1, 2, 5, 7, 16, 18, 24, 25, 26, 32, 34, 41. From Chapter 5 (in the PartOne version from October 2024) : Exercises 1, 2, 6, 14, 15, 16, 17, 18, 23, 29, 30, 31, 42, 44, 46, 51. Gently asterisked: 2, 14, 15, 17, 42, 44. From Chapter 6 (in the PartOne version from October 2024) : Exercises 1, 2, 3, 4, 6, 8, 15, 16, 27, 28. From Chapter 7 (in the PartOne version from October 2024) : Exercises 1, 2, 3, 6, 9, 10, 12, 14. From Chapter 8 (in the PartOne version from October 2024) : Exercises 1, 2, 3, 4, 7, 8, 9, 10. From the Statistical Stories: v.6, Who wins?, dataset: denmark-norway-handballnov2022; v.7, The turn-around operation, dataset: 0-2 to 3-2; iv.1, New Haven temperatures, dataset: newhaven-data; iv.3, How special are You, dataset: sleep14; vi.1, Checking out the CLT; i.6, Mothers, babies, birthweights, factors; vii.4, Time-to-failure for machine components; iv.6, Birds on islands outside Ecuador; ...

I Short & crisp

1 Statistical models

Normal, bi- and multinomial, exponential, gamma, mixing
The normal distribution
Exercise 1.1
Normal sums
Exercise 1.2
Binomial distribution
Exercise 1.3
Trinomial probabilities
Exercise 1.4
Hazard rates and survival functions
Exercise 1.7
The exponential distribution
Exercise 1.8
The Gamma distribution
Exercise 1.9
Mixing the exponential
Exercise 1.10
Transformations, uniform, Pareto, Cauchy, Beta, Dirichlet
Transformation from \(X\) to \(Y\)
Exercise 1.12
Ordering exponentials
Exercise 1.15
The Pareto distribution
Exercise 1.16
Maxima of i.i.d. samples
Exercise 1.17
Ratios and the Cauchy
Exercise 1.18
Dirichlets inside Dirichlets
Exercise 1.21
The Dirichlet-multinomial distribution
Exercise 1.23
Laws of small numbers, Poisson, geometric, negative binomial
The Poisson distribution
Exercise 1.24
Conditioning on Poisson sums and a generalised binomial
Exercise 1.29
Moments, moment-generating functions, characteristic functions
Moments
Exercise 1.30
Moment-generating functions
Exercise 1.31
The Laplace distribution
Exercise 1.33
Sums of random lengths and the compound Poisson
Exercise 1.40
The logarithmic distribution
Exercise 1.41
The multinormal, the t, the chi-squared, the F
The multinormal distribution
Exercise 1.43
How tall is Nils?
Exercise 1.45
The t distribution
Exercise 1.49
The noncentral t distribution
Exercise 1.50

2 Large-sample theory

Modes of convergence
The normal distribution
Exercise 2.1
The Borel-Cantelli lemma and convergence almost surely
Exercise 2.2
Scheffé’s lemma
Exercise 2.6
From discrete to continuous
Exercise 2.7
Many small probabilities give a Poisson
Exercise 2.8
Maximum of uniforms
Exercise 2.9
Stochastic \(O_{pr}\) and \(o_{pr}\) symbols
Exercise 2.10
Convergence in probability and tail bounds
Markov, Chebyshov, and the Law of Large Numbers
Exercise 2.11
The binomial and the empirical distribution function
Exercise 2.12
Continuous mapping for convergence in probability
Exercise 2.13
Toeplitz lemma and convergence in probability
Exercise 2.14
Further tail bound inequalities
Exercise 2.18
Bernshteı̆n and Weierstraß
Exercise 2.19
Convergence in distribution
More in the Portmanteau
Exercise 2.21
Skorokodh’s theorem
Exercise 2.23
Central limit theorems
The central limit theorem, Lindeberg’s proof
Exercise 2.29
Showing convergence in distribution via moment-generating functions
Exercise 2.32
Proving the CLT with moment-generating functions
Exercise 2.33
Lindeberg, Liapunov, etc.
Exercise 2.39
Higher-order expansions of m.g.f.s
Exercise 2.41
Continuity theorem for vector variables
Exercise 2.42
The multi-dimensional CLT
Exercise 2.43
The delta method
The empirical correlation coefficient, general case
Exercise 2.52
Stretching the delta method
Exercise 2.54
The strong law of large numbers
The Strong Law of Large Numbers: the basics
Exercise 2.55

3 Parameters, estimators, precision, confidence

Precise estimation in a few classical models
Mean squared error
Exercise 3.1
Binomial estimation
Exercise 3.2
Estimating the normal variance
Exercise 3.4
Confidence interval
Exercise 3.5
Confidence interval for a normal variance
Exercise 3.6
Confidence interval for a normal mean
Exercise 3.7
Normal quantiles: estimation and confidence
Exercise 3.8
The empirical distribution function
Exercise 3.9
Confidence via normal approximations
Confidence intervals via normal approximations
Exercise 3.11
Confidence intervals for the standard deviation, outside normality
Exercise 3.12
The binomial, the normal approximation, and confidence intervals
Exercise 3.15
Quantiles and sample quantiles
Uniform ordering
Exercise 3.16
Min and max of two uniforms
Exercise 3.17
The sample median
Exercise 3.18
Moment matching methods
Moment matching estimators
Exercise 3.24
Moment method estimators for the exponential family
Exercise 3.27
Quantile matching methods
Moment fitting and quantile fitting for the Weibull
Exercise 3.30
Minimum sum of squares and linear regression
Linear regression and least squares estimation
Exercise 3.31
Linear multiple regression and least squares
Exercise 3.32
Separate and joint confidence intervals
Exercise 3.34

4 Testing, sufficiency, power

Testing, testing
Testing a null hypothesis
Exercise 4.1
Connections from confidence intervals to testing
Exercise 4.2
Confidence intervals for quantiles
Exercise 4.3
t testing, one and two samples
Exercise 4.4
Wald tests
Exercise 4.5
Neymann–Pearson optimal testing
The Neymann–Pearson Lemma
Exercise 4.7
The Neymann–Pearson Lemma: more details
Exercise 4.8
The Neymann–Pearson Lemma: applications
Exercise 4.9
The t test and its power
Exercise 4.12
Optimal average power
Exercise 4.16
Sufficiency, factorisation theorem, completeness
The factorisation theorem
Exercise 4.18
Completeness for the exponential family
Exercise 4.24
Basu’s Lemma
Exercise 4.25
Optimal conditional testing
Conditional tests
Exercise 4.26
Conditional tests: pairs of exponentials
Exercise 4.27
Conditional tests: Poisson
Exercise 4.29
Conditional tests: multiparameter exponential models
Exercise 4.32
A generalised Poisson distribution
Exercise 4.34
Testing for correlation
Exercise 4.35
Testing equality across groups, vector parameter case
Exercise 4.41

5 Minimum divergence and maximum likelihood

Maximising the log-likelihood
Exercise 5.1
The likelihood and log-likelihood functions
Exercise 5.2
The Kullback–Leibler divergence and the maximum likelihood method
Exercise 5.6
Score functions and the Fisher information matrix
Exercise 5.14
Cramér–Rao lower bounds for estimators
Exercise 5.15
Cramér–Rao bounds for the multidimensional case
Exercise 5.16
Fisher information and parameter transformations
Exercise 5.17
Maximum likelihood estimators
Exercise 5.18
Completing increasingly simpler tasks
Exercise 5.23
Wilks theorems
Exercise 5.29
log-likelihood and ML for the binomial, trinomial, multinomial
Exercise 5.30
The maximum likelihood estimator: examples
Exercise 5.31
Extending theory and methods to regression setups
Exercise 5.42
Logistic regression
Exercise 5.44
Poisson regression
Exercise 5.46
We can do things
Exercise 5.51

6 Bayesian inference and computation

Poisson data with gamma priors
Exercise 6.1
The Bayesian Master Recipe
Exercise 6.2
Some loss functions and their associated Bayes rules
Exercise 6.3
How many streetcars in San Francisco?
Exercise 6.4
A Bayesian take on hypothesis testing
Exercise 6.6
The binomial-beta setup
Exercise 6.8
MCMC, II: simulating from the posterior
Exercise 6.15
The normal prior and posterior with normal data
Exercise 6.16
Bernshteı̆n–von Mises approximations
Exercise 6.27
Multiparameter inference, I
Exercise 6.28

7 CDs, confidence curves, combining information

The probability transform
Exercise 7.1
Recipe One: via the c.d.f. of an estimator
Exercise 7.2
Confidence distribution and confidence curve for the normal standard deviation
Exercise 7.3
Computing a CD with simulation and isotonic repair
Exercise 7.6
Recipe Four: confidence curves via Wilks theorems
Exercise 7.9
Median age for men and women in Roman Era Egypt
Exercise 7.10
Recipe Six: CDs in exponential families
Exercise 7.12
Bayesian posteriors as approximate CDs
Exercise 7.14

8 Loss, risk, performance, optimality

Coin tossing
Exercise 8.1
Uniformly minimum variance unbiased estimators
Exercise 8.2
Cramér–Rao and Cauchy–Schwarz
Exercise 8.3
Sufficiency and Rao–Blackwell
Exercise 8.4
A weird unbiased estimator
Exercise 8.7
More Rao–Blackwellisation
Exercise 8.8
Tools for minimaxity
Exercise 8.9
Minimax estimators for binomial and multinomial parameters
Exercise 8.10

II Stories

Who wins? Computing probabilities as a function match time
Story v.6
The turn-around operation: from 0-2 to 3-2
Story v.7
New Haven annual temperatures 1912-1971
Story iv.1
Mammals and their bodies and brains
Story iv.3
Checking out the CLT
Story vi.1
Mothers, babies, birthweights, factors
Story i.6
Time-to-failure for machine components
Story vii.4
Birds on islands outside Ecuador
Story iv.6
Incomplete
Complete
2024-Jul-31 (46 hours ago)
2024-Jul-31 (46 hours ago)
2024-Jul-31 (46 hours ago)
2024-Jul-31 (46 hours ago)