Appendix — Reference Material
Quick Reference for Papers, Textbooks, and Coursework
1 Overview
This appendix serves as a standalone reference companion to the main course modules. It is designed for fast lookup — when you encounter an unfamiliar symbol in a paper, need to verify a matrix calculus identity, or want to double-check a proof structure without re-reading a full chapter.
The material here does not introduce new concepts. It collects, organizes, and presents the most frequently needed reference material from across all course modules in a scannable format.
2 Contents
2.1 Greek Alphabet Reference
A complete table of all 24 Greek letters with their standard uses in statistics, econometrics, and machine learning. Covers both lowercase and uppercase forms, pronunciation guide, and a supplementary section on common mathematical operators and symbols (\(\in\), \(\nabla\), \(\partial\), \(\sim\), \(\propto\), etc.).
Use when: You encounter \(\xi\), \(\psi\), \(\nu\), \(\iota\), or any other symbol whose meaning you cannot recall immediately.
2.2 Common Mathematical Proofs
Step-by-step proofs for the results most frequently cited without derivation in econometrics and ML textbooks. Organized into three sections:
- Linear Algebra: Transpose of product, inverse of product, trace cycling, det(AB), trace = sum of eigenvalues, det = product of eigenvalues
- Calculus / Optimization: Matrix derivative of \(Ax\), quadratic form derivative, OLS normal equations derivation
- Probability / Statistics: Unbiasedness of sample mean and sample variance, Cauchy-Schwarz, Jensen’s inequality, KL divergence non-negativity, OLS unbiasedness, Gauss-Markov sketch
Use when: A paper states “it can be shown that…” and you want to actually verify it.
2.3 Notation Guide
A systematic reference for mathematical notation conventions used across this course, aligned with Hayashi (Econometrics) and Bishop (PRML). Organized by category:
- Scalars, vectors, matrices, random variables, sets
- Linear algebra operators (transpose, inverse, trace, determinant, norms, Kronecker product, etc.)
- Probability and statistics notation
- Convergence notation (\(\xrightarrow{p}\), \(\xrightarrow{d}\), \(o_p\), \(O_p\))
- Calculus and optimization (\(\nabla\), \(H\), \(\text{argmin}\), \(L^p\) norms)
- Machine learning specific (\(\mathcal{L}\), \(\odot\), \(D_{KL}\), indicator functions)
- Econometrics specific (\(M_X\), \(P_X\), \(\hat{\beta}\), \(\text{plim}\), within-group notation)
Use when: A notation convention is ambiguous or you want to confirm how a symbol is used in this course.
2.4 Formula Cheat Sheets
The most comprehensive single-document reference in this appendix. Seven sections of quick-lookup formulas:
- Linear Algebra Essentials — product rules, inverse rules, Woodbury identity, SVD, eigenvalue identities
- Matrix Calculus Quick Reference — table of derivatives of scalar expressions w.r.t. vectors and matrices
- Probability Distribution Summary — 19 distributions with parameters, PDF/PMF, mean, variance, MGF
- OLS Cheat Sheet — estimator, variance, residuals, \(R^2\), t-stat, F-stat, GLS, IV, 2SLS
- Optimization Cheat Sheet — FOC/SOC, Lagrangian, KKT conditions, Ridge, LASSO soft threshold
- Information Theory Quick Reference — entropy, KL divergence, cross-entropy, mutual information
- Key Theorems Quick Reference — Gauss-Markov, FWL, CLT, LLN, Delta Method, Jensen, Bayes, and more
Use when: You need a formula immediately and do not want to search through chapter notes.
3 How to Use the Appendix
The appendix is most useful as a companion to reading, not as a primary learning resource.
Recommended workflow:
- While reading a paper or textbook, bookmark this appendix in a separate tab.
- When you encounter an unfamiliar symbol, check Greek Alphabet Reference first.
- When notation is ambiguous (e.g., is \(\frac{\partial f}{\partial \mathbf{x}}\) a row or column vector?), check Notation Guide.
- When a result is cited without proof and you want to verify it, check Common Proofs.
- When you need a formula fast (OLS formula, a distribution’s variance, a matrix derivative), go directly to Cheat Sheets.
4 Notation Conventions
All notation in this course follows the conventions established in:
- Hayashi, F. (2000). Econometrics. Princeton University Press.
- Bishop, C.M. (2006). Pattern Recognition and Machine Learning. Springer.
Where these sources conflict (rare), the econometrics convention is used for econometrics topics and the ML convention for ML topics. Deviations are noted explicitly in the relevant sections.
Navigate to individual appendix files using the sidebar or the links above.