Moment Generating & Characteristic Functions
Alat Berat untuk Analisis Distribusi
1 Kenapa Ini Penting?
Kenapa kita perlu MGF dan characteristic functions kalau sudah ada PDF dan CDF?
MGF uniquely determines a distribution: Kalau dua distribusi punya MGF yang sama, mereka identical. Ini sangat berguna untuk proving distribusi dari sum of random variables.
Moments gampang dihitung: Semua moments (\(E[X], E[X^2], E[X^3], \ldots\)) bisa didapat langsung dengan diferensiasi MGF di \(t=0\).
Characteristic functions exist for ALL distributions (unlike MGF). Bahkan Cauchy distribution yang tidak punya MGF tetap punya characteristic function.
Kunci dalam proving CLT: Proof standar CLT menggunakan convergence of characteristic functions (atau MGFs). Tanpa alat ini, proving CLT jauh lebih sulit.
Method of Moments dalam GMM: “Moment conditions” dalam GMM langsung berkaitan dengan derivatives of MGF. Cumulants berhubungan langsung dengan parameter distribusi.
Untuk researcher: kamu mungkin tidak sering compute MGF secara langsung. Tapi memahaminya membantu kamu understand kenapa beberapa properties distribusi berlaku dan kenapa estimators tertentu bekerja.
2 1. Moment Generating Function (MGF)
Untuk variabel random \(X\), moment generating function (MGF) adalah:
\[M_X(t) = E[e^{tX}] = \begin{cases} \sum_x e^{tx} p(x) & \text{(diskret)} \\ \int_{-\infty}^{\infty} e^{tx} f(x)\,dx & \text{(kontinu)} \end{cases}\]
MGF didefinisikan untuk \(t \in (-h, h)\) untuk beberapa \(h > 0\) (neighborhood of 0).
Penting: MGF tidak selalu ada! Distribusi heavy-tailed seperti Pareto atau Cauchy tidak punya MGF (karena \(E[e^{tX}] = \infty\) untuk \(t > 0\)).
2.1 Kenapa Disebut “Moment Generating”?
Karena MGF “generates” semua moments via differentiation:
\[E[X^k] = M_X^{(k)}(0) = \left.\frac{d^k M_X(t)}{dt^k}\right|_{t=0}\]
Proof: Taylor expand \(e^{tX}\): \[E[e^{tX}] = E\left[\sum_{k=0}^{\infty} \frac{(tX)^k}{k!}\right] = \sum_{k=0}^{\infty} \frac{t^k}{k!}E[X^k]\]
Diferensiasi \(k\) kali dan set \(t=0\) memberikan \(E[X^k]\).
3 2. MGF untuk Distribusi Umum
3.1 Normal Distribution
Untuk \(X \sim N(\mu, \sigma^2)\):
\[M_X(t) = \exp\left(\mu t + \frac{\sigma^2 t^2}{2}\right)\]
Derivasi untuk \(N(0,1)\): \[M_Z(t) = \int_{-\infty}^{\infty} e^{tz} \frac{1}{\sqrt{2\pi}} e^{-z^2/2}\,dz = \int_{-\infty}^{\infty} \frac{1}{\sqrt{2\pi}} e^{-(z^2-2tz)/2}\,dz\]
Complete the square: \(z^2 - 2tz = (z-t)^2 - t^2\): \[= e^{t^2/2} \int_{-\infty}^{\infty} \frac{1}{\sqrt{2\pi}} e^{-(z-t)^2/2}\,dz = e^{t^2/2}\]
Untuk \(X = \mu + \sigma Z\): \(M_X(t) = E[e^{t(\mu+\sigma Z)}] = e^{\mu t} M_Z(\sigma t) = e^{\mu t + \sigma^2 t^2/2}\).
Verifikasi moments dari MGF normal:
\[M'_Z(t) = te^{t^2/2} \Rightarrow M'_Z(0) = 0 = E[Z]\] \[M''_Z(t) = e^{t^2/2} + t^2 e^{t^2/2} \Rightarrow M''_Z(0) = 1 = E[Z^2]\] \[E[Z^3] = M'''_Z(0) = 0 \text{ (odd moment = 0 for symmetric)}\] \[E[Z^4] = M^{(4)}_Z(0) = 3\]
Ini kurtosis = \(E[Z^4] - 3 = 0\) untuk normal distribution (mesokurtic).
3.2 Bernoulli Distribution
Untuk \(X \sim \text{Bernoulli}(p)\):
\[M_X(t) = E[e^{tX}] = e^{t \cdot 0}(1-p) + e^{t \cdot 1}p = (1-p) + pe^t = 1 - p + pe^t\]
Moments: - \(M'_X(t) = pe^t \Rightarrow E[X] = M'_X(0) = p\) - \(M''_X(t) = pe^t \Rightarrow E[X^2] = M''_X(0) = p\) - \(\text{Var}(X) = E[X^2] - (E[X])^2 = p - p^2 = p(1-p)\)
3.3 Poisson Distribution
Untuk \(X \sim \text{Poisson}(\lambda)\):
\[M_X(t) = E[e^{tX}] = \sum_{k=0}^{\infty} e^{tk} \frac{e^{-\lambda}\lambda^k}{k!} = e^{-\lambda}\sum_{k=0}^{\infty} \frac{(\lambda e^t)^k}{k!} = e^{-\lambda} e^{\lambda e^t} = e^{\lambda(e^t-1)}\]
Moments: - \(M'_X(t) = \lambda e^t e^{\lambda(e^t-1)} \Rightarrow E[X] = \lambda\) - \(M''_X(t)|_{t=0} = \lambda^2 + \lambda \Rightarrow \text{Var}(X) = E[X^2] - \lambda^2 = \lambda\) ✓
3.4 Exponential Distribution
Untuk \(X \sim \text{Exponential}(\lambda)\), \(f(x) = \lambda e^{-\lambda x}\):
\[M_X(t) = \int_0^{\infty} e^{tx} \lambda e^{-\lambda x}\,dx = \lambda \int_0^{\infty} e^{-(λ-t)x}\,dx = \frac{\lambda}{\lambda - t}, \quad t < \lambda\]
Moments: - \(M'_X(t) = \lambda/(\lambda-t)^2 \Rightarrow E[X] = 1/\lambda\) - \(M''_X(t) = 2\lambda/(\lambda-t)^3 \Rightarrow E[X^2] = 2/\lambda^2\) - \(\text{Var}(X) = 2/\lambda^2 - 1/\lambda^2 = 1/\lambda^2\)
3.5 Gamma Distribution
Untuk \(X \sim \text{Gamma}(\alpha, \beta)\), \(f(x) = \frac{\beta^\alpha}{\Gamma(\alpha)}x^{\alpha-1}e^{-\beta x}\):
\[M_X(t) = \left(\frac{\beta}{\beta - t}\right)^\alpha, \quad t < \beta\]
Chi-squared sebagai special case: \(\chi^2(k) = \text{Gamma}(k/2, 1/2)\), sehingga: \[M_{\chi^2(k)}(t) = \left(\frac{1}{1-2t}\right)^{k/2}, \quad t < 1/2\]
4 3. MGF dan Independence
Jika \(X\) dan \(Y\) independent, maka:
\[M_{X+Y}(t) = M_X(t) \cdot M_Y(t)\]
Proof: \[M_{X+Y}(t) = E[e^{t(X+Y)}] = E[e^{tX}e^{tY}] = E[e^{tX}]E[e^{tY}] = M_X(t)M_Y(t)\]
(Independence \(\Rightarrow\) expectation factorizes)
Aplikasi langsung:
Sum of iid Normals: \(X_i \sim N(\mu, \sigma^2)\), \(\sum X_i \sim N(n\mu, n\sigma^2)\)
- \(M_{\sum X_i}(t) = \prod M_{X_i}(t) = (e^{\mu t + \sigma^2 t^2/2})^n = e^{n\mu t + n\sigma^2 t^2/2}\)
- Ini adalah MGF dari \(N(n\mu, n\sigma^2)\) ✓
Sum of iid Poissons: \(X_i \sim \text{Poisson}(\lambda_i)\) independent, \(\sum X_i \sim \text{Poisson}(\sum \lambda_i)\)
Chi-squared additivity: \(\chi^2(m) + \chi^2(n) = \chi^2(m+n)\) (karena \(M(t) = (1-2t)^{-m/2}(1-2t)^{-n/2} = (1-2t)^{-(m+n)/2}\))
5 4. Membuktikan CLT dengan MGF
Sketch proof CLT menggunakan MGF (versi sederhana, \(\mu=0, \sigma=1\)):
Goal: Tunjukkan \(Z_n = \frac{1}{\sqrt{n}}\sum_{i=1}^n X_i \xrightarrow{d} N(0,1)\) jika \(X_i\) iid, \(E[X_i]=0\), \(\text{Var}(X_i)=1\).
Step 1: MGF dari \(Z_n\): \[M_{Z_n}(t) = \left[M_X\left(\frac{t}{\sqrt{n}}\right)\right]^n\]
Step 2: Taylor expand \(M_X(t/\sqrt{n})\) di sekitar 0: \[M_X\left(\frac{t}{\sqrt{n}}\right) = 1 + \frac{t}{\sqrt{n}}M'_X(0) + \frac{t^2}{2n}M''_X(0) + O(n^{-3/2})\] \[= 1 + 0 + \frac{t^2}{2n} + O(n^{-3/2}) \quad \text{(karena } E[X]=0, E[X^2]=1\text{)}\]
Step 3: Take the \(n\)-th power: \[M_{Z_n}(t) = \left(1 + \frac{t^2}{2n} + O(n^{-3/2})\right)^n \to e^{t^2/2} \quad \text{sebagai } n \to \infty\]
(menggunakan \(\lim_{n\to\infty}(1 + a/n)^n = e^a\))
Step 4: \(e^{t^2/2}\) adalah MGF dari \(N(0,1)\). Oleh uniqueness theorem for MGFs, \(Z_n \xrightarrow{d} N(0,1)\). QED.
6 5. Characteristic Function
Characteristic function (CF) dari \(X\) adalah:
\[\varphi_X(t) = E[e^{itX}] = E[\cos(tX)] + i\,E[\sin(tX)], \quad t \in \mathbb{R}\]
dimana \(i = \sqrt{-1}\).
Keunggulan vs MGF: - Selalu ada untuk semua distribusi (karena \(|e^{itX}| = 1\)) - MGF mungkin tidak ada untuk heavy-tailed distributions - CF sepenuhnya menentukan distribusi (uniqueness theorem)
Hubungan dengan MGF: Jika MGF \(M_X(t)\) ada di neighborhood of 0, maka \(\varphi_X(t) = M_X(it)\).
CF untuk beberapa distribusi:
| Distribusi | Characteristic Function \(\varphi_X(t)\) |
|---|---|
| \(N(\mu, \sigma^2)\) | \(\exp(i\mu t - \sigma^2 t^2/2)\) |
| \(\text{Poisson}(\lambda)\) | \(\exp(\lambda(e^{it}-1))\) |
| \(\text{Exponential}(\lambda)\) | \(\lambda/(\lambda - it)\) |
| \(\text{Cauchy}(0,1)\) | \(\exp(-|t|)\) |
Catatan: Cauchy distribution tidak punya MGF (heavy tails) tapi punya CF. Ini menunjukkan CF lebih general.
6.1 Inversion Formula
Jika \(\varphi_X(t)\) diketahui, kita bisa recover PDF: \[f(x) = \frac{1}{2\pi}\int_{-\infty}^{\infty} e^{-itx}\varphi_X(t)\,dt\]
Ini adalah Fourier inversion! CF adalah Fourier transform dari distribusi.
7 6. Cumulant Generating Function
\[K_X(t) = \log M_X(t)\]
Cumulants \(\kappa_r\) adalah coefficients di Taylor expansion: \[K_X(t) = \sum_{r=1}^{\infty} \kappa_r \frac{t^r}{r!}\]
Cumulants pertama dan interpretasinya: - \(\kappa_1 = \mu = E[X]\) (mean) - \(\kappa_2 = \sigma^2 = \text{Var}(X)\) (variance) - \(\kappa_3 = E[(X-\mu)^3]\) (third central moment, related to skewness) - \(\kappa_4 = E[(X-\mu)^4] - 3\sigma^4\) (excess kurtosis related)
Standardized skewness: \(\gamma_1 = \kappa_3/\sigma^3\)
Excess kurtosis: \(\gamma_2 = \kappa_4/\sigma^4\) (= 0 for normal, > 0 for heavy tails)
Keunggulan CGF: Cumulants additive untuk independent variables! \(K_{X+Y}(t) = K_X(t) + K_Y(t)\).
CGF untuk Normal: \(K_X(t) = \log(e^{\mu t + \sigma^2 t^2/2}) = \mu t + \sigma^2 t^2/2\)
Semua cumulants order \(\geq 3\) adalah nol untuk normal distribution. Ini adalah characteristic property of the normal distribution.
8 7. Worked Example: MGF of Normal — Verify Moments
Problem: Untuk \(X \sim N(\mu, \sigma^2)\) dengan \(M_X(t) = e^{\mu t + \sigma^2 t^2/2}\), verifikasi \(E[X] = \mu\), \(E[X^2] = \mu^2 + \sigma^2\), dan \(\text{Var}(X) = \sigma^2\).
Step 1: First derivative. \[M'_X(t) = (\mu + \sigma^2 t)e^{\mu t + \sigma^2 t^2/2}\] \[E[X] = M'_X(0) = \mu e^0 = \mu \checkmark\]
Step 2: Second derivative. \[M''_X(t) = \sigma^2 e^{\mu t + \sigma^2 t^2/2} + (\mu + \sigma^2 t)^2 e^{\mu t + \sigma^2 t^2/2}\] \[= e^{\mu t + \sigma^2 t^2/2}[\sigma^2 + (\mu + \sigma^2 t)^2]\] \[E[X^2] = M''_X(0) = e^0[\sigma^2 + \mu^2] = \sigma^2 + \mu^2 \checkmark\]
Step 3: Variance. \[\text{Var}(X) = E[X^2] - (E[X])^2 = \sigma^2 + \mu^2 - \mu^2 = \sigma^2 \checkmark\]
Step 4: Kurtosis (untuk \(Z \sim N(0,1)\)). \[M_Z^{(4)}(0) = E[Z^4] = 3\] Excess kurtosis = \(3 - 3 = 0\) (normal adalah mesokurtic) ✓
# Numerical verification using R
# Compare theoretical moments with simulated moments
set.seed(2024)
n <- 100000
mu <- 3; sigma <- 2
X <- rnorm(n, mean=mu, sd=sigma)
cat("=== MOMENT VERIFICATION ===\n")
cat(sprintf("E[X]: Theoretical=%g, Simulated=%.4f\n", mu, mean(X)))
cat(sprintf("E[X^2]: Theoretical=%g, Simulated=%.4f\n",
mu^2 + sigma^2, mean(X^2)))
cat(sprintf("Var(X): Theoretical=%g, Simulated=%.4f\n", sigma^2, var(X)))
cat(sprintf("E[X^3]: Theoretical=%g, Simulated=%.4f\n",
mu^3 + 3*mu*sigma^2, mean(X^3)))
# CGF approach
# K(t) = mu*t + sigma^2*t^2/2
# K'(0) = mu, K''(0) = sigma^2, K'''(0) = 0, K''''(0) = 0
# Verify: for Z~N(0,1), E[Z^4] = 3
Z <- rnorm(n)
cat(sprintf("\nFor Z~N(0,1):\n"))
cat(sprintf("E[Z^4]: Theoretical=3, Simulated=%.4f\n", mean(Z^4)))
cat(sprintf("Excess kurtosis: Theoretical=0, Simulated=%.4f\n",
mean(Z^4) - 3))
# MGF via numerical integration
mgf_normal <- function(t, mu=0, sigma=1) exp(mu*t + sigma^2*t^2/2)
t_vals <- seq(-1, 1, by=0.1)
mgf_theoretical <- mgf_normal(t_vals, mu=mu, sigma=sigma)
# Monte Carlo approximation of MGF at different t
mgf_simulated <- sapply(t_vals, function(t) mean(exp(t * X)))
cat("\n=== MGF COMPARISON ===\n")
cat("t\tTheoretical\tSimulated\n")
for(i in seq(1, length(t_vals), by=3)) {
cat(sprintf("%.1f\t%.4f\t\t%.4f\n",
t_vals[i], mgf_theoretical[i], mgf_simulated[i]))
}9 8. Aplikasi: Membuktikan Distribusi dari Sum
Claim: Jika \(X \sim N(\mu_1, \sigma_1^2)\) dan \(Y \sim N(\mu_2, \sigma_2^2)\) independent, maka \(X+Y \sim N(\mu_1+\mu_2, \sigma_1^2+\sigma_2^2)\).
Proof via MGF: \[M_{X+Y}(t) = M_X(t) \cdot M_Y(t)\] \[= e^{\mu_1 t + \sigma_1^2 t^2/2} \cdot e^{\mu_2 t + \sigma_2^2 t^2/2}\] \[= e^{(\mu_1+\mu_2)t + (\sigma_1^2+\sigma_2^2)t^2/2}\]
Ini adalah MGF dari \(N(\mu_1+\mu_2, \sigma_1^2+\sigma_2^2)\).
Karena MGF uniquely determines distribution, \(X+Y \sim N(\mu_1+\mu_2, \sigma_1^2+\sigma_2^2)\).
Analogously: Untuk \(X_i \sim N(\mu_i, \sigma_i^2)\) independent: \[\sum_{i=1}^n a_i X_i \sim N\left(\sum_i a_i\mu_i, \sum_i a_i^2\sigma_i^2\right)\]
10 9. Koneksi ke GMM dan Method of Moments
Method of Moments (MOM): Estimasi parameter dengan matching sample moments ke population moments. Population moments \(E[X^k]\) adalah langsung dari MGF. Sample moments \(\hat{\mu}_k = \frac{1}{n}\sum_i X_i^k\).
GMM (Generalized Method of Moments): Hansen (1982). “Moment conditions” adalah pernyataan \(E[g(X_i, \theta)] = 0\). Ini adalah generalisasi MOM dimana moment functions \(g\) bisa lebih umum dari power functions.
Contoh: IV estimation: moment condition \(E[(y_i - x_i\beta)z_i] = 0\) dimana \(z_i\) adalah instrument. Ini adalah statement tentang covariance antara error dan instrument.
Cumulant matching dalam econometrics: Dalam GARCH model estimation, kita sering match empirical kurtosis ke model-implied kurtosis. Kurtosis adalah function dari cumulant \(\kappa_4\).
Skewness dan kurtosis dalam asset returns: Financial returns yang non-normal require mismatch tests using \(\kappa_3\) (skewness) dan \(\kappa_4\) (kurtosis). Jarque-Bera test adalah exactly test bahwa \(\kappa_3=0\) dan \(\kappa_4=0\) jointly.
11 10. R Code: MGF Computations
# ============================================================
# MANUAL MGF COMPUTATION
# ============================================================
# MGF via Monte Carlo integration
mgf_montecarlo <- function(t, data) {
mean(exp(t * data))
}
# Compare to theoretical
set.seed(2024)
n <- 100000
# Normal(3, 4)
X_norm <- rnorm(n, mean=3, sd=2)
t_test <- 0.5
cat("Normal(3,4) MGF at t=0.5:\n")
cat("Theoretical:", exp(3*0.5 + 4*0.5^2/2), "\n") # exp(mu*t + sigma^2*t^2/2)
cat("Monte Carlo:", mgf_montecarlo(0.5, X_norm), "\n")
# Poisson(2)
X_pois <- rpois(n, lambda=2)
cat("\nPoisson(2) MGF at t=0.3:\n")
cat("Theoretical:", exp(2*(exp(0.3)-1)), "\n") # exp(lambda*(e^t - 1))
cat("Monte Carlo:", mgf_montecarlo(0.3, X_pois), "\n")
# ============================================================
# CUMULANTS: Compute from data
# ============================================================
library(moments) # for skewness() and kurtosis()
# Simulate from various distributions and check cumulants
check_cumulants <- function(x, dist_name) {
cat(sprintf("\n=== %s ===\n", dist_name))
cat(sprintf("Mean (kappa_1): %.4f\n", mean(x)))
cat(sprintf("Variance (kappa_2): %.4f\n", var(x)))
cat(sprintf("Skewness (gamma_1): %.4f\n", skewness(x)))
cat(sprintf("Excess kurtosis: %.4f\n", kurtosis(x) - 3))
}
n <- 100000
check_cumulants(rnorm(n), "Normal(0,1) [should: 0, 1, 0, 0]")
check_cumulants(rexp(n, rate=1), "Exp(1) [should: 1, 1, 2, 6]")
check_cumulants(rpois(n, lambda=5), "Poisson(5) [should: 5, 5, skew=2/sqrt(5), kurt=1/5]")
check_cumulants(rchisq(n, df=5), "Chi-sq(5) [should: 5, 10, skew=sqrt(8/5), kurt=12/5]")
# ============================================================
# CHARACTERISTIC FUNCTION: visualize
# ============================================================
# Empirical characteristic function
ecf <- function(t, data) {
n <- length(data)
cf_vals <- complex(real = mean(cos(t * data)), imaginary = mean(sin(t * data)))
return(cf_vals)
}
# For Standard Normal: phi(t) = exp(-t^2/2)
n <- 1000; Z <- rnorm(n)
t_vals <- seq(-3, 3, by=0.1)
# Empirical vs theoretical CF
ecf_real <- sapply(t_vals, function(t) Re(ecf(t, Z))) # Real part
theoretical_cf <- exp(-t_vals^2 / 2) # Theoretical CF of N(0,1) is exp(-t^2/2)
plot(t_vals, ecf_real, type="l", col="blue", lwd=2,
main="Characteristic Function of N(0,1): Real Part",
xlab="t", ylab="Re[phi(t)]")
lines(t_vals, theoretical_cf, col="red", lwd=2, lty=2)
legend("topleft", c("Empirical CF", "Theoretical exp(-t^2/2)"),
col=c("blue","red"), lwd=2, lty=c(1,2))
# ============================================================
# MGF-BASED MOMENT COMPUTATION: Numerical differentiation
# ============================================================
# For X ~ Gamma(alpha=3, beta=2)
alpha <- 3; beta_param <- 2
X_gamma <- rgamma(n, shape=alpha, rate=beta_param)
# Theoretical: E[X] = alpha/beta = 1.5, Var(X) = alpha/beta^2 = 0.75
cat("\n=== Gamma(3,2) Moments ===\n")
cat(sprintf("E[X]: Theoretical=%.4f, Sample=%.4f\n", alpha/beta_param, mean(X_gamma)))
cat(sprintf("Var(X): Theoretical=%.4f, Sample=%.4f\n", alpha/beta_param^2, var(X_gamma)))
# Numerical differentiation of MGF
h <- 1e-5
M_Gamma <- function(t) (beta_param/(beta_param-t))^alpha # Theoretical MGF
# Numerical first derivative at t=0
dM_dt_at_0 <- (M_Gamma(h) - M_Gamma(-h)) / (2*h)
cat(sprintf("E[X] from MGF diff: %.4f\n", dM_dt_at_0))
# Numerical second derivative at t=0
d2M_dt2_at_0 <- (M_Gamma(h) - 2*M_Gamma(0) + M_Gamma(-h)) / h^2
cat(sprintf("E[X^2] from MGF: %.4f (Theoretical: %.4f)\n",
d2M_dt2_at_0, alpha*(alpha+1)/beta_param^2))12 Practice Problems
Problem 1: MGF of Binomial.
Untuk \(X \sim \text{Binomial}(n, p)\):
- Derive \(M_X(t)\) (hint: \(X = \sum_{i=1}^n X_i\) dimana \(X_i \sim \text{Bernoulli}(p)\) iid)
- Gunakan MGF untuk compute \(E[X]\) dan \(\text{Var}(X)\)
- Verifikasi bahwa kamu mendapat \(np\) dan \(np(1-p)\)
Jawaban: \(M_{X_i}(t) = 1-p+pe^t\). Karena iid: \(M_X(t) = (1-p+pe^t)^n\).
\(M'_X(t) = n(1-p+pe^t)^{n-1} \cdot pe^t\)
\(E[X] = M'_X(0) = n(1)^{n-1} \cdot p = np\) ✓
\(M''_X(t) = n(n-1)(1-p+pe^t)^{n-2}(pe^t)^2 + n(1-p+pe^t)^{n-1}pe^t\)
\(E[X^2] = n(n-1)p^2 + np\)
\(\text{Var}(X) = n(n-1)p^2 + np - n^2p^2 = np - np^2 = np(1-p)\) ✓
Problem 2: Skewness dan kurtosis dari Poisson.
Untuk \(X \sim \text{Poisson}(\lambda)\) dengan \(M_X(t) = e^{\lambda(e^t-1)}\) dan \(K_X(t) = \lambda(e^t-1)\):
- Compute semua cumulants \(\kappa_r\) (hint: \(K_X^{(r)}(0) = \kappa_r\))
- Hitung skewness \(\gamma_1 = \kappa_3/\sigma^3\)
- Apa yang terjadi pada skewness saat \(\lambda \to \infty\)?
Jawaban: \(K_X^{(r)}(t) = \lambda e^t\) untuk semua \(r\). Maka \(\kappa_r = \lambda\) untuk semua \(r \geq 1\).
\(\gamma_1 = \kappa_3/\sigma^3 = \lambda/\lambda^{3/2} = 1/\sqrt{\lambda}\)
Saat \(\lambda \to \infty\), skewness \(\to 0\). Poisson mendekati normal untuk large \(\lambda\) (consistent dengan CLT).
Problem 3: Prove independence using MGF.
Jika \(X\) dan \(Y\) independent dan \(M_{X+Y}(t) = M_X(t)M_Y(t)\), gunakan fakta ini untuk tunjukkan bahwa jika \(Z_1, Z_2 \sim N(0,1)\) iid, maka \(Z_1 + Z_2 \sim N(0,2)\).
Jawaban: \(M_{Z_1+Z_2}(t) = M_{Z_1}(t)M_{Z_2}(t) = e^{t^2/2} \cdot e^{t^2/2} = e^{2t^2/2} = e^{0 \cdot t + 2 t^2/2}\)
Ini adalah MGF dari \(N(0, 2)\). QED.
Problem 4: CGF and log-normal.
Jika \(X \sim N(\mu, \sigma^2)\) dan \(Y = e^X\) (log-normal), derive \(E[Y]\) dan \(\text{Var}(Y)\) menggunakan MGF dari \(X\).
Jawaban: \(E[Y] = E[e^X] = M_X(1) = e^{\mu + \sigma^2/2}\)
\(E[Y^2] = E[e^{2X}] = M_X(2) = e^{2\mu + 2\sigma^2}\)
\(\text{Var}(Y) = e^{2\mu+\sigma^2}(e^{\sigma^2}-1)\)