Probability And - Statistics 2
She invoked : Posterior ∝ Likelihood × Prior Using Markov Chain Monte Carlo (MCMC) —a computational method to sample from complex posterior distributions—she showed that neither guild was entirely wrong. The Drift had a hidden Markov structure : it switched between “tide-like” and “random walk” states at random intervals. The probability of switching was itself a parameter.
A debate ensued. Elara stepped in. “In Stat 1, you compare point estimates. In Stat 2, you compare entire distributions of belief.” probability and statistics 2
She introduced the : Var(Y) = E[Var(Y|X)] + Var(E[Y|X]) The fishermen scratched their heads. She explained: “The total uncertainty of your position comes from two things: the average internal chaos (the Drift’s random variance) plus the uncertainty in the Drift’s mean behavior.” She invoked : Posterior ∝ Likelihood × Prior
They ran a Gibbs sampler (a type of MCMC) overnight. By dawn, the chains had converged. The posterior distribution revealed that the Drift switched states every 3.2 days on average. Now they could build a real-time predictor. For the next hour’s Drift speed, they used a Kalman filter —a recursive algorithm that updates predictions as new data arrives. A debate ensued