Preliminaries


Expected Values


$E[X]$ is the mean of $X$.

For a discrete random variable $X$,

$$ E[X] = \sum^N (x_i)P(x_i) $$

For a continuous random variable $X$,

$$ E[X]=\int_a^b xf(x) \,dx $$

where $f(x)$ is the probablity distribution of the random variable $X$.

Properties of Expection Values

Some useful properties of expectation values.

  • $E[a] = a$
  • $E[aX] = aE[X] = a\mu$
  • $E[aX + b] = aE[X] + b = a\mu$ + b

Where $a$, $b$ are constants.

Variance


Definition of variance of a distribution

$$ Var(X) = E[(X-\mu)^2]= \int_{-\infty}^\infty (X-\mu)^2f(x) \,dx = \sigma^2 $$

Estimation of variance

$$ \hat{Var}(X) = \sum^N p_i (x_i - \bar{x})^2 $$

where $p_i$ represents the probability mass function of X.

Autocorrelation


Independence

An event is independent if the probability of its occurence does not depend on events in the past.

$$ P(x_{t+1}|x_t) = P(x_t) $$

Thm: If two events are independent, thier correlation is 0.

Corollary: If the correlation between two variables is non-zero, they are not independent.

Serial Dependence

Serial dependence can be accessed with autocorrelation. Autocorrelcation is denoted as $\rho_k$, where $k$ represents the number of lags. Autocorrelation is expressed as

$$ \rho_k = \frac{E[(X_t-\mu)(X_{t+k}-\mu)]}{\sigma_x^2} $$

Let $\gamma_k$ be defined as

$$ \gamma_k = E[(X_t-\mu)(X_{t+k}-\mu)] $$

And noting that

$$ \sigma_x^2 = E[(X_t-\mu)(X_{t+k}-\mu)] \,\, |_{k=0} $$

Then

$$ \rho_k = \frac{\gamma_k}{\gamma_0} $$

Note: $\gamma_k$ is known as autocovariance.

References


[1] W. Woodward and B. Salder, "Stationary", SMU, 2019