1. LInear Applications: Markov Process and OLS
  1. Markov Revisited
  2. Earlier we considered finding the stationary distribution of a discrete Markov process. That is, $P$ is a $J\times J$ transition matrix for a system that takes on one of $J$ discrete states each period. Then $f^{t+1} = P f^t$ is how the system evolves over time. A stationary distribution over states is a $f^\star$ such that $f^\star = P f^\star.$ One way to find $f^\star$ is to iterate on from some initial distribution until $\|f^{t+1}-f^t\|$ is small enough.

    For this solution $f^\star$ to be a proper distribution the initial value $f^0$ must be such that $\sum f^0_i = 1.$ Then if $P$ is a valid transition the columns add up to 1 so $Pf^t$ will always sum to 1. (In words, start with a distribution that adds up to 100% then the transition simply moves that distribution around but keeps the total to 100%. No mass is lost or gained with a valid transition matrix $P$.

    Another way to think about this process is as a linear system. That is, $$f^\star = P f^\star \quad\Rightarrow \quad (I-P)f^\star = \overrightarrow{0}. \tag{SD}\label{SD}$$ So the stationary distribution solves a particular system of equations of the usual form $Ax=b$. Subtract $P$ from $I$ to get $A$ and set $b$ to a vector of 0s. However, notice something about this. Suppose we have a solution $f^star$ and we double all the values: $$(I-P)2f^\star = 2(I-P)f^\star = 2\overrightarrow{0} = \overrightarrow{0}.\tag{SS}\label{SS}$$ Since $b$ is the zero vector if there is one solution to the stationary values there are lots of other solutions: any scaling of a solution is still a solution. There is some linear algebra going on here: any vector in the "null space" of $I-P$ is a solution to the distribution, so the $J\times J$ system must not be invertible if a stationary distribution exists. If the system is not invertible then many vectors will solve \eqref{SD}.

    However, as discussed above, a proper stationary distribution must have values that sum to 1 (and are between 0 and 1). This is a new linear equation that we want a solution to solve: $$\sum_i f^\star_i = 1 \quad\Rightarrow\quad \pmatrix{1 & 1 &\cdots & 1} f^\star = 1.\tag{Sum1}\label{Sum1}$$ It turns out we can substitute this row for any of the rows of the system in \eqref{SD}. And if this is done, then two things happen. First, the system is no linear singular (becomes invertible). Second, the right hand side is no longer just 0s. It turns out that if a stationary distribution exists solving this new system will find it and the distribution is unique.

    The idea is simple to describe but notation is tricky: compute $I-P$ then replace the last row of that matrix with a row of 1s, then replace the last 0 on the right-hand-side with a 1. The result might be written: $$\pmatrix{ ---- & I_{0*}-p_{0*} & ---\cr ---- & I_{1*}-p_{1*} & ---\cr \vdots & \vdots & \vdots \cr ---- & I_{J-2\,*}-p_{J-2\,*} & ---\cr 1 & \cdots & 1 &}f^\star = \pmatrix{0 \cr 0 \cr \vdots \cr 0 \cr 1}$$ Giving names to the matrix and vector constructed from P and 0, we can write this as $$Qf^\star = b^\star.\tag{SD2}\label{SD2}$$ Notice that this will give you the same result as iterating on the Markov process. The two approaches can differ in how long they take.
  3. OLS
  4. Not completed yet ... see lectures.

Exercises