Stationary Distributions:
Let $\mathbf{P}$ be the transition probability matrix of a homogeneous Markov chain $\{X_n, n \geq 0\}$. If there exists a probability vector $\mathbf{\pi}$ such that $$\mathbf{\pi} \mathbf{P} = \mathbf{\pi} \:\:\:\:\:\:\: (1)$$
then $\mathbf{\pi}$ is called a stationary distribution for the Markov chain.
Equation $(1)$ indicates that a stationary distribution $\mathbf{\pi}$ is a (left) eigenvector of $\mathbf{P}$ with eigenvalue $1$. Note that any nonzero multiple of $\mathbf{\pi}$ is also an eigenvector of $\mathbf{P}$. But the stationary distribution $\mathbf{\pi}$ is fixed by being a probability vector;
that is, its components sum to unity.
Limiting Distributions:
A Markov chain is called regular if there is a finite positive integer $m$ such that after $m$ time-steps, every state has a nonzero chance of being occupied, no matter what the initial state. Let $A > 0$ denote that every element $a_{ij}$ of $A$ satisfies the condition $a_{ij} > 0$. Then, for a regular Markov chain with transition probability matrix $\mathbf{P}$, there exists an $m > 0$ such that $\mathbf{P}^m > 0$. For a regular homogeneous
Markov chain we have the following theorem:
Thm:
Let $\{X_n, n \geq 0\}$ be a regular homogeneous finite-state Markov chain
with transition matrix $\mathbf{P}$.
Then
$$\lim \limits_{n \to \infty} \mathbf{P}^n = \mathbf{\hat{P}}$$
where $\mathbf{\hat{P}}$ is a matrix whose rows are identical and equal to the stationary distribution $\mathbf{\pi}$ for the Markov chain defined by $(1)$. This is a sufficient condition for $\mathbf{\pi}$ to be unique.