X: Branching Processes
We shall now apply our knowledge of markov chains to an analysis of population growth. At each time epoch $t$, we let $X_t$ denote the number of individuals surviving at time $t$. Each time epoch will represent a generation of a species, so that at each time interval, offspring are generated, and the current population dies off. We now make the assumption that
Now if $X_t = k$, and $Y_1, \dots, Y_k$ are independent random variables with the distribution described above, then \begin{equation} \mathbf{P}(X_{t+1} = k' | X_t = k) = \mathbf{P}\left(\sum_{i = 1}^k Y_i = k' \right) \end{equation} Let $\mu$ denote the mean amount of offspring a single individual will possess. Then by (1), \begin{equation} \mathbf{E}(X_{t+1}|X_t = k) = \mathbf{E}\left(\sum_{i = 1}^k Y_i \right) = \sum_{i = 1}^k \mathbf{E}(Y_i) = k\mu \end{equation} (2) gives us the relatively easy computation that \begin{equation} \mathbf{E}(X_{t+1}) = \mathbf{E}[\mathbf{E}(X_{t+1}|X_t)] = \sum_{k = 0}^\infty \mathbf{P}(X_t = k) k \mu = \mu \mathbf{E}(X_t) \end{equation} And therefore $\mathbf{E}(X_n) = \mu^n \mathbf{E}(X_0)$.
This already tells us the intuitive fact that
Even if $\mu \geq 1$, there is a chance that the population will become extinct. The problem in this section will be in deriving this probability in terms of the reproduction probabilities. Let $a_n(k)$ denote the probability of extinction after $n$ generations given that we start with k individuals. Then, as we have derived above, the possibility of general extinction from $k$ individuals is $a(k) = \lim_{n \to \infty} a_n(k)$. Since all $k$ branches of the population act independantly, we have $a_n(k) = a_n(1)^k$, and it suffices to determine $a(1)$, which we shall denote $a$. Furthermore, for computation, denote the probability of $k$ offspring from a branch by $p_k$. If we look one generation ahead, then \begin{equation} a = \mathbf{P}(\text{extinction}|X_0 = 1) = \sum_{k = 0}^\infty \mathbf{P}(X_1 = k | X_0 = 1) \mathbf{P}(\text{extinction} | X_1 = k) = \sum_{k = 0}^\infty p_k a^k \end{equation} Thus $a = \varphi_1(a)$, where $\varphi_1$ is the generating function of $X_1$, assuming $X_0 = 1$. If we let $\varphi_n$ be the generating function of $X_n$, then \begin{align} \varphi_n(a) &= \sum_{k = 0}^\infty \mathbf{P}(X_n = k) a^k = \sum_{k = 0}^\infty \sum_{j = 0}^\infty \mathbf{P}(X_1 = j) \mathbf{P}(X_n = k | X_1 = j) a^k\\ &= \sum_{j = 0}^\infty p_j \sum_{k = 0}^\infty \mathbf{P}(X_n = k | X_1 = j) a^k = \sum_{j = 0}^\infty p_j \sum_{k = 0}^\infty \mathbf{P}(X_{n-1} = k | X_0 = j) a^k \end{align} Now $\mathbf{P}(X_{n-1}| X_0 = j)$ is distributed the same as the sum of $j$ independent random variables $Y_1, \dots, Y_j$, each distributed the same as $\mathbf{P}(X_{n-1} | X_0 = 1)$, so that each has the generating function $\varphi_{n-1}$, and so, in tandem with (6), \begin{align} \varphi_n(a) = \sum_{j = 0}^\infty p_j \mathbf{E}(a^{X_{n-1}}) = \sum_{j = 0}^\infty p_j \varphi_{n-1}(a)^j = \varphi(\varphi_{n-1}(a)) \end{align} Using (7), we can recursively determine $a_n(1) = \mathbf{P}(X_n = 0 | X_0 = 1) = \varphi_n(0)$ by calculating $\varphi_n(a)$