Statistics 650 Applied Stochastic Processes
Spring 2018 MW 5-6:15pm , Mth 0409A directory containing old final exams and sample
exam problems, and also a list of topics we emphasized in this term's course, can be
found here.
Please fill out the on-line Evaluation form on this Course and teacher at http://CourseEvalUM.umd.edu. Thank you.
Instructor: Eric Slud, Statistics Program, Math. Dept.
Office: Mth 2314, x5-5469, email evs@math.umd.edu
Office hours: M10:30-11:30 am, W1:30 or by appointment.
Required Course Text:
R. Durrett, Essentials of Stochastic processes,
1999, 2nd ed. 2010, 3rd ed. 2016 Springer.
This is a good book, with good intuitive explanations and many
interesting problems.
The beta 2nd edition is available free online and also here.
Recommended Texts: Karlin, S. and Taylor, H.,
A First Course in Stochastic Processes,
2nd ed. 1975,
Academic Press.
This is an excellent standard book, with thorough treatment of all important topics and
a broad selection of applications. However, its problems are (much) too hard.
Additional Recommended Course Texts (used in Spring 2016):
R. Serfozo, Basics of Applied Stochastic Processes,
2009 Springer.
M. Lefebvre, Applied Stochastic Processes,
2007 Springer.
Both of these books can be freely downloaded as e-books (through the University of Maryland libraries) by currently registered University of Maryland College Park students, who can also purchase hard-copies for $25.00. (Inquire at the EPSL Library for details.)
Both of these two books are well-written mathematical treatments of our subject, but the first book is written with slightly more advanced prerequisite, assuming measure-theoretic probability. I may draw some problems from these books (but will reproduce them in a convenient form).
Updated Homework Solutions Info about Mid-term Test Info about Final Examination Sample Final-Exam Problems
Course Coverage: The core material of the course consists of Chapters 1-4 in the Serfozo book, with considerable skipping of more advanced topics, and Chapters 3, 5, and 6 in the Lefebvre book: the primary topics are Discrete Time Markov Chains, Poisson Processes, Continuous Time Markov Chains, and Renewal Processes. See detailed Course Outline below. The primary applications are population and queueing processes, but many other special applied examples occur in both books.
Overview: This course is about Stochastic Processes, or
time-evolving collections of random variables, primarily about the discrete-state
continuous time Markov chains which arise in applications in a variety of disciplines.
For the first part of the course, both the random variables and the time index-set are discrete:
in this setting, our object of study is discrete-time discrete-state Markov chains.
Examples of "states" which arise in applications include the size of a population or a
waiting-line, or the state ("in control" versus "out of control") of a manufacturing process,
or other indicators such as "married" or "employed" etc. for an individual. "Markov chains" are
time-evolving systems whose future trajectory does not depend on their past history,
given their current state. But many of the most interesting applications involve the generalization
of the same ideas to continuous time.
Probability theory material needed throughout this course includes joint probability laws,
probability mass functions and densities, conditional expectations, moment generating functions,
and an understanding of the various kinds of probabilistic convergence, including the Law of
Large Numbers.
Various technical tools developed in the course and used in an essential way include: Laplace
transforms and moment generating functions, methods of solving recursions and systems of
difference equations, ergodic and renewal theorems for Markov chains, and (discrete-time)
martingales.
Prerequisite: Stat 410, or Math 410 plus one semester of probability theory.
Course requirements and Grading: there will be 8 graded homework sets (one every 1½ weeks) which together will count 50% of the course grade. In all homework problems, your solution will be graded correct only if you fully justify your reasoning. There will also be an in-class test and a final examination, which will respectively count 20% and 30% toward the overall course grade.
Assignment 1.
(First 1.5 weeks of course, HW due Fri., Feb. 2).
Read the Review of Probability Chapter plus Sections 1.1 to 1.3.
Then solve and hand in the following problems (8 in all):
(1) Which of the following two functions F(x,y) can possibly be the joint
distribution function P(X ≤ x, Y ≤ y) of a pair of jointly distributed
random variables X, Y. For each part, if F is a proper joint d.f.,
find the marginal density functions for X and Y.
(i) F(x,y) = 1 - e^{-x-y} if x,y ≥ 0,
and = 0 otherwise.
(ii) F(x,y) = 1 - e^{-x} - x e^{-y} if 0 ≤ x ≤ y,
1 - e^{-y} - y e^{-y} if 0 ≤ y ≤ x,
and = 0 otherwise.
(2) Show that it is not possible to load [assign unequal probabilities to the possible outcomes
1,..,6 for each of] two dice in such a way that when they are tossed independently, the total
number of dots showing is equally likely to be any of the integers 2, 3, ..., 12.
(3) In the game of craps when the "shooter's point is 4", his objective is to throw a sum of 4 dots showing
with two [independent, fair] dice before a sum of 7 dots appears, in repeated tosses of the two dice. Calculate in
three different ways the probability x that he succeeds:
(i) Consider the outcome of the
first roll of the dice, to conclude that x = 3/36 + (27/36) x and solve.
(ii) Let A_{k} be the event that the sum is k and justify why
x = P(A_{4}) / P( A_{4} ∪ A_{7}).
(iii) Calculate directly
and then sum up over k the probabilities that 4 occurs before 7 and this happens on the
k'th trial.
(4) For two independent random variables X ~ Uniform[0,4] and Y ~ Uniform[2,5],
find (a) E(Y-X | Y > X) , and (b) the joint
probability distribution of (min(X,Y), Y - min(X,Y)). Note that the second
component random variable in this pair has a probability atom at 0, i.e. a positive
probability of being equal to 0.
plus 4 more problems about Markov chain basics:
Lefebvre book
#12 p.145 and #13 p.151,
and Durrett (online Beta 2nd ed. version), #1.7 and 1.8(c),(d) pp. 87-88.
Assignment 2.
(Second 2 weeks of course, HW due Fri., Feb. 16, 11:59pm).
Read Chapter 1 Sections 1.3 to 1.5.
Then solve and hand in the following problems (7 in all):
(1) Prove the next-to-last statement in the proof of (1.2) [mistakenly cited as "(4.1)"] on
page 16, namely:
prove that for all i, j, k ∈ S and all
m, n ≥ 1 , P(X_{m+n}=j | X_{m}=k,
X_{0}=i) = P(X_{m+n}=j | X_{m}=k).
(2) Suppose that (X_{k}, k=0,1,2, ...) is a temporally homogeneous Markov
chain on an irreducible state-space S with a finite number K
of states and with K x K transition matrix Q with entries
Q_{ij} = p(i,j). Prove that all entries of Q^{n} for
n ≥ K are strictly positive under the additional assumption that
Q_{ii} > 0 for all i ∈ S, but not (in general) without that
assumption.
(3) Suppose that (X_{k}, k=0,1,2, ...) is a temporally homogeneous Markov
chain on state-space S equal to the nonnegative integers. Suppose that
p(i,j)=0 whenever j > i+20, that p(i,i+1)>0 for all i,
and that for all n ≥ 10, p(n,0) ≥ 1/n^{1/2}.
Prove from these assumptions that
(a) for all initial states i, Pr(min({k ≥ 1: X_{k}
∈ {0,1,...,9}}) < ∞) | X_{0}=i) = 1, and that therefore
(b) the Markov chain is irreducible and recurrent.
plus problems # 1.13 and 1.15 on pp.92-93 and 1.47 on p.98 of Durrett,
and Question #20 on pp.147-148 of Lefebvre.
Assignment 3.
(Next 2 weeks of course, HW due Fri., March 2, 11:59pm).
Read the rest of Chapter 1 of Durrett.
Then solve and hand in the following problems (7 in all):
(1) (a) Prove that the Branching Process Markov Chain on {0,1,2,...} with offspring distribution p(k) = 1/4
for k=1,2 and p(0) = 1/2 has all states transient other than {0} which is the
only irreducible closed class. (b) Prove that if the transitions from 0 in the chain
in (a) are modified so that p(0,1) = .2 and p(0,0) = 0.8, then the chain
becomes irreducible and positively recurrent. If you can, find an upper bound on
E_{0}(T_{0}).
(2) Recall the discussion early in the course about the (random) time T_{(1,2,3)} to
first see 1,2,3 appear successively in an iid sequence of uniform{0,1,...,9} random
digits. (a) Formulate the sequence of successive triplets seen as a Markov Chain on
{0,1,..,9}^{3}, and show that this chain is
irreducible aperiodic and not reversible. (b) Find E(T_{(1,2,3)}). I do not specify an
initial state because I have in mind the expected waiting time from the beginning of generating the
iid sequence of random digits. In the context of the Markov Chain that you define in part (a), you
can take as your initial state any triplet not ending in 1,2,or 3.
plus problems # 1.52 and 1.54 on pp.99-100, 1.63 and 1.67 on pp.102-103, and 1.75 on p.105
of Durrett.
Assignment 4.
(Next 2 weeks of course, HW due Wed., March 14).
Read Chapter 2 of Durrett. Begin Chapter 4 by reading all of Section 4.1.
After reading Chapter 2, solve and hand in the following problems (8 in all, worth 9 points each):
problems # 2.14, 2.17, 2.24, 2.35, 2.38, 2.43, 2.48, 2.56 on pp.129-136 of Durrett.
Assignment 5.
(Next 2 weeks of course, HW due Fri., March 30, 11:59pm.).
Read Chapter 4 of Durrett through Section 4.3.
Solve and hand in the following problems (6 in all, worth at total of 65 points):
problems # 4.3, 4.8, 4.22, 4.28, 4.29 of Durrett each worth 10 points, plus 1 additional problem worth
15 points:
(I) Suppose that X_{t} for t ≥ 0 is a homogeneous
continuous-time Markov chain with
states S = {1,2,3} and transition rates q_{ab} = b
for all a,b ∈ S.
(a) Find E_{b}(T_{b}).
(b) Find the limit for large t of
P_{1}(X(t) = b).
(c) Find the limit for large
integer n of E_{1}(τ_{n}/n), where
τ_{0} = 0 and
for k ≥ 0, τ_{k+1} =
inf{ t> τ_{k}: X(t) ≠ X(τ_{k}).
Assignment 6.
(Next 2 weeks of course, HW due Fri., April 13, 6pm.).
Read the rest of Chapter 4 of Durrett, and also Sections 3.1 and 3.2.
Solve and hand in the following problems (7 in all):
problems # 4.24, 4.32, 3.9, and 3.13 of Durrett, plus 3 additional problems:
In connection with problems (A)-(B) below, note that a continuous-time
irreducible HMC is recurrent if and only if its embedded discrete-time chain is
null-recurrent. (The reason for that is that recurrence is equivalent to saying
that with probability 1 the chain starting from state i
returns to i infinitely often, and this has to do only with
the sequence of successive states entered by the chain and not at all with the
time it takes to make the transitions. So this holds even if the chain is explosive
and makes infitinely many jumps in finite time.)
For clarity in these problems, assume that all Markov chains
are irreducible and in (A) interpret "positive-recurrent" to mean that the expected
time to re-enter any specified state is finite.
(A) Show that the finite-state continuous-time Markov Chain X(t) is
positive-recurrent
if its discrete-time embedded chain X(τ_{k})
is positive-recurrent.
(B) Give an example, in terms of the continuous-time ladder chain X(t)
with state-space
S = {0,1,2,...}, with rates q_{k,0} = a_{k},
q_{k,k+1} = b_{k} and all other rates q_{ij} = 0, of
a positively
recurrent continuous time chain which has null-recurrent embedded discrete-time
chain X(τ_{k}).
(C) Find the exact transition probabilities P_{t}(a,b) of the HMC with 3x3
intensity matrix
Q which has entries 1 off the diagonal and -2 on the diagonal,
by solving the forward or
backward differential equation system explicitly.
Assignment 7.
(Next 2 weeks of course, HW due Fri., April 27, 11:59pm.).
Read the rest of Chapter 3 of Durrett, and also Section 5.2.
Solve and hand in the following problems (7 in all):
problem # 3.18, 3.20 in Durrett [and for 3.20 derive a renewal equation for the desired probability
and use the Key Renewal Theorem], and problems 4, 14 and 19 in Serfozo's book pp. 157-160, plus the
following 2 extra problems:
(I) Suppose that Y ~ F is a service-time random variable, and that N(t)
is an independent
Poisson(λ) process. Let M denote the number of jumps in
N(t) viewed as new queueing-system
arrivals over the course of the service-time Y.
(a) Find the probability generating function of the nonnegative-integer-valued random variable M,
and use it to find the mean and variance of M.
(b) Use the result in (a) together with what you know
about branching-process extinction
probabilities to give another proof that the M/G/1 queue has
probability 1 of emptying
out regardless of the length of its initial waiting-line when λ < μ,
using the idea of Example 1.51 covered in class.
(II) Show that it follows from the Blackwell Renewal Theorem for a continuous-time irreducible
null-
recurrent HMC X(t) on a countable state-space S that for any states
a, b ∈ S, lim_{t → ∞} p_{t}(a,b) = 0.
Assignment 8.
(Next 2 weeks of course, HW due Fri., May 11, 4:30pm.).
Read the rest of Chapter 5 of Durrett, and also Ch. 6.
Solve and hand in the following problems (5 in all, worth a total of 75 points):
(A) (10 points) Suppose that X_{k+1} = ∑_{1 ≤ j ≤
Xk}
Z_{k,j} is a Branching process, where Z_{k,j} are all
iid
(nondegenerate) integer-valued random variables with mean
μ, where X_{0}=1. Show
that X_{k}/μ^{k} is a
martingale, and conclude that
X_{k} converges to 0 with probability 1 if μ ≤ 1. Hint:
Use the fact
that X_{k} is both an HMC and converges to a finite-valued random
variable X with mean ≤ 1.
(B) (20 points) Let X_{n} be a discrete-time Birth-death HMC with
0 < λ_{k} ≤ μ_{k} for all k ≥ 1, and
λ_{0}=1.
(a). Show that this HMC is irreducible and recurrent.
(b). If λ_{k} =
μ_{k} for all k ≥ 1, and N_{k}(0) is the number of j ∈ {1,...,k} for
which X_{j}=0, then
show
E_{0}(X_{n}) = E_{0}(N_{n-1}(0)) + 1 for all n ≥ 1,
where N_{0}(0) is defined = 0.
(c). Use (b) or another method to develop a
formula for the limit for large n of E_{0}(N_{n+k}(0)-N_{n}(0)).
(d). Still under the assumption that λ_{k} = μ_{k} for k ≥ 1,
find P_{1}( X_{n} hits M before 0 ).
(C) (10 points) Suppose that X_{n} is a fair random walk on the d-dimensional
Cartesian product S of the
(positive and negative) integers, i.e. the set of vectors of integers
of length d, with each
coordinate of X_{k+1}- X_{k} equal to ± 1 with
probability 1/2, independently of the other coordinates.
Prove that X_{n} visits all
2^{d} orthants of S infinitely often, with probability 1, even though (it can be shown,
but you should not show it for this exercise) that for all d ≥ 3, {
X_{n} } is an irreducible transient HMC.
(D) (10 points) If Y_{n} is a Homogeneous Markov Chain
(discrete-time) on the state-space
S = {0,1,2,... } with transitions p(k,0) ≥ (1/k)
∑_{j: j ≥ k} (j-k) p(k,j) for all k > 100 and p(x,x+1) > 0
whenever x ≥ 0, then show that the chain Y_{n} is irreducible
recurrent.
This problem has been corrected (inserting the term j-k inside the sum
over j ≥ k) at a relatively late stage.
So if you have already spent time on the problem and
hand in a fully developed counterexample to
the incorrect older statement, you will get full
credit. Otherwise do the problem in its current version
(E) (25 points) Consider an M/G/1 queueing system with Poisson(0.5) arrivals and
Uniform[0,1] service
times (in hours), with the additional protocol or rule of operation that
whenever there are at least 2 persons
waiting on line (in addition to the one currently in
service), the doors of the service facility close so that
no new arrivals can enter the system
until the system empties out (i.e. until no one is left waiting and the
last person's service
time is complete), at which time the doors re-open.
(a). Find the long-term proportion of the time during which the system is busy.
(b). Find the long-term proportion of arriving customers who are allowed to enter the system.
(c). Find the average number of hours that an arriving customer who enters the system spends
until that
person's service is completed.
(d). Suppose that the system's owners receive $10 from each customer who enters and completes
his service
in less than 1 hour and $5 from each customer who enters and completes his
service in > 1 hour. Derive
a renewal equation for the expected income of the service
facility during the first t hours of operation (for
real values of t ≥ 0).
(1). Handout on a non-Markovian example constructed
by lumping 2 of the states in a 3-state Homogeneous Markov Chain.
(2). Handout of 5 pages from Durrett's "Essentials of Stochastic Processes" book
containing proofs for equivalent conditions to positive-recurrence,
for aperiodic irreducible HMC's.
(3). Handout clarifying two loose ends from recent classes: (1) the
condition for existence of a stationary probability distribution for recurrent Birth-Death chains, and (2) the
notational verification as in Durrett that the "cycle trick" provides a stationary disatribution for an
irreducible positive-recurrent HMC.
(4). Handout on existence of invariant vectors for countable-state irreducible
transient Homogeneous Markov Chains.
(5). Handout on the uniqueness of solutions of the
Kolmogoroff forward and backward equations when the rates q_i of leaving states are uniformly bounded.
(5). A Sample Test containing short-answer problems like the ones to appear on the April 2
in-class test can be found here. Note that this sample test does
not have Poisson process problems, although they are in scope for the Spring 2018 mid-term.
(6). Handout on the main limit theorems of renewal theory,
taken from 2 pages of the Grimmett and Stirzaker book, "Probability and Random Processes".
Mid-Term Test
A mid-term test will be given in-class on Monday, April 2, 2018. It will be easier than the HW problems, consisting of short-answer questions emphasizing definitions and the application of results of Theorems. Sample problems can be found here. On the test, you will not be permitted to use your books, but you are allowed to bring up to two notebook sheets of notes for reference.
Sample problems and information about specific coverage of the test can be
found here.
Sample Final Exam
You can find sample final-exam problems here. By all
means try them and ask questions about them, either in class or in
the Review Session to be held Friday, May 11, in our classroom from 4:30--6pm.
The Final Examination will be held from 4 to 6pm on Wednesday, May 16, 2018 in the
regular classroom. It will have 5 or 6 problems. The coverage is cumulative of
all material from the Durrett book assigned and covered in class throughout the
semester. Also, you will be allowed 2 2-sided notebook sheets of reference material
for the exam. But the exam will otherwise be closed-book.
There will be a review session on Friday, May 11, 4:30-6pm, for the Final Exam, in
the regular classroom.
The UMCP Math Department home page.
The University of Maryland home page.
My home page.
© Eric V Slud, May 11, 2018.