Statistics 650  Applied Stochastic Processes

Spring 2020 MW 5-6:15pm , MTH (Kirwan Hall) 0103

After Spring Break, starting March 30, 2020, this course will be delivered only in an online format, through ELMS/Canvas and Zoom tools. Major announcements related to homework assignments will continue to be posted on this website as well as on ELMS. However, essential course materials including lecture materials, tests, discussion boards, and online office hours will be accessible only through your student accounts on ELMS, into which you log in with University credentials at https://elms.umd.edu/.

This web-page for STAT 650 is adapted from one last used in Spring 2018. The main difference is that in Spring 2020 the required textbook is the Serfozo book listed below, which is available free as an e-book through the University of Maryland Library catalog. The texts listed below other than Karlin and Taylor -- including the Durrett book required in Spring 2018 -- are also available as free e-books and are still Recommended Texts for Spring 2020.


A directory containing old final exams and sample exam problems, and also a list of topics we emphasized in the Spring 2018 term's course, can be found here.


Instructor: Eric Slud, Statistics Program, Math. Dept.

Office:    MTH (Kirwan Hall) 2314, x5-5469, email slud@umd.edu

Office hours:    M11am-12noon, W1-2pm or by appointment.

Required Course Text:
R. Serfozo, Basics of Applied Stochastic Processes, 2009 Springer.


This book is somewhat more formal about theorems and proofs than Durrett, and uses some real-analysis language and the notation of "sigma-algebras", but is presented at the level of a slightly strengthened version of undergraduate Probability (STAT 410) and does not require any knowledge of measure theory.

Recommended Texts:
Karlin, S. and Taylor, H., A First Course in Stochastic Processes, 2nd ed. 1975, Academic Press.

This is an excellent standard book, with thorough treatment of all important topics and lots of applications. However, its problems are (much) too hard.

Additional Recommended Course Texts (used in Spring 2016 and 2018):
R. Durrett, Essentials of Stochastic processes, 1999, 2nd ed. 2010, 3rd ed. 2016, Springer.

This is a good, readable book, with intuitive explanations and many interesting problems. The beta 2nd edition is available free online and also here.

M. Lefebvre, Applied Stochastic Processes, 2007 Springer.

This book is definitely easier than the others; it is radable at a more basic level and is a source for easier problems and more straightforward applications.

The Serfozo and Lefebvre books can be freely downloaded as e-books (through the University of Maryland libraries) by currently registered University of Maryland College Park students, who can also purchase hard-copies for $25.00. (Inquire at the EPSL (STEM) Library for details.)


Current HW Assignment         Updated HW Solutions         Info about Mid-term Test         Info about Final Examination         Sample Final-Exam Problems


Course Coverage: The core material of the course consists of Chapters 1-4 in the Serfozo book, with considerable skipping of more advanced topics, and Chapters 3, 5, and 6 in the Lefebvre book: the primary topics are Discrete Time Markov Chains, Poisson Processes, Continuous Time Markov Chains, and Renewal Processes. See detailed Course Outline below. The primary applications are population and queueing processes, but many other special applied examples occur in both books.

Overview:    This course is about Stochastic Processes, or time-evolving collections of random variables, primarily about the discrete-state continuous time Markov chains which arise in applications in a variety of disciplines. For the first part of the course, both the random variables and the time index-set are discrete: in this setting, our object of study is discrete-time discrete-state Markov chains. Examples of "states" which arise in applications include the size of a population or a waiting-line, or the state ("in control" versus "out of control") of a manufacturing process, or other indicators such as "married" or "employed" etc. for an individual. "Markov chains" are time-evolving systems whose future trajectory does not depend on their past history, given their current state. But many of the most interesting applications involve the generalization of the same ideas to continuous time.

Probability theory material needed throughout this course includes joint probability laws, probability mass functions and densities, conditional expectations, moment generating functions, and an understanding of the various kinds of probabilistic convergence, including the Law of Large Numbers.

Various technical tools developed in the course and used in an essential way include: Laplace transforms and moment generating functions, methods of solving recursions and systems of difference equations, ergodic and renewal theorems for Markov chains, and (discrete-time) martingales.

Prerequisite:   Stat 410 plus additional background in mathematical analysis and proofs, or Math 410 plus one semester of probability theory.

Course requirements and Grading: there will be 6 or 7 graded homework sets (one every 1½ to 2 weeks) which together will count 50% of the course grade. In all homework problems, your solution will be graded correct only if you fully justify your reasoning. There will also be an in-class test and a final examination, which will respectively count 20% and 30% toward the overall course grade.
NOTE As of March 30: There will be 6 HW's in all. The Midterm will be an online 60 minute Quiz administered through ELMS. The Final will be Take-Home. The weights used in computing the final grade will be changed to: HW 60%, Midterm 15%, Final 25%. Details of these course changes, the revised Syllabus, and the mode of delivery of Lectures, Office Hours and Discussions can be found in the ELMS Pages and Files and here.



STAT 650 SYLLABUS / COURSE OUTLINE    Revised 3/13/2020 syllabus here

0. Probability Review.     (Durrett Appendix A, Lefebvre, Chapter 1; Serfozo, Sec.1.22 & Ch.6.)     1--2 Lectures
(a) Probability spaces, countable additivity.
(b) Conditional expectation, change-of-variable and transformations of r.v.'s.

1. General Definitions and Properties of Stochastic Processes     (Class Notes and Lefebvre, Chapter 2.)     1 Lecture
(a) Spaces of trajectories, Kolmogorov Consistency Theorem (infinite sequences of states).
(b) Independent increments, stationarity and wide-sense stationarity, Markov property, ergodicity

2. Discrete-time Discrete-State Markov Chains.     (Serfozo, Chapter 1 and Durrett Chapter 1)     5 Lectures
(a) Markov property. Examples of Markov & non-Markov random sequences.
(b) Multistep transition probabilities. Chapman-Kolmogorov equation.
(c) "First-step analysis" and branching processes
(d) Classification of states.
(e) Notions of limiting behavior. Reducibility. Recurrence. Steady state.
(f) Time reversibility, exit distributions, and other topics.

3. Poisson Processes.     (Serfozo Chapter 3, and Durrett Chapter 2)     4 Lectures
(a) Equivalent Definitions of Poisson Process
(b) Nonhomogeneous Poisson, Compound Poisson, Superposition and Thinning

4. Renewal Processes.     (Serfozo Chapter 2, and Durrett Chapter 3)     3 Lectures
(a) Laws of Large Numbers, Renewal Equation.
(b) Queueing theory examples. Asymptotic behavior of renewal reward processes.

5. Continuous time Markov Chains.     (Serfozo Chapter 4, and Durrett Chapter 4)     4 Lectures
(a) Kolmogorov Forward and backward equations. Bith-death processes.
(b) Embedded discrete chains.
(c) Limiting behavior and stationary distributions.

6. Martingales & Applications.     (Serfozo Sections 5.5--5.7, and Durrett Chapter 5)     2 Lectures
(a) Definitions and Optional Sampling Theorem.
(b) Expectation calculations in examples.

7. Miscellaneous Applications.     (Serfozo Sections 5.1--5.2, and Durrett Ch.6, Secs. 4.4-4.5.)     3 Lectures
(a) Brownian Motion.
(b) Finance and Black-Scholes Formula.
(c) Reversibility in Queueing & Markov Chain Monte Carlo.


Homework

HW Assignment 1 (due in class Monday, Feb. 10):
Read Serfozo Ch.6 (Review) Sec.1.22, Secs. 6.1 and 6.6--6.9, plus Ch.1, Sections 1.1-1.6.
Problems to Hand in from Serfozo Ch.1, pp.84-98: #8, 13, 17, 26.
Problems in Lefebvre book, pp.145-6: #12, 15.
Additional Problems to Hand in:
(I) For two independent random variables   X ~ Uniform[0,2]   and   Y ~ Expon(1),   find   (a)   E(Y-X | Y > X) ,   and   (b)   the joint probability distribution of   (min(X,Y), Y - min(X,Y)).   Note that the second component random variable in this pair has a probability atom at 0, i.e. a positive probability of being equal to 0.
(II) In the game of craps when the "shooter's point is 4", the player's objective is to throw a sum of 4 dots showing with two [independent, fair] dice before a sum of 7 dots appears, in repeated tosses of the two dice. Calculate in three different ways the probability x that he succeeds:   
(i) Consider the outcome of the first roll of the dice, to conclude that    x = 3/36 + (27/36) x    and solve.
(ii) Let   Ak   be the event that the sum is   k   and justify why    x = P(A4) / P( A4 ∪ A7).
(iii) Calculate directly and then sum up over k the probabilities that 4 occurs before 7 and that this happens on the   k'th trial.

HW Assignment 2 (due in class Monday, Feb. 24):
Read Serfozo Ch.1 Sections 1.5-1.13 and 1.20-1.22.
Problems to hand in:
Problems in Serfozo book, pp.89-92: # 25, 36, 37, 44.
Problems in Lefebvre book, pp.145-149: # 11, 18, 20, 21.

HW Assignment 3 (due in class Wednesday, March 11):
Read Serfozo Ch.1 Sections 1.8, 1.12, 1.13,, 1.17, 1.18, 1.20-1.21, Ch. 2 through Sec.2.2.
Problems to hand in:
Problems in Serfozo book, pp.89-92: # 27, 39, 50.
Problems in Lefebvre book, pp.153-155: # 38, 41.
3 Additional Problems to Hand in:
(A) Establish the complete set of values a,b for which the Markov Chain with states 0,1,2 and transition matrix defined by
p0,1 = p1,0 = p1,1 = p1,2 = p2,1 = 1/3,       p0,2 = a ,     p2,0 = b,       is reversible.
(B) Suppose that (Xk,    k=0,1,2, ...) is a temporally homogeneous Markov chain on state-space   S   equal to the nonnegative integers. Suppose that   p(i,j)=0   whenever   j > i+20,   that   p(i,i+1)>0   for all   i,   and that for all   n ≥ 10,   p(n,0) ≥ 1/n1/2.   Prove from these assumptions that
(a) for all initial states   i,     Pr(min({k ≥ 1:   Xk ∈ {0,1,...,9}}) < ∞)   |   X0=i) = 1,   and that therefore
(b) the Markov chain is irreducible and recurrent.
(C) Let {Xn: n=0,1,2,...} be a branching process for which X0=1 and the offspring prob. mass function is    pZ(0)=0.1,   pZ(1)=0.6, and   pZ(2)=0.3.
(a) Find the extinction probability   q = P(Xn=0 for some n>0 | X0=1).
(b) Find the expected value and variance of X4.
(c) Find E(X4 | X3>1).

HW Assignment 4, six (6) problems due April 13 (by 11:59pm) in Assignments on ELMS.
Read Serfozo Ch.2 Sections 2.1-2.2, Ch.3 through Sec.3.3, Lefebvre Ch.5 Sec.5.1 through p.247.
Problems to hand in:
Problems in Serfozo book: Ch.3 #15 on p.228, and Ch.2 # 53 on p.167.
Problems in Lefebvre book: Ch.3 #17, 60 pp.147-160, and Ch.5 #10 pp.290-292.
1 Additional Problem to Hand in:
(A) Calculate the variance of X defined in Example (c) at the end of 3/30 Lecture part A.

HW Assignment 5, six (6) problems due Monday April 27 (by 11:59pm) in Assignments on ELMS.
Read Lefebvre 3.3.2-3.3.5 (Continuous-time MC's) and Sec. 5.2 (Nohomogeneous Poisson),
and also Serfozo Ch.4 through 4.5 plus Sec. 4.7.
Problems to hand in:
Problems in Serfozo book: Ch.3 #27 and #30 on p.231, and Ch.4 #20 on p.328.
Problems in Lefebvre book: Ch.3 #75 p.164, #81, #82 pp.166-167.

HW Assignment 6, six (6) problems due Wednesday May 13 (by 11:59pm) in Assignments on ELMS.
Read Serfozo Sec. 1.8 (MCMC), Durrett Example 1.35 and Chapter 5, and Serfozo Sections 5.5-5.6.
4 Problems to hand in from Serfozo:
Serfozo text, Ch.1 #55, p.95; Ch.4 #14 p.326, #19 p.328; and #24 pp.329-330.
2 Additional Problems to hand in:
(A)(a) Show that the continuous-time Birth-Death Process with   λn = μn = n λ , with   λ > 0,
is a martingale, and use this to show that E1(X(t)) = 1.
(b) Argue that the birth-death process X(t) in (a) starting at X(0)=m, is equal in distribution
to the algebraic sum of m independent processes Xj(t) of the same Birth-Death type but
starting at Xj(0)=1. Use this to obtain the formula for   Pm(X(t) = 0).
(B) Suppose that Nrc are independent Poisson(λ pr qc)   random variables, for r=1,...,R and c=1,...,C,
where   (p1,p2,...,pR)   and   (q1,q2,...,qC)   are probability vectors.
Find the generalized Hypergeometric joint probability mass function for these random
variables Nrc conditionally given   Σc=1C Nrc= mr   and   Σr=1R Nrc= nc. This joint probability
mass function should depend on the row and column marginal sums   mr   and   nc   but turn
out not to depend on the values   pr   and   qc.


Handouts

(1). Handout on a non-Markovian example constructed by lumping 2 of the states in a 3-state Homogeneous Markov Chain.

(2). Handout of 5 pages from Durrett's "Essentials of Stochastic Processes" book containing proofs for equivalent conditions to positive-recurrence, for aperiodic irreducible HMC's.

(3). Handout clarifying two loose ends from recent classes: (1) the condition for existence of a stationary probability distribution for recurrent Birth-Death chains, and (2) the notational verification as in Durrett that the "cycle trick" provides a stationary distribution for an irreducible positive-recurrent HMC.

(4). Handout on existence of invariant vectors for countable-state irreducible transient Homogeneous Markov Chains.

(5). Handout on the uniqueness of solutions of the Kolmogorov forward and backward equations when the rates q_i of leaving states are uniformly bounded.

(6). A Sample Test containing short-answer problems like the ones to appear on the March 30 in-class test can be found here. Note that this sample test does not have Poisson process problems, although they are in scope for the mid-term.

(7). Handout on the main limit theorems of renewal theory, taken from 2 pages of the Grimmett and Stirzaker book, "Probability and Random Processes".

(8). Handout on Introduction to Markov Chain Monte Carlo. This is a statistical computing topic based on a strong theoretical foundation involving Markov Chains in discrete-time on discrete or continuous state spaces. This is unit 7 on my STAT 705 web-page, for which you can find lecture notes on Metropolis-Hastings and the Gibbs Sampler here. In addition, you can find lectures from a mini-course pair of lectures I gave on the topic in this folder.


Mid-Term Test

In light of the campus closure and online course format from March 30 through April 10, a mid-term test will be given on-line to be taken through ELMS between April 8 and 10, 2020. It will be easier than the HW problems, consisting of short-answer questions emphasizing definitions and the application of results of Theorems, plus a true-false component. Sample problems can be found here.

Sample problems and information about specific coverage of the test can be found here.


Sample Final Exam

You can find sample final-exam problems here. By all means try them and ask questions about them.


Final Examination

The Final Examination will be held from 4 to 6pm on Monday, May 18, 2020 in the regular classroom if campus re-opens by then. If the campus does not re-open by then, the final will be a set of take-home problems to be solved over 3-4 days. If the final is in-class, then it will have 5 or 6 problems. The coverage of the Final is cumulative of all material from the Course Outline assigned and covered in class throughout the semester. For an in-class final, you will be allowed 2 2-sided notebook sheets of reference material for the exam; but the exam will otherwise be closed-book.

If the Final Exam is to be in-class, there will be a review session for the Final on Friday, May 15, 4:30-6pm, in the regular classroom.



Important Dates, Spring 2020


The UMCP Math Department home page.

The University of Maryland home page.

My home page.

© Eric V Slud, May 11, 2020.