Instructions: You may only consult the textbook and lecture notes: do not consult any other materials or other persons. Reference any results you use from the textbook or lecture
notes. Solutions must be produced using LaTeX and submitted as hardcopy. Handwritten solutions will not be accepted.

1) Let S and T be independent random variables that each have the uniform distribution on the interval [−1, 1]. Put R =√S2 + T2.
Show that conditional on the event {R ≤ 1} the two random variables X :=SRp−2 log R2 and Y :=TRp−2 log R2 are independent standard normal random variables.[10 points]

2) Let X and Y be independent nonnegative random variables with density functions f and g that are continuous on (0, ∞). Suppose for any u > 0 that the conditional distribution of X given X + Y = u is uniform on the interval [0, u].
Show that X and Y are identically distributed and that their common distribution is exponential.[10 points]
Hint: Write h for the density function of X + Y . Explain why the density function of (X, X +Y ) is given by fX,X+Y (x, u) = 1uh(u) for 0 ≤ x ≤ u. Then explain why 1x+yh(x+y) =f(x)g(y) for x, y ≥ 0. By considering what happens when x = 0 and y = 0, show that f = g.
Conclude that f(x+y)f(0) = f(x)f(y) for x, y ≥ 0. Define a function ¯f by ¯f(z) = f(z)/f(0).
Show that ¯f(x + y) = ¯f(x)¯f(y) for x, y ≥ 0 . Conclude that ¯f(z) = e−λz when z ≥ 0 for some λ > 0. Lastly, observe that f(x) = λe−λx when x ≥ 0.

3) Given two n×n matrices A = (aij ) and B = (bij ), their Schur product is the n×n matrix C = (cij ) with i, j entry cij = aij bij for 1 ≤ i, j ≤ n. Suppose that Σ0 and Σ00 are two n × n nonnegative definite matrices with Schur product Σ.
Show that Σ is also nonnegative definite. [10 points]

Hint: Explain why we may assume that there are independent random vectors X0 and X00 such that X0 and X00 both have mean vector 0, X0 has variance-covariance matrix Σ 0, and X00 has variance-covariance matrix Σ 00. Then consider the variance-covariance matrix of the random vector X, where Xi = X0
iX00 i for 1 ≤ i ≤ n.

4) Alice and Bob are having a snowball fight. The times at which Alice throws her snowballs are the arrival times of Poisson process with (constant) intensity λ. The times at which Bob throws his snowballs are the arrival times of Poisson process with (constant) intensity µ.

Assume that these two Poisson processes are independent. Consider the event that in the time interval [0, T] Alice and Bob throw the same non-zero number of snowballs and that they alternate throws (for example, one way this could happen is that they each throw two snowballs and the sequence of throws is Alice, Bob, Alice, Bob).
What is the conditional probability given this event that Alice throws the first snowball? [10 points]
Hint: Use the coloring theorem to build the Poisson processes for Alice and Bob from a single Poisson process with intensity λ + µ.

5) Suppose that the random variable U has the uniform distribution on [0, 1] and that conditional on U = u the random variable X has the binomial distribution with number of trials n and success probability u.
Show using generating functions or otherwise that P{X = x} =1 n+1 for x ∈ {0, 1, . . . , n}. [10 points]

6) Let 0 = S0, S1, S2, . . . be a simple random walk on the integers with p = q =12. Recall that p0(n), n ≥ 0, is the probability that the walk is at position 0 at time n and f0(n), n ≥ 1, is the probability that the walk first returns to position 0 at time n. We have the generating functions P0(s) = P∞n=0 p0(n)sn =P∞k=0 p0(2k)s 2k and F0(s) = P∞n=1 f0(n)s Pn =∞k=1 f0(2k)s 2k. Put a(0) = 1, a(2k) = P{S1 6= 0, S2 6= 0, . . . , S2k 6= 0} for k ≥ 1.
Show that a(2k) = p0(2k) =2k k(12)2k for k ≥ 0. [10 points]
Hint: Explain why a(2k) = P∞`=k+1 f0(2k). Put A(s) = P∞k=0 a(2k)s 2k and show that A(s) = 1 − F0(s)1 − s2.

7) Let 0 = S0, S1, S2, . . . be a simple random walk on the integers with 0 < p < q < 1.
Say that the time n ≥ 0 is an upcrossing time for the random walk if Sn < Sn+1 (that is Sn+1 = Sn + 1). Let T−1 = min{n ≥ 0 : Sn = −1} be the first time that the random walk visits −1.
Find the probability generating function for the number of upcrossing times that occur before time T−1. [10 points]
Hint: Put Z0 = 1 and define Zm for m ≥ 1 by Zm = |{0 ≤ n < T−1 : Sn = m − 1, Sn+1 = m}|; that is, Zm for m ≥ 1 is the number of times the random walk makes an upcrossing from state m − 1 to state m prior to the time T−1. Show that Z0, Z1, Z2, . . . is a branching process with P{# of offspring = k} = p kq for k ≥ 0.