1. Random
  2. 13. The Poisson Process
  3. 1
  4. 2
  5. 3
  6. 4
  7. 5
  8. 6
  9. 7
  10. 8

6. Non-homogeneous Poisson Processes

Basic Theory

A non-homogeneous Poisson process is similar to an ordinary Poisson process, except that the average rate of arrivals is allowed to vary with time. Many applications that generate random points in time are modeled more faithfully with such non-homogeneous processes. The mathematical cost of this generalization, however, is that we lose the property of stationary increments.

Non-homogeneous Poisson processes are best described in measure-theoretic terms. Thus, you may need to review the sections measure theory, probability measures, and distribution functions. Our basic measure space in this section is [0,) With the σ-algebra of Borel measurable subsets (named for Émile Borel). As usual, λ denotes Lebesgue measure on this space, named for Henri Lebesgue. Recall that the Borel σ-algebra is the one generated by the intervals, and λ is the generalization of length on intervals.

Definition and Basic Properties

Of all of our various characterizations of the ordinary Poisson process, in terms of the inter-arrival times, the arrival times, and the counting process, the characterizations involving the counting process leads to the most natural generalization to non-homogeneous processes.

Consider a process that generates random points in time. As usual, let Nt denote the number of random points in the interval (0,t] for t[0,), so that N={Nt:t0} is the counting process. More generally, N(A) denotes the number of random points in A. for measurable A[0,)

So N is a random counting measure, and as before, tNt is a (random) distribution function and AN(A) is the (random) measure associated with this distribution function.

Suppose now that r:[0,)[0,) is measurable.

  1. Define m:[0,)[0,) by m(t)=0tr(s)ds Then m is increasing and right-continuous on [0,) and hence is distribution function.
  2. The positive measure on [0,) associated with m (also denoted by m) is defined for measurable A[0,) by m(A)=Ar(s)ds The measure m is absolutely continuous with respect to λ, and with density function r.
Details:

Technically, the integrals are Lebesgue integrals (integrals with respect to λ). The properties of m follow from general properties of the integral,

So m(t)=m(0,t], and for s,t[0,) with s<t, m(s,t]=m(t)m(s). Note the parallels between the random distribution function and measure N and the deterministic distribution function and measure m. With the setup involving r and m complete, we are ready for our main definition.

A process that produces random points in time is a non-homogeneous Poisson process with rate function r if the counting process N satisfies the following properties:

  1. If {Ai:iI} is a countable, disjoint collection of measurable subsets of [0,) then {N(Ai):iI} is a collection of independent random variables.
  2. If A[0,) is measurable then N(A) has the Poisson distribution with parameter m(A): P[N(A)=n]=em(A)[m(A)]nn!,nN

Property (a) is our usual property of independent increments, while property (b) is a natural generalization of the property of Poisson distributed increments. Clearly, if r is a positive constant, then m(t)=rt for t[0,) and as a measure, m is proportional to Lebesgue measure λ. In this case, the non-homogeneous process reduces to an ordinary, homogeneous Poisson process with rate r. However, if r is not constant, then m is not linear, and as a measure, is not proportional to Lebesgue measure. In this case, the process does not have stationary increments with respect to λ, but does of course, have stationary increments with respect to m. That is, if A,B are measurable subsets of [0,) and λ(A)=λ(B) then N(A) and N(B) will not in general have the same distribution, but of course they will have the same distribution if m(A)=m(B).

In particular, recall that the parameter of the Poisson distribution is both the mean and the variance.

The function m is called the mean function.

  1. E[N(A)]=var[N(A)]=m(A) for measurable A[0,)
  2. E(Nt)=var(Nt)=m(t) for t[0,).

Since m(t)=r(t) (if r is continuous at t), it makes sense to refer to r as the rate function. Locally, at t, the arrivals are occurring at an average rate of r(t) per unit time. As before, from a modeling point of view, the property of independent increments can reasonably be evaluated. But we need something more primitive to replace the property of Poisson increments. Here is the main theorem.

A process that produces random points in time is a non-homogeneous Poisson process with rate function r if and only if the counting process N satisfies the following properties:

  1. If {Ai:iI} is a countable, disjoint collection of measurable subsets of [0,) then {N(Ai):iI} is a set of independent variables.
  2. For t[0,), P[N(t,t+h]=1]hr(t) as h0P[N(t,t+h]>1]h0 as h0

So if h is small the probability of a single arrival in [t,t+h) is approximately r(t)h, while the probability of more than 1 arrival in this interval is negligible.

Arrival Times and Time Change

Suppose that we have a non-homogeneous Poisson process as defined in [1], [2], and [3], with continuous rate function r. As with the ordinary Poisson process, we have an inverse relation between the counting process and the arrival time sequence

Let T0=0 and let Tn denote the time of the nth arrival for nN+. The counting process N={Nt:t[0,)} and the arrival time sequence T={Tn:nN} are related as follow:

  1. Tn=min{t[0,):Nt=n} for nN
  2. Nt=#{nN:Tnt} for t[0,)
  3. {Tnt}={Ntn} for nN and t[0,).
Details:

Both events in part (c) mean that there are at least n random points in (0,t].

Part (c) of [6] will allow us to get the distribution of Tn

For nN+, Tn has probability density function fn given by fn(t)=mn1(t)(n1)!r(t)em(t),t[0,)

Details:

From [6] the distribution function of Tn is P(Tnt)=P(Ntn)=k=nem(t)mk(t)k!,t[0,) Differentiating with respect to t gives fn(t)=k=n[m(t)em(t)mk(t)k!+em(t)kmk1(t)m(t)k!]=r(t)em(t)k=n[mk1(t)(k1)!mk(t)k!] The last sum collapses to mn1(t)/(n1)!.

In particular, T1 has probability density function f1 given by f1(t)=r(t)em(t),t[0,) Recall that in reliability terms, r is the failure rate function, m is the cumulative failure rate function, and the reliability function is the right distribution function: F1c(t)=P(T1>t)=em(t),t[0,) In general, the functional form of fn is clearly similar to the probability density function of the gamma distribution, and indeed, Tn can be transformed into a random variable with a gamma distribution. This amounts to a time change which will give us additional insight into the non-homogeneous Poisson process.

Let Un=m(Tn) for nN+. Then Un has the gamma distribution with shape parameter n and rate parameter 1

Details:

Let gn denote the PDF of Un. Since m is strictly increasing and differentiable, we can use the standard change of variables formula. So letting u=m(t), the relationship is gn(u)=fn(t)dtdu Simplifying gives gn(u)=un1eu/(n1)! for u[0,).

So the time change u=m(t) transforms the non-homogeneous Poisson process into a standard (rate 1) Poisson process. Here is an equivalent way to look at the time change result.

For u[0,), let Mu=Nt where t=m1(u). Then {Mu:u[0,)} is the counting process for a standard, rate 1 Poisson process.

Details:
  1. Suppose that (u1,u2,) os a sequence of points in [0,) with 0u1<u2<. Since m1 is strictly increasing, we have 0t1<t2<, where of course ti=m1(ui). By assumption, the sequence of random variables (Nt1,Nt2Nt1,) is independent, but this is also the sequence (Mu1,Mu2Mu1,).
  2. Suppose that u,v[0,) with u<v, and let s=m1(u) and t=m1(v). Then s<t and so MvMu=NtNs has the Poisson distribution with parameter m(t)m(s)=vu.

Equivalently, we can transform a standard (rate 1) Poisson process into a a non-homogeneous Poisson process with a time change.

Suppose that M={Mu:u[0,)} is the counting process for a standard Poisson process, and let Nt=Mm(t) for t[0,). Then {Nt:t[0,)} is the counting process for a non-homogeneous Poisson process with mean function m (and rate function r).

Details:
  1. Let (t1,t2,) be a sequence of points in [0,) with 0t1<t2<. Since m is strictly increasing, we have 0m(t1)<m(t2)<. Hence (Mm(t1),Mm(t2)Mm(t1),) is a sequence of independent variables. But this sequence is simply (Nt1,Nt2Nt1,).
  2. If s,t[0,) with s<t. Then NtNs=Mm(t)Mm(s) has the Poisson distribution with parameter m(t)m(s).