Chapter 12

Square-Integrable Continuous-Time Processes

12.1. Definitions

Let (Xt, tT) be a real, square-integrable process defined on the probability space images. We suppose T to be an interval of finite or infinite length on images, and we set:

images

where m is the mean of (Xt) and C is its covariance.

In the following, unless otherwise indicated, we suppose that m = 0.

12.2. Mean-square continuity

(Xt) is said to be continuous in mean square at t0T if tt0 (tT) leads to E(XtXt0)2 → 0.

It is equivalent to say that images is a continuous mapping from T to images and that (Xt) is continuous in mean square (on all of T).

THEOREM 12.1.The following properties are equivalent:

1) (Xt) is continuous in mean square.

2) C is continuous on the diagonal of T × T.

3) C is continuous on T × T.

PROOF.–

– (1) ⇒ (3) as (s, s′) → (t, t′) leads to images and by the bicontinuity of the scalar product E(XsXs) → E(XtXt).

– (3) ⇒ (2): evidently.

– (2) ⇒ (1) as, if s → t, we have:

images

12.3. Mean-square integration

Let (Xt) be a square-integrable, centered, and measurable process (i.e. images is measurable) defined on T = [a, b], −∞ < a < b < +∞.

To define its mean-square Riemann integral on [a, b], we set:

images

where a = tn,0 < tn,1 < ··· < tn,kn = b; sn, i ∈ [tn, i−1, tn, i].

If images when n → ∞, and supi (tn, itn, i−1) → 0 and if I does not depend on the chosen sequence of partitions, then (Xt) is said to be mean-square (Riemann) integrable on [a, b] and we write:

images

I is therefore, by definition, a square-integrable, real random variable, and EI = lim EΔn = 0.

THEOREM 12.2.– (Xt) is mean-square Riemann integrable on [a, b] if and only if C is Riemann integrable on [a, b] × [a, b].

PROOF.– Let (Δn) and (Δm) be the Riemann sums associated with two partition sequences (tn, i) and (tm, i). (Xt) is then integrable, with integral I, if and only if:

images

for every pair (tn, i), (tm, i).

These conditions are equivalent to E(Δn − Δm)2 → 0 and to E(ΔnΔm) → , ∀(Δn), ∀(Δm), which is written as:

images

and the latter condition means that C is Riemann integrable on [a, b].

THEOREM 12.3.If C is continuous on the diagonal [a, b] × [a, b], and if f and g are continuous functions on [a, b], then

images

In particular

[12.1] images

PROOF.– From Theorem 12.1, C is continuous on [a, b] × [a, b]. Furthermore, the processes (f(t)Xt) and (g(t)Xt) have respective covariances images and images. These functions being continuous on [a, b] × [a, b], Theorem 12.2 leads to the integrability of (f(t)Xt) and (g(t)Xt) on [a, b]. Now, in clear notation,

images

and

images

and by the bicontinuity of the scalar product

images

Moreover

images

Hence, the stated equation.

COMMENT 12.1.–It may be shown that [12.1] remains valid when it is only supposed that C is integrable on [a, b]2.

When (Xt) is not centered, we say that it is mean-square integrable if m(t) is integrable on [a, b] and if (Xt − m(t)) is mean-square integrable. Then:

images

and

images

Using the integral

Consider the following input–output schema:

images

EXAMPLE 12.1.–

– An emission of particles → their recording by a computer.

– An arrival of customers to a service window → the service time.

– The exchange rate of the dollar → foreign trade.

Let h(t, s) be the response at time t to a signal of unit intensity emitted at time s. In many systems, we have:

images

If the intensity Xs of the signal is assumed to be random, and the system begins to work at time 0, then the response Yt at time t is the “sum” of the responses at time t, to the signals at the times s ∈ (0, t). Hence

images

Extending the mean-square integral to intervals of infinite length, we may define responses of the type:

images

Note that in discrete time, the formula would be written as:

images

which defines (Yt) as the transform of (Xt) by a realizable linear filter.

12.4. Mean-square differentiation

(Xt, tT) is said to be mean-square differentiable at images if there exists some random square-integrable variable images such that:

images

THEOREM 12.4.– The following two conditions are equivalent:

1) (Xt) is mean-square differentiable at t0.

2) C has a generalized derivative at t0, i.e.

images

PROOF.–

– (1) ⇒ (2) by the bicontinuity of the scalar product, and furthermore images.

– (2) ⇒ (1) as, if we set images, we have E(YhYk) → ℓ, therefore

images

and the Cauchy criterion implies the convergence in quadratic mean of (Yh).

REMARK 12.1.–

1) The covariance of the process images is the generalized derivative of C.

2) If a two-variable function has a generalized derivative, it has mixed second derivatives. Conversely, a function that has continuous mixed second derivatives has a generalized derivative.

3) The trajectories of a mean-square differentiable process are not necessarily differentiable functions in the usual sense, or even continuous functions. For example, the process images, t ∈ [0, 1], with basis space images where λ is the Lebesgue measure, is mean-square differentiable whereas its trajectories are not continuous.

12.5. The Karhunen–Loeve theorem

THEOREM 12.5.Let (Xt) be a centered, square-integrable and measurable process, which is continuous in mean-square; T is a compact interval in images. Under these conditions, there exist an orthonormal sequencen, n ≥ 0) in images and a sequence of orthogonal random variables (ξn, n ≥ 0) such that:

images

where the series converges in quadratic mean.

PROOF.– From Theorem 12.1, C is continuous. As it is symmetric and positive semidefinite, Mercer’s theorem lets us write:

images

where (φn) is an orthonormal sequence in images and the λn are real and such that:

images

Moreover, the series converges uniformly to C and the φn are continuous.

Now, from Theorem 12.2, the process (Xt φn(t), tT) is mean-square integrable on T and we may set:

images

Then, from Theorem 12.3,

images

Therefore, the ξn are pairwise orthogonal.

Furthermore

images

Consequently

images

If the process (Xt) is Gaussian, the random variables ξn are Gaussian, as is the limit in quadratic mean of a Gaussian sequence (see the definition of the integral), and they are independent. In addition, it may be shown that the convergence of the KL series is almost sure.

12.6. Wiener processes

DEFINITION 12.1.– images is said to be a Wiener process or a Brownian motion process if:

1) images, t ≥ 0, where σ2 is a strictly positive constant.

2) (Wt) has independent increments:k ≥ 3; ∀0 ≤ t1 < t2 < … < tk, the random variables Wt2Wt1 , Wt3Wt2 ,…, WtkWtk−1 are independent.

INTERPRETATION 12.1.– A particle immersed in a motionless homogeneous fluid is subjected to molecular impacts which constantly modify its trajectory. For images, we denote by Wt the abscissa of the projection of the particle on any axis, with origin W0, images is then a Wiener process.

Brown (1827) observed this phenomenon for the first time. Einstein (1905) showed that:

images

where R is the gas constant and T the absolute temperature, N Avogadro’s number, and f is the friction coefficient. Wiener (1923) gave the precise mathematical definition of (Wt).

THEOREM 12.6.–

1) A Wiener process (Wt) has stationary increments and covariance:

images

2) Conversely, every centered Gaussian process with covariance σ2 min (s, t) is a Wiener process.

PROOF.–

1) Let h, t be the characteristic function of Wt+hWt. Since

images

we have, by independence of the increments,

images

Hence

images

and consequently images: the increments are stationary.

Furthermore, for s < t,

images

We therefore have C(s, t) = σ2 min (s, t).

2) It is sufficient to show that the increments are independent. Now, for t1 < t2t3 < t4, we have:

images

The increments are therefore orthogonal and thus independent, since the process is Gaussian.

MEAN-SQUARE PROPERTIES.– Since σ2 min (s, t) is continuous, (Wt) is continuous in mean square (Theorem 12.1), and mean-square integrable when it is measurable (Theorem 12.2). It is not mean-square differentiable, as

images

TRAJECTORIES PROPERTIES.– The trajectories of (Wt) are continuous, but (almost surely) not differentiable. This property is delicate to establish. We will only show the following result:

THEOREM 12.7.– The trajectories of (Wt) are (almost surely) not functions of bounded variation.

Recall that the total variation of a numerical function f, defined on [α, β], is written as:

images

A monotone function is of bounded variation; a function whose first derivative is bounded on [α, β] is likewise; every function of bounded variation is equal to the difference of two monotone functions.

PROOF.– (Wt) may always be assumed to be standard (i.e. σ2 = 1) and [α, β] = [0, 1].

We set:

images

Then let N be a random variable with distribution images, and let us set:

images

thus

images

Hence, from the independence of the increments,

images

Applying Tchebychev’s inequality, we obtain:

images

The Borel–Cantelli lemma therefore leads to:

images

Hence

images

and the variation of (Wt) on [0, 1] is +∞ a.s.

12.6.1. Karhunen–Loeve decomposition

THEOREM 12.8.The KarhunenLoeve decomposition of a standard Wiener process is written as:

images

where the ξn are independent and have the respective distributions:

images

PROOF.– It is sufficient to determine the eigenfunctions and eigenvalues of min (s, t), determined by:

images

that is

[12.2] images

Differentiating, we obtain:

[12.3] images

then

images

The general solution of this equation is of the form

[12.4] images

From [12.2], ℓn (0) = 0, therefore A = 0, and from [12.3], image(1) = 0, therefore

images

Hence

images

and, since

images

we have images. We may choose images as n and ξn are defined having either sign.

Consequently

images

and

images

is centered Gaussian with variance λn. The result is deduced by applying Theorem 12.5.

COMMENT 12.2.– It may be shown that the series from Theorem 12.8 converges uniformly (a.s.) which leads to the (a.s.) continuity of the trajectories of (Wt).

12.6.2. Statistics of Wiener processes

Let (Wt) be a Wiener process observed on the time interval [0, τ]. The only unknown parameter is σ2. In the (purely theoretical) case where (Wt, 0 ≤ tτ) is entirely observed, we consider the estimator:

images

To study the asymptotic behavior of (Zn), we will use the preliminary result:

LEMMA 12.1.Let X1, …, Xn be independent random variables with the same distribution such that images, EXi = 0, then

[12.5] images

PROOF.–

images

By their independence, the terms may be factorized from which we deduce [12.5].

Let us set:

images

The images are independent and follow a χ2 (1) distribution. Then, successively applying Tchebychev’s inequality and Lemma 12.1, we obtain:

images

where images and images are constant. Consequently,

images

where c is constant and, from the Borel–Cantelli lemma,

[12.6] images

Therefore, for any τ, images is almost surely equal to the unknown parameter σ2!

This result is not surprising as the large irregularity of the trajectories of (Wt) does not allow us to completely observe them.

In fact, [12.5] may be interpreted as a convergence result: an estimator Zn of σ2 is constructed from the observations (W(/2n), 0 ≤ k ≤ 2n) and Zn tends almost surely to σ2 when n tends to infinity. It is to be noted that this “asymptotic” is not the usual asymptotic of discrete-time processes, which corresponds here to τ → +∞: considering the observations (Wkh, 0 ≤ kn) where nh = τ, we may construct the estimator:

images

where h is fixed and n → +∞. images is then unbiased and

[12.7] images

[12.7] may be established again using Lemma 12.1.

12.7. Notions on weakly stationary continuous-time processes

Let images be a weakly stationary process. Its autocovariance is defined by setting:

images

If (γt) is integrable on images, then the spectral density of (Xt) is defined by:

images

and by the inverse Fourier transform:

images

provided that f is integrable.

From Theorem 4.1, (Xt) is continuous in mean square if and only if γt is continuous at the origin, and γt is then continuous everywhere. Consequently, if (γt) is continuous at the origin, (Xt) is mean-square integrable on each bounded interval.

If (Xt) is mean-square differentiable, then (γt) is twice differentiable, and image is a weakly stationary process with autocovariance image.

12.7.1. Estimating the mean

If (Xt), with mean m, is observed in the interval [0, τ], we consider the unbiased estimator:

images

which is defined when (Xt) is mean-square integrable on each bounded interval.

Then, from Theorem 12.3,

images

If, for example, γ is integrable on images, then

images

The rate of convergence may be specified, as in the discrete case.

12.7.2. Estimating the autocovariance

The empirical autocovariance is defined by:

images

and the empirical spectral density by:

images

Thus, under analogous conditions to those in the discrete case, when t → ∞, we have:

images

but

images

and the introduction of weighting functions allows us to obtain convergent estimators of the spectral density.

12.7.3. The case of a process observed at discrete instants

When Xt is observed at the instants 0, h, …, (n − 1) h, where h is a fixed interval of time, the mean may be estimated by setting:

images

We may also estimate the autocovariance for t = jk, 0 ≤ jn − 1, by:

images

but cannot estimate γt for any t!

This inconvenience is called “aliasing”. It is found in the estimation of the spectral density because the spectral density of a discretized process does not permit the reconstruction of that of the initial process. To avoid aliasing, we may observe (Xt) at times T1 < T2 < ··· < Tn < ··· associated with a Poisson process.

12.8. Exercises

EXERCISE 12.1. (Ornstein–Uhlenbeck process).– Let images be a centered, stationary, Markovian, Gaussian process with autocovariance (γ(h), images such that γ(0) > 0. The autocorrelation of (Xt) is defined by setting:

images

Supposing that ρ is continuous, and that there exists δ > 0 such that 0 < ρ(δ) < 1,

1) Establish the relation

images

2) Show that there exists a > 0 such that:

images

EXERCISE 12.2.– Let images be a centered Gaussian process with covariance:

images

where u and v are continuous and such that φ(t) = u(t)/υ(t) is continuous and strictly increasing.

1) Show that the process:

images

is a standard Wiener process.

2) Apply this result to the Ornstein–Uhlenbeck process for images (see Exercise 12.1).

EXERCISE 12.3.– Let (Wt, t ≥ 0) be a standard Wiener process, and set:

images

where images is an unknown parameter.

1) Determine EXt and E(XsXt) – E(Xs)E(Xt) = Cov(Xs, Xt), s, t ≥ 0. Is this process Gaussian? Is it stationary?

2) (Xt, 0 ≤ tT) (T > 0) are observed and an estimator of θ is defined by setting:

images

where dXt = θ cos tdt + dWt.

Determine images and deduce from this an asymptotic equivalent of images when T → ∞. Show that images has a limit in distribution and specify it.

3) Setting

images

show that

images

determining the function φ.

4) Verify that the statistical predictor images is unbiased, i.e. images.

5) Study the asymptotic behavior of images in quadratic mean and in distribution.

6) Construct an asymptotic prediction interval for rT(XT, θ), and then for XT+h.

7) What happens if h = 2kπ (k, an integer)?

8) How do you calculate images in practice?

EXERCISE 12.4.– Let images be a sequence of independent random Gaussian variables with the same distribution images. Consider the continuous-time process:

images

1) Show that images is a Gaussian process. Determine its mean and covariance.

2) Show that the Xt follows the same distribution. Is the process (Xt) stationary?

3) Study the continuity and differentiability of (Xt) in the usual sense and in mean square.

4) Supposing (Xt) to be observed on the interval [0, T], propose an estimator of σ2 and study its asymptotic properties.

EXERCISE 12.5.– If (Xt) is continuous in mean square on [a, b], show that, as defined in L2,

images

EXERCISE 12.6.–

1) (W1, …, Wn) is said to be a standard Wiener process in images if the components W1, …, Wn are standard (i.e. of variance 1), mutually independent Wiener processes. Show that, for every images is again a Wiener process.

2) The converse is not necessarily true. Let (W1, W2) be a standard Wiener process in images and let:

images

Show that every linear combination of γ1 and γ2 is a Wiener process. Why is (γ1, γ2) not a Wiener process in the plane (i.e. a Gaussian process with values in images and with independent and stationary increments)?

EXERCISE 12.7.– Let (Wt, t ≥ 0) be a standard Wiener process, and let images. Show that:

1) W is a martingale with respect to images, i.e. for all st, images;

2) imagest is a martingale;

3) exp (λWt – λ2/2t) is a martingale for every fixed images.

EXERCISE 12.8.–

1) Show that, if X follows a standard normal distribution, then, for all λ > 0,

images

2) Deduce that, if W is a standard Wiener process, then

images

Hint: You may choose

images

It is to be noted that this result is much weaker than the law of the iterated logarithm:

images

EXERCISE 12.9.– Supposing that W is a standard Wiener process and letting images, determine the joint distribution of (X, Y, W1).

EXERCISE 12.10.–Setting, for t > 0, images, where (W(t), t ≥ 0) is a Brownian motion process with variance σ2, show that, for st,

images

and from this, deduce Cov (Xt, Xs).

EXERCISE 12.11.– Let (W(t), t ≥ 0) be a Brownian motion process. Setting t = exp (u) and images, prove that images is strictly stationary.

EXERCISE 12.12.– Let images be the class of real, measurable, continuous-time processes images such that Xt has the density f (which does not depend on t) in the class C2, which is bounded, as are its derivatives. We seek to estimate the derivative f′of f.

1) i) Show that no unbiased estimator of f′, based on the observation of X0, exists on the class images (making the necessary regularity hypotheses).

ii) Deduce from i) that no unbiased estimator of f′, based on the observation of (Xt, 0 ≤ tT), exists.

2) Consider the estimator:

images

where images.

   i) Show that it is asymptotically unbiased if limT→+∞ hT = 0.

   ii) Study its convergence and its rate of convergence in quadratic mean when (Xt) is strictly stationary and such that gu = f(X0, Xu) − ff exists for u ≠ 0, and is continuous on every (x, x), images, and images is integrable on ]0, +∞[.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset