All About Stochastic Difference Equations, Part 1

“All about X, part 1” is really a fancy way of saying a little bit about X.

How do you eat an elephant?
One bite at a time 1

So let’s eat stochastic difference equations one bite at a time. Let’s consider a first-order linear equation driven by a Gaussian random variable.

xn+1=axn+wnwnN(0,σ2)|a|<1.

Let’s dissect this one bit at a time. xn is our system’s state, or value, at time n. Our time variable n takes on integer values, -2, -1, 0, 1, 2, and so on. wn is our noise term, and it is a normally distributed Gaussian variable with mean 0 and standard deviation σ at each time n. The values at different times are uncorrelated. We require |a|<1 so that our equation is stable, or does not blow up.
What does this series look like? Here is a sample run of 5000 steps for a=0.9967 and σ=0.082. The significance of these numbers will be revealed later.

sde_a0-9967_w0-082

Let’s derive some basic properties of our stochastic series xn. Recall that E() is the expectation operator, meaning it computes the expected value of its random variable argument.



What is the average (expected) value of xn? Since our process is stationary, that is, a is fixed, and wn does not varying it’s statistics over time, IE(xn) is constant, let’s call it IE(x). Then

IE(xn+1)=aIE(xn)+IE(wn)IE(x)(1a)=IE(wn)IE(x)=0
where we used E(wn)=0 for the final step.

What about the variance of x?
Var(xn+1)=a2Var(xn)+Var(wn)Var(x)(1a2)=σ2Var(x)=σ21a2
Cov(xn+1,xn)=IE[(xn+1IE(xn+1))(xnIE(xn))]=IE[xn+1xn]=IE[(axn+wn)xn]=IE[ax2n]=aσ21a2

Let’s look at the increments:
Δxn+1=xn+1xn=(a1)xn+wnVar(Δx)=(a1)2Var(x)+σ2Var(Δx)=σ2[(1a)21a2+1]=σ221+a.
Note If a is close to 1, then the variance of the increments is close to σ2.

Now, let’s get a little more involved and compute the variance over k steps. From some basic system theory (or inspection):
xk=x0ak+k1s=0aks1ws
To make sure we have the right limits, let’s check for k=1:
x1=ax0+w0
That looks right, let’s continue on.
Var(xkx0)=(ak1)2Var(x)+σ2k1s=0(aks1)2
Now, using the Sum of a Geometric Series, we have
Var(xkx0)=(ak1)21a2σ2+1a2k1a2=σ2[(ak1)2+1a2k1a2=2σ21ak1a2.
Checking when k=1 and we see it matches our earlier result.

In Part 2 we will look more closely at some simulations and trading results on these series.

Notes:

  1. Generally credited to General Creighton Williams Abrams, Jr., Chief of Staff of United States Army 1972-1974, but an earlier reference to the concept is Frank Cody, Detroit’s Superintendent of Schools in 1921 link ↩

Sum of a Geometric Series

What is the sum of the following geometric series?
ki=0ai=1+a+a2++ak
We will frequently need a simple formula for this finite series. It is called a geometric series because each term is related by a multiple to the previous one. If each term was related by a fixed difference, it would be called an arithmetic series; but that’s for another day.
Let us define the sum as S. Then writing the equation for S and aS with some clever alignment:

S=1+a+a2++akaS=a+a2++ak+ak+1
Subtract the equations
S(1a)=1ak+1
or
S={1ak+11aa1k+1otherwise
where we have to be careful to divide by 1a only if a1, and the answer for a=1 is determined by inspection.
Q.E.D.

How does that limit for e work?

You may have seen the following equation:
limn(1+1n)n=e.

which expresses the fundamental constant e as a limit. Wondering how that can be proven?

Consider this graph:

One key fact we will use is

x11tdt=lnx

Looking at the graph, you can see that the area under the 1t curve on the interval (1,1+1n) is bounded by

1n(11+1n)1+1n11tdt1n(1)

Using the definition of lnx above, we have

11+nln(1+1n)1n

Raising e to the power of each term,

e(11+n)1+1ne(1n)

Raising each term to power n,
e(n1+n)(1+1n)ne

As n approaches infinity, the left hand term approaches e, therefore
limn(1+1n)n=e.

QED.