San José State University

applet-magic.com
Thayer Watkins
Silicon Valley
USA

 Theorems Concerning the Existence of Solutions to Systems of Ordinary Differential Equations

The solution of differential equations has been the heart and soul of the physical sciences since the time of Isaac Newton. The question of whether or not a solution actually exists for an equation involving a derivative is vital. Fortunately for linear equations constructive proofs of the existence of solutions are readily available. One goal of the analysis is to be able to say that for a system of n linear equations there exists n independent solutions.

## Preliminaries

For a square matrix M define its expontential function exp(M) which maps square matrices into square matrices of the same dimension as

#### exp(M) = I + M + M²/2! + M³/3! + …

where I is the identity matrix of the same dimensions as M. Questions of convergence will be dealt with later.

For a matrix multiplied by a scalar t the definition reduces to

## Systems of Linear Homogeneous Differential Equations and Their Solution

A set of linear differential equations of the form

#### dx1/dt = a1,1xi + a1,2x2 + … + a1,nxn dx2/dt = a2,1xi + a2,2x2 + … + a2,nxn...................................................... dxn/dt = an,1xi + an,2x2 + … + an,nxn

can be expressed as

#### dX/dt = AX(t)

where X(t) is an n dimensional column vector and A is the n×n matrix of the coefficients ai,j. The solution requires the initial conditions X(t)=X0 be satisfied.

Now consider the exponential matrix function

#### exp(At) = I + At + A²t²/2! + A³t³/3! + …

The derivative of the right-hand side (RHS) of the above with respect to t gives

#### d(exp(At))/dt = A + A²(2t)/2! + A³(3t²)/3! + … which reduces to d(exp(At))/dt = A + A²t + A³t²/2! + …

The matrix A can be factored as a premultiplier from each term so

#### d(exp(At))/dt = A[ I + At + A²t²/2! + …]

What is left in the brackets is none other than exp(At). Therefore

#### d(exp(At))/dt = A[exp(At)]

Now consider X(t) = exp(At)X0. Differentiation by t shows that

#### dX/dt = A(exp(At)X0) = AX(t)

Thus the function exp(At)X0 satisfies the system of differential equations dX/dt = AX and the initial conditions.

## Systems of Linear Inhomogeneous Equations

Consider a system of equations of the form

#### dX/dt = AX + C

where C is a vector of constants. If A has an inverse then the system can be converted into the form

#### dX/dt = A(X+D)

where D=A-1C.

Let Y(t)=X(t)+D. Then the system becomes

#### dY/dt = AY

This system has the solution Y(t)=exp(At)Y(0) and thus

## Systems with Variable Coefficients

The systems previously considered all had constant coefficients. If any coefficient is a non-trivial function of time then the system

has the solution

#### X(t) = exp(∫0tA(s)ds)X(0)

If A(t) has an inverse for all t then the solution to the inhomogeneous system

#### dX/dt=A(t)X + C(t) can be constructed as X(t) = exp(∫0tA(s)ds)(X(0)+D(0)) - D(t)

where D(t)=A(t)-1C(t).

## Questions of Convergence

A square matrix M can be represented as

#### M = PΛPT

Where P is an orthogonal matrix; i.e., P-1=PT; i.e., the inverse of P is equal to the transpose of P. The matrix Λ is a diagonal martrix. Thus

This means that

#### exp(M) = P*exp(Λ)PT

The values of elements of Λ are called the eigenvalues of the matrix M and P is the matrix of its eigenvectors.

#### If Λ=Diag(λ1, λ2, …, λn ) then exp(Λ)=Diag(exp(λ1), exp(λ2), …, exp(λn) ).

One small complication is that the eigenvalues of a matrix may be nonreal. That is no major difficulty in that exp(x+iy) is well defined. It is exp(x)(cos(y)+isin(y)). A more serious complication occurs when an eigenvalue occurs with a multiplicity greater than unity. If an eigenvalue is repeated then the second occurence does not constitute another independent solution.

If an eigenvalue λ occurs with a multiplicity p then

#### {exp(λt), t*exp(λt), t²*exp(λt), …, tp-1*exp(λt)}

are all solutions.

(To be continued.)

## Higher Order Differential Equations

A differential equation of order k can be converted into a system of k first order equations. This just involves defining new variables such that

## Systems of Nonlinear Differential Equations

Suppose a system of ordinary differential equations is representeed as

#### dX/dt = F(X)

One approach to a solution is by iteration. Start with an arbitrary X0(t) then construct X1(t) by integration of

#### dX1/dt = F(X0(t))

and likewise for X2(t) and beyond as

#### dXn+1/dt = F(Xn(t))

The question of convergence can be examined by looking at the differences yn=Xn(t)−Xn-1(t). Then

#### dyn/dt = F(Xn)−F(Xn-1)

The RHS of the above can be approximated by (∂F/∂X)·yn(t). Thus

#### dyn/dt = (∂F/∂X)·yn(t)

The y's would go asymptotically to zero if the eigenvalues of the matrices (∂F/∂X) are all of a negative real part.

(To be continued.)