Section 8.5 is entitled “Properties of Linear Systems.”
I have a feeling a lot of this section will be direct quotes
from the book, considering that this section contains quite a few proofs.
“Suppose x1
and x2 are solutions to
the homogeneous linear system x’ = Ax. If C1 and C2
are any constants, then x = C1x1 + C2x2 is also a solution to
[this system]” (362).
Using the properties of matrix multiplication, we can prove this:
Note that this theorem works if and only
if the system is linear and homogeneous.
“Suppose that x1, x2,
… , and xk are all
solutions to the homogeneous linear system x’
= Ax. Then any linear combination of x1, x2, … , and xk
is also a solution. Thus for any constants C1, C2, … , Ck,
the function
is a solution to x’ = Ax” (363).
So if we have a system, and we want x to be a solution to that system such
that x can be expressed as
In this case, x1 and x2
are solutions to our system, and C1 and C2 are arbitrary
constants. Our goal would be to find C1 and C2 which
would make our expression of x
satisfied for all t. If we know something about our solutions x1 and x2, like what their values would be at a specific
time/point t, we could solve for C1 and C2 at that
time/point. We have a handy exactness/uniqueness theorem we can use that would
imply our equation would be satisfied for all t. Hooray!
However, something to note that we can
only solve our constants provided the matrix
is nonsingular. The matrix will be
nonsingular if x1(t0)
and x2(t0) are
linearly independent. In this case x11 and x12 are the
values for x1 at our specific
value of t, and x21 and x22 are the values for x2 at our specific value of
t. I suppose I could have made this a little more general, and expanded the
rows to contain values x1n and x2n, but you get the idea.
This works (just in case you’re a little foggy on matrix multiplication)
because x(t0) can be
compacted to
The dimension of the first matrix is n × 2,
and the dimension of the second matrix is 2 × 1. The multiplication is valid.
Double hooray!
Let’s look at a proposition concerning
this.
“Suppose y1(t), y2(t),
… , and (t), … , and yk(t)
are solutions to the n-dimensional system y’
= Ay defined on the interval I = (α,
β).
1. If the vectors y1(t0), y2(t0),
… , and yk(t0)
are linearly dependent for some t0 ∈ I, then
there are contestants C1, C2, … , and Ck, not
all zero, such that C1y1(t)
+ C2y2(t) + …
+ Ckyk(t) = 0 for all t ∈ I. In
particular, y1(t), y2(t), … , and yk(t) are linearly dependent
for all t ∈ I.
2. If for some t0 ∈ I the
vectors y1(t0),
y2(t0), .. ,
and yk(t0) are
linearly independent, then y1(t),
y2(t), … , and yk(t) are linearly
independent for all t ∈ I” (365).
The definition that arises from this is fairly
straightforward. If there is one value of t that makes the linear system y’ = Ay linearly independent, then the set of all k solutions of the system
is also linearly independent.
“Suppose y1,
…, and yn are linearly
independent solutions to the n-dimensional linear system y’(t) = Ay(t). Then any
solution y can be expressed as a
linear combination of y1,
…, and yn. That is, there
are constants C1, …, and Cn such that
for all t” (365).
If our homogeneous and linear system has a set
of n linearly independent solutions, it’ll be called the fundamental set of
solutions.
Our last theorem provides us with a way of finding
general solutions to the homogeneous system y’ = Ay:
1. Find n linearly independent solutions y1, y2, …, yn.
2. Show the n solutions are linearly
independent.
3. Make the general solution
C1, C2, …, and Cn
are arbitrary constants.
Elaborating on step 2, we only have to show
linear independence for one value of t. A great way to do this is to use
determinants and the Wronskian. I barely covered this in section 4.1 (which
turned out to not be a thing in this class for a while). Hooray! My inability to
follow the weekly schedule early on has finally paid off!
If for one value of t, the Wronskian is not
equal to zero, then the n solutions are linearly independent.
That’s all for chapter 8! I know I’ll have the
time, but if I have the motivation, then I’ll get started on chapter 9.
I’ll see you when I see you.
No comments:
Post a Comment