auto-update(nvim): 2025-03-01 01:08:05
This commit is contained in:
parent
325bbc1c05
commit
bb0b3da8b7
1 changed files with 311 additions and 6 deletions
|
@ -79,7 +79,7 @@ Attendance to discussion sections is mandatory.
|
|||
#definition[
|
||||
*Equilibrium solutions* for the ODE
|
||||
$ y' = F(x,y) $
|
||||
are solutions $y(x)$ such that $y'(x) = 0$, that is, $y(x)$ is constant.
|
||||
are solutions $y(x)$ such that $y'(x) = 0$, that is, $y(x)$ is *constant*.
|
||||
]
|
||||
|
||||
#example[
|
||||
|
@ -531,7 +531,7 @@ We can find a few particular solutions to our ODE, but how can we find all of th
|
|||
techniques.
|
||||
]
|
||||
|
||||
= Principle of superposition, Wronskian complex roots
|
||||
= Principle of superposition, the Wronskian, complex roots
|
||||
|
||||
== Review
|
||||
|
||||
|
@ -740,6 +740,8 @@ $
|
|||
y_2(t) = 1 / (2i) [z_1(t) - z_2(t)] = e^(lambda t) sin mu t, "imaginary part of" z_1(t) \
|
||||
$
|
||||
|
||||
(in fact this is a variant of the Laplace transform.)
|
||||
|
||||
By the superposition principle, they are solutions. Are they a fundamental set
|
||||
of solutions? Are they a basis for the solution space?
|
||||
|
||||
|
@ -840,12 +842,59 @@ To find gamma, we can simply use inverse trig functions.
|
|||
|
||||
== Linear systems of differential equations
|
||||
|
||||
Consider the following *linear system* of ODEs:
|
||||
|
||||
$
|
||||
x' = x + 2y \
|
||||
y' = 2x - 2y
|
||||
$
|
||||
|
||||
We want function $x(t)$ and $y(t)$ that together solve this system. For instance,
|
||||
$
|
||||
x(t) = 2e^(2t), y(t) = e^(2t)
|
||||
$
|
||||
|
||||
Which we write as a vector
|
||||
$
|
||||
arrow(x)(t) = vec(2e^2t, e^(2t))
|
||||
$
|
||||
|
||||
We express our system above in matrix form
|
||||
|
||||
$
|
||||
arrow(x)'(t) = mat(1,2;2,-2) arrow(x)(t)
|
||||
$
|
||||
|
||||
The solution $arrow(x)(t)$ is a *vector valued function* because it takes you from $t :: RR$ to $RR^2$, so $arrow(x) : RR -> RR^2$.
|
||||
|
||||
We may consider $arrow(x)$, $arrow(y)$ as populations of competing species or
|
||||
concentrations of two compounds in a mixture.
|
||||
|
||||
== General first order system
|
||||
|
||||
#fact[
|
||||
The general first order linear system:
|
||||
|
||||
$
|
||||
arrow(x)'(t) = A(t) arrow(x)(t) + arrow(g)(t)
|
||||
$
|
||||
|
||||
where $A(t)$ is an $n times n$ matrix and $arrow(g)$ and $arrow(x)$ are
|
||||
vectors of length $n$.
|
||||
$
|
||||
arrow(g)(t) = vec(g_1 (t), g_2 (t), dots.v, g_n (t)) \
|
||||
arrow(x)(t) = vec(mu_1 (t), mu_2 (t), dots.v, mu_n (t)) \
|
||||
$
|
||||
]
|
||||
|
||||
== Superposition principle for linear system
|
||||
|
||||
Consider a matrix $A$ and solution vector $x$.
|
||||
$
|
||||
x'(t) = A(t) x(t)
|
||||
$
|
||||
|
||||
#fact[Superposition principle for linear system][
|
||||
#fact[
|
||||
If $x^((1)) (t)$ and $x^((2)) (t)$ are solutions, then
|
||||
$
|
||||
c_1 x^((1)) (t) + c_2 x^((2)) (t)
|
||||
|
@ -853,14 +902,95 @@ $
|
|||
are also solutions.
|
||||
]
|
||||
|
||||
=== Homogenous case
|
||||
== Homogenous case
|
||||
|
||||
This is when $g(t) = 0$. I want to solve for $arrow(x)' = A arrow(x)$, where
|
||||
This is when $g(t) = 0$. Consider
|
||||
$
|
||||
arrow(x)'(t) = A arrow(x)(t)
|
||||
$
|
||||
Then there exists $n$ solutions
|
||||
$
|
||||
arrow(x)^((1)) (t), dots, arrow(x)^((n)) (t)
|
||||
$
|
||||
such that any solution $arrow(x) (t)$ is a (unique) linear combination
|
||||
$
|
||||
arrow(x)(t) = c_1 arrow(x)^((1)) (t) + dots + c_2 arrow(x)^((n)) (t)
|
||||
$
|
||||
|
||||
=== The Wronskian, back back again!
|
||||
|
||||
Alternatively form an $n times n$ square matrix $X(t)$ by arranging the fundamental solutions in columns
|
||||
|
||||
#definition[
|
||||
This is called a *fundamental matrix*.
|
||||
|
||||
$
|
||||
X(t) = mat(x^((1)) (t), dots.c, x^((n)) (t))
|
||||
$
|
||||
]
|
||||
|
||||
#definition[
|
||||
$det X(t)$ is called the *Wronskian* of $x^((1)) (t), dots.c, x^((n)) (t)$.
|
||||
]
|
||||
|
||||
=== Finding solutions
|
||||
|
||||
I want to solve for $arrow(x)' = A arrow(x)$, where
|
||||
$
|
||||
arrow(x) = vec(mu_1 (t), mu_2 (t), dots.v, mu_n (t))
|
||||
$
|
||||
|
||||
Similar to the scalar case, we look for solutions in terms of exponential
|
||||
functions. Guessing
|
||||
functions. Substitute
|
||||
|
||||
$
|
||||
arrow(x)(t) = e^(bold(r) t) arrow(v)
|
||||
$
|
||||
where $bold(r)$ is a constant, and $arrow(v)$ is a column vector in $RR^n$.
|
||||
|
||||
Then we have
|
||||
|
||||
$
|
||||
arrow(x)'(t) = r e^(r t) arrow(v) \
|
||||
A arrow(x)(t) = A e^(r t) arrow(v) = e^(r t) A arrow(v)
|
||||
$
|
||||
|
||||
Then $arrow(x)(t)$ is a solution if
|
||||
|
||||
$
|
||||
r arrow(v) = A arrow(v)
|
||||
$
|
||||
|
||||
Recall that when we have a linear transformation from $RR^m -> RR^m$, an
|
||||
*eigenvector* is a vector whose only transformation is that it gets scaled.
|
||||
That is, applying the linear transformation is equivalent to multiplying the
|
||||
vector by a scalar. An *eigenvalue* for an eigenvector is the scalar that
|
||||
scales the vector after the linear transformation. That is, for a linear
|
||||
transformation $A : RR^m -> RR^m$, vector $arrow(v)$ (in $RR^m$), and scalar
|
||||
$lambda$, if $A arrow(v) = lambda arrow(v)$, then $lambda$ is an eigenvalue of
|
||||
the eigenvector $arrow(v)$. A complex eigenvector somehow corresponds to
|
||||
rotation, however we will not discuss geometric interpretation of it here.
|
||||
|
||||
To solve for an eigenvector from an eigenvalue, one only needs to write the
|
||||
equation $(A - lambda I) arrow(v) = 0$.
|
||||
|
||||
Returning our attention to our equation above, we see that $arrow(v)$ is an
|
||||
eigenvector of the coefficient matrix $A$ with eigenvalue $r$.
|
||||
|
||||
If an $n times n$ coefficient matrix $A$ has $n$ linearly independent
|
||||
eigenvectors (eigenbasis)
|
||||
|
||||
$
|
||||
arrow(v)^((1)), ..., arrow(v)^((n))
|
||||
$
|
||||
|
||||
with corresponding eigenvalues
|
||||
|
||||
$
|
||||
r_1, ..., r_n
|
||||
$
|
||||
|
||||
Guessing
|
||||
|
||||
$
|
||||
arrow(x) (t) = e^(r t) arrow(v) \
|
||||
|
@ -896,4 +1026,179 @@ This is the eigenvalue equation for $A$!
|
|||
|
||||
Now we want the eigenvectors for our eigenvalues. Find an eigenvector
|
||||
corresponding to $lambda_1 = 1$.
|
||||
|
||||
For $r_1 = 1$:
|
||||
$
|
||||
(A - I)v^((1)) = 0 \
|
||||
mat(1,1;1,1) v^((1)) = 0 \
|
||||
v^((1)) = vec(1,-1)
|
||||
$
|
||||
|
||||
For $r_2 = 3$:
|
||||
$
|
||||
(A - 3I)v^((2)) = 0 \
|
||||
mat(-1,1;1,-1) v^((2)) = 0 \
|
||||
v^((2)) = vec(1,1)
|
||||
$
|
||||
|
||||
Then our fundamental solutions are
|
||||
$
|
||||
x^((1)) (t) &= e^(r_1 t) arrow(v)^((1)) = e^t vec(1,-1) \
|
||||
x^((2)) (t) &= e^(r_2 t) arrow(v)^((2)) = e^(3t) vec(1,1)
|
||||
$
|
||||
Forming a fundamental matrix
|
||||
$
|
||||
X(t) = mat(e^t,e^(3t);-e^t,e^(3t))
|
||||
$
|
||||
Representing a general solution
|
||||
$
|
||||
arrow(x)(t) = c_1 arrow(x)^((1)) (t) + c_2 arrow(x)^((2)) (t) = c_1 e^t vec(1,-1) + c_2 e^(3t) vec(1,1)
|
||||
$
|
||||
or written as a fundamental matrix:
|
||||
$
|
||||
arrow(x)(t) = X(t) c = mat(e^t,e^(3t);-e^t,e^(3t)) vec(c_1,c_2)
|
||||
$
|
||||
]<general-solution-system>
|
||||
|
||||
#example[Continued @general-solution-system][
|
||||
Now consider the initial value $arrow(x)(0) = arrow(x)_0 = vec(2,-1)$. Now
|
||||
solve the initial value problem.
|
||||
|
||||
General solution
|
||||
$
|
||||
arrow(x)(t) = X(t) c = mat(e^t,e^(3t);-e^t,e^(3t)) vec(c_1,c_2)
|
||||
$
|
||||
$
|
||||
arrow(x)(0) = X(0) arrow(c) = x_0 => arrow(c) = X(0)^(-1) arrow(x)_0 \
|
||||
arrow(c) = mat(1,1;-1,1)^(-1) vec(2,-1) = 1 / 2 mat(1,-1;1,1) vec(2,-1) = vec(3/2,1/2)
|
||||
$
|
||||
Finally giving us
|
||||
$
|
||||
arrow(x)(t) = X(t) c = mat(e^t,e^(3t);-e^t,e^(3t)) vec(3/2,1/2) \
|
||||
vec(3/2 e^t + 1/2 e^(3t), -3/2 e^t + 1/2 e^(3t))
|
||||
$
|
||||
]
|
||||
|
||||
== Visualizing solutions
|
||||
|
||||
We discuss visualizing the solutions in @general-solution-system. The general
|
||||
solution is a vector valued function.
|
||||
|
||||
$
|
||||
arrow(x)(t) = c_1 arrow(x)^((1)) (t) = c_2 arrow(x)^((2)) (t) = c_1 e^t vec(1,-1) + c_2 e^(3t) vec(1,1)
|
||||
$
|
||||
|
||||
As $t$ varies, each solution $arrow(x)(t)$ traces a curve in the plane. When
|
||||
$c_1 = c_2 = 0$, $arrow(x) = vec(0,0)$, the *equilibrium solution*.
|
||||
|
||||
When $c_1 != 0$ and $c_2 = 0$, the solution is a scalar multiple of $vec(1,-1)$
|
||||
and the magnitude tends to $infinity$ as $t -> infinity$. As $t -> -infinity$
|
||||
the magnitude tends to 0.
|
||||
|
||||
When both $c_1$ and $c_2$ are nonzero then the full solution is a linear
|
||||
combination. As $t -> infinity$ the magnitude of $arrow(x)$ tends to
|
||||
$infinity$. As $t -> -infinity$ the magnitude of $arrow(x)$ tends to 0.
|
||||
|
||||
The equilibrium solution $arrow(x)(t) = vec(0,0)$ is stable (as $t$ moves in
|
||||
either direction it tends to $vec(0,0)$ namely because it doesn't depend on
|
||||
$t$). It's an example of a *node*.
|
||||
|
||||
= Repeated eigenvalues, nonhomogenous systems
|
||||
|
||||
== Classification of equilibria $n=2$
|
||||
|
||||
We discuss classification of possible equilibria at 0 for a system $arrow(x)' =
|
||||
A arrow(x)$ when $n=2$.
|
||||
|
||||
If we have real eigenvalues $r_1, r_2 != 0$, then
|
||||
- $r_1,r_2 < 0$ means we have an *asymptotically stable* node
|
||||
- $r_1,r_2 > 0$ means we have an *unstable* node
|
||||
- $r_1,r_2 < 0$ means we have an *unstable* saddle
|
||||
|
||||
== Repeated eigenvalues
|
||||
|
||||
Consider the system
|
||||
$
|
||||
arrow(x)' = mat(1,-2;2,5) arrow(x)
|
||||
$
|
||||
on $[a,b]$. It has one eigenvalue and eigenvector
|
||||
$
|
||||
r_1 = 3, arrow(v)_1 = vec(1,-1)
|
||||
$
|
||||
So we have one solution
|
||||
$
|
||||
arrow(x)_1 (t) = e^(3t) vec(1,-1)
|
||||
$
|
||||
|
||||
How do we obtain the rest of our fundamental set? We need to try
|
||||
$
|
||||
arrow(x)_2 (t) = t e^(3t) arrow(v)_1 + e^(3t) arrow(u)
|
||||
$
|
||||
and find the right choice of $arrow(u)$.
|
||||
|
||||
As long as $arrow(u)$ solves $(A - r_1 I) arrow(u) = arrow(v)_1$, it works.
|
||||
|
||||
#definition[
|
||||
We call such a vector $arrow(u)$ a *generalized eigenvector*.
|
||||
]
|
||||
|
||||
#remark[
|
||||
We can *always* find the rest of our solution space with this method.
|
||||
]
|
||||
|
||||
== Nonhomogenous linear system
|
||||
|
||||
Like before, when we have a set of fundamental solutions to the homogenous
|
||||
system, we only need to find any particular solution.
|
||||
|
||||
$
|
||||
arrow(x)(t) = c_1 arrow(x)_1 (t) + c_2 arrow(x)_2 (t) + arrow(x)_p (t)
|
||||
$
|
||||
|
||||
== Methods for finding a particular solution
|
||||
|
||||
- When $arrow(g)(t) = arrow(g)$ is a constant vector, there may exist an
|
||||
equilibrium solution which can then be used as a particular solution.
|
||||
- Method of undetermined coefficients: can be used in constant coefficient case
|
||||
if $arrow(g)(t)$ has a special form. Very limited.
|
||||
- Variation of parameters: more general, but messy integrals
|
||||
|
||||
== Equilibrium solution as particular solution
|
||||
|
||||
Let $arrow(g) in RR^n$ be a constant vector. Find a particular solution to
|
||||
$
|
||||
arrow(x)' + A arrow(x) + arrow(g)
|
||||
$
|
||||
Solve the linear system to find a constant equilibrium solution
|
||||
$
|
||||
A arrow(x) + arrow(g) = 0
|
||||
$
|
||||
If $A$ is invertible then
|
||||
$
|
||||
arrow(x) = -A^(-1) arrow(g)
|
||||
$
|
||||
is an equilibrium solution. So
|
||||
$
|
||||
arrow(x)_p (t) = -A^(-1) arrow(g)
|
||||
$
|
||||
is a particular solution of the system.
|
||||
|
||||
== Undetermined coefficients
|
||||
|
||||
Consider
|
||||
$
|
||||
arrow(x)' (t) = mat(1,2;2,1) arrow(x)(t) + vec(1,1)
|
||||
$
|
||||
We want $arrow(x)_p$. Let's try
|
||||
$
|
||||
arrow(x)_p = vec(A,B)
|
||||
$
|
||||
|
||||
The general solution looks like
|
||||
$
|
||||
arrow(x)(t) = c_1 e^(-t) vec(1,-2) + c_2 e^(3t) vec(1,2) + arrow(x)_p (t)
|
||||
$
|
||||
We assume
|
||||
$
|
||||
arrow(x)_p = vec(A e^t, B e^t)
|
||||
$
|
||||
|
|
Loading…
Reference in a new issue