auto-update(nvim): 2025-03-03 18:25:32
This commit is contained in:
parent
243c74438a
commit
fb5cfda895
2 changed files with 283 additions and 0 deletions
|
@ -1103,6 +1103,49 @@ The equilibrium solution $arrow(x)(t) = vec(0,0)$ is stable (as $t$ moves in
|
|||
either direction it tends to $vec(0,0)$ namely because it doesn't depend on
|
||||
$t$). It's an example of a *node*.
|
||||
|
||||
== Imaginary eigenvalues
|
||||
|
||||
Consider the $2 times 2$ system
|
||||
$
|
||||
arrow(x)' = mat(1,-1;2,3) arrow(x)
|
||||
$
|
||||
Then the eigenvalues/vectors are
|
||||
$
|
||||
r_1 = 2 + i, arrow(v)_1 = vec(-1,2) + i vec(1,0) \
|
||||
r_2 = 2 - i, arrow(v)_2 = vec(-1,2) - i vec(1,0)
|
||||
$
|
||||
|
||||
We have complex solutions
|
||||
$
|
||||
arrow(x)(t) = e^((2+i) t) (vec(-1,2) + i vec(1,0)) \
|
||||
= e^(2t) (cos t + i sin t) (vec(-1,2) + i vec(1,0))
|
||||
$
|
||||
Now we can build two linearly independent real solutions out of one solution.
|
||||
|
||||
Check that
|
||||
$
|
||||
arrow(x)_"Re" (t) = e^(2t) (cos t vec(-1,2) - sin t vec(1,0)) = vec(-e^(2t) cos t - e^(2t) sin t,2e^(2t) cos t)
|
||||
$
|
||||
is a solution to the system.
|
||||
|
||||
In general it turns out for an $n times n$ system,
|
||||
$
|
||||
arrow(x)' = A arrow(x)
|
||||
$
|
||||
If the eigenvalues/vectors are
|
||||
$
|
||||
r_(1,2) = lambda plus.minus i mu
|
||||
$
|
||||
and
|
||||
$
|
||||
arrow(v)_(1,2) = arrow(a) plus.minus i arrow(b)
|
||||
$
|
||||
then we have two real and fundamental solutions
|
||||
$
|
||||
arrow(x)_"Re" (t) = e^(lambda t) (cos(mu t) arrow(a) - sin(mu t) arrow(b)) \
|
||||
arrow(x)_"Im" (t) = e^(lambda t) (sin(mu t) arrow(a) + cos(mu t) arrow(b)) \
|
||||
$
|
||||
|
||||
= Repeated eigenvalues, nonhomogenous systems
|
||||
|
||||
== Classification of equilibria $n=2$
|
||||
|
@ -1146,6 +1189,32 @@ As long as $arrow(u)$ solves $(A - r_1 I) arrow(u) = arrow(v)_1$, it works.
|
|||
We can *always* find the rest of our solution space with this method.
|
||||
]
|
||||
|
||||
#example[
|
||||
Consider the system
|
||||
$
|
||||
arrow(x) = mat(1,-2;2,-3) arrow(x)
|
||||
$
|
||||
on $[a,b]$.
|
||||
|
||||
The matrix has eigenvalue and eigenvector
|
||||
$
|
||||
r_1 = -1 "and" arrow(v)_1 = vec(1,1)
|
||||
$
|
||||
Find a generalized eigenvector $arrow(u)$ that satisfies
|
||||
$
|
||||
(A - r_1 I) arrow(u) = arrow(v)_1
|
||||
$
|
||||
|
||||
Such an $arrow(u)$ could be $vec(1/2, 0)$. Now we write the general solution
|
||||
$
|
||||
arrow(x)_1 (t) = e^(-t) vec(1,1), arrow(x)_2 (t) = t e^(-t) vec(1,1) + e^(-t) vec(1/2,0) = e^(-t) vec(t+ 1/2, t)
|
||||
$
|
||||
And the general solution is
|
||||
$
|
||||
arrow(x) (t) = c_1 e^(-t) vec(1,1) + c_2 e^(-t) vec(t + 1/2, t)
|
||||
$
|
||||
]
|
||||
|
||||
== Nonhomogenous linear system
|
||||
|
||||
Like before, when we have a set of fundamental solutions to the homogenous
|
||||
|
|
|
@ -2647,3 +2647,217 @@ where $f_(X,Y) (x,y) >= 0$ for all possible $(x,y)$ and $integral _y integral _x
|
|||
$
|
||||
where any $f(x_1,...,x_n) >= 0$ and we always integrate to unity.
|
||||
]
|
||||
|
||||
= Lecture #datetime(day: 3, month: 3, year: 2025).display()
|
||||
|
||||
== Conditioning on an event
|
||||
|
||||
Let event $A = {X=k}$ for a discrete random variable $X$, then
|
||||
|
||||
#definition[
|
||||
Let $X$ be a discrete random variable and $B$ an event with $P(B) > 0$. Then
|
||||
the *conditional probability mass function* of $X$, given $B$, is the
|
||||
function $p_(X | B)$ defined as follows for all possible values $k$ of $X$:
|
||||
$
|
||||
p_(X|B) (k) = P(X = k | B) = P({X=k} sect B) / P(B)
|
||||
$
|
||||
]
|
||||
|
||||
#definition[
|
||||
Let $X$ be a discrete random variable and $B$ an event with $P(B) > 0$. Then
|
||||
the *conditional expectation* of $X$, given the event $B$ is given by
|
||||
$EE[X|B]$ and defined as:
|
||||
$
|
||||
EE[X | B] = sum_k k p_(X|B) (k) = sum_k k P(X=k | B)
|
||||
$
|
||||
where the sum ranges over all possible values $k$ of $X$.
|
||||
]
|
||||
|
||||
== Law of total probability
|
||||
|
||||
#fact[
|
||||
Let $Omega$ be a sample space, $X$ a discrete random variable on $Omega$, and
|
||||
$B_1,...,B_n$ a partition on $Omega$ such that each $P(B_i) > 0$. Then the
|
||||
(unconditional) probability mass function of $X$ can be calculated by
|
||||
averaging the conditional probability mass functions,
|
||||
$
|
||||
p_X (k) = sum_(i=1)^n p_(X|B_i) (k) P(B_i)
|
||||
$
|
||||
]
|
||||
|
||||
#example[
|
||||
Let $X$ denote the number of customers that arrive in my store tomorrow. If the day is rainy, $X$ is $"Poisson"(lambda)$ and if the day is dry, $X$ is $"Poisson"(mu)$. Suppose the probability it rains tomorrow is 0.10. Find the probability max function and expectation of $X$.
|
||||
|
||||
Let $B$ be the event that it rains tomorrow. Then $P(B) = 0.10$. The conditional PMF and conditional expectation is given by
|
||||
$
|
||||
p_(X|B) (k) = e^(-lambda) lambda^k / k!, p_(X|B^c) (k) = e^(-mu) mu^k / k!
|
||||
$
|
||||
and
|
||||
$
|
||||
EE[X|B] = lambda, EE[X|B^c] = mu
|
||||
$
|
||||
The unconditional PMF is given by
|
||||
$
|
||||
p_X (k) &= P(B) p_(X|B) (k) + P(B^c) p_(X|B^c) (k) \
|
||||
&= 1 / 10 e^(-lambda) lambda^k / k! + 9 / 10 e^(-mu) mu^k / k!
|
||||
$
|
||||
]
|
||||
|
||||
== Conditioning on a random variable
|
||||
|
||||
#definition[
|
||||
Let $X$ and $Y$ be discrete random variables. Then the *conditional probability mass function* of $X$ given $Y=y$ is the following:
|
||||
$
|
||||
p_(X|Y) (x|y) = P(X=x | Y = y) = P(X = x, Y = y) / P(Y=y) = (p_(X,Y) (x,y)) / (p_Y (y))
|
||||
$
|
||||
]
|
||||
|
||||
#definition[
|
||||
The conditional expectation of $X$ given $Y = y$ is
|
||||
$
|
||||
EE[X | Y=y] = sum_x x p_(X|Y) (x|y)
|
||||
$
|
||||
]
|
||||
|
||||
#remark[
|
||||
These definitions are valid for all $y$ such that $P(y) > 0$.
|
||||
]
|
||||
|
||||
#example[
|
||||
Suppose an insect lays some number of eggs, $X$. Of those eggs, some hatch and
|
||||
some won't, with probability $p$, and each egg hatching independent of the
|
||||
others. Let $Y$ represent the number of the $x$ eggs that hatch.
|
||||
|
||||
Then
|
||||
$
|
||||
X ~ "Pois"(lambda) \
|
||||
Y | X = x ~ "Bin"(x,p)
|
||||
$
|
||||
|
||||
What is the marginal distribution of $Y$, $p_Y (y)$?
|
||||
$
|
||||
p_(X,Y) (x,y) &= p_X (x) dot p_(Y|X=x) (y) \
|
||||
&= e^(-lambda) lambda^x / x! dot vec(x,y) p^y q^(x-y) \
|
||||
&= e^(-lambda) lambda^x / cancel(x!) dot cancel(x!) / (y! (x-y)!) p^y q^(x-y) \
|
||||
p_Y (y) &= sum_x p_(X,Y) (x,y) = sum_(x=0)^infinity p_(X,Y) (x,y) \
|
||||
&= e^(-lambda) p^y / y! = sum_(x=y)^infinity (lambda^y dot lambda^(x-y) q^(x-y)) / (x-y)! = e^(-lambda (1-q)) (lambda p)^y / y! = e^(-lambda p) (lambda p)^y / y! \
|
||||
&= "Pois"(lambda p)
|
||||
$
|
||||
]
|
||||
|
||||
== Continuous marginal/conditional distributions
|
||||
|
||||
The conditional probability density function of $X$ given $Y = y$ is given by
|
||||
$
|
||||
X | Y = y ~ f_(X|Y) (x) = f(x,y) / f_y(y)
|
||||
$
|
||||
and the corresponding probability density function of $Y$ given $X = x$,
|
||||
$
|
||||
Y | X = x ~ f_(Y|X=x) (y) = f(x,y) / f_X(x)
|
||||
$
|
||||
|
||||
For example, where $X = "height"$, $Y = "weight"$, which is fixed at 150 lbs,
|
||||
and we want the distribution of heights where the weight value is $y = 150$.
|
||||
|
||||
#definition[
|
||||
The conditional probability that $X in A$ given $Y = y$, is
|
||||
$
|
||||
PP(X in A | Y = y) = integral_A f_(X|Y) (x|y) dif x
|
||||
$
|
||||
The conditional expectation of $g$, given $Y = y$, is
|
||||
$
|
||||
EE[g(X) | Y = y] = integral_(-infinity)^infinity g(x) f_(X|Y) (x|y) dif x
|
||||
$
|
||||
]
|
||||
|
||||
#fact[
|
||||
Let $X$ and $Y$ be jointly continuous. Then
|
||||
$
|
||||
f_X (x) = integral_(-infinity)^infinity f_(X|Y) (x|y) f_Y (y) dif y
|
||||
$
|
||||
For any function $g$ where the expectation makes sense, is then
|
||||
$
|
||||
EE[g(X)] = integral^infinity_(-infinity) EE[g(X) | Y = y] f_Y (y) dif y
|
||||
$
|
||||
]
|
||||
|
||||
#definition[
|
||||
Let $X$ and $Y$ be discrete or jointly continuous random variables. The
|
||||
*conditional expectation* of $X$ given $Y$, denoted $EE[X|Y]$, is by
|
||||
definition the RV $v(Y)$ where the function $v$ is defined by $v(y)$ where
|
||||
the function $v$ is defined by $v(y) = EE[X | Y = y]$.
|
||||
]
|
||||
|
||||
#remark[
|
||||
Note the distinction between $EE[X | Y=y]$ and $EE[X|Y]$. The first is a number, the second is a random variable. The possible values (support) of $EE[X|Y]$ is precisely the numbers $EE[X | Y = y]$ as $y$ varies. The terminology:
|
||||
- $EE[X | Y = y]$ is the expectation of $X$ given $Y = y$
|
||||
- $EE[X|Y]$ is the expectation of $X$ given $Y$
|
||||
]
|
||||
|
||||
#example[
|
||||
Suppose $X$ and $Y$ are ${0,1}$-valued random variables with joint PMF
|
||||
#table(
|
||||
columns: 3,
|
||||
rows: 3,
|
||||
[$X \\ Y$], [0], [1],
|
||||
[0], [$3 / 10$], [$2 / 10$],
|
||||
[1], [$1 / 10$], [$4 / 10$],
|
||||
)
|
||||
Find the conditional PMF and conditional expectation of $X$ given $Y = y$.
|
||||
|
||||
The marginal PMFs come from summing respective rows and columns
|
||||
$
|
||||
p_Y (0) = 4 / 10 "and" p_Y (1) = 6 / 10
|
||||
$
|
||||
and
|
||||
$
|
||||
p_X (0) = 5 / 10 "and" p_X (1) = 5 / 10
|
||||
$
|
||||
The conditional PMF of $X$ given $Y = 0$ is
|
||||
$
|
||||
P_(X|Y) (0|0) = (p_(X,Y) (0,0)) / (p_Y (0)) = 3 / 4 \
|
||||
P_(X|Y) (1|0) = (p_(X,Y) (1,0)) / (p_Y (0)) = 1 / 4
|
||||
$
|
||||
Similarly, the conditional PMF of $X$ given $Y = 0$ is
|
||||
$
|
||||
p_(X|Y) (0|1) = 1 / 3 \
|
||||
p_(X|Y) (1|1) = 2 / 3
|
||||
$
|
||||
|
||||
The conditional expectations of $X$ come by computnig expectations with the
|
||||
conditional PMF
|
||||
$
|
||||
EE[X | Y = 0] = 0 dot p_(X|Y) (0|0) + 1 dot p_(X|Y) (1|0) = 0 dot 3 / 4 + 1 / 4 = 1 / 4 \
|
||||
EE[X | Y = 1] = 0 dot p_(X|Y) (0|1) + 1 dot p_(X|Y) (1|1) = 0 dot 1 / 3 + 2 / 3 = 2 / 3
|
||||
$
|
||||
]
|
||||
|
||||
== Sums of independent random variables
|
||||
|
||||
We derive distributions for sums of independent random variables. We show how
|
||||
symmetry can help simplify calculations. If we know the joint distribution of
|
||||
any two $X$ and $Y$, then we know everything about them and can describe any
|
||||
random variable of the form $g(X,Y)$.
|
||||
|
||||
In particular, we focus on $g(X,Y) = X + Y$ for both the discrete and
|
||||
continuous case.
|
||||
|
||||
Suppose $X$ and $Y$ are discrete with joint PMF $p_(X,Y)$. Then $X + Y$ is also discrete and its PMF can be computed by breaking up the event ${X+Y = n}$.
|
||||
$
|
||||
{X+Y = n} = union.big_("all possible" k) {X = k, Y = n - k}
|
||||
$
|
||||
into the disjoint union of the events ${X=k,Y=n-k}$.
|
||||
|
||||
So,
|
||||
$
|
||||
p_(X+Y) (n) = P(X + Y = n) = sum_k PP(X=k, Y = n - k) \
|
||||
= sum_k p_(X,Y) (k,n-k)
|
||||
$
|
||||
|
||||
If $X$ and $Y$ are independent, then we can rewrite
|
||||
$
|
||||
p_(X+Y) (n) = sum_k p_X (k) p_Y (n-k) \
|
||||
= sum_l p_X (n-l) p_Y (l) \
|
||||
= p_X convolve p_Y (n)
|
||||
$
|
||||
Where $convolve$ is the _convolution_ of $X$ and $Y$.
|
||||
|
|
Loading…
Reference in a new issue