auto-update(nvim): 2025-03-03 01:10:33
Some checks are pending
Deploy Quartz site to GitHub Pages using Nix / build (push) Waiting to run
Deploy Quartz site to GitHub Pages using Nix / deploy (push) Blocked by required conditions

This commit is contained in:
Youwen Wu 2025-03-03 01:10:33 -08:00
parent ddd6d0e351
commit 243c74438a
Signed by: youwen5
GPG key ID: 865658ED1FE61EC3

View file

@ -2579,4 +2579,71 @@ $
Just record the probability of the respective events. Just record the probability of the respective events.
For example, the probability $X$ is 0 and $Y$ is 1 is $p_(X,Y) (0,1)$ is $2/8$. For example, the probability $X$ is 0 and $Y$ is 1 is $p_(X,Y) (0,1)$ is $2/8$.
2. Suppose we are playing a game where each tails earns 3 dollars and if the first flip is tails, each reward is doubled. What is the expected reward?
Note that we can record this reward function by $g(x,y) = 3(1+x)y$. Then the expected reward is just
$
EE[g(X,Y)] = sum_(k=0)^1 sum_(l=0)^3 g(k,l) p_(X,Y) (k,l) = sum_(k=0)^1 sum_(l=0)^3 3(1+k) l p_(X,Y) (k,l) \
= sum_(l=0)^3 p_(X,Y) (0,l) + sum_(l=0)^3 6l p_(X,Y) (1,l) = 7 + 1 / 2
$
]
== Discrete marginal distributions
Consider the joint PMF $p(x,y)$ for a random vector $vec(X,Y)$. The marginal PMF for $X$ is found by
$
P(X=x) &= p_X (x) = P(union.big_("all" y) {X=x,Y=y}) \
&= sum_("all" y) P(X = x, Y = y) \
&= sum_("all" y) p(x,y)
$
We sum for all possible values across all the other random variables.
In general, for an $n$-vector, we do the same thing summing over all possible
combinations of values of the other variables.
== Multinomial distribution
#definition[
Let $n$ and $r$ be positive integers and let $p_1, p_2, ..., p_r$ be positive reals such that $p_1 + p_2 + dots.c + p_r = 1$. Then the random vector $(X_1, ..., X_r)$ has the *multinomial distribution* with parameters $n$, $r$, and $p_1,p_2,...,p_r$ if the possible values are integers vectors $(k_1,k_2,...,k_r)$ such that $k_j >= 0$ and $k_1 + k_2 + dots.c + k_r = n$ and the joint PMF is given by
$
P(X_1 = k_1, X_2 = k_2, ..., X_r = k_r) = vec(n,k_1,k_2,...,k_r) p_1^(k_1) p_2^(k_2) dots.c p_r^(k_r)
$
Abbreviate this by $(X_1,...,X_r) ~ "Mult"(n,r,p_1,...,p_r)$.
]
#example[
Suppose an urn contains 1 green, 2 red, and 3 yellow balls. We sample a ball
with replacement 10 times. Find the probability that green appeared 3 times,
red twice, and yellow 5 times.
Let $X_g, X_r, X_y$ be the number of green, red and yellow balls in the
sample. Then
$
(X_g,X_r,X_y) ~ "Mult"(n=10,r=3,1 / 6,2 / 6,3 / 6)
$
and
$
PP(X_g = 3, X_r = 2, X_y = 5) = 10! / (3!2!5!) (1 / 6)^3 (2 / 6)^2 (3 / 6)^5 \
approx 0.0405
$
]
== Jointly continuous
Let the pair of continuous random variables $(X,Y)$ defined on common $Omega$ have the joint PDF $f_(X,Y) (x,y)$ where $f$ is a function on $RR^2$ such that for any subset $B subset.eq RR^2$,
$
P((X,Y) in B) = integral.double f(x,y) dif x dif y
$
where $f_(X,Y) (x,y) >= 0$ for all possible $(x,y)$ and $integral _y integral _x f(x,y) = 1$.
#definition[
In general, for random variables $X_1, X_2, ..., X_n$, they are jointly continuous if there exists a joint density function $f$ on $RR^n$ such that for subsets $B subset.eq RR^n$,
$
P((X_1,...,X_n) in B) = integral underbrace(dots.c,B) integral f(x_1,dots,x_n) dif x_1 dots.c dif x_n
$
where any $f(x_1,...,x_n) >= 0$ and we always integrate to unity.
] ]