How to find joint probability distribution function?
Answers
Answered by
0
The joint probability mass function of twodiscrete random variables {\displaystyle X,Y} is:
{\displaystyle {\begin{aligned}\mathrm {P} (X=x\ \mathrm {and} \ Y=y)=\mathrm {P} (Y=y\mid X=x)\cdot \mathrm {P} (X=x)=\mathrm {P} (X=x\mid Y=y)\cdot \mathrm {P} (Y=y)\end{aligned}},}
where {\displaystyle \mathrm {P} (Y=y\mid X=x)} is the probabilityof {\displaystyle Y=y} given that {\displaystyle X=x
The generalization of the preceding two-variable case is the joint probability distribution of {\displaystyle n\,discrete random variables {\displaystyle X_{1},X_{2},\dots ,X_{n}} which is:
{\displaystyle {\begin{aligned}\mathrm {P} (X_{1}=x_{1},\dots ,X_{n}=x_{n})&=\mathrm {P} (X_{1}=x_{1})\times \mathrm {P} (X_{2}=x_{2}\mid X_{1}=x_{1})\\&\times \mathrm {P} (X_{3}=x_{3}\mid X_{1}=x_{1},X_{2}=x_{2})\\&\dots \\&\times P(X_{n}=x_{n}\mid X_{1}=x_{1},X_{2}=x_{2},\dots ,X_{n-1}=x_{n-1}).\end{aligned}}}
This identity is known as the chain rule of probability.
Since these are probabilities, we have in the two-variable case
{\displaystyle \sum _{i}\sum _{j}\mathrm {P} (X=x_{i}\ \mathrm {and} \ Y=y_{j})=1,\,}
which generalizes for {\displaystyle n\,}discrete random variables {\displaystyle X_{1},X_{2},\dots ,X_{n}}to
{\displaystyle \sum _{i}\sum _{j}\dots \sum _{k}\mathrm {P} (X_{1}=x_{1i},X_{2}=x_{2j},\dots ,X_{n}=x_{nk})=1.\;}
Continuous caseEdit
The joint probability density function fX,Y(x, y) for two continuous random variables is equal to:
{\displaystyle f_{X,Y}(x,y)=f_{Y\mid X}(y\mid x)f_{X}(x)=f_{X\mid Y}(x\mid y)f_{Y}(y)\;
where fY|X(y|x) and fX|Y(x|y) are the conditional distributions of Y given X = x and of X givenY = y respectively, and fX(x) and fY(y) are themarginal distributions for X and Yrespectively.
Again, since these are probability distributions, one has
{\displaystyle \int _{x}\int _{y}f_{X,Y}(x,y)\;dy\;dx=1.}
Mixed caseEdit
The "mixed joint density" may be defined where one or more random variables are continuous and the other random variables are discrete, or vice versa. With one variable of each type we have
{\displaystyle {\begin{aligned}f_{X,Y}(x,y)=f_{X\mid Y}(x\mid y)\mathrm {P} (Y=y)=\mathrm {P} (Y=y\mid X=x)f_{X}(x).\end{aligned}}}
One example of a situation in which one may wish to find the cumulative distribution of one random variable which is continuous and another random variable which is discrete arises when one wishes to use a logistic regression in predicting the probability of a binary outcome Y conditional on the value of a continuously distributed outcome X. Onemust use the "mixed" joint density when finding the cumulative distribution of this binary outcome because the input variables (X, Y) were initially defined in such a way that one could not collectively assign it either a probability density function or a probability mass function. Formally, fX,Y(x, y) is the probability density function of (X, Y) with respect to the product measure on the respective supports of X and Y. Either of these two decompositions can then be used to recover the joint cumulative distribution function:
{\displaystyle {\begin{aligned}F_{X,Y}(x,y)&=\sum \limits _{t\leq y}\int _{s=-\infty }^{x}f_{X,Y}(s,t)\;ds.\end{aligned}}}
The definition generalizes to a mixture of arbitrary numbers of discrete and continuous random variables.
Plzzzzz mark me brainlist plzz...
{\displaystyle {\begin{aligned}\mathrm {P} (X=x\ \mathrm {and} \ Y=y)=\mathrm {P} (Y=y\mid X=x)\cdot \mathrm {P} (X=x)=\mathrm {P} (X=x\mid Y=y)\cdot \mathrm {P} (Y=y)\end{aligned}},}
where {\displaystyle \mathrm {P} (Y=y\mid X=x)} is the probabilityof {\displaystyle Y=y} given that {\displaystyle X=x
The generalization of the preceding two-variable case is the joint probability distribution of {\displaystyle n\,discrete random variables {\displaystyle X_{1},X_{2},\dots ,X_{n}} which is:
{\displaystyle {\begin{aligned}\mathrm {P} (X_{1}=x_{1},\dots ,X_{n}=x_{n})&=\mathrm {P} (X_{1}=x_{1})\times \mathrm {P} (X_{2}=x_{2}\mid X_{1}=x_{1})\\&\times \mathrm {P} (X_{3}=x_{3}\mid X_{1}=x_{1},X_{2}=x_{2})\\&\dots \\&\times P(X_{n}=x_{n}\mid X_{1}=x_{1},X_{2}=x_{2},\dots ,X_{n-1}=x_{n-1}).\end{aligned}}}
This identity is known as the chain rule of probability.
Since these are probabilities, we have in the two-variable case
{\displaystyle \sum _{i}\sum _{j}\mathrm {P} (X=x_{i}\ \mathrm {and} \ Y=y_{j})=1,\,}
which generalizes for {\displaystyle n\,}discrete random variables {\displaystyle X_{1},X_{2},\dots ,X_{n}}to
{\displaystyle \sum _{i}\sum _{j}\dots \sum _{k}\mathrm {P} (X_{1}=x_{1i},X_{2}=x_{2j},\dots ,X_{n}=x_{nk})=1.\;}
Continuous caseEdit
The joint probability density function fX,Y(x, y) for two continuous random variables is equal to:
{\displaystyle f_{X,Y}(x,y)=f_{Y\mid X}(y\mid x)f_{X}(x)=f_{X\mid Y}(x\mid y)f_{Y}(y)\;
where fY|X(y|x) and fX|Y(x|y) are the conditional distributions of Y given X = x and of X givenY = y respectively, and fX(x) and fY(y) are themarginal distributions for X and Yrespectively.
Again, since these are probability distributions, one has
{\displaystyle \int _{x}\int _{y}f_{X,Y}(x,y)\;dy\;dx=1.}
Mixed caseEdit
The "mixed joint density" may be defined where one or more random variables are continuous and the other random variables are discrete, or vice versa. With one variable of each type we have
{\displaystyle {\begin{aligned}f_{X,Y}(x,y)=f_{X\mid Y}(x\mid y)\mathrm {P} (Y=y)=\mathrm {P} (Y=y\mid X=x)f_{X}(x).\end{aligned}}}
One example of a situation in which one may wish to find the cumulative distribution of one random variable which is continuous and another random variable which is discrete arises when one wishes to use a logistic regression in predicting the probability of a binary outcome Y conditional on the value of a continuously distributed outcome X. Onemust use the "mixed" joint density when finding the cumulative distribution of this binary outcome because the input variables (X, Y) were initially defined in such a way that one could not collectively assign it either a probability density function or a probability mass function. Formally, fX,Y(x, y) is the probability density function of (X, Y) with respect to the product measure on the respective supports of X and Y. Either of these two decompositions can then be used to recover the joint cumulative distribution function:
{\displaystyle {\begin{aligned}F_{X,Y}(x,y)&=\sum \limits _{t\leq y}\int _{s=-\infty }^{x}f_{X,Y}(s,t)\;ds.\end{aligned}}}
The definition generalizes to a mixture of arbitrary numbers of discrete and continuous random variables.
Plzzzzz mark me brainlist plzz...
Attachments:

Similar questions