Math, asked by technology57, 4 months ago

If x + {\large{\frac{1}{x}}}, Find the value of x² + {\large{\frac{1}{x²}}}.

Please Don't Spam⚕️​


technology57: It's Urgent...
technology57: Please don't Spam...
technology57: Only 4 min. left for Test to end...
technology57: Be fast Please...

Answers

Answered by puttu33331
3

Answer:

ChaptersMenu

← previousnext →

6.1.6 Solved Problems

Problem

Let X,Y and Z be three jointly continuous random variables with joint PDF

fXYZ(x,y,z)={

1

3

(x+2y+3z) 0≤x,y,z≤1 0 otherwise

Find the joint PDF of X and Y, fXY(x,y).

Solution

Problem

Let X,Y and Z be three independent random variables with X∼N(μ,σ2), and Y,Z∼Uniform(0,2). We also know that

E[X2Y+XYZ]=13, E[XY2+ZX2]=14.

Find μ and σ.

Solution

Problem

Let X1, X2, and X3 be three i.i.d Bernoulli(p) random variables and

Y1=max(X1,X2), Y2=max(X1,X3), Y3=max(X2,X3), Y=Y1+Y2+Y3.

Find EY and Var(Y).

Solution

Problem

Let MX(s) be finite for s∈[−c,c], where c>0. Show that MGF of Y=aX+b is given by

MY(s)=esbMX(as),

and it is finite in [−

c

|a|

,

c

|a|

].

Solution

Problem

Let Z∼N(0,1) Find the MGF of Z. Extend your result to X∼N(μ,σ).

Solution

Problem

Let Y=X1+X2+X3+...+Xn, where Xi's are independent and Xi∼Poisson(λi). Find the distribution of Y.

Solution

Problem

Probability Generating Functions (PGFs): For many important discrete random variables, the range is a subset of {0,1,2,...}. For these random variables it is usually more useful to work with probability generating functions (PGF)s defined as

GX(z)=E[zX]=

n=0 P(X=n)zn,

for all z∈R that GX(z) is finite.

Show that GX(z) is always finite for |z|≤1.

Show that if X and Y are independent, then

GX+Y(z)=GX(z)GY(z).

Show that

1

k!

dkGX(z)

dzk

|z=0=P(X=k).

Show that

dkGX(z)

dzk

|z=1=E[X(X−1)(X−2)...(X−k+1)].

Solution

Problem

Let MX(s) be finite for s∈[−c,c] where c>0. Prove

limn→∞ [MX(

s

n

)]n=esEX.

Solution

Problem

Let MX(s) be finite for s∈[−c,c], where c>0. Assume EX=0, and Var(X)=1. Prove

limn→∞ [MX(

s

n

)]n=e

s2

2

.

Note: From this, we can prove the Central Limit Theorem (CLT) which is discussed in Section 7.1.

Solution

Problem

We can define MGF for jointly distributed random variables as well. For example, for two random variables (X,Y), the MGF is defined by

MXY(s,t)=E[esX+tY].

Similar to the MGF of a single random variable, the MGF of the joint distributions uniquely determines the joint distribution. Let X and Y be two jointly normal random variables with EX=μX, EY=μY, Var(X)=σ

2

X

, Var(Y)=σ

2

Y

, ρ(X,Y)=ρ . Find MXY(s,t).

Solution

Problem

Let X=[ X1 X2 ] be a normal random vector with the following mean vector and covariance matrix

m=[ 0 1 ],C=[ 1 −1 −1 2 ].

Let also

A=[ 1 2 2 1 1 1 ],b=[ 0 1 2 ],Y=[ Y1 Y2 Y3 ]=AX+b.

Find P(0≤X2≤1).

Find the expected value vector of Y, mY=EY.

Find the covariance matrix of Y, CY.

Find P(Y3≤4).

Solution

Problem

(Whitening/decorrelating transformation) Let X be an n-dimensional zero-mean random vector. Since CX is a real symmetric matrix, we conclude that it can be diagonalized. That is, there exists an n by n matrix Q such that

QQT=I(I is the identity matrix), CX=QDQT,

where D is a diagonal matrix

D=[ d11 0 ... 0 0 d22 ... 0 . . . . . . . . . . . . 0 0 ... dnn ].

Now suppose we define a new random vector Y as Y=QTX, thus

X=QY.

Show that Y has a diagonal covariance matrix, and conclude that components of Y are uncorrelated, i.e., Cov(Yi,Yj)=0 if i≠j.

Solution

← previousnext →

Creative Commons License

Introduction to Probability by Hossein Pishro-Nik is licensed under a Creative Commons Attribution-NonCommercial-NoDerivs 3.0 Unported License

Similar questions