Computer Science, asked by SaEram, 10 months ago

Consider the case where two classes follow Gaussian distribution which are centered at (−2, 4) and (2, 8) and have identity covariance matrix. Which of the following is the separating decision boundary?

Answers

Answered by namoarihantanam
0

can you specify which topic is this in python?

Answered by KajalBarad
0

Answer:

he case where two classes follow Gaussian distribution which are centered at (-2, 4) and (2, 8) and have identity covariance matrix boundary using LDA assuming the priors to be equal is  X+Y =6 .

Given ,

(−2, 4) and (2, 8)

So,

When the covariance among K classes is assumed to be equal, LDA emerges. That is, rather than having a separate covariance matrix for each class, all classes have the same covariance matrix.

It is linear if there exists a function.

H(x) = β0 + βT x such that h(x) = I(H(x) > 0). H(x) is also called a linear discriminant function. The decision boundary is therefore defined as the set {x ∈ Rd : H(x)=0}, which corresponds to a (d − 1)-dimensional hyperplane within the d-dimensional input space X.

Similar questions