Computer Science, asked by dhipinsahni2917, 4 months ago

Naive Bayes works based on the idea that Predictor variables in the ML process is __________ of each other.

Answers

Answered by pitamberpatel1678
8

Explanation:

Bayes’ theorem finds many uses in the probability theory and statistics. There’s a micro chance that you have never heard about this theorem in your life. Turns out that this theorem has found its way into the world of machine learning, to form one of the highly decorated algorithms. In this article, we will learn all about the Naive Bayes Algorithm, along with its variations for different purposes in machine learning.

As you might have guessed, this requires us to view things from a probabilistic point of view. Just as in machine learning, we have attributes, response variables and predictions or classifications. Using this algorithm, we will be dealing with the probability distributions of the variables in the dataset and predicting the probability of the response variable belonging to a particular value, given the the attributes of a new instance. Lets start by reviewing the Bayes’ theorem

Answered by aryansuts01
0

Answer:

Independent

The foundation of Naive Bayes is the notion that each predictor variable used in machine learning is independent of the others.

Explanation:

The Nave Bayes algorithm is a supervised learning method for classification issues that is based on the Bayes theorem. It is mostly employed in text categorization with a large training set.

It is a classification method built on the Bayes Theorem and predicated on the idea of prediction independent. A Naive Bayes classifier, to put it simply, believes that the presence of one feature in a class has nothing to do with the existence of any other feature.

Simple to construct and especially helpful for very big data sets is the naive Bayes model. In addition to being simple, Naive Bayes is known to outperform even the most sophisticated classification methods.

  • It is quick and simple to forecast the test data set class. Additionally, it excels at multi-class prediction.
  • A Naive Bayes classification model performed best when the assumption of independence is true than other models, such as logistic regression, and requires fewer data for training.
  • Compared to a numerical variable, it performs well with category input variables (s). Numerical quantities are thought to have a normal distribution.

#SPJ2

Similar questions