Physics, asked by Sonamsinha4826, 1 year ago

A classifier that can compute using numeric as well as categorical values is called

Answers

Answered by shoaibahmad131
1

for numerical data, choices are too many - starting from basic decision trees, naive bayes, SVM, logistic regression, ensemble methods (bagging, boosting), Random forest, multi-layer perceptron etc.

For categorical data - naive bayes,  decision trees and their ensembles including Random forest, Minimum distance classifiers or KNN type with a cost function different than euclidean distance e.g. hamming distance

For 'mixed data', one option is to go with decision trees, other possibilities are naive Bayes where you model numeric attributes by a Gaussian distribution or kernel density estimation or so. You can also employ a minimum distance or KNN based approach; however, the cost function must be able to handle data for both types together. If these approaches don't work then try ensemble techniques. Try bagging with decision trees or else Random Forest that combines bagging and random subspace. With mixed data, choices are limited and you need to be cautious and creative with your choices. (Taken from Shehroz Khan's answer to Which algorithm fits best for categorical and continuous independent variables with categorical response in Machine Learning

Answered by zunnairailyas177
3

A classifier that can compute using numeric as well as categorical values is called decision tree classifier

The decision tree classifier is one amongst the oldest and most intuitive classification algorithms living. In machine learning, call trees are used for many years as effective and simply comprehensible information classifiers (contrast that with the various black box classifiers in existence).  

The classification technique may be a systematic approach to make classification models from Associate in nursing input digital audiotape set. As an example, call tree classifiers, rule-based classifiers, neural networks, support vector machines, and naive Thomas Bayes classifiers area unit totally different technique to unravel a classification drawback. Every technique adopts a learning formula to spot a model that most closely fits the relationship between the attribute set and sophistication label of the input file. Therefore, a key objective of the educational formula is to make predictive model that accurately predict the category labels of antecedently unknown records.

Decision Tree Classifier may be an easy and wide used classification technique. It applies a straightforward plan to unravel the classification drawback. Call Tree Classifier poses a series of fastidiously crafted questions on the attributes of the check record.


Similar questions