Naive Bayes Classification uses probability to classify data points.
Naive Bayes can be understood better if the concept of conditional probability and Bayes rule are clear. Naive Bayes makes a strong assumption that features are independent of each other.
Conditional Probability is based on the probability that something will happen, given that some event already happened i.e; based on past event occurrences.
-
Event A is that it is raining outside, and it has a 0.4 (40%) chance of raining today.
-
Event B is that you will play outside, and that has a probability of 0.6 (60%).
The conditional probability is that it is both rain and you will play.
P(Play | Rain) = P(Rain) | P(Play) * P(Play) / P(Rain)
Thus, on the basis of some known past events, probability is calculated.
Naive bayes is suitable for multiclass prediction and it performs well with both categorical and numeric data.
But the strong assumption that Naive Bayes makes serves as a negative point as in real-life we hardly find features completely independent of each other.
Naive Bayes finds it application in Spam filtering, building recommender system, sentiment analysis and others.