1.Give at least three examples of NLP.
2. Compare Stemming with lemmatization.
3. Compare Sentence tokenization with word tokenization
Answers
1. Email filters
Smart assistants
Search result
2.
Stemming just removes or stems the last few characters of a word, often leading to incorrect meanings and spelling. Lemmatization considers the context and converts the word to its meaningful base form, which is called Lemma.
3.Sentence tokenization is the process of splitting text into individual sentences. ... After generating the individual sentences, the reverse substitutions are made, which restores original text in a set of improved sentences.
Tokenization is essentially splitting a phrase, sentence, paragraph, or an entire text document into smaller units, such as individual words or terms. Each of these smaller units are called tokens. Check out the below image to visualize this definition: The tokens could be words, numbers or punctuation marks.
1. Smart assistants
Search results
Predictive text
2. Stemming just removes or stems the last few characters of a word, often leading to incorrect meanings and spelling. Lemmatization considers the context and converts the word to its meaningful base form, which is called Lemme
3. Sentence tokenization is the process of splitting text into individual sentences. ... After generating the individual sentences, the reverse substitutions are made, which restores original text in a set of improved sentences.