Computer Science, asked by madhavsharma2407, 7 months ago

1 point
"Tokenization" is when the
characters are translated into a
sequence of tokens, classifying
each by it's
category.
O Lexicon
O Lexical
O Logical
0 Syntactical​

Answers

Answered by swadhaszar426
0

Answer:

I think logical not sure

Answered by Krishnasahu2020
0

Answer:

logical is the answer. .

Similar questions