what is lexical analysis?how it is diffrent from parsing ?explain.
Answers
Answered by
0
Answer:
In computer science, lexical analysis, lexing or tokenization is the process of converting a sequence of characters into a sequence of tokens. A program that performs lexical analysis may be termed a lexer, tokenizer, or scanner, though scanner is also a term for the first stage of a lexer.
The main difference between lexical analysis and syntax analysis is that lexical analysis reads the source code one character at a time and converts it into meaningful lexemes (tokens) whereas syntax analysis takes those tokens and produce a parse tree as an output.
Explanation:
hope it helps u
:)
Similar questions