LEXICALANALYSIS

Lexical analysis

In computer science, lexical analysis is the process of converting a sequence of characters into a sequence of tokens. A program or function that performs lexical analysis is called a lexical analyzer, lexer, tokenizer, or scanner, though "scanner" is also used for the first stage of a lexer. A lexer is generally combined with a parser, which together analyze the syntax of computer languages, such as in compilers for programming languages, but also HTML parsers in web browsers, among other examples.

The above text is a snippet from Wikipedia: Lexical analysis
and as such is available under the Creative Commons Attribution/Share-Alike License.

lexical analysis

Noun

  1. The conversion of a stream of characters to a stream of meaningful tokens; normally to simplify parsing.
    While it's often not difficult to identify tokens while parsing, having a separate stage for lexical analysis simplifies the structure of your compiler.


The above text is a snippet from Wiktionary: lexical analysis
and as such is available under the Creative Commons Attribution/Share-Alike License.

Need help with a clue?
Try your search in the crossword dictionary!