Explore chapters and articles related to this topic
L
Published in Phillip A. Laplante, Dictionary of Computer Science, Engineering, and Technology, 2017
lex a program that takes a description of a set of regular expressions defining different tokens and produces a lexical analyzer for these tokens that can then be integrated into the application. See also yacc.
LR-SDiscr: a novel and scalable merging and splitting discretization framework using a lexical generator
Published in Journal of Information and Telecommunication, 2019
Habiba Drias, Hadjer Moulai, Nourelhouda Rehkab
Numeric discretization consists in recognizing values of same significance or dependency and insert them in intervals. Lex, a lexical analyzer (Lesk, 1975; Lesk & Schmidt, 2016) is a tool for automatically and rapidly implementing a lexer for a programming language or a sequence of recognizable objects. It is widely used in compilers' construction but it is also prevalent in many areas that require patterns' recognition, such as word processing and natural language. As the discretization mechanism is based on recognizing intervals of values to classify them according to their importance, Lex is an appropriate and convenient tool for handling this task. As shown in Figure 1, Lex generates automatically the discretization programme from a source containing specifications of entities and actions. Each entity is described by a regular expression followed by an action composed of a programme fragment. The action is executed each time the corresponding interval is recognized during the discretization process.
Development of a CNC interpretation service with good performance and variable functionality
Published in International Journal of Computer Integrated Manufacturing, 2022
Since G code is a context-sensitive language, its analysis part consists of lexical analysis, syntactic analysis, and semantic analysis. Lex&Yacc, an off-the-shelf compilation tool, is widely used to implement the analysis part (Xu et al. 2007; Xu and Ye 2007; Wang and Zhou 2017). The Lex tool implements the lexical analysis by splitting the source program into tokens in light of the lex specification of the source language. The Yacc tool implements the syntactic analysis by finding the hierarchical structure of the source program in light of the yacc specification of the source language. The lex specification creates a set of patterns of the source language, using the regular expression format. The yacc specification describes syntax rules and some semantic rules of the source language in the Backus-Naur Form (BNF) (Levine, Mason, and Brown 1995). However, the Lex&Yacc tool is developed in the mid-1970s and its development does not involve new technologies in computer science, such as object-oriented paradigm (OOP) (Ivantsov 2008). Disadvantages of the Lex&Yacc tool include that both lex specification and yacc specification have a fixed and complex programming structure and are written by special meta-languages, both Lex tool and Yacc tool need to install, the lexical analysis and the syntactic analysis are implemented separately, and more. ANTLR is another off-the-shelf compilation tool that is utilized to build G code interpreters (Yu 2008). Its parsers use a new parsing technology called Adaptive LL(*) (Parr 2013). LL(*) parsers are easier to read and write than LR-style parsers like Lex&Yacc, on the other hand, however, they are less powerful and accept a much smaller set of grammars (Aho et al. 2007).