Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

PPL Unit-I

Download as pptx, pdf, or txt
Download as pptx, pdf, or txt
You are on page 1of 76

Principles of Programming

Languages
Unit-I
Prepared By:
Mrs.K.Pranathi
Asst.Professor(CSE)
GLWEC
Course objectives
• To briefly describe various programming paradigms.
• To provide conceptual understanding of High level design and
implementation.
• To introduce the power of scripting languages.
• To provide an introduction to formalisms for specifying syntax and
semantics of programming languages.
• To provide an exposure to core concepts and principles in
contemporary programming languages.
• To analyze and optimize the complexity of programming languages.
Outcomes
• Ability to express syntax and semantics in formal notation.
• Ability to apply suitable programming paradigm for the application.
• Gain knowledge and comparison of the features programming
languages.
• Identify and describe semantic issues associated with variable
binding ,scoping rules, parameter passing and exception handling.
• Understand the design issues of object-oriented and functional
languages.
Syllabus
Unit-I
PRELIMINARY CONCEPTS Reasons for Studying ,Concepts of Programming Languages,
Programming Domains ,Language Evaluation Criteria, Influences on Language Design, Language
Categories, Programming Paradigms-Imperative , Functional Programming , Logic programming.
Programming Language Implementation-compilation and Virtual Machines, Programming environments.
SYNTAX AND SEMANTICS The General Problems of Describing Syntax and semantics, formal
methods of describing syntax BNF, EBNF for common programming language features, parse trees,
Ambiguous Grammars, attribute grammars, denotational semantics and axiomatic semantics for common
programming language features.

Unit-II
Data types: Introduction, Primitive,Character, User-Defined,Array,Associative,Record,Union,Pointer and
Reference types, design and implementation uses related to these types. Names, Variables, Concept of
Binding, Type Checking, Strong typing, Type Compatibility ,Named Constants, variable initialization.
Expressions And Statements & Control Structures : Arithmetic, Relational and Boolean Expressions,
Short-Circuit Evaluation , Mixed-Mode Assignment, Assignment Statements, Control Structures-Statement
Level , Compound Statements Selection ,Iteration, Unconditional Statements, Guarded Commands.
Syllabus(contd..)
Unit-III
Sub Programs Block and Fundamentals of Sub-Programs: Scope and lifetime of variable
Static and dynamic scope, Design issues of subprogram and operations
Local Referencing Environments, Parameter passing methods.
Overloaded sub-programs, Generic sub programs, parameters that are sub program names,
Design issues for functions user defined overloaded operators, co-routines.

Unit-IV
Abstract types: Data Abstraction and encapsulation, Introductions to data abstraction, design
issues, Language examples, C++ Parameterized ADT, Object oriented programming in small talk.
C++,Java,C#,Ada95
Concurrency: Subprogram level Concurrency. Semaphores,Monitors, Message Passing . Java
threads, C# threads .Exception Handling: exceptions,exception propogation. Exception Handler in
Ada C++ and Java. Logical programming Language: Introduction and overview of logic
programming, Basic elements of prolog, application of logical programming.
Syllabus(contd..)
Unit-V
Functional Programming Languages: Introduction, fundamentals of FPL.LISP, ML, Haskell,
Introduction, Application of Functional Programming Languages, Comparison of functional and
imperative languages.
Scripting Language: Pragmatics, Key Concepts.
Case Study: Python- Values and Types, Variables, Storage and Control,
Bindings and Scope, Procedural Abstraction, data Abstraction,
Separate Compilation, Module Library.
Text Books

1. Concepts of Programming Languages Robert W. Sebesta 8/e, Pearson


Education.2008.
2. Programming language Design Concepts, D.A.Watt, Wiley
dreamtech,rp-2007
3. Programming Languages,2nd Edition,A.B.Tucker, R.E.Noonan, TMH.
4. Programming Languages, K.C.Louden,2nd Edition, Thomson, 2003.
5. LISP, Patric Henry Winston and paul Horn, Pearson Education.
6. Programming in Prolog, W.F.Clocksin, & C.S.Mellish, 5th Edition,
Springer.
7. Programming Python, M.Lutz, #rd Edition, O’reilly, SPD, rp-2007.
8. Core Python Programming, Chun, II Edition, Pearson Education, 2007.
9. Guide to Programming with Python, Michael Dawson, Thomson, 2008.
Unit-I
Syllabus
• PRELIMINARY CONCEPTS Reasons for Studying ,Concepts of
Programming Languages, Programming Domains ,Language Evaluation
Criteria, Influences on Language Design, Language Categories,
Programming Paradigms-Imperative , Functional Programming , Logic
programming. Programming Language Implementation-compilation and
Virtual Machines, Programming environments.
• SYNTAX AND SEMANTICS The General Problems of Describing Syntax
and semantics, formal methods of describing syntax BNF, EBNF for
common programming language features, parse trees, Ambiguous
Grammars, attribute grammars, denotational semantics and axiomatic
semantics for common programming language features.
Preliminary Concepts

Reasons for Studying PPL


• Increased ability to express ideas
• Improved background for choosing appropriate languages
• Increased ability to learn new languages
• Better understanding of significance of implementation
• Overall advancement of computing
Programming Domains

Scientific Applications
• In the early 40s computers were invented for scientific applications.
• The applications require large number of floating point
computations.
• Fortran was the first language developed scientific applications.
• ALGOL 60 was intended for the same use.
Business applications
• The first successful language for business was COBOL.
• Produce reports, use decimal arithmetic numbers and characters.
• The arrival of PCs started new ways for businesses to use computers
• Spreadsheets and database systems were developed for business.
Programming Domains(continued..)
Artificial intelligence
• Symbolic rather than numeric computations are manipulated.
• Symbolic computation is more suitably done with linked lists than
arrays.
• LISP was the first widely used AI programming language.
Systems programming
• The O/S and all of the programming supports tools are collectively
known as its system software.
• Need efficiency because of continuous use.
Programming Domains(continued..)
Scripting languages
• Put a list of commands, called a script, in a file to be executed.
• PHP is a scripting language used on Web server systems. Its code is
embedded in HTML documents. The code is interpreted on the
server before the document is sent to a requesting browser.
• Special-purpose languages
Language Evaluation Criteria

• Readability
• Overall simplicity
• Orthogonality-Control Statements, Data Types and
Structures,Syntax Considerations
• Writability-Support for abstraction, Expressivity
• Reliability-Type checking, Exception handling, Aliasing,
Readability and writability, Cost
Influences on Language Design
Computer architecture
• Von Neumann
• We use imperative languages, at least in part, because we use von
Neumann machines
• Data and programs stored in same memory
• Memory is separate from CPU
• Instructions and data are piped from memory to CPU
• Results of operations in the CPU must be moved back to memory
• Basis for imperative languages
– Variables model memory cells
– Assignment statements model piping
– Iteration is efficient
The Von Neumann Machine
Programming Methodologies

• 1950s and early 1960s: Simple applications; worry about


machine efficiency
Late 1960s: People efficiency became important;
readability, better control structures
• Structured programming
• Top-down design and step-wise refinement
• Late 1970s: Process-oriented to data-oriented and data
abstraction
• Middle 1980s: Object-oriented programming

Language Categories
• A programming language defines a set of instructions that are
compiled together to perform a specific task by the CPU (Central
Processing Unit).
• Programming languages are broadly categorized into three types −
• Low level ( Assembly level language, Machine level language also
comes into this category).
• High-level language
Low Level Languages
• Low-level languages are used to write programs that relate to the
specific architecture and hardware of a particular type of computer.
• They are closer to the native language of a computer (binary), making
them harder for programmers to understand.
• Programs written in low-level languages are fast and memory efficient.
• However, it is nightmare for programmers to write, debug and maintain
low-level programs.
• They are mostly used to develop operating systems, device drivers,
databases and applications that require direct hardware access.
• Low level languages are further classified in two more categories –
Machine language and Assembly language.
Advantages of Low-Level
Languages
• Programs developed using low-level languages are fast and memory
efficient.
• Programmers can utilize processor and memory in a better way
using a low-level language.
• There is no need of any compiler or interpreters to translate the
source to machine code. Thus, cuts the compilation and
interpretation time.
• Low-level languages provide direct manipulation of computer
registers and storage.
• It can directly communicate with hardware devices
Disadvantages of Low-Level
Languages
• Programs developed using low-level languages are machine
dependent and are not portable.
• It is difficult to develop, debug and maintain.
• Low-level programs are more error-prone.
• Low-level programming usually results in poor programming
productivity.
• A programmer must have additional knowledge of the computer
architecture of a particular machine, for programming in the low-
level language
Machine Level Languages
• Machine language is lowest level of programming language. It
handles binary data i.e. 0’s and 1’s. It directly interacts with system.
• Machine language is difficult for human beings to understand as it
comprises combination of 0’s and 1’s.
• There is software which translate programs into machine level
language.
• Examples include operating systems like Linux, UNIX, Windows,
etc.
• In this language, there is no need of compilers and interpreters for
conversion and hence the time consumption is less. However, it is
not portable and non-readable to humans.
Assembly Level Language

• It consists of a set of instructions in a specific format


called commands.
• It uses symbols to represent field of instructions. It is very close to
machine level language.
• The computer should have assembler to translate assembly level
program to machine level program.
• Examples include ADA, PASCAL, etc. It is in human-readable
format and takes lesser time to write a program and debug it.
However, it is a machine dependent language.
Pictorial Representation of Assembly
Language and Machine Code
High-level Language

• High-level languages are similar to the human language. high-level


languages are programmers friendly, easy to code, debug and
maintain. it provides a higher level of abstraction from machine
language.
• They do not interact directly with the hardware. Rather, they focus
more on the complex arithmetic operations, optimal program
efficiency and easiness in coding.
• Programs in a high-level language are written using English
statements (such as Python, Java, C++, etc).
High-level Language(continued…)

• High-level programs require compilers/interpreters to translate


source code to machine language.
• We can compile the source code written in the high-level language
to multiple machine languages. Thus, they are machine independent
language.
• High-level languages are grouped into two categories based on the
execution model – compiled or interpreted languages.
• High-level language uses format or language that is most familiar to
users. The instructions in this language are called codes or scripts.
• The computer needs a compiler and interpreter to convert high-level
language program to machine level language.
• Examples include C++, Python, Java, etc.
Following is a simple example for a high level language −

if age < 18
{
printf("You are not eligible to vote");
}
else
{
printf("You are eligible to vote");
}
Classification of High-level
Languages
• We can also classify high-level language several other categories based on the
programming paradigm.
• Structured programming (sometimes known as modular programming) is a
programming paradigm aimed at improving the clarity, quality, and development
time of a computer program by making extensive use of the structured control
flow constructs of selection (if/then/else) and repetition (while and for), block
structures, and subroutines.
• Hence, making it more efficient and easier to understand and modify.
• Structured programming frequently employs a top-down design model, in which
developers map out the overall program structure into separate subsections.
• Note, it is possible to do structured programming in any programming language.
Object Oriented Programming
• Any given procedure might be called at any point during a program's execution,
including by other procedures or itself.
• Object-oriented programming is a programming paradigm based on the concept
of "objects", which may contain data, in the form of fields, often known as
attributes; and code, in the form of procedures, often known as methods.
• A feature of objects is that an object's procedures can access and often modify
the data fields of the object with which they are associated.
• Thus, programmers define not only the data type of a data structure but also the
types of operations (functions) that can be applied to the data structure. In this
way, the data structure becomes an object that includes both data and functions.
In addition, programmers can create relationships between one object and
another.
Procedural Programming
• Procedural programming is a programming paradigm, derived
from structured programming, based upon the concept of the
procedure call. Procedures, also known as routines, subroutines, or
functions, simply contain a series of computational steps to be
carried out.
Advantages of High-Level
Languages
• High-level languages are programmer friendly. They are easy to
write, debug and maintain.
• They provide higher level of abstraction from machine languages.
• It is machine independent language.
• Easy to learn.
• Less error-prone, easy to find and debug errors.
• High-level programming results in better programming
productivity.
Disadvantages of High Level languages
• It takes additional translation time to translate the source code to
machine code.
• High-level programs are comparatively slower than low-level
programs.
• Compared to low-level programs, they are generally less memory
efficient.
• Cannot communicate directly with the hardware.
Programming Paradigms

• Imperative-Central features are variables, assignment statements,


and iteration
Ex: C, Pascal
• Functional-Main means of making computations is by applying
functions to given parameters
Ex:LISP, Scheme
• Logic-Rule-based, Rules are specified in no special order
Ex:Prolog
Programming Language
Implementation
• Compilation-Programs are translated into machine language
• Pure Interpretation-Programs are interpreted by another program
known as an interpreter
• Hybrid Implementation Systems- A compromise between
compilers and pure interpreters
Compilation
• Translate high-level program(source language) into machine code
(machine language)
• Slow translation, fast execution
• Compilation process has several phases:
-Lexical analysis: converts characters in the source program into
lexical units
-Syntax analysis: transforms lexical units into parse trees which
represent the syntactic structure of program
-Semantics analysis: generate intermediate code
-Code Generation: machine code is generated.
The Compilation Process
Pure Interpretation

• No translation
• Easier implementation of programs(run-time errors can easily and
immediately displayed)
• Slower execution (10 to 100 times slower than compiled programs)
• Often requires more space
• Becoming rare on high-level languages
Hybrid Implementation
Systems(Virtual Machines)
• A compromise between compilers and pure interpreters
• A high-level language program is translated to an intermediate
language that allows easy interpretation
• Faster than pure interpretation
Examples
• Perl programs are partially compiled to detect errors before
interpretation.
• Initial implementations of Java were hybrid;the intermediate form,
byte code, provides portability to any machine that has a byte code
interpreter and a run time system(together, these are called
JavaVirtualMachine)
Just-in-Time Implementation Systems

• Initially translate programs to an intermediate language


• Then compile intermediate language into machine code
• Machine code version is kept for subsequent calls
• JIT systems are widely used for Java programs
• .NET languages are implemented with a JIT system
Preprocessors

• Preprocessor macros(instructions) are commonly used to specify


that code from another file is to be included
• A preprocessor processes a program immediately before the
program is compiled to expand embedded preprocessor macros
• A well-known example: C preprocessor
• Ex:#include, #define, and similar macros
Programming Environments

• A programming environments is the collection of tools used in the


development of software.
• This collection may consist:-
• A file system,
• A text editor,
• A compiler,
• A linker,
• Integrated tools
Text Editor and Compiler
• A text editor is a software that is used to write computer
programs. Your Windows machine must have a Notepad,
which can be used to type programs.
• As the computer cannot understand your program directly
given in the text format, so we need to convert this program in
a binary format, which can be understood by the computer.
• The conversion from text program to binary file is done by
another software called Compiler and this process of
conversion from text formatted program to binary format file
is called program compilation. Finally, you can execute binary
file to perform the programmed task.
Linker
• A linker is special program that combines the object files, generated
by compiler/assembler and other pieces of code to originate an
executable file has .exe extension. In the object file, linker searches
and append all libraries needed for execution of file. It regulates the
memory space that will hold the code from each module.
• Modules are called for the different pieces of code, which are
written in programming languages.
• Linking is a process that helps to gather and maintain a different
piece of code into an executable file or single file. With the help of a
linker, a specific module is also linked into the system library.
Interpreters
• There are other programming languages such as Python, PHP, and
Perl, which do not need any compilation into binary format, rather
an interpreter can be used to read such programs line by line and
execute them directly without any further conversion.
• So, if you are going to write your programs in PHP, Python, Perl,
Ruby, etc., then you will need to install their interpreters before you
start programming.
• Online Compilation
Programming Environments
• Some of the examples of programming environments are-
• Microsoft Visual Studio .NET, which is a large collection of
software development tools, used through a windows interface. It is
used to develop software in following languages-
• C#,
• Visual Basic .NET,
• JScript(MS JavaScript version),
• J# (MS Java version)
• NetBeans
• Turbo C, C++
• Dreamweaver
• Arduino, etc.
Syntax And Semantics
• Syntax: the form or structure of the expressions, statements, and
program units
• Semantics: the meaning of the expressions, statements, and
program units
• Syntax and semantics provide a language‘s definition
Who must use Language Definitions?

– Other language designers


– Implementors
– Programmers (the users of the language)
The General Problem of
Describing Syntax
• A sentence is a string of characters over some alphabet
• A language is a set of sentences
• A lexeme is the lowest level syntactic unit of a language (e.g., *,
sum, begin)
• A token is a category of lexemes (e.g., identifier)
The General Problem of Describing
Syntax (continued..)
• Languages Recognizers – A recognition device reads input strings
of the language and decides whether the input strings belong to the
language

• Languages Generators – A device that generates sentences of a


language .One can determine if the syntax of a particular sentence is
correct by comparing it to the structure of the generator
Language Recognizers

• Suppose we have a language L that uses an alphabet ∑ of characters. To define


L formally using the recognition method, we would need to construct a
mechanism R, called a recognition device, capable of reading strings of
characters from the alphabet ∑. R would indicate whether a given input string
was or was not in L. In effect, R would either accept or reject the given string.
Such devices are like filters, separating legal sentences from those that are
incorrectly formed. If R, when fed any string of characters over ∑, accepts it
only if it is in L, then R is a description of L. Because most useful languages
are, for all practical purposes, infinite, this might seem like a lengthy and
ineffective process. Recognition devices, however, are not used to enumerate all
of the sentences of a language—they have a different purpose.
• The syntax analysis part of a compiler is a recognizer for the language the
compiler translates. In this role, the recognizer need not test all possible strings
of characters from some set to determine whether each is in the language.
Language Generators

• A language generator is a device that can be used to generate the


sentences of a language.
• The syntax-checking portion of a compiler (a language
recognizer) is not as useful a language description for a
programmer because it can be used only in trial-and-error mode.
• For example, to determine the correct syntax of a particular
statement using a compiler, the programmer can only submit a
speculated version and note whether the compiler accepts it. On
the other hand, it is often possible to determine whether the
syntax of a particular statement is correct by comparing it with
the structure of the generator.
Context-Free Grammars

• Developed by Noam Chomsky in the mid-1950s


• Language generators, meant to describe the syntax of natural
languages
• Define a class of languages called context-free Languages.
• A rule has a left-hand side (LHS) and a right-hand side (RHS), and
consists of terminal and nonterminal symbols
Formal Methods of Describing
Syntax
• Backus-Naur Form and Context-Free Grammars – Most widely
known method for describing programming language syntax
• Extended BNF – Improves readability and writability of BNF
• Backus-Naur Form (BNF)
Backus-Naur Form was Invented by John Backus to describe
ALGOL 58 in the year 1959.
• BNF is a meta language used to describe another language.
• It is used to specify the syntax of computer programming
languages,command/instruction. BNF is applied when
language descriptions are required.
Formal Methods of Describing
Syntax(continued…)
• Backus-Naur notation (shortly BNF) is a formal mathematical way to
describe a language, (to describe the syntax of the programming
languages).
• The Backus-Naur Form is a way of defining syntax.
• It consists of
o a set of terminal symbols
o a set of non-terminal symbols
o a set of production rules of the form Left-Hand-Side ::= Right-Hand-Side
• Terminal (or Terminal symbol):
Terminals are strings written within quotes. They are meant to be used as
they are.
Nothing is hidden behind them.
- For example “code" or “principles".
Formal Methods of Describing
Syntax(continued…)
• Non-terminal (or Non-terminal symbol):
Sometimes we need a name to refer to
something else.
• These are called non-terminals.
• In BNF, non-terminal names are written
within angle brackets
• (for example <statement>)
Ex: <something> ::= "content"
• Each rule in BNF (also in ENBF) has three parts:
• Left-hand side: Here we write a non-terminal to define it. In
the above example, it is <something>.
• ::=: This character group separates the Left hand
side from Right hand side. Read this symbol as "is defined
as".
• Right-hand side: The definition of the non-terminal specified
on the right-hand side. In the above example, it's "content".
• The general structure of BNF is given below –
name ::= expansion
• The symbol ::= means “may expand into” and “may get replaced
with.”
• Every name in Backus-Naur form is surrounded by angle brackets,
< >, whether it appears on the left- or right-hand side of the rule.
• An expansion is an expression containing terminal symbols and
non-terminal symbols, joined together by sequencing and
selection.
• A terminal symbol may be a literal like (“+” or “function”) or a
category of literals (like integer).
• A vertical bar | indicates choice.
Defining Grammar

• A grammar is a finite nonempty set of rules. An abstraction (or


nonterminal symbol) can have more than one RHS

<Stmt> -> <single_stmt>


| begin
<stmt_list>
end
Defining Grammar(continued…)

• Naturally, we will define a grammar for rules in BNF –


rule → name ::= expression;
name → < identifier >
expansion → expansion expansion
expansion → expansion | expansion expansion → name expansion →
terminal
Defining Grammar(continued…)

• We might define identifiers as using the regular expression [-A-Za-


z_0-9]+.
• A terminal could be a quoted literal (like “+”, “switch” or ” “<<=”)
or the name of a category of literals (like integer).
• The name of a category of literals is typically defined by other
means, like an expression.
Defining Grammar(continued…)

• A derivation is a repeated application of rules, starting with the start


symbol and ending with a sentence (all terminal symbols)
• For example
<breakfast> ::= <drink> " and biscuit“
<drink> ::= "tea“
• It means the only option for breakfast for you is "tea and biscuit". Note that
here, the order of symbols is important.
• Let's say someday you want to drink coffee instead of tea. In this case, you
can express your possible breakfast items like below:

<breakfast> ::= <drink> " and biscuit" <drink> ::= "tea" | "coffee"
Defining Grammar(continued…)

• The | operator indicates that the parts separated by it are choices. Which
means the non-terminal on the left can be any such part. Here the order
is unimportant, that is there is no difference between "tea" |
"coffee and "coffee" | "tea".
• Example 2: let's see how you express one or more digits in BNF:
<digits> ::= <digit> | <digit> <digits>
<digit> ::= "0" | "1" | "2" | "3" | "4" | "5" | "6" | "7" | "8" | "9"
• Every string of symbols in the derivation is a sentential
form.
• A sentence is a sentential form that has only terminal
symbols .
• A leftmost derivation is one in which the leftmost
nonterminal in each sentential form is the one that is
expanded
• A derivation may be neither leftmost nor rightmost
• A parse tree is a hierarchical representation of a derivation
Parse Tree
Parse Tree(continued…)
Example:
Define a grammar for the letter language. A letter is a lower-case Latin letter
between a and z.
letter ::= "a" | "b" | "c" | "d" | ... | "z“
The | symbol indicates alternate choices for the rule’s definition. It’s actually
shorthand for multiple rules for the same nonterminal:
letter ::= "a" letter ::= "b" letter ::= "c" letter ::= "d" ... letter ::= "z“.
Extended BNF
• Optional parts are placed in brackets ([]) <proc_call> -> ident
[ ( <expr_list>)]
• Put alternative parts of RHSs in parentheses and separate them
with vertical bars
• <term> -> <term> (+ | -) const
• Put repetitions (0 or more) in braces ({})<ident> -> letter {letter
| digit}
Extended BNF (continued…)
• For example, the previous example can be written in EBNF like
below:

digits = digit { digit }


digit = "0" | "1" | "2" | "3" | "4" | "5" | "6" | "7" | "8" | "9"

• The braces above mean that its inner part may be repeated 0 or more
times. It frees your mind from getting lost in recursion.
Extended BNF (continued…)
• Everything you can express in EBNF can also be expressed in BNF.
• EBNF usually uses a slightly different notation than BNF. For
example:

::= becomes just =.


• There are no angle brackets around non-terminals.
Extended BNF (continued…)
• BNF extends BNF by adding the following 3 operations:
 Option
 Repetition
 Grouping
Option

• Option uses square brackets to make the inner content optional.

Example:
thing = "water" [ "melon" ]
So the above thing is either water or watermelon.
Repetition

• Curly braces indicate the inner content may be repeated 0 or more


times.

long_google = "Goo" { "o" } "gle“

• So "Google", "Gooogle", "Gooooooogle" are all


valid long_google non-terminal.
Grouping

• Parentheses can be used to indicate grouping. It means everything


they wrap can be replaced with any of the valid strings that the
contents of the group represent according to the rules of EBNF.
fly = ("fire" | "fruit") "fly"
Here fly is either "firefly" or "fruitfly".

• With BNF we could not do that in one line. It would look like the
following in BNF:
<fly> ::= <type> "fly" <type> ::= "fire" | "fruit"
Parse Trees
• A hierarchical representation of a derivation. Every internal node of
a parse tree is labeled with a non terminal symbol; every leaf is
labeled with a terminal symbol.
• Every subtree of a parse tree describes one instance of an abstraction
in the sentence
Parse Trees(continued…)
• A grammar is ambiguous if it generates a sentential form that has two or
more distinct parse trees An ambiguous expression grammar:
<expr> -> <expr> <op> <expr> | const
<op> -> / | -
• If we use the parse tree to indicate precedence levels of the operators, we
cannot have ambiguity An unambiguous expression grammar:
<expr> -> <expr> - <term> | <term>
<term> -> <term> / const | const
Example

You might also like