Toc Lecture Notes
Toc Lecture Notes
II B.Tech II Sem
Computable function
Computable functions are the basic objects of study in computability theory. Computable
functions are the formalized analogue of the intuitive notion of algorithm. They are used to
discuss computability without referring to any concrete model of computation such as Turing
machines or register machines.
According to the Church–Turing thesis, computable functions are exactly the functions that can
be calculated using a mechanical calculation device given unlimited amounts of time and storage
space.
Each computable function f takes a fixed, finite number of natural numbers as arguments.A
function which is defined for all possible arguments is called total. If a computable function is
total, it is called a total computable function or total recursive function.
The basic characteristic of a computable function is that there must be a finite procedure (an
algorithm) telling how to compute the function. The models of computation listed above give
different interpretations of what a procedure is and how it is used, but these interpretations share
many properties.
16
17
Counter machine
A counter machine is an abstract machine used in formal logic and theoretical computer science
to model computation. It is the most primitive of the four types of register machines. A counter
machine comprises a set of one or more unbounded registers, each of which can hold a single
non-negative integer, and a list of (usually sequential) arithmetic and control instructions for the
machine to follow.
The primitive model register machine is, in effect, a multitape 2-symbol Post-Turing machine
with its behavior restricted so its tapes act like simple "counters".
By the time of Melzak, Lambek, and Minsky the notion of a "computer program" produced a
different type of simple machine with many left-ended tapes cut from a Post-Turing tape. In all
cases the models permit only two tape symbols { mark, blank }.[3]
Some versions represent the positive integers as only a strings/stack of marks allowed in a
"register" (i.e. left-ended tape), and a blank tape represented by the count "0". Minsky eliminated
the PRINT instruction at the expense of providing his model with a mandatory single mark at the
left-end of each tape.[3]
In this model the single-ended tapes-as-registers are thought of as "counters", their instructions
restricted to only two (or three if the TEST/DECREMENT instruction is atomized). Two
common instruction sets are the following:
18
19
20
UNIT – V
Syllabus:
Computability Theory: Chomsky hierarchy of languages, Linear Bounded Automata and
Context Sensitive Language, LR(0) grammar, Decidability of problems, Universal Turing
Machine, Undecidability of Posts Correspondence Problem, Turing Reducibility, Definition of P
and NP problems.
According to Noam Chomsky, there are four types of grammars − Type 0, Type 1, Type 2, and
Type 3. The following table shows how they differ from each other −
Grammar
Grammar Accepted Language Accepted Automaton
Type
Recursively enumerable
Type 0 Unrestricted grammar Turing Machine
language
Take a look at the following illustration. It shows the scope of each type of grammar −
Prepared by Y. Nagender, Asst. Prof & G. Sunil Reddy, Asst. Prof, CSE, SREC, Warangal Page 1
Grammar Language
Production Rules Automata Closed Under
Type Accepted
Type-2
Union,
(Context A->ρ where A∈N Context Push Down
Concatenation,
Free and ρ ∈ (T∪N)* Free Automata
Kleene Closure
Grammar)
α→β where
Type-1 α,β∈ (T∪N)* and Union, Intersection,
Linear
(Context len(α) <= len(β) Context Complementation,
Bound
Sensitive and α should Sensitive Concatenation,
Automata
Grammar) contain atleast 1 Kleene Closure
non terminal.
α → β where
Type-0 Union, Intersection,
α,β∈ (T∪N)* andα Recursive Turing
(Recursive Concatenation,
contains atleast 1 Enumerable Machine
Enumerable) Kleene Closure
non-terminal
Type - 3 Grammar:
Type-3 grammars generate regular languages. Type-3 grammars must have a single non-
terminal on the left-hand side and a right-hand side consisting of a single terminal or single
terminal followed by a single non-terminal.
The productions must be in the form X → a or X → aY
where X, Y ∈ N (Non terminal)
and a ∈ T (Terminal)
The rule S → ε is allowed if S does not appear on the right side of any rule.
Example
X→
X → a |aY
Y→b
Prepared by Y. Nagender, Asst. Prof & G. Sunil Reddy, Asst. Prof, CSE, SREC, Warangal Page 2
Type - 2 Grammar:
Type - 1 Grammar:
Prepared by Y. Nagender, Asst. Prof & G. Sunil Reddy, Asst. Prof, CSE, SREC, Warangal Page 3
Type - 0 Grammar:
Definition
Linear Bounded Automata is a single tape Turing Machine with two special tape symbols call
them left marker < and right marker >.
The transitions should satisfy these conditions:
It should not replace the marker symbols by any other symbol.
It should not write on cells beyond the marker symbols.
Thus the initial configuration will be:
< q0a1a2a3a4a5.......an >
Formal Definition:
Formally Linear Bounded Automata is a non-deterministic Turing εachine, ε=(Q, P, Γ, , F
,q0, t, r)
Q is set of all states
P is set of all terminals
Γ is set of all tape alphabets P ⊂ Γ
is set of transitions
F is blank symbol
q0 is the initial state
< is left marker and > is right marker
t is accept state
r is reject state
Prepared by Y. Nagender, Asst. Prof & G. Sunil Reddy, Asst. Prof, CSE, SREC, Warangal Page 4
The Context sensitive languages are the languages which are accepted by linear bounded
automata. These types of languages are defined by context Sensitive Grammar. In this
grammar more than one terminal or non terminal symbol may appear on the left hand
side of the production rule. Along with it, the context sensitive grammar follows
following rules:
The number of symbols on the left hand side must not exceed number of symbols on the
right hand side.
The rule of the form A ∈ is not allowed unless A is a start symbol. It does not occur
on the right hand side of any rule.
The classic example of context sensitive language is L= { anbncn | n ≥ 1 }
If G is a Context Sensitive Grammar then
L(G) = {w| w ∈ ∑∗ and S ⇒+ G w}
CSG for L = { anbncn | n ≥ 1 }
N : {S, B} and P = {a, b, c}
P : S → aSBc | abc cB → Bc bB → bb
Derivation of aabbcc :
S ⇒ aSBc ⇒ aabcBc ⇒ aabBcc ⇒ aabbcc
Prepared by Y. Nagender, Asst. Prof & G. Sunil Reddy, Asst. Prof, CSE, SREC, Warangal Page 5
aabBCC rule bB bb
aabbCC rule bC bc
aabbcC rule cC cc
aabbcc
NOTE: The language an bncn where n ≥ 1 is represented by context sensitive grammar but it can
not be represented by context free grammar.
Every context sensitive language can be represented by LBA.
Closure Properties
Context Sensitive Languages are closed under
Union
Concatenation
Reversal
Kleene Star
Intesection
All of the above except Intersection can be proved by modifying the grammar.
Proof of Intersection needs a machine model for CSG
LR-Grammar:
Prepared by Y. Nagender, Asst. Prof & G. Sunil Reddy, Asst. Prof, CSE, SREC, Warangal Page 6
Some Notation
* = 1 or more steps in a derivation
*rm = rightmost derivation
rm = single step in rightmost derivation
More terms
Handle
A substring which matches the right-hand side of a production and represents 1
step in the derivation
Or more formally:
(of a right-sentential form for CFG G)
Is a substring such that:
S *rm w
w =
If the grammar is unambiguous:
There are no useless symbols
The rightmost derivation (in right-sentential form) and the handle are unique
Example
Given our example grammar:
o S’ Sc, S SA|A, A aSb|ab
An example right-most derivation:
o S’ Sc SAc SaSbc
Therefore we can say that: SaSbc is in right-sentential form
o The handle is aSb
Viable Prefix
o (of a right-sentential form for )
o Is any prefix of ending no farther right than the right end of a handle of
.
Complete item
An item where the dot is the rightmost symbol
Example
Given our example grammar:
o S’ Sc, S SA|A, A aSb|ab
The right-sentential form abc:
o S’ *rm Ac abc
Valid prefixes:
o A ab for prefix ab
Prepared by Y. Nagender, Asst. Prof & G. Sunil Reddy, Asst. Prof, CSE, SREC, Warangal Page 7
Example Grammar
1. S → E$
2. E → E+(E)
3. E → id
Prepared by Y. Nagender, Asst. Prof & G. Sunil Reddy, Asst. Prof, CSE, SREC, Warangal Page 8
States are composed of closures constructed from items. Initially the only closure is {S → • E$}.
Next, we construct the closure like so:
Closure(I) = Closure(I) ∪ {A → • α | B → • A ∈ I}
Basically, for a non-terminal \(A\) in \(I\) with a • before it, add all items of the form "A → • …".
to create more closures we define a "goto" function that creates new closures. Given a closure
\(I\) and a symbol \(a\) (terminal or non-terminal):
goto(I, a) = {B → α a• | B→α • a ∈ I}
Basically, For every item in \(I\) that has a • before \(a\) we create a new closure by pushing the •
one symbol forward. For instance, given our example closure and the symbol \(E\) we get:
goto({S → • E$, E → • E+(E), E → • id}, E) = {S → E• $, E → E• +(E)}
Now, for each of these items we create a closure and for each of those closures we create all
possible goto sets. We keep going until there are no more new states (items that are not part of a
closure).
The table is index by state and symbol. We created the states already and the symbols are given
by the grammar, now we need to create the action within the cells. The goto functions defines the
Prepared by Y. Nagender, Asst. Prof & G. Sunil Reddy, Asst. Prof, CSE, SREC, Warangal Page 9
transitions between the closures. Transition from state q1 to state q2 given symbol a \(\iff\)
goto(closure(q1), a) = closure(q2).
States A + ( ) $ S E
0 s1 g2
2 s4 s3
4 s5
5 s1 g6
6 s4 s7
Indications of a conflict
Any grammar with an derivation cannot be δR(0). This is because there is no input to reduce,
so at any point that derivation rule can be used to reduce (add the rule's LHS non-terminal to the
stack)
Prepared by Y. Nagender, Asst. Prof & G. Sunil Reddy, Asst. Prof, CSE, SREC, Warangal Page 10
Decidability of problems:
Undecidable Problems –
The problems for which we can’t construct an algorithm that can answer the problem correctly in
a finite time are termed as Undecidable Problems. These problems may be partially decidable but
they will never be decidable. That is there will always be a condition that will lead the Turing
Machine into an infinite loop without providing an answer at all.
We can understand Undecidable Problems intuitively by considering Fermat’s Theorem, a
popular Undecidable Problem which states that no three positive integers a, b and c for any n>=2
can ever satisfy the equation: a^n + b^n = c^n.
If we feed this problem to a Turing machine to find such a solution which gives a contradiction
then a Turing Machine might run forever, to find the suitable values of n, a, b and c. But we are
Prepared by Y. Nagender, Asst. Prof & G. Sunil Reddy, Asst. Prof, CSE, SREC, Warangal Page 11
always unsure whether a contradiction exists or not and hence we term this problem as
an Undecidable Problem.
Examples – These are few important Undecidable Problems:
Whether a CFG generates all the strings or not?
As a CFG generates infinite strings ,we can’t ever reach up to the last string and hence it is
Undecidable.
Whether two CFG L and M equal?
Since we cannot determine all the strings of any CFG , we can predict that two CFG are
equal or not.
Ambiguity of CFG?
There exist no algorithm which can check whether for the ambiguity of a CFL. We can
only check if any particular string of the CFL generates two different parse trees then the
CFL is ambiguous.
Is it possible to convert a given ambiguous CFG into corresponding non-ambiguous CFL?
It is also an Undecidable Problem as there doesn’t exist any algorithm for the conversion of
an ambiguous CFL to non-ambiguous CFL.
Is a language Learning which is a CFL, regular?
This is an Undecidable Problem as we can not find from the production rules of the CFL
whether it is regular or not.
Some more Undecidable Problems related to Turing machine:
Membership problem of a Turing Machine?
Finiteness of a Turing Machine?
Emptiness of a Turing Machine?
Whether the language accepted by Turing Machine is regular or CFL?
A language is called Decidable or Recursive if there is a Turing machine which accepts and
halts on every input string w. Every decidable language is Turing-Acceptable.
Prepared by Y. Nagender, Asst. Prof & G. Sunil Reddy, Asst. Prof, CSE, SREC, Warangal Page 12
For a decidable language, for each input string, the TM halts either at the accept or the reject
state as depicted in the following diagram −
Example 1
Find out whether the following problem is decidable or not −
Is a number ‘m’ prime?
Solution
Prime numbers = {2, 3, 5, 7, 11, 13, …………..}
Divide the number ‘m’ by all the numbers between ‘2’ and ‘√m’ starting from ‘2’.
If any of these numbers produce a remainder zero, then it goes to the “Rejected state”, otherwise
it goes to the “Accepted state”. So, here the answer could be made by ‘Yes’ or ‘No’.
Hence, it is a decidable problem.
Example 2
Given a regular language L and string w, how can we check if w ∈ L?
Solution
Take the DFA that accepts L and check if w is accepted
Prepared by Y. Nagender, Asst. Prof & G. Sunil Reddy, Asst. Prof, CSE, SREC, Warangal Page 13
Turing machines are abstract computing devices. Each Turing machine represents a particular
algorithm. Hence we can think of Turing machines as being "hard-wired".
Is there a programmable Turing machine that can solve any problem solved by a "hard-wired"
Turing machine?
The answer is "yes", the programmable Turing machine is called "universal Turing machine".
Basic Idea:
The Universal TM will take as input a description of a standard TM and an input w in the
alphabet of the standard TM, and will halt if and only if the standard TM halts on w.
The Post Correspondence Problem (PCP), introduced by Emil Post in 1946, is an undecidable
decision problem. The PCP problem over an alphabet ∑ is stated as follows −
Given the following two lists, M and N of non-empty strings over ∑ −
M = (x1, x2, x3,………, xn)
N = (y1, y2, y3,………, yn)
We can say that there is a Post Correspondence Solution, if for some i1,i2,………… ik, where 1
≤ ij ≤ n, the condition xi1 …….xik = yi1 …….yik satisfies.
Example 1
Find whether the lists M = (abb, aa, aaa) and N = (bba, aaa, aa) have a Post Correspondence
Solution?
Solution
X1 X2 X3
M abb aa aaa
N bba aaa aa
Here,
x2x1x3 = ‘aaabbaaa’
and y2y1y3 = ‘aaabbaaa’
We can see that
x2x1x3 = y2y1y3
Hence, the solution is i = 2, j = 1, and k = 3.
Prepared by Y. Nagender, Asst. Prof & G. Sunil Reddy, Asst. Prof, CSE, SREC, Warangal Page 14
Example 2
Find whether the lists M = (ab, bab, bbaaa) and N = (a, ba, bab) have a Post Correspondence
Solution?
Solution
X1 X2 X3
M ab bab bbaaa
N a ba bab
Turing Reducibility:
Prepared by Y. Nagender, Asst. Prof & G. Sunil Reddy, Asst. Prof, CSE, SREC, Warangal Page 15
Definition of P:
P is the class of languages that are decidable in polynomial time on a deterministic single-tape
Turing machine. In other words,
P = [ k Time(n k )
Motivation: To define a class of problems that can be solved efficiently.
P is invariant for all models of computation that are polynomially equivalent to the
deterministic single-tape Turing Machine.
P roughly corresponds to the class of problems that are realistically solvable on a
computer.
Definition of NP:
The term NP comes from nondeterministic polynomial time and has an alternative
characterization by using nondeterministic polynomial time Turing machines.
Theorem
A language is in NP iff it is decided by some nondeterministic polynomial time Turing machine.
Proof.
(⇒) Convert a polynomial time verifier V to an equivalent polynomial time NTM N. On input w
of length n:
Nondeterministically select string c of length at most nk (assuming that V runs in time nk
).
Run V on input < w, c >.
If V accepts, accept; otherwise, reject.
P vs. NP
If you spend time in or around the programming community you probably hear the term “P
versus NP” rather frequently.
The Problem
P vs. NP
The P vs. NP problem asks whether every problem whose solution can be quickly
verified by a computer can also be quickly solved by a computer.
Prepared by Y. Nagender, Asst. Prof & G. Sunil Reddy, Asst. Prof, CSE, SREC, Warangal Page 16
P problems are easily solved by computers, and NP problems are not easily solvable, but if you
present a potential solution it’s easy to verify whether it’s correct or not.
As you can see from the diagram above, all P problems are NP problems. That is, if it’s easy for
the computer to solve, it’s easy to verify the solution. So the P vs NP problem is just asking if
these two problem types are the same, or if they are different, i.e. that there are some problems
that are easily verified but not easily solved.
It currently appears that P ≠ NP, meaning we have plenty of examples of problems that we can
quickly verify potential answers to, but that we can’t solve quickly. δet’s look at a few examples:
A traveling salesman wants to visit 100 different cities by driving, starting and ending his
trip at home. He has a limited supply of gasoline, so he can only drive a total of 10,000
kilometers. He wants to know if he can visit all of the cities without running out of
gasoline. (from Wikipedia)
A farmer wants to take 100 watermelons of different masses to the market. She needs to
pack the watermelons into boxes. Each box can only hold 20 kilograms without breaking.
The farmer needs to know if 10 boxes will be enough for her to carry all 100
watermelons to market.
All of these problems share a common characteristic that is the key to understanding the intrigue
of P versus NP: In order to solve them you have to try all combinations.
The Solution
This is why the answer to the P vs. NP problem is so interesting to people. If anyone were able to
show that P is equal to NP, it would make difficult real-world problems trivial for computers.
Summary
1. P vs. NP deals with the gap between computers being able to quickly solve problems vs.
just being able to test proposed solutions for correctness.
2. As such, the P vs. NP problem is the search for a way to solve problems that require the
trying of millions, billions, or trillions of combinations without actually having to try
each one.
3. Solving this problem would have profound effects on computing, and therefore on our
society.
Prepared by Y. Nagender, Asst. Prof & G. Sunil Reddy, Asst. Prof, CSE, SREC, Warangal Page 17