VLSI DESIGN AUTOMATION
VLSI DESIGN AUTOMATION
3. Transistor-level Design
Logic gates are composed of transistors. Designing at the transistor level requires
its own design tools, most of which are simulation tools. Depending on the
accuracy required, transistors can be simulated at different levels. At the switch
level, transistors are modelled as ideal bidirectional switches and the signals are
essentially digital, although the model is often augmented to deal with different
signal strengths, capacitances of nodes, etc. At the timing level, analog signals are
considered, but the transistors have simple models (e.g. piecewise linear functions).
At the circuit level, more accurate models of the transistors are used which often
involve nonlinear differential equations for the currents and voltages. The equations
are then solved by numerical integration. The more accurate the model, the more
computer time is necessary for simulation and, therefore, the lower the maximum
size of the circuit that can be simulated in reasonable time. The fact that an
integrated circuit will be realized in a mainly two-dimensional physical medium has
implications for design decisions at many levels. Circuit extraction is especially
important when performing full-custom design. In the case of standard cells
(semicustom design), the so-called characterization of the cells, i.e. the
determination of their timing behavior is done once by the library developer rather
than by the designer who makes use of the library.
4. Layout Design
The problem is to compose the layout of the entire integrated circuit. It is often
solved in two stages. First, a position in the plane is assigned to each sub-block,
trying to minimize the area to be occupied by interconnections. This is called the
placement problem. The next step is to generate the wiring patterns that realize the
correct interconnections between these blocks. This is called the routing problem.
The goal of placement and routing is to generate the minimal chip area, while
possibly satisfying some constraints. Constraints may e.g. be derived from timing
requirements.
The partitioning problem concerns the grouping of the sub-blocks in a structural
description such that those sub-blocks that are tightly connected are put in the same
group while the number of connections from one group to the other is kept low.
This problem is not strictly a layout problem. Partitioning can also help to solve the
placement problem.
The simultaneous development of structure and layout is called floor-planning. In a
top-down design methodology, when making a transition of a behavioral
description to a structure, one also fixes the relative positions of the sub-blocks.
Through floor-planning, layout information becomes available at early stages of the
design. It gives early feedback on e.g. long wires in the layout and may lead to a
reconsideration of the decisions on structural decomposition. The floor-planning
problem is closely related to the placement problem with the difference that
detailed layout information is available in placement whereas floor-planning has
mainly to deal with estimations.
A cell compiler generates the layout for a network of transistors. A problem
somewhat related to cell compilation is module generation. A module is normally
understood to be a hardware block, the layout of which can be composed by an
arrangement of cells from a small subset. These elementary cells are sometimes
called microcells. They have a complexity of around l0 transistors.
Working at the mask level gives the freedom of manipulating the layout at the
lowest level, but the increased freedom is as well a source of errors. In a correct
design, the mask patterns should obey some rules, e.g. on minimal distances and the
minimal widths, called design rules. Tools that analyse a layout to detect violations
of these rules are called design-rule checkers. A somewhat related tool that also
takes the mask patterns as its input is the circuit extraction. It constructs a circuit of
transistors, resistors and capacitances that can be simulated. Both design-rule
checking and circuit extraction lean on knowledge from the field called
"computational geometry".
One serious disadvantage of full-custom design is that the layout has to be
redesigned when the technology changes. As a remedy to this problem and to speed
up the design time in general, symbolic layout has been proposed. In symbolic
layout widths and distances of mask patterns are irrelevant. What matters is the
positions of the patterns relative to each other, the so called topology of the design.
Symbolic layout can only be used in combination with a compactor. This is a tool
that takes the symbolic description, assigns widths to all patterns and spaces the
patterns such that all design rules are satisfied.
5. Verification Methods
There are three ways of checking the correctness of an integrated circuit without
actually fabricating it:
Simulation, i.e. making a computer model of all relevant aspects of the circuit,
executing the model for a set of input signals, and observing the output signals.
Simulation has the disadvantage that it is impossible to have an exhaustive test of a
circuit of reasonable size, as the set of all possible input signals and internal states
grows too large. One has to be satisfied with a subset that gives sufficient
confidence in the correctness of the circuit. So, simulation that does not check all
possible input patterns and internal states always includes the risk of overlooking
some errors.
Formal verification, i.e. the use of mathematical methods to prove that a circuit is
correct. A mathematical proof, as opposed to simulation, gives certainty on the
correctness of the circuit. The problem is that performing these proofs by hand is
too time consuming. Therefore, the attention is focused on those techniques that can
be performed by computers. Formal verification methods consider different aspects
of VLSI design. The most common problem is to check the equivalence of two
descriptions of a circuit, especially a behavioral description and its structural
decomposition. In this context the behavioral description is called the specification
and the structural one its implementation.
There are tools that are not directly related to the progress of the design itself, but
are indispensable in a CAD system. First of all, CAD tools consume and produce
design data in different design domains and at different levels of abstraction. These
data have to be stored in databases. The quantity of data for a VLSI chip can be
enormous and appropriate data management techniques have to be used to store and
retrieve them efficiently. Besides, design is an iterative activity: a designer might
modify a design description in several steps and sometimes discard some
modifications if they do not satisfy. Version management allows for the possibility
of undoing some design decisions and proceeding with the design.
A famous standard format is EDIF (Electronic Design Interchange Format).
Algorithmic graph theory, as opposed to pure graph theory, emphasizes the design
of algorithms that operate on graphs, instead of concentrating on mathematical
properties of graphs and theorems expressing those properties. The distinction
between the two is not very sharp, however, and algorithmic graph theory certainly
benefits from results in pure graph theory.
The big-O notation describes the upper bound of a function. If one wants to
describe a lower bound, the big-omega notation is used: f (n) : Ω (g(n)).
Depending on the magnitude of the input size, a number of different criteria can be
used for qualifying an algorithm:
Polynomial vs. exponential order. As an exponential function grows faster than any
polynomial and the exponents of a polynomial tend to be small, an algorithm with a
polynomial time complexity is to be preferred over an exponential algorithm.
Linear vs. quadratic order. Suppose that the input size of an algorithm is determined
by the number of transistors in a circuit and that the algorithm has to be applied to
VLSI circuit containing some 106 transistors. Then, running an algorithm with a
linear time complexity is feasible on a computer with a realistic speed, but an
algorithm with quadratic time complexity is not.
Graph Algorithms:
a. Depth First Search
b. Breadth First Search
c. Dijkstra's Shortest-path Algorithm
d. Prim's Algorithmfor Minimum Spanning Trees
Eulerian Cycle:
All vertices with non-zero degree are connected.
All vertices have even degree.
Eulerian Path:
All vertices with non-zero degree are connected.
If zero or two vertices have odd degree and all other vertices have even degree.
MODULE 2
Layout Compaction
1. Design Rules:
The mask patterns that are used for the fabrication of an integrated circuit have to
obey certain restrictions on their shapes and sizes. These restrictions are called the
design rules. Sticking to the design rules decreases the probability that the
fabricated circuit will not work due to short circuits, disconnections in wires,
parasitics, etc. The shape of the patterns is often restricted to rectilinear polygons,
i.e. polygons that are made of horizontal and vertical segments only. Some
technologies also allow 45-degree segments in polygons, segments that are parallel
to the lines y = x or y = -x on an x-y plane. There are design rules for layout
elements located in the same fabrication layer and rules for elements in different
layers. If patterns in two specific layers are constrained by one or more design
rules, the layers are said to interact. For example, polysilicon and diffusion are
interacting layers as their overlapping creates a transistor, whereas polysilicon and
metal form non-interacting layers (if one ignores parasitic capacitances). Design
rules can be quite complex. However, most of them can be expressed as minimum-
distance rules. As the minimum feature size that can be realized on a chip is subject
to continual change, distances are often expressed in integer multiples (or small
fractions) of a relative length unit, the λ, rather than absolute length units. In this
way, designers can deal with simple expressions independent of actual length
values. This means that all mask patterns are drawn along the lines of a so-called
lambda grid.
2. Problem Formulation:
Application of Compaction –
Layout compaction can be applied in four situations:-
Converting symbolic layout to geometric layout.
Removing redundant area from geometric layout.
Adapting geometric layout to a new technology. A new technology means that
the design rules have changed; as long as the new and old technologies are
compatible, this adaptation can be done automatically, by means of so-called
mask-to-symbolic extraction. In such a case geometric layout in the old
technology is converted to a symbolic layout and then the design rules of the
new technology are used for the generation of the new geometric layout.
Correcting small design rule errors. If there are methods to put layout elements
closer to each other to remove redundant space, it is reasonable to assume that
pulling layout elements apart when they are too close to each other can be done
similarly. This is true as long as the layout with design-rule errors is
topologically correct that is the relative ordering of the rectangle edges in
interacting layers should be the same as in the correct design.
Graph-Theoretical Formulation –
In one-dimensional, say horizontal, compaction a rigid rectangle can be represented by
one x-coordinate and a stretchable one by two. For the purpose of the algorithms to be
explained, it is assumed that there are n distinct x-coordinates. They will be indicated
as x1, x2, . . . , xn. A minimum-distance design rule between two rectangle edges can
now be expressed as an inequality:
xj – xi >= dij -------(1)
Computing the lengths of the longest paths to all vertices in the constraint graph results
in a solution for the one-dimensional compaction problem.
Maximum-Distance Constraints –
Maximum-distance constraints can in general be written as:
xj – xi <= cij
The last inequality has the same form as Inequality (1) and can be represented in the
constraint graph by an edge (vi,vj) with weight dij = -cij. The addition of this type of
edges can create cycles in the constraint graph. In the presence of cycles, the solution
of the compaction problem still amounts to computing the lengths of the longest paths.
3. Algorithms:
https://youtu.be/jdTnoCBSOVM
Bellman-Ford Algorithm –
https://youtu.be/FtN3BYH2Zes
The algorithm does not discriminate between forward and backward edges. It is
comparable to the longest path algorithm for DAGs with the difference that several
iterations through the graph are necessary before the lengths of the longest paths have
been computed.
The time complexity of the Bellman-Ford algorithm is O(n x lEl) as each iteration
visits all edges at most once and there are at most n iterations. If the graph is dense, i.e.
the number of edges is O(n2), this would mean a worst-case time complexity of O(n3).
However, under assumptions that are realistic for compaction, the average time
complexity turns out to be O(n1.5).
1. Circuit representation:
2. Placement Algorithms
Placement algorithms can be grouped into two categories:
Constructive placement: the algorithm is such that once the coordinates of a
cell have been fixed they are not modified anymore.
Iterative placement: all cells have already some coordinates and cells are
moved around, their positions are interchanged, etc. in order to get a new
configuration.
Most placement algorithms contain both approaches: an initial placement is obtained
in a constructive way and attempts are made to increase the quality of the placement
by iterative improvement.