0% found this document useful (0 votes)
299 views

UVM Based Subsystem - FIFO

ml.

Uploaded by

joshi zizjo
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
299 views

UVM Based Subsystem - FIFO

ml.

Uploaded by

joshi zizjo
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 5

2017 IEEE International Conference on Technological Advancements in Power and Energy(TAP Energy )

UVM Based Testbench Architecture for Logic


Sub-System Verification
Pavithran T M1 , Ramesh Bhakthavatchalu2
Department of Electronics and Communication Engineering, Amrita School of Engineering,
Amrita Vishwa Vidyapeetham,India1,2
[email protected] , [email protected]

Abstract—Functional verification i s o ne a mong t he main The rest of the paper is systematized as follows. SectionII
bottle-neck in design of complex system designs and it consumes presents a literature survey over conventional testbench ar-
almost 70% of the project cycle. In present scenario, verification chitecture. It explains the main structures provided by UVM
using directed testing is a tedious task and time consuming.
Also, there will be many uncovered scenarios left out. It will be based test bench architecture related with Verilog test bench
of advantage if the testbench is reusable, robust, and scalable and System Verilog alone test benches. Section III describes
in SoC verification. U niversal V erification Me thodology (UVM) the UVM base class library in detail. It also explains UVM
along with System Verilog helps in building a coverage driven Component Class and UVM Transaction Class. Section IV
constraint random verification e nvironment f or verification. describes the Design Under Verification – Synchronous FIFO.
This paper analyzes the use of UVM in creating testbench by
taking synchronous FIFO as a subsystem under verification. Section V explains how UVM features were used to verify
FIFOs are an integral part in almost all SoCs. They are widely the design. Section VI presents the result and analysis. The
used as buffers, queue, flow c ontrol e tc. i n d ata applications. coverage report is analyzed to see if the functionality of design
The Design Under Verification i s t ested a nd c overage models is implemented as per planned design specification. Finally,
are implemented to verify DUV. section VII concludes the paper with future scope.
Keywords- Universal Verification M ethodology ( UVM), Func- II. PREVIOUS WORK
tional Verification, F unctional Coverage.
Mentioning to previous works, verification is done using
I. INTRODUCTION diverse languages and methodologies. The most conventional
or common way is to write test bench in VHDL or Verilog
Functional verification is the process of validating the design HDL. The main drawbacks of Verilog and VHDL test bench
such that all the design specifications a re m et. T here may are given in paper[3]. Both Verilog and VHDL lack features to
be several number of testcases created to demonstrate that provision high-level data types, OOPS, assertions, functional
the intent of Design Under Test is well-preserved in its coverage, and constraints.
employment [1]. Today, the complexity of design has reached System Verilog aims to offer solutions for all the mentioned
to the stage where conventional direct testing methods will downsides of Verilog and VHDL HDLs [4]. The key features
no longer prove useful. In addition, the verification process of System Verilog: - Program block, Clocking block and
consumes almost 70-80% of product cycle. The constrained Interfaces are mentioned in detail in [5]. The paper [6] also
random verification covers almost 80% of the scenarios. They lists Re key features of System Verilog. System Verilog LRM
are complimentd with direct testcases to test hard-to-reach test has over 200 highlighted keywords which are sufficient to
scenarios with random stimuli. implement complex verification components. But the language
The Universal Verification Methodology (UVM) along with had numerous practical limitations. The code implemented in
System Verilog offers base class libraries for generating and one vendor’s tool would not run on another vendor’s tool. Even
organizing testbenches. Methodology outlines set of rules and though System Verilog offered features for creating stable
procedures for doing things in a systematic way [2]. The main environments, code reuse was a big issue. The short time to
benefits of UVM are: - wide-ranging base class library support, market, reusability of code still remained as a hurdle.
IEEE standard 1800.2 – 2017, can perform coverage-driven Methodology was presented to offer set of procedures and
constraint random verification, f ully s upported b y m ajor tool rules for test bench architecture. The Verification Methodology
vendors, maintained by Accelera (industry recognized body). Manual (VMM) [7] and Reference Methodology (RVM) pro-
This work validates the application of UVM in Soc verifica- vided rich library classes and OOPs functionality for making
tion by presenting the analysis of logic sub-system verification test bench. All the methodologies were vendor dependent.
which is synchronous FIFO here. The methodology of UVM Eventually Accellera was formed and a Universal Verification
based testbench for other units in a SoC looks similar and Methodology (UVM) was technologically advanced that could
can Retimes have more functionality than the testbench for be used with all major tool vendors. This ensured interoper-
synchronous FIFO. The focus of this work is to analyze the ability. UVM also helps in creating robust, scalable and re-
UVM features. usable test bench. It ensured that all the verification engineers

978-1-5386-4021-0/17/$31.00 ©2017IEEE
i)Sequencer: - The sequences run on sequencer. The sequences
generated are send to the driver through sequencer whenever
the driver demands for it.
ii) Driver: - The driver drives the signals of DUV. The
transactions from sequencer are mapped to corresponding
signals in DUV through virtual interface.
iii) Monitor: - The monitor scans the transactions from driver
to DUV and captures the responses of DUV. The captured
information is send to scoreboard and coverage collector via.
Subscriber in the form of packets.
iv) Agent: - The agent is a container which holds the driver,
sequencer and monitor. The agent can be active or passive
based on its operation. The active agent has driver, monitor
and sequencer which drives the transaction whereas the
passive agent just monitors the transaction and responses.
Fig. 1. UVM Testbench Architecture. v) Scoreboard: - The scoreboard compares the response of
DUV with expected responses. It shows the number of failed
/ passed responses.
do the same thing in same way which made the verification vi) Coverage collector: - It helps to determine the progress
components portable and understandable. of verification. It helps to find the untested areas of Design
In this paper, UVM based architecture is used for verifica- Under Verification.
tion of a logic sub-system which corresponds to a sub system vii) Environment: - The environment holds all the above
in a SoC. Here, synchronous FIFO is being verified and the components. An UVM test bench can have one or more
result is analyzed using functional coverage. environments. The entire environment can be configured
III. UNIVERSAL VERIFICATION METHODOLOGY differently using factory settings.
UVM based testbench helps to create constrained random
2) uvm transaction: uvm transaction class [11] is
input stimuli and it also helps in reaching verification goals by
extended from uvm object class. They are used to
providing functional coverage details [8]. The main purposes
create data items. All the transaction items are extended
of a testbench are: - generate input stimuli, drive the stimuli
from uvm sequence item which itself is extended from
to DUV, monitor the response, check for correctness, measure
uvm transaction.These includes:
coverage details to determine the progress of verification.
i) uvm sequence: - uvm sequence defines the transaction to
A. UVM Architecture be driven to Device Under Verification.The sequence item
The UVM testbench architecture consists of following parts: is used to create uvm sequence. They are used to provide
- Top level harness module which instantiates the DUV, inputs to the Design Under Verification.
interface and Verification Components of test bench [9]. The ii)uvm sequence item: - uvm sequence items are the
tests are separated from test bench in UVM based testbench ar- transactions created. They are used to make uvm sequence.
chitecture. The test bench consists of uvm environment where
master and slave agents are defined along with scoreboard and 3) UVM Phases: In UVM architecture based testbenches,
coverage collector. The uvm test class is used to state various all the components and transactions happen in predefined
test scenarios. The test generates the environments where all phases. The diverse phases in UVM are: [12]
other components are created. The virtual interface is used i) build phase: - The build phase runs top down style. The
to point the physical interface which helps in communication higher level components can adopt whether to build the child
between dynamic test bench components and static modules. components based on random configuration.
The top level harness module calls run test() which creates ii) connect phase: - The TLM ports are connected in connect
the test class object and test is executed in phases. phase. It runs bottom up manner.
iii) end of elaboration phase: - This phase is used to check
B. UVM Base Class Library connections in test bench, to print topology. Fine modification
The UVM has rich library class which aids to make robust, of test bench is done in this phase.
scalable, and reusable verification components. Re of the key iv) start of simulation phase: - The TLM ports are connected
classes of UVM Class tree [10] are: in connect phase. It runs bottom up manner.
1) uvm component: All the verification components v) run phase: - It is the only phase defined as a task as it
which exists throughout the simulation are derivative from consumes time. The simulation starts in this phase. The run
uvm component class [11]. It facilitates the use of shared phase is again sub-divided into many sub phases.
operations like create, copy, compare and print.The various vi) extract phase: - All data are extracted from the testbench
components includes : components.

2
Fig. 2. Synchronous FIFO.

vii) check phase: - This phase checks for unexpected condi-


tions from test bench components.
viii) report phase: - The scoreboard and other checkers report
the simulation results are provided in this phase.
ix) final phase: - The simulation is about to end in this phase. Fig. 3. Verification Cycle.
It is used to print final messages.
IV. SYNCHRONOUS FIFO DESIGN PRINCIPLES
uvm package which provides many inbuilt functionality. It
The FIFO (First In First Out) is a memory / buffer used to defines the clock used by interface. The run test method is
transfer data among to buses / devices which work at unlike called inside the initial block which creates the object of
rates. The info sent from transmitter is stored in FIFO if the uvm test and the verification process starts from this point.
receiver is not able to collect the information at the same speed
[13]. The FIFO empty and full flags are generated to avoid the B. Interface
FIFO overflow and FIFO underflow conditions.
The interface is used to define the signals of DUV. It is
A. Architecture of Synchronous FIFO through this interface that Driver and DUV communicates each
The Fig.2. show the architecture of synchronous FIFO other. The virtual interface handle is set to interface instance
which is comprehended for this work. The FIFO read and in top harness module.
FIFO write process takes place on the identical free running C. Verification Components
clock. The input data is written to FIFO on positive edge
of clock if write enable signal is high and FIFO full signal In this work, Master and Slave agents are used in envi-
is disserted. Similarly, read operation happened when enable ronment. Master agent has sequence item which defines the
signal read is high and FIFO empty is disserted. The FIFO sequence, sequencer – which sends the created sequence to
design has an asynchronous reset which sets the output data driver, driver – which drives the input to pins of DUV through
signal to zero. interface. The master monitor takes note of input data being
fed to DUV.
B. Verification Plan The Slave agent has slave monitor which takes note of the
The verification of a design necessitates more effort than to output from DUV. Both the data from master monitor and slave
write a RTL code for design [14]. Therefore, verification plan- monitor are send to the scoreboard. The scoreboard reference
ning is imperative before starting verification. The verification model compares the output response with expected response
plan covers the total process of verification: - i) defines what and indicates whether the test is passed or failed.
the planned design does. ii) defines what the intended design The coverage collector has defined coverage groups which
should not do. iii) defines the necessities of design. iv) defines has cover points and associated bins. The coverage report gives
the coverage model based on requirements and much more. the progress of verification process.
The abstract level of steps involved in verifying our design is
shown in flowchart below, Fig.3. VI. SIMULATION AND RESULTS
The synchronous FIFO is designed and synthesized using
V. UVM TESTBENCH FOR SYNCHRONOUS FIFO
Xilinx ISE. The RTL view is shown in Fig.4. The UVM based
A. Top Harness Module test bench is created for verification of synchronous FIFO
The top level harness module instantiates the DUV and [15]. Scoreboard was implemented to check the correctness
interface for connecting DUV and UVM testbench. It imports of DUV[16]. Functional coverage [17] was calculated using

3
Fig. 4. RTL View Of Synchronous FIFO.

TABLE I TABLE II
F UNCTIONAL C OVERAGE R EPORT F OR S YNCHRONOUS FIFO C ROSS C OVERAGE R EPORT F OR S YNCHRONOUS FIFO

Category Hits Percent Goal FIFO Size Cross Hits Percent Goal FIFO Size
covergroup1 95.83 100 32 X 64 readXwrite 100 100 32 X 64
covergroup2 83.33 100 32 X 256 readXwrite 100 100 32 X 256
covergroup3 83.33 100 32 X 1K readXwrite 100 100 32 X 1K
covergroup4 75.74 100 32 X 4K readXwrite 100 100 32 X 4K

coverage model [18] and could achieve good coverage result


as per verification plan.
The simulation was done in Synopsys VCS as well as Aldec
Rivera and the output waveform is shown in Fig.5.
The coverage report is generated for different FIFO size.
The functional coverage report is shown in Table I. The
coverage includes line, toggle and conditional coverage. UVM
allows defining of all cover-groups and different tests are
written to remove the coverage holes in verification. Line
coverage checks whether all lines of codes are being checked.
Similarly, toggle coverage checks for the transition of signals Fig. 5. Synchronous FIFO output waveform.
from 0 to 1 and viceversa. Conditional coverage makes sure
that all the conditional statements are being tested with all
possible combinations. VII. CONCLUSION AND FUTURE SCOPE

The Table II shows the cross coverage report for the dif- In this paper, UVM based architecture for logic sub-system
ferent FIFO size. The cross coverage checks for simultaneous verification is outlined with the example of synchronous FIFO
read and write operations. It checks for the cases like: - i) write design. As the design is getting increasingly complex, UVM
only. ii) simultaneous write and read. iii) no read and write. helps in vertical reuse of testbench across entire project. More
Reading an empty FIFO is not valid and is considered as an than 70-80% of project cycle time is spent for verification
illegal bin in cross coverage. The Table III gives the entire and UVM methodology with strong base class and power
coverage report including line coverage, conditional coverage of system Verilog helps in reducing the verification time.
and toggle coverage for different FIFO sizes. The various UVM also provides constraint randomization feature which
important scenarios that were tested of DUV are: - i) read
only operation and write only operation. ii) read one data and TABLE III
write one data. iii) simultaneous write and read operation. iv) T OTAL C OVERAGE S UMMARY F OR S YNCHRONOUS FIFO
empty to full and back to empty. v) write to maximum depth
Name Score Line Conditional Toggle FIFO Depth
and read it completely. vi) write after FIFO is full. vii) read sync fifo 98.25 94.74 100 100 32 X 64
when FIFO is empty. viii) reset FIFO in the middle. ix) writing sync fifo 92.56 94.74 83.33 99.62 32 X 256
all data as 0’s followed by all1’s and reading same. There are sync fifo 93.08 94.74 83.33 99.9 32 X 1K
many more other tests which are executed to verify the DUV. sync fifo 92.68 94.74 83.33 99.98 32 X 4K

4
significantly helps in defining input stimuli. The UVM based
testbench offers reusability of verification of components.
Currently, the UVM environment has master and slave agent
along with coverage and scoreboard. UVM configuration
database can be used to set more controllable for verification
components. The coverage group can be enhanced to achieve
100% coverage.

R EFERENCES
[1] Juan Francesconi J. Agustin Rodriguez and Pedro M. Julian, “UVM-
Based Testbench Architecture for Unit Verification,”Argentine Schoolof
Micro-Nanoelectronics, Technology and Applications, IEEE Cata-
logNumber CFP1454E-CDR, ISBN: 978-987-1907-86-1, pp. 89–94,
2014
[2] Khaled Salah A, “ A UVM-Based Smart Functional Verification Plat-
form: Concepts, Pros, Cons, and Opportunities,”9th International De-
sign and Test Symposium, IEEE, pp. 94–99, 2014.
[3] J.Bergeron,“Writingtestbenchesusingsystemverilog,”,
www.Verificationguild.com, p. 24, 2006.
[4] Jonathan Bromley, “If System Verilog Is So Good, Why Do We Needthe
UVM?” , pp. 1–7.
[5] C. Spear, “System Verilog for Verification, 2rd edition, Springer
,”AGuide to Learning the Testbench Language Features, pp. 79–124,
2012.
[6] Deepa Kaith, Janakkumar B. Patel and Neeraj Gupta, “A Technical
RoadMap from System Verilog to UVM,”International Journal on Re-
centand Innovation Trends in Computing and Communication, Volume:
3,Issue: 3, pp. 1302 – 1306, 2015.
[7] J. Bergeron, E. Cerny, A. Hunter and A. Nightingale, “Verification-
Methodology Manual for System Verilog,” , 2006.
[8] DevikaK N, Ramesh Bhakthavatchalu, “Programmable MISR mod-
ulesfor Logic BIST based VLSI testing,”International Conference on
Con-trol,Instrumentation, Communication and Computational Technolo-
gies(ICCICCT), pp. 699–703, 2016.
[9] “UVM User Guide 1.2, Accellera,” , pp. 1–8, October 2015.
[10] Bhaumik Vaidya, Nayan Pithadiya, “An Introduction to Universal Verifi-
cation Methodology,”Journal of Information Knowledge and Researchin
Electronics and Communication Engineering, ISSN: 0975 – 6779,VOL-
UME – 02, pp. 420–424, 2013.
[11] Wei Ni, JichunZhang, “Research of Re usability Based on UVMVerifi-
cation,”IEEE, 2015.
[12] Nilay Jayeshkumar Doshi,Sheetal Suryawanshi, Gardas Nareshku-
mar,“Development of Generic Verification Environment Based on
UVMWith Case Study on HMC Controller,”IEEE International Con-
ferenceOn Recent Trends In Electronics Information Communication
Technol-ogy, pp. 550–553, 2016.
[13] Navaid Z. Rizvi1, Rajat Arora and NirajAgrawa, “Implementation
andVerification of Synchronous FIFO using System Verilog Verifica-
tionMethodology,”Journal of Communications Technology, Electronics
andComputer Science, 2015.
[14] Bhavana B M and Ajaykumar D, “Implementation of Low PowerInter-
face for Verification IP (VIP) of AXI 4 Protocol,”InternationalJournal of
Current Trends in Engineering Research (IJCTER), Volume2, pp. 1–8,
2016.
[15] Midhun K Dinesh and Ramesh Bhakthavatchalu, “Storage mem-
ory/NVM based executable memory interface IP for advanced IoTap-
plications,”Fifth International Conference On Recent Trends In In-
formation Technology, p. 8, 2016.
[16] Khaled Khalifa, Khaled Salah, “Implementation and Verification of
AGeneric Universal Memory Controller Based On UVM,”10th Inter-
national Conference on Design Technology of Integrated Systems in-
Nanoscale Era (DTIS), 2015.
[17] Serdar Tasiran, Kurt Keutzer, “Coverage Metrics for Functional Valida-
tion of Hardware Designs,”IEEE Design Test of Computers, 2011.
[18] Chien-Nan Jimmy Liu, Chen-Yi Chang, Jing-Yang Jou, Ming-Chih
Laiand Hsing-Ming Juan, “A Novel Approach for Functional Cover-
ageMeasurement in HDL,”IEEE International Symposium on Circuits
andSystems, Geneva, Switzerland, pp. 217–220, 2000.

You might also like