Transitioning to Advanced Verification Techniques for FPGAs – Catch-22?

A Guest Blog by TVS Founder and CEO, Dr. Mike Bartley

Dr. Mike Bartley, Founder and CEO, TVS
Like(2)  Comments  (0)

Many FPGA designers find themselves in a catch-22: they recognise that their designs are becoming too complex for their current verification strategies BUT adopting advanced verification techniques (such as the Universal Verification Methodology – UVM) is even more complex! In this blog we provide some practical guidance on a way out of this.

 

Firstly, let’s consider the structure of a test bench. Test benches need two types of components: active and passive. Active components stimulate the design under test (DUT) to cause activity. Passive components only observe the DUT.

 

So let’s consider a FPGA designer who is used to performing directed testing (the active component) with manual checks of waveforms (the passive component). UVM is a big mountain to climb from such a base camp but are there are some notable staging posts that can be offered by passive test bench elements. The first is coverage, code and functional coverage, which gives an indication of how much of the DUT has been exercised. The former is automated by the simulators and tells you how much of the code has been exercised, or, more importantly, what has not been exercised (e.g. this line has not been exercised, this decision has not been fully exercised, this sensitivity list has not been fully exercised). Exercising every line should be a minimum (do you really trust something where parts of the code has never been tested?). Functional coverage allows you to instrument the DUT to define scenarios you want to exercise and check they have been exercised. For example: have I seen back-back transactions on the bus; have we seen a master lock the bus for burst transactions; have I seen a single bit and double error in my EEC checkers.

 

Coverage gives us a better view on what we have tested – that is, the actual scenarios that were reached by the active components. Assertions allow us to add passive checkers to the test bench. For example, if we expect a group of signals to be one-hot then we can add a check that will check this is the case on every cycle. If it finds a cycle when it is not the case then the assertion will fire and take a variety of actions including stopping the simulation. Such failures are usually quicker to debug because the simulation stops on the cycle where the bug occurs and tells the designer that it is caused by drivers on the bus. The above one-hot bus check is combinatorial (i.e. it checks relationships between signals at a particular time) but we can also add sequential checks. For example, we may want to check “if a master is granted the bus then the mater must have previously requested the bus”.

 

Assertions are unlikely to provide a complete check on the responses from the DUT. However, manual checks are time consuming and error prone so an alternative solution is required and this can be performed through passive scoreboards. First passive monitor are added to inputs and to outputs/state variables. The former capture the stimulus from the active components and the latter capture the responses from the DUT. The monitors should perform some level of abstraction – e.g. a bus monitor might abstract the bus signal activity into the address and data sent over the bus and similarly for an Ethernet or PCIe monitor. The address and data from the input and output monitors are passed to the scoreboard where they are matched and checked. Obviously the scoreboard needs to apply quite a lot of intelligence; for example, data sent to a register in the DUT creates an entry in the scoreboard expecting to be matched by a monitor that detects the register update.

 

fpga design verificationThe final step would be to add active components such as automated stimulus generators. If the passive checkers (assertions and/or scoreboard) are in place then the test bench can now be left to run without manual intervention and the passive coverage collection will now tell us exactly what we have actually tested.

 

This blog has touched upon three passive test bench components that can be applied to the existing designer test bench with minimal change and active test bench components that go beyond directed testing strategies. The first two steps – coverage and assertions – can be relatively quick to adopt with minimal learning and disruption to existing practises. The final two steps – passive scoreboards and active stimulus generation - can be quite time-consuming and might be best approached via a Verification IP (VIP) approach. VIP is ready made test benches containing all the required passive and active components that can be easily connected to the DUT interfaces. So for example, you could buy an AHB VIP, connect it to the AHB interface, measure the functional coverage and apply a number of assertions to check your AHB interface. In time you could attach the monitor and connect that to a scoreboard and finally start to generate stimulus.

 

FPGA designs are becoming more complex and so are the strategies we use to verify them. If you are at base camp then code and functional coverage are good first steps as well as assertions. These passive approaches are much easier to adopt and should have high return on investment (ROI). Adding automated checks through passive monitors and scoreboards should make verification more efficient (automated checking rather than manual) and open up the prospect of active components to automate stimulus generation. This should create a very effective verification environment taking you closer to the peak!

 

FPGA design verification will be discussed in detail at Verification Futures, held on February 5th in Reading and online. Verification Futures is a unique free one day conference, exhibition and industry networking event organised by TVS to discuss the challenges faced in hardware verification. The event gives the opportunity for end users to define their current and future verification challenges and collaborate with the vendors to create solutions. Registration is free via Eventbrite.

Mike Bartley has a PhD in Mathematics from Bristol University, an MSc in Software Engineering and an MBA from the Open University, and over 25 years of experience in software testing and hardware verification. He has built and managed state-of-the-art test and verification teams in a number of companies (including STMicroelectronics, Infineon and Elixent/Panasonic) who still use the methodologies he established. Since founding TVS he has consulted on multiple verification projects for respected organisations including ARM and Infineon.

Dr. Bartley was formerly Chairman for ten years of the Bristol branch of the British Computer Society and is currently Chairman of the Bristol Local Enterprise Partnership (LEP) and is a Technical Advisor to the National Microelectronics Institute (NMI).  He has had over 20 articles published on the subject of verification and outsourcing.

Comments

Ask Us a Question
x
Ask Us a Question
x
Captcha ImageReload Captcha
Incorrect data entered.
Thank you! Your question has been submitted. Please allow 1-3 business days for someone to respond to your question.
Internal error occurred. Your question was not submitted. Please contact us using Feedback form.
We use cookies to ensure we give you the best user experience and to provide you with content we believe will be of relevance to you. If you continue to use our site, you consent to our use of cookies. A detailed overview on the use of cookies and other website information is located in our Privacy Policy.