EDA News
Monday
June 6, 2005
From: EDACafe
Previous Issues

www.mentor.com/fv

About This Issue

Mentor's Questa Verification Products


May 30 - June 3, 2005 By Dr. Jack Horgan
Read business product alliance news and analysis of weekly happenings

Introduction

On May 16, 2005 Mentor announced its new line of Questa verification products. The Questa verification products offer built-in support for testbench automation, coverage-driven verification (CDV), assertion-based verification (ABV), and transaction-level modeling (TLM). This initial release includes two new products: Questa SystemVerilog and Questa Advanced Functional Verification.

I had an opportunity to discuss these products at length with Robert Hum, vice president and general manager of Mentor Graphics Design Verification and Test division.

Whenever a new product is introduced I am always interested in what problem it is trying to solve. Robert referenced the well known 2002 study by Collet International Research which surveyed users about their design processes. In particular it tracked the causes of IC/ASIC mask re-spins. The survey found that the leading factor was logic/functional flaws. They occurred in 71% of the cases. The second biggest factor, clocking, occurred in just over 20%. In fact a 2004 study by the same firm showed that functional errors were even a larger factor. In some sense this is really counter intuitive because you are talking about 40 million or 50 million gate design. One would expect that at this fine geometries, you would get crosstalk or some weird analog effect, parasitics or something like that would cause the chip not to work but the overwhelming complexity and amount of logic on these chips still makes it true that logical or functional errors are still the ones that are the singly largest category. Robert also shared a slide from Intel published at DAC 2003 which showed the number of bugs found for each generation of the Pentium. The graph was a straight line on a semilog plot. The bug count correlated well with the number of transistors. Clearly whatever is happening in terms of design methodology and tools is not having an impact on reducing the total number of bugs. The indication there is that no amount of tool optimization is really going to help. You need a change in methodology.

Given the consequence of functional bugs in terms of the cost of re-spins and the opportunity costs in terms of longer time to market (TTM), it is not surprising that a number of new approaches are being tried to have an impact on bugs. These include assertion based verification, functional coverage, constrained-random testing, dynamic-formal verification, transaction-level verification etc etc. There are lots of different ways of coming at the problem. The question is whether all of these methodologies will be used, whether one particular methodology will dominate or how this whole thing will turn out.

Verification as we practice it today is basically test benches, some sort of an executable model of your product and a simulator. That's today's dominate methodology. That methodology is no longer powerful enough to take some of these new methods, create tools around them and create flows around them and come at verification from that point of view. Methodology shifts require change. The EDA industry has to react and designers have to react.

Robert stressed the importance of standards as an enabler for driving methodological change. Standards really help change the rules. Wherever you have standards the industry can move forward much more quickly. It was really when Verilog was standardized and entered the public domain that the market really took off for natural languages and RTL design. One thing I think we're sensitive to is that to the extent that something can be standardized and that all vendors have access to it, can cause a methodology to preferentially take off.

We have taken SystemVerilog which is a standard we believe that the industry is willing to adopt. We have created an implementation of SystemVerilog that embodies a powerful simulation engine which is based upon ModelSim. We have included assertions based verification both with PSL and with SystemVerilog. We have coverage driven verification with some Verilog native coverage directives and things like that. And we have a transaction modeling approach that permeates the way that our graphical user interface works.

The table below shows the features of the three verification products.
The standard ModelSim product has in it Verilog, Verilog 2001 and VHDL. But if you buy that product you don't get assertion and you don't get testbench automation. Quest SystemVerilog has all the SystemVerilog stuff in it and Quest AVS has not only SystemVerilog in it but PSL and System C. It is the highest priced product that we have. There are upgrade paths if you already have ModelSim and want to do SystemVerilog. You can buy an upgrade.

Will ModelSim go away at some point?
ModelSim is the most popular simulator in the market. There are over 140,000 licenses out there. ModelSim itself is not going to go away as long as VHDL and Verilog remain in use. When we talk about the industry transitioning to SystemVerilog, of course this will not happen overnight. A large portion of the population will continue to use ModelSim as it is. As people convert over they get the benefit of having SystemVerilog available plus the benefit of having assertion based verification and testbench support in there. I think the conversion in the industry will happen quickly over the next three years. You will see rapid adoption of the SystemVerilog environment. From our point of view ModelSim is the product we've developed and will support until people stop buying it.

Because of the architecture of what we have, ModelSim in a sense is continuously supported because the simulator is a single kernel the ModelSim kernel. ModelSim is itself the real guts, it does the execution. We created from scratch the constraint solver which is the heart of the directed random testbench. Functional coverage is the module that collects all the functions and coverage and presents them. This single kernel environment executes Verilog, VHDL, SystemVerilog and System C; all one nice environment tied together. This makes it easy for us to support and maintain. It's standards based. It gives higher performance than if you have separate architecture. If you use e and want to use e in a VCF environment, that has to be done through a PLI link and it slows everything down.

What percent of the 140,000 ModelSim users use SystemVerilog?
That's really hard for us to say because of the way we package ModelSim. We have packages where you can use either Verilog or VHDL and we have other packages available where you can use Verilog and VHDL. We have not been able to successfully breakout the statistics of what people run. It is true to say from a national point of view that VHDL usage is not growing. It is also true that more VHDL users are importing Verilog modules. So if you have a pure VHDL design you might be importing something that comes from a Verilog design. Most of what we have seen is that pure VHDL has been mostly replaced by mixed language. Most of what we sell now is mixed language.

The trend seems to be that VHDL community is not growing. That does not imply that it is shrinking but it is just that we haven't seen growth in this area. We do not see customers adopting it but there are lots of VHDL users that are integrating bits and pieces of SystemVerilog.

When you are faced with debugging your design of 40 or 50 million gates, one of the problems is: Where do you start looking? If you find a problem, how do you narrow down to where the bug is and how do you solve this particular problem? ModelSim has its own GUI and has had it for years. It is one of its strength. What we've done is added debug and analysis. We've added transaction to it, coverage, a whole lot of things. This new GUI will add lots of productivity. People find that if you're using PSL standalone and you are writing assertions, the assertions themselves sometimes have bugs in them. The question is: How do you debug the assertions? You can't debug them in a normal simulator GUI because that only has waveforms. So inside the GUI we now have an assertion debugger and a bunch of support to really help people be productive with whatever language they want to use.

If you look at verification it can be divided into three main pieces: ABV, CDV and AVM.
ABV - Assertion Based Verification. The aim of that is to improve the time to finding bugs. ABV depends on designers embedding assertions in their designs.

CDV - Coverage Driven Verification. The aim is to improve the time to get coverage. CDV depends on being able to search and measure coverage. Coverage is the metric you want to use. You want to figure out whether you have done enough verification. You might say that I have run all my tests and haven't found any bugs but that doesn't mean I have covered enough of the circuit.

AVM - Advanced verification methods. The aim is to improve design quality. There are some things you would do in simulation but there are other things you could do in simulation that are better off doing in another tool. An example is clock domain crossing. The issue there is that you have many clock domains in your design. We have seen designs that have 90 clock domains in them. One of the problems you have is how to make sure that your signals are handed off properly from one domain to the other. You have to use FIFOs, arbiters and things like that. In order to make sure your design works, you would write a testbench to simulate the data transfer and handoff and verify it all works. It's very time consuming if you have to write a testbench that specifically sets things up that way. But with the technology we have in Questa that comes out of the 0-in acquisition that we did last year we can now do that kind of clock domain analysis statically. You don't need a testbench. That's an example of an advanced verification method. We can also do things like metastability analysis and a few other things that get people off the hook of having to write testbenches. Rather than sitting there writing testbenches, why not try another way of verifying what you're trying to do and not rely on only using the simulator.

Probably the best known example of an AVM would be something like logic equivalency checking. Mentor has been selling that for years but now with new technology we have some other AVMs.
By the way what is your definition of an assertion?
Basically an assertion is a statement of the design intent. Assertions can be either embedded in you design file or can be put in another file and linked in through another method. Assertion is a statement of intent like: If signal A rises and signal B falls within 3 clock cycles, then C has to be zero. So you put that in and as you run your simulation, the assertion sits there and verifies that whenever signals A and B behave in the way the assertion triggers on, then it will verify that within some period of time the other signal will be zero. The assertion can be much more complicated than that. You can do things like data use. You can say I have this register that fans out in four places within 3 clock cycles, once I set up the register that data has to be latched into one of these four places. There is a library of these assertions. Part of it is standardized, part of it is stuff we have in our system and then there are some you write yourself. So these assertions are integrated with our kernel execution engine and tied into the metric system.

Assertions have been around since the mid nineties. Kantrowitz and Noack reported in 1996 that 34% of all bugs found on the DEC Alpha 21164 project were identified by assertions. Foster and Ceolno reported in 1998 that 85% of all bugs found on an HP project were identified by assertions.

Nine years ago people were complaining about assertions and questioning whether they were a viable way of doing verification. Here we are nine years later and finally there is a standard language out there that everybody can jump on the band wagon. People who were at the leading edge in 1996 already saw the value of these assertions and now many people at the leading edge think this kind of design methodology will be the one that dominates.

It's not something totally new that the world doesn't understand. It's been around for years. The language elements of Verilog have been well thought out. I think things are going to happen now.

Once you have assertions you move into coverage. You begin to ask questions like: Are there pieces of my circuit that I haven't adequately verified? What are those pieces? Start asking questions like that to try to understand when you have done enough. When is enough, enough? The philosophy a lot of people have is: If it isn't verified, you have to assume that it is broken.

In today's world you have to come at coverage from two points of view: top down and bottom up.

The top down approach where you say: whatever I design has to meet the specification. You're not worried about the implementation per se. You are worried whether or not the functionality that you have included in your design meets the spec.

The bottom up approach. If in fact the design had included functionality that is in the spec, you worry about whether you have implemented it properly. Things like clock domain crossing issues, metastability issues and stuff like that are all about implementation. Did you implement the circuit in a way that is robust under all process conditions and all manufacturing set of rules?

As a verification vendor we need to cover all top down specification verification and the bottom up implementation verification.

Languages like e and Vera have come at things either exclusively from the specification or exclusively from the implementation. It has really only been since SystemVerilog showed up where you really could address both the top down and the bottom up ways of working.

In order to answer the question 'How do I know I have done enough simulation', you need some notion of what you are measuring. Structural coverage is things like did I execute enough of the statements in my model because for sure if you haven't executed a statement in your model, then you haven't verified anything about it. But the issue with that kind of coverage is even if you can cover all the statements in your model and you can still have functionality bugs in there. What happens when you FIFO overflows or underflows? Does the circuit handle the error condition properly? So structural coverage is easy to do but doesn't tell you a lot. We have to move to things like transaction coverage which is about protocol and functional coverage which is about how well does the design match the spec and functional coverage which is about implementation. Code coverage is whether all the lines in your model have actually been executed. All of these metrics have to be asking the question: Have I simulated enough, have I verified enough? Inside this Questa release all these metrics work.

The model will have embedded assertions in it that are used to instrument certain areas inside the circuit that provide feedback around coverage and that feedback is sent back to the testbench that makes decisions about what are the next set of tests that ought to be run. This is the way you get into this random directed testbench pioneered by Verisity. Inside you now have everything that PSL can do, everything that SystemVerilog can do. You have coverage directives, coverage metrics and we have created a GUI and Database where we collect together everything that is possible to know about what is covered and what is not covered in your circuit. All color coded. Red is bad and green is good. It's kind of a cockpit area that tells whether or not you have holes in your verification and test plan. A very powerful thing. It enables designers and design groups to answer that question around am I done verifying yet. The way we have done this you can actually see where you haven't got coverage. Not only can you answer the question 'Am I verifying enough?' but it tells you on the green ones you've done verifying but on the red ones you've got more work to put in there. It tells you why that hasn't been verified.

The effect of adding metrics and assertions to your circuit really is to increase your productivity. Now you no longer have to write testbenches that are as long because of random directive technology. Verilog takes care of fleshing out the actual data piece. We support random, reactive, transactional level modeling - all different approaches to writing testbenches. You can use them together or one at a time.

The testbench measures coverage by monitoring some of the assertions. Once having measured coverage it goes back in a loop and enables the testbench to modify its constraints to go on to explore other paths and other functionality it has not already done. The key to the whole thing is to have enough assertions in your design to monitor what is going on. It has to be able to measure something in order to tell whether I have covered or not.

In Questa you can now officially use transactions which are a method of abstraction. We all know that when you can use abstraction, you can increase productivity leaving the question of details either to some kind of automation or you can do some level of division of responsibility and have more people working on something.

In transaction level modeling it is primarily a methodology but it is also supported by some functionality and capabilities that we have inserted into Questa. The point of TLM is to interface your testbench into your circuit at higher than at the pin level. The problem with the interface being at the pin level is that you have to completely specify exact timing of all the data that goes in there. It becomes quite tedious. Testbenches talk to chips through protocols. Protocols are really a macro version of some kind of a handshake or timing diagram; a transaction between one entity and another. We give people now the capability to create these so called transactors or transactions. We have seen testbench speedup of 1000x. It helps you write more efficient testbenches. A lot of people in the industry have been talking about this but it has become part of the Questa environment.

Rather than looking at pure waveforms that are just 1s and 0s, you now have boxes that represent the entire protocol and inside the boxes we've extracted the fields that are in there and can tell you the values of each field so that you can quit worrying about this you have the details of each signal. That's been very helpful to people. The feedback has been tremendous.

Our philosophy has been to use public standards. We believe our industry is in need of a methodology shift for verification. It is not going to come from just simulators not just 5x or 10x. You need a different way of trying to eliminate functional errors that are occurring in large designs. So we think that supporting a standard like SystemVerilog is the best way. We also believe that the next generation of verification technology will be based on assertions. They deliver higher quality and find more bugs faster. Also in Questa we have an engine, coverage metrics, coverage driven verification where you can connect your testbench to coverage metrics, then modify your constraints based upon getting inside your circuit. Testbench automation is the bit about coverage driven verification.

What is the availability and pricing of these new products?
The Questa AFV platform will ship in Q2 2005 and is listed at $42K (USD) perpetual. Questa SystemVerilog, will also ship in Q2 2005 and is listed at $28K (USD) perpetual. Term pricing is available. ModelSim is listed at $21K (USD) perpetual.

How many of the 140,000 ModelSim users currently use assertions?
We have had PSL available for about a year. PSL has been the only assertion language available out there in public. It is hard to tell how many people use it because it is part of the package, so everybody who has renewed now has it. We can judge a bit because of hot line calls and things like that. I would say on the assertion side maybe 35%. Some of the licenses we have out there are on farms. 35% of the users not 35% of the licenses. Some of our largest customers have been using it for years. Larger customers account for a larger number of seats and hence represent more in the license space. It's not 100% but a surprisingly large number.

Where does one insert assertions in the flow?
If you are a designer, you put them in while you capture your RTL. Those are assertions generally around the implementation like if a FIFO overflows, I have to have that signal in so many cycles. Those assertions are put in as you capture the design because they are used in block testing. There are assertions later on that the verification team puts in not as part of the design process but as part of the verification process. These assertions are put in for very different reasons.

There is a third set of reasons for using assertions. If I am building a block that I know I am going to reuse, I will put assertions both inside and outside that block to make sure that whoever uses that block doesn't violate my interface. Those assertions are not there because of design, those assertions are there to keep .. Those can be put in either during the design or afterwards in an ad hoc manner.

Let me protect this design by adding assertions that will fire and the user will see a failure. Then he can diagnose why the failure occurred. Those assertions can be put in before or after the design is done.

Would existing users of ModelSim have to change their existing designs to use Questa?
There is nothing that forces you to add assertions. I can take a block that has no assertions and use it. It will work just fine. You can adopt SystemVerilog and assertions piecemeal.

Aren't the benefits of Questa liked to the use of assertions?
The more places you use assertions, the better off you are. In fact one coverage metric is assertion density. This measures whether you sprinkled assertions throughout your design. Your insight is correct.

If you have a change that says your customers must convert 100% of their designs before they see any benefit, then people will say that they can't afford to do that. What happens here is people say while I don't get 100% of the benefit, I have so many problems with functional bugs I am going to get some benefit. I'm going to get as much benefit as my investment. If I invest a lot, I will get a lot of benefit. If I invest a little, I will get a little benefit. You have a linear relationship. And this helps people adopt. If Verilog were a brand new language, to require a Verilog module would be a disaster. You would tell people oh by the way you have to convert all of your designs into SystemVerilog before they will run. That would be a non-starter. There would be no way to have the community wrap their arms around it. That's why we have things like PSL in the market that works with Verilog and VHDL. You can take a design that's up and running and say I'm having trouble understanding whether my arbiter works. So I put in some assertions around that and at least I will make a little progress. People learn that you have to put training wheels. How do you get better and better writing these assertions? It's a piecemeal strategy that people can use and get some value out of it immediately, right out of the blocks rather than wait for 9 months and not get anything. I think that's important.

Do existing ModelSim customers use Mentor design tools or tools from other vendors?
ModelSim users are spread all over the place. We have interfaces (Verisity). We also have another product called Seamless which is kind of an ESL product. It's all over the map.

According to the latest EDAC numbers Mentor is now the largest verification company having surpassed our friends at Cadence. Analog, mixed signal, functional verification, emulation and so on. People choose their simulators based upon the productivity they get in verification. The statistics you get from Ron Collett is that verification takes up 60% to 80% of the overall design cycle. Simulation is used very early when people start writing RTL. They start to package it up and test. There are enough standards in place, Verilog and VHDL where you can capture in one vendor's tools and verify in another. The point of standardization is that they can have flow even if they do not buy all of the tools from one vendor. We still manage to differentiate the kind of value we offer.



The top five articles over the last two weeks as determined by the number of readers were

TransEDA announces Assertain™, the first independent Verification Closure Management tool. Covering all front-end design stages from original text specification through to validated RTL, Assertain monitors, measures and manages the verification process in one integrated environment. The tool seamlessly brings together rule, protocol and assertion checking; code and assertion coverage; design and assertion coverability analysis; test grading and optimization, linked to specification coverage using proven requirements traceability techniques.

Mentor Graphics Users Conference (U2U 2005) Proceedings Papers Now Available

Mentor Graphics Strengthens its Automotive Solutions Portfolio with the Acquisition of Volcano Communications Technologies. Volcano's automotive networking series includes network design tools, embedded software and test and validation tools for all major automotive networks. Terms of the deal were not disclosed.

New Cadence PowerMeter Technology Enables Signoff-Quality Dynamic Power Rail Verification; VoltageStorm Dynamic IR Drop Analysis Enhanced with Sophisticated Power Consumption Analysis Capabilities. PowerMeter enables design teams to accurately calculate and distribute leakage, internal and switching power consumption for every instance of their design

Celoxica ESL Tool Gets Faster and More Physical; The Leading ESL Design Suite for Algorithm Acceleration Sets New Standards in Performance and Ease-of-Use, Optimizes the Connection to SoC Flows. DK4 introduces new VHDL and Verilog output optimizations for interfacing with Design Compiler from Synopsys. In addition to RTL input for the SoC flow, DK4 also supports automatic scripting for SoC test bench generation. This interface bridges the gap between ESL and latest SoC physical design and verification flows.



Other EDA News

Calypto and Mentor Graphics Integrate Tools for Verifiable, Automated Path from System to RTL; Calypto Joins Mentor OpenDoor Program

EVE, Novas Complete Integration Between EVE's Hardware-Assisted Verification Platform, Novas Debug System; Optimized Tool Flow Streamlines SoC Debug with EVE's ZeBu

OEA International Enhances DP-PLAN™, Dynamic Power Planning Tool with Visual Feedback

OEA International Boosts Performance of P-GRID™ Power Distribution Analysis Tool

OEA International Boosts NET-AN™ 3D Critical Net Extractor Performance for Nano-Meter Technologies

Design For Debug Meeting To Be Held At 42nd Design Automation Conference

Cadence President & Chief Executive Officer, Mike Fister, to Present at the SG Cowen Technology Conference

At DAC 2005 OEA International Invites You to Attend High Speed Digital & RF Design Presentations and Demos

OEA International Announces Enhancements to Spiral Inductor Design and Synthesis Tool

Summit Design Expands Its ESL Solution Suite with Vista 1.1 to Deliver Advanced Analysis and Debug for Expert and Novice SystemC Users; Powerful Automatic Transaction-Level Modeling Viewer Provides Comprehensive System-Level Design Observability

HP Strengthens Adaptive Enterprise Solutions with New Offerings; Company Launches World's Most Reliable Server, First Unified Infrastructure Management Software

Silicon Dimensions adds new levels of POWER analysis, modeling and prediction to Chip2nite

VaST Systems Releases Virtual Processor Model Transformer, Latest in Line of Powerful "Constructor" Tools

Sigrity Solution Optimizes IR Drop Analysis for Packages and Boards

Huawei Adopts Synopsys VCS Native Testbench to Accelerate Verification of Networking and Communications ASICs

Synopsys Advances VCS Solution by Adding Assertion IP Library and Native Testbench Support for SystemVerilog

Other IP & SoC News

EZchip Doubles the Price-Performance of 2.5-Gigabit Network Processors; Provides the NP-2/5, a 5Gbps duplex NPU with 10xGE ports and Traffic Manager

TI Provides Free SPICE Circuit Simulation Program for High-Performance Analog Designs

SiRF Technology Acquires Motorola's GPS Chip Set Product Lines

Agere Systems Announces Money-Saving Software, a Higher-Performance Network Processor, and Multiservice Convergence Demo at SUPERCOMM

Cygnus Communications, Inc. Completes Acquisition of SiWorks, Inc.

Conexant Unveils Complete Family of VDSL and VDSL2 Semiconductor System Solutions; Accelity Production Chipsets Shipping to Customers for Global VDSL2 Deployments

AMCC Demonstrates ATCA-Based, Multiservice Evaluation Platform at Supercomm 2005; Enables OEMs to Drastically Reduce Costs, Risk and Time-to-Market When Developing Next-Generation Networking Solutions

TI Introduces Class-D Audio Power Amplifier for Flat Panel Displays

Fairchild Semiconductor Updates Guidance for the Second Quarter

FSA Finds Funding of Fabless Companies Rises 36% QoQ; Segment Raised $357.9M in Q1 2005

Actel Broadens Popular PCI Product Family With CorePCIF

National Semiconductor Introduces Energy-Efficient, Integrated AC-DC PWM Controller For Power Systems

TDK Corporation announces Acquisition

TI and RadioScape Launch First Chips and Modules for Digital Radio Mondiale Standard

Altera Narrows Second Quarter Revenue Guidance

New Analog Devices Instrumentation Amplifier Delivers Precision Performance to Low-Voltage Applications

Zarlink Introduces World's First Wireless Chip Designed Specifically for In-Body Communication Systems

Atmel Announces First High-Speed 8-bit MCU With Integrated USB to Serial Bridge Interface Capabilities

clicking here.
You are registered as: [dbouldin@utk.edu].

CafeNews is a service for EDA professionals. EDACafe respects your online time and Internet privacy. To change your newsletter’s details, including format and frequency, or to discontinue this service, please navigate to http://www10.edacafe.com/nl/newsletter_subsc ribe.php. If you have questions about EDACafe services, please send email to adam@EDACafe.com.

Copyright © 2005, Internet Business Systems, Inc. — 496 Salmar Ave. Campbell, CA 95008 — 888-44-WEB-44 — All rights reserved.