Printable (PDF) Version of Newsletter
Dear friends and colleagues,
The demand for analog verification continues to grow, and growth is now limited by the shortage of analog verification engineers. As such, we have begun offering training classes in analog verification. So far these classes have been offered on-site to several of our customers and are always well received. To open these classes to a wider audience, we have arranged our own training facilities. Our first class will be held in San Jose at the end of September. Our article Introducing Analog Verification describes what you can expect to learn in the class.
As always, feedback is greatly appreciated.
Henry Chang and Ken Kundert
Verification Training Class
30 September 3 October, 2008 in Santa Clara, California
This challenging four day course provides participants with the tools they need to take on the task of verifying complex analog, RF, and mixed-signal integrated circuits. It combines lecture with a substantial amount of time in the lab to teach the overall analog verification process. You will learn how to develop a verification plan, functional models of analog blocks, regression tests for those models, and a fully verified Verilog model for the entire analog portion of the design for use in chip-level verification.
The class is intended for anyone who would benefit from a working knowledge of analog verification. These include: analog verification engineers, analog designers, and digital verification engineers and CAD engineers who meet the prerequisites.
Ken Kundert and Henry Chang.
Students should have a working knowledge of Verilog-A, analog circuits, and the Cadence design environment. It is also helpful to have gone through Verilog-AMS training. The better prepared you are, the more you will get from the class.
The class will be held from 30 Sep. 3 Oct. in Santa Clara, CA. The price is $2250 until August 15, at which time it becomes $2600.
For more information or to sign up, visit www.designers-guide.com.
The Designer’s Guide Community website has two new boards on its Forum dedicated to verification. You can use these boards to discuss, or ask questions, about verification. In fact, if you have any questions or comments about this newsletter, we encourage posting them there.
We’d like to announce that Designer’s Guide Consulting has grown. Xiaoyang (Sunny) Zhang has recently joined our team. We’ve seen a growing need from customers asking us to write models and test benches for their analog blocks. She is available to help. She will follow our strict guidelines and best practices for how to write high quality and efficient models and test benches for analog and mixed-signal blocks. Please contact us if you have such a need.
Designer’s Guide Sponsors 2008 Custom Integrated Circuits Conference
Having attended CICC consistently for more than a decade, we have observed that CICC is a great place for analog, mixed-signal, and RF designers to hone their design knowledge. CICC presents first published works in the areas that you are likely working in, such as high speed serial I/O, data converters, RF design, analog circuit design, and power management. There are also many CAD papers presented aimed to help the analog designer. The focus of CICC is to be an educational conference. CICC presenters tend to both explain what was done and how it was done to aid other designers. There are also educational sessions and a poster session to foster communication and learning.
Introducing Analog Verification
By Henry Chang and Ken Kundert
Currently, 90% of all SOCs contain analog circuitry, and the analog content of these SOCs averages a relatively constant 20% of the area of the SOC. This analog is implemented in CMOS and is relatively complicated, with hundreds and often thousands of digital control signals. Without a methodical and well designed verification process, the complexity of these circuits is resulting in increasing numbers of functional errors. Generally one or more ‘test chips’ are planned during the development of an SOC to test new circuits and architectures, tune yield, and to catch errors. However, unlike missing a performance goal, functional errors are problematic as they considerably reduce the value of the test chip. Functional errors degrade your ability to test and verify important aspects of your chip and software, and could make the test chip worthless as a demonstrator for your customers.
Today, very few design groups employ a systematic analog verification methodology. Designer’s Guide Consulting is working to establish just such a methodology; one that has been well tested and shown to be both effective and practical. It is this methodology that is described in this article and taught in our class. Concerning this methodology, if you are in charge of producing a chip, you might ask about the benefits and the costs. If you are involved in its design, you might ask about how it would affect you; will it be a burden or a help. If you are interested in becoming involved in the verification itself, you might ask if this is something that fits your skills and interests. This article attempts to give you the information to answer these questions at a conceptual level while filling in some of the details by way of a simple example.
Our methodology tends to find three types of errors. First, it finds errors within individual analog blocks that implement many settings. These errors are often subtle problems in the control logic. They are not found by the designer in cases where there are simply too many settings to test. For example, when designing a programmable gain amplifier with 64 gain settings, the typical designer will only simulate a few representative settings; the highest, the lowest, and maybe one or two in the middle. With this approach errors in the least significant bits in the gain control will likely go unnoticed. The second types of errors are inter-block communication errors; chicken and egg problems and the like. As an example, consider the case of an input bias block, perhaps responsible for keeping a large de-coupling capacitor charged and ready for rapid start-up when the chip is in a power-down state, that is itself dependent on the main bias generator, which is disabled in that power-down state. These go undetected because blocks from different designers would have to be simulated together for them to be noticed, and that is rarely, if ever, done. The third type of errors is in the digital circuitry that controls the analog, produces its input or processes its output; or it is in the interface between the analog and digital circuitry. They generally go undetected because it is not possible to co-simulate the analog and digital sections in any meaningful way (transistor simulations are much too slow).
Our methodology addresses these potential errors with two new tools: exhaustive self-checking test benches based on Verilog-AMS, and pin-accurate functional models written in either Verilog or Verilog-AMS. To see the importance of both, consider the equalizer shown below, as might be used as part of a high-speed digital data transmission system. During the course of the design the equalizer itself must be verified to function correctly. In addition, usually it is desirable to build a Verilog model of the equalizer that can be used when verifying the overall system. Our methodology fulfills both of these needs.
Writing an exhaustive equalizer test bench for a Spice simulator would be very difficult, but Verilog-AMS provides a rich language that can be used to easily describe the needed tests. An example is given below. It consists of 20-30 lines of code that are tailored to the device under test and are used to thoroughly exercise it and confirm that it produces the expected output (for brevity, boilerplate code is not shown). The test bench operates by applying a unit impulse to the input, and then monitoring the output as the impulse propagates through the delay line. On the first clock cycle the output should produce k0, on the second k1, etc. After the unit impulse exits the delay line, the coefficients are changed and a new impulse is fed in. Notice that each coefficient is stepped through all of its 16 possible values, and each coefficient is always set differently from the others. This minimizes the likelihood that a wiring or logic error will be missed.
module testbench ();
reg [3:0] k0, k1, k2, k3, k4;
logic2p5 in, clk;
supply2p5 vdd, gnd;
equalizer DUT(.out(out), .in(in), .clk(clk), .k0(k0), .k1(k1), .k2(k2), .k3(k3), .k4(k4), .vdd(vdd), .gnd(gnd));
always #1 clk = ~clk;
k0 = 0;
k1 = 0;
k2 = 0;
k3 = 0;
k4 = 0;
in = 0;
// clear out the delay line
for (i=0; i<5; i=i+1)
// send an impulse through delay line and check coefficients
for (i=0; i<16; i=i+1) begin
k0 = (i+0);
k1 = (i+1);
k2 = (i+2);
k3 = (i+3);
k4 = (i+4);
in = 1;
in = 0;
checkOutput(k0/75.0, V(out), “k0”, k0, 10m);
checkOutput(k1/75.0, V(out), “k1”, k1, 10m);
checkOutput(k2/75.0, V(out), “k2”, k2, 10m);
checkOutput(k3/75.0, V(out), “k3”, k3, 10m);
checkOutput(k4/75.0, V(out), “k4”, k4, 10m);
When run on the circuit, the test bench produces the waveforms shown below. However, one does not need to actually view the waveforms to determine if the circuit is working correctly. This test bench runs 80 separate tests on the equalizer, all while driving each of the inputs independently through all possible values. When run on the circuit, it will produce a summary output that indicates which tests the circuit passed, and which it fails. If the test bench is written exclusively from the functional specifications for the block, if the tests are comprehensive, and if they all pass, then the circuit implementation has been verified to be functionally equivalent to the specification. This in itself is generally much more verification than is done for analog blocks today. However, this is only the first of three sets of tests that are run, verifying that each individual analog or mixed-signal block is implemented correctly. The next two sets will verify that the blocks operate together as expected and that the entire analog subsystem inter-operates properly with the digital subsystem.
To take the next step, it is necessary to have a pin accurate functional model of each analog or mixed-signal circuit block in the analog subsystem. The model for our equalizer is shown below.
module equalizer(out, in, clk, k0, k1, k2, k3, k4, vdd, gnd);
output out; electrical out;
input in, clk, vdd, clk;
input [3:0] k0, k1, k2, k3, k4;
reg z0, z1, z2, z3, z4;
logic2p5 in, clk;
supply2p5 vdd, gnd;
// delay line
always @(clk) begin
z0 <= in;
z1 <= z0;
z2 <= z1;
z3 <= z2;
z4 <= z3;
// weighted summer
integer result = 0;
always @(*) begin
result = z0*k0 + z1*k1 + z2*k2 + z3*k3 + z4*k4;
if ((^result === 1'bx) || (vdd !== 1) || (gnd !== 0))
result = 0; // set output to 0 if there is a problem with vdd or gnd, of if result contains unknowns (x)
analog V(out) <+ transition( result/(5*15.0), 0, 0.5n );
While very simple, this is a complete functional model of the equalizer that can be used in a full mixed-signal simulation.
Once the model is written, there is the question “does it faithfully represent the circuit?”. At this point the question is easy to answer with authority. Simply run the test bench on the model. Since the model is pin accurate, it is a simple matter of changing the configuration. If all tests pass, the model and the circuit are functionally equivalent.
It is important to recognize that it while it was easy for us to confirm that the model matched the circuit, it was only possible because we invested in building a comprehensive test bench. Most people that write models for their blocks do not perform this verification because they did not develop a test bench. This is a very dangerous situation as errors in the model may not be found and those errors may result in errors being injected into the implementation of surrounding blocks to compensate for the errors in the model.
With a verified model of the block, it is now possible to take the next step of verifying that all of the analog blocks operate properly when connected together. To do so, one writes a test bench for the entire analog subsystem and applies it to the top-level schematic of the analog section, where each block is represented by its fully verified model. For our example, one might combine the equalizer with the cable driver and the receiver and perform loopback testing. This would be much too expensive if all of the blocks were at the transistor level, but is very reasonable when all of the blocks are represented by functional models. For more complete testing, one can drop one block at a time to the transistor level while continuing to use models for the remaining blocks. This is referred to as mixed-level simulation. While more expensive than model-level simulation, it does provide additional benefits.
With the models already created it is often possible to verify that the analog and digital subsystems operate together as expected. Simply write a test bench for the entire system and simulate both the analog, represented at the model level, and the digital, represented with RTL, together in a Verilog-AMS simulator. While this is the simplest solution, it can be problematic in certain cases. It may be that the analog models are too slow to allow a thorough top-level verification, or it may be that the there are constraints on the top-level verification, such as the need for System-Verilog, that cannot be satisfied with existing Verilog-AMS simulators. In these cases a new model is created, often a model written purely in Verilog. Again, this is a functional model, and so is generally not difficult to write. And again, the existence of a test bench means that the model can be verified to match the implementation. The importance of verifying the model cannot be overstated. As mentioned before, without verification the model could contain errors, which creates the risk that an otherwise working design will be modified to properly operate with the model, thereby breaking the implementation.
The Need for Verification Engineers
Analog verification requires a change in the way things are done in most analog design groups. New skills must be learned and new types of engineers must be found, hired, and trained. Why would one go to all this trouble? The answer is easy: there is no other way to assure your design will function properly before you build it. Despite what simulation vendors may want you to believe, exhaustive regression testing on transistor level schematics is completely impractical. Furthermore, it will never again in the foreseeable future be practical, because the complexity of the circuitry is increasing faster than the speeds of the simulators and computers they run on. It is important to realize that the complexity of analog circuits increases in three independent ways simultaneously: the circuits become larger, they become algorithmically more complex, and the number of modes and settings they support increases. Furthermore, once adopted, our approach to analog verification is seen as being preferable to the old ways that only involved transistor simulation in two ways. First, our approach is based on the use of models, and so can occur much earlier in the design cycle. It can find errors before the transistor-level circuits are designed, which can save significant design effort. Second, the regression tests themselves are of much higher quality when they are developed in concert with the models. The models allow the tests themselves to be much more fully exercised and tested. Regression tests developed in concert with models are more comprehensive and more sophisticated that those developed in concert with transistor-level circuits alone.
To undertake this methodology one needs engineers trained in the art of analog functional modeling and testing and focused on verification. Such engineers currently are very rare. The alternative, using designers to both design and verify, is generally not as successful. It can be difficult to convince them to do it, they generally do not have the skills, and they tend to prioritize design activities over verification activities.
The most likely path to success is to identify engineers that have a basic understanding of analog circuitry and have a natural interest in programming, to train them in analog verification, and to arrange for coaching and guidance for them as they take on their first few projects. And it is to that end that Designer’s Guide Consulting has arranged its offerings. If you are interested in adopting analog verification in your designs, give us a call. We will visit and assess your current methodology, train your engineers, and then provide guidance and coaching in the form of consulting for your first few projects. If needed, we can also drive the verification process and provide supplementary verification services. The training can either be private on-site training at your facilities or open training programs at facilities that we arrange. The first of our open training sessions is coming up at the end of September in Santa Clara, California. We hope to see you there.
You can access a fully complete and executable version of this circuit and test bench at www.designers-guide.com/newsletters/0807/example.tgz.
Disclaimer: We strive to provide information that is both helpful and accurate. Designer’s Guide Consulting, Inc., the creators of this newsletter, makes no representation or guarantees on the newsletter contents and assume no liability in connection with the information contained on it.
Copyright, 2008 © Designer’s Guide Consulting, Inc. No reproductions of this newsletter can be made without the expressed permission of Designer’s Guide Consulting, Inc.