Designer's Guide Consulting :: Analog Verification Designer Guide Community
Designer's Guide Consulting :: Analog Verification Designer's Guide Consulting :: Analog Verification
Analog Verification Home
Analog Verification Challenges
Analog Verification Approach
Analog Verification Documents
Analog Verification Classes
Analog Verification Services
Analog Verification Products
Analog Verification Newsletters
Analog Verification Team
Analog Verification Contacts
Designer's Guide Consulting
Analog Verification Newsletter

Issue #2 - August 2007

Printable (PDF) Version of Newsletter

CONTENTS


GREETINGS

Dear friends and colleagues,

This month’s newsletter features an article describing how analog verification aids in validating and keeping the analog, mixed-signal, or RF design specifications synchronized with the implementation and all model representations. At the end of the article, we pose a few questions. We’d greatly appreciate it if you could send us your thoughts. We’ll remove any identifying information and in next month’s newsletter we’ll summarize the responses we receive. We haven’t had time to set up the blog, so we’ll have to go with this approach for now.

We’ve also included a career oriented article in this newsletter. As Designer’s Guide is a sponsor of the IEEE Custom Integrated Circuits Conference and Henry is on the steering committee, this article discusses how one goes about joining a technical program committee of a conference.

Finally, we have several announcements including another talk that Ken is doing, an announcement for CICC, and a new service Designer’s Guide Consulting is offering.

From the last newsletter, we were gratified to hear that several of you attended the talk that Ken gave at last month’s Agilent EEsof RFIC Seminar, "Tackling the Tough Problems,” as a result of reading our newsletter.

If you think someone you know would find this newsletter useful, please suggest that he or she subscribe.

Sincerely,
Ken Kundert, President, Designer's Guide Consulting, Inc.
Henry Chang, Vice President, Designer's Guide Consulting, Inc.


ANNOUNCEMENTS

New Simulation Tune-Up Service Offered

We are offering a new service called the 'simulation tune-up'. Ken visits your site, gives a lecture of your choice in the morning, and then spends the rest of the day discussing simulation and modeling issues with your analog engineering staff. You can choose a lecture from the following list,

  • Verification of complex analog integrated circuits
  • Using RF simulators on analog circuits: how to perform simulations you never thought were possible
  • An introduction to cyclostationary noise: noise in mixers, oscillators, samplers, and logic
  • Simulation of switched-capacitor and other sampled-data circuits
  • Determining the stability of feedback loops (including switched-capacitor and switching power supplies)

or you can request a lecture customized to your needs. The discussion is often the most valuable part of this service. It allows your designers to pose questions and present their problems to one of the world's leading experts on simulation. Your designer's usually benefit from an improved understanding of the capabilities of the simulator and how they can be used to perform challenging simulations. Design teams usually make much more effective use of their simulators after a simulation tune-up.

If your are interested, please email consulting@designers-guide.com.


Ken to Give Educational Session at CICC on Analog Verification
Sunday, September 16 at DoubleTree Hotel in San Jose, California.

Abstract

Verification is becoming widely recognized as one of the most important issues when designing large complex analog and RF mixed-signal circuits. As a result, design methodologies are starting to change. This change mirrors a change that occurred in digital design 10-15 years ago. In this session I will show why the problem has become so significant, and what people are doing to control the problem. This presentation is targeted for design management, design engineers, and verification engineers. It outlines a practical and proven methodology for performing the complete functional verification of the most complex analog SoCs using examples to illustrate the essential points. This methodology not only assures that the implementation is functionally consistent with the specification, but it also produces a high-level Verilog or VHDL model that is shown through exhaustive transistor-level testing to be functionally equivalent to the implementation. Use of this methodology also leads naturally to the adoption of a top-down design style and aids performance verification.

Register


Designer’s Guide Sponsors 2007 Custom Integrated Circuits Conference

Having attended CICC consistently for more than a decade, we have observed that CICC is a great place for analog, mixed-signal, and RF designers to hone their design knowledge. CICC presents first published works in the areas that you are likely working in, such as high speed serial I/O, data converters, RF design, analog circuit design, and power management. There are also many CAD papers presented aimed to help the analog designer. The focus of CICC is to be an educational conference. CICC presenters tend to both explain what was done and how it was done to aid other designers. There are also educational sessions and a poster session to foster communication and learning. Henry and Ken will be at CICC. Please stop by and see us.

More information


Designer's Guide Newsletters are now on the Web

You can find all back issues of this newsletter on our consulting website, www.designers-guide.com/newsletters.


ANALOG VERIFICATION

Validating Specifications and Keeping Them Synchronized
By Henry Chang and Ken Kundert

Whether analog/mixed-signal/RF design is conducted top-down, bottom-up, or middle-out, and regardless of the level of re-use, in all cases we’ve seen, there are always specifications for the design. They may or may not represent the true start of the design. They may or may not be of high quality. They could be captured in a loose fashion across a set of software products, or they could be in an incredibly complex specifications tracking system. Anecdotally, all of us suspect or know that specifications never fully match the current state of the design, they are never descriptive enough, and keeping the specifications in good shape is rarely high on anyone’s priority list. However, we live with them, because specifications are a necessity. This article will describe why this is the case, what is done in practice, and how analog verification addresses many of the issues associated with specifications.

Specifications for the analog, mixed-signal or RF block represent a primary means of communication. They are used within the analog design team. They are delivered to the end customer. And most importantly, they are used by the rest of the chip team. This team includes the test engineers, the digital system design team, and the digital implementation team. These teams typically lack the skills or time to understand what the analog design does from looking at the schematics, and thus rely on the specifications.

The system team either develops the specifications or uses them to determine whether or not the analog design fits the needs of the system. The design team uses them to understand what they are to implement. With any set of specifications, one must always consider the following two questions.

  • Are the specifications correct and complete?
  • Does the implementation match the specifications?

An important third follow-on question from the program management point-of-view is:

  • If the specifications change, because of a new requirement from the system team, the implementation team, or the customer, how effectively can the teams react?

In general, what we’ve observed is that these questions are answered in an ad-hoc manner. A common methodology is that the system team starts with high level model (e.g. in Matlab). They partition the system algorithm into a digital and an analog portion. When the digital implementation team finishes the RTL, the system or digital team takes the RTL and applies vectors to it to confirm that it matches what they saw in their high level model. The methodology breaks down with the analog group. Typically, there are two scenarios. The first is that no model or only a stub model is given to the system team representing the analog block’s implementation. In this case, no verification of the analog portion of the design within the context of the overall system can be performed. In the second, a behavioral model (e.g. in Verilog, VHDL, Verilog-AMS) is developed for the analog block. Usually, it is based on reading the specifications or a combination of reading the specifications and talking to the designers. Unfortunately, most authors of behavioral models do not apply a simulator to systematically verify that the behavioral model and the implementation are functionally consistent. In this case the models have limited value as they cannot be trusted, and if they contain errors, having them may be worse than having no models at all.

For the second question, “do the specifications match the implementation?”, what we have observed is that analog designers spend most of their time verifying that the performance specifications are met, but relatively little time verifying that all of the functional aspects of the specifications. Because the digital parts are “simple,” visual inspections with one time simulations of small blocks are often all that is done. It is the functional specifications of the analog block that detail what all of the I/O pins do. They explain the key sequences, such as power-up, power-down, how to get into loopback, how to enter the test modes, how to enter the sleep modes, any sequences associated with the control bits, and how calibration is to work. In an entire analog front end, verification is typically deemed infeasible because circuit simulation times are too long even with timing (“FastMOS”) simulators. Designers hope that the functional errors that do arise can be fixed in firmware. Unfortunately, expensive FIBs and re-spins are often required. Again, as a result, the question of whether or not specifications match the implementation is mostly left unanswered.

To the last question, we’ve observed very little rigor when it comes to specification changes. Often specification changes are communicated verbally or in e-mail messages. Many times, designers have their own private version of specifications. They make local changes, such as on the schematic itself, and little effort is taken to combine them in a timely fashion in the top-level. Specification changes happen. Because of the loose manner in which these changes propagate, design changes often take longer than necessary. We can’t say this is a huge problem, but this does add inefficiency to the design process. For design groups where specifications drive their business, we’ve seen complex specification tracking systems talked about.

Analog verification offers a solution to answering these three questions. One of the key attributes of analog verification is that it validates that the specifications, the transistor level implementation, and any model representation of the implementation are consistent. The models can include a pure Verilog or VHDL behavior model of the analog design. The validation is done with comprehensive automated regression tests. At a minimum, consistency is checked in terms of function – all of the control bits, bias inputs, supply levels, test mode, loop back, power-up/power-down sequences, etc. Function does not mean just digital, function includes the nominal behavior of the analog circuit. For example, in a variable gain amplifier, specification correctness is checked for in terms of how the gain control bits affect the actual gain. Functional correctness can also be checked at a higher level where the entire analog front end can be placed in the context of the entire system and verified with a system level regression test.

This assurance is made by virtue of the fact that the expected results, which are coded into the regression tests, are based strictly on the specifications. If the regression tests fail when running on any of the models or the transistor level implementation, then it is clear there is an inconsistency. As long as rigor is taken to assure that the regression tests only change when the specifications are changed, then this methodology works. If a change is ever made to any of these design representations, and is not propagated to the other views, then the inconsistency will be caught. And because we apply mixed-level simulation techniques, the regression tests even for large mixed-signal blocks can be run overnight.

Going back to the original questions, are the specifications correct and complete? The process of building the models, building the tests, and running the tests against the implementation, all of which exercise the specifications, will find most of the errors and holes in the specifications. In addition, a fully validated model is produced that acts as an executable specification, which greatly helps to reduce the chance that the specification is misinterpreted. On the second question, does the analog design match the specifications? This is the primary mission of analog verification as described above and is assured if the methodology is properly followed. Finally, how fast can the design team react to specification changes? We can’t say how fast the design team will react, but the models and tests that would already be in place as a result of the verification methodology should speed up the process. But perhaps more importantly, the exhaustive suite of regression tests will assure that no new errors are created as a result of a last minute change.

We all have mixed feelings when it comes to specifications, but we all know deep down that they are necessary. Analog verification offers an approach to make sure that the specifications stays synchronized with the design and all of the model representations.

Questions:

Think about how specifications are used in your design.

1. How confident are you that your specifications accurately reflect the design? (confident, somewhat confident, not very confident, no confidence)

2. What errors have you seen arise as a result of incorrect specifications? (examples would be great)

3. Are specification changes effectively communicated? (yes, somewhat, no)

4. What benefits can you see from having specifications checked automatically against the design and against all of the models? (again, examples would be great)

Please mail us your thoughts.

CAREER

Getting Onto a Conference Technical Program Committee
By Henry Chang

Starting engineers often ask me how to get onto a conference technical program committee (TPC). I’ve served on many TPCs. The key to understand is that typical TPC members of technical conferences are volunteers. Thus, TPCs are always looking for motivated individuals who can bring energy, enthusiasm, and technical talent to the conference. Conferences are typically measured on the number of papers they attract, the quality of the papers, and the number of attendees. Thus, TPCs are also looking for technically connected individuals who can bring in those high quality papers and colleagues.

Usually, joining a TPC requires only that you have a resume, that your company supports your participation in terms of time and expense, and that you can commit to do the responsibilities as outlined by the TPC chair. I can’t say this is true for all conferences, but usually the first step to getting on a TPC is to find the TPC chair and open a dialog with him/her. The TPC chair can almost always be found on the conference website. Start with letting him/her know why you are interested in joining the TPC. Convey enthusiasm and let him/her know how you think you can help the conference. For the Custom Integrated Circuits Conference, requirements include attending the conference and the two TPC meetings as well as participating in some of the non-technical aspects, i.e. running the education sessions and panels, soliciting sponsorships and exhibits, help with publicity, and help with the best paper process. These requirements usually provide benefit to the TPC member as they make a great excuse to network. As a steering committee member on CICC, I know that CICC is always looking for new TPC members. Right after the conference is when CICC will begin re-building the next TPC. Thus, the best time to show your interest is at the conference. For CICC, any TPC member will be happy to forward your information to next year’s TPC chair. If you are interested in joining the CICC TPC, please approach any of the TPC members while at the conference and help make the engineering community into a better place. See you at the conference!


Previous Newsletter Next Newsletter

Disclaimer: We strive to provide information that is both accurate and helpful. Designer’s Guide Consulting, Inc., the creators of this newsletter, makes no representation or guarantees on the newsletter contents and assume no liability in connection with the information contained in it.

Copyright, 2007 © Designer’s Guide Consulting, Inc. No reproductions of this newsletter can be made without the express written permission of Designer’s Guide Consulting, Inc.

Copyright © 2012, Designer's Guide Consulting, Inc. Designer's Guide is a registered trademark of Designer's Guide LLC. All rights reserved.

webmaster@designers-guide.com